Animal learning is often conceived as a gradual process that develops over many trials and involves the incremental strengthening of associations among stimuli, responses, and outcomes. However, in recent years there is mounting evidence that animal learning is better understood as an active inference and decision making process. This view comes from careful statistical examination of the individual, trial-by-trial learning progress, and from the observation of sudden transitions among neural ensemble states coding for different behavioral rules in prefrontal cortex (PFC) which accompany the learning process. Results from the last funding period suggest that a) these sudden neural transitions may reflect a change in choice criterion rather than a period of uncertainty and/or exploration, and b) that dopamine may also primarily affect the choice process rather than value updating. Building on these results and other observations, here we aim to further validate and extend our current understanding along three major directions, using a combination of multi-tetrode recordings, optogenetic manipulations, and advanced time series and computational model- based analysis of the learning process:
1) We will dissect in more detail the task periods during which dopamine input is most crucial, how dopamine neuron activity is coordinated with PFC activity as the task progresses, and how it impacts on various subcomponents of rule learning like action selection and value updating;
2) We will address in more detail specific hypotheses regarding the neurodynamical mechanisms underlying the active inference process in various subdivisions of the rat PFC;
3) Through variations of the basic behavioral task design, we will further explore different hypotheses about the decision making and rule formation mechanisms involved in the learning process.
Thus, we will continue toward our goal of a comprehensive understanding of rule learning as active inference at the neurophysiological, neuro-dynamical, and neuro-computational levels.