ABSTRACT
ABSTRACT
Variations in human behavior correspond to the adaptation of the nervous system to different internal and environmental demands. Attention, a cognitive process for weighing environmental demands, changes over time. Pupillary activity, which is affected by fluctuating levels of cognitive processing, appears to identify neural dynamics that relate to different states of attention. In mice, for example, pupil dynamics directly correlate with brain state fluctuations. Although, in humans, alpha-band activity is associated with inhibitory processes in cortical networks during visual processing, and its amplitude is modulated by attention, conclusive evidence linking this narrowband activity to pupil changes in time remains sparse. We hypothesize that, as alpha activity and pupil diameter indicate attentional variations over time, these two measures should be comodulated. In this work, we recorded the electroencephalographic (EEG) and pupillary activity of 16 human subjects who had their eyes fixed on a gray screen for 1 min. Our study revealed that the alpha-band amplitude and the high-frequency component of the pupil diameter covariate spontaneously. Specifically, the maximum alpha-band amplitude was observed to occur ∼300 ms before the peak of the pupil diameter. In contrast, the minimum alpha-band amplitude was noted to occur ∼350 ms before the trough of the pupil diameter. The consistent temporal coincidence of these two measurements strongly suggests that the subject’s state of attention, as indicated by the EEG alpha amplitude, is changing moment to moment and can be monitored by measuring EEG together with the diameter pupil.
[:]ABSTRACT
ABSTRACT
The importance of proportional reasoning has long been recognized by psychologists and educators, yet we still do not have a good understanding of how humans mentally represent proportions. In this paper we present a psychophysical model of proportion estimation, extending previous approaches. We assumed that proportion representations are formed by representing each magnitude of a proportion stimuli (the part and its complement) as Gaussian activations in the mind, which are then mentally combined in the form of a proportion. We next derived the internal representation of proportions, including bias and internal noise parameters -capturing respectively how our estimations depart from true values and how variable estimations are. Methodologically, we introduced a mixture of components to account for contaminating behaviors (guessing and reversal of responses) and framed the model in a hierarchical way. We found empirical support for the model by testing a group of 4th grade children in a spatial proportion estimation task. In particular, the internal density reproduced the asymmetries (skewedness) seen in this and in previous reports of estimation tasks, and the model accurately described wide variations between subjects in behavior. Bias estimates were in general smaller than by using previous approaches, due to the model's capacity to absorb contaminating behaviors. This property of the model can be of especial relevance for studies aimed at linking psychophysical measures with broader cognitive abilities. We also recovered higher levels of noise than those reported in discrimination of spatial magnitudes and discuss possible explanations for it. We conclude by illustrating a concrete application of our model to study the effects of scaling in proportional reasoning, highlighting the value of quantitative models in this field of research.
ABSTRACT
ABSTRACT
ABSTRACT
Hippocampal-dependent memories emerge late during postnatal development, aligning with hippocampal maturation. During sleep, the two-stage memory formation model states that through hippocampal-neocortical interactions, cortical slow-oscillations (SO), thalamocortical Spindles, and hippocampal sharp-wave ripples (SWR) are synchronized, allowing for the consolidation of hippocampal-dependent memories. However, evidence supporting this hypothesis during development is still lacking. Therefore, we performed successive object-in-place tests during a window of memory emergence and recorded in vivo the occurrence of SO, Spindles, and SWR during sleep, immediately after the memory encoding stage of the task. We found that hippocampal-dependent memory emerges at the end of the 4th postnatal week independently of task overtraining. Furthermore, we observed that those animals with better performance in the memory task had increased Spindle density and duration and lower density of SWR. Moreover, we observed changes in the SO-Spindle and Spindle-SWR temporal-coupling during this developmental period. Our results provide new evidence for the onset of hippocampal-dependent memory and its relationship to the oscillatory phenomenon occurring during sleep that helps us understand how memory consolidation models fit into the early stages of postnatal development.
[:]ABSTRACT
ABSTRACT
ABSTRACT
It is widely accepted that the brain, like any other physical system, is subjected to physical constraints that restrict its operation. The brain's metabolic demands are particularly critical for proper neuronal function, but the impact of these constraints continues to remain poorly understood. Detailed single-neuron models are recently integrating metabolic constraints, but these models’ computational resources make it challenging to explore the dynamics of extended neural networks, which are governed by such constraints. Thus, there is a need for a simplified neuron model that incorporates metabolic activity and allows us to explore the dynamics of neural networks. This work introduces an energy-dependent leaky integrate-and-fire (EDLIF) neuronal model extension to account for the effects of metabolic constraints on the single-neuron behavior. This simple, energy-dependent model could describe the relationship between the average firing rate and the Adenosine triphosphate (ATP) cost as well as replicate a neuron's behavior under a clinical setting such as amyotrophic lateral sclerosis (ALS). Additionally, EDLIF model showed better performance in predicting real spike trains – in the sense of spike coincidence measure – than the classical leaky integrate-and-fire (LIF) model. The simplicity of the energy-dependent model presented here makes it computationally efficient and, thus, suitable for studying the dynamics of large neural networks.
[:]