نتائج البحث

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
تم إضافة الكتاب إلى الرف الخاص بك!
عرض الكتب الموجودة على الرف الخاص بك .
وجه الفتاة! هناك خطأ ما.
وجه الفتاة! هناك خطأ ما.
أثناء محاولة إضافة العنوان إلى الرف ، حدث خطأ ما :( يرجى إعادة المحاولة لاحقًا!
هل أنت متأكد أنك تريد إزالة الكتاب من الرف؟
{{itemTitle}}
{{itemTitle}}
وجه الفتاة! هناك خطأ ما.
وجه الفتاة! هناك خطأ ما.
أثناء محاولة إزالة العنوان من الرف ، حدث خطأ ما :( يرجى إعادة المحاولة لاحقًا!
    منجز
    مرشحات
    إعادة تعيين
  • الضبط
      الضبط
      امسح الكل
      الضبط
  • مُحَكَّمة
      مُحَكَّمة
      امسح الكل
      مُحَكَّمة
  • نوع العنصر
      نوع العنصر
      امسح الكل
      نوع العنصر
  • الموضوع
      الموضوع
      امسح الكل
      الموضوع
  • السنة
      السنة
      امسح الكل
      من:
      -
      إلى:
  • المزيد من المرشحات
55 نتائج ل "Ben Hamed, Suliann"
صنف حسب:
Sulcal organization in the medial frontal cortex provides insights into primate brain evolution
Although the relative expansion of the frontal cortex in primate evolution is generally accepted, the nature of the human uniqueness, if any, and between-species anatomo-functional comparisons of the frontal areas remain controversial. To provide a novel interpretation of the evolution of primate brains, sulcal morphological variability of the medial frontal cortex was assessed in Old World monkeys (macaque/baboon) and Hominoidea (chimpanzee/human). We show that both Hominoidea possess a paracingulate sulcus, which was previously thought to be unique to the human brain and linked to higher cognitive functions, such as mentalizing. Also, we show systematic sulcal morphological organization of the medial frontal cortex that can be traced from Old World monkeys to Hominoidea species, demonstrating an evolutionarily conserved organizational principle. These data provide a new framework to compare sulcal morphology, cytoarchitectonic areal distribution, connectivity, and function across the primate order, leading to clear predictions about how other primate brains might be anatomo-functionally organized. The frontal cortex has expanded over primate evolution. Here, the authors use neuroimaging data from the brains of humans, chimpanzees, baboons, and macaques, to reveal shared and distinct sulcal morphology of the medial frontal cortex.
Prefrontal attentional saccades explore space rhythmically
Recent studies suggest that attention samples space rhythmically through oscillatory interactions in the frontoparietal network. How these attentional fluctuations coincide with spatial exploration/displacement and exploitation/selection by a dynamic attentional spotlight under top-down control is unclear. Here, we show a direct contribution of prefrontal attention selection mechanisms to a continuous space exploration. Specifically, we provide a direct high spatio-temporal resolution prefrontal population decoding of the covert attentional spotlight. We show that it continuously explores space at a 7–12 Hz rhythm. Sensory encoding and behavioral reports are increased at a specific optimal phase w/ to this rhythm. We propose that this prefrontal neuronal rhythm reflects an alpha-clocked sampling of the visual environment in the absence of eye movements. These attentional explorations are highly flexible, how they spatially unfold depending both on within-trial and across-task contingencies. These results are discussed in the context of exploration-exploitation strategies and prefrontal top-down attentional control. The prefrontal attention spotlight dynamically explores space at 7–12 Hz, enhancing sensory encoding and behavior, in the absence of eye movements. This alpha-clocked sampling of space is under top-down control and implements an alternation in exploration and exploitation of the visual environment.
Separate and overlapping mechanisms of statistical regularities and salience processing in the occipital cortex and dorsal attention network
Attention selects behaviorally relevant inputs for in‐depth processing. Beside the role of traditional signals related to goal‐directed and stimulus‐driven control, a debate exists regarding the mechanisms governing the effect of statistical regularities on attentional selection, and how these are integrated with other control signals. Using a visuo‐spatial search task under fMRI, we tested the joint effects of statistical regularities and stimulus‐driven salience. We found that both types of signals modulated occipital activity in a spatially specific manner. Salience acted primarily by reducing the attention bias towards the target location when associated with irrelevant distractors, while statistical regularities reduced this attention bias when the target was presented at a low probability location, particularly at the lower levels of the visual hierarchy. In addition, we found that both statistical regularities and salience activated the dorsal frontoparietal network. Additional exploratory analyses of functional connectivity revealed that only statistical regularities modulated the inter‐regional coupling between the posterior parietal cortex and the occipital cortex. These results show that statistical regularities and salience signals are both spatially represented at the occipital level, but that their integration into attentional processing priorities relies on dissociable brain mechanisms. Our findings contribute to the debate concerning how statistical regularities affect attention control, here pointing to substantial differences compared to salience processing: while salience was associated primarily with local effects within the occipital cortex, statistical regularities engaged a combination of local occipital processing and occipito‐parietal interactions.
Dynamic causal interactions between occipital and parietal cortex explain how endogenous spatial attention and stimulus-driven salience jointly shape the distribution of processing priorities in 2D visual space
•We investigated the interplay between endogenous and exogenous spatial attention.•Distributed activity in occipital cortex represents current processing priorities.•Parietal top-down and occipital lateral signaling contribute to spatial selection. Visuo-spatial attention prioritizes the processing of relevant inputs via different types of signals, including current goals and stimulus salience. Complex mixtures of these signals engage in everyday life situations, but little is known about how these signals jointly modulate distributed patterns of activity across the occipital regions that represent visual space. Here, we measured spatio-topic, quadrant-specific occipital activity during the processing of visual displays containing both task-relevant targets and salient color-singletons. We computed spatial bias vectors indexing the effect of attention in 2D space, as coded by distributed activity in the occipital cortex. We found that goal-directed spatial attention biased activity towards the target and that salience further modulated this endogenous effect: salient distractors decreased the spatial bias, while salient targets increased it. Analyses of effective connectivity revealed that the processing of salient distractors relied on the modulation of the bidirectional connectivity between the occipital and the posterior parietal cortex, as well as the modulation of the lateral interactions within the occipital cortex. These findings demonstrate that goal-directed attention and salience jointly contribute to shaping processing priorities in the occipital cortex and highlight that multiple functional paths determine how spatial information about these signals is distributed across occipital regions.
Automated video-based heart rate tracking for the anesthetized and behaving monkey
Heart rate (HR) is extremely valuable in the study of complex behaviours and their physiological correlates in non-human primates. However, collecting this information is often challenging, involving either invasive implants or tedious behavioural training. In the present study, we implement a Eulerian video magnification (EVM) heart tracking method in the macaque monkey combined with wavelet transform. This is based on a measure of image to image fluctuations in skin reflectance due to changes in blood influx. We show a strong temporal coherence and amplitude match between EVM-based heart tracking and ground truth ECG, from both color (RGB) and infrared (IR) videos, in anesthetized macaques, to a level comparable to what can be achieved in humans. We further show that this method allows to identify consistent HR changes following the presentation of conspecific emotional voices or faces. EVM is used to extract HR in humans but has never been applied to non-human primates. Video photoplethysmography allows to extract awake macaques HR from RGB videos. In contrast, our method allows to extract awake macaques HR from both RGB and IR videos and is particularly resilient to the head motion that can be observed in awake behaving monkeys. Overall, we believe that this method can be generalized as a tool to track HR of the awake behaving monkey, for ethological, behavioural, neuroscience or welfare purposes.
Early social adversity modulates the relation between attention biases and socioemotional behaviour in juvenile macaques
Affect-biased attention may play a fundamental role in early socioemotional development, but factors influencing its emergence and associations with typical versus pathological outcomes remain unclear. Here, we adopted a nonhuman primate model of early social adversity (ESA) to: (1) establish whether juvenile, pre-adolescent macaques demonstrate attention biases to both threatening and reward-related dynamic facial gestures; (2) examine the effects of early social experience on such biases; and (3) investigate how this relation may be linked to socioemotional behaviour. Two groups of juvenile macaques (ESA exposed and non-ESA exposed) were presented with pairs of dynamic facial gestures comprising two conditions: neutral-threat and neutral-lipsmacking. Attention biases to threat and lipsmacking were calculated as the proportion of gaze to the affective versus neutral gesture. Measures of anxiety and social engagement were also acquired from videos of the subjects in their everyday social environment. Results revealed that while both groups demonstrated an attention bias towards threatening facial gestures, a greater bias linked to anxiety was demonstrated by the ESA group only. Only the non-ESA group demonstrated a significant attention bias towards lipsmacking, and the degree of this positive bias was related to duration and frequency of social engagement in this group. These findings offer important insights into the effects of early social experience on affect-biased attention and related socioemotional behaviour in nonhuman primates, and demonstrate the utility of this model for future investigations into the neural and learning mechanisms underlying this relationship across development.
Strengths and challenges of longitudinal non-human primate neuroimaging
•Strengths and challenges of longitudinal non-human primate MRI are described.•Statistical power calculation of longitudinal and cross-sectional designs are provided.•The impact of template choice on grey matter estimation is demonstrated.•Recommendations for designing and analysing such studies are provided. Longitudinal non-human primate neuroimaging has the potential to greatly enhance our understanding of primate brain structure and function. Here we describe its specific strengths, compared to both cross-sectional non-human primate neuroimaging and longitudinal human neuroimaging, but also its associated challenges. We elaborate on factors guiding the use of different analytical tools, subject-specific versus age-specific templates for analyses, and issues related to statistical power.
Beyond MRI: on the scientific value of combining non-human primate neuroimaging with metadata
•Data sharing of primate neuroimaging offers new opportunities.•The potential of metadata to enrich primate neuroimaging is described.•Illustration of how meta-data can be shared in the BIDS format is provided. Sharing and pooling large amounts of non-human primate neuroimaging data offer new exciting opportunities to understand the primate brain. The potential of big data in non-human primate neuroimaging could however be tremendously enhanced by combining such neuroimaging data with other types of information. Here we describe metadata that have been identified as particularly valuable by the non-human primate neuroimaging community, including behavioural, genetic, physiological and phylogenetic data.
Comparison of Classifiers for Decoding Sensory and Cognitive Information from Prefrontal Neuronal Populations
Decoding neuronal information is important in neuroscience, both as a basic means to understand how neuronal activity is related to cerebral function and as a processing stage in driving neuroprosthetic effectors. Here, we compare the readout performance of six commonly used classifiers at decoding two different variables encoded by the spiking activity of the non-human primate frontal eye fields (FEF): the spatial position of a visual cue, and the instructed orientation of the animal's attention. While the first variable is exogenously driven by the environment, the second variable corresponds to the interpretation of the instruction conveyed by the cue; it is endogenously driven and corresponds to the output of internal cognitive operations performed on the visual attributes of the cue. These two variables were decoded using either a regularized optimal linear estimator in its explicit formulation, an optimal linear artificial neural network estimator, a non-linear artificial neural network estimator, a non-linear naïve Bayesian estimator, a non-linear Reservoir recurrent network classifier or a non-linear Support Vector Machine classifier. Our results suggest that endogenous information such as the orientation of attention can be decoded from the FEF with the same accuracy as exogenous visual information. All classifiers did not behave equally in the face of population size and heterogeneity, the available training and testing trials, the subject's behavior and the temporal structure of the variable of interest. In most situations, the regularized optimal linear estimator and the non-linear Support Vector Machine classifiers outperformed the other tested decoders.
Socially meaningful visual context either enhances or inhibits vocalisation processing in the macaque brain
Social interactions rely on the interpretation of semantic and emotional information, often from multiple sensory modalities. Nonhuman primates send and receive auditory and visual communicative signals. However, the neural mechanisms underlying the association of visual and auditory information based on their common social meaning are unknown. Using heart rate estimates and functional neuroimaging, we show that in the lateral and superior temporal sulcus of the macaque monkey, neural responses are enhanced in response to species-specific vocalisations paired with a matching visual context, or when vocalisations follow, in time, visual information, but inhibited when vocalisation are incongruent with the visual context. For example, responses to affiliative vocalisations are enhanced when paired with affiliative contexts but inhibited when paired with aggressive or escape contexts. Overall, we propose that the identified neural network represents social meaning irrespective of sensory modality. Social interaction involves processing semantic and emotional information. Here the authors show that in the macaque monkey lateral and superior temporal sulcus, cortical activity is enhanced in response to species-specific vocalisations predicted by matching face or social visual stimuli but inhibited when vocalisations are incongruent with the predictive visual context.