Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
DisciplineDiscipline
-
Is Peer ReviewedIs Peer Reviewed
-
Item TypeItem Type
-
SubjectSubject
-
YearFrom:-To:
-
More FiltersMore FiltersSourceLanguage
Done
Filters
Reset
25
result(s) for
"Las, Liora"
Sort by:
Vectorial representation of spatial goals in the hippocampus of bats
2017
To navigate, animals need to represent not only their own position and orientation, but also the location of their goal. Neural representations of an animal’s own position and orientation have been extensively studied. However, it is unknown how navigational goals are encoded in the brain. We recorded from hippocampal CA1 neurons of bats flying in complex trajectories toward a spatial goal. We discovered a subpopulation of neurons with angular tuning to the goal direction. Many of these neurons were tuned to an occluded goal, suggesting that goal-direction representation is memory-based. We also found cells that encoded the distance to the goal, often in conjunction with goal direction. The goal-direction and goal-distance signals make up a vectorial representation of spatial goals, suggesting a previously unrecognized neuronal mechanism for goal-directed navigation.
Journal Article
Social place-cells in the bat hippocampus
2018
Different sets of neurons encode the spatial position and orientation of an organism. However, social animals need to know the position of other individuals for social interactions, observational learning, and group navigation. Surprisingly, very little is known about how the position of other animals is represented in the brain. Danjo et al. and Omer et al. now report the discovery of a subgroup of neurons in hippocampal area CA1 that encodes the presence of conspecifics in rat and bat brains, respectively. Science , this issue p. 213 , p. 218 A subpopulation of bat hippocampal CA1 neurons represents the spatial position of another bat. Social animals have to know the spatial positions of conspecifics. However, it is unknown how the position of others is represented in the brain. We designed a spatial observational-learning task, in which an observer bat mimicked a demonstrator bat while we recorded hippocampal dorsal-CA1 neurons from the observer bat. A neuronal subpopulation represented the position of the other bat, in allocentric coordinates. About half of these “social place-cells” represented also the observer’s own position—that is, were place cells. The representation of the demonstrator bat did not reflect self-movement or trajectory planning by the observer. Some neurons represented also the position of inanimate moving objects; however, their representation differed from the representation of the demonstrator bat. This suggests a role for hippocampal CA1 neurons in social-spatial cognition.
Journal Article
Spatial cognition in bats and rats: from sensory acquisition to multiscale maps and navigation
by
Ulanovsky, Nachum
,
Geva-Sagiv, Maya
,
Yovel, Yossi
in
631/378/116
,
631/378/1595/1554
,
631/378/1595/3922
2015
Key Points
Real-life navigation takes place in large natural environments that are complex and multiscaled. There is a fundamental gap between large-scale navigation and typical laboratory studies of the neurobiology of navigation, in which animals navigate in small boxes. This is a gap in spatial scales, in the richness of sensory information available, and in the techniques used and models proposed for describing small-scale versus large-scale navigation.
Spatial cells in the mammalian hippocampal formation include place cells, grid cells, head-direction cells and border (boundary) cells. It is unknown whether and how these neurons contribute to natural navigation on scales beyond a few meters.
Bats are excellent animal models for studying spatial cognition, because they are superb navigators, and possess multiple excellent sensory systems, including vision, olfaction and echolocation (biosonar). Bat echolocation is an active-sensing system with high accuracy and dynamic flexibility, which bears surprising similarities to rat whisking and sniffing, with the benefit that bat biosonar can be analysed quantitatively by using the mathematical theory of sonar.
Large-scale navigation is likely to be dominated by incoming sensory information rather than by self-motion cues (path integration). Two prominent sensory-based models of place fields, the boundary vector cell (BVC) model and the view-based model, both predict that sensory resolution determines the spatial resolution of place cells.
The multiscale character of natural habitats, from small rat burrows and bat caves to large kilometre-scale trajectories, suggests that neural spatial representation by place cells should adapt to this multiscale structure. Three phenomena described in the literature could serve as mechanisms for such multiscale representation: these include the dorsoventral gradient of place-field size in the hippocampus and dorsoventral gradient of grid-spacing in the medial entorhinal cortex, the dynamic adjustment of place-field size to environmental dimensions, and the enhanced precision of spatial representation using population coding.
Theoretical models suggested several ways to bridge the gap between spatial representations for 1-m boxes used in laboratory experiments, versus kilometre-scale natural environments. Models of place cells proposed that place fields could rescale according to environmental size, or that place cells can exhibit dozens or hundreds of place fields per neuron; whereas for grid cells, models proposed a combinatorial grid code that could represent simultaneously both small-scale and large geographical environments.
Future directions include the need to understand which of these theoretical models best captures the actual neural basis of large-scale navigation, as well as the need to elucidate the neural mechanisms of route planning, re-orienting after losing one's path, and how different maps are stitched together. To answer these questions would require electrophysiological recordings from the hippocampal formation of animals navigating in large-scale, complex naturalistic environments.
Although we understand much about mechanisms of spatial navigation in the mammalian brain in the context of laboratory investigations, our knowledge of the neural bases of 'real-world' navigation is more limited. Ulanovsky and colleagues here describe how we can approach this problem through experimental research and theoretical models of large-scale navigation in bats and rats.
Spatial orientation and navigation rely on the acquisition of several types of sensory information. This information is then transformed into a neural code for space in the hippocampal formation through the activity of place cells, grid cells and head-direction cells. These spatial representations, in turn, are thought to guide long-range navigation. But how the representations encoded by these different cell types are integrated in the brain to form a neural 'map and compass' is largely unknown. Here, we discuss this problem in the context of spatial navigation by bats and rats. We review the experimental findings and theoretical models that provide insight into the mechanisms that link sensory systems to spatial representations and to large-scale natural navigation.
Journal Article
Locally ordered representation of 3D space in the entorhinal cortex
by
Burak, Yoram
,
Ulanovsky, Nachum
,
Ginosar, Gily
in
631/378/116
,
631/378/1595/1554
,
631/378/1595/2618
2021
As animals navigate on a two-dimensional surface, neurons in the medial entorhinal cortex (MEC) known as grid cells are activated when the animal passes through multiple locations (firing fields) arranged in a hexagonal lattice that tiles the locomotion surface
1
. However, although our world is three-dimensional, it is unclear how the MEC represents 3D space
2
. Here we recorded from MEC cells in freely flying bats and identified several classes of spatial neurons, including 3D border cells, 3D head-direction cells, and neurons with multiple 3D firing fields. Many of these multifield neurons were 3D grid cells, whose neighbouring fields were separated by a characteristic distance—forming a local order—but lacked any global lattice arrangement of the fields. Thus, whereas 2D grid cells form a global lattice—characterized by both local and global order—3D grid cells exhibited only local order, creating a locally ordered metric for space. We modelled grid cells as emerging from pairwise interactions between fields, which yielded a hexagonal lattice in 2D and local order in 3D, thereby describing both 2D and 3D grid cells using one unifying model. Together, these data and model illuminate the fundamental differences and similarities between neural codes for 3D and 2D space in the mammalian brain.
Recordings from the brains of freely flying bats show that grid cells that represent 3D space have multiple firing fields and are organized with local rather than global order.
Journal Article
Hippocampal global remapping for different sensory modalities in flying bats
by
Ulanovsky, Nachum
,
Geva-Sagiv, Maya
,
Romani, Sandro
in
631/378/116/1925
,
631/378/1595/1554
,
631/378/1595/3922
2016
Hippocampal place cells encode the animal’s position within the environment. Using flying bats navigating either by vision or echolocation, the authors found that hippocampal spatial maps changed completely between vision and echolocation. This suggests the hippocampus does not contain a single abstract map for a given environment, but rather multiple maps for different sensory modalities.
Hippocampal place cells encode the animal's spatial position. However, it is unknown how different long-range sensory systems affect spatial representations. Here we alternated usage of vision and echolocation in Egyptian fruit bats while recording from single neurons in hippocampal areas CA1 and subiculum. Bats flew back and forth along a linear flight track, employing echolocation in darkness or vision in light. Hippocampal representations remapped between vision and echolocation via two kinds of remapping: subiculum neurons turned on or off, while CA1 neurons shifted their place fields. Interneurons also exhibited strong remapping. Finally, hippocampal place fields were sharper under vision than echolocation, matching the superior sensory resolution of vision over echolocation. Simulating several theoretical models of place-cells suggested that combining sensory information and path integration best explains the experimental sharpening data. In summary, here we show sensory-based global remapping in a mammal, suggesting that the hippocampus does not contain an abstract spatial map but rather a 'cognitive atlas', with multiple maps for different sensory modalities.
Journal Article
Three-dimensional head-direction coding in the bat brain
2015
Navigation requires a sense of direction (‘compass’), which in mammals is thought to be provided by head-direction cells, neurons that discharge when the animal’s head points to a specific azimuth. However, it remains unclear whether a three-dimensional (3D) compass exists in the brain. Here we conducted neural recordings in bats, mammals well-adapted to 3D spatial behaviours, and found head-direction cells tuned to azimuth, pitch or roll, or to conjunctive combinations of 3D angles, in both crawling and flying bats. Head-direction cells were organized along a functional–anatomical gradient in the presubiculum, transitioning from 2D to 3D representations. In inverted bats, the azimuth-tuning of neurons shifted by 180°, suggesting that 3D head direction is represented in azimuth × pitch toroidal coordinates. Consistent with our toroidal model, pitch-cell tuning was unimodal, circular, and continuous within the available 360° of pitch. Taken together, these results demonstrate a 3D head-direction mechanism in mammals, which could support navigation in 3D space.
A study of freely moving bats provides new insights into how the brain encodes a three-dimensional neural compass; neurons were identified encoding the three Euler rotation angles of the head (azimuth, pitch, and roll) and recordings from these head-direction cells revealed a toroidal model of spatial orientation mapped out by cells tuned to two circular variables (azimuth × pitch).
Bats' mental compass
Most mammals can navigate complex environments thanks to an accurate neural representation of three-dimensional space involving the coordination of cells encoding space, distance, boundaries and head direction. Orientation via head-direction cells is a critical component of this navigation but little is known of the nature of this compass. In a study of freely moving Egyptian fruit bats, either in flight or crawling in search of food, Arseny Finkelstein
et al
. provide new insights into how the brain encodes its neural compass. Using neural recordings from the brain — specifically from the presubiculum — the authors identified neurons encoding the three Euler rotation angles of the head (azimuth, pitch and roll). Recordings from these head-direction cells revealed a toroidal model of spatial orientation mapped out by cells tuned to two circular variables — azimuth and pitch.
Journal Article
A role for descending auditory cortical projections in songbird vocal learning
by
Mandelblat-Cerf, Yael
,
Denisenko, Natalia
,
Fee, Michale S
in
Acoustic Stimulation
,
Animals
,
Auditory Cortex
2014
Many learned motor behaviors are acquired by comparing ongoing behavior with an internal representation of correct performance, rather than using an explicit external reward. For example, juvenile songbirds learn to sing by comparing their song with the memory of a tutor song. At present, the brain regions subserving song evaluation are not known. In this study, we report several findings suggesting that song evaluation involves an avian 'cortical' area previously shown to project to the dopaminergic midbrain and other downstream targets. We find that this ventral portion of the intermediate arcopallium (AIV) receives inputs from auditory cortical areas, and that lesions of AIV result in significant deficits in vocal learning. Additionally, AIV neurons exhibit fast responses to disruptive auditory feedback presented during singing, but not during nonsinging periods. Our findings suggest that auditory cortical areas may guide learning by transmitting song evaluation signals to the dopaminergic midbrain and/or other subcortical targets. Most new skills, from playing a sport to learning a language, are acquired through a gradual process of trial and error. While some of this learning is driven by direct external rewards, such as praise, much of it occurs when the individual compares their current performance with their own impression of what a ‘correct’ performance should be. The way that the brain responds to external rewards is relatively well understood, but much less is known about the processes used by the brain to evaluate its own performance. One way to study this process is to examine how songbirds learn their songs. While in the nest, young male birds memorize another bird's song, usually that of their father. They learn to sing by comparing their own vocals with this memorized template, tweaking their song until the two versions match. Now, Mandelblat-Cerf et al. have identified a pathway in the brain that enables the birds to make this comparison and to use any discrepancies to improve their subsequent attempts. Anatomical labeling experiments revealed that a brain structure called the arcopallium has a key role in this process. The ventral part of this structure (known as AIV) receives inputs from the auditory cortex—meaning that it has access to the bird’s own song—and then forms connections with a specific group of neurons in the midbrain. These midbrain neurons, which communicate using the chemical transmitter dopamine, project to brain regions that ultimately control the movements involved in singing. This means that the AIV is ideally positioned to be able to evaluate and then adjust the song as required. Consistent with this possibility, young zebra finches were less able to imitate a template song if their AIV was destroyed before they had started practicing. By contrast, destroying the AIV in adult birds who had already learned their song did not impair performance, indicating that the arcopallium circuit supports song learning rather than singing per se. Finally, recordings of neurons in the AIV made during singing revealed that this brain area sends signals about discrepancies between what the young bird tries to sing and what he hears himself sing. In addition to providing further clues as to how the songbirds learn their songs, this work also highlights the fact that dopaminergic neurons in the midbrain—which are best known for being involved in our response to external rewards such as food and drugs—also contribute to learning that is driven internally.
Journal Article
Processing of low-probability sounds by cortical neurons
by
Ulanovsky, Nachum
,
Nelken, Israel
,
Las, Liora
in
Acoustic Stimulation
,
Action Potentials - physiology
,
Animal Genetics and Genomics
2003
The ability to detect rare auditory events can be critical for survival. We report here that neurons in cat primary auditory cortex (A1) responded more strongly to a rarely presented sound than to the same sound when it was common. For the rare stimuli, we used both frequency and amplitude deviants. Moreover, some A1 neurons showed hyperacuity for frequency deviants—a frequency resolution one order of magnitude better than receptive field widths in A1. In contrast, auditory thalamic neurons were insensitive to the probability of frequency deviants. These phenomena resulted from stimulus-specific adaptation in A1, which may be a single-neuron correlate of an extensively studied cortical potential—mismatch negativity—that is evoked by rare sounds. Our results thus indicate that A1 neurons, in addition to processing the acoustic features of sounds, may also be involved in sensory memory and novelty detection.
Journal Article