Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
DisciplineDiscipline
-
Is Peer ReviewedIs Peer Reviewed
-
Reading LevelReading Level
-
Content TypeContent Type
-
YearFrom:-To:
-
More FiltersMore FiltersItem TypeIs Full-Text AvailableSubjectPublisherSourceDonorLanguagePlace of PublicationContributorsLocation
Done
Filters
Reset
656
result(s) for
"Space perception Physiological aspects."
Sort by:
Brain Landscape
2009,2008
We know as architects that the ability to measure human response to environmental stimuli still requires more years of work. Neuroscience is beginning to provide us with an understanding of how the brain controls all of our bodily activities, and ultimately affects how we think, move, perceive, learn, and remember. In an address to the American Institute of Architects convention in 2003, “Rusty” Gage made the following observations that set the core premise for this book: (1) The brain controls our behavior; (2) Genes control the blueprints for the design and structure of the brain; (3) The environment can modulate the function of genes, and ultimately, the structure of the brain; (4) Changes in the environment change the brain; (5) Consequently, changes in the environment change our behavior; and (6) Therefore, architectural design can change our brain and our behavior.
Vestibular Cognition
by
Ferrè, Elisa Raffaella
,
Harris, Laurence R
in
Cognitive neuroscience
,
Physiological aspects
,
Space perception
2017
In this volume specific cognitive sub-functions are identified and indications of how basic vestibular input contributes to each are described. The broad range of these functions is consistent with the broad spread of vestibular projections throughout the cortex. Combining vestibular signals about the head's orientation relative to gravity with information about head position relative to the body provides sufficient information to map body position onto the ground surface and underlie the sense of spatial position. But vestibular signals are also fundamental to sensorimotor control and even to high-level bodily perception such as the sense of body ownership and the anchoring of perspective to the body. Clinical observations confirm the essential role of vestibular signals in maintaining a coherent self-representation and suggest some novel rehabilitation strategies.The chapters presented in this volume are previously published in a Special Issue of Multisensory Research, Volume 28, Issue 5-6 (2015).Contributors are: M. Barnett-Cowan, O. Blanke, J. Blouin, G. Bosco, G. Bottini, J.-P. Bresciani, J.C. Culham, C.L. Darlington, A.W. Ellis, E.R. Ferrè, M. Gandola, L. Grabherr, S. Gravano, P. Grivaz, E. Guillaud, P. Haggard, L.R. Harris, A.E.N. Hoover, I. Indovina, K. Jáuregu Renaud, M. Kaliuzhna, F. Lacquaniti, B. Lenggenhager, C. Lopez, G. Macauda, V. Maffei, F.W. Mast, B. La Scaleia, B.M. Seemungal, M. Simoneau, P.F. Smith, J.C. Snow, D. Vibert, M. Zago, and Y. Zheng.
Spatial vision
by
DeValois, Russell L
,
DeValois, Karen K
in
Neuropsychology
,
Physiological aspects
,
Space perception
1990,1991,1988
This book presents an integrated view of how we perceive the spatial relations in our visual world, covering anatomical, physiological, psychophysical, and perceptual aspects. The book discusses the visual system primarily in terms of spatial frequency analysis using a linear systems approach. It reviews evidence supporting a local, patch-by-patch spatial frequency filtering of visual information rather than the global Fourier analysis other researchers have proposed. A separate chapter addresses the special issues surrounding color vision, and a brief, nonmathematical introduction to linear systems analysis is included.
Brain landscape : the coexistance of neuroscience and architecture
2009
Brain Landscape: The Coexistence of Neuroscience and Architecture is the first book to serve as an intellectual bridge between architectural practice and neuroscience research. John P. Eberhard, founding President of the non-profit Academy of Neuroscience for Architecture, argues that increased funding, and the ability to think beyond the norm, will lead to a better understanding of how scientific research can change how we design, illuminate, and build spaces. Inversely, he posits that by better understanding the effects that buildings and places have on us, and our mental state, the better we may be able to understand how the human brain works. This book is devoted to describing architectural design criteria for schools, offices, laboratories, memorials, churches, and facilities for the aging, and then posing hypotheses about human experiences in such settings.
Testing the role of spontaneous activity in visuospatial perception with patterned optogenetics
by
Olcese, Umberto
,
Benedetti, Davide
,
Fiorilli, Julien
in
Animals
,
Biology and Life Sciences
,
Consciousness
2025
A major debate in the field of consciousness pertains to whether neuronal activity or rather the causal structure of neural circuits underlie the generation of conscious experience. The former position is held by theoretical accounts of consciousness based on the predictive processing framework (such as neurorepresentationalism and active inference), while the latter is posited by the integrated information theory. This protocol describes an experiment, part of a larger adversarial collaboration, that was designed to address this question through a combination of behavioral tests in mice, functional imaging, patterned optogenetics and electrophysiology. The experiment will directly test if optogenetic inactivation of a portion of the visual cortex not responding to behaviorally relevant stimuli will affect the perception of the spatial distribution of these stimuli, even when the neurons being inactivated display no or very low spiking activity, so low that it does not induce a significant effect on other cortical areas. The results of the experiment will be compared against theoretical predictions, and will provide a major contribution towards understanding what the neuronal substrate of consciousness is.
Journal Article
Vestibular stimulation and space-time interaction affect the perception of time during whole-body rotations
by
Kuldavletova, Olga
,
Laplanche, Alexis
,
Kola, Adéla
in
Adult
,
Amplitudes
,
Attention - physiology
2025
Among the factors, such as emotions, that distort time perception, vestibular stimulation causes a contraction in subjective time. Unlike emotions, the intensity of vestibular stimulation can be easily and precisely modified, making it possible to study the quantitative relationship between stimulation and its effect on time perception. We hypothesized that the contraction of subjective time would increase with the vestibular stimulation magnitude. In the first experiment, participants sat on a rotatory chair and reproduced time intervals between the start and the end of whole-body passive rotations (40° or 80°; dynamic condition) or between two consecutive low-amplitude shakes (static condition). We also assessed reaction time under the same conditions to evaluate the attentional effect of the stimuli. As expected, duration reproduction in the 40° rotation was shorter than that observed in the static condition, but this effect was partly reversed for 80° rotations. In other words, vestibular stimulation shortens the perceived time interval, but this effect weakens with stronger stimulation. Attentional changes do not explain this unexpected result, as reaction time did not change between conditions. We hypothesized that the space-time interaction (i.e., spatially larger stimuli are perceived as lasting longer) could explain these findings. To assess this, in a second experiment participants were subjected to the same protocol but with three rotation amplitudes (30°, 60°, and 120°). The duration reproductions were systematically shorter for the lower amplitudes than for the higher amplitudes; so much so that for the highest amplitude (120°), the duration reproduction increased so that it did not differ from the static condition. Overall, the experiments show that whole-body rotation can contract subjective time, probably at the rather low level of the interval timing network, or dilate it, probably at a higher level via the space-time interaction.
Journal Article
Vestibular processing during natural self-motion: implications for perception and action
2019
How the brain computes accurate estimates of our self-motion relative to the world and our orientation relative to gravity in order to ensure accurate perception and motor control is a fundamental neuroscientific question. Recent experiments have revealed that the vestibular system encodes this information during everyday activities using pathway-specific neural representations. Furthermore, new findings have established that vestibular signals are selectively combined with extravestibular information at the earliest stages of central vestibular processing in a manner that depends on the current behavioural goal. These findings have important implications for our understanding of the brain mechanisms that ensure accurate perception and behaviour during everyday activities and for our understanding of disorders of vestibular processing.In addition to ensuring stable gaze and posture, the vestibular system contributes to the perception of self-motion and orientation. In this Review, Cullen provides a comprehensive overview of recent advances in our understanding of sensory encoding and integration in the vestibular pathways.
Journal Article
Using space and time to encode vibrotactile information: toward an estimate of the skin’s achievable throughput
2015
Touch receptors in the skin can relay various forms of abstract information, such as words (Braille), haptic feedback (cell phones, game controllers, feedback for prosthetic control), and basic visual information such as edges and shape (sensory substitution devices). The skin can support such applications with ease: They are all low bandwidth and do not require a fine temporal acuity. But what of high-throughput applications? We use sound-to-touch conversion as a motivating example, though others abound (e.g., vision, stock market data). In the past, vibrotactile hearing aids have demonstrated improvement in speech perceptions in the deaf. However, a sound-to-touch sensory substitution device that works with high efficacy and without the aid of lipreading has yet to be developed. Is this because skin simply does not have the capacity to effectively relay high-throughput streams such as sound? Or is this because the spatial and temporal properties of skin have not been leveraged to full advantage? Here, we begin to address these questions with two experiments. First, we seek to determine the best method of relaying information through the skin using an identification task on the lower back. We find that vibrotactile patterns encoding information in both space and time yield the best overall information transfer estimate. Patterns encoded in space and time or “intensity” (the coupled coding of vibration frequency and force) both far exceed performance of only spatially encoded patterns. Next, we determine the vibrotactile two-tacton resolution on the lower back—the distance necessary for resolving two vibrotactile patterns. We find that our vibratory motors conservatively require at least 6 cm of separation to resolve two independent tactile patterns (>80 % correct), regardless of stimulus type (e.g., spatiotemporal “sweeps” versus single vibratory pulses). Six centimeter is a greater distance than the inter-motor distances used in Experiment 1 (2.5 cm), which explains the poor identification performance of spatially encoded patterns. Hence, when using an array of vibrational motors, spatiotemporal sweeps can overcome the limitations of vibrotactile two-tacton resolution. The results provide the first steps toward obtaining a realistic estimate of the skin’s achievable throughput, illustrating the best ways to encode data to the skin (using as many dimensions as possible) and how far such interfaces would need to be separated if using multiple arrays in parallel.
Journal Article
Biases in Visual, Auditory, and Audiovisual Perception of Space
by
Wozny, David R.
,
Odegaard, Brian
,
Shams, Ladan
in
Accuracy
,
Acoustic Stimulation - methods
,
Adolescent
2015
Localization of objects and events in the environment is critical for survival, as many perceptual and motor tasks rely on estimation of spatial location. Therefore, it seems reasonable to assume that spatial localizations should generally be accurate. Curiously, some previous studies have reported biases in visual and auditory localizations, but these studies have used small sample sizes and the results have been mixed. Therefore, it is not clear (1) if the reported biases in localization responses are real (or due to outliers, sampling bias, or other factors), and (2) whether these putative biases reflect a bias in sensory representations of space or a priori expectations (which may be due to the experimental setup, instructions, or distribution of stimuli). Here, to address these questions, a dataset of unprecedented size (obtained from 384 observers) was analyzed to examine presence, direction, and magnitude of sensory biases, and quantitative computational modeling was used to probe the underlying mechanism(s) driving these effects. Data revealed that, on average, observers were biased towards the center when localizing visual stimuli, and biased towards the periphery when localizing auditory stimuli. Moreover, quantitative analysis using a Bayesian Causal Inference framework suggests that while pre-existing spatial biases for central locations exert some influence, biases in the sensory representations of both visual and auditory space are necessary to fully explain the behavioral data. How are these opposing visual and auditory biases reconciled in conditions in which both auditory and visual stimuli are produced by a single event? Potentially, the bias in one modality could dominate, or the biases could interact/cancel out. The data revealed that when integration occurred in these conditions, the visual bias dominated, but the magnitude of this bias was reduced compared to unisensory conditions. Therefore, multisensory integration not only improves the precision of perceptual estimates, but also the accuracy.
Journal Article