Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
DisciplineDiscipline
-
Is Peer ReviewedIs Peer Reviewed
-
Item TypeItem Type
-
SubjectSubject
-
YearFrom:-To:
-
More FiltersMore FiltersSourceLanguage
Done
Filters
Reset
25
result(s) for
"Dyer, Blake"
Sort by:
Sea-level trends across The Bahamas constrain peak last interglacial ice melt
by
D’Andrea, William J.
,
Rovere, Alessio
,
Sandstrom, Michael R.
in
Age determination
,
Archipelagoes
,
Chronology
2021
During the last interglacial (LIG) period, global mean sea level (GMSL) was higher than at present, likely driven by greater high-latitude insolation. Past sea-level estimates require elevation measurements and age determination of marine sediments that formed at or near sea level, and those elevations must be corrected for glacial isostatic adjustment (GIA). However, this GIA correction is subject to uncertainties in the GIA model inputs, namely, Earth’s rheology and past ice history, which reduces precision and accuracy in estimates of past GMSL. To better constrain the GIA process, we compare our data and existing LIG sea-level data across the Bahamian archipelago with a suite of 576 GIA model predictions. We calculated weights for each GIA model based on how well the model fits spatial trends in the regional sea-level data and then used the weighted GIA corrections to revise estimates of GMSL during the LIG. During the LIG, we find a 95% probability that global sea level peaked at least 1.2 m higher than today, and it is very unlikely (5% probability) to have exceeded 5.3 m. Estimates increase by up to 30% (decrease by up to 20%) for portions of melt that originate from the Greenland ice sheet (West Antarctic ice sheet). Altogether, this work suggests that LIG GMSL may be lower than previously assumed.
Journal Article
Giant boulders and Last Interglacial storm intensity in the North Atlantic
2017
As global climate warms and sea level rises, coastal areas will be subject to more frequent extreme flooding and hurricanes. Geologic evidence for extreme coastal storms during past warm periods has the potential to provide fundamental insights into their future intensity. Recent studies argue that during the Last Interglacial (MIS 5e, ∼128–116 ka) tropical and extratropical North Atlantic cyclones may have been more intense than at present, and may have produced waves larger than those observed historically. Such strong swells are inferred to have created a number of geologic features that can be observed today along the coastlines of Bermuda and the Bahamas. In this paper, we investigate the most iconic among these features: massive boulders atop a cliff in North Eleuthera, Bahamas. We combine geologic field surveys, wave models, and boulder transport equations to test the hypothesis that such boulders must have been emplaced by storms of greater-than-historical intensity. By contrast, our results suggest that with the higher relative sea level (RSL) estimated for the Bahamas during MIS 5e, boulders of this size could have been transported by waves generated by storms of historical intensity. Thus, while the megaboulders of Eleuthera cannot be used as geologic proof for past “superstorms,” they do show that with rising sea levels, cliffs and coastal barriers will be subject to significantly greater erosional energy, even without changes in storm intensity.
Journal Article
A Bayesian framework for inferring regional and global change from stratigraphic proxy records (StratMC v1.0)
2025
The chemistry of ancient sedimentary rocks encodes information about past climate, element cycling, and biological innovations. Records of large-scale Earth system change are constructed by piecing together geochemical proxy data from many different stratigraphic sections, each of which may be incomplete, time-uncertain, biased by local processes, and diagenetically altered. Accurately reconstructing past Earth system change thus requires correctly correlating sections from different locations, distinguishing between global and local changes in proxy values, and converting stratigraphic height to absolute time. Incomplete consideration of the uncertainties associated with each of these challenging tasks can lead to biased and inaccurate estimates of the magnitude, duration, and rate of past Earth system change. Here, we address this shortcoming by developing a Bayesian statistical framework for inferring the common proxy signal recorded by multiple stratigraphic sections. Using the principle of stratigraphic superposition and both absolute and relative age constraints, the model simultaneously correlates all stratigraphic sections, builds an age model for each section, and untangles global and local signals for one or more proxies. Synthetic experiments confirm that the model can correctly recover proxy signals from incomplete, noisy, and biased stratigraphic observations. Future applications of the model to the geologic record will enable geoscientists to more accurately pose and test hypotheses for the drivers of past proxy perturbations, generating new insights into Earth's history. The model is available as an open-source Python package (StratMC), which provides a flexible and user-friendly framework for studying different times and proxies recorded in sediments.
Journal Article
The origin of carbonate mud and implications for global climate
2022
Carbonate mud represents one of the most important geochemical archives for reconstructing ancient climatic, environmental, and evolutionary change from the rock record. Mud also represents a major sink in the global carbon cycle. Yet, there remains no consensus about how and where carbonate mud is formed. Here, we present stable isotope and trace-element data from carbonate constituents in the Bahamas, including ooids, corals, foraminifera, and algae. We use geochemical fingerprinting to demonstrate that carbonate mud cannot be sourced from the abrasion and mixture of any combination of these macroscopic grains. Instead, an inverse Bayesian mixing model requires the presence of an additional aragonite source.We posit that this source represents a direct seawater precipitate. We use geological and geochemical data to show that “whitings” are unlikely to be the dominant source of this precipitate and, instead, present a model for mud precipitation on the bank margins that can explain the geographical distribution, clumped-isotope thermometry, and stable isotope signature of carbonate mud. Next, we address the enigma of why mud and ooids are so abundant in the Bahamas, yet so rare in the rest of the world: Mediterranean outflow feeds the Bahamas with the most alkaline waters in themodern ocean (>99.7th-percentile). Such high alkalinity appears to be a prerequisite for the nonskeletal carbonate factory because, when Mediterranean outflow was reduced in the Miocene, Bahamian carbonate export ceased for 3-million-years. Finally, we show how shutting off and turning on the shallow carbonate factory can send ripples through the global climate system.
Journal Article
REPLY TO HEARTY AND TORMEY
by
Rovere, Alessio
,
Lorscheid, Thomas
,
D’Andrea, William J.
in
Earth, Atmospheric, and Planetary Sciences
,
LETTER
,
Letters
2018
Journal Article
Stratigraphic expression and numerical modeling of meteoric diagenesis in carbonate platforms during the Late Paleozoic Ice Age
2015
The history of life on Earth is intricately tied to the coevolution of the biosphere, atmosphere, and lithosphere over billions of years. Ancient sediments are the fragmented historical record of the interactions among these systems. The Late Paleozoic Ice Age (LPIA) is an interval of extreme climate change and variability that is expressed in the physical and chemical stratigraphy of tropical sediments. The focus of this thesis is refining strategies to extract information about global and local sea level from carbonate-rich sedimentary basins. The second chapter explores the impact sea level had on the stratigraphic expression of carbonate cycles from the late Pennsylvanian. These carbonate cycles classically are interpreted as the sedimentary response to Milankovitch-style orbital forcing of climate in the Late Paleozoic, but the lateral synthesis of sedimentary facies and their carbon isotopic values suggests that sea level change was a minor component to sedimentary expression in the basin. Therefore, late Pennsylvanian ice sheets were relatively stable and not responding rapidly to changes in orbital forcings. The third chapter investigates a globally expressed sedimentary unconformity near the middle Carboniferous boundary. Glacial expansion and subsequent sea level fall results in sedimentary hiatus and meteoric diagenesis of the carbon isotopes in the exposed carbonates. The observations of negative carbon isotopes in the carbonate platforms motivates the exploration of the impact on the global carbon cycle, and suggests that the 13C of the ocean may be elevated during glacioeustasy of the LPIA. This result offers a much needed improvement on global biogeochemical models that have struggled to provide a congruent solution to the high δ13C of the LPIA. The final chapter provides numerical methods to interpret superimposed seawater and meteoric diagenetic isotopic signals in the stratigraphy. The merger of these numerical methods and the carbon and calcium isotopic excursions beneath the middle Carboniferous unconformity offers insight into the processes by which sea water chemistry, carbonate weathering, meteoric diagenesis, and local platform hydrology contribute to the composite stratigraphic record.
Dissertation
Clinical course, costs and predictive factors for response to treatment in carpal tunnel syndrome: the PALMS study protocol
by
Blake, Julian
,
Dyer, Tony
,
Jerosch-Herold, Christina
in
Biomechanical Phenomena
,
Carpal tunnel syndrome
,
Carpal Tunnel Syndrome - diagnosis
2014
Background
Carpal tunnel syndrome (CTS) is the most common neuropathy of the upper limb and a significant contributor to hand functional impairment and disability. Effective treatment options include conservative and surgical interventions, however it is not possible at present to predict the outcome of treatment. The primary aim of this study is to identify which baseline clinical factors predict a good outcome from conservative treatment (by injection) or surgery in patients diagnosed with carpal tunnel syndrome. Secondary aims are to describe the clinical course and progression of CTS, and to describe and predict the UK cost of CTS to the individual, National Health Service (NHS) and society over a two year period.
Methods/Design
In this prospective observational cohort study patients presenting with clinical signs and symptoms typical of CTS and in whom the diagnosis is confirmed by nerve conduction studies are invited to participate. Data on putative predictive factors are collected at baseline and follow-up through patient questionnaires and include standardised measures of symptom severity, hand function, psychological and physical health, comorbidity and quality of life. Resource use and cost over the 2 year period such as prescribed medications, NHS and private healthcare contacts are also collected through patient self-report at 6, 12, 18 and 24 months. The primary outcome used to classify treatment success or failures will be a 5-point global assessment of change. Secondary outcomes include changes in clinical symptoms, functioning, psychological health, quality of life and resource use. A multivariable model of factors which predict outcome and cost will be developed.
Discussion
This prospective cohort study will provide important data on the clinical course and UK costs of CTS over a two-year period and begin to identify predictive factors for treatment success from conservative and surgical interventions.
Journal Article
Know Thyself by Knowing Others: Learning Neuron Identity from Population Context
2025
Neurons process information in ways that depend on their cell type, connectivity, and the brain region in which they are embedded. However, inferring these factors from neural activity remains a significant challenge. To build general-purpose representations that allow for resolving information about a neuron's identity, we introduce NuCLR, a self-supervised framework that aims to learn representations of neural activity that allow for differentiating one neuron from the rest. NuCLR brings together views of the same neuron observed at different times and across different stimuli and uses a contrastive objective to pull these representations together. To capture population context without assuming any fixed neuron ordering, we build a spatiotemporal transformer that integrates activity in a permutation-equivariant manner. Across multiple electrophysiology and calcium imaging datasets, a linear decoding evaluation on top of NuCLR representations achieves a new state-of-the-art for both cell type and brain region decoding tasks, and demonstrates strong zero-shot generalization to unseen animals. We present the first systematic scaling analysis for neuron-level representation learning, showing that increasing the number of animals used during pretraining consistently improves downstream performance. The learned representations are also label-efficient, requiring only a small fraction of labeled samples to achieve competitive performance. These results highlight how large, diverse neural datasets enable models to recover information about neuron identity that generalize across animals. Code is available at https://github.com/nerdslab/nuclr.
Journal Article
A Unified, Scalable Framework for Neural Population Decoding
2023
Our ability to use deep learning approaches to decipher neural activity would likely benefit from greater scale, in terms of both model size and datasets. However, the integration of many neural recordings into one unified model is challenging, as each recording contains the activity of different neurons from different individual animals. In this paper, we introduce a training framework and architecture designed to model the population dynamics of neural activity across diverse, large-scale neural recordings. Our method first tokenizes individual spikes within the dataset to build an efficient representation of neural events that captures the fine temporal structure of neural activity. We then employ cross-attention and a PerceiverIO backbone to further construct a latent tokenization of neural population activities. Utilizing this architecture and training framework, we construct a large-scale multi-session model trained on large datasets from seven nonhuman primates, spanning over 158 different sessions of recording from over 27,373 neural units and over 100 hours of recordings. In a number of different tasks, we demonstrate that our pretrained model can be rapidly adapted to new, unseen sessions with unspecified neuron correspondence, enabling few-shot performance with minimal labels. This work presents a powerful new approach for building deep learning tools to analyze neural data and stakes out a clear path to training at scale.Our ability to use deep learning approaches to decipher neural activity would likely benefit from greater scale, in terms of both model size and datasets. However, the integration of many neural recordings into one unified model is challenging, as each recording contains the activity of different neurons from different individual animals. In this paper, we introduce a training framework and architecture designed to model the population dynamics of neural activity across diverse, large-scale neural recordings. Our method first tokenizes individual spikes within the dataset to build an efficient representation of neural events that captures the fine temporal structure of neural activity. We then employ cross-attention and a PerceiverIO backbone to further construct a latent tokenization of neural population activities. Utilizing this architecture and training framework, we construct a large-scale multi-session model trained on large datasets from seven nonhuman primates, spanning over 158 different sessions of recording from over 27,373 neural units and over 100 hours of recordings. In a number of different tasks, we demonstrate that our pretrained model can be rapidly adapted to new, unseen sessions with unspecified neuron correspondence, enabling few-shot performance with minimal labels. This work presents a powerful new approach for building deep learning tools to analyze neural data and stakes out a clear path to training at scale.
Journal Article
Towards a \universal translator\ for neural dynamics at single-cell, single-spike resolution
2024
Neuroscience research has made immense progress over the last decade, but our understanding of the brain remains fragmented and piecemeal: the dream of probing an arbitrary brain region and automatically reading out the information encoded in its neural activity remains out of reach. In this work, we build towards a first foundation model for neural spiking data that can solve a diverse set of tasks across multiple brain areas. We introduce a novel self-supervised modeling approach for population activity in which the model alternates between masking out and reconstructing neural activity across different time steps, neurons, and brain regions. To evaluate our approach, we design unsupervised and supervised prediction tasks using the International Brain Laboratory repeated site dataset, which is comprised of Neuropixels recordings targeting the same brain locations across 48 animals and experimental sessions. The prediction tasks include single-neuron and region-level activity prediction, forward prediction, and behavior decoding. We demonstrate that our multi-task-masking (MtM) approach significantly improves the performance of current state-of-the-art population models and enables multi-task learning. We also show that by training on multiple animals, we can improve the generalization ability of the model to unseen animals, paving the way for a foundation model of the brain at single-cell, single-spike resolution. Project page and code: https://ibl-mtm.github.io/.
Journal Article