Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
DisciplineDiscipline
-
Is Peer ReviewedIs Peer Reviewed
-
Item TypeItem Type
-
SubjectSubject
-
YearFrom:-To:
-
More FiltersMore FiltersSourceLanguage
Done
Filters
Reset
14
result(s) for
"Gauthier, Jeff L"
Sort by:
Volumetric two-photon imaging of neurons using stereoscopy (vTwINS)
by
Gauthier, Jeff L
,
Thiberge, Stephan Y
,
Charles, Adam S
in
631/1647/245/2226
,
631/1647/328/2057
,
631/1647/328/2235
2017
vTwINS enables high-speed volumetric calcium imaging via a V-shaped point spread function and a dedicated data-processing algorithm. Song
et al
. apply this strategy to image population activity in the mouse visual cortex and hippocampus.
Two-photon laser scanning microscopy of calcium dynamics using fluorescent indicators is a widely used imaging method for large-scale recording of neural activity
in vivo
. Here, we introduce volumetric two-photon imaging of neurons using stereoscopy (vTwINS), a volumetric calcium imaging method that uses an elongated, V-shaped point spread function to image a 3D brain volume. Single neurons project to spatially displaced 'image pairs' in the resulting 2D image, and the separation distance between projections is proportional to depth in the volume. To demix the fluorescence time series of individual neurons, we introduce a modified orthogonal matching pursuit algorithm that also infers source locations within the 3D volume. We illustrated vTwINS by imaging neural population activity in the mouse primary visual cortex and hippocampus. Our results demonstrated that vTwINS provides an effective method for volumetric two-photon calcium imaging that increases the number of neurons recorded while maintaining a high frame rate.
Journal Article
Detecting and correcting false transients in calcium imaging
by
Nieh, Edward H.
,
Tank, David W.
,
Charles, Adam S.
in
631/1647/334/1874/345
,
631/1647/794
,
631/378/116
2022
Population recordings of calcium activity are a major source of insight into neural function. Large datasets require automated processing, but this can introduce errors that are difficult to detect. Here we show that popular time course-estimation algorithms often contain substantial misattribution errors affecting 10–20% of transients. Misattribution, in which fluorescence is ascribed to the wrong cell, arises when overlapping cells and processes are imperfectly defined or not identified. To diagnose misattribution, we develop metrics and visualization tools for evaluating large datasets. To correct time courses, we introduce a robust estimator that explicitly accounts for contaminating signals. In one hippocampal dataset, removing contamination reduced the number of place cells by 15%, and 19% of place fields shifted by over 10 cm. Our methods are compatible with other cell-finding techniques, empowering users to diagnose and correct a potentially widespread problem that could alter scientific conclusions.
SEUDO is a tool for detecting and correcting errors introduced by automated processing of calcium imaging data.
Journal Article
Volumetric Two-photon Imaging of Neurons Using Stereoscopy (vTwINS)
by
Gauthier, Jeff L
,
Thiberge, Stephan Y
,
Charles, Adam S
in
Calcium imaging
,
Calcium signalling
,
Confocal microscopy
2016
Two-photon laser scanning microscopy of calcium dynamics using fluorescent indicators is a widely used imaging method for large scale recording of neural activity in vivo. Here we introduce volumetric Two-photon Imaging of Neurons using Stereoscopy (vTwINS), a volumetric calcium imaging method that employs an elongated, V-shaped point spread function to image a 3D brain volume. Single neurons project to spatially displaced \"image pairs\" in the resulting 2D image, and the separation distance between images is proportional to depth in the volume. To demix the fluorescence time series of individual neurons, we introduce a novel orthogonal matching pursuit algorithm that also infers source locations within the 3D volume. We illustrate vTwINS by imaging neural population activity in mouse primary visual cortex and hippocampus. Our results demonstrate that vTwINS provides an effective method for volumetric two-photon calcium imaging that increases the number of neurons recorded while maintaining a high frame-rate.
Neural Anatomy and Optical Microscopy (NAOMi) Simulation for evaluating calcium imaging methods
2019
The past decade has seen a multitude of new in vivo functional imaging methodologies. However, the lack of ground-truth comparisons or evaluation metrics makes large-scale, systematic validation impossible. Here we provide a new framework for evaluating TPM methods via in silico Neural Anatomy and Optical Microscopy (NAOMi) simulation. Our computationally efficient model generates large anatomical volumes of mouse cortex, simulates neural activity, and incorporates optical propagation and scanning to create realistic calcium imaging datasets. We verify NAOMi simulations against in vivo two-photon recordings from mouse cortex. We leverage this access to in silico ground truth to perform direct comparisons between different segmentation algorithms and optical designs. We find modern segmentation algorithms extract strong neural time-courses comparable to estimation using oracle spatial information, but with an increase in the false positive rate. Comparison between optical setups demonstrate improved resilience to motion artifacts in sparsely labeled samples using Bessel beams, increased signal-to-noise ratio and cell-count using low numerical aperture Gaussian beams and nuclear GCaMP, and more uniform spatial sampling with temporal focusing versus multi-plane imaging. Overall, by leveraging the rich accumulated knowledge of neural anatomy and optical physics, we provide a powerful new tool to assess and develop important methods in neural imaging.
Detecting and Correcting False Transients in Calcium Imaging
by
Pillow, Jonathan W
,
Gauthier, Jeffrey L
,
Charles, Adam S
in
Automation
,
Calcium imaging
,
Neuroscience
2018
Population recordings of calcium activity are a major source of insight into neural function. Large dataset sizes often require automated methods, but automation can introduce errors that are difficult to detect. Here we show that automatic time course estimation can sometimes lead to significant misattribution errors, in which fluorescence is ascribed to the wrong cell. Misattribution arises when the shapes of overlapping cells are imperfectly defined, or when entire cells or processes are not identified, and misattribution can even be produced by methods specifically designed to handle overlap. To diagnose this problem, we develop a transient-by-transient metric and a visualization tool that allow users to quickly assess the degree of misattribution in large populations. To filter out misattribution, we also design a robust estimator that explicitly accounts for contaminating signals in a generative model. Our methods can be combined with essentially any cell finding technique, empowering users to diagnose and correct at large scale a problem that has the potential to significantly alter scientific conclusions.
A genomic mutational constraint map using variation in 76,156 human genomes
by
Wilson, Michael W.
,
Ferriera, Steven
,
O’Donnell-Luria, Anne
in
631/114
,
631/181/2474
,
631/181/457/649
2024
The depletion of disruptive variation caused by purifying natural selection (constraint) has been widely used to investigate protein-coding genes underlying human disorders
1
–
4
, but attempts to assess constraint for non-protein-coding regions have proved more difficult. Here we aggregate, process and release a dataset of 76,156 human genomes from the Genome Aggregation Database (gnomAD)—the largest public open-access human genome allele frequency reference dataset—and use it to build a genomic constraint map for the whole genome (genomic non-coding constraint of haploinsufficient variation (Gnocchi)). We present a refined mutational model that incorporates local sequence context and regional genomic features to detect depletions of variation. As expected, the average constraint for protein-coding sequences is stronger than that for non-coding regions. Within the non-coding genome, constrained regions are enriched for known regulatory elements and variants that are implicated in complex human diseases and traits, facilitating the triangulation of biological annotation, disease association and natural selection to non-coding DNA analysis. More constrained regulatory elements tend to regulate more constrained protein-coding genes, which in turn suggests that non-coding constraint can aid the identification of constrained genes that are as yet unrecognized by current gene constraint metrics. We demonstrate that this genome-wide constraint map improves the identification and interpretation of functional human genetic variation.
A genomic constraint map for the human genome constructed using data from 76,156 human genomes from the Genome Aggregation Database shows that non-coding constrained regions are enriched for regulatory elements and variants associated with complex diseases and traits.
Journal Article
Evaluating drug targets through human loss-of-function genetic variation
2020
Naturally occurring human genetic variants that are predicted to inactivate protein-coding genes provide an in vivo model of human gene inactivation that complements knockout studies in cells and model organisms. Here we report three key findings regarding the assessment of candidate drug targets using human loss-of-function variants. First, even essential genes, in which loss-of-function variants are not tolerated, can be highly successful as targets of inhibitory drugs. Second, in most genes, loss-of-function variants are sufficiently rare that genotype-based ascertainment of homozygous or compound heterozygous ‘knockout’ humans will await sample sizes that are approximately 1,000 times those presently available, unless recruitment focuses on consanguineous individuals. Third, automated variant annotation and filtering are powerful, but manual curation remains crucial for removing artefacts, and is a prerequisite for recall-by-genotype efforts. Our results provide a roadmap for human knockout studies and should guide the interpretation of loss-of-function variants in drug development.
Analysis of predicted loss-of-function variants from 125,748 human exomes and 15,708 whole genomes in the Genome Aggregation Database (gnomAD) provides a roadmap for human ‘knockout’ studies and a guide for future research into disease biology and drug-target selection.
Journal Article
The effect of LRRK2 loss-of-function variants in humans
2020
Human genetic variants predicted to cause loss-of-function of protein-coding genes (pLoF variants) provide natural in vivo models of human gene inactivation and can be valuable indicators of gene function and the potential toxicity of therapeutic inhibitors targeting these genes
1
,
2
. Gain-of-kinase-function variants in
LRRK2
are known to significantly increase the risk of Parkinson’s disease
3
,
4
, suggesting that inhibition of LRRK2 kinase activity is a promising therapeutic strategy. While preclinical studies in model organisms have raised some on-target toxicity concerns
5
–
8
, the biological consequences of LRRK2 inhibition have not been well characterized in humans. Here, we systematically analyze pLoF variants in
LRRK2
observed across 141,456 individuals sequenced in the Genome Aggregation Database (gnomAD)
9
, 49,960 exome-sequenced individuals from the UK Biobank and over 4 million participants in the 23andMe genotyped dataset. After stringent variant curation, we identify 1,455 individuals with high-confidence pLoF variants in
LRRK2
. Experimental validation of three variants, combined with previous work
10
, confirmed reduced protein levels in 82.5% of our cohort. We show that heterozygous pLoF variants in
LRRK2
reduce LRRK2 protein levels but that these are not strongly associated with any specific phenotype or disease state. Our results demonstrate the value of large-scale genomic databases and phenotyping of human loss-of-function carriers for target validation in drug discovery.
Analysis of large genomic datasets, including gnomAD, reveals that partial
LRRK2
loss of function is not strongly associated with diseases, serving as an example of how human genetics can be leveraged for target validation in drug discovery.
Journal Article
Inferring compound heterozygosity from large-scale exome sequencing data
2024
Recessive diseases arise when both copies of a gene are impacted by a damaging genetic variant. When a patient carries two potentially causal variants in a gene, accurate diagnosis requires determining that these variants occur on different copies of the chromosome (that is, are in
trans
) rather than on the same copy (that is, in
cis
). However, current approaches for determining phase, beyond parental testing, are limited in clinical settings. Here we developed a strategy for inferring phase for rare variant pairs within genes, leveraging genotypes observed in the Genome Aggregation Database (v2,
n
= 125,748 exomes). Our approach estimates phase with 96% accuracy, both in trio data and in patients with Mendelian conditions and presumed causal compound heterozygous variants. We provide a public resource of phasing estimates for coding variants and counts per gene of rare variants in
trans
that can aid interpretation of rare co-occurring variants in the context of recessive disease.
A strategy for inferring phase for rare variant pairs is applied to exome sequencing data for 125,748 individuals from the Genome Aggregation Database (gnomAD). This resource will aid interpretation of rare co-occurring variants in the context of recessive disease.
Journal Article
Drug repurposing for Alzheimer’s disease: a Delphi consensus and stakeholder consultation
by
Gauthier, Serge
,
Aarsland, Dag
,
Noble, Wendy
in
Advertising executives
,
Alzheimer Disease - drug therapy
,
Alzheimer's disease
2025
Background
Alzheimer’s disease (AD) is an escalating global challenge, with more than 40 million people affected, and this number is projected to increase to more than 100 million by 2050. While amyloid-targeting antibody treatments (lecanemab and donanemab) are a significant step forward, the benefits of these therapies remain limited. This highlights the necessity for safe and effective compounds that offer greater therapeutic benefits to the majority of individuals with or at risk of AD. Drug repurposing allows for a cost-effective, time-efficient strategy to accelerate the availability of treatments, owing to the availability of safety information.
Method
This study focuses on the third iteration of the Delphi consensus programme aimed at identifying new high-priority drug candidates for repurposing in AD. An international expert panel comprising academics, clinicians and industry representatives was convened. Through a combination of anonymized drug nominations, systemic evidence reviews, iterative consensus rankings, and lay advisory inputs, drug candidates were evaluated and ranked based on rational, non-clinical, and clinical evidence and overall safety profiles.
Results
Among the 80 candidates that were nominated by the expert panel, seven underwent review, with only three candidates meeting the following consensus criteria of relevant mechanisms for targeting neurodegenerative pathways, non-clinical efficacy, and tolerability in older individuals. The three agents were: [
1
] the live attenuated herpes zoster (HZ) vaccine (Zostavax) [
2
], sildenafil, a phosphodiesterase-5 (PDE-5) inhibitor, and [
3
] riluzole, a glutamate antagonist. The HZ vaccine additionally offers potential for population-level dementia risk reduction.
Conclusion
This Delphi consensus identified three high-priority drug repurposing candidates for AD with favourable safety profiles and mechanistic plausibility, which are considered suitable for pragmatic clinical trials, including remote or hybrid designs. The PROTECT platform, which supports international cohorts in the UK, Norway, and Canada, offers a well-established means to conduct such trials effectively, thus helping to accelerate the evaluation and potential deployment of these drug candidates to benefit individuals with or at risk for AD.
Journal Article