Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
DisciplineDiscipline
-
Is Peer ReviewedIs Peer Reviewed
-
Item TypeItem Type
-
SubjectSubject
-
YearFrom:-To:
-
More FiltersMore FiltersSourceLanguage
Done
Filters
Reset
5
result(s) for
"Andersson, J.L.R."
Sort by:
Non-negative data-driven mapping of structural connections with application to the neonatal brain
by
Glasser, M.F.
,
Bastiani, M.
,
Sotiropoulos, S.N.
in
Algorithms
,
Brain
,
Brain - growth & development
2020
Mapping connections in the neonatal brain can provide insight into the crucial early stages of neurodevelopment that shape brain organisation and lay the foundations for cognition and behaviour. Diffusion MRI and tractography provide unique opportunities for such explorations, through estimation of white matter bundles and brain connectivity. Atlas-based tractography protocols, i.e. a priori defined sets of masks and logical operations in a template space, have been commonly used in the adult brain to drive such explorations. However, rapid growth and maturation of the brain during early development make it challenging to ensure correspondence and validity of such atlas-based tractography approaches in the developing brain. An alternative can be provided by data-driven methods, which do not depend on predefined regions of interest. Here, we develop a novel data-driven framework to extract white matter bundles and their associated grey matter networks from neonatal tractography data, based on non-negative matrix factorisation that is inherently suited to the non-negative nature of structural connectivity data. We also develop a non-negative dual regression framework to map group-level components to individual subjects. Using in-silico simulations, we evaluate the accuracy of our approach in extracting connectivity components and compare with an alternative data-driven method, independent component analysis. We apply non-negative matrix factorisation to whole-brain connectivity obtained from publicly available datasets from the Developing Human Connectome Project, yielding grey matter components and their corresponding white matter bundles. We assess the validity and interpretability of these components against traditional tractography results and grey matter networks obtained from resting-state fMRI in the same subjects. We subsequently use them to generate a parcellation of the neonatal cortex using data from 323 new-born babies and we assess the robustness and reproducibility of this connectivity-driven parcellation.
[Display omitted]
Journal Article
A Bayesian framework for global tractography
2007
We readdress the diffusion tractography problem in a global and probabilistic manner. Instead of tracking through local orientations, we parameterise the connexions between brain regions at a global level, and then infer on global and local parameters simultaneously in a Bayesian framework. This approach offers a number of important benefits. The global nature of the tractography reduces sensitivity to local noise and modelling errors. By constraining tractography to ensure a connexion is found, and then inferring on the exact location of the connexion, we increase the robustness of connectivity-based parcellations, allowing parcellations of connexions that were previously invisible to tractography. The Bayesian framework allows a direct comparison of the evidence for connecting and non-connecting models, to test whether the connexion is supported by the data. Crucially, by explicit parameterisation of the connexion between brain regions, we infer on a parameter that is shared with models of functional connectivity. This model is a first step toward the joint inference on functional and anatomical connectivity.
Journal Article
How to obtain high-accuracy image registration : application to movement correction of dynamic positron emission tomography data
1998
When registering dynamic positron emission tomography (PET) sequences, the time-dependent changes in uptake pattern prevent registration of all frames to the first frame in a straightforward manner. Instead, a sequential registration of each frame to its predecessor may be used, provided the registration algorithm is completely free of bias. It is shown that most existing algorithms introduce a bias, the size of which depends on the pixel size and the signal-to-noise ratio of the data. The bias is introduced by the pixelisation of the underlying continuous process. All existing cost-functions are more or less sensitive to noise, and the noise reduction resulting from translating one image set relative to the other means that a small movement will always be detected in the cases where no actual movement has occurred. The problem is solved by an initial resampling of the reference volume into a representation with another image and pixel size. If the new representation is sensibly chosen it means that all possible transforms applied to the other image volume will yield approximately the same noise reduction, thereby removing the source of the bias. The described effect is demonstrated on phantom data, and its impact is shown on human data.
Journal Article
Design, construction and six years' experience of an integrated system for automated handling of discrete blood samples
1998
The present paper describes the design of an integrated system to aid in the taking and measurement of manual blood samples during nuclear medical examinations requiring blood sampling. In contrast to previously published systems, the present system is not used in the actual sampling of the blood, but aims to aid in all other aspects of handling and measurement. It consists of two main parts. One part is a distributed software system running on the scanner host computer used to register sample times, to display information pertaining to the ongoing examination and to collect data from a number of well crystals. The other main part consists of an industrial robot used to perform the actual weighing, centrifugation, pipetting and measurement of the samples. The system has been operational for 6 years, during which time it has had an \"up-time\" in excess of 95% and has handled and measured the blood samples from more than 5000 examinations, each comprising an average of 15 blood samples. The throughput of the system is 50 whole blood samples or 21 plasma samples per hour. In addition it has to a large extent removed the \"human factor\" from the process, thereby increasing the reliability of the data.
Conference Proceeding