Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Item Type
      Item Type
      Clear All
      Item Type
  • Subject
      Subject
      Clear All
      Subject
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
25 result(s) for "Rei, Luca"
Sort by:
Integrating longitudinal information in hippocampal volume measurements for the early detection of Alzheimer's disease
Structural MRI measures for monitoring Alzheimer's Disease (AD) progression are becoming instrumental in the clinical practice, and more so in the context of longitudinal studies. This investigation addresses the impact of four image analysis approaches on the longitudinal performance of the hippocampal volume. We present a hippocampal segmentation algorithm and validate it on a gold-standard manual tracing database. We segmented 460 subjects from ADNI, each subject having been scanned twice at baseline, 12-month and 24month follow-up scan (1.5T, T1 MRI). We used the bilateral hippocampal volume v and its variation, measured as the annualized volume change Λ=δv/year(mm3/y). Four processing approaches with different complexity are compared to maximize the longitudinal information, and they are tested for cohort discrimination ability. Reference cohorts are Controls vs. Alzheimer's Disease (CTRL/AD) and CTRL vs. Mild Cognitive Impairment who subsequently progressed to AD dementia (CTRL/MCI-co). We discuss the conditions on v and the added value of Λ in discriminating subjects. The age-corrected bilateral annualized atrophy rate (%/year) were: −1.6 (0.6) for CTRL, −2.2 (1.0) for MCI-nc, −3.2 (1.2) for MCI-co and −4.0 (1.5) for AD. Combined (v, Λ) discrimination ability gave an Area under the ROC curve (auc)=0.93 for CTRL vs AD and auc=0.88 for CTRL vs MCI-co. Longitudinal volume measurements can provide meaningful clinical insight and added value with respect to the baseline provided the analysis procedure embeds the longitudinal information. [Display omitted] •A large-scale hippocampus segmentation effort was applied to 1.5T ADNI images•Four approaches of increasing complexity were used to implement a longitudinal measure of volume•We tested the longitudinal information potential as a early-AD marker with respect to baseline assessment.•Clinically relevant knowledge can be extracted provided longitudinal information is embedded in the measurement.
Local MRI analysis approach in the diagnosis of early and prodromal Alzheimer's disease
Medial temporal lobe (MTL) atrophy is one of the key biomarkers to detect early neurodegenerative changes in the course of Alzheimer's disease (AD). There is active research aimed at identifying automated methodologies able to extract accurate classification indexes from T1-weighted magnetic resonance images (MRI). Such indexes should be fit for identifying AD patients as early as possible. A reference group composed of 144AD patients and 189 age-matched controls was used to train and test the procedure. It was then applied on a study group composed of 302 MCI subjects, 136 having progressed to clinically probable AD (MCI-converters) and 166 having remained stable or recovered to normal condition after a 24month follow-up (MCI-non converters). All subjects came from the ADNI database. We sampled the brain with 7 relatively small volumes, mainly centered on the MTL, and 2 control regions. These volumes were filtered to give intensity and textural MRI-based features. Each filtered region was analyzed with a Random Forest (RF) classifier to extract relevant features, which were subsequently processed with a Support Vector Machine (SVM) classifier. Once a prediction model was trained and tested on the reference group, it was used to compute a classification index (CI) on the MCI cohort and to assess its accuracy in predicting AD conversion in MCI patients. The performance of the classification based on the features extracted by the whole 9 volumes is compared with that derived from each single volume. All experiments were performed using a bootstrap sampling estimation, and classifier performance was cross-validated with a 20-fold paradigm. We identified a restricted set of image features correlated with the conversion to AD. It is shown that most information originate from a small subset of the total available features, and that it is enough to give a reliable assessment. We found multiple, highly localized image-based features which alone are responsible for the overall clinical diagnosis and prognosis. The classification index is able to discriminate Controls from AD with an Area Under Curve (AUC)=0.97 (sensitivity ≃89% at specificity ≃94%) and Controls from MCI-converters with an AUC=0.92 (sensitivity ≃89% at specificity ≃80%). MCI-converters are separated from MCI-non converters with AUC=0.74(sensitivity ≃72% at specificity ≃65%). The present automated MRI-based technique revealed a strong relationship between highly localized baseline-MRI features and the baseline clinical assessment. In addition, the classification index was also used to predict the probability of AD conversion within a time frame of two years. The definition of a single index combining local analysis of several regions can be useful to detect AD neurodegeneration in a typical MCI population. ► Temporal lobe atrophy in MRI images is measured with a novel, automated technique. ► Small, highly localized volumes convey all the information. ► A localized features-based index is built to classify both individuals and cohorts. ► Discrimination accuracy is comparable to whole brain analyses in literature. ► The index can tell prodromal Alzheimer's disease patients from Controls with high accuracy.
Towards an Automated Approach to the Semi-Quantification of 18FF-DOPA PET in Pediatric-Type Diffuse Gliomas
Background: This study aims to evaluate the use of a computer-aided, semi-quantification approach to [18F]F-DOPA positron emission tomography (PET) in pediatric-type diffuse gliomas (PDGs) to calculate the tumor-to-background ratio. Methods: A total of 18 pediatric patients with PDGs underwent magnetic resonance imaging and [18F]F-DOPA PET, which were analyzed using both manual and automated procedures. The former provided a tumor-to-normal-tissue ratio (TN) and tumor-to-striatal-tissue ratio (TS), while the latter provided analogous scores (tn, ts). We tested the correlation, consistency, and ability to stratify grading and survival between these methods. Results: High Pearson correlation coefficients resulted between the ratios calculated with the two approaches: ρ = 0.93 (p < 10−4) and ρ = 0.814 (p < 10−4). The analysis of the residuals suggested that tn and ts were more consistent than TN and TS. Similarly to TN and TS, the automatically computed scores showed significant differences between low- and high-grade gliomas (p ≤ 10−4, t-test) and the overall survival was significantly shorter in patients with higher values when compared to those with lower ones (p < 10−3, log-rank test). Conclusions: This study suggested that the proposed computer-aided approach could yield similar results to the manual procedure in terms of diagnostic and prognostic information.
Towards an Automated Approach to the Semi-Quantification of 18FF-DOPA PET in Pediatric-Type Diffuse Gliomas
This study aims to evaluate the use of a computer-aided, semi-quantification approach to [18F]F-DOPA positron emission tomography (PET) in pediatric-type diffuse gliomas (PDGs) to calculate the tumor-to-background ratio.BACKGROUNDThis study aims to evaluate the use of a computer-aided, semi-quantification approach to [18F]F-DOPA positron emission tomography (PET) in pediatric-type diffuse gliomas (PDGs) to calculate the tumor-to-background ratio.A total of 18 pediatric patients with PDGs underwent magnetic resonance imaging and [18F]F-DOPA PET, which were analyzed using both manual and automated procedures. The former provided a tumor-to-normal-tissue ratio (TN) and tumor-to-striatal-tissue ratio (TS), while the latter provided analogous scores (tn, ts). We tested the correlation, consistency, and ability to stratify grading and survival between these methods.METHODSA total of 18 pediatric patients with PDGs underwent magnetic resonance imaging and [18F]F-DOPA PET, which were analyzed using both manual and automated procedures. The former provided a tumor-to-normal-tissue ratio (TN) and tumor-to-striatal-tissue ratio (TS), while the latter provided analogous scores (tn, ts). We tested the correlation, consistency, and ability to stratify grading and survival between these methods.High Pearson correlation coefficients resulted between the ratios calculated with the two approaches: ρ = 0.93 (p < 10-4) and ρ = 0.814 (p < 10-4). The analysis of the residuals suggested that tn and ts were more consistent than TN and TS. Similarly to TN and TS, the automatically computed scores showed significant differences between low- and high-grade gliomas (p ≤ 10-4, t-test) and the overall survival was significantly shorter in patients with higher values when compared to those with lower ones (p < 10-3, log-rank test).RESULTSHigh Pearson correlation coefficients resulted between the ratios calculated with the two approaches: ρ = 0.93 (p < 10-4) and ρ = 0.814 (p < 10-4). The analysis of the residuals suggested that tn and ts were more consistent than TN and TS. Similarly to TN and TS, the automatically computed scores showed significant differences between low- and high-grade gliomas (p ≤ 10-4, t-test) and the overall survival was significantly shorter in patients with higher values when compared to those with lower ones (p < 10-3, log-rank test).This study suggested that the proposed computer-aided approach could yield similar results to the manual procedure in terms of diagnostic and prognostic information.CONCLUSIONSThis study suggested that the proposed computer-aided approach could yield similar results to the manual procedure in terms of diagnostic and prognostic information.
Towards an Automated Approach to the Semi-Quantification of sup.18FF-DOPA PET in Pediatric-Type Diffuse Gliomas
Background: This study aims to evaluate the use of a computer-aided, semi-quantification approach to [[sup.18] F]F-DOPA positron emission tomography (PET) in pediatric-type diffuse gliomas (PDGs) to calculate the tumor-to-background ratio. Methods: A total of 18 pediatric patients with PDGs underwent magnetic resonance imaging and [[sup.18] F]F-DOPA PET, which were analyzed using both manual and automated procedures. The former provided a tumor-to-normal-tissue ratio (TN) and tumor-to-striatal-tissue ratio (TS), while the latter provided analogous scores (tn, ts). We tested the correlation, consistency, and ability to stratify grading and survival between these methods. Results: High Pearson correlation coefficients resulted between the ratios calculated with the two approaches: ρ = 0.93 (p < 10[sup.−4] ) and ρ = 0.814 (p < 10[sup.−4] ). The analysis of the residuals suggested that t[sub.n] and t[sub.s] were more consistent than TN and TS. Similarly to TN and TS, the automatically computed scores showed significant differences between low- and high-grade gliomas (p ≤ 10[sup.−4] , t-test) and the overall survival was significantly shorter in patients with higher values when compared to those with lower ones (p < 10[sup.−3] , log-rank test). Conclusions: This study suggested that the proposed computer-aided approach could yield similar results to the manual procedure in terms of diagnostic and prognostic information.
Design and implementation of a seismic Newtonian noise cancellation system for the Virgo gravitational-wave detector
Terrestrial gravity perturbations caused by seismic fields produce the so-called Newtonian noise in gravitational-wave detectors, which is predicted to limit their sensitivity in the upcoming observing runs. In the past, this noise was seen as an infrastructural limitation, i.e., something that cannot be overcome without major investments to improve a detector’s infrastructure. However, it is possible to have at least an indirect estimate of this noise by using the data from a large number of seismometers deployed around a detector’s suspended test masses. The noise estimate can be subtracted from the gravitational-wave data, a process called Newtonian noise cancellation (NNC). In this article, we present the design and implementation of the first NNC system at the Virgo detector as part of its AdV+ upgrade. It uses data from 110 vertical geophones deployed inside the Virgo buildings in optimized array configurations. We use a separate tiltmeter channel to test the pipeline in a proof-of-principle. The system has been running with good performance over months.
Joint Optimization of seismometer arrays for the cancellation of Newtonian noise from seismic body waves in the Einstein Telescope
Seismic Newtonian noise is predicted to limit the sensitivity of the Einstein Telescope. It can be reduced with coherent noise cancellation techniques using data from seismometers. To achieve the best results, it is important to place the seismic sensors in optimal positions. A preliminary study on this topic was conducted for the Einstein Telescope (ET): it focused on the optimization of the seismic array for the cancellation of Newtonian noise at an isolated test mass. In this paper, we expand the study to include the nested shape of ET, i.e., four test masses of the low-frequency interferometers at each vertex of the detector. Results are investigated in function of the polarization content of the seismic field composed of body waves. The study also examines how performance can be affected by displacing the sensor array from its optimal position or by operating at frequencies other than those used for optimization.
Impact of signal clusters in wide-band searches for continuous gravitational waves
In this paper we present a study of some relevant steps of the hierarchical frequency-Hough (FH) pipeline, used within the LIGO and Virgo Collaborations for wide-parameter space searches of continuous gravitational waves (CWs) emitted, for instance, by spinning neutron stars (NSs). Because of their weak expected amplitudes, CWs have not been still detected so far. These steps, namely the spectral estimation, the {\\it peakmap} construction and the procedure to select candidates in the FH plane, are critical as they contribute to determine the final search sensitivity. Here, we are interested in investigating their behavior in the (presently quite) extreme case of signal clusters, due to many and strong CW sources, emitting gravitational waves (GWs) within a small (i.e. <1 Hz wide) frequency range. This could happen for some kinds of CW sources detectable by next generation detectors, like LISA, Einstein Telescope and Cosmic Explorer. Moreover, this possibility has been recently raised even for current Earth-based detectors, in some scenarios of CW emission from ultralight boson clouds around stellar mass black holes (BHs). We quantitatively evaluate the robustness of the FH analysis procedure, designed to minimize the loss of single CW signals, under the unusual situation of signal clusters. Results depend mainly on how strong in amplitude and dense in frequency the signals are, and on the range of frequency they cover. We show that indeed a small sensitivity loss may happen in presence of a very high mean signal density affecting a frequency range of the order of one Hertz, while when the signal cluster covers a frequency range of one tenth of Hertz, or less, we may actually have a sensitivity gain. Overall, we demonstrate the FH to be robust even in presence of moderate-to-large signal clusters.
Probing new light gauge bosons with gravitational-wave interferometers using an adapted semi-coherent method
We adapt a method, originally developed for searches for quasi-monochromatic, quasi-infinite gravitational-wave signals, to directly detect new light gauge bosons with laser interferometers, which could be candidates for dark matter. To search for these particles, we optimally choose the analysis coherence time as a function of boson mass, such that all of the signal power will be confined to one frequency bin. We focus on the dark photon, a gauge boson that could couple to baryon or baryon-lepton number, and explain that its interactions with gravitational-wave interferometers result in a narrow-band, stochastic signal. We provide an end-to-end analysis scheme, estimate its computational cost, and investigate follow-up techniques to confirm or rule out dark matter candidates. Furthermore, we derive a theoretical estimate of the sensitivity, and show that it is consistent with both the empirical sensitivity determined through simulations, and results from a cross-correlation search. Finally, we place Feldman-Cousins upper limits using data from LIGO Livingston's second observing run, which give a new and strong constraint on the coupling of gauge bosons to the interferometer.
Direct constraints on ultra-light boson mass from searches for continuous gravitational waves
\\textit{Superradiance} can trigger the formation of an ultra-light boson cloud around a spinning black hole. Once formed, the boson cloud is expected to emit a nearly periodic, long-duration, gravitational-wave signal. For boson masses in the range \\((10^{-13}-10^{-11})\\) eV, and stellar mass black holes, such signals are potentially detectable by gravitational wave detectors, like Advanced LIGO and Virgo. In this {\\it Letter} we present full band upper limits for a generic all-sky search for periodic gravitational waves in LIGO O2 data, and use them to derive - for the first time - direct constraints on the ultra-light scalar boson field mass.