Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Item Type
      Item Type
      Clear All
      Item Type
  • Subject
      Subject
      Clear All
      Subject
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
167 result(s) for "Mean of functional data"
Sort by:
Detecting changes in the mean of functional observations
Principal component analysis has become a fundamental tool of functional data analysis. It represents the functional data as Xi(t)=μ(t)+Σ1[less-than or equal to]l<[infinity]ηi, l+ vl(t), where μ is the common mean, vl are the eigenfunctions of the covariance operator and the ηi, l are the scores. Inferential procedures assume that the mean function μ(t) is the same for all values of i. If, in fact, the observations do not come from one population, but rather their mean changes at some point(s), the results of principal component analysis are confounded by the change(s). It is therefore important to develop a methodology to test the assumption of a common functional mean. We develop such a test using quantities which can be readily computed in the R package fda. The null distribution of the test statistic is asymptotically pivotal with a well-known asymptotic distribution. The asymptotic test has excellent finite sample performance. Its application is illustrated on temperature data from England.
Universal Local Linear Kernel Estimators in Nonparametric Regression
New local linear estimators are proposed for a wide class of nonparametric regression models. The estimators are uniformly consistent regardless of satisfying traditional conditions of dependence of design elements. The estimators are the solutions of a specially weighted least-squares method. The design can be fixed or random and does not need to meet classical regularity or independence conditions. As an application, several estimators are constructed for the mean of dense functional data. The theoretical results of the study are illustrated by simulations. An example of processing real medical data from the epidemiological cross-sectional study ESSE-RF is included. We compare the new estimators with the estimators best known for such studies.
Evaluation of a personalized functional near infra‐red optical tomography workflow using maximum entropy on the mean
In the present study, we proposed and evaluated a workflow of personalized near infra‐red optical tomography (NIROT) using functional near‐infrared spectroscopy (fNIRS) for spatiotemporal imaging of cortical hemodynamic fluctuations. The proposed workflow from fNIRS data acquisition to local 3D reconstruction consists of: (a) the personalized optimal montage maximizing fNIRS channel sensitivity to a predefined targeted brain region; (b) the optimized fNIRS data acquisition involving installation of optodes and digitalization of their positions using a neuronavigation system; and (c) the 3D local reconstruction using maximum entropy on the mean (MEM) to accurately estimate the location and spatial extent of fNIRS hemodynamic fluctuations along the cortical surface. The workflow was evaluated on finger‐tapping fNIRS data acquired from 10 healthy subjects for whom we estimated the reconstructed NIROT spatiotemporal images and compared with functional magnetic resonance imaging (fMRI) results from the same individuals. Using the fMRI activation maps as our reference, we quantitatively compared the performance of two NIROT approaches, the MEM framework and the conventional minimum norm estimation (MNE) method. Quantitative comparisons were performed at both single subject and group‐level. Overall, our results suggested that MEM provided better spatial accuracy than MNE, while both methods offered similar temporal accuracy when reconstructing oxygenated (HbO) and deoxygenated hemoglobin (HbR) concentration changes evoked by finger‐tapping. Our proposed complete workflow was made available in the brainstorm fNIRS processing plugin—NIRSTORM, thus providing the opportunity for other researchers to further apply it to other tasks and on larger populations. This study was to introduce and evaluate a workflow for personalized near infra‐red optical tomography (NIROT). The workflow comprises a full pipeline from functional near‐infrared spectroscopy (fNIRS) data acquisition to local 3D imaging of cortical hemodynamic fluctuations.
Evaluation of sliding window correlation performance for characterizing dynamic functional connectivity and brain states
A promising recent development in the study of brain function is the dynamic analysis of resting-state functional MRI scans, which can enhance understanding of normal cognition and alterations that result from brain disorders. One widely used method of capturing the dynamics of functional connectivity is sliding window correlation (SWC). However, in the absence of a “gold standard” for comparison, evaluating the performance of the SWC in typical resting-state data is challenging. This study uses simulated networks (SNs) with known transitions to examine the effects of parameters such as window length, window offset, window type, noise, filtering, and sampling rate on the SWC performance. The SWC time course was calculated for all node pairs of each SN and then clustered using the k-means algorithm to determine how resulting brain states match known configurations and transitions in the SNs. The outcomes show that the detection of state transitions and durations in the SWC is most strongly influenced by the window length and offset, followed by noise and filtering parameters. The effect of the image sampling rate was relatively insignificant. Tapered windows provide less sensitivity to state transitions than rectangular windows, which could be the result of the sharp transitions in the SNs. Overall, the SWC gave poor estimates of correlation for each brain state. Clustering based on the SWC time course did not reliably reflect the underlying state transitions unless the window length was comparable to the state duration, highlighting the need for new adaptive window analysis techniques. •We evaluate sliding window correlation as a dynamic network analysis method.•We evaluate k-means for state identification of the results.•Use of simulated networks provides ground truth for performance evaluation.•Window length, offset, and filtering significantly influence the results
Optimal Estimation of Large Functional and Longitudinal Data by Using Functional Linear Mixed Model
The estimation of large functional and longitudinal data, which refers to the estimation of mean function, estimation of covariance function, and prediction of individual trajectory, is one of the most challenging problems in the field of high-dimensional statistics. Functional Principal Components Analysis (FPCA) and Functional Linear Mixed Model (FLMM) are two major statistical tools used to address the estimation of large functional and longitudinal data; however, the former suffers from a dramatically increasing computational burden while the latter does not have clear asymptotic properties. In this paper, we propose a computationally effective estimator of large functional and longitudinal data within the framework of FLMM, in which all the parameters can be automatically estimated. Under certain regularity assumptions, we prove that the mean function estimation and individual trajectory prediction reach the minimax lower bounds of all nonparametric estimations. Through numerous simulations and real data analysis, we show that our new estimator outperforms the traditional FPCA in terms of mean function estimation, individual trajectory prediction, variance estimation, covariance function estimation, and computational effectiveness.
Combining unsupervised and supervised learning techniques for enhancing the performance of functional data classifiers
This paper offers a supervised classification strategy that combines functional data analysis with unsupervised and supervised classification methods. Specifically, a two-steps classification technique for high-dimensional time series treated as functional data is suggested. The first stage is based on extracting additional knowledge from the data using unsupervised classification employing suitable metrics. The second phase applies functional supervised classification of the new patterns learned via appropriate basis representations. The experiments on ECG data and comparison with the classical approaches show the effectiveness of the proposed technique and exciting refinement in terms of accuracy. A simulation study with six scenarios is also offered to demonstrate the efficacy of the suggested strategy. The results reveal that this line of investigation is compelling and worthy of further development.
Which results of the standard test for community-weighted mean approach are too optimistic?
Aims The community‐weighted mean (CWM) approach is used to analyse the relationship between species attributes (traits, Ellenberg‐type indicator values) and sample attributes (environmental variables, richness) via the community matrix. It has recently been shown to suffer from inflated Type I error rate if tested by a standard test and the results of many published studies are probably affected. I review the current knowledge about this problem, and clarify which studies are likely affected and by how much. Methods I suggest classifying hypotheses commonly tested by CWM approach into three categories, which differ in the formulation of the null hypothesis. I use simulated and real data to show how the Type I error rate of the standard test is affected by data characteristics. Results The CWM approach with the standard test returns a correct Type I error rate for hypotheses assuming a link between species attributes and composition (Category A). However, for hypotheses assuming a link between composition and sample attributes (Category B) or not assuming any link (Category C), the standard test is inflated, and alternative tests are needed to control for this. The inflation of standard tests for Category C is negatively related to the compositional β‐diversity, and positively to the strength of the composition–sample attributes relationship and data set sample size. These results apply to CWM analyses with extrinsic sample attributes (not derived from the compositional matrix). CWM analysis with intrinsic sample attributes (derived from the composition, such as species richness) is a case of spurious correlation and can be tested using a column‐based (modified) permutation test. Conclusions The concept of three hypothesis categories offers a simple tool to evaluate which hypothesis has been tested and whether the results have correct or inflated Type I error rate. In the case of inflated results, the level of inflation can be estimated from the data characteristics. Community weighted mean approach tests the relationship of species attributes (traits, indicator values) to sample attributes (environmental variables, richness) and the test is known to have inflated Type I error rate. I argue that whether test results are inflated depends on the type of tested hypothesis and that the level of inflation depends on dataset parameters (e.g. beta diversity).
SimPACE: Generating simulated motion corrupted BOLD data with synthetic-navigated acquisition for the development and evaluation of SLOMOCO: A new, highly effective slicewise motion correction
Head motion in functional MRI and resting-state MRI is a major problem. Existing methods do not robustly reflect the true level of motion artifact for in vivo fMRI data. The primary issue is that current methods assume that motion is synchronized to the volume acquisition and thus ignore intra-volume motion. This manuscript covers three sections in the use of gold-standard motion-corrupted data to pursue an intra-volume motion correction. First, we present a way to get motion corrupted data with accurately known motion at the slice acquisition level. This technique simulates important data acquisition-related motion artifacts while acquiring real BOLD MRI data. It is based on a novel motion-injection pulse sequence that introduces known motion independently for every slice: Simulated Prospective Acquisition CorrEction (SimPACE). Secondly, with data acquired using SimPACE, we evaluate several motion correction and characterization techniques, including several commonly used BOLD signal- and motion parameter-based metrics. Finally, we introduce and evaluate a novel, slice-based motion correction technique. Our novel method, SLice-Oriented MOtion COrrection (SLOMOCO) performs better than the volumetric methods and, moreover, accurately detects the motion of independent slices, in this case equivalent to the known injected motion. We demonstrate that SLOMOCO can model and correct for nearly all effects of motion in BOLD data. Also, none of the commonly used motion metrics was observed to robustly identify motion corrupted events, especially in the most realistic scenario of sudden head movement. For some popular metrics, performance was poor even when using the ideal known slice motion instead of volumetric parameters. This has negative implications for methods relying on these metrics, such as recently proposed motion correction methods such as data censoring and global signal regression. •We present a pulse sequence to acquire BOLD data with known motion corruption.•BOLD data with induced motion is acquired in cadavers and live subjects at rest.•We shows contemporary motion measures are insensitive to intravolume motion.•We present a retrospective algorithm to obtain full slicewise motion estimates.•SLOMOCO is the first motion correction suitable for realistic head motion.
OPTIMAL ESTIMATION OF THE MEAN FUNCTION BASED ON DISCRETELY SAMPLED FUNCTIONAL DATA: PHASE TRANSITION
The problem of estimating the mean of random functions based on discretely sampled data arises naturally in functional data analysis. In this paper, we study optimal estimation of the mean function under both common and independent designs. Minimax rates of convergence are established and easily implementable rate-optimal estimators are introduced. The analysis reveals interesting and different phase transition phenomena in the two cases. Under the common design, the sampling frequency solely determines the optimal rate of convergence when it is relatively small and the sampling frequency has no effect on the optimal rate when it is large. On the other hand, under the independent design, the optimal rate of convergence is determined jointly by the sampling frequency and the number of curves when the sampling frequency is relatively small. When it is large, the sampling frequency has no effect on the optimal rate. Another interesting contrast between the two settings is that smoothing is necessary under the independent design, while, somewhat surprisingly, it is not essential under the common design.
Estimation of the mean of functional time series and a two-sample problem
The paper is concerned with inference based on the mean function of a functional time series. We develop a normal approximation for the functional sample mean and then focus on the estimation of the asymptotic variance kernel. Using these results, we develop and asymptotically justify testing procedures for the equality of means in two functional samples exhibiting temporal dependence. Evaluated by means of a simulation study and application to a real data set, these two-sample procedures enjoy good size and power in finite samples.