Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Item Type
      Item Type
      Clear All
      Item Type
  • Subject
      Subject
      Clear All
      Subject
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
      More Filters
      Clear All
      More Filters
      Source
    • Language
3,389 result(s) for "Permutation Method"
Sort by:
CONTROLLING THE FALSE DISCOVERY RATE VIA KNOCKOFFS
In many fields of science, we observe a response variable together with a large number of potential explanatory variables, and would like to be able to discover which variables are truly associated with the response. At the same time, we need to know that the false discovery rate (FDR)—the expected fraction of false discoveries among all discoveries—is not too high, in order to assure the scientist that most of the discoveries are indeed true and replicable. This paper introduces the knockoff filter, a new variable selection procedure controlling the FDR in the statistical linear model whenever there are at least as many observations as variables. This method achieves exact FDR control in finite sample settings no matter the design or covariates, the number of variables in the model, or the amplitudes of the unknown regression coefficients, and does not require any knowledge of the noise level. As the name suggests, the method operates by manufacturing knockoff variables that are cheap—their construction does not require any new data—and are designed to mimic the correlation structure found within the existing variables, in a way that allows for accurate FDR control, beyond what is possible with permutation-based methods. The method of knockoffs is very general and flexible, and can work with a broad class of test statistics. We test the method in combination with statistics from the Lasso for sparse regression, and obtain empirical results showing that the resulting method has far more power than existing selection rules when the proportion of null variables is high.
Asymptotic permutation tests in general factorial designs
In general factorial designs where no homoscedasticity or a particular error distribution is assumed, the well‐known Wald‐type statistic is a simple asymptotically valid procedure. However, it is well known that it suffers from a poor finite sample approximation since the convergence to its χ²limit distribution is quite slow. This becomes even worse with an increasing number of factor levels. The aim of the paper is to improve the small sample behaviour of the Wald‐type statistic, maintaining its applicability to general settings as crossed or hierarchically nested designs by applying a modified permutation approach. In particular, it is shown that this approach approximates the null distribution of the Wald‐type statistic not only under the null hypothesis but also under the alternative yielding an asymptotically valid permutation test which is even finitely exact under exchangeability. Finally, its small sample behaviour is compared with competing procedures in an extensive simulation study.
PERMUTATION METHODS FOR FACTOR ANALYSIS AND PCA
Researchers often have datasets measuring features xij of samples, such as test scores of students. In factor analysis and PCA, these features are thought to be influenced by unobserved factors, such as skills. Can we determine how many components affect the data? This is an important problem, because decisions made here have a large impact on all downstream data analysis. Consequently, many approaches have been developed. Parallel Analysis is a popular permutation method: it randomly scrambles each feature of the data. It selects components if their singular values are larger than those of the permuted data. Despite widespread use, as well as empirical evidence for its accuracy, it currently has no theoretical justification. In this paper, we show that parallel analysis (or permutation methods) consistently select the large components in certain high-dimensional factor models. However, when the signals are too large, the smaller components are not selected. The intuition is that permutations keep the noise invariant, while “destroying” the low-rank signal. This provides justification for permutation methods. Our work also uncovers drawbacks of permutation methods, and paves the way to improvements.
ADAPTIVE TEST OF INDEPENDENCE BASED ON HSIC MEASURES
The Hilbert–Schmidt Independence Criterion (HSIC) is a dependence measure based on reproducing kernel Hilbert spaces that is widely used to test independence between two random vectors. Remains the delicate choice of the kernel. In this work, we develop a new HSIC-based aggregated procedure which avoids such a kernel choice, and provide theoretical guarantees for this procedure. To achieve this, on the one hand, we introduce non-asymptotic single tests based on Gaussian kernels with a given bandwidth, which are of prescribed level. Then, we aggregate several single tests with different band-widths, and prove sharp upper bounds for the uniform separation rate of the aggregated procedure over Sobolev balls. On the other hand, we provide a lower bound for the non-asymptotic minimax separation rate of testing over Sobolev balls, and deduce that the aggregated procedure is adaptive in the minimax sense over such regularity spaces. Finally, from a practical point of view, we perform numerical studies in order to assess the efficiency of our aggregated procedure and compare it to existing tests in the literature.
Deterministic parallel analysis
Factor analysis and principal component analysis are used in many application areas. The first step, choosing the number of components, remains a serious challenge. Our work proposes improved methods for this important problem. One of the most popular state of the art methods is parallel analysis (PA), which compares the observed factor strengths with simulated strengths under a noise-only model. The paper proposes improvements to PA. We first derandomize it, proposing deterministic PA, which is faster and more reproducible than PA. Both PA and deterministic PA are prone to a shadowing phenomenon in which a strong factor makes it difficult to detect smaller but more interesting factors. We propose deflation to counter shadowing. We also propose to raise the decision threshold to improve estimation accuracy. We prove several consistency results for our methods, and test them in simulations. We also illustrate our methods on data from the human genome diversity project, where they significantly improve the accuracy.
Molecular Dynamics Simulation Studies on the Aggregation of Amyloid-β Peptides and Their Disaggregation by Ultrasonic Wave and Infrared Laser Irradiation
Alzheimer’s disease is understood to be caused by amyloid fibrils and oligomers formed by aggregated amyloid-β (Aβ) peptides. This review article presents molecular dynamics (MD) simulation studies of Aβ peptides and Aβ fragments on their aggregation, aggregation inhibition, amyloid fibril conformations in equilibrium, and disruption of the amyloid fibril by ultrasonic wave and infrared laser irradiation. In the aggregation of Aβ, a β-hairpin structure promotes the formation of intermolecular β-sheet structures. Aβ peptides tend to exist at hydrophilic/hydrophobic interfaces and form more β-hairpin structures than in bulk water. These facts are the reasons why the aggregation is accelerated at the interface. We also explain how polyphenols, which are attracting attention as aggregation inhibitors of Aβ peptides, interact with Aβ. An MD simulation study of the Aβ amyloid fibrils in equilibrium is also presented: the Aβ amyloid fibril has a different structure at one end from that at the other end. The amyloid fibrils can be destroyed by ultrasonic wave and infrared laser irradiation. The molecular mechanisms of these amyloid fibril disruptions are also explained, particularly focusing on the function of water molecules. Finally, we discuss the prospects for developing treatments for Alzheimer’s disease using MD simulations.
Analysis of Water Yield Changes from 1981 to 2018 Using an Improved Mann-Kendall Test
Water yield (WY) refers to the difference between precipitation and evapotranspiration (ET), which is vital for available terrestrial water. Climate change has led to significant changes in precipitation and evapotranspiration on a global scale, which will affect the global WY. Nevertheless, how terrestrial WY has changed during the past few decades and which factors dominated the WY changes are not fully understood. In this study, based on climate reanalysis and remote sensing data, the spatial and temporal patterns of terrestrial WY were revisited from 1981 to 2018 globally using an improved Mann-Kendall trend test method with a permutation test. The response patterns of WY to precipitation and ET are also investigated. The results show that the global multi-year mean WY is 297.4 mm/a. Based on the traditional Mann-Kendall trend test, terrestrial WY showed a significant (p < 0.05) increase of 5.72% of the total valid grid cells, while it showed a significant decrease of 7.68% of those. After correction using the calibration method, the significantly increasing and decreasing areas are reduced by 10.52% and 10.58% of them, respectively. After the correction, the confirmed increase and decrease in WY are mainly located in Africa, eastern North America and Siberia, and parts of Asia and Oceania, respectively. The dominant factor for increasing WY is precipitation, while that for decreasing WY was the combined effect of precipitation and evapotranspiration. The achievements of this study are beneficial for improving the understanding of WY in response to hydrological variables in the context of climate change.
Child-parent associations of hematocrit in trios of Japanese adulthood confirmed by the random family method: The TMM BirThree Cohort Study
To examine child-parent associations of HCT among Japanese adults and their parents. Factors associated with hematocrit (HCT) were analyzed in 3,574 sons and 7,203 daughters using Pearson’s correlation coefficient and Student’s t-test. Multiple linear regression analysis, adjusted by the factors identified by univariate analyses and by living with parents, was performed on 242 son-parent trios and 587 daughter-parent trios. When a child-parent association was observed in the multiple linear regression analysis, it was validated using the random family method (RFM). In univariate analyses, the son’s HCT was associated with age (correlation coefficient = –0.072), white blood cell (WBC) (0.19), alanine aminotransferase (ALT) (0.20), triglyceride (0.11), and estimated glomerular filtration rate (eGFR) (− 0.087). The daughter’s HCT was associated with WBC (0.014), ALT (0.18), and eGFR (− 0.17). In multiple linear regression analysis, the son’s HCT was associated with the son’s WBC (coefficient = 3.48 × 10 –4 ), the son’s eGFR (0.031), the father’s HCT (0.11), and the mother’s HCT (0.17). RFM confirmed the association between the son’s and father’s HCT (p = 0.0070) and between the son’s and mother’s HCT (p = 0.0011). The daughter’s HCT was associated with WBC (2.6 × 10 –4 ), ALT (0.037), and the mother’s HCT (0.14). RFM confirmed the association between the daughter’s and mother’s HCT (p = 0.00043). Child-parent association of HCT was confirmed between son-father, son–mother, and daughter-mother relationships, and differed depending on the sex of the child and the parents.
Dynamics of the Forced Thirring Instanton with Two Forcing Terms
We consider the dynamical behavior of fermionic instanton solutions of the Thirring Model with two forcing terms. In particular, the study focuses on the effect of the frequency and amplitude of the forcing terms on the behavior of fermionic instanton solutions. Numerical analysis based on the Smaller Alignment Index (SALI) method, Bifurcation Diagram and Permutation Entropy (PE) are used to show how and which dynamical behaviors occur in the system. Color maps and diagrams of the SALI time ( S t ) (the time S t required for the SALI index to below a threshold value of 10 −12 ) and PE of the system with respect to varying of frequency and amplitude values of the forcing terms are plotted comparatively to determine different dynamical behavior of the system. The study shows that fermionic instanton solutions exhibit a wide variety of dynamical behavior due to two forcing terms. Furthermore, it is emphasized that SALI time ( S t ) can be easily compared with bifurcation diagram and complexity method as a fast, efficient and precise method to investigate different degrees of chaos. In general, the instanton solutions with two forcing terms have been observed to exhibit different type of dynamical behavior. In this study, identical and symmetric coexisting attractors are demonstrated for these different types of behavior of the forced Thirring instanton.
High-Precision Switched Capacitor Device with an Energy Estimation Circuit
This article introduces a device for the precise testing of the non-linearities of inductive voltage dividers and digitizers used in digital impedance bridges. The device is based on switched capacitors composed of an NP0 dielectric, in addition to high-quality microwave relays. The article discusses issue concerning the symmetrization of the device as one of the main ways to achieve a high accuracy. Furthermore, a temperature-stabilization system for the device is presented. The system uses a battery management system to estimate the quantity of the energy available in the supply battery. The article further discusses problems encountered with the design of heating elements, which are situated on a laminate with an aluminum substrate.