Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Item Type
      Item Type
      Clear All
      Item Type
  • Subject
      Subject
      Clear All
      Subject
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
      More Filters
      Clear All
      More Filters
      Source
    • Language
52,524 result(s) for "normal distribution"
Sort by:
On Moments of Folded and Truncated Multivariate Normal Distributions
Recurrence relations for integrals that involve the density of multivariate normal distributions are developed. These recursions allow fast computation of the moments of folded and truncated multivariate normal distributions. Besides being numerically efficient, the proposed recursions also allow us to obtain explicit expressions of low-order moments of folded and truncated multivariate normal distributions. Supplementary material for this article is available online.
An Improved RANSAC for 3D Point Cloud Plane Segmentation Based on Normal Distribution Transformation Cells
Plane segmentation is a basic task in the automatic reconstruction of indoor and urban environments from unorganized point clouds acquired by laser scanners. As one of the most common plane-segmentation methods, standard Random Sample Consensus (RANSAC) is often used to continually detect planes one after another. However, it suffers from the spurious-plane problem when noise and outliers exist due to the uncertainty of randomly sampling the minimum subset with 3 points. An improved RANSAC method based on Normal Distribution Transformation (NDT) cells is proposed in this study to avoid spurious planes for 3D point-cloud plane segmentation. A planar NDT cell is selected as a minimal sample in each iteration to ensure the correctness of sampling on the same plane surface. The 3D NDT represents the point cloud with a set of NDT cells and models the observed points with a normal distribution within each cell. The geometric appearances of NDT cells are used to classify the NDT cells into planar and non-planar cells. The proposed method is verified on three indoor scenes. The experimental results show that the correctness exceeds 88.5% and the completeness exceeds 85.0%, which indicates that the proposed method identifies more reliable and accurate planes than standard RANSAC. It also executes faster. These results validate the suitability of the method.
Models for Extremal Dependence Derived from Skew-symmetric Families
Skew-symmetric families of distributions such as the skew-normal and skew-t represent supersets of the normal and t distributions, and they exhibit richer classes of extremal behaviour. By defining a non-stationary skew-normal process, which allows the easy handling of positive definite, non-stationary covariance functions, we derive a new family of max-stable processes – the extremal skew-t process. This process is a superset of non-stationary processes that include the stationary extremal-t processes. We provide the spectral representation and the resulting angular densities of the extremal skew-t process and illustrate its practical implementation.
The normal law under linear restrictions: simulation and estimation via minimax tilting
Simulation from the truncated multivariate normal distribution in high dimensions is a recurrent problem in statistical computing and is typically only feasible by using approximate Markov chain Monte Carlo sampling. We propose a minimax tilting method for exact independently and identically distributed data simulation from the truncated multivariate normal distribution. The new methodology provides both a method for simulation and an efficient estimator to hitherto intractable Gaussian integrals. We prove that the estimator has a rare vanishing relative error asymptotic property. Numerical experiments suggest that the scheme proposed is accurate in a wide range of set-ups for which competing estimation schemes fail. We give an application to exact independently and identically distributed data simulation from the Bayesian posterior of the probit regression model.
Time series modelling to forecast the confirmed and recovered cases of COVID-19
Coronaviruses are enveloped RNA viruses from the Coronaviridae family affecting neurological, gastrointestinal, hepatic and respiratory systems. In late 2019 a new member of this family belonging to the Betacoronavirus genera (referred to as COVID-19) originated and spread quickly across the world calling for strict containment plans and policies. In most countries in the world, the outbreak of the disease has been serious and the number of confirmed COVID-19 cases has increased daily, while, fortunately the recovered COVID-19 cases have also increased. Clearly, forecasting the “confirmed” and “recovered” COVID-19 cases helps planning to control the disease and plan for utilization of health care resources. Time series models based on statistical methodology are useful to model time-indexed data and for forecasting. Autoregressive time series models based on two-piece scale mixture normal distributions, called TP–SMN–AR models, is a flexible family of models involving many classical symmetric/asymmetric and light/heavy tailed autoregressive models. In this paper, we use this family of models to analyze the real world time series data of confirmed and recovered COVID-19 cases.
Multi-objective generalized normal distribution optimization: a novel algorithm for multi-objective problems
This study introduces the Multi-objective Generalized Normal Distribution Optimization (MOGNDO) algorithm, an advancement of the Generalized Normal Distribution Optimization (GNDO) algorithm, now adapted for multi-objective optimization tasks. The GNDO algorithm, previously known for its effectiveness in single-objective optimization, has been enhanced with two key features for multi-objective optimization. The first is the addition of an archival mechanism to store non-dominated Pareto optimal solutions, ensuring a detailed record of the best outcomes. The second enhancement is a new leader selection mechanism, designed to strategically identify and select the best solutions from the archive to guide the optimization process. This enhancement positions MOGNDO as a cutting-edge solution in multi-objective optimization, setting a new benchmark for evaluating its performance against leading algorithms in the field. The algorithm's effectiveness is rigorously tested across 35 varied case studies, encompassing both mathematical and engineering challenges, and benchmarked against prominent algorithms like MOPSO, MOGWO, MOHHO, MSSA, MOALO, MOMVO, and MOAOS. Utilizing metrics such as Generational Distance (GD), Inverted Generational Distance (IGD), and Maximum Spread (MS), the study underscores MOGNDO's ability to produce Pareto fronts of high quality, marked by exceptional precision and diversity. The results affirm MOGNDO's superior performance and versatility, not only in theoretical tests but also in addressing complex real-world engineering problems, showcasing its high convergence and coverage capabilities. The source codes of the MOGNDO algorithm are publicly available at  https://nimakhodadadi.com/algorithms-%2B-codes .
Consistency Issues in Skew Random Fields: Investigating Proposed Alternatives and Identifying Persisting Problems
Multiple researchers have proposed skew random fields derived from multivariate skew distributions, yet the consistency of these fields has been questioned. Mahmoudian (2018) and Saber et al. (2018) have put forth alternative suggestions to address these concerns. In our study, we identify that the random fields outlined by Mahmoudian (2018) continue to demonstrate consistency issues, suggesting a flaw in their definition. Finally we propose a skew random field and apply it to spatial prediction.
Automated bimodal pause analysis for acoustic markers of cognitive decline and Alzheimer's disease in connected speech
INTRODUCTION This study introduces a language‐independent, acoustic‐based method to identify the bimodal pauses in connected speech related to Alzheimer's disease (AD) through a log‐normal distribution, aiming to explore pausing behavior as a digital marker of cognitive decline. METHODS We fitted a bimodal log‐normal distribution to 4473 pauses automatically extracted through acoustic analysis. We compared linear and logarithmic pause indices between cognitive groups and explored their neurocognitive correlates. RESULTS We empirically revealed a dual‐mode pause distribution, customizing a threshold of ≈ 180 ms to differentiate short and long pauses. This bimodal distribution effectively distinguished cognitive groups, driven by variations in the central tendency of long pauses. Both pause types were elevated in individuals with mild cognitive impairment and correlated with tau and amyloid levels. DISCUSSION Bimodal pause distribution shows promise as a sensitive speech‐based indicator of cognitive decline, linking closely to AD biomarkers. We introduce a refined, unbiased, language‐independent framework for broader application across diverse populations. Highlights Pausing in connected speech was investigated as a digital marker of cognitive decline. Bimodal log‐normal pause distribution distinguishes between cognitive groups. Short (80–180 ms) and long (> 180 ms) pauses correlate with tau and amyloid.
Conjugate Bayes for probit regression via unified skew-normal distributions
Regression models for dichotomous data are ubiquitous in statistics. Besides being useful for inference on binary responses, these methods serve as building blocks in more complex formulations, such as density regression, nonparametric classification and graphical models. Within the Bayesian framework, inference proceeds by updating the priors for the coefficients, typically taken to be Gaussians, with the likelihood induced by probit or logit regressions for the responses. In this updating, the apparent absence of a tractable posterior has motivated a variety of computational methods, including Markov chain Monte Carlo routines and algorithms that approximate the posterior. Despite being implemented routinely, Markov chain Monte Carlo strategies have mixing or time-inefficiency issues in large-p and small-n studies, whereas approximate routines fail to capture the skewness typically observed in the posterior. In this article it is proved that the posterior distribution for the probit coefficients has a unified skew-normal kernel under Gaussian priors. This result allows efficient Bayesian inference for a wide class of applications, especially in large-p and small-to-moderate-n settings where state-of-the-art computational methods face notable challenges. These advances are illustrated in a genetic study, and further motivate the development of a wider class of conjugate priors for probit models, along with methods for obtaining independent and identically distributed samples from the unified skew-normal posterior.
The Skew-normal Distribution and Related Multivariate Families
This paper provides an introductory overview of a portion of distribution theory which is currently under intense development. The starting point of this topic has been the so-called skew-normal distribution, but the connected area is becoming increasingly broad, and its branches include now many extensions, such as the skew-elliptical families, and some forms of semiparametric formulations, extending the relevance of the field much beyond the original theme of 'skewness'. The final part of the paper illustrates connections with various areas of application, including selective sampling, models for compositional data, robust methods, some problems in econometrics, non-linear time series, especially in connection with financial data, and more.