Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Item Type
      Item Type
      Clear All
      Item Type
  • Subject
      Subject
      Clear All
      Subject
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
473 result(s) for "Bayesian framework"
Sort by:
Detection of Co-salient Objects by Looking Deep and Wide
In this paper, we propose a unified co-salient object detection framework by introducing two novel insights: (1) looking deep to transfer higher-level representations by using the convolutional neural network with additional adaptive layers could better reflect the sematic properties of the co-salient objects; (2) looking wide to take advantage of the visually similar neighbors from other image groups could effectively suppress the influence of the common background regions. The wide and deep information are explored for the object proposal windows extracted in each image. The window-level co-saliency scores are calculated by integrating the intra-image contrast, the intra-group consistency, and the inter-group separability via a principled Bayesian formulation and are then converted to the superpixel-level co-saliency maps through a foreground region agreement strategy. Comprehensive experiments on two existing and one newly established datasets have demonstrated the consistent performance gain of the proposed approach.
Modeling the ACMG/AMP variant classification guidelines as a Bayesian classification framework
Purpose We evaluated the American College of Medical Genetics and Genomics/Association for Molecular Pathology (ACMG/AMP) variant pathogenicity guidelines for internal consistency and compatibility with Bayesian statistical reasoning. Methods The ACMG/AMP criteria were translated into a naive Bayesian classifier, assuming four levels of evidence and exponentially scaled odds of pathogenicity. We tested this framework with a range of prior probabilities and odds of pathogenicity. Results We modeled the ACMG/AMP guidelines using biologically plausible assumptions. Most ACMG/AMP combining criteria were compatible. One ACMG/AMP likely pathogenic combination was mathematically equivalent to pathogenic and one ACMG/AMP pathogenic combination was actually likely pathogenic. We modeled combinations that include evidence for and against pathogenicity, showing that our approach scored some combinations as pathogenic or likely pathogenic that ACMG/AMP would designate as variant of uncertain significance (VUS). Conclusion By transforming the ACMG/AMP guidelines into a Bayesian framework, we provide a mathematical foundation for what was a qualitative heuristic. Only 2 of the 18 existing ACMG/AMP evidence combinations were mathematically inconsistent with the overall framework. Mixed combinations of pathogenic and benign evidence could yield a likely pathogenic, likely benign, or VUS result. This quantitative framework validates the approach adopted by the ACMG/AMP, provides opportunities to further refine evidence categories and combining rules, and supports efforts to automate components of variant pathogenicity assessments.
Hierarchical Cellular Automata for Visual Saliency
Saliency detection, finding the most important parts of an image, has become increasingly popular in computer vision. In this paper, we introduce Hierarchical Cellular Automata (HCA)—a temporally evolving model to intelligently detect salient objects. HCA consists of two main components: Single-layer Cellular Automata (SCA) and Cuboid Cellular Automata (CCA). As an unsupervised propagation mechanism, Single-layer Cellular Automata can exploit the intrinsic relevance of similar regions through interactions with neighbors. Low-level image features as well as high-level semantic information extracted from deep neural networks are incorporated into the SCA to measure the correlation between different image patches. With these hierarchical deep features, an impact factor matrix and a coherence matrix are constructed to balance the influences on each cell’s next state. The saliency values of all cells are iteratively updated according to a well-defined update rule. Furthermore, we propose CCA to integrate multiple saliency maps generated by SCA at different scales in a Bayesian framework. Therefore, single-layer propagation and multi-scale integration are jointly modeled in our unified HCA. Surprisingly, we find that the SCA can improve all existing methods that we applied it to, resulting in a similar precision level regardless of the original results. The CCA can act as an efficient pixel-wise aggregation algorithm that can integrate state-of-the-art methods, resulting in even better results. Extensive experiments on four challenging datasets demonstrate that the proposed algorithm outperforms state-of-the-art conventional methods and is competitive with deep learning based approaches.
μ-STAR: A novel framework for spatio-temporal M/EEG source imaging optimized by microstates
•A novel M/EEG source imaging method – μ-STAR is proposed to estimate source activities.•Microstate analysis is employed to determine optimal time window length for source imaging.•Spatial constraint and temporal basis functions are used to model source dynamics.•μ-STAR shows superior performance with high spatio-temporal accuracy and fast convergence. Source imaging of Electroencephalography (EEG) and Magnetoencephalography (MEG) provides a noninvasive way of monitoring brain activities with high spatial and temporal resolution. In order to address this highly ill-posed problem, conventional source imaging models adopted spatio-temporal constraints that assume spatial stability of the source activities, neglecting the transient characteristics of M/EEG. In this work, a novel source imaging method μ-STAR that includes a microstate analysis and a spatio-temporal Bayesian model was introduced to address this problem. Specifically, the microstate analysis was applied to achieve automatic determination of time window length with quasi-stable source activity pattern for optimal reconstruction of source dynamics. Then a user-specific spatial prior and data-driven temporal basis functions were utilized to characterize the spatio-temporal information of sources within each state. The solution of the source reconstruction was obtained through a computationally efficient algorithm based upon variational Bayesian and convex analysis. The performance of the μ-STAR was first assessed through numerical simulations, where we found that the determination and inclusion of optimal temporal length in the spatio-temporal prior significantly improved the performance of source reconstruction. More importantly, the μ-STAR model achieved robust performance under various settings (i.e., source numbers/areas, SNR levels, and source depth) with fast convergence speed compared with five widely-used benchmark models (including wMNE, STV, SBL, BESTIES, & SI-STBF). Additional validations on real data were then performed on two publicly-available datasets (including block-design face-processing ERP and continuous resting-state EEG). The reconstructed source activities exhibited spatial and temporal neurophysiologically plausible results consistent with previously-revealed neural substrates, thereby further proving the feasibility of the μ-STAR model for source imaging in various applications.
A Bayesian model updating framework for robust seismic fragility analysis of non-isolated historic masonry towers
Seismic assessment of existing masonry structures requires a numerical model able to both reproduce their nonlinear behaviour and account for the different sources of uncertainties; the latter have to be dealt with since the unavoidable lack of knowledge on the input parameters (material properties, geometry, boundary conditions, etc.) has a relevant effect on the reliability of the seismic response provided by the numerical approaches. The steadily increasing necessity of combining different sources of information/knowledge makes the Bayesian approach an appealing technique, not yet fully investigated for historic masonry constructions. In fact, while the Bayesian paradigm is currently employed to solve inverse problems in several sectors of the structural engineering domain, only a few studies pay attention to its effectiveness for parameter identification on historic masonry structures. This study combines a Bayesian framework with probabilistic structural analyses: starting from the Bayesian finite element model updating by using experimental data it provides the definition of robust seismic fragility curves for non-isolated masonry towers. A comparison between this method and the standard deterministic approach illustrates its benefits. This article is part of the theme issue ‘Environmental loading of heritage structures’.
Inverse solution of process parameters in gear grinding using hierarchical bayesian physics informed neural network (HBPINN)
Accurate inverse solution of process parameters by surface roughness is crucial for precision gear grinding processes. When inversely solving process parameters, model parameters are typically obtained by fitting experimental data. However, model parameters exhibit complex correlations and uncertainties, posing significant challenges to the inverse solution of process parameters. To address these challenges, the study proposes a hierarchical Bayesian physics-informed neural network (HBPINN) for the inverse solution of gear-grinding process parameters. An innovative global-group-individual level hierarchical structure is constructed for model parameters. Correlation analysis among model parameters is conducted through group effects within a hierarchical Bayesian framework, followed by uncertainty analysis. Then, multivariate regression functions describing the relationship between process parameters and surface roughness are constructed to form the physics loss function. The regularization incorporates the Kullback-Leibler (KL) divergence of model parameters, integrating with the empirical loss function. Furthermore, datasets of different scales were established through Gaussian process regression (GPR) algorithms. Compared with Bayesian physics-informed neural network (BPINN), variational inference Bayesian physics-informed neural network (VI-BPINN), and physics-informed neural network (PINN), HBPINN demonstrates superior performance in terms of both efficiency and accuracy. With a training set size of 200, HBPINN reduced prediction time by 4–10 times and achieved an average R² of 0.9629. The model demonstrates excellent uncertainty quantification capabilities and robustness.
Gravity as a Strong Prior: Implications for Perception and Action
In the future, humans are likely to be exposed to environments with altered gravity conditions, be it only visually (Virtual and Augmented Reality), or visually and bodily (space travel). As visually and bodily perceived gravity as well as an interiorized representation of earth gravity are involved in a series of tasks, such as catching, grasping, body orientation estimation and spatial inferences, humans will need to adapt to these new gravity conditions. Performance under earth gravity discrepant conditions has been shown to be relatively poor, and few studies conducted in gravity adaptation are rather discouraging. Especially in VR on earth, conflicts between bodily and visual gravity cues seem to make a full adaptation to visually perceived earth-discrepant gravities nearly impossible, and even in space, when visual and bodily cues are congruent, adaptation is extremely slow. We invoke a Bayesian framework for gravity related perceptual processes, in which earth gravity holds the status of a so called \"strong prior\". As other strong priors, the gravity prior has developed through years and years of experience in an earth gravity environment. For this reason, the reliability of this representation is extremely high and overrules any sensory information to its contrary. While also other factors such as the multisensory nature of gravity perception need to be taken into account, we present the strong prior account as a unifying explanation for empirical results in gravity perception and adaptation to earth-discrepant gravities.
Optimizing energy and latency in edge computing through a Boltzmann driven Bayesian framework for adaptive resource scheduling
This paper presents a new approach based on Boltzmann Distribution and Bayesian Optimization to solve the energy-efficient resource allocation in edge computing. It employs Bayesian Optimization to optimize the parameters iteratively for the minimum energy consumption and latency. Coupled with this, a Boltzmann-driven probabilistic action selection mechanism enhances adaptability in selecting low-energy tasks by balancing exploration and exploitation through a dynamically adjusted temperature parameter. Simulation analysis demonstrates that the new method can decrease energy consumption and average delay much lower than Round-Robin and threshold-based algorithms. The feature of temperature adaptation within Boltzmann further guarantees the achievement of the optimal scheduling actions while ensuring flexibility in the case or altering load percentages. Cumulative energy savings varied up to 25% compared to baseline methods, demonstrating the applicability of the framework in real-time, energy-aware applications at the edge. This work demonstrates the viability of combining probabilistic selection with parameter optimization, setting a new benchmark for energy-efficient resource scheduling. Such findings create possibilities in expanding the existing literature on the use of hybrid optimization methods to enhance sustainable computing solutions in the context of distribution systems.
High-Resolution Flood Monitoring Based on Advanced Statistical Modeling of Sentinel-1 Multi-Temporal Stacks
High-resolution flood monitoring can be achieved relying on multi-temporal analysis of remote sensing SAR data, through the implementation of semi-automated systems. Exploiting a Bayesian inference framework, conditioned probabilities can be estimated for the presence of floodwater at each image location and each acquisition date. We developed a procedure for efficient monitoring of floodwaters from SAR data cubes, which adopts a statistical modelling framework for SAR backscatter time series over normally unflooded areas based on Gaussian processes (GPs), in order to highlight flood events as outliers, causing abrupt variations in the trends. We found that non-parametric time series modelling improves the performances of Bayesian probabilistic inference with respect to state-of-the-art methodologies using, e.g., parametric fits based on periodic functions, by both reducing false detections and increasing true positives. Our approach also exploits ancillary data derived from a digital elevation model, including slopes, normalized heights above nearest drainage (HAND), and SAR imaging parameters such as shadow and layover conditions. It is here tested over an area that includes the so-called Metaponto Coastal Plain (MCP), in the Basilicata region (southern Italy), which is recurrently subject to floods. We illustrate the ability of our system to detect known (although not ground-truthed) and smaller, undocumented inundation events over large areas, and propose some consideration about its prospective use for contexts affected by similar events, over various land cover scenarios and climatic settings.
Remaining Useful Life Prediction of Lithium-Ion Batteries Based on Wiener Processes with Considering the Relaxation Effect
Remaining useful life (RUL) prediction has great importance in prognostics and health management (PHM). Relaxation effect refers to the capacity regeneration phenomenon of lithium-ion batteries during a long rest time, which can lead to a regenerated useful time (RUT). This paper mainly studies the influence of the relaxation effect on the degradation law of lithium-ion batteries, and proposes a novel RUL prediction method based on Wiener processes. This method can simplify the modeling complexity by using the RUT to model the recovery process. First, the life cycle of a lithium-ion battery is divided into the degradation processes that eliminate the relaxation effect and the recovery processes caused by relaxation effect. Next, the degradation model, after eliminating the relaxation effect, is established based on linear Wiener processes, and the model for RUT is established by using normal distribution. Then, the prior parameters estimation method based on maximum likelihood estimation and online updating method under the Bayesian framework are proposed. Finally, the experiments are carried out according to the degradation data of lithium-ion batteries published by NASA. The results show that the method proposed in this paper can effectively improve the accuracy of RUL prediction and has a strong engineering application value.