Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
DisciplineDiscipline
-
Is Peer ReviewedIs Peer Reviewed
-
Item TypeItem Type
-
SubjectSubject
-
YearFrom:-To:
-
More FiltersMore FiltersSourceLanguage
Done
Filters
Reset
131
result(s) for
"Drovandi, Christopher"
Sort by:
Bayesian Estimation of Small Effects in Exercise and Sports Science
by
Gore, Christopher J.
,
Robert, Christian P.
,
Drovandi, Christopher C.
in
Altitude effects
,
Athletes
,
Bayes Theorem
2016
The aim of this paper is to provide a Bayesian formulation of the so-called magnitude-based inference approach to quantifying and interpreting effects, and in a case study example provide accurate probabilistic statements that correspond to the intended magnitude-based inferences. The model is described in the context of a published small-scale athlete study which employed a magnitude-based inference approach to compare the effect of two altitude training regimens (live high-train low (LHTL), and intermittent hypoxic exposure (IHE)) on running performance and blood measurements of elite triathletes. The posterior distributions, and corresponding point and interval estimates, for the parameters and associated effects and comparisons of interest, were estimated using Markov chain Monte Carlo simulations. The Bayesian analysis was shown to provide more direct probabilistic comparisons of treatments and able to identify small effects of interest. The approach avoided asymptotic assumptions and overcame issues such as multiple testing. Bayesian analysis of unscaled effects showed a probability of 0.96 that LHTL yields a substantially greater increase in hemoglobin mass than IHE, a 0.93 probability of a substantially greater improvement in running economy and a greater than 0.96 probability that both IHE and LHTL yield a substantially greater improvement in maximum blood lactate concentration compared to a Placebo. The conclusions are consistent with those obtained using a 'magnitude-based inference' approach that has been promoted in the field. The paper demonstrates that a fully Bayesian analysis is a simple and effective way of analysing small effects, providing a rich set of results that are straightforward to interpret in terms of probabilistic statements.
Journal Article
A Review of Modern Computational Algorithms for Bayesian Optimal Design
by
McGree, James M.
,
Pettitt, Anthony N.
,
Drovandi, Christopher C.
in
Algorithms
,
Bayesian analysis
,
Bayesian optimal design
2016
Bayesian experimental design is a fast growing area of research with many real-world applications. As computational power has increased over the years, so has the development of simulation-based design methods, which involve a number of algorithms, such as Markov chain Monte Carlo, sequential Monte Carlo and approximate Bayes methods, facilitating more complex design problems to be solved. The Bayesian framework provides a unified approach for incorporating prior information and/or uncertainties regarding the statistical model with a utility function which describes the experimental aims. In this paper, we provide a general overview on the concepts involved in Bayesian experimental design, and focus on describing some of the more commonly used Bayesian utility functions and methods for their estimation, as well as a number of algorithms that are used to search over the design space to find the Bayesian optimal design. We also discuss other computational strategies for further research in Bayesian optimal design.
Journal Article
Bayesian Indirect Inference Using a Parametric Auxiliary Model
by
Pettitt, Anthony N.
,
Lee, Anthony
,
Drovandi, Christopher C.
in
Approximation
,
Bayesian analysis
,
Comparative studies
2015
Indirect inference (II) is a methodology for estimating the parameters of an intractable (generative) model on the basis of an alternative parametric (auxiliary) model that is both analytically and computationally easier to deal with. Such an approach has been well explored in the classical literature but has received substantially less attention in the Bayesian paradigm. The purpose of this paper is to compare and contrast a collection of what we call parametric Bayesian indirect inference (pBII) methods. One class of pBII methods uses approximate Bayesian computation (referred to here as ABC II) where the summary statistic is formed on the basis of the auxiliary model, using ideas from II. Another approach proposed in the literature, referred to here as parametric Bayesian indirect likelihood (pBIL), uses the auxiliary likelihood as a replacement to the intractable likelihood. We show that pBIL is a fundamentally different approach to ABC II. We devise new theoretical results for pBIL to give extra insights into its behaviour and also its differences with ABC II. Furthermore, we examine in more detail the assumptions required to use each pBII method. The results, insights and comparisons developed in this paper are illustrated on simple examples and two other substantive applications. The first of the substantive examples involves performing inference for complex quantile distributions based on simulated data while the second is for estimating the parameters of a trivariate stochastic process describing the evolution of macroparasites within a host based on real data. We create a novel framework called Bayesian indirect likelihood (BIL) that encompasses pBII as well as general ABC methods so that the connections between the methods can be established.
Journal Article
LooplessFluxSampler: an efficient toolbox for sampling the loopless flux solution space of metabolic models
by
Zapararte, Sebastian
,
Saa, Pedro A.
,
Drovandi, Christopher C.
in
Adaptation, Physiological
,
Adaptive sampling
,
Advertising executives
2024
Background
Uniform random sampling of mass-balanced flux solutions offers an unbiased appraisal of the capabilities of metabolic networks. Unfortunately, it is impossible to avoid thermodynamically infeasible loops in flux samples when using convex samplers on large metabolic models. Current strategies for randomly sampling the non-convex loopless flux space display limited efficiency and lack theoretical guarantees.
Results
Here, we present LooplessFluxSampler, an efficient algorithm for exploring the loopless mass-balanced flux solution space of metabolic models, based on an Adaptive Directions Sampling on a Box (ADSB) algorithm. ADSB is rooted in the general Adaptive Direction Sampling (ADS) framework, specifically the Parallel ADS, for which theoretical convergence and irreducibility results are available for sampling from arbitrary distributions. By sampling directions that adapt to the target distribution, ADSB traverses more efficiently the sample space achieving faster mixing than other methods. Importantly, the presented algorithm is guaranteed to target the uniform distribution over convex regions, and it provably converges on the latter distribution over more general (non-convex) regions provided the sample can have full support.
Conclusions
LooplessFluxSampler enables scalable statistical inference of the loopless mass-balanced solution space of large metabolic models. Grounded in a theoretically sound framework, this toolbox provides not only efficient but also reliable results for exploring the properties of the almost surely non-convex loopless flux space. Finally, LooplessFluxSampler includes a Markov Chain diagnostics suite for assessing the quality of the final sample and the performance of the algorithm.
Journal Article
Fully Bayesian Experimental Design for Pharmacokinetic Studies
by
Pettitt, Anthony
,
Drovandi, Christopher
,
Ryan, Elizabeth
in
Approximation
,
Bayesian analysis
,
Bayesian design
2015
Utility functions in Bayesian experimental design are usually based on the posterior distribution. When the posterior is found by simulation, it must be sampled from for each future dataset drawn from the prior predictive distribution. Many thousands of posterior distributions are often required. A popular technique in the Bayesian experimental design literature, which rapidly obtains samples from the posterior, is importance sampling, using the prior as the importance distribution. However, importance sampling from the prior will tend to break down if there is a reasonable number of experimental observations. In this paper, we explore the use of Laplace approximations in the design setting to overcome this drawback. Furthermore, we consider using the Laplace approximation to form the importance distribution to obtain a more efficient importance distribution than the prior. The methodology is motivated by a pharmacokinetic study, which investigates the effect of extracorporeal membrane oxygenation on the pharmacokinetics of antibiotics in sheep. The design problem is to find 10 near optimal plasma sampling times that produce precise estimates of pharmacokinetic model parameters/measures of interest. We consider several different utility functions of interest in these studies, which involve the posterior distribution of parameter functions.
Journal Article
Computationally efficient mechanism discovery for cell invasion with uncertainty quantification
by
VandenHeuvel, Daniel J.
,
Simpson, Matthew J.
,
Drovandi, Christopher
in
Biological activity
,
Biological models (mathematics)
,
Biology
2022
Parameter estimation for mathematical models of biological processes is often difficult and depends significantly on the quality and quantity of available data. We introduce an efficient framework using Gaussian processes to discover mechanisms underlying delay, migration, and proliferation in a cell invasion experiment. Gaussian processes are leveraged with bootstrapping to provide uncertainty quantification for the mechanisms that drive the invasion process. Our framework is efficient, parallelisable, and can be applied to other biological problems. We illustrate our methods using a canonical scratch assay experiment, demonstrating how simply we can explore different functional forms and develop and test hypotheses about underlying mechanisms, such as whether delay is present. All code and data to reproduce this work are available at https://github.com/DanielVandH/EquationLearning.jl .
Journal Article
Innovative approaches in soil carbon sequestration modelling for better prediction with limited data
by
Davoudabadi, Mohammad Javad
,
Pagendam, Daniel
,
Drovandi, Christopher
in
704/106/694/682
,
704/172
,
Artificial intelligence
2024
Soil carbon accounting and prediction play a key role in building decision support systems for land managers selling carbon credits, in the spirit of the Paris and Kyoto protocol agreements. Land managers typically rely on computationally complex models fit using sparse datasets to make these accounts and predictions. The model complexity and sparsity of the data can lead to over-fitting, leading to inaccurate results when making predictions with new data. Modellers address over-fitting by simplifying their models and reducing the number of parameters, and in the current context this could involve neglecting some soil organic carbon (SOC) components. In this study, we introduce two novel SOC models and a new RothC-like model and investigate how the SOC components and complexity of the SOC models affect the SOC prediction in the presence of small and sparse time series data. We develop model selection methods that can identify the soil carbon model with the best predictive performance, in light of the available data. Through this analysis we reveal that commonly used complex soil carbon models can over-fit in the presence of sparse time series data, and our simpler models can produce more accurate predictions.
Journal Article
Water quality mediates resilience on the Great Barrier Reef
by
Devlin, Michelle
,
Mengersen, Kerrie
,
Graham, Nicholas A. J.
in
631/158/2165
,
631/158/2450
,
631/158/672
2019
Threats from climate change and other human pressures have led to widespread concern for the future of Australia’s Great Barrier Reef (GBR). Resilience of GBR reefs will be determined by their ability to resist disturbances and to recover from coral loss, generating intense interest in management actions that can moderate these processes. Here we quantify the effect of environmental and human drivers on the resilience of southern and central GBR reefs over the past two decades. Using a composite water quality index, we find that while reefs exposed to poor water quality are more resistant to coral bleaching, they recover from disturbance more slowly and are more susceptible to outbreaks of crown-of-thorns starfish and coral disease—with a net negative impact on recovery and long-term hard coral cover. Given these conditions, we find that 6–17% improvement in water quality will be necessary to bring recovery rates in line with projected increases in coral bleaching among contemporary inshore and mid-shelf reefs. However, such reductions are unlikely to buffer projected bleaching effects among outer-shelf GBR reefs dominated by fast-growing, thermally sensitive corals, demonstrating practical limits to local management of the GBR against the effects of global warming.
Fitting a water quality index to survey-based estimates of coral resilience finds that reefs exposed to poor water quality are more resistant to bleaching but slower to recover from disturbance and more susceptible to disease outbreaks.
Journal Article
Estimating a novel stochastic model for within-field disease dynamics of banana bunchy top virus via approximate Bayesian computation
by
Varghese, Abhishek
,
Mira, Antonietta
,
Drovandi, Christopher
in
Agricultural industry
,
Agricultural management
,
Bayesian analysis
2020
The Banana Bunchy Top Virus (BBTV) is one of the most economically important vector-borne banana diseases throughout the Asia-Pacific Basin and presents a significant challenge to the agricultural sector. Current models of BBTV are largely deterministic, limited by an incomplete understanding of interactions in complex natural systems, and the appropriate identification of parameters. A stochastic network-based Susceptible-Infected-Susceptible model has been created which simulates the spread of BBTV across the subsections of a banana plantation, parameterising nodal recovery, neighbouring and distant infectivity across summer and winter. Findings from posterior results achieved through Markov Chain Monte Carlo approach to approximate Bayesian computation suggest seasonality in all parameters, which are influenced by correlated changes in inspection accuracy, temperatures and aphid activity. This paper demonstrates how the model may be used for monitoring and forecasting of various disease management strategies to support policy-level decision making.
Journal Article
Framework for assessing and easing global COVID-19 travel restrictions
by
Le, Thien-Minh
,
Hambridge, Hali
,
Mira, Antonietta
in
639/705/531
,
692/699/255/1578
,
692/700/478
2022
During the COVID-19 pandemic, many countries implemented international travel restrictions that aimed to contain viral spread while still allowing necessary cross-border travel for social and economic reasons. The relative effectiveness of these approaches for controlling the pandemic has gone largely unstudied. Here we developed a flexible network meta-population model to compare the effectiveness of international travel policies, with a focus on evaluating the benefit of policy coordination. Because country-level epidemiological parameters are unknown, they need to be estimated from data; we accomplished this using approximate Bayesian computation, given the nature of our complex stochastic disease transmission model. Based on simulation and theoretical insights we find that, under our proposed policy, international airline travel may resume up to 58% of the pre-pandemic level with pandemic control comparable to that of a complete shutdown of all airline travel. Our results demonstrate that global coordination is necessary to allow for maximum travel with minimum effect on viral spread.
Journal Article