Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
DisciplineDiscipline
-
Is Peer ReviewedIs Peer Reviewed
-
Reading LevelReading Level
-
Content TypeContent Type
-
YearFrom:-To:
-
More FiltersMore FiltersItem TypeIs Full-Text AvailableSubjectPublisherSourceDonorLanguagePlace of PublicationContributorsLocation
Done
Filters
Reset
251
result(s) for
"Shephard, Neil"
Sort by:
Realising the future: forecasting with high-frequency-based volatility (HEAVY) models
2010
This paper studies in some detail a class of high-frequency-based volatility (HEAVY) models. These models are direct models of daily asset return volatility based on realised measures constructed from high-frequency data. Our analysis identifies that the models have momentum and mean reversion effects, and that they adjust quickly to structural breaks in the level of the volatility process. We study how to estimate the models and how they perform through the credit crunch, comparing their fit to more traditional GARCH models. We analyse a model-based bootstrap which allows us to estimate the entire predictive distribution of returns. We also provide an analysis of missing data in the context of these models. Copyright © 2010 John Wiley & Sons, Ltd.
Journal Article
Multivariate high-frequency-based volatility (HEAVY) models
by
Shephard, Neil
,
Noureldin, Diaa
,
Sheppard, Kevin
in
Analysis of covariance
,
Analytical forecasting
,
Covariance
2012
This paper introduces a new class of multivariate volatility models that utilizes high-frequency data. We discuss the models' dynamics and highlight their differences from multivariate generalized autoregressive conditional heteroskedasticity (GARCH) models. We also discuss their covariance targeting specification and provide closed-form formulas for multi-step forecasts. Estimation and inference strategies are outlined. Empirical results suggest that the HEAVY model outperforms the multivariate GARCH model out-of-sample, with the gains being particularly significant at short forecast horizons. Forecast gains are obtained for both forecast variances and correlations.
Journal Article
Panel experiments and dynamic causal effects: A finite population perspective
by
Shephard, Neil G
,
Bojinov, Iavor
,
Rambachan, Ashesh
in
Bias
,
Causality
,
dynamic causal effects
2021
In panel experiments, we randomly assign units to different interventions, measuring their outcomes, and repeating the procedure in several periods. Using the potential outcomes framework, we define finite population dynamic causal effects that capture the relative effectiveness of alternative treatment paths. For a rich class of dynamic causal effects, we provide a nonparametric estimator that is unbiased over the randomization distribution and derive its finite population limiting distribution as either the sample size or the duration of the experiment increases. We develop two methods for inference: a conservative test for weak null hypotheses and an exact randomization test for sharp null hypotheses. We further analyze the finite population probability limit of linear fixed effects estimators. These commonly-used estimators do not recover a causally interpretable estimand if there are dynamic causal effects and serial correlation in the assignments, highlighting the value of our proposed estimator.
Journal Article
BAYESIAN INFERENCE BASED ONLY ON SIMULATED LIKELIHOOD: PARTICLE FILTER ANALYSIS OF DYNAMIC ECONOMIC MODELS
2011
We note that likelihood inference can be based on an unbiased simulation-based estimator of the likelihood when it is used inside a Metropolis–Hastings algorithm. This result has recently been introduced in statistics literature by Andrieu, Doucet, and Holenstein (2010, Journal of the Royal Statistical Society, Series B, 72, 269–342) and is perhaps surprising given the results on maximum simulated likelihood estimation. Bayesian inference based on simulated likelihood can be widely applied in microeconomics, macroeconomics, and financial econometrics. One way of generating unbiased estimates of the likelihood is through a particle filter. We illustrate these methods on four problems, producing rather generic methods. Taken together, these methods imply that if we can simulate from an economic model, we can carry out likelihood–based inference using its simulations.
Journal Article
Haemorrhoidal artery ligation versus rubber band ligation for the management of symptomatic second-degree and third-degree haemorrhoids (HubBLe): a multicentre, open-label, randomised controlled trial
by
Brown, Steven R
,
Alshreef, Abualbishr
,
Wailoo, Allan J
in
Adult
,
Aged
,
Ambulatory Surgical Procedures - adverse effects
2016
Optimum surgical intervention for low-grade haemorrhoids is unknown. Haemorrhoidal artery ligation (HAL) has been proposed as an efficacious, safe therapy while rubber band ligation (RBL) is a commonly used outpatient treatment. We compared recurrence after HAL versus RBL in patients with grade II–III haemorrhoids.
This multicentre, open-label, parallel group, randomised controlled trial included patients from 17 acute UK NHS trusts. We screened patients aged 18 years or older presenting with grade II–III haemorrhoids. We excluded patients who had previously received any haemorrhoid surgery, more than one injection treatment for haemorrhoids, or more than one RBL procedure within 3 years before recruitment. Eligible patients were randomly assigned (in a 1:1 ratio) to either RBL or HAL with Doppler. Randomisation was computer-generated and stratified by centre with blocks of random sizes. Allocation concealment was achieved using a web-based system. The study was open-label with no masking of participants, clinicians, or research staff. The primary outcome was recurrence at 1 year, derived from the patient's self-reported assessment in combination with resource use from their general practitioner and hospital records. Recurrence was analysed in patients who had undergone one of the interventions and been followed up for at least 1 year. This study is registered with the ISRCTN registry, ISRCTN41394716.
From Sept 9, 2012, to May 6, 2014, of 969 patients screened, 185 were randomly assigned to the HAL group and 187 to the RBL group. Of these participants, 337 had primary outcome data (176 in the RBL group and 161 in the HAL group). At 1 year post-procedure, 87 (49%) of 176 patients in the RBL group and 48 (30%) of 161 patients in the HAL group had haemorrhoid recurrence (adjusted odds ratio [aOR] 2·23, 95% CI 1·42–3·51; p=0·0005). The main reason for this difference was the number of extra procedures required to achieve improvement (57 [32%] participants in the RBL group and 23 [14%] participants in the HAL group had a subsequent procedure for haemorrhoids). The mean pain 1 day after procedure was 3·4 (SD 2·8) in the RBL group and 4·6 (2·8) in the HAL group (difference −1·2, 95% CI −1·8 to −0·5; p=0·0002); at day 7 the scores were 1·6 (2·3) in the RBL group and 3·1 (2·4) in the HAL group (difference −1·5, −2·0 to −1·0; p<0·0001). Pain scores did not differ between groups at 21 days and 6 weeks. 15 individuals reported serious adverse events requiring hospital admission. One patient in the RBL group had a pre-existing rectal tumour. Of the remaining 14 serious adverse events, 12 (7%) were among participants treated with HAL and two (1%) were in those treated with RBL. Six patients had pain (one treated with RBL, five treated with HAL), three had bleeding not requiring transfusion (one treated with RBL, two treated with HAL), two in the HAL group had urinary retention, two in the HAL group had vasovagal upset, and one in the HAL group had possible sepsis (treated with antibiotics).
Although recurrence after HAL was lower than a single RBL, HAL was more painful than RBL. The difference in recurrence was due to the need for repeat bandings in the RBL group. Patients (and health commissioners) might prefer such a course of RBL to the more invasive HAL.
NIHR Health Technology Assessment programme.
Journal Article
Designing Realized Kernels to Measure the ex post Variation of Equity Prices in the Presence of Noise
by
Lunde, Asger
,
Shephard, Neil
,
Barndorff-Nielsen, Ole E.
in
Acceleration of convergence
,
Analytical estimating
,
Applications
2008
This paper shows how to use realized kernels to carry out efficient feasible inference on the ex post variation of underlying equity prices in the presence of simple models of market frictions. The weights can be chosen to achieve the best possible rate of convergence and to have an asymptotic variance which equals that of the maximum likelihood estimator in the parametric version of this problem. Realized kernels can also be selected to (i) be analyzed using endogenously spaced data such as that in data bases on transactions, (ii) allow for market frictions which are endogenous, and (iii) allow for temporally dependent noise. The finite sample performance of our estimators is studied using simulation, while empirical work illustrates their use in practice.
Journal Article
Econometric Analysis of Realized Covariation: High Frequency Based Covariance, Regression, and Correlation in Financial Economics
by
Shephard, Neil
,
Barndorff-Nielsen, Ole E.
in
Analysis of covariance
,
Applications
,
Asymptotic theory
2004
This paper analyses multivariate high frequency financial data using realized covariation. We provide a new asymptotic distribution theory for standard methods such as regression, correlation analysis, and covariance. It will be based on a fixed interval of time (e.g., a day or week), allowing the number of high frequency returns during this period to go to infinity. Our analysis allows us to study how high frequency correlations, regressions, and covariances change through time. In particular we provide confidence intervals for each of these quantities.
Journal Article
Time Series Experiments and Causal Estimands: Exact Randomization Tests and Trading
2019
We define causal estimands for experiments on single time series, extending the potential outcome framework to dealing with temporal data. Our approach allows the estimation of a broad class of these estimands and exact randomization-based p-values for testing causal effects, without imposing stringent assumptions. We further derive a general central limit theorem that can be used to conduct conservative tests and build confidence intervals for causal effects. Finally, we provide three methods for generalizing our approach to multiple units that are receiving the same class of treatment, over time. We test our methodology on simulated \"potential autoregressions,\" which have a causal interpretation. Our methodology is partially inspired by data from a large number of experiments carried out by a financial company who compared the impact of two different ways of trading equity futures contracts. We use our methodology to make causal statements about their trading methods.
Supplementary materials
for this article are available online.
Journal Article
Quantifying complexity in DNA structures with high resolution Atomic Force Microscopy
2025
DNA topology is essential for regulating cellular processes and maintaining genome stability, yet it is challenging to quantify due to the size and complexity of topologically constrained DNA molecules. By combining high-resolution Atomic Force Microscopy (AFM) with a new high-throughput automated pipeline, we can quantify the length, conformation, and topology of individual complex DNA molecules with sub-molecular resolution. Our pipeline uses deep-learning methods to trace the backbone of individual DNA molecules and identify crossing points, efficiently determining which segment passes over which. We use this pipeline to determine the structure of stalled replication intermediates from
Xenopus
egg extracts, including theta structures and late replication products, and the topology of plasmids, knots and catenanes from the
E. coli
Xer recombination system. We use coarse-grained simulations to quantify the effect of surface immobilisation on twist-writhe partitioning. Our pipeline opens avenues for understanding how fundamental biological processes are regulated by DNA topology.
Here the authors develop a pipeline combining atomic force microscopy and deep learning to trace and quantify the structure of complex DNA molecules like replication intermediates and recombination products. Furthermore, they characterise surface deposition effects using simulations.
Journal Article