Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
DisciplineDiscipline
-
Is Peer ReviewedIs Peer Reviewed
-
Item TypeItem Type
-
SubjectSubject
-
YearFrom:-To:
-
More FiltersMore FiltersSourceLanguage
Done
Filters
Reset
66
result(s) for
"Graves, Todd"
Sort by:
Predicting fault incidence using software change history
2000
This paper is an attempt to understand the processes by which software ages. We define code to be aged or decayed if its structure makes it unnecessarily difficult to understand or change and we measure the extent of decay by counting the number of faults in code in a period of time. Using change management data from a very large, long-lived software system, we explore the extent to which measurements from the change history are successful in predicting the distribution over modules of these incidences of faults. In general, process measures based on the change history are more useful in predicting fault rates than product metrics of the code: For instance, the number of times code has been changed is a better indication of how many faults it will contain than is its length. We also compare the fault rates of code of various ages, finding that if a module is, on the average, a year older than an otherwise similar module, the older module will have roughly a third fewer faults. Our most successful model measures the fault potential of a module as the sum of contributions from all of the times the module has been changed, with large, recent changes receiving the most weight.
Journal Article
A platform in the use of medicines to treat chronic hepatitis C (PLATINUM C): protocol for a prospective treatment registry of real-world outcomes for hepatitis C
by
Cheng, Wendy
,
Hellard, Margaret
,
Andric, Nada
in
Adaptive Clinical Trials as Topic
,
Antiviral agents
,
Antiviral Agents - therapeutic use
2020
Background
Safe, highly curative, short course, direct acting antiviral (DAA) therapies are now available to treat chronic hepatitis C. DAA therapy is freely available to all adults chronically infected with the hepatitis C virus (HCV) in Australia. If left untreated, hepatitis C may lead to progressive hepatic fibrosis, cirrhosis and hepatocellular carcinoma. Australia is committed to eliminating hepatitis as a public health threat by 2030 set by the World Health Organization. However, since the introduction of funded DAA treatment, uptake has been suboptimal. Australia needs improved strategies for testing, treatment uptake and treatment completion to address the persisting hepatitis C public health problem. PLATINUM C is a HCV treatment registry and research platform for assessing the comparative effectiveness of alternative interventions for achieving virological cure.
Methods
PLATINUM C will prospectively enrol people with active HCV infection confirmed by recent detection of HCV ribonucleic acid (RNA) in blood. Those enrolled will agree to allow standardised collection of demographic, lifestyle, treatment, virological outcome and other relevant clinical data to better inform the future management of HCV infection. The primary outcome is virological cure evidenced by sustained virological response (SVR), which is defined as a negative HCV PCR result 6 to 18 months after initial prescription of DAA therapy and no less than 12 weeks after the completion of treatment. Study participants will be invited to opt-in to medication adherence monitoring and quality of life assessments using validated self-reported instruments (EQ-5D-5L).
Discussion
PLATINUM C is a treatment registry and platform for nesting pragmatic trials. Data collected will inform the design, development and implementation of pragmatic trials. The digital infrastructure, study procedures and governing systems established by the registry will allow PLATINUM C to support a wider research platform in the management of hepatitis C in primary care.
Trial registration
The trial is registered with the Australia and New Zealand Clinical Trials Register (
ACTRN12619000023156
). Date of registration: 10/01/2019.
Journal Article
The ORVAC trial: a phase IV, double-blind, randomised, placebo-controlled clinical trial of a third scheduled dose of Rotarix rotavirus vaccine in Australian Indigenous infants to improve protection against gastroenteritis: a statistical analysis plan
by
Totterdell, James
,
Snelling, Thomas L
,
Graves, Todd
in
Adaptive design
,
Antibodies, Viral - blood
,
Australia
2020
Objective
The purpose of this double-blind, randomised, placebo-controlled, adaptive design trial with frequent interim analyses is to determine if Australian Indigenous children, who receive an additional (third) dose of human rotavirus vaccine (Rotarix, GlaxoSmithKline) for children aged 6 to < 12 months, would improve protection against clinically significant all-cause gastroenteritis.
Participants
Up to 1000 Australian Aboriginal and Torres Strait Islander (hereafter Indigenous) infants aged 6 to < 12 months will be recruited from all regions of the Northern Territory.
Interventions
The intervention is the addition of a third scheduled dose of human monovalent rotavirus vaccine.
Co-primary and secondary outcome measures
ORVAC has two co-primary outcomes: (1) anti-rotavirus IgA seroconversion, defined as serum anti-rotavirus IgA ≥ 20 U/ml 28 to 55 days post Rotarix/placebo, and (2) time from randomisation to medical attendance for which the primary reason for presentation is acute gastroenteritis or acute diarrhoea illness before age 36 months. Secondary outcomes include (1) change in anti-rotavirus IgA log titre, (2) time from randomisation to hospitalisation with primary admission code presumed or confirmed acute diarrhoea illness before age 36 months, (3) time from randomisation to hospitalisation for which the admission is rotavirus confirmed diarrhoea illness before age 36 months and (4) time from randomisation to rotavirus infection (not necessarily requiring hospitalisation) meeting the jurisdictional definition before age 36 months.
Discussion
A detailed, prospective statistical analysis plan is presented for this Bayesian adaptive design. The plan was written by the trial statistician and details the study design, pre-specified adaptative elements, decision thresholds, statistical methods and the simulations used to evaluate the operating characteristics of the trial. As at August 2020, four interim analyses have been run, but no stopping rules have been triggered. Application of this SAP will minimise bias and supports transparent and reproducible research.
Trial registration
Clinicaltrials.gov NCT02941107. Registered on 21 October 2016
Original protocol for the study
https://doi.org/10.1136/bmjopen-2019-032549
Journal Article
Does code decay? Assessing the evidence from change management data
2001
A central feature of the evolution of large software systems is that change-which is necessary to add new functionality, accommodate new hardware, and repair faults-becomes increasingly difficult over time. We approach this phenomenon, which we term code decay, scientifically and statistically. We define code decay and propose a number of measurements (code decay indices) on software and on the organizations that produce it, that serve as symptoms, risk factors, and predictors of decay. Using an unusually rich data set (the fifteen-plus year change history of the millions of lines of software for a telephone switching system), we find mixed, but on the whole persuasive, statistical evidence of code decay, which is corroborated by developers of the code. Suggestive indications that perfective maintenance can retard code decay are also discussed.
Journal Article
Design Ideas for Markov chain Monte Carlo Software
2007
This article discusses design ideas useful in the development of Markov chain Monte Carlo (MCMC) software. Goals of the design are to facilitate analysis of as many statistical models as possible, and to enable users to experiment with different MCMC algorithms as a research tool. These ideas have been used in YADAS, a system written in the Java language, but are also applicable in other object-oriented languages.
Journal Article
Advances in Data Combination, Analysis and Collection for System Reliability Assessment
by
Wilson, Alyson G.
,
Hamada, Michael S.
,
Reese, C. Shane
in
Bayesian
,
Bayesian network
,
Bayesian networks
2006
The systems that statisticians are asked to assess, such as nuclear weapons, infrastructure networks, supercomputer codes and munitions, have become increasingly complex. It is often costly to conduct full system tests. As such, we present a review of methodology that has been proposed for addressing system reliability with limited full system testing. The first approaches presented in this paper are concerned with the combination of multiple sources of information to assess the reliability of a single component. The second general set of methodology addresses the combination of multiple levels of data to determine system reliability. We then present developments for complex systems beyond traditional series/parallel representations through the use of Bayesian networks and flowgraph models. We also include methodological contributions to resource allocation considerations for system relability assessment. We illustrate each method with applications primarily encountered at Los Alamos National Laboratory.
Journal Article
Reliability Models for Almost-Series and Almost-Parallel Systems
by
Hamada, Michael S.
,
Anderson-Cook, Christine M.
,
Graves, Todd L.
in
Applied sciences
,
Bayesian
,
Binomials
2010
When assessing system reliability using system, subsystem, and component-level data, assumptions are required about the form of the system structure in order to utilize the lower-level data. We consider model forms which allow for the assessment and modeling of possible discrepancies between reliability estimates based on different levels of data. By understanding these potential conflicts between data, we can more realistically represent the true uncertainty of the estimates and gain understanding about inconsistencies which might guide further improvements to the system model. The new methodology is illustrated with several examples.
Journal Article
Visualizing software changes
by
Eick, S.G.
,
Schuster, P.
,
Graves, T.L.
in
Alliances
,
Business intelligence software
,
Complement
2002
A key problem in software engineering is changing the code. We present a sequence of visualizations and visual metaphors designed to help engineers understand and manage the software change process. The principal metaphors are matrix views, cityscapes, bar and pie charts, data sheets and networks. Linked by selection mechanisms, multiple views are combined to form perspectives that both enable discovery of high-level structure in software change data and allow effective access to details of those data. Use of the views and perspectives is illustrated in two important contexts: understanding software change by exploration of software change data and management of software development. Our approach complements existing visualizations of software structure and software execution.
Journal Article
Using version control data to evaluate the impact of software tools: a case study of the Version Editor
by
Atkins, D.L.
,
Graves, T.L.
,
Mockus, A.
in
Algorithms
,
Case studies
,
Computer aided software engineering
2002
Software tools can improve the quality and maintainability of software, but are expensive to acquire, deploy, and maintain, especially in large organizations. We explore how to quantify the effects of a software tool once it has been deployed in a development environment. We present an effort-analysis method that derives tool usage statistics and developer actions from a project's change history (version control system) and uses a novel effort estimation algorithm to quantify the effort savings attributable to tool usage. We apply this method to assess the impact of a software tool called VE, a version-sensitive editor used in Bell Labs. VE aids software developers in coping with the rampant use of certain preprocessor directives (similar to #if/#endif in C source files). Our analysis found that developers were approximately 40 percent more productive when using VE than when using standard text editors.
Journal Article
Adjunctive Intravenous Argatroban or Eptifibatide for Ischemic Stroke
2024
In this trial involving patients with acute ischemic stroke treated with thrombolysis within 3 hours after symptom onset, the use of adjunctive intravenous argatroban or eptifibatide did not improve 90-day poststroke outcomes.
Journal Article