Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
DisciplineDiscipline
-
Is Peer ReviewedIs Peer Reviewed
-
Item TypeItem Type
-
SubjectSubject
-
YearFrom:-To:
-
More FiltersMore FiltersSourceLanguage
Done
Filters
Reset
9
result(s) for
"Spencer Angus Thomas"
Sort by:
Deep learning for necrosis detection using canine perivascular wall tumour whole slide images
2022
Necrosis seen in histopathology Whole Slide Images is a major criterion that contributes towards scoring tumour grade which then determines treatment options. However conventional manual assessment suffers from inter-operator reproducibility impacting grading precision. To address this, automatic necrosis detection using AI may be used to assess necrosis for final scoring that contributes towards the final clinical grade. Using deep learning AI, we describe a novel approach for automating necrosis detection in Whole Slide Images, tested on a canine Soft Tissue Sarcoma (cSTS) data set consisting of canine Perivascular Wall Tumours (cPWTs). A patch-based deep learning approach was developed where different variations of training a DenseNet-161 Convolutional Neural Network architecture were investigated as well as a stacking ensemble. An optimised DenseNet-161 with post-processing produced a hold-out test F1-score of 0.708 demonstrating state-of-the-art performance. This represents a novel first-time automated necrosis detection method in the cSTS domain as well specifically in detecting necrosis in cPWTs demonstrating a significant step forward in reproducible and reliable necrosis assessment for improving the precision of tumour grading.
Journal Article
The need for measurement science in digital pathology
by
Dexter, Alex
,
Turpin, Robert James
,
Shaw, Mike
in
Artificial intelligence
,
Calibration
,
DICOM
2022
Pathology services experienced a surge in demand during the COVID-19 pandemic. Digitalisation of pathology workflows can help to increase throughput, yet many existing digitalisation solutions use non-standardised workflows captured in proprietary data formats and processed by black-box software, yielding data of varying quality. This study presents the views of a UK-led expert group on the barriers to adoption and the required input of measurement science to improve current practices in digital pathology.
With an aim to support the UK’s efforts in digitalisation of pathology services, this study comprised: (1) a review of existing evidence, (2) an online survey of domain experts, and (3) a workshop with 42 representatives from healthcare, regulatory bodies, pharmaceutical industry, academia, equipment, and software manufacturers. The discussion topics included sample processing, data interoperability, image analysis, equipment calibration, and use of novel imaging modalities.
The lack of data interoperability within the digital pathology workflows hinders data lookup and navigation, according to 80% of attendees. All participants stressed the importance of integrating imaging and non-imaging data for diagnosis, while 80% saw data integration as a priority challenge. 90% identified the benefits of artificial intelligence and machine learning, but identified the need for training and sound performance metrics.
Methods for calibration and providing traceability were seen as essential to establish harmonised, reproducible sample processing, and image acquisition pipelines. Vendor-neutral data standards were seen as a “must-have” for providing meaningful data for downstream analysis. Users and vendors need good practice guidance on evaluation of uncertainty, fitness-for-purpose, and reproducibility of artificial intelligence/machine learning tools. All of the above needs to be accompanied by an upskilling of the pathology workforce.
Digital pathology requires interoperable data formats, reproducible and comparable laboratory workflows, and trustworthy computer analysis software. Despite high interest in the use of novel imaging techniques and artificial intelligence tools, their adoption is slowed down by the lack of guidance and evaluation tools to assess the suitability of these techniques for specific clinical question. Measurement science expertise in uncertainty estimation, standardisation, reference materials, and calibration can help establishing reproducibility and comparability between laboratory procedures, yielding high quality data and providing higher confidence in diagnosis.
•A change to vendor-neutral open standard for images and annotations is essential to improve data management, sharing and re-use.•Lack of standardised data hinders development of AI/ML tools for pathology.•Frequency and scope of instrument calibration vary a lot between laboratories.•Standardised calibration tools are needed to yield consistent comparable images.•Pathology community needs own metrics to assess AI/ML performance.
Journal Article
Synthesis, Analysis and Reconstruction of Gene Regulatory Networks Using Evolutionary Algorithms
2014
Large and complex biological networks are thought to be built from small functional modules called motifs. Currently there has been insufficient study of the fundamental understanding of these motifs which has resulted in a lack of consensus of their role and presence in biology. Here we investigate two networks that produce biologically important dynamics, an oscillation and a toggle switch. We couple these motifs and observe multiple sets of combined dynamic behaviour and evidence of gene connectivity preferences between the two networks. Such fundamental studies of networks can be performed computationally with detailed mathematical analysis that may not be possible from experimental data due to noise and experimental costs. Computational studies can also be used in conjunction with experimental data to analysis and interpret large scale data sets such as high-throughput data. Here we use such an approach to go beyond fundamental networks and model a system of particular interest in biology, the bacteria Streptomyces coelicolor, which produces a plethora of antibiotics and medicinal compounds. The regulatory network of genes in S. colicolor is vast and sub-networks can span hundreds, or even thousands of genes. Currently there is insufficient data to statistically reverse engineer regulatory networks for large networks, known as underdetermined problems. The complexity of real data due to noise is also a problem for inferring networks, and as a result much of the research community focus on small artificial data sets to benchmark their algorithms. Here we develop a novel algorithm which uses data integration and processing with a multi-objective set-up that enhances convergence through multiobjectivization. Additionally our algorithm uses a decoupled optimisation approach to improve the optimisation and parallel computation to significantly reduce computational run times. Our algorithm is general and can be applied to any network with time series data of any size. We compare various size biologically relevant sub-networks within S. colicolor with several optimisation arrangements and demonstrate our novel approach is the best over any network size. Furthermore, we apply our algorithm to the PhoP sub-network of 911 genes within S. colicolor, which is strongly linked to antibiotic production. All networks here are reconstructed from real experimental data. Our algorithm is able to build a regulatory model for 911 genes in the PhoP network for time series data sets of up to 32 points, both of which are far larger than current methods.
Dissertation
Relative Sensitivities and Correlation of Factors Introducing Uncertainty in Radiotherapy Dosimetry Audits
by
Spencer Angus Thomas
,
Smith, Nadia A S
,
Krishnadas, Padmini
in
Correlation analysis
,
Dosimeters
,
Dosimetry
2024
Dosimetry audits are carried out to determine how well radiotherapy is delivered to the patient. It is also used to understand the uncertainty introduced into the measurement result when using different computational models. As measurement procedures are becoming increasingly complex with technological advancements, it is harder to establish sources of variability in measurements and understand if they stem from true differences in measurands or in the measurement pipelines themselves. The gamma index calculation is a widely accepted metric used for the comparison of measured and predicted doses in radiotherapy. However, various steps in the measurement pipeline can introduce variation in the measurement result. In this paper, we perform a sensitivity and correlation analysis to investigate the influence of various input factors (i.e. setting) in gamma index calculations on the uncertainty introduced in dosimetry audits. We identify a number of factors where standardization will improve measurements by reducing variability in outputs. Furthermore, we also compare gamma index metrics and similarities across audit sites.
Modeling dynamic gene expression in STREPTOMYCES COELICOLOR: Comparing single and multi‐objective setups
by
Smith, Colin
,
Thomas, Spencer Angus
,
Laing, Emma
in
decoupled network optimization
,
dynamic gene expression
,
evolutionary algorithms
2016
A common modeling technique for biological networks is to use gene regulatory networks (GRNs), where interactions are modeled on the gene level only and are mediated by their protein products. This chapter investigates several computational configurations for modeling the dynamic gene expression of networks of increasing size within the PhoP sub‐network using the SysMO data set that contains 32 time points for every gene. It reviews various optimization‐based modeling methods to reproduce the dynamic gene expression profiles of the genes in these networks. The chapter compares a single objective setup (SOS) for full network optimization with a comparable multi‐objective setup (MOS), and also investigates these methods in a decoupled optimization arrangement. The method uses real biological data from microarray experiments, which give genome‐wide expression profiles for all genes in Streptomyces Coelicolor. Evolutionary algorithms (EAs) have been used to reconstruct GRNs from time‐series data.
Book Chapter
Autofocusing drift tube linac envelopes
2021
To date, beam dynamics studies and design of combined zero degree drift tube linac (DTL) structures (Kombinierte Null Grad Struktur; KONUS) have only been carried out in multiparticle codes. Quantities such as the beam envelopes are obtained by averaging over particles, whose tracking is computationally intensive for large bunch populations. Tune computations, which depend on this average, are burdensome to obtain. This has motivated the implementation of a simulation of KONUS DTLs in the code transoptr, whose Hamiltonian treatment of beam dynamics enables the integration of energy gain from the longitudinal electric field on axis while simultaneously elaborating the field in transverse directions to obtain the linear optics. The code also features an in-built space charge capability. The evolution of the beam matrix including longitudinal optics is computed in a reference Frenet-Serret frame through the time-dependent DTL cavity fields. This enables fast envelope simulations for DTLs, resulting in a variable energy sequential tune optimization capability. The implementation methodology and optimization techniques, applicable for any combination of DTL tanks and bunchers, is outlined. Comparisons with the code lorasr, in addition to beam-basedE/Ameasurements of a DTL are presented.
Journal Article
Significance of Epicardial and Intrathoracic Adipose Tissue Volume among Type 1 Diabetes Patients in the DCCT/EDIC: A Pilot Study
by
Backlund, Jye-Yu C.
,
Darabian, Sirous
,
Sheidaee, Nasim
in
Adipose tissue
,
Adipose Tissue - diagnostic imaging
,
Adult
2016
Type 1 diabetes (T1DM) patients are at increased risk of coronary artery disease (CAD). This pilot study sought to evaluate the relationship between epicardial adipose tissue (EAT) and intra-thoracic adipose tissue (IAT) volumes and cardio-metabolic risk factors in T1DM.
EAT/IAT volumes in 100 patients, underwent non-contrast cardiac computed tomography in the Diabetes Control and Complications Trial /Epidemiology of Diabetes Interventions and Complications (DCCT/EDIC) study were measured by a certified reader. Fat was defined as pixels' density of -30 to -190 Hounsfield Unit. The associations were assessed using-Pearson partial correlation and linear regression models adjusted for gender and age with inverse probability sample weighting.
The weighted mean age was 43 years (range 32-57) and 53% were male. Adjusted for gender, Pearson correlation analysis showed a significant correlation between age and EAT/IAT volumes (both p<0.001). After adjusting for gender and age, participants with greater BMI, higher waist to hip ratio (WTH), higher weighted HbA1c, elevated triglyceride level, and a history of albumin excretion rate of equal or greater than 300 mg/d (AER≥300) or end stage renal disease (ESRD) had significantly larger EAT/IAT volumes.
T1DM patients with greater BMI, WTH ratio, weighted HbA1c level, triglyceride level and AER≥300/ESRD had significantly larger EAT/IAT volumes. Larger sample size studies are recommended to evaluate independency.
Journal Article
Quality Control Measures over 30 Years in a Multicenter Clinical Study: Results from the Diabetes Control and Complications Trial / Epidemiology of Diabetes Interventions and Complications (DCCT/EDIC) Study
by
Diminick, Lisa
,
Morrison, Anthony D.
,
Klumpp, Kandace A.
in
Adolescent
,
Adult
,
Care and treatment
2015
Implementation of multicenter and/or longitudinal studies requires an effective quality assurance program to identify trends, data inconsistencies and process variability of results over time. The Diabetes Control and Complications Trial (DCCT) and the follow-up Epidemiology of Diabetes Interventions and Complications (EDIC) study represent over 30 years of data collection among a cohort of participants across 27 clinical centers. The quality assurance plan is overseen by the Data Coordinating Center and is implemented across the clinical centers and central reading units. Each central unit incorporates specific DCCT/EDIC quality monitoring activities into their routine quality assurance plan. The results are reviewed by a data quality assurance committee whose function is to identify variances in quality that may impact study results from the central units as well as within and across clinical centers, and to recommend implementation of corrective procedures when necessary. Over the 30-year period, changes to the methods, equipment, or clinical procedures have been required to keep procedures current and ensure continued collection of scientifically valid and clinically relevant results. Pilot testing to compare historic processes with contemporary alternatives is performed and comparability is validated prior to incorporation of new procedures into the study. Details of the quality assurance plan across and within the clinical and central reading units are described, and quality outcomes for core measures analyzed by the central reading units (e.g. biochemical samples, fundus photographs, ECGs) are presented.
Journal Article