Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
DisciplineDiscipline
-
Is Peer ReviewedIs Peer Reviewed
-
Item TypeItem Type
-
SubjectSubject
-
YearFrom:-To:
-
More FiltersMore FiltersSourceLanguage
Done
Filters
Reset
24
result(s) for
"real world variability"
Sort by:
In Shift and In Variance: Assessing the Robustness of HAR Deep Learning Models Against Variability
by
Roggen, Daniel
,
Lago, Paula
,
Khaked, Azhar Ali
in
Accuracy
,
Artificial intelligence
,
Automation
2025
Deep learning (DL)-based Human Activity Recognition (HAR) using wearable inertial measurement unit (IMU) sensors can revolutionize continuous health monitoring and early disease prediction. However, most DL HAR models are untested in their robustness to real-world variability, as they are trained on limited lab-controlled data. In this study, we isolated and analyzed the effects of the subject, device, position, and orientation variabilities on DL HAR models using the HARVAR and REALDISP datasets. The Maximum Mean Discrepancy (MMD) was used to quantify shifts in the data distribution caused by these variabilities, and the relationship between the distribution shifts and model performance was drawn. Our HARVAR results show that different types of variability significantly degraded the DL model performance, with an inverse relationship between the data distribution shifts and performance. The compounding effect of multiple variabilities studied using REALDISP further underscores the challenges of generalizing DL HAR models to real-world conditions. Analyzing these impacts highlights the need for more robust models that generalize effectively to real-world settings. The MMD proved valuable for explaining the performance drops, emphasizing its utility in evaluating distribution shifts in HAR data.
Journal Article
Entropy of Real-World Gait in Parkinson’s Disease Determined from Wearable Sensors as a Digital Marker of Altered Ambulatory Behavior
2020
Parkinson’s disease (PD) is a common age-related neurodegenerative disease. Gait impairment is frequent in the later stages of PD contributing to reduced mobility and quality of life. Digital biomarkers such as gait velocity and step length are predictors of motor and cognitive decline in PD. Additional gait parameters may describe different aspects of gait and motor control in PD. Sample entropy (SampEnt), a measure of signal predictability, is a nonlinear approach that quantifies regularity of a signal. This study investigated SampEnt as a potential biomarker for PD and disease duration. Real-world gait data over a seven-day period were collected using an accelerometer (Axivity AX3, York, UK) placed on the low back and gait metrics extracted. SampEnt was determined for the stride time, with vector length and threshold parameters optimized. People with PD had higher stride time SampEnt compared to older adults, indicating reduced gait regularity. The range of SampEnt increased over 36 months for the PD group, although the mean value did not change. SampEnt was associated with dopaminergic medication dose but not with clinical motor scores. In conclusion, this pilot study indicates that SampEnt from real-world data may be a useful parameter reflecting clinical status although further research is needed involving larger populations.
Journal Article
Understanding the origins and variability of the fuel consumption gap: lessons learned from laboratory tests and a real-driving campaign
2020
BackgroundDivergence in fuel consumption (FC) between the type-approval tests and real-world driving trips, known also as the FC gap, is a well-known issue and Europe is preparing the field for tackling it. The present study focuses on the monitoring of the FC of a single vehicle throughout 1 year with 20 different drivers and almost 14,000 km driven with the aim to analyze and quantify the true intrinsic variability in the FC gap coming from environmental and traffic conditions and driving factors. In addition, the regression model has been developed to evaluate the importance of these different factors on the FC gap’s variability.ResultsThe 1-year FC gap measured in this study was 29% while driver’s averages were in the range from 16 to 106%. The regression model developed had R2 equal to 90.4 meaning that more than 90% of the FC gap’s variance can be explained with this model and factors measured in this study. The results of the model showed that among all factors analyzed the highest contribution in the FC gap’s variance is coming from the average vehicle speed (16.6%), followed by the road grade (13.4%), and trip distance (10.1%). Indeed, the highest FC gaps are measured when the average vehicle speeds were below 20 km/h, the average distance-weighted road grades above 1%, and the trip distances below 5 km. In addition, the impact of driver factors is not negligible (25%) and the highest FC gap is measured for the trips where average positive acceleration was higher than 0.7 m/s2 (indicating aggressive driving) and the electric power demand higher than 800 W.ConclusionsThe future lifetime on-board fuel consumption reporting is a crucial instrument that will allow the monitoring of the evolution of the FC gap and ensuring that it does not increase over time. The analysis presented in this study is a basis for setting up a more detailed and refined prediction model, which could assist the European Commission in closely monitoring the gap and the underlying factors generating it.
Journal Article
Development of a Physiologically Based Model of Bilirubin Metabolism in Health and Disease and Its Comparison With Real‐World Data
by
Kuepfer, Lars
,
Sayin, Ahenk Zeynep
in
Adult
,
ATP-Binding Cassette, Sub-Family C Proteins - metabolism
,
bilirubin
2026
Bilirubin is a breakdown product of erythrocytes and plays a crucial role in elimination of heme‐containing proteins. After its synthesis in the reticuloendothelial system, unconjugated bilirubin is released into plasma and taken up into the liver. In hepatocytes, bilirubin is conjugated and excreted into the gastrointestinal tract via bile, where it is further converted to urobilinoids. There are various genetic factors causing abnormal bilirubin levels in plasma, such as Gilbert syndrome, Crigler‐Najjar syndrome, Dubin‐Johnson syndrome, and Rotor syndrome. To better understand bilirubin metabolism and its disorders, this study develops a physiologically based computational model incorporating published literature as well as real‐world clinical data from the Explorys database. The model simulates bilirubin levels in both healthy individuals and patients with disorders of bilirubin metabolism. Population simulations show that Gilbert syndrome requires a substantial reduction in UDP‐glucuronosyltransferase 1A1 activity, while Crigler‐Najjar syndrome requires near‐complete loss of its function. In contrast, Dubin‐Johnson syndrome is characterized by a significant impairment of multidrug resistance‐associated protein 2 activity. To also illustrate model behavior under targeted perturbations, we simulated administration of atazanavir in healthy individuals and patients with Gilbert syndrome to investigate its effect on bilirubin levels. Relative to baseline, unconjugated bilirubin maximum concentration (Cmax) increased by 34% in healthy individuals but by 67% in Gilbert syndrome. Overall, this study provides a conceptual and mechanistically informed framework for studying bilirubin homeostasis and the functional consequences of drug administration in health and disease. Study Highlights What is the current knowledge on the topic? ○Bilirubin metabolism involves breakdown of heme into unconjugated bilirubin, which is then conjugated in the liver and excreted into the bile. Disorders such as Gilbert syndrome, Crigler‐Najjar syndrome, Dubin‐Johnson syndrome, and Rotor syndrome disrupt this process, leading to abnormal bilirubin levels. What question did this study address? ○We developed a whole‐body physiologically based computational model representing bilirubin metabolism to evaluate the bilirubin levels in health and bilirubin‐related disorders, addressing interindividual variability, disorder‐specific mechanistic alterations, and effects of dynamic perturbations. What does this study add to our knowledge? ○The study presents functional analyses of bilirubin homeostasis and specific metabolic disorders. It is the first study to systematically compare simulations of bilirubin metabolism with real‐world clinical data, providing insights into interindividual variability and disorder‐specific patterns. How might this change drug discovery, development, and/or therapeutics? ○The model provides a conceptual and mechanistic framework for investigation of drug‐induced perturbations in bilirubin metabolism. The model can in particular be used to explore aberrant states in vulnerable patient subgroups with specific metabolic disorders at the population level.
Journal Article
Use of long-acting muscarinic antagonists for severe asthma: insights from clinicians in the SHARP network
by
Marcon, Alessandro
,
Berret, Emmanuelle
,
Sont, Jacob K.
in
Acetylcholine receptors (muscarinic)
,
Administration, Inhalation
,
Adult
2025
Background
There is limited real-world evidence on the use and positioning of inhaled long-acting muscarinic antagonists (LAMAs) for treating severe asthma.
Objective
We aimed to assess the differences in clinicians’ perspectives on prescribing LAMA for severe asthma across Europe.
Methods
Within the Severe Heterogeneous Asthma Research collaboration, Patient-centred (SHARP) network, we conducted a multi-country survey of 470 respiratory clinicians in 2023-24.
Results
Most participants were pneumonologists (83%). 68% reported using specific criteria for prescribing LAMA for severe asthma, primarily fixed bronchial obstruction (68%), frequent asthma exacerbations (65%), and a history of smoking (53%). Improved quality of life, lung function improvement, and reduction of asthma exacerbations were the three expected outcomes of LAMA treatment that showed the largest agreement across clinicians (85–95%). As compared to non-severe asthma specialists (about 50% of the sample), severe asthma specialists were more likely to report always prescribing LAMA before biologics (54 vs. 41%) and before oral corticosteroids (OCS) (51 vs. 40%). Approximately 90% of participants prioritised tapering or discontinuing OCS before stopping LAMA. Overall, participants in Northern (
n
= 73) and Western Europe (
n
= 71) appeared less prone to prescribe LAMA and less confident in their benefits compared to those from Eastern (
n
= 88) and Southern Europe (
n
= 238). In particular, most participants from Latvia (91%) and Lithuania (67%) reported that LAMAs are not reimbursed for severe asthma in their countries. Most participants who expressed their opinion favoured triple therapy (
n
= 236) over single inhaler therapy (
n
= 91).
Conclusion
Our findings show that barriers and heterogeneous approaches still limit LAMA prescription for severe asthma in Europe, potentially leading to the underuse of this treatment option. Establishing a clear role for LAMA within a precision medicine framework is crucial; this aspect is not yet firmly supported by current international recommendations.
Highlights
What is already known about this topic?
Long-acting muscarinic antagonists (LAMA) are approved for treating severe asthma, but the best responder profile is unclear.
What does this article add to our knowledge?
Our survey documents heterogeneous perspectives and barriers to LAMA use across Europe, examining the background of different prescription behaviours.
How does this study impact current management guidelines?
The findings highlight the need for international recommendations to specify which patient profiles are most likely to benefit from LAMA treatment and to clarify LAMA positioning within the step-up and step-down approach to asthma management.
Journal Article
An exploratory factor analysis model for slum severity index in Mexico City
2020
Today, over half of the world’s population lives in urban areas and it is projected that, by 2050, two out of three people will live in a city. This increased rural–urban migration, coupled with housing poverty, has led to the growth and formation of informal settlements, commonly known as slums. In Mexico, 25% of the urban population now live in informal settlements with varying degrees of deprivation. Although some informal neighbourhoods have contributed to the upward mobility of the inhabitants, the majority still lack basic services. Mexico City and the conurbation around it form a mega city of 21million people that has been growing in a manner qualified as ‘highly unproductive, (that) deepens inequality, raises pollution levels’ (available at: https://www.smartcitiesdive.com/ex/sustainablecitiescollective/making-way-urban-reform-mexico/176466/) and contains the largest slum in the world: Neza-Chalco-Izta. Urban reforms are now aiming to improve the conditions in these slums and therefore it is very important to have reliable tools to measure the changes that are underway. In this paper, we use exploratory factor analysis to define an index of shelter deprivation in Mexico City, namely the Slum Severity Index (SSI), based on the UN-HABITAT’s definition of slum. We apply this novel approach to the Census survey of Mexico and measure the shelter deprivation levels of households from 1990 to 2010. The analysis highlights high variability in housing conditions within Mexico City. We find that the SSI decreased significantly between 1990 and 2000 as a result of several policy reforms but increased between 2000 and 2010. We also show correlations of the SSI with other social factors such as education, health and fertility. We present a validation of the SSI using Grey Level Co-occurrence Matrix (GLCM) features extracted from Very-High Resolution (VHR) remote-sensed satellite images. Finally, we show that the SSI can present a cardinally meaningful assessment of the extent of deprivation compared with a similar index defined by Connolly (Connolly P (2009) Observing the evolution of irregular settlements: Mexico city’s colonias populares, 1990 to 2005. International Development Planning Review 31: 1–35) that studies shelter deprivation in Mexico.
如今,超过一半的世界人口生活在城市地区,预计到2050年,三分之二的人口将生活在城市。越来越多的乡村到城市移民加上住房贫困,导致了非正规住区的增长和形成,它们通常被称为贫民窟。在墨西哥,目前25%的城市人口生活在不同程度贫困的非正规住区。尽管一些非正规住区促进了居民的向上流动,但大多数居民仍然缺乏基本服务。墨西哥城及其周围的大都市区形成了一个拥有2,100万人口的巨型城市,其增长方式属于“加剧不平等和污染水平的高生产率”(可在以下网站查阅: https://www.smartcitiesdive.com/ex/sustainablecitiescollective/making-way-urban-reform-mexico/176466/),并包括世界上最大的贫民窟那扎-查可-伊泽塔(Neza-Chalco-Izta)。城市改革的目标是改善这些贫民窟的条件,因此拥有可靠的工具来衡量正在发生的变化非常重要。在本文中,我们根据联合国人居署对贫民窟的定义,使用探索性因素分析来定义墨西哥城的住房匮乏指数,即贫民窟严重程度指数(SSI)。我们将这一新方法应用于墨西哥的人口普查,并测量了1990年至2010年期间家庭住房匮乏的程度。我们的分析凸显了墨西哥城住房条件的高度可变性。我们发现,由于若干政策改革,SSI在1990年至2000年期间大幅下降,但在2000年至2010年期间有所上升。我们还揭示了SSI与教育、健康和生育率等其他社会因素的相关性。我们利用从超高分辨率(VHR)遥感卫星图像中提取的灰度共生矩阵(GLCM)特征对SSI进行了验证。最后,我们证明,与康纳利定义的类似指数(Connolly P (2009) Observing the evolution of irregular settlements: Mexico city’s colonias populares, 1990 to 2005.International Development Planning Review 31:1-35)相比,SSI可以对贫困程度进行具有最重大意义的评估。
Journal Article
Electronic health record data quality variability across a multistate clinical research network
by
Mohamed, Yahia
,
Song, Xing
,
Zozus, Meredith
in
Advancing Translational Science through Real-World Data and Real-World Evidence
,
common data model
,
Data integrity
2023
Electronic health record (EHR) data have many quality problems that may affect the outcome of research results and decision support systems. Many methods have been used to evaluate EHR data quality. However, there has yet to be a consensus on the best practice. We used a rule-based approach to assess the variability of EHR data quality across multiple healthcare systems.
To quantify data quality concerns across healthcare systems in a PCORnet Clinical Research Network, we used a previously tested rule-based framework tailored to the PCORnet Common Data Model to perform data quality assessment at 13 clinical sites across eight states. Results were compared with the current PCORnet data curation process to explore the differences between both methods. Additional analyses of testosterone therapy prescribing were used to explore clinical care variability and quality.
The framework detected discrepancies across sites, revealing evident data quality variability between sites. The detailed requirements encoded the rules captured additional data errors with a specificity that aids in remediation of technical errors compared to the current PCORnet data curation process. Other rules designed to detect logical and clinical inconsistencies may also support clinical care variability and quality programs.
Rule-based EHR data quality methods quantify significant discrepancies across all sites. Medication and laboratory sources are causes of data errors.
Journal Article
Real-world antibiotic utilization during pregnancy in Italy: a multiregional retrospective population-based study
by
Trotta, Francesco
,
Davoli, Marina
,
Perna, Serena
in
Adult
,
Anti-Bacterial Agents - therapeutic use
,
Antibiotics
2025
Background
Exposure to antibiotics during pregnancy is frequent, despite the limited evidence derived from clinical trials. Drug utilization studies could improve knowledge on utilization of these medications during this critical period. In this context, the present study aimed to describe antibiotic exposure during pregnancy in Italy at both national and regional levels.
Methods
This retrospective population-based study involved a cohort of women who gave birth from 2016 to 2018 and were residents of one of the following Italian regions: Lombardy, Veneto, Emilia-Romagna, Tuscany, Umbria, Lazio, Apulia or Sardinia. A series of sociodemographic and clinical characteristics were retrieved from regional healthcare databases. The prevalence of the use of antibiotics was estimated in nine trimesters, which were divided into three different periods: pre- pregnancy (-III, -II, -I) during pregnancy (I, II, III) and post-pregnancy (+ I, + II, + III). Analyses were stratified by region and by prenatal invasive diagnostic performed.
Results
A total of 449,012 women were included in the study, of whom more than 37% were aged ≥ 35 years at birth. The overall prevalence rates of antibiotic use in the study cohort were 33.9% pre-pregnancy (
per
trimester: -III = 14.3%, -II = 14.5%, -I = 14.5%), 31.8% during pregnancy (
per
trimester: I = 12.0%, II = 16.0%, III = 11.4%) and 29.3% post-pregnancy (
per
trimester: + I = 15.3%, + II = 9.7%; + III = 11.0%). The regions with the lowest usage pre-, during and post-pregnancy were Lombardy (29.7%, 26.1%, 28.0%) and Veneto (28.8%, 26.4%, 25.5%), whereas Apulia reached the highest values (45.6%, 41.6%, 38.3%). The highest peaks during pregnancy were reached by Umbria (25.8%), Latium (24.1%) and Apulia (21.4%). Women who underwent chorionic villus sampling and those who underwent amniocentesis registered a peak during trimester I (25%) and trimester II (41%), respectively. These peaks were in line with the timing of the invasive prenatal diagnostic procedures.
Conclusions
The use of antibiotics during pregnancy in Italy was in line with other European countries, reflecting national and international guidelines. However, a certain level of misuse of specific antibiotics and different utilization rates across the regions were observed. Continuous monitoring of long- and short-term outcomes associated with exposure to antibiotics during pregnancy may contribute to reducing excessive utilization and improving the diffusion of more appropriate procedures and practices.
Journal Article
Real-World Insights into the Effectiveness and Tolerability of OnabotulinumtoxinA in Chronic Migraine: A Long-Term Evaluation of up to 11 Years
2025
Background: Chronic migraine (CM) is a debilitating neurological disorder that imposes substantial burdens on individuals and society, including diminished quality of life and increased healthcare utilization. While the efficacy of botulinum neurotoxin type A (BoNT-A) has been demonstrated in controlled trials, this longitudinal, real-world study offers unprecedented evidence of its long-term benefits, with patients followed for a median of 15 months (interquartile range: 6–36 months) and up to 11 years. Methods: This retrospective analysis included 579 patients diagnosed with CM who were newly treated with BoNT-A, according to the PREEMPT protocol, receiving injections every 12 weeks at doses of 155–195 units across 31–39 sites. Outcomes were assessed through changes in monthly headache days, frequency, symptomatic medication use, and migraine-related disability using Migraine Disability Assessment (MIDAS) scores up to 60 months from recruitment. Safety was evaluated by recording treatment-emergent adverse events (TEAEs), with a focus on long-term tolerability and subgroup variability. Results: Patients showed sustained improvements, with the mean number of monthly headache days decreasing from 22.7 to 5.5, and symptomatic medication use dropping from 33.4 to 3.7 mean doses at 60 months. Additionally, over 60% of patients improved from severe (MIDAS Grade IV) to minimal disability (MIDAS Grade I). Subgroup analysis revealed variability in response rates, emphasizing the need for personalized approaches. TEAEs were predominantly mild, with no new adverse events reported after 36 months, supporting the long-term safety of BoNT-A in real-world settings. Conclusions: This real-world study provides significant evidence for the long-term efficacy, safety, and tolerability of BoNT-A in the preventive treatment of CM. The findings highlight the importance of real-world data to account for patient variability and tailoring treatment strategies.
Journal Article
Epidemiology and Genetic Characterization of Distinct Ebola Sudan Outbreaks in Uganda
by
Ciccozzi, Massimo
,
Scarpa, Fabio
,
Branda, Francesco
in
Data collection
,
Datasets
,
Disease transmission
2025
Background. Sudan virus (SUDV) has caused multiple outbreaks in Uganda over the past two decades, leading to significant morbidity and mortality. The recent outbreaks in 2022 and 2025 highlight the ongoing threat posed by SUDV and the challenges in its containment. This study aims to characterize the epidemiological patterns and phylogenomic evolution of SUDV outbreaks in Uganda, identifying key factors influencing transmission and disease severity. Methods. We conducted a retrospective observational study analyzing epidemiological and genomic data from SUDV outbreaks in Uganda between 2000 and 2025. Epidemiological data were collected from official sources, including the Ugandan Ministry of Health and the World Health Organization, supplemented with reports from public health organizations. Genomic sequences of SUDV were analyzed to investigate viral evolution and identify genetic variations associated with pathogenicity and transmissibility. Results. The 2022 outbreak involved 164 confirmed cases and a case fatality rate (CFR) of 33.5%, with significant geographic variation in case distribution. The 2025 outbreak, still ongoing, was first detected in Kampala, with evidence of both nosocomial and community transmission. Phylogenomic analysis revealed the presence of two main genetic groups, representing Sudan and Uganda, respectively. The genetic variability of the Ugandan cluster is higher than that observed in Sudan, suggesting a greater expansion potential, which aligns with the current outbreak. Epidemiological findings indicate that human mobility, weaknesses in the health system, and delays in detection contribute to the amplification of the outbreak. Conclusions. Our findings underscore the importance of integrated genomic and epidemiological surveillance in understanding SUDV transmission dynamics. The recurrent emergence of SUDV highlights the need for improved outbreak preparedness, rapid response mechanisms, and international collaboration. Strengthening real-time surveillance and enhancing healthcare system resilience are critical to mitigating the impact of future outbreaks.
Journal Article