Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Item Type
      Item Type
      Clear All
      Item Type
  • Subject
      Subject
      Clear All
      Subject
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
80 result(s) for "Wolfson, Julian"
Sort by:
Decision trees in epidemiological research
Background In many studies, it is of interest to identify population subgroups that are relatively homogeneous with respect to an outcome. The nature of these subgroups can provide insight into effect mechanisms and suggest targets for tailored interventions. However, identifying relevant subgroups can be challenging with standard statistical methods. Main text We review the literature on decision trees, a family of techniques for partitioning the population, on the basis of covariates, into distinct subgroups who share similar values of an outcome variable. We compare two decision tree methods, the popular Classification and Regression tree (CART) technique and the newer Conditional Inference tree (CTree) technique, assessing their performance in a simulation study and using data from the Box Lunch Study, a randomized controlled trial of a portion size intervention. Both CART and CTree identify homogeneous population subgroups and offer improved prediction accuracy relative to regression-based approaches when subgroups are truly present in the data. An important distinction between CART and CTree is that the latter uses a formal statistical hypothesis testing framework in building decision trees, which simplifies the process of identifying and interpreting the final tree model. We also introduce a novel way to visualize the subgroups defined by decision trees. Our novel graphical visualization provides a more scientifically meaningful characterization of the subgroups identified by decision trees. Conclusions Decision trees are a useful tool for identifying homogeneous subgroups defined by combinations of individual characteristics. While all decision tree techniques generate subgroups, we advocate the use of the newer CTree technique due to its simplicity and ease of interpretation.
Relevance of Interleukin-6 and D-Dimer for Serious Non-AIDS Morbidity and Death among HIV-Positive Adults on Suppressive Antiretroviral Therapy
Despite effective antiretroviral treatment (ART), HIV-positive individuals are at increased risk of serious non-AIDS conditions (cardiovascular, liver and renal disease, and cancers), perhaps due in part to ongoing inflammation and/or coagulation. To estimate the potential risk reduction in serious non-AIDS conditions or death from any cause that might be achieved with treatments that reduce inflammation and/or coagulation, we examined associations of interleukin-6 (IL-6), D-dimer, and high-sensitivity C-reactive protein (hsCRP) levels with serious non-AIDS conditions or death in 3 large cohorts. In HIV-positive adults on suppressive ART, associations of IL-6, D-dimer, and hsCRP levels at study entry with serious non-AIDS conditions or death were studied using Cox regression. Hazard ratios (HR) adjusted for age, gender, study, and regression dilution bias (due to within-person biomarker variability) were used to predict risk reductions in serious non-AIDS conditions or death associated with lower \"usual\" levels of IL-6 and D-dimer. Over 4.9 years of mean follow-up, 260 of the 3766 participants experienced serious non-AIDS conditions or death. IL-6, D-dimer and hsCRP were each individually associated with risk of serious non-AIDS conditions or death, HR = 1.45 (95% CI: 1.30 to 1.63), 1.28 (95% CI: 1.14 to 1.44), and 1.17 (95% CI: 1.09 to 1.26) per 2x higher biomarker levels, respectively. In joint models, IL-6 and D-dimer were independently associated with serious non-AIDS conditions or death, with consistent results across the 3 cohorts and across serious non-AIDS event types. The association of IL-6 and D-dimer with serious non-AIDS conditions or death was graded and persisted throughout follow-up. For 25% lower \"usual\" IL-6 and D-dimer levels, the joint biomarker model estimates a 37% reduction (95% CI: 28 to 46%) in the risk of serious non-AIDS conditions or death if the relationship is causal. Both IL-6 and D-dimer are independently associated with serious non-AIDS conditions or death among HIV-positive adults with suppressed virus. This suggests that treatments that reduce IL-6 and D-dimer levels might substantially decrease morbidity and mortality in patients on suppressive ART. Clinical trials are needed to test this hypothesis.
Methods for Real-Time Prediction of the Mode of Travel Using Smartphone-Based GPS and Accelerometer Data
We propose and compare combinations of several methods for classifying transportation activity data from smartphone GPS and accelerometer sensors. We have two main objectives. First, we aim to classify our data as accurately as possible. Second, we aim to reduce the dimensionality of the data as much as possible in order to reduce the computational burden of the classification. We combine dimension reduction and classification algorithms and compare them with a metric that balances accuracy and dimensionality. In doing so, we develop a classification algorithm that accurately classifies five different modes of transportation (i.e., walking, biking, car, bus and rail) while being computationally simple enough to run on a typical smartphone. Further, we use data that required no behavioral changes from the smartphone users to collect. Our best classification model uses the random forest algorithm to achieve 96.8% accuracy.
Repurposing non-pharmacological interventions for Alzheimer's disease through link prediction on biomedical literature
Non-pharmaceutical interventions (NPI) have great potential to improve cognitive function but limited investigation to discover NPI repurposing for Alzheimer's Disease (AD). This is the first study to develop an innovative framework to extract and represent NPI information from biomedical literature in a knowledge graph (KG), and train link prediction models to repurpose novel NPIs for AD prevention. We constructed a comprehensive KG, called ADInt, by extracting NPI information from biomedical literature. We used the previously-created SuppKG and NPI lexicon to identify NPI entities. Four KG embedding models (i.e., TransE, RotatE, DistMult and ComplEX) and two novel graph convolutional network models (i.e., R-GCN and CompGCN) were trained and compared to learn the representation of ADInt. Models were evaluated and compared on two test sets (time slice and clinical trial ground truth) and the best performing model was used to predict novel NPIs for AD. Discovery patterns were applied to generate mechanistic pathways for high scoring candidates. The ADInt has 162,212 nodes and 1,017,284 edges. R-GCN performed best in time slice (MR = 5.2054, Hits@10 = 0.8496) and clinical trial ground truth (MR = 3.4996, Hits@10 = 0.9192) test sets. After evaluation by domain experts, 10 novel dietary supplements and 10 complementary and integrative health were proposed from the score table calculated by R-GCN. Among proposed novel NPIs, we found plausible mechanistic pathways for photodynamic therapy and Choerospondias axillaris to prevent AD, and validated psychotherapy and manual therapy techniques using real-world data analysis. The proposed framework shows potential for discovering new NPIs for AD prevention and understanding their mechanistic pathways.
Logistic burdens of cancer care: A qualitative study
Cancer treatment often creates logistic conflicts with everyday life priorities; however, these challenges and how they are subjectively experienced have been largely unaddressed in cancer care. Our goal was to describe time and logistic requirements of cancer care and whether and how they interfered with daily life and well-being. We conducted interviews with 20 adults receiving cancer-directed treatment at a single academic cancer center. We focused on participants’ perception of the time, effort, and energy-intensiveness of cancer care activities, organization of care requirements, and preferences in how to manage the logistic burdens of their cancer care. Participant interview transcripts were analyzed using an inductive thematic analysis approach. Burdens related to travel, appointment schedules, healthcare system navigation, and consequences for relationships had roots both at the system-level (e.g. labs that were chronically delayed, protocol-centered rather than patient-centered bureaucratic requirements) and in individual circumstances (e.g. greater stressors among those working and/or have young children versus those who are retired) that determined subjective burdensomeness, which was highest among patients who experienced multiple sources of burdens simultaneously. Our study illustrates how objective burdens of cancer care translate into subjective burden depending on patient circumstances, emphasizing that to study burdens of care, an exclusive focus on objective measures does not capture the complexity of these issues. The complex interplay between healthcare system factors and individual circumstances points to clinical opportunities, for example helping patients to find ways to meet work and childcare requirements while receiving care.
Identification of dietary supplement use from electronic health records using transformer-based language models
Background Alzheimer’s disease (AD) and related dementias (ADRD) are common in older adults, their prevention and management are challenging problems. To prevent or delay ADRD, dietary supplements (DS) have emerged as a promising treatment; however, the role of DS usage on disease progression of patients with cognitive impairments remains unclear. Little clinical trial evidence is available, but substantial information is contained in electronic health records (EHR), including structured and unstructured data about patients’ DS usage and disease status. The objectives of this study were to (1) develop accurate natural language processing (NLP) methods to extract DS usage for patients with Mild Cognitive Impairment (MCI) and ADRD, (2) examine the coverage of DS in structured data versus unstructured data and (3) compare DS usage information in EHR with National Health and Nutrition Examination Survey (NHANES) data. Methods We collected EHR data for patients with MCI and ADRD. A pipeline to extract the usage information of DS from both structured data and unstructured clinical notes was developed in the study. For structured data, we used the medication table to identify the DS and for unstructured clinical notes, we applied Bidirectional Encoder Representations from Transformers (BERT) fine-tuning strategy to extract the DS usage status. Results The best named entity recognition model for DS achieved an F1-score of 0.964 and the PubMed BERT-based use status classifier had a weighted F1-score of 0.879. We applied these models to extract DS usage information from unstructured clinical notes and subsequently compared and combined with those from structured medication orders. In total, 125 unique DS were identified for patients with MCI and 108 unique DS were identified for patients with ADRD. Conclusions In this study, we developed an NLP-based pipeline to extract the DS use information from medication structured data and clinical notes in EHR for patients with MCI and ADRD. Our method could further help understand the DS usage of patients with MCI and ADRD, and how these DS could influence the diseases.
Early results of a natural experiment evaluating the effects of a local minimum wage policy on the diet-related health of low-wage workers, 2018–2020
The current study presents results of a midpoint analysis of an ongoing natural experiment evaluating the diet-related effects of the Minneapolis Minimum Wage Ordinance, which incrementally increases the minimum wage to $15/h. A difference-in-difference (DiD) analysis of measures collected among low-wage workers in two U.S. cities (one city with a wage increase policy and one comparison city). Measures included employment-related variables (hourly wage, hours worked and non-employment assessed by survey questions with wages verified by paystubs), BMI measured by study scales and stadiometers and diet-related mediators (food insecurity, Supplemental Nutrition Assistance Program (SNAP) participation and daily servings of fruits and vegetables, whole-grain rich foods and foods high in added sugars measured by survey questions). Minneapolis, Minnesota and Raleigh, North Carolina. A cohort of 580 low-wage workers (268 in Minneapolis and 312 in Raleigh) who completed three annual study visits between 2018 and 2020. In DiD models adjusted for time-varying and non-time-varying confounders, there were no statistically significant differences in variables of interest in Minneapolis compared with Raleigh. Trends across both cities were evident, showing a steady increase in hourly wage, stable BMI, an overall decrease in food insecurity and non-linear trends in employment, hours worked, SNAP participation and dietary outcomes. There was no evidence of a beneficial or adverse effect of the Minimum Wage Ordinance on health-related variables during a period of economic and social change. The COVID-19 pandemic and other contextual factors likely contributed to the observed trends in both cities.
Influence of Patient Characteristics and Psychological Needs on Diabetes Mobile App Usability in Adults With Type 1 or Type 2 Diabetes: Crossover Randomized Trial
More than 1100 diabetes mobile apps are available, but app usage by patients is low. App usability may be influenced by patient factors such as age, sex, and psychological needs. Guided by Self-Determination Theory, the purposes of this study were to (1) assess the effect of patient characteristics on app usability, and (2) determine whether patient characteristics and psychological needs (competence, autonomy, and connectivity)-important for motivation in diabetes care-are associated with app usability. Using a crossover randomized design, 92 adults with type 1 or 2 diabetes tested two Android apps (mySugr and OnTrack) for seven tasks including data entry, blood glucose (BG) reporting, and data sharing. We used multivariable linear regression models to examine associations between patient characteristics, psychological needs, user satisfaction, and user performance (task time, success, and accuracy). Participants had a mean age of 54 (range 19-74) years, and were predominantly white (62%, 57/92), female (59%, 54/92), with type 2 diabetes (70%, 64/92), and had education beyond high school (67%, 61/92). Participants rated an overall user satisfaction score of 62 (SD 18), which is considered marginally acceptable. The satisfaction mean score for each app was 55 (SD 18) for mySugr and 68 (SD 15) for OnTrack. The mean task completion time for all seven tasks was 7 minutes, with a mean task success of 82% and an accuracy rate of 68%. Higher user satisfaction was observed for patients with less education (P=.04) and those reporting more competence (P=.02), autonomy (P=.006), or connectivity with a health care provider (P=.03). User performance was associated with age, sex, education, diabetes duration, and autonomy. Older patients required more time (95% CI 1.1-3.2) and had less successful task completion (95% CI 3.5-14.3%). Men needed more time (P=.01) and more technical support than women (P=.04). High school education or less was associated with lower task success (P=.003). Diabetes duration of ≥10 years was associated with lower task accuracy (P=.02). Patients who desired greater autonomy and were interested in learning their patterns of BG and carbohydrates had greater task success (P=.049). Diabetes app usability was associated with psychological needs that are important for motivation. To enhance patient motivation to use diabetes apps for self-management, clinicians should address competence, autonomy, and connectivity by teaching BG pattern recognition and lifestyle planning, customizing BG targets, and reviewing home-monitored data via email. App usability could be improved for older male users and those with less education and greater diabetes duration by tailoring app training and providing ongoing technical support.
Evaluating the impact of a full-service mobile food market on food security, diet quality and food purchases: a cluster randomised trial protocol and design paper
IntroductionMobile food markets may help to mitigate diet-related and weight-related inequities by bringing low-cost, nutritious food directly to underserved populations. By stocking foods to meet a range of dietary needs, full-service mobile markets may improve multiple aspects of diet, food security and fruit and vegetable procurement with a convenient one-stop shop.Methods and analysisThis cluster randomised trial is evaluating the impact of a full-service mobile market, the Twin Cities Mobile Market (TCMM). The TCMM sells staple foods at affordable prices from a retrofitted bus that regularly visits communities experiencing low incomes. The trial’s primary outcome is participant diet quality. Secondary outcomes include intake of specific foods and nutrients, food security and servings of fruits and vegetables procured for the home.Together with our partners, we enrolled four subsidised community housing sites in three waves (12 total sites), aimed to recruit 22 participants per site (N=264) and collected baseline data. Sites were then randomised to either receive the full-service TCMM intervention or serve as a waitlist control, and the full-service TCMM began implementing at intervention sites. Follow-up data collection is occurring at 6 months post-implementation. After follow-up data collection for each wave, the full-service TCMM intervention is being implemented at the waitlist control sites. Waves 1 and 2 are complete and Wave 3 is in progress.At baseline and follow-up data collection, dietary quality and intake are being assessed through three, interviewer-administered, 24-hour dietary recalls, food insecurity is being assessed by the 18-item Food Security Screening Module and fruit and vegetable procurement is being measured by collecting one month of food procurement tracking forms.We will use intent-to-treat analyses to determine if participant diet quality, food security and procurement of fruits and vegetables improve in the sites that received the full-service TCMM intervention relative to the participants in the waitlist control condition.Ethics and disseminationTrial procedures have been approved by the University of Minnesota Institutional Review Board. We plan to disseminate main outcomes in Grant Year 5 in both scientific and community spaces.Trial registration numberClinicalTrials.gov: NCT05672186.
Adjudicated Morbidity and Mortality Outcomes by Age among Individuals with HIV Infection on Suppressive Antiretroviral Therapy
Non-AIDS conditions such as cardiovascular disease and non-AIDS defining cancers dominate causes of morbidity and mortality among persons with HIV on suppressive combination antiretroviral therapy. Accurate estimates of disease incidence and of risk factors for these conditions are important in planning preventative efforts. With use of medical records, serious non-AIDS events, AIDS events, and causes of death were adjudicated using pre-specified criteria by an Endpoint Review Committee in two large international trials. Rates of serious non-AIDS which include cardiovascular disease, end-stage renal disease, decompensated liver disease, and non-AIDS cancer, and other serious (grade 4) adverse events were determined, overall and by age, over a median follow-up of 4.3 years for 3,570 participants with CD4+ cell count ≥300 cells/mm³ who were taking antiretroviral therapy and had an HIV RNA level ≤500 copies/mL. Cox models were used to examine the effect of age and other baseline factors on risk of a composite outcome of all-cause mortality, AIDS, or serious non-AIDS. Five-year Kaplan-Meier estimates of the composite outcome, overall and by age were 8.3% (overall), 3.6% (<40), 8.7% (40-49) and 16.1% (≥50), respectively (p<0.001). In addition to age, smoking and higher levels of interleukin-6 and D-dimer were significant predictors of the composite outcome. The composite outcome was dominated by serious non-AIDS events (overall 65% of 277 participants with a composite event). Most serious non-AIDS events were due to cardiovascular disease and non-AIDS cancers. To date, few large studies have carefully collected data on serious non-AIDS outcomes. Thus, reliable estimates of event rates are scarce. Data cited here, from a geographically diverse cohort, will be useful for planning studies of interventions aimed at reducing rates of serious non-AIDS events among people with HIV.