Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Item Type
      Item Type
      Clear All
      Item Type
  • Subject
      Subject
      Clear All
      Subject
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
      More Filters
      Clear All
      More Filters
      Source
    • Language
1,789 result(s) for "Grimm, J"
Sort by:
Immunotherapy for ovarian cancer: towards a tailored immunophenotype-based approach
Despite documented evidence that ovarian cancer cells express immune-checkpoint molecules, such as PD-1 and PD-L1, and of a positive correlation between the presence of tumour-infiltrating lymphocytes and favourable overall survival outcomes in patients with this tumour type, the results of trials testing immune-checkpoint inhibitors (ICIs) in these patients thus far have been disappointing. The lack of response to ICIs can be attributed to tumour heterogeneity as well as inherent or acquired resistance associated with the tumour microenvironment (TME). Understanding tumour immunobiology, discovering biomarkers for patient selection and establishing optimal treatment combinations remains the hope but also a key challenge for the future application of immunotherapy in ovarian cancer. In this Review, we summarize results from trials testing ICIs in patients with ovarian cancer. We propose the implementation of a systematic CD8+ T cell-based immunophenotypic classification of this malignancy, followed by discussions of the preclinical data providing the basis to treat such immunophenotypes with combination immunotherapies. We posit that the integration of an accurate TME immunophenotype characterization with genetic data can enable the design of tailored therapeutic approaches and improve patient recruitment in clinical trials. Lastly, we propose a roadmap incorporating tissue-based profiling to guide future trials testing adoptive cell therapy approaches and assess novel immunotherapy combinations while promoting collaborative research.Although ovarian cancer is considered an immunoreactive disease, moderate to no efficacy has been shown in the trials testing immune-checkpoint inhibitors in these patients performed thus far. The authors of this Review summarize these results and propose a systematic classification of ovarian cancer based on CD8+ T cell immunophenotypes that, integrated with genetic data, can enable the design of tailored therapeutic approaches, including adoptive cell therapy and novel immunotherapy combinations.
Mitochondrial DNA Released by Trauma Induces Neutrophil Extracellular Traps
Neutrophil extracellular traps (NETs) are critical for anti-bacterial activity of the innate immune system. We have previously shown that mitochondrial damage-associated molecular patterns (mtDAMPs), including mitochondrial DNA (mtDNA), are released into the circulation after injury. We therefore questioned whether mtDNA is involved in trauma-induced NET formation. Treatment of human polymorphoneutrophils (PMN) with mtDNA induced robust NET formation, though in contrast to phorbol myristate acetate (PMA) stimulation, no NADPH-oxidase involvement was required. Moreover, formation of mtDNA-induced NETs was completely blocked by TLR9 antagonist, ODN-TTAGGG. Knowing that infective outcomes of trauma in elderly people are more severe than in young people, we measured plasma mtDNA and NET formation in elderly and young trauma patients and control subjects. MtDNA levels were significantly higher in the plasma of elderly trauma patients than young patients, despite lower injury severity scores in the elderly group. NETs were not visible in circulating PMN isolated from either young or old control subjects. NETs were however, detected in PMN isolated from young trauma patients and to a lesser extent from elderly patients. Stimulation by PMA induced widespread NET formation in PMN from both young volunteers and young trauma patients. NET response to PMA was much less pronounced in both elderly volunteers' PMN and in trauma patients' PMN. We conclude that mtDNA is a potent inducer of NETs that activates PMN via TLR9 without NADPH-oxidase involvement. We suggest that decreased NET formation in the elderly regardless of higher mtDNA levels in their plasma may result from decreased levels of TLR9 and/or other molecules, such as neutrophil elastase and myeloperoxidase that are involved in NET generation. Further study of the links between circulating mtDNA and NET formation may elucidate the mechanisms of trauma-related organ failure as well as the greater susceptibility to secondary infection in elderly trauma patients.
Radiology for Ductal Carcinoma In Situ of the Breast: Updates on Invasive Cancer Progression and Active Monitoring
Ductal carcinoma in situ (DCIS) accounts for approximately 30% of new breast cancer diagnoses. However, our understanding of how normal breast tissue evolves into DCIS and invasive cancers remains insufficient. Further, conclusions regarding the mechanisms of disease progression in terms of histopathology, genetics, and radiology are often conflicting and have implications for treatment planning. Moreover, the increase in DCIS diagnoses since the adoption of organized breast cancer screening programs has raised concerns about overdiagnosis and subsequent overtreatment. Active monitoring, a nonsurgical management strategy for DCIS, avoids surgery in favor of close imaging follow-up to de-escalate therapy and provides more treatment options. However, the two major challenges in active monitoring are identifying occult invasive cancer and patients at risk of invasive cancer progression. Subsequently, four prospective active monitoring trials are ongoing to determine the feasibility of active monitoring and refine the patient eligibility criteria and follow-up intervals. Radiologists play a major role in determining eligibility for active monitoring and reviewing surveillance images for disease progression. Trial results published over the next few years would support a new era of multidisciplinary DCIS care.
Nonlinear Growth Curves in Developmental Research
Developmentalists are often interested in understanding change processes, and growth models are the most common analytic tool for examining such processes. Nonlinear growth curves are especially valuable to developmentalists because the defining characteristics of the growth process such as initial levels, rates of change during growth spurts, and asymptotic levels can be estimated. A variety of growth models are described beginning with the linear growth model and moving to nonlinear models of varying complexity. A detailed discussion of nonlinear models is provided, highlighting the added insights into complex developmental processes associated with their use. A collection of growth models are fit to repeated measures of height from participants of the Berkeley Growth and Guidance Studies from early childhood through adulthood.
Investigating the effect of anatomical variations in the response of the neonatal brachial plexus to applied force: Use of a two-dimensional finite element model
The brachial plexus is a set of nerves that innervate the upper extremity and may become injured during the birthing process through an injury known as Neonatal Brachial Plexus Palsy. Studying the mechanisms of these injuries on infant cadavers is challenging due to the justifiable sensitivity surrounding testing. Thus, these specimens are generally unavailable to be used to investigate variations in brachial plexus injury mechanisms. Finite Element Models are an alternative way to investigate the response of the neonatal brachial plexus to loading. Finite Element Models allow a virtual representation of the neonatal brachial plexus to be developed and analyzed with dimensions and mechanical properties determined from experimental studies. Using ABAQUS software, a two-dimensional brachial plexus model was created to analyze how stresses and strains develop within the brachial plexus. The main objectives of this study were (1) to develop a model of the brachial plexus and validate it against previous literature, and (2) to analyze the effect of stress on the nerve roots based on variations in the angles between the nerve roots and the spinal cord. The predicted stress for C5 and C6 was calculated as 0.246 MPa and 0.250 MPa, respectively. C5 and C6 nerve roots experience the highest stress and the largest displacement in comparison to the lower nerve roots, which correlates with clinical patterns of injury. Even small (+/- 3 and 6 degrees) variations in nerve root angle significantly impacted the stress at the proximal nerve root. This model is the first step towards developing a complete three-dimensional model of the neonatal brachial plexus to provide the opportunity to more accurately assess the effect of the birth process on the stretch within the brachial plexus and the impact of biological variations in structure and properties on the risk of Neonatal Brachial Plexus Palsy.
Stress and sleep across the onset of the novel coronavirus disease 2019 pandemic: impact of distance learning on US college students’ health trajectories
Abstract Study Objectives This study examined associations between average and intraindividual trajectories of stress, sleep duration, and sleep quality in college students before, during, and after transitioning to online learning due to the COVID-19 pandemic. Methods One hundred and sixty-four first-year college students answered twice-weekly questionnaires assessing stress exposure and perception, sleep duration, and sleep quality from January until May, 2020 (N = 4269 unique observations). Results Multilevel growth modeling revealed that prior to distance learning, student stress was increasing and sleep duration and quality were decreasing. After transitioning online, students’ stress exposure and perception trajectories immediately and continuously decreased; sleep quality initially increased but decreased over time; and sleep duration increased but then plateaued for the remainder of the semester. Days with higher stress exposure than typical for that student were associated with lower sleep quality, and both higher stress exposure and perception at the transition were linked with simultaneous lower sleep quality. Specific groups (eg, females) were identified as at-risk for stress and sleep problems. Conclusions Although transitioning to remote learning initially alleviated college students’ stress and improved sleep, these effects plateaued, and greater exposure to academic, financial, and interpersonal stressors predicted worse sleep quality on both daily and average levels. Environmental stressors may particularly dictate sleep quality during times of transition, but adaptations in learning modalities may help mitigate short-term detrimental health outcomes during global emergencies, even during a developmental period with considerable stress vulnerability. Future studies should examine longer-term implications of these trajectories on mental and physical health.
HIV Reactivation from Latency after Treatment Interruption Occurs on Average Every 5-8 Days—Implications for HIV Remission
HIV infection can be effectively controlled by anti-retroviral therapy (ART) in most patients. However therapy must be continued for life, because interruption of ART leads to rapid recrudescence of infection from long-lived latently infected cells. A number of approaches are currently being developed to 'purge' the reservoir of latently infected cells in order to either eliminate infection completely, or significantly delay the time to viral recrudescence after therapy interruption. A fundamental question in HIV research is how frequently the virus reactivates from latency, and thus how much the reservoir might need to be reduced to produce a prolonged antiretroviral-free HIV remission. Here we provide the first direct estimates of the frequency of viral recrudescence after ART interruption, combining data from four independent cohorts of patients undergoing treatment interruption, comprising 100 patients in total. We estimate that viral replication is initiated on average once every ≈6 days (range 5.1- 7.6 days). This rate is around 24 times lower than previous thought, and is very similar across the cohorts. In addition, we analyse data on the ratios of different 'reactivation founder' viruses in a separate cohort of patients undergoing ART-interruption, and estimate the frequency of successful reactivation to be once every 3.6 days. This suggests that a reduction in the reservoir size of around 50-70-fold would be required to increase the average time-to-recrudescence to about one year, and thus achieve at least a short period of anti-retroviral free HIV remission. Our analyses suggests that time-to-recrudescence studies will need to be large in order to detect modest changes in the reservoir, and that macaque models of SIV latency may have much higher frequencies of viral recrudescence after ART interruption than seen in human HIV infection. Understanding the mean frequency of recrudescence from latency is an important first step in approaches to prolong antiretroviral-free viral remission in HIV.
Classification performance bias between training and test sets in a limited mammography dataset
To assess the performance bias caused by sampling data into training and test sets in a mammography radiomics study. Mammograms from 700 women were used to study upstaging of ductal carcinoma in situ. The dataset was repeatedly shuffled and split into training (n = 400) and test cases (n = 300) forty times. For each split, cross-validation was used for training, followed by an assessment of the test set. Logistic regression with regularization and support vector machine were used as the machine learning classifiers. For each split and classifier type, multiple models were created based on radiomics and/or clinical features. Area under the curve (AUC) performances varied considerably across the different data splits (e.g., radiomics regression model: train 0.58-0.70, test 0.59-0.73). Performances for regression models showed a tradeoff where better training led to worse testing and vice versa. Cross-validation over all cases reduced this variability, but required samples of 500+ cases to yield representative estimates of performance. In medical imaging, clinical datasets are often limited to relatively small size. Models built from different training sets may not be representative of the whole dataset. Depending on the selected data split and model, performance bias could lead to inappropriate conclusions that might influence the clinical significance of the findings. Performance bias can result from model testing when using limited datasets. Optimal strategies for test set selection should be developed to ensure study conclusions are appropriate.
Stress and diurnal cortisol among Latino/a college students: A multi-risk model approach
The transition to college is a time of increased opportunity and stress spanning multiple domains. Adolescents who encounter significant stress during this transition may be vulnerable to adverse outcomes due to a “wear and tear” of the hypothalamic pituitary adrenal (HPA) axis. Latino/a students may be particularly at-risk for heightened stress exposure due to experiences of both minority-specific and general life stress. Despite this, little is known regarding the cumulative impact of multiple stressors on Latino/a students’ HPA axis functioning. The present study employed a “multi-risk model” approach to examine additive, common, and cumulative effects of multiple stress forms (general, academic, social, financial, bicultural, ethnic/racial discrimination) on diurnal cortisol in a sample of first-year Latino/a college students ( N = 196; 64.4% female; M age = 18.95). Results indicated that no stress forms were additively associated with the cortisol awakening response (CAR), but general stress was associated with a flatter diurnal cortisol slope (DCS) and bicultural stress was linked with a steeper DCS. A college stress latent factor was associated with a lower CAR, whereas a latent factor of discrimination was not associated with diurnal cortisol. Cumulative risk was linked with a lower CAR. Findings highlight the physiological correlates of various stressors experienced by Latino/a college students.
Benefits and Risks of Mammography Screening in Women Ages 40 to 49 Years
Breast cancer screening in the United States is complicated by conflicting recommendations from professional and governmental organizations. The benefits and risks of breast cancer screening differ though by age which should influence shared decision-making discussions. Compared to older women, women ages 40 to 49 years have a lower risk of breast cancer, but the types of breast cancer that develop are often more aggressive with a poorer prognosis. Furthermore, younger women have a longer life expectancy and fewer comorbidities. The primary benefits of screening for women in their 40s are a reduction in breast cancer mortality, years of life lost to breast cancer, and morbidity of breast cancer treatment by detecting cancers at an earlier stage. Compared to older women, the risks of breast cancer screening in women ages 40 to 49 years includes more false positive recalls and biopsies as well as transient anxiety. Concerns regarding radiation induced malignancy and overdiagnosis are minimal in this age group. The shorter lead time of breast cancer in women ages 40 to 49 years also favors shorter screening intervals. This information should help inform providers in their shared decision-making discussions with patients.