Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Language
      Language
      Clear All
      Language
  • Subject
      Subject
      Clear All
      Subject
  • Item Type
      Item Type
      Clear All
      Item Type
  • Discipline
      Discipline
      Clear All
      Discipline
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
1,576 result(s) for "Radiation Injuries - diagnosis"
Sort by:
Presymptomatic microRNA-based biomarker signatures for the prognosis of localized radiation injury in mice
The threat of nuclear or radiological events requires early diagnostic tools for radiation induced health effects. Localized radiation injuries (LRI) are severe outcomes of such events, characterized by a latent presymptomatic phase followed by symptom onset ranging from erythema and edema to ulceration and tissue necrosis. Early diagnosis is crucial for effective triage and adapted treatment, potentially through minimally invasive biomarkers including circulating microRNAs (miRNAs), which have been correlated with tissue injuries and radiation exposure, suggesting their potential in diagnosing LRI. In this study, we sought to identify early miRNA signatures for LRI severity prognosis before clinical symptoms appear. Using a mouse model of hindlimb irradiation at 0, 20, 40, or 80 Gy previously shown to lead to localized injuries of different severities, we performed broad-spectrum plasma miRNA profiling at two latency stages (day 1 and 7 post-irradiation). The identified candidate miRNAs were then challenged using two independent mouse cohorts to refine miRNA signatures. Through sparse partial least square discriminant analysis (sPLS-DA), signatures of 14 and 16 plasma miRNAs segregated animals according to dose groups at day 1 and day 7, respectively. Interestingly, these signatures shared 9 miRNAs, including miR-19a-3p, miR-93-5p, miR-140-3p, previously associated with inflammation, radiation response and tissue damage. In addition, the Bayesian latent variable modeling confirmed significant correlations between these prognostic miRNA signatures and day 14 clinical and functional outcomes from unrelated mice. This study identified plasma miRNA signatures that might be used throughout the latency phase for the prognosis of LRI severity. These results suggest miRNA profiling could be a powerful tool for early LRI diagnosis, thereby improving patient management and treatment outcomes in radiological emergency situations.
Radiation necrosis after a combination of external beam radiotherapy and iodine-125 brachytherapy in gliomas
Purpose Frequency and risk profile of radiation necrosis (RN) in patients with glioma undergoing either upfront stereotactic brachytherapy (SBT) and additional salvage external beam radiotherapy (EBRT) after tumor recurrence or vice versa remains unknown. Methods Patients with glioma treated with low-activity temporary iodine-125 SBT at the University of Munich between 1999 and 2016 who had either additional upfront or salvage EBRT were included. Biologically effective doses (BED) were calculated. RN was diagnosed using stereotactic biopsy and/or metabolic imaging. The rate of RN was estimated with the Kaplan Meier method. Risk factors were obtained from logistic regression models. Results Eighty-six patients (49 male, 37 female, median age 47 years) were included. 38 patients suffered from low-grade and 48 from high-grade glioma. Median follow-up was 15 months after second treatment. Fifty-eight patients received upfront EBRT (median total dose: 60 Gy), and 28 upfront SBT (median reference dose: 54 Gy, median dose rate: 10.0 cGy/h). Median time interval between treatments was 19 months. RN was diagnosed in 8/75 patients. The 1- and 2-year risk of RN was 5.1% and 11.7%, respectively. Tumor volume and irradiation time of SBT, number of implanted seeds, and salvage EBRT were risk factors for RN. Neither of the BED values nor the time interval between both treatments gained prognostic influence. Conclusion The combination of upfront EBRT and salvage SBT or vice versa is feasible for glioma patients. The risk of RN is mainly determined by the treatment volume but not by the interval between therapies.
Imaging Radiation-Induced Normal Tissue Injury
Technological developments in radiation therapy and other cancer therapies have led to a progressive increase in five-year survival rates over the last few decades. Although acute effects have been largely minimized by both technical advances and medical interventions, late effects remain a concern. Indeed, the need to identify those individuals who will develop radiation-induced late effects, and to develop interventions to prevent or ameliorate these late effects is a critical area of radiobiology research. In the last two decades, preclinical studies have clearly established that late radiation injury can be prevented/ameliorated by pharmacological therapies aimed at modulating the cascade of events leading to the clinical expression of radiation-induced late effects. These insights have been accompanied by significant technological advances in imaging that are moving radiation oncology and normal tissue radiobiology from disciplines driven by anatomy and macrostructure to ones in which important quantitative functional, microstructural, and metabolic data can be noninvasively and serially determined. In the current article, we review use of positron emission tomography (PET), single photon emission tomography (SPECT), magnetic resonance (MR) imaging and MR spectroscopy to generate pathophysiological and functional data in the central nervous system, lung, and heart that offer the promise of, (1) identifying individuals who are at risk of developing radiation-induced late effects, and (2) monitoring the efficacy of interventions to prevent/ameliorate them.
The broadening scope of oral mucositis and oral ulcerative mucosal toxicities of anticancer therapies
Oral mucositis (OM) is a common, highly symptomatic complication of cancer therapy that affects patients' function, quality of life, and ability to tolerate treatment. In certain patients with cancer, OM is associated with increased mortality. Research on the management of OM is ongoing. Oral mucosal toxicities are also reported in targeted and immune checkpoint inhibitor therapies. The objective of this article is to present current knowledge about the epidemiology, pathogenesis, assessment, risk prediction, and current and developing intervention strategies for OM and other ulcerative mucosal toxicities caused by both conventional and evolving forms of cancer therapy.
Recommendations for cardiomyopathy surveillance for survivors of childhood cancer: a report from the International Late Effects of Childhood Cancer Guideline Harmonization Group
Survivors of childhood cancer treated with anthracycline chemotherapy or chest radiation are at an increased risk of developing congestive heart failure. In this population, congestive heart failure is well recognised as a progressive disorder, with a variable period of asymptomatic cardiomyopathy that precedes signs and symptoms. As a result, several clinical practice guidelines have been developed independently to help with detection and treatment of asymptomatic cardiomyopathy. These guidelines differ with regards to definitions of at-risk populations, surveillance modality and frequency, and recommendations for interventions. Differences between these guidelines could hinder the effective implementation of these recommendations. We report on the results of an international collaboration to harmonise existing cardiomyopathy surveillance recommendations using an evidence-based approach that relied on standardised definitions for outcomes of interest and transparent presentation of the quality of the evidence. The resultant recommendations were graded according to the quality of the evidence and the potential benefit gained from early detection and intervention.
Radiotherapy toxicity
Radiotherapy is used in >50% of patients with cancer, both for curative and palliative purposes. Radiotherapy uses ionizing radiation to target and kill tumour tissue, but normal tissue can also be damaged, leading to toxicity. Modern and precise radiotherapy techniques, such as intensity-modulated radiotherapy, may prevent toxicity, but some patients still experience adverse effects. The physiopathology of toxicity is dependent on many parameters, such as the location of irradiation or the functional status of organs at risk. Knowledge of the mechanisms leads to a more rational approach for controlling radiotherapy toxicity, which may result in improved symptom control and quality of life for patients. This improved quality of life is particularly important in paediatric patients, who may live for many years with the long-term effects of radiotherapy. Notably, signs and symptoms occurring after radiotherapy may not be due to the treatment but to an exacerbation of existing conditions or to the development of new diseases. Although differential diagnosis may be difficult, it has important consequences for patients. Radiotherapy is used in >50% of patients with cancer but may be associated with short-term toxicity and long-term consequences. This Primer summarizes the mechanisms by which normal tissues are affected by irradiation, the techniques to mitigate such damage and how to treat the symptoms of radiotherapy toxicity.
Differentiating Radiation-Induced Necrosis from Recurrent Brain Tumor Using MR Perfusion and Spectroscopy: A Meta-Analysis
This meta-analysis examined roles of several metabolites in differentiating recurrent tumor from necrosis in patients with brain tumors using MR perfusion and spectroscopy. Medline, Cochrane, EMBASE, and Google Scholar were searched for studies using perfusion MRI and/or MR spectroscopy published up to March 4, 2015 which differentiated between recurrent tumor vs. necrosis in patients with primary brain tumors or brain metastasis. Only two-armed, prospective or retrospective studies were included. A meta-analysis was performed on the difference in relative cerebral blood volume (rCBV), ratios of choline/creatine (Cho/Cr) and/or choline/N-acetyl aspartate (Cho/NAA) between participants undergoing MRI evaluation. A χ2-based test of homogeneity was performed using Cochran's Q statistic and I2. Of 397 patients in 13 studies who were analyzed, the majority had tumor recurrence. As there was evidence of heterogeneity among 10 of the studies which used rCBV for evaluation (Q statistic = 31.634, I2 = 97.11%, P < 0.0001) a random-effects analysis was applied. The pooled difference in means (2.18, 95%CI = 0.85 to 3.50) indicated that the average rCBV in a contrast-enhancing lesion was significantly higher in tumor recurrence compared with radiation injury (P = 0.001). Based on a fixed-effect model of analysis encompassing the six studies which used Cho/Cr ratios for evaluation (Q statistic = 8.388, I2 = 40.39%, P = 0.137), the pooled difference in means (0.77, 95%CI = 0.57 to 0.98) of the average Cho/Cr ratio was significantly higher in tumor recurrence than in tumor necrosis (P = 0.001). There was significant difference in ratios of Cho to NAA between recurrent tumor and necrosis (1.02, 95%CI = 0.03 to 2.00, P = 0.044). MR spectroscopy and MR perfusion using Cho/NAA and Cho/Cr ratios and rCBV may increase the accuracy of differentiating necrosis from recurrent tumor in patients with primary brain tumors or metastases.
Radiation-induced intestinal damage: latest molecular and clinical developments
To systematically review the prophylactic and therapeutic interventions for reducing the incidence or severity of intestinal symptoms among cancer patients receiving radiotherapy. A literature search was conducted in the PubMed database using various search terms, including ‘radiation enteritis’, ‘radiation enteropathy’, ‘radiation-induced intestinal disease’, ‘radiation-induced intestinal damage’ and ‘radiation mucositis’. The search was limited to studies, clinical trials and meta-analyses published in English with no limitation on publication date. Other relevant literature was identified based on the reference lists of selected studies. The pathogenesis of acute and chronic radiation-induced intestinal damage as well as the prevention and treatment approaches were reviewed. There is inadequate evidence to strongly support the use of a particular strategy to reduce radiation-induced intestinal damage. More high-quality randomized controlled trials are required for interventions with limited evidence suggestive of potential benefits.
The Diagnosis and Treatment of Pseudoprogression, Radiation Necrosis and Brain Tumor Recurrence
Radiation therapy is an important modality used in the treatment of patients with brain metastatic disease and malignant gliomas. Post-treatment surveillance often involves serial magnetic resonance imaging. A challenge faced by clinicians is in the diagnosis and management of a suspicious gadolinium-enhancing lesion found on imaging. The suspicious lesion may represent post-treatment radiation effects (PTRE) such as pseudoprogression, radiation necrosis or tumor recurrence. Significant progress has been made in diagnostic imaging modalities to assist in differentiating these entities. Surgical and medical interventions have also been developed to treat PTRE. In this review, we discuss the pathophysiology, clinical presentation, diagnostic imaging modalities and provide an algorithm for the management of pseudoprogression, radiation necrosis and tumor recurrence.
Machine-learning based MRI radiomics models for early detection of radiation-induced brain injury in nasopharyngeal carcinoma
Background Early radiation-induced temporal lobe injury (RTLI) diagnosis in nasopharyngeal carcinoma (NPC) is clinically challenging, and prediction models of RTLI are lacking. Hence, we aimed to develop radiomic models for early detection of RTLI. Methods We retrospectively included a total of 242 NPC patients who underwent regular follow-up magnetic resonance imaging (MRI) examinations, including contrast-enhanced T1-weighted and T2-weighted imaging. For each MRI sequence, four non-texture and 10,320 texture features were extracted from medial temporal lobe, gray matter, and white matter, respectively. The relief and 0.632 + bootstrap algorithms were applied for initial and subsequent feature selection, respectively. Random forest method was used to construct the prediction model. Three models, 1, 2 and 3, were developed for predicting the results of the last three follow-up MRI scans at different times before RTLI onset, respectively. The area under the curve (AUC) was used to evaluate the performance of models. Results Of the 242 patients, 171 (70.7%) were men, and the mean age of all the patients was 48.5 ± 10.4 years. The median follow-up and latency from radiotherapy until RTLI were 46 and 41 months, respectively. In the testing cohort, models 1, 2, and 3, with 20 texture features derived from the medial temporal lobe, yielded mean AUCs of 0.830 (95% CI: 0.823–0.837), 0.773 (95% CI: 0.763–0.782), and 0.716 (95% CI: 0.699–0.733), respectively. Conclusion The three developed radiomic models can dynamically predict RTLI in advance, enabling early detection and allowing clinicians to take preventive measures to stop or slow down the deterioration of RTLI.