Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Language
      Language
      Clear All
      Language
  • Subject
      Subject
      Clear All
      Subject
  • Item Type
      Item Type
      Clear All
      Item Type
  • Discipline
      Discipline
      Clear All
      Discipline
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
125 result(s) for "Abend, M."
Sort by:
Impact of Inter-Individual Variance in the Expression of a Radiation-Responsive Gene Panel Used for Triage
In previous studies we determined a gene expression signature in baboons for predicting the severity of hematological acute radiation syndrome. We subsequently validated a set of eight of these genes in leukemia patients undergoing total-body irradiation. In the current study, we addressed the effect of intra-individual variability on the basal level of expression of those eight radiation-responsive genes identified previously, by examining baseline levels in 200 unexposed healthy human donors (122 males and 88 females with an average age of 46 years) using real-time PCR. In addition to the eight candidate genes (DAGLA, WNT3, CD177, PLA2G16, WLS, POU2AF1, STAT4 and PRF1), we examined two more genes (FDXR and DDB2) widely used in ex vivo whole blood experiments. Although significant sex- (seven genes) and age-dependent (two genes) differences in expression were found, the fold changes ranged only between 1.1–1.6. These were well within the twofold differences in gene expression generally considered to represent control values. Age and sex contributed less than 20–30% to the complete inter-individual variance, which is calculated as the fold change between the lowest (reference) and the highest Ct value minimum–maximum fold change (min–max FC). Min–max FCs ranging between 10–17 were observed for most genes; however, for three genes, min–max FCs of complete inter-individual variance were found to be 37.1 (WNT3), 51.4 (WLS) and 1,627.8 (CD177). In addition, to determine whether discrimination between healthy and diseased baboons might be altered by replacing the published gene expression data of the 18 healthy baboons with that of the 200 healthy humans, we employed logistic regression analysis and calculated the area under the receiver operating characteristic (ROC) curve. The additional inter-individual variance of the human data set had either no impact or marginal impact on the ROC area, since up to 32-fold change gene expression differences between healthy and diseased baboons were observed.
Examining Radiation-Induced In Vivo and In Vitro Gene Expression Changes of the Peripheral Blood in Different Laboratories for Biodosimetry Purposes: First RENEB Gene Expression Study
The risk of a large-scale event leading to acute radiation exposure necessitates the development of high-throughput methods for providing rapid individual dose estimates. Our work addresses three goals, which align with the directive of the European Union's Realizing the European Network of Biodosimetry project (EU-RENB): 1. To examine the suitability of different gene expression platforms for biodosimetry purposes; 2. To perform this examination using blood samples collected from prostate cancer patients (in vivo) and from healthy donors (in vitro); and 3. To compare radiation-induced gene expression changes of the in vivo with in vitro blood samples. For the in vitro part of this study, EDTA-treated whole blood was irradiated immediately after venipuncture using single X-ray doses (1 Gy/min−1 dose rate, 100 keV). Blood samples used to generate calibration curves as well as 10 coded (blinded) samples (0–4 Gy dose range) were incubated for 24 h in vitro, lysed and shipped on wet ice. For the in vivo part of the study PAXgene tubes were used and peripheral blood (2.5 ml) was collected from prostate cancer patients before and 24 h after the first fractionated 2 Gy dose of localized radiotherapy to the pelvis [linear accelerator (LINAC), 580 MU/min, exposure 1–1.5 min]. Assays were run in each laboratory according to locally established protocols using either microarray platforms (2 laboratories) or qRT-PCR (2 laboratories). Report times on dose estimates were documented. The mean absolute difference of estimated doses relative to the true doses (Gy) were calculated. Doses were also merged into binary categories reflecting aspects of clinical/diagnostic relevance. For the in vitro part of the study, the earliest report time on dose estimates was 7 h for qRT-PCR and 35 h for microarrays. Methodological variance of gene expression measurements (CV ≤10% for technical replicates) and interindividual variance (≤twofold for all genes) were low. Dose estimates based on one gene, ferredoxin reductase (FDXR), using qRT-PCR were as precise as dose estimates based on multiple genes using microarrays, but the precision decreased at doses ≥2 Gy. Binary dose categories comprising, for example, unexposed compared with exposed samples, could be completely discriminated with most of our methods. Exposed prostate cancer blood samples (n = 4) could be completely discriminated from unexposed blood samples (n = 4, P < 0.03, two-sided Fisher's exact test) without individual controls. This could be performed by introducing an in vitro-to-in vivo correction factor of FDXR, which varied among the laboratories. After that the in vitro-constructed calibration curves could be used for dose estimation of the in vivo exposed prostate cancer blood samples within an accuracy window of ±0.5 Gy in both contributing qRT-PCR laboratories. In conclusion, early and precise dose estimates can be performed, in particular at doses ≤2 Gy in vitro. Blood samples of prostate cancer patients exposed to 0.09–0.017 Gy could be completely discriminated from pre-exposure blood samples with the doses successfully estimated using adjusted in vitro-constructed calibration curves.
Examining potential confounding factors in gene expression analysis of human saliva and identifying potential housekeeping genes
Isolation of RNA from whole saliva, a non-invasive and easily accessible biofluid that is an attractive alternative to blood for high-throughput biodosimetry of radiological/nuclear victims might be of clinical significance for prediction and diagnosis of disease. In a previous analysis of 12 human samples we identified two challenges to measuring gene expression from total RNA: (1) the fraction of human RNA in whole saliva was low and (2) the bacterial contamination was overwhelming. To overcome these challenges, we performed selective cDNA synthesis for human RNA species only by employing poly(A)+-tail primers followed by qRT-PCR. In the current study, this approach was independently validated on 91 samples from 61 healthy donors. Additionally, we used the ratio of human to bacterial RNA to adjust the input RNA to include equal amounts of human RNA across all samples before cDNA synthesis, which then ensured comparable analysis using the same base human input material. Furthermore, we examined relative levels of ten known housekeeping genes, and assessed inter- and intra-individual differences in 61 salivary RNA isolates, while considering effects of demographical factors (e.g. sex, age), epidemiological factors comprising social habits (e.g. alcohol, cigarette consumption), oral hygiene (e.g. flossing, mouthwash), previous radiological diagnostic procedures (e.g. number of CT-scans) and saliva collection time (circadian periodic). Total human RNA amounts appeared significantly associated with age only ( P  ≤ 0.02). None of the chosen housekeeping genes showed significant circadian periodicity and either did not associate or were weakly associated with the 24 confounders examined, with one exception, 60% of genes were altered by mouthwash. ATP6, ACTB and B2M represented genes with the highest mean baseline expression (Ct-values ≤ 30) and were detected in all samples. Combining these housekeeping genes for normalization purposes did not decrease inter-individual variance, but increased the robustness. In summary, our work addresses critical confounders and provides important information for the successful examination of gene expression in human whole saliva.
First Generation Gene Expression Signature for Early Prediction of Late Occurring Hematological Acute Radiation Syndrome in Baboons
We implemented a two-stage study to predict late occurring hematologic acute radiation syndrome (HARS) in a baboon model based on gene expression changes measured in peripheral blood within the first two days after irradiation. Eighteen baboons were irradiated to simulate different patterns of partial-body and total-body exposure, which corresponded to an equivalent dose of 2.5 or 5 Gy. According to changes in blood cell counts the surviving baboons (n = 17) exhibited mild (H1–2, n = 4) or more severe (H2–3, n = 13) HARS. Blood samples taken before irradiation served as unexposed control (H0, n = 17). For stage I of this study, a whole genome screen (mRNA microarrays) was performed using a portion of the samples (H0, n = 5; H1–2, n = 4; H2–3, n = 5). For stage II, using the remaining samples and the more sensitive methodology, qRT-PCR, validation was performed on candidate genes that were differentially up- or down-regulated during the first two days after irradiation. Differential gene expression was defined as significant (P < 0.05) and greater than or equal to a twofold difference above a H0 classification. From approximately 20,000 genes, on average 46% appeared to be expressed. On day 1 postirradiation for H2–3, approximately 2–3 times more genes appeared up-regulated (1,418 vs. 550) or down-regulated (1,603 vs. 735) compared to H1–2. This pattern became more pronounced at day 2 while the number of differentially expressed genes decreased. The specific genes showed an enrichment of biological processes coding for immune system processes, natural killer cell activation and immune response (P = 1 × E-06 up to 9 × E-14). Based on the P values, magnitude and sustained differential gene expression over time, we selected 89 candidate genes for validation using qRT-PCR. Ultimately, 22 genes were confirmed for identification of H1–3 classifications and seven genes for identification of H2–3 classifications using qRT-PCR. For H1–3 classifications, most genes were constantly three to fivefold down-regulated relative to H0 over both days, but some genes appeared 10.3-fold (VSIG4) or even 30.7-fold up-regulated (CD177) over H0. For H2–3, some genes appeared four to sevenfold up-regulated relative to H0 (RNASE3, DAGLA, ARG2), but other genes showed a strong 14- to 33-fold down-regulation relative to H0 (WNT3, POU2AF1, CCR7). All of these genes allowed an almost completely identifiable separation among each of the HARS categories. In summary, clinically relevant HARS can be independently predicted with all 29 irradiated genes examined in the peripheral blood of baboons within the first two days postirradiation. While further studies are needed to confirm these findings, this model shows potential relevance in the prediction of clinical outcomes in exposed humans and as an aid in the prioritizing of medical treatment.
A comparison of thyroidal protection by iodine and perchlorate against radioiodine exposure in Caucasians and Japanese
Radioactive iodine released in nuclear accidents may accumulate in the thyroid and by irradiation enhances the risk of cancer. Radioiodine uptake into the gland can be inhibited by large doses of stable iodine or perchlorate. Nutritional iodine daily intake may impact thyroid physiology, so that radiological doses absorbed by the thyroid as well as thyroid blocking efficacy may differ in Japanese with a very rich iodine diet compared to Caucasians. Based on established biokinetic–dosimetric models for the thyroid, we derived the parameters for Caucasians and Japanese to quantitatively compare the effects of radioiodine exposure and the protective efficacy of thyroid blocking by stable iodine at the officially recommended dosages (100 mg in Germany, 76 mg in Japan) or perchlorate. The maximum transport capacity for iodine uptake into the thyroid is lower in Japanese compared to Caucasians. For the same radioiodine exposure pattern, the radiological equivalent thyroid dose is substantially lower in Japanese in the absence of thyroid blocking treatments. In the case of acute radioiodine exposure, stable iodine is less potent in Japanese (ED50 = 41.6 mg) than in Caucasians (ED50 = 2.7 mg) and confers less thyroid protection at the recommended dosages because of a delayed responsiveness to iodine saturation of the gland (Wolff–Chaikoff effect). Perchlorate (ED50 = 10 mg in Caucasians) at a dose of 1000 mg has roughly the same thyroid blocking effect as 100 mg iodine in Caucasians, whereas it confers a much better protection than 76 mg iodine in Japanese. For prolonged exposures, a single dose of iodine offer substantially lower protection than after acute radioiodine exposure in both groups. Repetitive daily iodine administrations improve efficacy without reaching levels after acute radioiodine exposure and achieve only slightly better protection in Japanese than in Caucasians. However, in the case of continuous radioiodine exposure, daily doses of 1000 mg perchlorate achieve a high protective efficacy in Caucasians as well as Japanese (> 0.98). In Caucasians, iodine (100 mg) and perchlorate (1000 mg) at the recommended dosages seem alternatives in case of acute radioiodine exposure, whereas perchlorate has a higher protective efficacy in the case of longer lasting radioiodine exposures. In Japanese, considering protective efficacy, preference should be given to perchlorate in acute as well as prolonged radioiodine exposure scenarios.
Simulations of radioiodine exposure and protective thyroid blocking in a new biokinetic model of the mother–fetus unit at different pregnancy ages
In the case of nuclear incidents, radioiodine may be released. After incorporation, it accumulates in the thyroid and enhances the risk of thyroidal dysfunctions and cancer occurrence by internal irradiation. Pregnant women and children are particularly vulnerable. Therefore, thyroidal protection by administering a large dose of stable (non-radioactive) iodine, blocking radioiodide uptake into the gland, is essential in these subpopulations. However, a quantitative estimation of the protection conferred to the maternal and fetal thyroids in the different stages of pregnancy is difficult. We departed from an established biokinetic model for radioiodine in pregnancy using first-order kinetics. As the uptake of iodide into the thyroid and several other tissues is mediated by a saturable active transport, we integrated an uptake mechanism described by a Michaelis–Menten kinetic. This permits simulating the competition between stable and radioactive iodide at the membrane carrier site, one of the protective mechanisms. The Wollf–Chaikoff effect, as the other protective mechanism, was simulated by adding a total net uptake block for iodide into the thyroid, becoming active when the gland is saturated with iodine. The model’s validity was confirmed by comparing predicted values with results from other models and sparse empirical data. According to our model, in the case of radioiodine exposure without thyroid blocking, the thyroid equivalent dose in the maternal gland increases about 45% within the first weeks of pregnancy to remain in the same range until term. Beginning in the 12th pregnancy week, the equivalent dose in the fetal thyroid disproportionately increases over time and amounts to three times the dose of the maternal gland at term. The maternal and fetal glands’ protection increases concomitantly with the amount of stable iodine administered to the mother simultaneously with acute radioiodine exposure. The dose–effect curves reflecting the combined thyroidal protection by the competition at the membrane carrier site and the Wolff–Chaikoff effect in the mother are characterized by a mean effective dose (ED 50 ) of roughly 1.5 mg all over pregnancy. In the case of the fetal thyroid, the mean effective doses for thyroid blocking, taking into account only the competition at the carrier site are numerically lower than in the mother. Taking into account additionally the Wolff–Chaikoff effect, the dose–effect curves for thyroidal protection in the fetus show a shift to the left over time, with a mean effective dose of 12.9 mg in the 12th week of pregnancy decreasing to 0.5 mg at term. In any case, according to our model, the usually recommended dose of 100 mg stable iodine given at the time of acute radioiodine exposure confers a very high level of thyroidal protection to the maternal and fetal glands over pregnancy. For ethical reasons, the possibilities of experimental studies on thyroid blocking in pregnant women are extremely limited. Furthermore, results from animal studies are associated with the uncertainties related to the translation of the data to humans. Thus model-based simulations may be a valuable tool for better insight into the efficacy of thyroidal protection and improve preparedness planning for uncommon nuclear or radiological emergencies.
CLIP2 as radiation biomarker in papillary thyroid carcinoma
A substantial increase in papillary thyroid carcinoma (PTC) among children exposed to the radioiodine fallout has been one of the main consequences of the Chernobyl reactor accident. Recently, the investigation of PTCs from a cohort of young patients exposed to the post-Chernobyl radioiodine fallout at very young age and a matched nonexposed control group revealed a radiation-specific DNA copy number gain on chromosomal band 7q11.23 and the radiation-associated mRNA overexpression of CLIP2 . In this study, we investigated the potential role of CLIP2 as a radiation marker to be used for the individual classification of PTCs into CLIP2-positive and -negative cases—a prerequisite for the integration of CLIP2 into epidemiological modelling of the risk of radiation-induced PTC. We were able to validate the radiation-associated CLIP2 overexpression at the protein level by immunohistochemistry (IHC) followed by relative quantification using digital image analysis software ( P =0.0149). Furthermore, we developed a standardized workflow for the determination of CLIP2-positive and -negative cases that combines visual CLIP2 IHC scoring and CLIP2 genomic copy number status. In addition to the discovery cohort ( n =33), two independent validation cohorts of PTCs ( n =115) were investigated. High sensitivity and specificity rates for all three investigated cohorts were obtained, demonstrating robustness of the developed workflow. To analyse the function of CLIP2 in radiation-associated PTC, the CLIP2 gene regulatory network was reconstructed using global mRNA expression data from PTC patient samples. The genes comprising the first neighbourhood of CLIP2 ( BAG2, CHST3, KIF3C, NEURL1, PPIL3 and RGS4 ) suggest the involvement of CLIP2 in the fundamental carcinogenic processes including apoptosis, mitogen-activated protein kinase signalling and genomic instability. In our study, we successfully developed and independently validated a workflow for the typing of PTC clinical samples into CLIP2-positive and CLIP2-negative and provided first insights into the CLIP2 interactome in the context of radiation-associated PTC.
Rapid Prediction of Hematologic Acute Radiation Syndrome in Radiation Injury Patients Using Peripheral Blood Cell Counts
Rapid clinical triage of radiation injury patients is essential for determining appropriate diagnostic and therapeutic interventions. We examined the utility of blood cell counts (BCCs) in the first three days postirradiation to predict clinical outcome, specifically for hematologic acute radiation syndrome (HARS). We analyzed BCC test samples from radiation accident victims (n = 135) along with their clinical outcome HARS severity scores (H1–4) using the System for Evaluation and Archiving of Radiation Accidents based on Case Histories (SEARCH) database. Data from nonirradiated individuals (H0, n = 132) were collected from an outpatient facility. We created binary categories for severity scores, i.e., 1 (H0 vs. H1–4), 2 (H0–1 vs. H2–4) and 3 (H0–2 vs. H3–4), to assess the discrimination ability of BCCs using unconditional logistic regression analysis. The test sample contained 454 BCCs from 267 individuals. We validated the discrimination ability on a second independent group comprised of 275 BCCs from 252 individuals originating from SEARCH (HARS 1–4), an outpatient facility (H0) and hospitals (e.g., leukemia patients, H4). Individuals with a score of H0 were easily separated from exposed individuals based on developing lymphopenia and granulocytosis. The separation of H0 and H1–4 became more prominent with increasing hematologic severity scores and time. On day 1, lymphocyte counts were most predictive for discriminating binary categories, followed by granulocytes and thrombocytes. For days 2 and 3, an almost complete separation was achieved when BCCs from different days were combined, supporting the measurement of sequential BCC. We found an almost complete discrimination of H0 vs. irradiated individuals during model validation (negative predictive value, NPV > 94%) for all three days, while the correct prediction of exposed individuals increased from day 1 (positive predictive value, PPV 78–89%) to day 3 (PPV > 90%). The models were unable to provide predictions for 10.9% of the test samples, because the PPVs or NPVs did not reach a 95% likelihood defined as the lower limit for a prediction. We developed a prediction model spreadsheet to provide early and prompt diagnostic predictions and therapeutic recommendations including identification of the worried well, requirement of hospitalization or development of severe hematopoietic syndrome. These results improve the provisional classification of HARS. For the final diagnosis, further procedures (sequential diagnosis, retrospective dosimetry, clinical follow-up, etc.) must be taken into account. Clinical outcome of radiation injury patients can be rapidly predicted within the first three days postirradiation using peripheral BCC.
miRNA Expression Patterns Differ by Total- or Partial-Body Radiation Exposure in Baboons
In a radiation exposure event, a likely scenario may include either total-body irradiation (TBI) or different partial-body irradiation (PBI) patterns. Knowledge of the exposure pattern is expected to improve prediction of clinical outcome. We examined miRNA species in 17 irradiated baboons receiving an upper-body, left hemibody or total-body irradiation of 2.5 or 5 Gy. Blood samples were taken before irradiation and at 1, 2, 7, 28 and 75–106 days after irradiation. Using a qRT-PCR platform for simultaneous detection of 667 miRNAs, we identified 55 miRNAs over all time points. Candidate miRNAs, such as miR-17, miR-128 or miR-15b, significantly discriminated TBI from different PBI exposure patterns, and 5-to-10-fold changes in gene expression were observed among the groups. A total of 22 miRNAs (including miR-17) revealed significant linear associations of gene expression changes with the percentage of the exposed body area ( P < 0.0001). All these changes were primarily observed at day 7 postirradiation and almost no miRNAs were detected either before or after 7 days. A significant association in the reduction of lymphocyte counts in TBI compared to PBI animals corresponded with the number of miRNA candidates. This finding suggests that our target miRNAs predominantly originated from irradiated lymphocytes. In summary, gene expression changes in the peripheral blood provided indications of the exposure pattern and a suggestion of the percentage of the exposed body area.
Laboratory Intercomparison of Gene Expression Assays
The possibility of a large-scale acute radiation exposure necessitates the development of new methods that could provide rapid individual dose estimates with high sample throughput. The focus of the study was an intercomparison of laboratories' dose-assessment performances using gene expression assays. Lithium-heparinized whole blood from one healthy donor was irradiated (240 kVp, 1 Gy/min) immediately after venipuncture at approximately 37°C using single X-ray doses. Blood samples to establish calibration curves (0.25–4 Gy) as well as 10 blinded test samples (0.1–6.4 Gy) were incubated for 24 h at 37°C supplemented with an equal volume of medium and 10% fetal calf serum. For quantitative reverse transcription polymerase chain reaction (qRT-PCR), samples were lysed, stored at −20°C and shipped on ice. For the Chemical Ligation Dependent Probe Amplification methodology (CLPA), aliquots were incubated in 2 ml CLPA reaction buffer (DxTerity), mixed and shipped at room temperature. Assays were run in each laboratory according to locally established protocols. The mean absolute difference (MAD) of estimated doses relative to the true doses (in Gy) was calculated. We also merged doses into binary categories reflecting aspects of clinical/diagnostic relevance and examined accuracy, sensitivity and specificity. The earliest reported time on dose estimates was <8 h. The standard deviation of technical replicate measurements in 75% of all measurements was below 11%. MAD values of 0.3–0.5 Gy and 0.8–1.3 Gy divided the laboratories contributions into two groups. These fourfold differences in accuracy could be primarily explained by unexpected variances of the housekeeping gene (P = 0.0008) and performance differences in processing of calibration and blinded test samples by half of the contributing laboratories. Reported gene expression dose estimates aggregated into binary categories in general showed an accuracies and sensitivities of 93–100% and 76–100% for the groups, with low MAD and high MAD, respectively. In conclusion, gene expression-based dose estimates were reported quickly, and for laboratories with MAD between 0.3–0.5 Gy binary dose categories of clinical significance could be discriminated with an accuracy and sensitivity comparable to established cytogenetic assays.