Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Item Type
      Item Type
      Clear All
      Item Type
  • Subject
      Subject
      Clear All
      Subject
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
27 result(s) for "Herodin, F."
Sort by:
Comparison of Established and Emerging Biodosimetry Assays
Rapid biodosimetry tools are required to assist with triage in the case of a large-scale radiation incident. Here, we aimed to determine the dose-assessment accuracy of the well-established dicentric chromosome assay (DCA) and cytokinesis-block micronucleus assay (CBMN) in comparison to the emerging γ-H2AX foci and gene expression assays for triage mode biodosimetry and radiation injury assessment. Coded blood samples exposed to 10 X-ray doses (240 kVp, 1 Gy/min) of up to 6.4 Gy were sent to participants for dose estimation. Report times were documented for each laboratory and assay. The mean absolute difference (MAD) of estimated doses relative to the true doses was calculated. We also merged doses into binary dose categories of clinical relevance and examined accuracy, sensitivity and specificity of the assays. Dose estimates were reported by the first laboratories within 0.3–0.4 days of receipt of samples for the γ-H2AX and gene expression assays compared to 2.4 and 4 days for the DCA and CBMN assays, respectively. Irrespective of the assay we found a 2.5–4-fold variation of interlaboratory accuracy per assay and lowest MAD values for the DCA assay (0.16 Gy) followed by CBMN (0.34 Gy), gene expression (0.34 Gy) and γ-H2AX (0.45 Gy) foci assay. Binary categories of dose estimates could be discriminated with equal efficiency for all assays, but at doses ≥1.5 Gy a 10% decrease in efficiency was observed for the foci assay, which was still comparable to the CBMN assay. In conclusion, the DCA has been confirmed as the gold standard biodosimetry method, but in situations where speed and throughput are more important than ultimate accuracy, the emerging rapid molecular assays have the potential to become useful triage tools.
First Generation Gene Expression Signature for Early Prediction of Late Occurring Hematological Acute Radiation Syndrome in Baboons
We implemented a two-stage study to predict late occurring hematologic acute radiation syndrome (HARS) in a baboon model based on gene expression changes measured in peripheral blood within the first two days after irradiation. Eighteen baboons were irradiated to simulate different patterns of partial-body and total-body exposure, which corresponded to an equivalent dose of 2.5 or 5 Gy. According to changes in blood cell counts the surviving baboons (n = 17) exhibited mild (H1–2, n = 4) or more severe (H2–3, n = 13) HARS. Blood samples taken before irradiation served as unexposed control (H0, n = 17). For stage I of this study, a whole genome screen (mRNA microarrays) was performed using a portion of the samples (H0, n = 5; H1–2, n = 4; H2–3, n = 5). For stage II, using the remaining samples and the more sensitive methodology, qRT-PCR, validation was performed on candidate genes that were differentially up- or down-regulated during the first two days after irradiation. Differential gene expression was defined as significant (P < 0.05) and greater than or equal to a twofold difference above a H0 classification. From approximately 20,000 genes, on average 46% appeared to be expressed. On day 1 postirradiation for H2–3, approximately 2–3 times more genes appeared up-regulated (1,418 vs. 550) or down-regulated (1,603 vs. 735) compared to H1–2. This pattern became more pronounced at day 2 while the number of differentially expressed genes decreased. The specific genes showed an enrichment of biological processes coding for immune system processes, natural killer cell activation and immune response (P = 1 × E-06 up to 9 × E-14). Based on the P values, magnitude and sustained differential gene expression over time, we selected 89 candidate genes for validation using qRT-PCR. Ultimately, 22 genes were confirmed for identification of H1–3 classifications and seven genes for identification of H2–3 classifications using qRT-PCR. For H1–3 classifications, most genes were constantly three to fivefold down-regulated relative to H0 over both days, but some genes appeared 10.3-fold (VSIG4) or even 30.7-fold up-regulated (CD177) over H0. For H2–3, some genes appeared four to sevenfold up-regulated relative to H0 (RNASE3, DAGLA, ARG2), but other genes showed a strong 14- to 33-fold down-regulation relative to H0 (WNT3, POU2AF1, CCR7). All of these genes allowed an almost completely identifiable separation among each of the HARS categories. In summary, clinically relevant HARS can be independently predicted with all 29 irradiated genes examined in the peripheral blood of baboons within the first two days postirradiation. While further studies are needed to confirm these findings, this model shows potential relevance in the prediction of clinical outcomes in exposed humans and as an aid in the prioritizing of medical treatment.
Laboratory Intercomparison of Gene Expression Assays
The possibility of a large-scale acute radiation exposure necessitates the development of new methods that could provide rapid individual dose estimates with high sample throughput. The focus of the study was an intercomparison of laboratories' dose-assessment performances using gene expression assays. Lithium-heparinized whole blood from one healthy donor was irradiated (240 kVp, 1 Gy/min) immediately after venipuncture at approximately 37°C using single X-ray doses. Blood samples to establish calibration curves (0.25–4 Gy) as well as 10 blinded test samples (0.1–6.4 Gy) were incubated for 24 h at 37°C supplemented with an equal volume of medium and 10% fetal calf serum. For quantitative reverse transcription polymerase chain reaction (qRT-PCR), samples were lysed, stored at −20°C and shipped on ice. For the Chemical Ligation Dependent Probe Amplification methodology (CLPA), aliquots were incubated in 2 ml CLPA reaction buffer (DxTerity), mixed and shipped at room temperature. Assays were run in each laboratory according to locally established protocols. The mean absolute difference (MAD) of estimated doses relative to the true doses (in Gy) was calculated. We also merged doses into binary categories reflecting aspects of clinical/diagnostic relevance and examined accuracy, sensitivity and specificity. The earliest reported time on dose estimates was <8 h. The standard deviation of technical replicate measurements in 75% of all measurements was below 11%. MAD values of 0.3–0.5 Gy and 0.8–1.3 Gy divided the laboratories contributions into two groups. These fourfold differences in accuracy could be primarily explained by unexpected variances of the housekeeping gene (P = 0.0008) and performance differences in processing of calibration and blinded test samples by half of the contributing laboratories. Reported gene expression dose estimates aggregated into binary categories in general showed an accuracies and sensitivities of 93–100% and 76–100% for the groups, with low MAD and high MAD, respectively. In conclusion, gene expression-based dose estimates were reported quickly, and for laboratories with MAD between 0.3–0.5 Gy binary dose categories of clinical significance could be discriminated with an accuracy and sensitivity comparable to established cytogenetic assays.
miRNA Expression Patterns Differ by Total- or Partial-Body Radiation Exposure in Baboons
In a radiation exposure event, a likely scenario may include either total-body irradiation (TBI) or different partial-body irradiation (PBI) patterns. Knowledge of the exposure pattern is expected to improve prediction of clinical outcome. We examined miRNA species in 17 irradiated baboons receiving an upper-body, left hemibody or total-body irradiation of 2.5 or 5 Gy. Blood samples were taken before irradiation and at 1, 2, 7, 28 and 75–106 days after irradiation. Using a qRT-PCR platform for simultaneous detection of 667 miRNAs, we identified 55 miRNAs over all time points. Candidate miRNAs, such as miR-17, miR-128 or miR-15b, significantly discriminated TBI from different PBI exposure patterns, and 5-to-10-fold changes in gene expression were observed among the groups. A total of 22 miRNAs (including miR-17) revealed significant linear associations of gene expression changes with the percentage of the exposed body area ( P < 0.0001). All these changes were primarily observed at day 7 postirradiation and almost no miRNAs were detected either before or after 7 days. A significant association in the reduction of lymphocyte counts in TBI compared to PBI animals corresponded with the number of miRNA candidates. This finding suggests that our target miRNAs predominantly originated from irradiated lymphocytes. In summary, gene expression changes in the peripheral blood provided indications of the exposure pattern and a suggestion of the percentage of the exposed body area.
Validating Baboon Ex Vivo and In Vivo Radiation-Related Gene Expression with Corresponding Human Data
The research for high-throughput diagnostic tests for victims of radio/nuclear incidents remains ongoing. In this context, we have previously identified candidate genes that predict risk of late-occurring hematologic acute radiation syndrome (HARS) in a baboon model. The goal of the current study was to validate these genes after radiation exposure in humans. We also examined ex vivo relative to in vivo measurements in both species and describe dose-response relationships. Eighteen baboons were irradiated in vivo to simulate different patterns of partial- or total-body irradiation (TBI), corresponding to an equivalent dose of 2.5 or 5 Sv. Human in vivo blood samples were obtained from patients exposed to different dose ranges: diagnostic computerized tomography (CT; 0.004–0.018 Sv); radiotherapy for prostate cancer (0.25–0.3 Sv); and TBI of leukemia patients (2 × 1.5 or 2 × 2 Sv, five patients each). Peripheral whole blood of another five baboons and human samples from five healthy donors were cultivated ex vivo and irradiated with 0–4 Sv. RNA was isolated pairwise before and 24 h after irradiation and converted into cDNA. Gene expression of six promising candidate genes found previously by us in a baboon model (WNT3, POU2AF1, CCR7, ARG2, CD177, WLS), as well as three genes commonly used in ex vivo whole blood experiments (FDXR, PCNA, DDB2) was measured using qRT-PCR. We confirmed the six baboon candidate genes in leukemia patients. However, expression for the candidate gene FDXR showed an inverse relationship, as it was downregulated in baboons and upregulated in human samples. Comparisons among the in vivo and ex vivo experiments revealed the same pattern in both species and indicated peripheral blood cells to represent the radiation-responsive targets causing WNT3 and POU2AF1 gene expression changes. CCR7, ARG2, CD177 and WLS appeared to be altered due to radiation-responsive targets other than the whole blood cells. Linear dose-response relationships of FDXR, WNT3 and POU2AF1 using human ex vivo samples corresponded with human in vivo samples, suggesting that ex vivo models for in vivo dose estimates can be used over a wide dose range (0.001–5 Sv for POU2AF1). In summary, we validated six baboon candidate genes in humans, but the FDXR measurements underscored the importance of independent assessments even when candidates from animal models have striking gene sequence homology to humans. Since whole blood cells represented the same radiation-responsive targets for FDXR, WNT3 and POU2AF1 gene expression changes, ex vivo cell culture models can be utilized for in vivo dose estimates over a dose range covering up to 3.5 log scales. These findings might be a step forward in the development of a gene expression-based high-throughput diagnostic test for populations involved in large-scale radio/nuclear incidents.
Laboratory Intercomparison of the Dicentric Chromosome Analysis Assay
The study design and obtained results represent an intercomparison of various laboratories performing dose assessment using the dicentric chromosome analysis (DCA) as a diagnostic triage tool for individual radiation dose assessment. Homogenously X-irradiated (240 kVp, 1 Gy/min) blood samples for establishing calibration data (0.25–5 Gy) as well as blind samples (0.1–6.4 Gy) were sent to the participants. DCA was performed according to established protocols. The time taken to report dose estimates was documented for each laboratory. Additional information concerning laboratory organization/characteristics as well as assay performance was collected. The mean absolute difference (MAD) was calculated and radiation doses were merged into four triage categories reflecting clinical aspects to calculate accuracy, sensitivity and specificity. The earliest report time was 2.4 days after sample arrival. DCA dose estimates were reported with high and comparable accuracy, with MAD values ranging between 0.16–0.5 Gy for both manual and automated scoring. No significant differences were found for dose estimates based either on 20, 30, 40 or 50 cells, suggesting that the scored number of cells can be reduced from 50 to 20 without loss of precision of triage dose estimates, at least for homogenous exposure scenarios. Triage categories of clinical significance could be discriminated efficiently using both scoring procedures.
Laboratory Intercomparison of the Cytokinesis-Block Micronucleus Assay
The focus of the study is an intercomparison of laboratories' dose-assessment performances using the cytokinesis-block micronucleus (CBMN) assay as a diagnostic triage tool for individual radiation dose assessment. Homogenously X-irradiated (240 kVp, 1 Gy/min) blood samples for establishing calibration data (0.25–5 Gy) as well as blind samples (0.1–6.4 Gy) were sent to the participants. The CBMN assay was performed according to protocols individually established and varying among participating laboratories. The time taken to report dose estimates was documented for each laboratory. Additional information concerning laboratory organization/characteristics as well as assay performance was collected. The mean absolute difference (MAD) was calculated and radiation doses were merged into four triage categories reflecting clinical aspects to calculate accuracy, sensitivity and specificity. The earliest report time was 4 days after sample arrival. The CBMN dose estimates were reported with high accuracy (MAD values of 0.20–0.50 Gy at doses below 6.4 Gy for both manual and automated scoring procedures), but showed a limitation of the assay at the dose point of 6.4 Gy, which resulted in a clear dose underestimation in all cases. The MAD values (without 6.4 Gy) differed significantly (P = 0.03) between manual (0.25 Gy, SEM = 0.06, n = 4) or automated scoring procedures (0.37 Gy, SEM = 0.08, n = 5), but lowest MAD were equal (0.2 Gy) for both scoring procedures. Likewise, both scoring procedures led to the same allocation of dose estimates to triage categories of clinical significance (about 83% accuracy and up to 100% specificity).
Short-term sonic-hedgehog gene therapy to mitigate myelosuppression in highly irradiated monkeys: hype or reality?
The protection of hematopoietic stem and progenitor cells and their environment is required for recovery from radiation-induced (RI) myelosuppression. To achieve this goal, we propose a new gene therapy strategy based on local and short-term synthesis and expression of Sonic hedgehog morphogene (Shh) at the niche level. We investigated the hematopoietic response of 8 Gy gamma-irradiated monkeys to a single intra-osseous injection of multipotent mesenchymal stem cells (adipocyte-derived stem cells/ASC) transduced with a Shh pIRES2 plasmid (3+/−0.4 × 10 6 cells/kg on day (D) 2; n =4). Control animals were injected with mock-ASCs ( n =4). Two controls died from radiation toxicity on D19 and D196, whereas all Shh-ASC treated monkeys fully recovered. Thrombocytopenia (4.75+/−1.8 days versus 10+/−2.2 days, platelet count <20 × 10 9 /L), neutropenia (14.2 +/−1 days versus 17.7 +/−2.6 days, ANC count<0.5 × 10 9 /L) and anemia (15.5 +/−3.6 days versus 50.7 +/−31 days, Hb less than 10 g/dL) duration were reduced in Shh-ASC animals. Areas under the curve of platelets ( P <0.05), ANCs ( P =0.06) and RBC/Hb between D0 and D30 were higher in Shh-ASC injected animals. Globally this study suggests that Shh may represent a new factor to counteract RI-myelosuppression.
Using Clinical Signs and Symptoms for Medical Management of Radiation Casualties – 2015 NATO Exercise
The utility of early-phase (≤5 days) radiation-induced clinical signs and symptoms (e.g., vomiting, diarrhea, erythema and changes in blood cell counts) was examined for the prediction of later occurring acute radiation syndrome (ARS) severity and the development of medical management strategies. Medical treatment protocols for radiation accident victims (METREPOL) was used to grade ARS severities, which were assigned response categories (RCs). Data on individuals (n = 191) with mild (RC1, n = 45), moderate (RC2, n = 19), severe (RC3, n = 20) and fatal (RC4, n = 18) ARS, as well as nonexposed individuals (RC0, n = 89) were generated using either METREPOL (n = 167) or the system for evaluation and archiving of radiation accidents based on case histories (SEARCH) database (n = 24), the latter comprised of real-case descriptions. These data were converted into tables reflecting clinical signs and symptoms, and submitted to eight teams representing five participating countries. The teams were comprised of medical doctors, biologists and pharmacists with subject matter expertise. The tables comprised cumulated clinical data from day 1–3 and day 1–5 postirradiation. While it would have reflected a more realistic scenario to provide the data to the teams over the course of a 3- or 5-day period, the logistics of doing so proved too challenging. In addition, the team members participating in this exercise chose to receive the cumulated reports of day 1–3 and 1–5. The teams were tasked with predicting ARS incidence, ARS severity and the requirement for hospitalization for multiple cases, as well as providing the certainty of their diagnosis. Five of the teams also performed dose estimates. The teams did not employ harmonized methodologies, and the expertise among the members varied, as did the tools used and the means of analyzing the clinical data. The earliest report time was 3 h after the tables were sent to the team members. The majority of cases developing ARS (89.6% ± 3.3 SD) and requiring hospitalization (88.8% ± 4.6 SD) were correctly identified by all teams. Determination of ARS severity was particularly challenging for RC2–3, which was systematically overestimated. However, RC4 was correctly predicted at 94–100% by all teams. RC0 and RC1 ARS severities were more difficult to discriminate. When reported RCs (0–1 and 3–4) were merged, on average 89.6% (±3.3 SD) of all cases could be correctly classified. Comparisons on frequency distributions revealed no statistically significant differences among the following: 1. reported ARS from different teams (P > 0.2); 2. cases generated based on METREPOL or SEARCH (P > 0.5); or 3. results reported at day 3 and 5 postirradiation (P > 0.1). Dose estimates of all teams increased significantly along with ARS severity (P < 0.0001) as well as with dose estimates generated from dicentric chromosomal-aberration measurements available for SEARCH cases (P < 0.0001). In summary, early-phase radiation-induced clinical signs and symptoms proved to be useful for rapid and accurate assessment, with minor limitations, toward predicting life-threatening ARS severity and developing treatment management strategies.
Cutaneous challenge with chemical warfare agents in the SKH-1 hairless mouse. (I) Development of a model for screening studies in skin decontamination and protection
Exposure to lethal chemical warfare agents (CWAs) is no longer only a military issue due to the terrorist threat. Among the CWAs of concern are the organophosphorus nerve agent O-ethyl-S-(2[di-isopropylamino]ethyl)methyl-phosphonothioate (VX) and the vesicant sulfur mustard (SM). Although efficient means of decontamination are available, most of them lose their efficacy when decontamination is delayed after exposure of the bare skin. Alternatively, CWA skin penetration can be prevented by topical skin protectants. Active research in skin protection and decontamination is thus paramount. In vivo screening of decontaminants or skin protectants is usually time consuming and may be expensive depending on the animal species used. We were thus looking for a suitable, scientifically sound and cost-effective model, which is easy to handle. The euthymic hairless mouse Crl: SKH-1 (hr/hr) BR is widely used in some skin studies and has previously been described to be suitable for some experiments involving SM or SM analogs. To evaluate the response of this species, we studied the consequences of exposing male anaesthetized SKH-1 mice to either liquid VX or to SM, the latter being used in liquid form or as saturated vapours. Long-term effects of SM burn were also evaluated. The model was then used in the companion paper (Taysse et al.1).