Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Item Type
      Item Type
      Clear All
      Item Type
  • Subject
      Subject
      Clear All
      Subject
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
32 result(s) for "Lamkowski, A."
Sort by:
A comparison of the chemo- and radiotoxicity of thorium and uranium at different enrichment grades
Uranium and thorium are heavy metals, and all of their isotopes are radioactive, so it is impossible to study chemical effects entirely independent of the radiation effects. In the present study, we tried to compare the chemo- and radiotoxicity of both metals, taking into account deterministic radiation damages reflected by acute radiation sickness and stochastic radiation damages leading to long-term health impairments (e.g., tumor induction). We made at first a literature search on acute median lethal doses that may be expected to be caused by chemical effects, as even acute radiation sickness as a manifestation of acute radiotoxicity occurs with latency. By simulations based on the biokinetic models of the International Commission on Radiological Protection and using the Integrated Modules for Bioassay Analysis software, we determined the amounts of uranium at different enrichment grades and thorium-232 leading to a short-term red bone marrow equivalent dose of 3.5 Sv considered to cause 50% lethality in humans. Different intake pathways for incorporation were considered, and values were compared to the mean lethal doses by chemotoxicity. To assess stochastic radiotoxicity, we calculated the uranium and thorium amounts leading to a committed effective dose of 200 mSv that is often considered critical. Mean lethal values for uranium and thorium are in the same order of magnitude so that the data do not give evidence for substantial differences in acute chemical toxicity. When comparing radiotoxicity, the reference units (activity in Bq or weight in g) must always be taken into account. The mean lethal equivalent dose to the red bone marrow of 3.5 Sv is reached by lower activities of thorium compared to uranium in soluble compounds. However, for uranium as well as thorium-232, acute radiation sickness is expected only after incorporation of amounts exceeding the mean lethal doses by chemotoxicity. Thus, acute radiation sickness is not a relevant clinical issue for either metal. Concerning stochastic radiation damages, thorium-232 is more radiotoxic than uranium if incorporating the same activities. Using weight units for comparison show that for soluble compounds, thorium-232 is more radiotoxic than low-enriched uranium in the case of ingestion but even more toxic than high-enriched uranium after inhalation or intravenous administration. For insoluble compounds, the situation differs as the stochastic radiotoxicity of thorium-232 ranges between depleted and natural uranium. For acute effects, the chemotoxicity of uranium, even at high enrichment grades, as well as thorium-232 exceeds deterministic radiotoxicity. Simulations show that thorium-232 is more radiotoxic than uranium expressed in activity units. If the comparison is based on weight units, the rankings depend on the uranium enrichment grades and the route of intake.
Simulations of radioiodine exposure and protective thyroid blocking in a new biokinetic model of the mother–fetus unit at different pregnancy ages
In the case of nuclear incidents, radioiodine may be released. After incorporation, it accumulates in the thyroid and enhances the risk of thyroidal dysfunctions and cancer occurrence by internal irradiation. Pregnant women and children are particularly vulnerable. Therefore, thyroidal protection by administering a large dose of stable (non-radioactive) iodine, blocking radioiodide uptake into the gland, is essential in these subpopulations. However, a quantitative estimation of the protection conferred to the maternal and fetal thyroids in the different stages of pregnancy is difficult. We departed from an established biokinetic model for radioiodine in pregnancy using first-order kinetics. As the uptake of iodide into the thyroid and several other tissues is mediated by a saturable active transport, we integrated an uptake mechanism described by a Michaelis–Menten kinetic. This permits simulating the competition between stable and radioactive iodide at the membrane carrier site, one of the protective mechanisms. The Wollf–Chaikoff effect, as the other protective mechanism, was simulated by adding a total net uptake block for iodide into the thyroid, becoming active when the gland is saturated with iodine. The model’s validity was confirmed by comparing predicted values with results from other models and sparse empirical data. According to our model, in the case of radioiodine exposure without thyroid blocking, the thyroid equivalent dose in the maternal gland increases about 45% within the first weeks of pregnancy to remain in the same range until term. Beginning in the 12th pregnancy week, the equivalent dose in the fetal thyroid disproportionately increases over time and amounts to three times the dose of the maternal gland at term. The maternal and fetal glands’ protection increases concomitantly with the amount of stable iodine administered to the mother simultaneously with acute radioiodine exposure. The dose–effect curves reflecting the combined thyroidal protection by the competition at the membrane carrier site and the Wolff–Chaikoff effect in the mother are characterized by a mean effective dose (ED 50 ) of roughly 1.5 mg all over pregnancy. In the case of the fetal thyroid, the mean effective doses for thyroid blocking, taking into account only the competition at the carrier site are numerically lower than in the mother. Taking into account additionally the Wolff–Chaikoff effect, the dose–effect curves for thyroidal protection in the fetus show a shift to the left over time, with a mean effective dose of 12.9 mg in the 12th week of pregnancy decreasing to 0.5 mg at term. In any case, according to our model, the usually recommended dose of 100 mg stable iodine given at the time of acute radioiodine exposure confers a very high level of thyroidal protection to the maternal and fetal glands over pregnancy. For ethical reasons, the possibilities of experimental studies on thyroid blocking in pregnant women are extremely limited. Furthermore, results from animal studies are associated with the uncertainties related to the translation of the data to humans. Thus model-based simulations may be a valuable tool for better insight into the efficacy of thyroidal protection and improve preparedness planning for uncommon nuclear or radiological emergencies.
A comparison of thyroidal protection by iodine and perchlorate against radioiodine exposure in Caucasians and Japanese
Radioactive iodine released in nuclear accidents may accumulate in the thyroid and by irradiation enhances the risk of cancer. Radioiodine uptake into the gland can be inhibited by large doses of stable iodine or perchlorate. Nutritional iodine daily intake may impact thyroid physiology, so that radiological doses absorbed by the thyroid as well as thyroid blocking efficacy may differ in Japanese with a very rich iodine diet compared to Caucasians. Based on established biokinetic–dosimetric models for the thyroid, we derived the parameters for Caucasians and Japanese to quantitatively compare the effects of radioiodine exposure and the protective efficacy of thyroid blocking by stable iodine at the officially recommended dosages (100 mg in Germany, 76 mg in Japan) or perchlorate. The maximum transport capacity for iodine uptake into the thyroid is lower in Japanese compared to Caucasians. For the same radioiodine exposure pattern, the radiological equivalent thyroid dose is substantially lower in Japanese in the absence of thyroid blocking treatments. In the case of acute radioiodine exposure, stable iodine is less potent in Japanese (ED50 = 41.6 mg) than in Caucasians (ED50 = 2.7 mg) and confers less thyroid protection at the recommended dosages because of a delayed responsiveness to iodine saturation of the gland (Wolff–Chaikoff effect). Perchlorate (ED50 = 10 mg in Caucasians) at a dose of 1000 mg has roughly the same thyroid blocking effect as 100 mg iodine in Caucasians, whereas it confers a much better protection than 76 mg iodine in Japanese. For prolonged exposures, a single dose of iodine offer substantially lower protection than after acute radioiodine exposure in both groups. Repetitive daily iodine administrations improve efficacy without reaching levels after acute radioiodine exposure and achieve only slightly better protection in Japanese than in Caucasians. However, in the case of continuous radioiodine exposure, daily doses of 1000 mg perchlorate achieve a high protective efficacy in Caucasians as well as Japanese (> 0.98). In Caucasians, iodine (100 mg) and perchlorate (1000 mg) at the recommended dosages seem alternatives in case of acute radioiodine exposure, whereas perchlorate has a higher protective efficacy in the case of longer lasting radioiodine exposures. In Japanese, considering protective efficacy, preference should be given to perchlorate in acute as well as prolonged radioiodine exposure scenarios.
First Generation Gene Expression Signature for Early Prediction of Late Occurring Hematological Acute Radiation Syndrome in Baboons
We implemented a two-stage study to predict late occurring hematologic acute radiation syndrome (HARS) in a baboon model based on gene expression changes measured in peripheral blood within the first two days after irradiation. Eighteen baboons were irradiated to simulate different patterns of partial-body and total-body exposure, which corresponded to an equivalent dose of 2.5 or 5 Gy. According to changes in blood cell counts the surviving baboons (n = 17) exhibited mild (H1–2, n = 4) or more severe (H2–3, n = 13) HARS. Blood samples taken before irradiation served as unexposed control (H0, n = 17). For stage I of this study, a whole genome screen (mRNA microarrays) was performed using a portion of the samples (H0, n = 5; H1–2, n = 4; H2–3, n = 5). For stage II, using the remaining samples and the more sensitive methodology, qRT-PCR, validation was performed on candidate genes that were differentially up- or down-regulated during the first two days after irradiation. Differential gene expression was defined as significant (P < 0.05) and greater than or equal to a twofold difference above a H0 classification. From approximately 20,000 genes, on average 46% appeared to be expressed. On day 1 postirradiation for H2–3, approximately 2–3 times more genes appeared up-regulated (1,418 vs. 550) or down-regulated (1,603 vs. 735) compared to H1–2. This pattern became more pronounced at day 2 while the number of differentially expressed genes decreased. The specific genes showed an enrichment of biological processes coding for immune system processes, natural killer cell activation and immune response (P = 1 × E-06 up to 9 × E-14). Based on the P values, magnitude and sustained differential gene expression over time, we selected 89 candidate genes for validation using qRT-PCR. Ultimately, 22 genes were confirmed for identification of H1–3 classifications and seven genes for identification of H2–3 classifications using qRT-PCR. For H1–3 classifications, most genes were constantly three to fivefold down-regulated relative to H0 over both days, but some genes appeared 10.3-fold (VSIG4) or even 30.7-fold up-regulated (CD177) over H0. For H2–3, some genes appeared four to sevenfold up-regulated relative to H0 (RNASE3, DAGLA, ARG2), but other genes showed a strong 14- to 33-fold down-regulation relative to H0 (WNT3, POU2AF1, CCR7). All of these genes allowed an almost completely identifiable separation among each of the HARS categories. In summary, clinically relevant HARS can be independently predicted with all 29 irradiated genes examined in the peripheral blood of baboons within the first two days postirradiation. While further studies are needed to confirm these findings, this model shows potential relevance in the prediction of clinical outcomes in exposed humans and as an aid in the prioritizing of medical treatment.
Using Clinical Signs and Symptoms for Medical Management of Radiation Casualties – 2015 NATO Exercise
The utility of early-phase (≤5 days) radiation-induced clinical signs and symptoms (e.g., vomiting, diarrhea, erythema and changes in blood cell counts) was examined for the prediction of later occurring acute radiation syndrome (ARS) severity and the development of medical management strategies. Medical treatment protocols for radiation accident victims (METREPOL) was used to grade ARS severities, which were assigned response categories (RCs). Data on individuals (n = 191) with mild (RC1, n = 45), moderate (RC2, n = 19), severe (RC3, n = 20) and fatal (RC4, n = 18) ARS, as well as nonexposed individuals (RC0, n = 89) were generated using either METREPOL (n = 167) or the system for evaluation and archiving of radiation accidents based on case histories (SEARCH) database (n = 24), the latter comprised of real-case descriptions. These data were converted into tables reflecting clinical signs and symptoms, and submitted to eight teams representing five participating countries. The teams were comprised of medical doctors, biologists and pharmacists with subject matter expertise. The tables comprised cumulated clinical data from day 1–3 and day 1–5 postirradiation. While it would have reflected a more realistic scenario to provide the data to the teams over the course of a 3- or 5-day period, the logistics of doing so proved too challenging. In addition, the team members participating in this exercise chose to receive the cumulated reports of day 1–3 and 1–5. The teams were tasked with predicting ARS incidence, ARS severity and the requirement for hospitalization for multiple cases, as well as providing the certainty of their diagnosis. Five of the teams also performed dose estimates. The teams did not employ harmonized methodologies, and the expertise among the members varied, as did the tools used and the means of analyzing the clinical data. The earliest report time was 3 h after the tables were sent to the team members. The majority of cases developing ARS (89.6% ± 3.3 SD) and requiring hospitalization (88.8% ± 4.6 SD) were correctly identified by all teams. Determination of ARS severity was particularly challenging for RC2–3, which was systematically overestimated. However, RC4 was correctly predicted at 94–100% by all teams. RC0 and RC1 ARS severities were more difficult to discriminate. When reported RCs (0–1 and 3–4) were merged, on average 89.6% (±3.3 SD) of all cases could be correctly classified. Comparisons on frequency distributions revealed no statistically significant differences among the following: 1. reported ARS from different teams (P > 0.2); 2. cases generated based on METREPOL or SEARCH (P > 0.5); or 3. results reported at day 3 and 5 postirradiation (P > 0.1). Dose estimates of all teams increased significantly along with ARS severity (P < 0.0001) as well as with dose estimates generated from dicentric chromosomal-aberration measurements available for SEARCH cases (P < 0.0001). In summary, early-phase radiation-induced clinical signs and symptoms proved to be useful for rapid and accurate assessment, with minor limitations, toward predicting life-threatening ARS severity and developing treatment management strategies.
Rapid High-Throughput Diagnostic Triage after a Mass Radiation Exposure Event Using Early Gene Expression Changes
Radiological exposure scenarios involving large numbers of people require a rapid and high-throughput method to identify the unexposed, and those exposed to low- and high-dose radiation. Those with high-dose exposure, e.g., >2 Gy and depending on host characteristics, may develop severe hematological acute radiation syndrome (HARS), requiring hospitalization and treatment. Previously, we identified a set of genes that discriminated these clinically relevant groups. In the current work, we examined the utility of gene expression changes to classify 1,000 split blood samples into HARS severity scores of H0, H1 and H2–4, with the latter indicating likely hospitalization. In several previous radiation dose experiments, we determined that these HARS categories corresponded, respectively, to doses of 0 Gy (unexposed), 0.5 Gy and 5 Gy. The main purpose of this work was to assess the rapidity of blood sample processing using targeted next-generation sequencing (NGS). Peripheral blood samples from two healthy donors were X-ray irradiated in vitro and incubated at 37°C for 24 h. A total of 1,000 samples were evaluated by laboratory personnel blinded to the radiation dose. Changes in gene expression of FDXR, DDB2, POU2AF1 and WNT3 were examined with qRT-PCR as positive controls. Targeted NGS (TREX) was used on all samples for the same four genes. Agreement using both methods was almost 78%. Using NGS, all 1,000 samples were processed within 30 h. Classification of the HARS severity categories corresponding to radiation dose had an overall agreement ranging between 90–97%. Depending on the end point, either a combination of all genes or FDXR alone (H0 HARS or unexposed) provided the best classification. Using this optimized automated methodology, we assessed 100× more samples approximately three times faster compared to standard cytogenetic studies. We showed that a small set of genes, rather than a complex constellation of genes, provided robust positive (97%) and negative (97%) predictive values for HARS categories and radiation doses of 0, 0.5 and 5 Gy. The findings of this study support the potential utility of early radiation-induced gene expression changes for high-throughput biodosimetry and rapid identification of irradiated persons in need of hospitalization.
From genes to landscapes: Pattern formation and self‐regulation in raised bogs with an example from Tierra del Fuego
We studied a pristine, prominently patterned raised bog in Tierra del Fuego, Argentina, to disentangle the complex interactions among plants and water and peat. The studied bog lacks complicating features often posed by other bogs. It is completely dominated by Sphagnum magellanicum, which covers all niches and growth forms, and is joined by only a dozen higher plant species; it is entirely ombrotrophic with very sharp borders to the surrounding fen; it has only one type of peat that shows an only limited range in degree of decomposition; and it is situated in a very even climate with minimal differences in rainfall and temperature over the year. We present detailed measurements along a 498‐m‐long transect crossing the bog, including water table measurements (n = 498), contiguous vegetation relevés (n = 248), hydraulic conductivity just below the water table (n = 246), and hydraulic conductivity in 11 depth profiles (n = 291); degree of humification of the corresponding peat was assessed in conjunction with the hydraulic conductivity measurements (n = 537). Sphagnum magellanicum moss samples were collected every 2 m along this transect as well and genotyped (n = 242). In addition, along short, 26‐m‐long transects crossing strings and flarks water table and hydraulic conductivity just below the water table were measured every meter. Sphagnum growth forms were assessed, and the vegetation of the entire bog was mapped in 10 × 10‐m relevés (n = 3322). A simulation model was applied to a generalized shape of the bog and produced surface patterns that well matched those seen in the field. The results were integrated with information from the literature and discussed in the framework of a self‐regulating and self‐organizing raised bog. We identified 19 hydrological feedback mechanisms. We found that the various mechanisms overlap in both space and time, which means there is redundancy in the self‐regulation of the system. Raised bogs, when in a natural state, are among the most resilient ecosystems known; resilience that is provided by feedbacks and backup systems to these feedbacks.
A comparison of thyroidal protection by stable iodine or perchlorate in the case of acute or prolonged radioiodine exposure
In the case of a nuclear power plant accident, repetitive/prolonged radioiodine release may occur. Radioiodine accumulates in the thyroid and by irradiation enhances the risk of cancer. Large doses of non-radioactive iodine may protect the thyroid by inhibiting radioiodine uptake into the gland (iodine blockade). Protection is based on a competition at the active carrier site in the cellular membrane and the Wolff–Chaikoff effect, the latter being, however, only transient (24–48 h). Perchlorate may alternatively provide protection by a carrier competition mechanism only. Perchlorate has, however, a stronger affinity to the carrier than iodide. Based on an established biokinetic–dosimetric model developed to study iodine blockade, and after its extension to describe perchlorate pharmacokinetics and the inhibition of iodine transport through the carrier, we computed the protective efficacies that can be achieved by stable iodine or perchlorate in the case of an acute or prolonged radioiodine exposure. In the case of acute radioiodine exposure, perchlorate is less potent than stable iodine considering its ED50. A dose of 100 mg stable iodine has roughly the same protective efficacy as 1000 mg perchlorate. For prolonged exposures, single doses of protective agents, whether stable iodine or perchlorate, offer substantially lower protection than after acute radioiodine exposure, and thus repetitive administrations seem necessary. In case of prolonged exposure, the higher affinity of perchlorate for the carrier in combination with the fading Wolff–Chaikoff effect of iodine confers perchlorate a higher protective efficacy compared to stable iodine. Taking into account the frequency and seriousness of adverse effects, iodine and perchlorate at equieffective dosages seem to be alternatives in case of short-term acute radioiodine exposure, whereas preference should be given to perchlorate in view of its higher protective efficacy in the case of longer lasting radioiodine exposures.
Analyzing the impact of 900 MHz EMF short-term exposure to the expression of 667 miRNAs in human peripheral blood cells
More than ever before, people around the world are frequently exposed to different sections of the electromagnetic spectrum, mainly emitted from wireless modern communication technologies. Especially, the level of knowledge on non-thermal biological EMF effects remains controversial. New technologies allow for a more detailed detection of non-coding RNAs which affect the post-transcriptional control. Such method shall be applied in this work to investigate the response of human blood cells to electromagnetic irradiation. In this ex vivo in vitro study, we exposed peripheral blood cells from 5 male donors to a continuous wave of 900 MHz EMF for 0, 30, 60 and 90 min. Significant micro RNA ( mi RNA) expression changes ( p  ≤ 0.05) above or below the SHAM exposed samples were evaluated using a quantitative real time PCR platform for simultaneous detection of 667 miRNAs called low density array. Only significant miRNA expression changes which were detectable in at least 60% of the samples per exposure group were analyzed. The results were compared with data from room temperature + 2 °C ( RT  + 2 °C) samples (here referred to as hyperthermia) to exclude miRNA expression altered by hyperthermia. The validation study by using the same donors and study design was performed after an interval of 2 years. When analyzing a total of 667 miRNAs during the screening study, 2 promising candidate miRNAs were identified, which were down regulated almost twice and showed a complete separation from the unexposed control group (miR-194 at 30 min and miR-939 at 60 min). The p -values even survived the Bonferroni correction for multiple comparisons ( p  = 0.0007 and p  = 0.004, respectively). None of these miRNAs were expressed at a second time point after EMF exposure. Following an alternative analysis approach, we examined for miRNAs revealing an expected significant association of differential miRNA expression with the dose-time EMF exposure product, separately for each donor. Donors 2 and 3 revealed 11 and 10 miRNA species being significantly associated with EMF exposure which differed significantly from the other donors showing a minor number of differentially expressed miRNAs and could identify donors 2 and 3 as particularly EMF-responsive. The measurements were repeated after 2 years. The number of expressed/non-expressed miRNAs was almost similar (97.4%), but neither the number nor the previously differentially expressed miRNAs could be reproduced. Our data neither support evidence of early changes at miRNA expression level in human whole blood cells after 900 MHz EMF exposure nor the identification of EMF-responsive individuals.
Correlation of retinal sensitivity in microperimetry with vascular density in optical coherence tomography angiography in primary open-angle glaucoma
To evaluate the correlation between retinal sensitivity in microperimetry (MP) with vessel density (VD) using optical coherence tomography angiography (OCTA) in primary open-angle glaucoma (POAG). We enrolled 30 participants (52 eyes) with POAG and 15 participants (23 eyes) in the healthy control group. All participants were examined for retinal structure using OCTA to assess VD and Spectral domain OCT (SD-OCT) to assess ganglion cell complex (GCC) and peripapillary retinal nerve fiber layer (pRNFL) thickness. Retinal sensitivity was tested with MP and standard automatic perimetry (SAP). The VD in moderate/severe POAG was lower than that in mild POAG and healthy control in the macular superficial vascular plexus (SVP) (38.7±6.3% vs. 42.9±5.2%, 49.7±2.6% respectively, P<0.001) and peripapillary radial peripapillary capillaries (pRPC) (36.4±5.7% vs. 43.6±6.6%, 49.1±2.4% respectively, P0.05). The relationship between microvascular damage in the macular SVP whole and the decrease of MP average sensitivity threshold is stronger than the pRNFL thickness measurements and SAP parameters. OCTA and MP techniques are valuable methods that allow clinically monitor structural and functional changes in glaucomatous eyes.