Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Item Type
      Item Type
      Clear All
      Item Type
  • Subject
      Subject
      Clear All
      Subject
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
71 result(s) for "Janssen, Eric D."
Sort by:
Assessment of Seed Germinability after Prolonged Seed Storage for Synthyris bullii (Plantaginaceae), a Rare Endemic of the Midwestern U.S.A
Synthyris bullii (Plantaginaceae; Besseya bullii; kittentails) is a rare endemic of the Midwestern U.S.A. Although seed germination studies have been conducted for this species, limited information is available regarding seed viability after prolonged seed storage. The main goal of this study was to re-assess seed germinability of Synthyris bullii seeds collected from 2008 to 2011 across its range. Overall, we found regardless of seed age, locality (i.e., site and state), and maternal environment (i.e., open, semi-shaded, and shaded) seeds are still viable. Although values were low for seed germination and seedling survivorship, the results of this study are encouraging and point to seed harvesting and storage as one way to preserve this species.
Tracking species recovery status to improve U.S. endangered species act decisions
Currently 1677 species are listed under the U.S. Endangered Species Act (ESA), yet only a small percentage have been delisted due to recovery. In the fall of 2021, the U.S. Fish and Wildlife Service proposed delisting 23 species due to extinction. Tracking changes in species ‘recovery status over time is critical to understanding species’ statuses, informing adaptive management strategies, and assessing the performance of the ESA to prevent further species loss. In this paper, we describe four key obstacles in tracking species recovery status under the ESA. First, ESA 5‐year reviews lack a standardized format and clear documentation. Second, despite having been listed for decades, many species still suffer major data gaps in their biology and threats, rendering it difficult if not impossible to track progress towards recovery. Third, many species have continued declining after listing, yet given the above (1 & 2), understanding potential causes (proximate and/or ultimate) can be difficult. Fourth, many species currently have no path to clear recovery, which represents a potential failing of the process. We conclude with a discussion of potential policy responses that could be addressed to enhance the efficacy of the ESA. The U.S. Endangered Species Act protects about 1700 species, but no concise, standardized metrics exist for assessing changes in species recovery status. We helped develop and test novel metrics that track changes in recovery status using six components. Our testing also revealed several key challenges to species recovery.
Treatment of HCV Infection by Targeting MicroRNA
In this phase 2 trial, an antisense oligonucleotide was tested in the treatment of chronic hepatitis C virus infection. The oligonucleotide was designed to bind to and sequester a microRNA required for HCV replication. Approximately 170 million persons worldwide are chronically infected with the hepatitis C virus (HCV). 1 Chronic HCV infection is a major cause of liver cirrhosis, liver failure, and hepatocellular carcinoma and is the leading indication for liver transplantation in many Western countries. 2 Sustained eradication of HCV infection has been associated with a reduced risk of liver-related morbidity and all-cause mortality. 3 – 5 Despite the recent registration of protease inhibitors for the treatment of chronic HCV genotype 1 infection, current therapeutic regimens remain dependent on the administration of pegylated interferon and ribavirin for 24 to 48 weeks. 6 , 7 Thus, anti-HCV therapy continues to . . .
High School Football and Late-Life Risk of Neurodegenerative Syndromes, 1956-1970
To assess whether athletes who played American varsity high school football between 1956 and 1970 have an increased risk of neurodegenerative diseases later in life. We identified all male varsity football players between 1956 and 1970 in the public high schools of Rochester, Minnesota, and non–football-playing male varsity swimmers, wrestlers, and basketball players. Using the medical records linkage system of the Rochester Epidemiology Project, we ascertained the incidence of late-life neurodegenerative diseases: dementia, parkinsonism, and amyotrophic lateral sclerosis. We also recorded medical record–documented head trauma during high school years. We identified 296 varsity football players and 190 athletes engaging in other sports. Football players had an increased risk of medically documented head trauma, especially if they played football for more than 1 year. Compared with nonfootball athletes, football players did not have an increased risk of neurodegenerative disease overall or of the individual conditions of dementia, parkinsonism, and amyotrophic lateral sclerosis. In this community-based study, varsity high school football players from 1956 to 1970 did not have an increased risk of neurodegenerative diseases compared with athletes engaged in other varsity sports. This was from an era when there was a generally nihilistic view of concussion dangers, less protective equipment, and no prohibition of spearing (head-first tackling). However, the size and strength of players from previous eras may not be comparable with that of current high school athletes.
Prednisolone Attenuates Improvement of Cardiac and Skeletal Contractile Function and Histopathology by Lisinopril and Spironolactone in the mdx Mouse Model of Duchenne Muscular Dystrophy
Duchenne muscular dystrophy (DMD) is an inherited disease that causes striated muscle weakness. Recently, we showed therapeutic effects of the combination of lisinopril (L), an angiotensin converting enzyme (ACE) inhibitor, and spironolactone (S), an aldosterone antagonist, in mice lacking dystrophin and haploinsufficient for utrophin (utrn(+/-);mdx, het mice); both cardiac and skeletal muscle function and histology were improved when these mice were treated early with LS. It was unknown to what extent LS treatment is effective in the most commonly used DMD murine model, the mdx mouse. In addition, current standard-of-care treatment for DMD is limited to corticosteroids. Therefore, potentially useful alternative or additive drugs need to be both compared directly to corticosteroids and tested in presence of corticosteroids. We evaluated the effectiveness of this LS combination in the mdx mouse model both compared with corticosteroid treatment (prednisolone, P) or in combination (LSP). We tested the additional combinatorial treatment containing the angiotensin II receptor blocker losartan (T), which is widely used to halt and treat the developing cardiac dysfunction in DMD patients as an alternative to an ACE inhibitor. Peak myocardial strain rate, assessed by magnetic resonance imaging, showed a negative impact of P, whereas in both diaphragm and extensor digitorum longus (EDL) muscle contractile function was not significantly impaired by P. Histologically, P generally increased cardiac damage, estimated by percentage area infiltrated by IgG as well as by collagen staining. In general, groups that only differed in the presence or absence of P (i.e. mdx vs. P, LS vs. LSP, and TS vs. TSP) demonstrated a significant detrimental impact of P on many assessed parameters, with the most profound impact on cardiac pathology.
Longitudinal associations of circadian eating patterns with sleep quality, fatigue and inflammation in colorectal cancer survivors up to 24 months post-treatment
Fatigue and insomnia, potentially induced by inflammation, are distressing symptoms experienced by colorectal cancer (CRC) survivors. Emerging evidence suggests that besides the nutritional quality and quantity, also the timing, frequency and regularity of dietary intake (chrono-nutrition) could be important for alleviating these symptoms. We investigated longitudinal associations of circadian eating patterns with sleep quality, fatigue and inflammation in CRC survivors. In a prospective cohort of 459 stage I-III CRC survivors, four repeated measurements were performed between 6 weeks and 24 months post-treatment. Chrono-nutrition variables included meal energy contribution, frequency (a maximum of six meals could be reported each day), irregularity and time window (TW) of energetic intake, operationalised based on 7-d dietary records. Outcomes included sleep quality, fatigue and plasma concentrations of inflammatory markers. Longitudinal associations of chrono-nutrition variables with outcomes from 6 weeks until 24 months post-treatment were analysed by confounder-adjusted linear mixed models, including hybrid models to disentangle intra-individual changes from inter-individual differences over time. An hour longer TW of energetic intake between individuals was associated with less fatigue (β: −6·1; 95 % CI (−8·8, −3·3)) and insomnia (β: −4·8; 95 % CI (−7·4, −2·1)). A higher meal frequency of on average 0·6 meals/d between individuals was associated with less fatigue (β: −3·7; 95 % CI (−6·6, −0·8)). An hour increase in TW of energetic intake within individuals was associated with less insomnia (β: −3·0; 95 % CI (−5·2, −0·8)) and inflammation (β: −0·1; 95 % CI (−0·1, 0·0)). Our results suggest that longer TWs of energetic intake and higher meal frequencies may be associated with less fatigue, insomnia and inflammation among CRC survivors. Future studies with larger contrasts in chrono-nutrition variables are needed to confirm these findings.
Longitudinal Associations of Adherence to the Dietary World Cancer Research Fund/American Institute for Cancer Research (WCRF/AICR) and Dutch Healthy Diet (DHD) Recommendations with Plasma Kynurenines in Colorectal Cancer Survivors after Treatment
The tryptophan-kynurenine pathway has been linked to cancer aetiology and survivorship, and diet potentially affects metabolites of this pathway, but evidence to date is scarce. Among 247 stage I-III CRC survivors, repeated measurements were performed at 6 weeks, 6 months, and 1 year post-treatment. Adherence to the World Cancer Research Fund/ American Institute for Cancer Research (WCRF) and Dutch Healthy Diet (DHD) recommendations was operationalized using seven-day dietary records. Plasma kynurenines of nine metabolites were analysed. Longitudinal associations of adherence to these dietary patterns and plasma kynurenines were analysed using confounder-adjusted linear mixed-models. In general, higher adherence to the dietary WCRF/AICR and DHD recommendations was associated with lower concentrations of kynurenines with pro-oxidative, pro-inflammatory, and neurotoxic properties (3-hydroxykynurenine (HK) and quinolinic acid (QA)), and higher concentrations of kynurenines with anti-oxidative, anti-inflammatory, and neuroprotective properties (kynurenic acid (KA) and picolinic acid (Pic)), but associations were weak and not statistically significant. Statistically significant positive associations between individual recommendations and kynurenines were observed for: nuts with kynurenic-acid-to-quinolinic-acid ratio (KA/QA); alcohol with KA/QA, KA, and xanthurenic acid (XA); red meat with XA; and cheese with XA. Statistically significant inverse associations were observed for: nuts with kynurenine-to-tryptophan ratio (KTR) and hydroxykynurenine ratio; alcohol with KTR; red meat with 3-hydroxyanthranilic-to-3-hydroxykynurenine ratio; ultra-processed foods with XA and KA/QA; and sweetened beverages with KA/QA. Our findings suggest that CRC survivors might benefit from adhering to the dietary WCRF and DHD recommendations in the first year after treatment, as higher adherence to these dietary patterns is generally, but weakly associated with more favourable concentrations of kynurenines and their ratios. These results need to be validated in other studies.
A medical-toxicological view of tattooing
Long perceived as a form of exotic self-expression in some social fringe groups, tattoos have left their maverick image behind and become mainstream, particularly for young people. Historically, tattoo-related health and safety regulations have focused on rules of hygiene and prevention of infections. Meanwhile, the increasing popularity of tattooing has led to the development of many new colours, allowing tattoos to be more spectacular than ever before. However, little is known about the toxicological risks of the ingredients used. For risk assessment, safe intradermal application of these pigments needs data for toxicity and biokinetics and increased knowledge about the removal of tattoos. Other concerns are the potential for phototoxicity, substance migration, and the possible metabolic conversion of tattoo ink ingredients into toxic substances. Similar considerations apply to cleavage products that are formed during laser-assisted tattoo removal. In this Review, we summarise the issues of concern, putting them into context, and provide perspectives for the assessment of the acute and chronic health effects associated with tattooing.
Diurnal variability, photochemical production and loss processes of hydrogen peroxide in the boundary layer over Europe
Hydrogen peroxide (H2O2) plays a significant role in the oxidizing capacity of the atmosphere. It is an efficient oxidant in the liquid phase and serves as a temporary reservoir for the hydroxyl radical (OH), the most important oxidizing agent in the gas phase. Due to its high solubility, removal of H2O2 due to wet and dry deposition is efficient, being a sink of HOx (OH+HO2) radicals. In the continental boundary layer, the H2O2 budget is controlled by photochemistry, transport and deposition processes. Here we use in situ observations of H2O2 and account for chemical source and removal mechanisms to study the interplay between these processes. The data were obtained during five ground-based field campaigns across Europe from 2008 to 2014 and bring together observations in a boreal forest, two mountainous sites in Germany, and coastal sites in Spain and Cyprus. Most campaigns took place in the summer, while the measurements in the south-west of Spain took place in early winter. Diel variations in H2O2 are strongly site-dependent and indicate a significant altitude dependence. While boundary-layer mixing ratios of H2O2 at low-level sites show classical diel cycles with the lowest values in the early morning and maxima around local noon, diel profiles are reversed on mountainous sites due to transport from the nocturnal residual layer and the free troposphere. The concentration of hydrogen peroxide is largely governed by its main precursor, the hydroperoxy radical (HO2), and shows significant anti-correlation with nitrogen oxides (NOx) that remove HO2. A budget calculation indicates that in all campaigns, the noontime photochemical production rate through the self-reaction of HO2 radicals was much larger than photochemical loss due to reaction with OH and photolysis, and that dry deposition is the dominant loss mechanism. Estimated dry deposition velocities varied between approximately 1 and 6 cm s−1, with relatively high values observed during the day in forested regions, indicating enhanced uptake of H2O2 by vegetation. In order to reproduce the change in H2O2 mixing ratios between sunrise and midday, a variable contribution from transport (10 %–100 %) is required to balance net photochemical production and deposition loss. Transport is most likely related to entrainment from the residual layer above the nocturnal boundary layer during the growth of the boundary layer in the morning.
Treatment interval in curative treatment of colon cancer, does it impact (cancer free) survival? A non-inferiority analysis
Background In treatment of colon cancer, strict waiting-time targets are enforced, leaving professionals no room to lengthen treatment intervals when advisable, for instance to optimise a patient’s health status by means of prehabilitation. Good quality studies supporting these targets are lacking. With this study we aim to establish whether a prolonged treatment interval is associated with a clinically relevant deterioration in overall and cancer free survival. Methods This retrospective multicenter non-inferiority study includes all consecutive patients who underwent elective oncological resection of a biopsy-proven primary non-metastatic colon carcinoma between 2010 and 2016 in six hospitals in the Southern Netherlands. Treatment interval was defined as time between diagnosis and surgical treatment. Cut-off points for treatment interval were ≤35 days and ≤49 days. Findings 3376 patients were included. Cancer recurred in 505 patients (15.0%) For cancer free survival, a treatment interval >35 days and >49 days was non-inferior to a treatment interval ≤35 days. Results for overall survival were inconclusive, but no association was found. Conclusion For cancer free survival, a prolonged treatment interval, even over 49 days, is non-inferior to the currently set waiting-time target of ≤35 days. Therefore, the waiting-time targets set as fundamental objective in current treatment guidelines should become directional instead of strict targets