Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Item Type
      Item Type
      Clear All
      Item Type
  • Subject
      Subject
      Clear All
      Subject
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
63 result(s) for "Kirk, Michele"
Sort by:
Serum Neurofilament Light in American Football Athletes over the Course of a Season
Despite being underreported, American football boasts the highest incidence of concussion among all team sports, likely due to exposure to head impacts that vary in number and magnitude over the season. This study compared a biological marker of head trauma in American football athletes with non-contact sport athletes and examined changes over the course of a season. Baseline serum neurofilament light polypeptide (NFL) was measured after 9 weeks of no contact and compared with a non-contact sport. Serum NFL was then measured over the course of the entire season at eight time-points coincident with expected changes in likelihood of increased head impacts. Data were compared between starters (n = 11) and non-starters (n = 9). Compared with non-starters (mean ± standard deviation) (7.30 ± 3.57 pg•mL−1) and controls (6.75 ± 1.68 pg•mL−1), serum NFL in starters (8.45 ± 5.90 pg•mL−1) was higher at baseline (mean difference; ±90% confidence interval) (1.69;  ± 1.96 pg•mL−1 and 1.15;  ± 1.4 pg•mL−1, respectively). Over the course of the season, an increase (effect size [ES] = 1.8; p < 0.001) was observed post-camp relative to baseline (1.52 ± 1.18 pg•mL−1), which remained elevated until conference play, when a second increase was observed (ES = 2.6; p = 0.008) over baseline (4.82 ± 2.64 pg•mL−1). A lack of change in non-starters resulted in substantial differences between starters and non-starters over the course of the season. These data suggest that a season of collegiate American football is associated with elevations in serum NFL, which is indicative of axonal injury, as a result of head impacts.
A Season of American Football Is Not Associated with Changes in Plasma Tau
American football athletes are routinely exposed to sub-concussive impacts over the course of the season. This study sought to examine the effect of a season of American football on plasma tau, a potential marker of axonal damage. Nineteen National Collegiate Athletic Association (NCAA) football athletes underwent serial blood sampling over the course of the 2014–2015 season at those times in which the number and magnitude of head impacts likely changed. Non-contact sport controls (NCAA men's swim athletes; n = 19) provided a single plasma sample for comparison. No significant differences were observed between control swim athletes and football athletes following a period of non-contact (p = 0.569) or a period of contact (p = 0.076). Football athletes categorized as starters (n = 11) had higher tau concentrations than non-starters (n = 8) following a period of non-contact (p = 0.039) and contact (p = 0.036), but not higher than swimmers (p = 1.000 and p = 1.000, respectively). No difference was noted over the course of the season in football athletes, irrespective of starter status. Despite routine head impacts common to the sport of American football, no changes were observed over the course of the season in football athletes, irrespective of starter status. Further, no difference was observed between football athletes and non-contact control swim athletes following a period of non-contact or contact. These data suggest that plasma tau is not sensitive enough to detect damage associated with repetitive sub-concussive impacts sustained by collegiate–level football athletes.
Prevalence and clinical implications of persistent or exertional cardiopulmonary symptoms following SARS-CoV-2 infection in 3597 collegiate athletes: a study from the Outcomes Registry for Cardiac Conditions in Athletes (ORCCA)
ObjectiveTo assess the prevalence and clinical implications of persistent or exertional cardiopulmonary symptoms in young competitive athletes following SARS-CoV-2 infection.MethodsThis observational cohort study from the Outcomes Registry for Cardiac Conditions in Athletes included 3597 US collegiate athletes after SARS-CoV-2 infection. Clinical characteristics, advanced diagnostic testing and SARS-CoV-2-associated sequelae were compared between athletes with persistent symptoms >3 weeks, exertional symptoms on return to exercise and those without persistent or exertional symptoms.ResultsAmong 3597 athletes (mean age 20 years (SD, 1 year), 34% female), data on persistent and exertional symptoms were reported in 3529 and 3393 athletes, respectively. Persistent symptoms >3 weeks were present in 44/3529 (1.2%) athletes with 2/3529 (0.06%) reporting symptoms >12 weeks. Exertional cardiopulmonary symptoms were present in 137/3393 (4.0%) athletes. Clinical evaluation and diagnostic testing led to the diagnosis of SARS-CoV-2-associated sequelae in 12/137 (8.8%) athletes with exertional symptoms (five cardiac involvement, two pneumonia, two inappropriate sinus tachycardia, two postural orthostatic tachycardia syndrome and one pleural effusion). No SARS-CoV-2-associated sequelae were identified in athletes with isolated persistent symptoms. Of athletes with chest pain on return to exercise who underwent cardiac MRI (CMR), 5/24 (20.8%) had probable or definite cardiac involvement. In contrast, no athlete with exertional symptoms without chest pain who underwent CMR (0/20) was diagnosed with probable or definite SARS-CoV-2 cardiac involvement.ConclusionCollegiate athletes with SARS-CoV-2 infection have a low prevalence of persistent or exertional symptoms on return to exercise. Exertional cardiopulmonary symptoms, specifically chest pain, warrant a comprehensive evaluation.
Time from Start of Quarantine to SARS-CoV-2 Positive Test Among Quarantined College and University Athletes — 17 States, June–October 2020
To safely resume sports, college and university athletic programs and regional athletic conferences created plans to mitigate transmission of SARS-CoV-2, the virus that causes coronavirus disease 2019 (COVID-19). Mitigation measures included physical distancing, universal masking, and maximizing outdoor activity during training; routine testing; 10-day isolation of persons with COVID-19; and 14-day quarantine of athletes identified as close contacts* of persons with confirmed COVID-19. Regional athletic conferences created testing and quarantine policies based on National Collegiate Athletic Association (NCAA) guidance (1); testing policies varied by conference, school, and sport. To improve compliance with quarantine and reduce the personal and economic burden of quarantine adherence, the quarantine period has been reduced in several countries from 14 days to as few as 5 days with testing (2) or 10 days without testing (3). Data on quarantined athletes participating in NCAA sports were used to characterize COVID-19 exposures and assess the amount of time between quarantine start and first positive SARS-CoV-2 test result. Despite the potential risk for transmission from frequent, close contact associated with athletic activities (4), more athletes reported exposure to COVID-19 at social gatherings (40.7%) and from roommates (31.7%) than they did from exposures associated with athletic activities (12.7%). Among 1,830 quarantined athletes, 458 (25%) received positive reverse transcription-polymerase chain reaction (RT-PCR) test results during the 14-day quarantine, with a mean of 3.8 days from quarantine start (range = 0-14 days) until the positive test result. Among athletes who had not received a positive test result by quarantine day 5, the probability of having a positive test result decreased from 27% after day 5 to <5% after day 10. These findings support new guidance from CDC (5) in which different options are provided to shorten quarantine for persons such as collegiate athletes, especially if doing so will increase compliance, balancing the reduced duration of quarantine against a small but nonzero risk for postquarantine transmission. Improved adherence to mitigation measures (e.g., universal masking, physical distancing, and hand hygiene) at all times could further reduce exposures to SARS-CoV-2 and disruptions to athletic activities because of infections and quarantine (1,6).
On Thin Ice: Bureaucratic Processes of Monetary Sanctions and Job Insecurity
Research on court-imposed monetary sanctions has not yet fully examined the impact that processes used to manage court debt have on individuals’ lives. Drawing from both interviews and ethnographic data in Illinois and Washington State, we examine how the court’s management of justice-related debt affect labor market experiences. We conceptualize these managerial practices as procedural pressure points or mechanisms embedded within these processes that strain individuals’ ability to access and maintain stable employment. We find that, as a result, courts undermine their own goal of recouping costs and trap individuals in a cycle of court surveillance.
Networks and landscapes: a framework for setting goals and evaluating performance at the large landscape scale
The objective of large landscape conservation is to mitigate complex ecological problems through interventions at multiple and overlapping scales. Implementation requires coordination among a diverse network of individuals and organizations to integrate local‐scale conservation activities with broad‐scale goals. This requires an understanding of the governance options and how governance regimes achieve objectives or provide performance evaluation across both space and time. However, empirical assessments measuring network‐governance performance in large landscape conservation are limited. We describe a well‐established large landscape conservation network in North America, the Roundtable on the Crown of the Continent, to explore the application of a social–ecological performance evaluation framework. Systematic approaches to setting goals, tracking progress, and collecting data for feedback can help guide adaptation. Applying the established framework to our case study provides a means of evaluating the effectiveness of network governance in large landscape conservation.
A Simple Genetic Architecture Underlies Morphological Variation in Dogs
Domestic dogs exhibit tremendous phenotypic diversity, including a greater variation in body size than any other terrestrial mammal. Here, we generate a high density map of canine genetic variation by genotyping 915 dogs from 80 domestic dog breeds, 83 wild canids, and 10 outbred African shelter dogs across 60,968 single-nucleotide polymorphisms (SNPs). Coupling this genomic resource with external measurements from breed standards and individuals as well as skeletal measurements from museum specimens, we identify 51 regions of the dog genome associated with phenotypic variation among breeds in 57 traits. The complex traits include average breed body size and external body dimensions and cranial, dental, and long bone shape and size with and without allometric scaling. In contrast to the results from association mapping of quantitative traits in humans and domesticated plants, we find that across dog breeds, a small number of quantitative trait loci (< or = 3) explain the majority of phenotypic variation for most of the traits we studied. In addition, many genomic regions show signatures of recent selection, with most of the highly differentiated regions being associated with breed-defining traits such as body size, coat characteristics, and ear floppiness. Our results demonstrate the efficacy of mapping multiple traits in the domestic dog using a database of genotyped individuals and highlight the important role human-directed selection has played in altering the genetic architecture of key traits in this important species.
Height-reducing variants and selection for short stature in Sardinia
Francesco Cucca, David Schlessinger, John Novembre, Gonçalo Abecasis and colleagues present sequencing-based whole-genome association analyses for stature in Sardinia and identify two variants that lead to reduced height. Their findings suggest that shorter stature was selected for in Sardinia. We report sequencing-based whole-genome association analyses to evaluate the impact of rare and founder variants on stature in 6,307 individuals on the island of Sardinia. We identify two variants with large effects. One variant, which introduces a stop codon in the GHR gene, is relatively frequent in Sardinia (0.87% versus <0.01% elsewhere) and in the homozygous state causes Laron syndrome involving short stature. We find that this variant reduces height in heterozygotes by an average of 4.2 cm (−0.64 s.d.). The other variant, in the imprinted KCNQ1 gene (minor allele frequency (MAF) = 7.7% in Sardinia versus <1% elsewhere) reduces height by an average of 1.83 cm (−0.31 s.d.) when maternally inherited. Additionally, polygenic scores indicate that known height-decreasing alleles are at systematically higher frequencies in Sardinians than would be expected by genetic drift. The findings are consistent with selection for shorter stature in Sardinia and a suggestive human example of the proposed 'island effect' reducing the size of large mammals.