Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Item Type
      Item Type
      Clear All
      Item Type
  • Subject
      Subject
      Clear All
      Subject
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
      More Filters
      Clear All
      More Filters
      Source
    • Language
9,610 result(s) for "Football - statistics "
Sort by:
Does reducing the height of the tackle through law change in elite men’s rugby union (The Championship, England) reduce the incidence of concussion? A controlled study in 126 games
ObjectivesMost concussions in rugby union occur during tackles. We investigated whether legislation to lower maximum tackle height would change tackle behaviour, and reduce concussion incidence rate.MethodsIn an observational evaluation using a prospective cohort design, 12 elite men’s teams played in two competitions during the 2018/2019 season. The Championship (90 games) retained standard Laws of Rugby for the tackle; the Championship Cup (36 games) used revised laws—the maximum tackle height was lowered from the line of the shoulders on the ball carrier to the line of the armpits. Videos of tackles were analysed for ball carrier and tackler behaviour. Injury data were collected using standardised methods.ResultsIn the lowered tackle height setting, there was a significantly lower proportion of tackles; (1) in which ball carriers (rate ratio (RR) 0.83, 95% CI 0.79 to 0.86) and tacklers (RR 0.80, 95% CI 0.76 to 0.84) were upright, (2) in which the tackler’s initial contact was to the ball carrier’s head or neck (RR 0.70, 95% CI 0.58 to 0.84) and (3) in which initial contact was above the line of the ball carrier’s armpit (RR 0.84, 95% CI 0.80 to 0.88). Concussion incidence rate did not differ between conditions (RR 1.31, 95% CI 0.85 to 2.01). Unexpectedly, compared with the standard tackle height setting, tacklers in the lowered tackle height setting were themselves concussed at a higher rate as measured by; (1) incidence (RR 1.90, 95% CI 1.05 to 3.45) and (2) concussions per 1000 tackles (2.09, 95% CI 1.15 to 3.80).ConclusionsLegislating to lower the height of the tackle meant that tacklers made contact with the ball carrier’s head and neck 30% less often. This did not influence concussion incidence rates. Tacklers in the lowered tackle height setting suffered more concussions than did tacklers in the standard tackle height setting.
The Faroe Islands COVID-19 Recreational Football Study: Player-to-Player Distance, Body-to-Body Contact, Body-to-Ball Contact and Exercise Intensity during Various Types of Football Training for Both Genders and Various Age Groups
We determined player-to-player distance, body-to-ball contact, and exercise intensity during three training modalities in various football populations. 213 participants were recruited, ranging from 9-year-old boys to young men and 11-year-old girls to middle-aged women. All groups were analysed with video-filming and GPS-based Polar Pro monitors during three types of football training for 20 min, i.e., COVID-19-modified training (CMT) with >2-metre player-to-player distance, small-sided games (SSG), and simulated match-play with normal rules (SMP), in randomised order. Time spent in a danger zone (1.5 m) per-percent-infected-player (DZ PPIP) ranged from 0.015 to 0.279% of playing time. DZ PPIP for SSG was higher (P<0.05) than CMT and SMP. The average number of contacts (within 1.5 m) with a potentially infected player ranged from 12 to 73 contacts/hour. SSG had more (P<0.05) contacts than CMT and SMP, with SMP having a higher (P<0.05) number of contacts than CMT. Time/contact ranged from 0.87 to 3.00 seconds for the groups. No player-to-player and body-to-ball touches were registered for CMT. Total player-to-player contacts were 264% higher (P<0.05) in SSG than SMP, ranging from 80 to 170 and 25 to 56 touches, respectively. In all groups, a greater total distance was covered during SMP compared to CMT (38–114%; P<0.05). All groups performed more high-intensity running (33–54%; P<0.05) and had higher heart rates during SMP compared to CMT. Different types of football training all appear to exert a minor COVID-19 infection risk; however, COVID-19-modified training may be safer than small-sided game training, but also match-play. In contrast, exercise intensity is lower during COVID-19-modified training than match-play.
Does player time-in-game affect tackle technique in elite level rugby union?
It has been hypothesised that fatigue may be a major factor in tackle-related injury risk in rugby union and hence more injuries occur in the later stages of a game. The aim of this study is to identify changes in ball carrier or tackler proficiency characteristics, using elite level match video data, as player time-in-game increases. Qualitative observational cohort study. Three 2014/15 European Rugby Champions Cup games were selected for ball carrier and tackler proficiency analysis. Analysis was only conducted on players who started and remained on the field for the entire game. A separate analysis was conducted on 10 randomly selected 2014/15 European Rugby Champions Cup/Pro 12 games to assess the time distribution of tackles throughout a game. A Chi-square test and one-way way ANOVA with post-hoc testing was conducted to identify significant differences (p<0.05) for proficiency characteristics and tackle counts between quarters in the game, respectively. Player time-in-game did not affect tackle proficiency for both the ball carrier and tackler. Any results that showed statistical significance did not indicate a trend of deterioration in proficiency with increased player time-in-game. The time distribution of tackles analysis indicated that more tackles occurring in the final quarter of the game than the first (p=0.04) and second (p=<0.01). It appears that player time-in-game does not affect tackler or ball carrier tackle technique proficiency at the elite level. More tackles occurring in the final quarter of a game provides an alternative explanation to more tackle-related injuries occurring at this stage.
The association between adolescent football participation and early adulthood depression
Concerned about potentially increased risk of neurodegenerative disease, several health professionals and policy makers have proposed limiting or banning youth participation in American-style tackle football. Given the large affected population (over 1 million boys play high school football annually), careful estimation of the long-term health effects of playing football is necessary for developing effective public health policy. Unfortunately, existing attempts to estimate these effects tend not to generalize to current participants because they either studied a much older cohort or, more seriously, failed to account for potential confounding. We leverage data from a nationally representative cohort of American men who were in grades 7-12 in the 1994-95 school year to estimate the effect of playing football in adolescent on depression in early adulthood. We control for several potential confounders related to subjects' health, behavior, educational experience, family background, and family health history through matching and regression adjustment. We found no evidence of even a small harmful effect of football participation on scores on a version of the Center for Epidemiological Studies Depression scale (CES-D) nor did we find evidence of adverse associations with several secondary outcomes including anxiety disorder diagnosis or alcohol dependence in early adulthood. For men who were in grades 7-12 in the 1994-95 school year, participating or intending to participate in school football does not appear to be a major risk factor for early adulthood depression.
High School Football and Late-Life Risk of Neurodegenerative Syndromes, 1956-1970
To assess whether athletes who played American varsity high school football between 1956 and 1970 have an increased risk of neurodegenerative diseases later in life. We identified all male varsity football players between 1956 and 1970 in the public high schools of Rochester, Minnesota, and non–football-playing male varsity swimmers, wrestlers, and basketball players. Using the medical records linkage system of the Rochester Epidemiology Project, we ascertained the incidence of late-life neurodegenerative diseases: dementia, parkinsonism, and amyotrophic lateral sclerosis. We also recorded medical record–documented head trauma during high school years. We identified 296 varsity football players and 190 athletes engaging in other sports. Football players had an increased risk of medically documented head trauma, especially if they played football for more than 1 year. Compared with nonfootball athletes, football players did not have an increased risk of neurodegenerative disease overall or of the individual conditions of dementia, parkinsonism, and amyotrophic lateral sclerosis. In this community-based study, varsity high school football players from 1956 to 1970 did not have an increased risk of neurodegenerative diseases compared with athletes engaged in other varsity sports. This was from an era when there was a generally nihilistic view of concussion dangers, less protective equipment, and no prohibition of spearing (head-first tackling). However, the size and strength of players from previous eras may not be comparable with that of current high school athletes.
Social media perceptions of college football performance and season length 2019–2023
College Football carries a strong social, cultural, and economic importance in the United States with teams, universities, the public and others leveraging social media and the online sphere to promote and discuss events and happenings. In our study, we quantify the mentions and underlying public sentiment of online posts related to American College Football originating in the 2019–2023 seasons on social and news media in the United States. We complement this with an analysis of how team performance clustered into conference levels relates to social media data using ordinary least squares. We find the impact of the 2020 season, which had pandemic-induced schedule adjustments, was felt across the sports landscape, but impacted different conferences in diverse ways. We further observe that public sentiment during the season tends to be higher in Power Five conferences that have a lower winning percentage. Additionally, our results suggest that the COVID season corresponds to decreased mentions. The effect on sentiment is less clear, but we more generally find that the winning percentage (positively) predicts sentiment.
Socioeconomic Status and Race Outperform Concussion History and Sport Participation in Predicting Collegiate Athlete Baseline Neurocognitive Scores
Objectives: The purpose of this study was to assess the contribution of socioeconomic status (SES) and other multivariate predictors to baseline neurocognitive functioning in collegiate athletes. Methods: Data were obtained from the Concussion Assessment, Research and Education (CARE) Consortium. Immediate Post-Concussion Assessment and Cognitive Testing (ImPACT) baseline assessments for 403 University of Florida student-athletes (202 males; age range: 18–23) from the 2014–2015 and 2015–2016 seasons were analyzed. ImPACT composite scores were consolidated into one memory and one speed composite score. Hierarchical linear regressions were used for analyses. Results: In the overall sample, history of learning disability (β=−0.164; p=.001) and attention deficit–hyperactivity disorder (β=−0.102; p=.038) significantly predicted worse memory and speed performance, respectively. Older age predicted better speed performance (β=.176; p<.001). Black/African American race predicted worse memory (β=−0.113; p=.026) and speed performance (β=−.242; p<.001). In football players, higher maternal SES predicted better memory performance (β=0.308; p=.007); older age predicted better speed performance (β=0.346; p=.001); while Black/African American race predicted worse speed performance (β=−0.397; p<.001). Conclusions: Baseline memory and speed scores are significantly influenced by history of neurodevelopmental disorder, age, and race. In football players, specifically, maternal SES independently predicted baseline memory scores, but concussion history and years exposed to sport were not predictive. SES, race, and medical history beyond exposure to brain injury or subclinical brain trauma are important factors when interpreting variability in cognitive scores among collegiate athletes. Additionally, sport-specific differences in the proportional representation of various demographic variables (e.g., SES and race) may also be an important consideration within the broader biopsychosocial attributional model. (JINS, 2018, 24, 1–10)
Career duration and later-life health conditions among former professional American-style football players
ObjectivesCareer duration is often used as a metric of neurotrauma exposure in studies of elite athletes. However, as a proxy metric, career length may not accurately represent causal factors, and associations with health outcomes may be susceptible to selection effects. To date, relationships between professional American-style football (ASF) career length and postcareer health remain incompletely characterised.MethodsWe conducted a survey-based cross-sectional cohort study of former professional ASF players. Flexible regression methods measured associations between self-reported career duration and four self-reported health conditions: pain, arthritis, mood and cognitive symptoms. We also measured associations between career duration and four self-reported ASF exposures: prior concussion signs and symptoms (CSS), performance enhancing drugs, intracareer surgeries and average snaps per game. Models were adjusted for age and race.ResultsAmong 4189 former players (52±14 years of age, 39% black, 34% lineman position), the average career length was 6.7±3.9 professional seasons (range=1–20+). We observed inverted U-shaped relationships between career duration and outcomes (all p<0.001), indicating that adverse health effects were more common among men with intermediate career durations than those with shorter or longer careers. Similar findings were observed for play-related exposures (eg, CSS and snaps).ConclusionsRelationships between ASF career duration and subsequent health status are non-linear. Attenuation of the associations among longer career players may reflect selection effects and suggest career length may serve as a poor proxy for true causal factors. Findings highlight the need for cautious use of career duration as a proxy exposure metric in studies of former athletes.
Cumulative Head Impact Burden in High School Football
Impacts to the head are common in collision sports such as football. Emerging research has begun to elucidate concussion tolerance levels, but sub-concussive impacts that do not result in clinical signs or symptoms of concussion are much more common, and are speculated to lead to alterations in cerebral structure and function later in life. We investigated the cumulative number of head impacts and their associated acceleration burden in 95 high school football players across four seasons of play using the Head Impact Telemetry System (HITS). The 4-year investigation resulted in 101,994 impacts collected across 190 practice sessions and 50 games. The number of impacts per 14-week season varied by playing position and starting status, with the average player sustaining 652 impacts. Linemen sustained the highest number of impacts per season (868); followed by tight ends, running backs, and linebackers (619); then quarterbacks (467); and receivers, cornerbacks, and safeties (372). Post-impact accelerations of the head also varied by playing position and starting status, with a seasonal linear acceleration burden of 16,746.1g, while the rotational acceleration and HIT severity profile burdens were 1,090,697.7 rad/sec2 and 10,021, respectively. The adolescent athletes in this study clearly sustained a large number of impacts to the head, with an impressive associated acceleration burden as a direct result of football participation. These findings raise concern about the relationship between sub-concussive head impacts incurred during football participation and late-life cerebral pathogenesis, and justify consideration of ways to best minimize impacts and mitigate cognitive declines.
How Much Rugby is Too Much? A Seven-Season Prospective Cohort Study of Match Exposure and Injury Risk in Professional Rugby Union Players
Introduction Numerous studies have documented the incidence and nature of injuries in professional rugby union, but few have identified specific risk factors for injury in this population using appropriate statistical methods. In particular, little is known about the role of previous short-term or longer-term match exposures in current injury risk in this setting. Objectives Our objective was to investigate the influence that match exposure has upon injury risk in rugby union. Method We conducted a seven-season (2006/7–2012/13) prospective cohort study of time-loss injuries in 1253 English premiership professional players. Players’ 12-month match exposure (number of matches a player was involved in for ≥20 min in the preceding 12 months) and 1-month match exposure (number of full-game equivalent [FGE] matches in preceding 30 days) were assessed as risk factors for injury using a nested frailty model and magnitude-based inferences. Results The 12-month match exposure was associated with injury risk in a non-linear fashion; players who had been involved in fewer than ≈15 or more than ≈35 matches over the preceding 12-month period were more susceptible to injury. Monthly match exposure was linearly associated with injury risk (hazard ratio [HR]: 1.14 per 2 standard deviation [3.2 FGE] increase, 90% confidence interval [CI] 1.08–1.20; likely harmful), although this effect was substantially attenuated for players in the upper quartile for 12-month match exposures (>28 matches). Conclusion A player’s accumulated (12-month) and recent (1-month) match exposure substantially influences their current injury risk. Careful attention should be paid to planning the workloads and monitoring the responses of players involved in: (1) a high (>≈35) number of matches in the previous year, (2) a low (<≈15) number of matches in the previous year, and (3) a low-moderate number of matches in previous year but who have played intensively in the recent past. These findings make a major contribution to evidence-based policy decisions regarding match workload limits in professional rugby union.