Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Item Type
      Item Type
      Clear All
      Item Type
  • Subject
      Subject
      Clear All
      Subject
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
17 result(s) for "Groth, Caroline P"
Sort by:
Socioeconomic deprivation and suicide in Appalachia: The use of three socioeconomic deprivation indices to explain county-level suicide rates
West Virginia's (WV) suicide rate is 50% higher than the national average and is the highest in the Appalachian Region. Appalachia has several social factors that have contributed to greater socioeconomic deprivation, a known contributor of suicide. Given WV's high prevalence of suicide and poverty, the current study aims to examine the relationship between socioeconomic deprivation and suicide rates in WV. The Townsend Deprivation Index (TDI), Social Deprivation Index (SDI), and Social Vulnerability Index (SVI) measured socioeconomic deprivation. Negative binomial regression models assessed the relationship between socioeconomic deprivation scores, individual index items, and suicide rates. Model comparisons evaluated the indices' ability to assess suicide rates. A backward selection strategy identified additional key items for examining suicide rates. There was a significant increase in suicide rates for every 10% increase in TDI (β = 0.04; p < 0.01), SDI (β = 0.03; p = 0.04), and SVI scores (β = 0.05; p < 0.01). Household overcrowding and unemployment had a positive linear relationship with suicide in TDI (β = 0.04, p = 0.02; β = 0.07, p = 0.01), SDI (β = 0.10, p = 0.02; β = 0.01, p<0.01), and the SVI (β = 0.10, p = 0.02; β = 0.03, p<0.01). The backwards selection strategy identified additional key items included by the SVI when assessing suicide. Greater socioeconomic deprivation, measured by the TDI, SDI, and SVI, was significantly associated with higher suicide rates. Expanding unemployment benefits and increasing the availability of affordable housing, especially in rural areas, may be useful in reducing suicide rates. Our results suggest racial and ethnic minorities and adults living with a disability may benefit from targeted suicide prevention strategies.
Comparing the effects of decreasing prescription opioid shipments and the release of an abuse deterrent OxyContin formulation on opioid overdose fatalities in WV: an interrupted time series study
Introduction The 2010 release of an abuse deterrent formulation (ADF) of OxyContin, a brand name prescription opioid, has been cited as a major driver for the reduction in prescription drug misuse and the associated increasing illicit opioid use and overdose rates. However, studies of this topic often do not account for changes in supplies of other prescription opioids that were widely prescribed before and after the ADF OxyContin release, including generic oxycodone formulations and hydrocodone. We therefore sought to compare the impact of the ADF OxyContin release to that of decreasing prescription opioid supplies in West Virginia (WV). Methods Opioid tablet shipment and overdose data were extracted from The Washington Post ARCOS (2006–2014) and the WV Forensic Drug Database (2005–2020), respectively. Locally estimated scatterplot smoothing (LOESS) was used to estimate the point when shipments of prescription opioids to WV began decreasing, measured via dosage units and morphine milligram equivalents (MMEs). Interrupted time series analysis (ITSA) was used to compare the impact LOESS-identified prescription supply changes and the ADF OxyContin release had on prescription (oxycodone and hydrocodone) and illicit (heroin, fentanyl, and fentanyl analogues) opioid overdose deaths in WV. Model fit was compared using Akaike Information Criteria (AIC). Results The majority of opioid tablets shipped to WV from 2006 to 2014 were generic oxycodone or hydrocodone, not OxyContin. After accounting for a 6-month lag from ITSA models using the LOESS-identified change in prescription opioid shipments measured via dosage units (2011 Q3) resulted in the lowest AIC for both prescription (AIC = -188.6) and illicit opioid-involved overdoses (AIC = -189.4), indicating this intervention start date resulted in the preferred model. The second lowest AIC was for models using the ADF OxyContin release as an intervention start date. Discussion We found that illicit opioid overdoses in WV began increasing closer to when prescription opioid shipments to the state began decreasing, not when the ADF OxyContin release occurred. Similarly, the majority of opioid tablets shipped to the state for 2006–2014 were generic oxycodone or hydrocodone. This may indicate that diminishing prescription supplies had a larger impact on opioid overdose patterns than the ADF OxyContin release in WV.
Characterization of Cleaning and Disinfection Product Use, Glove Use, and Skin Disorders by Healthcare Occupations in a Midwestern Healthcare Facility
Healthcare facility staff use a wide variety of cleaning and disinfecting products during their daily operations, many of which are associated with respiratory or skin irritation or sensitization with repeated exposure. The objective of this study was to characterize the prevalence of cleaning and disinfection product use, glove use during cleaning and disinfection, and skin/allergy symptoms by occupation and identify the factors influencing glove use among the healthcare facility staff. A questionnaire was administered to the current employees at a midwestern Veterans Affairs healthcare facility that elicited information on cleaning and disinfection product use, glove use during cleaning and disinfection, skin/allergy symptoms, and other demographic characteristics, which were summarized by occupation. The central supply/environmental service workers (2% of the total survey population), nurses (26%,), nurse assistants (3%), and laboratory technicians (5%) had the highest prevalence of using cleaning or disinfecting products, specifically quaternary ammonium compounds, bleach, and alcohol. Glove use while using products was common in both patient care and non-patient care occupations. The factors associated with glove use included using bleach or quaternary ammonium compounds and using cleaning products 2–3 or 4–5 days per week. A high frequency of glove use (≥75%) was reported by workers in most occupations when using quaternary ammonium compounds or bleach. The use of alcohol, bleach, and quaternary ammonium compounds was associated with skin disorders (p < 0.05). These research findings indicate that although the workers from most occupations report a high frequency of glove use when using cleaning and disinfection products, there is room for improvement, especially among administrative, maintenance, and nursing workers. These groups may represent populations which could benefit from the implementation of workplace interventions and further training regarding the use of personal protective equipment and the potential health hazards of exposure to cleaning and disinfecting chemicals.
Temporal trends in occupational injuries treated in US emergency departments, 2012–2019
Background Evidence suggests that rates of occupational injuries in the US are decreasing. As several different occupational injury surveillance systems are used in the US, more detailed investigation of this trend is merited. Furthermore, studies of this decrease remain descriptive and do not use inferential statistics. The aim of this study was to provide both descriptive and inferential statistics of temporal trends of occupational injuries treated in US emergency departments (EDs) for 2012 to 2019. Methods Monthly non-fatal occupational injury rates from 2012 to 2019 were estimated using the national electronic injury surveillance system—occupational supplement (NEISS-Work) dataset, a nationally representative sample of ED-treated occupational injuries. Rates were generated for all injuries and by injury event type using monthly full-time worker equivalent (FTE) data from the US Current Population Survey as a denominator. Seasonality indices were used to detect seasonal variation in monthly injury rates. Trend analysis using linear regression adjusted for seasonality was conducted to quantify changes in injury rates from 2012 to 2019. Results Occupational injuries occurred at an average rate of 176.2 (95% CI =  ± 30.9) per 10,000 FTE during the study period. Rates were highest in 2012 and declined to their lowest level in 2019. All injury event types occurred at their highest rate in summer months (July or August) apart from falls, slips, and trips, which occurred at their highest rate in January. Trend analyses indicated that total injury rates decreased significantly throughout the study period (− 18.5%; 95% CI =  ± 14.5%). Significant decreases were also detected for injuries associated with contact with foreign object and equipment (− 26.9%; 95% CI =  ± 10.5%), transportation incidents (− 23.2%; 95% CI =  ± 14.7%), and falls, slips, and trips (− 18.1%; 95% CI =  ± 8.9%). Conclusions This study supports evidence that occupational injuries treated in US EDs have decreased since 2012. Potential contributors to this decrease include increased workplace mechanization and automation, as well as changing patterns in US employment and health insurance access.
Volatile Hydrocarbon Exposures and Incident Coronary Heart Disease Events: Up to Ten Years of Follow-up among Deepwater Horizon Oil Spill Workers
During the 2010 ( ) disaster, response and cleanup workers were potentially exposed to toxic volatile components of crude oil. However, to our knowledge, no study has examined exposure to individual oil spill-related chemicals in relation to cardiovascular outcomes among oil spill workers. Our aim was to investigate the association of several spill-related chemicals [benzene, toluene, ethylbenzene, xylene, -hexane (BTEX-H)] and total hydrocarbons (THC) with incident coronary heart disease (CHD) events among workers enrolled in a prospective cohort. Cumulative exposures to THC and BTEX-H across the cleanup period were estimated via a job-exposure matrix that linked air measurement data with self-reported spill work histories. We ascertained CHD events following each worker's last day of cleanup work as the first self-reported physician-diagnosed myocardial infarction (MI) or a fatal CHD event. We estimated hazard ratios (HR) and 95% confidence intervals for the associations of exposure quintiles (Q) with risk of CHD. We applied inverse probability weights to account for bias due to confounding and loss to follow-up. We used quantile g-computation to assess the joint effect of the BTEX-H mixture. Among 22,655 workers with no previous MI diagnoses, 509 experienced an incident CHD event through December 2019. Workers in higher quintiles of each exposure agent had increased CHD risks in comparison with the referent group (Q1) of that agent, with the strongest associations observed in Q5 (range of ). However, most associations were nonsignificant, and there was no evidence of exposure-response trends. We observed stronger associations among ever smokers, workers with education, and workers with body mass index . No apparent positive association was observed for the BTEX-H mixture. Higher exposures to volatile components of crude oil were associated with modest increases in risk of CHD among oil spill workers, although we did not observe exposure-response trends. https://doi.org/10.1289/EHP11859.
Development of a total hydrocarbon ordinal job-exposure matrix for workers responding to the Deepwater Horizon disaster: The GuLF STUDY
The GuLF STUDY is a cohort study investigating the health of workers who responded to the Deepwater Horizon oil spill in the Gulf of Mexico in 2010. The objective of this effort was to develop an ordinal job-exposure matrix (JEM) of airborne total hydrocarbons (THC), dispersants, and particulates to estimate study participants' exposures. Information was collected on participants' spill-related tasks. A JEM of exposure groups (EGs) was developed from tasks and THC air measurements taken during and after the spill using relevant exposure determinants. THC arithmetic means were developed for the EGs, assigned ordinal values, and linked to the participants using determinants from the questionnaire. Different approaches were taken for combining exposures across EGs. EGs for dispersants and particulates were based on questionnaire responses. Considerable differences in THC exposure levels were found among EGs. Based on the maximum THC level participants experienced across any job held, ∼14% of the subjects were identified in the highest exposure category. Approximately 10% of the cohort was exposed to dispersants or particulates. Considerable exposure differences were found across the various EGs, facilitating investigation of exposure-response relationships. The JEM is flexible to allow for different assumptions about several possibly relevant exposure metrics.
Fine Particulate Matter and Lung Function among Burning-Exposed Deepwater Horizon Oil Spill Workers
During the 2010 ( ) disaster, controlled burning was conducted to remove oil from the water. Workers near combustion sites were potentially exposed to increased fine particulate matter [with aerodynamic diameter ( )] levels. Exposure to has been linked to decreased lung function, but to our knowledge, no study has examined exposure encountered in an oil spill cleanup. We investigated the association between estimated only from burning/flaring of oil/gas and lung function measured 1-3 y after the disaster. We included workers who participated in response and cleanup activities on the water during the disaster and had lung function measured at a subsequent home visit ( ). concentrations were estimated using a Gaussian plume dispersion model and linked to work histories via a job-exposure matrix. We evaluated forced expiratory volume in 1 s (FEV1; milliliters), forced vital capacity (FVC; milliliters), and their ratio (FEV1/FVC; %) in relation to average and cumulative daily maximum exposures using multivariable linear regressions. We observed significant exposure-response trends associating higher cumulative daily maximum exposure with lower FEV1 ( ) and FEV1/FVC ( ). In comparison with the referent group (workers not involved in or near the burning), those with higher cumulative exposures had lower FEV1 [ , 95% confidence interval (CI): , 3.7] and FEV1/FVC ( , 95% CI: , 0.2). We also saw nonsignificant reductions in FVC (high vs. referent: , 95% CI: , 77.6; ). Similar associations were seen for average daily maximum exposure. Inverse associations were also observed in analyses stratified by smoking and time from exposure to spirometry and when we restricted to workers without prespill lung disease. Among oil spill workers, exposure to specifically from controlled burning of oil/gas was associated with significantly lower FEV1 and FEV1/FVC when compared with workers not involved in burning. https://doi.org/10.1289/EHP8930.
Comparing Accelerometer and Self-Reported Treatment Effects in a Technology-Supported Physical Activity Intervention
Background and Aims To estimate and compare the change in moderate-to-vigorous physical activity (MVPA) between an accelerometer and technology-supported physical activity (PA) log across a 3-week PA intervention. Method Participants (N = 204, 77% female, age = 33 ± 11 years, body mass index = 28.2 ± 7.1 kg/m2) were randomized to one of two activity-related intervention arms: (1) increase MVPA intervention or (2) decrease sedentary behavior active control. Participants wore an accelerometer while simultaneously completing a technology-based PA log every day for 5 weeks: a 2-week baseline assessment phase and a 3-week intervention phase. Bivariate linear mixed-effects models and correlations were used to characterize the relationship of MVPA between measurement methods throughout the intervention. Effect sizes were calculated to determine the intervention effect by measurement method. Results At baseline, PA log MVPA was 28 minutes greater than accelerometer-based minutes of MVPA in the active control group. This difference was 35 minutes (95% CI [23.7, 46.1]) greater at follow-up than at baseline measurement in the MVPA intervention group. In the active control group, there was a significant 16-minute (95% CI [6.0, 26.5]) increase between the two measures from baseline to follow-up. The intervention effect size based on the PA log was 0.27 (95% CI [0.14, 0.39]) and 0.42 (95% CI [0.28, 0.56]) when using the accelerometer. Discussion and Conclusions Our results indicate that PA log MVPA and accelerometer MVPA estimate significantly different minutes per day of MVPA. It is important researchers use caution when comparing MVPA intervention outcomes from different measurement methods.
Excess US Firearm Mortality During the COVID-19 Pandemic Stratified by Intent and Urbanization
This cross-sectional study used time series forecasting to estimate excess firearm mortality in the US during the COVID-19 pandemic.
Maternal Nanomaterial Inhalation Exposure: Critical Gestational Period in the Uterine Microcirculation is Angiotensin II Dependent
Maternal inhalation exposure to engineered nanomaterials (ENM) has been associated with microvascular dysfunction and adverse cardiovascular responses. Pregnancy requires coordinated vascular adaptation and growth that are imperative for survival. Key events in pregnancy hallmark distinct periods of gestation such as implantation, spiral artery remodeling, placentation, and trophoblast invasion. Angiotensin II (Ang II) is a critical vasoactive mediator responsible for adaptations and is implicated in the pathology of preeclampsia. If perturbations occur during gestation, such as those caused by ENM inhalation exposure, then maternal–fetal health consequences may occur. Our study aimed to identify the period of gestation in which maternal microvascular functional and fetal health are most vulnerable. Additionally, we wanted to determine if Ang II sensitivity and receptor density is altered due to exposure. Dams were exposed to ENM aerosols (nano-titanium dioxide) during three gestational windows: early (EE, gestational day (GD) 2–6), mid (ME, GD 8–12) or late (LE, GD 15–19). Within the EE group dry pup mass decreased by 16.3% and uterine radial artery wall to lumen ratio (WLR) increased by 25.9%. Uterine radial artery response to Ang II sensitivity increased by 40.5% in the EE group. Ang II receptor density was altered in the EE and LE group with decreased levels of AT2R. We conclude that early gestational maternal inhalation exposures resulted in altered vascular anatomy and physiology. Exposure during this time-period results in altered vascular reactivity and changes to uterine radial artery WLR, leading to decreased perfusion to the fetus and resulting in lower pup mass.