Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Item Type
      Item Type
      Clear All
      Item Type
  • Subject
      Subject
      Clear All
      Subject
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
      More Filters
      Clear All
      More Filters
      Source
    • Language
705 result(s) for "Reynolds, Steven"
Sort by:
Meta-Analysis of Nitrogen Removal in Riparian Buffers
Riparian buffers, the vegetated region adjacent to streams and wetlands, are thought to be effective at intercepting and reducing nitrogen loads entering water bodies. Riparian buffer width is thought to be positively related to nitrogen removal effectiveness by influencing nitrogen retention or removal. We surveyed the scientific literature containing data on riparian buffers and nitrogen concentration in streams and groundwater to identify trends between nitrogen removal effectiveness and buffer width, hydrological flow path, and vegetative cover. Nitrogen removal effectiveness varied widely. Wide buffers (>50 m) more consistently removed significant portions of nitrogen entering a riparian zone than narrow buffers (0-25 m). Buffers of various vegetation types were equally effective at removing nitrogen but buffers composed of herbaceous and forest/herbaceous vegetation were more effective when wider. Subsurface removal of nitrogen was efficient, but did not appear to be related to buffer width, while surface removal of nitrogen was partly related to buffer width. The mass of nitrate nitrogen removed per unit length of buffer did not differ by buffer width, flow path, or buffer vegetation type. Our meta-analysis suggests that buffer width is an important consideration in managing nitrogen in watersheds. However, the inconsistent effects of buffer width and vegetation on nitrogen removal suggest that soil type, subsurface hydrology (e.g., soil saturation, groundwater flow paths), and subsurface biogeochemistry (organic carbon supply, nitrate inputs) also are important factors governing nitrogen removal in buffers.
Systematic review of cognitive impairment and brain insult after mechanical ventilation
We conducted a systematic review following the PRISMA protocol primarily to identify publications that assessed any links between mechanical ventilation (MV) and either cognitive impairment or brain insult, independent of underlying medical conditions. Secondary objectives were to identify possible gaps in the literature that can be used to inform future studies and move toward a better understanding of this complex problem. The preclinical literature suggests that MV is associated with neuroinflammation, cognitive impairment, and brain insult, reporting higher neuroinflammatory markers, greater evidence of brain injury markers, and lower cognitive scores in subjects that were ventilated longer, compared to those ventilated less, and to never-ventilated subjects. The clinical literature suggests an association between MV and delirium, and that delirium in mechanically ventilated patients may be associated with greater likelihood of long-term cognitive impairment; our systematic review found no clinical study that demonstrated a causal link between MV, cognitive dysfunction, and brain insult. More studies should be designed to investigate ventilation-induced brain injury pathways as well as any causative linkage between MV, cognitive impairment, and brain insult.
Moral distress in intensive care unit professionals is associated with profession, age, and years of experience
To determine which demographic characteristics are associated with moral distress in intensive care unit (ICU) professionals. We distributed a self-administered, validated survey to measure moral distress to all clinical personnel in 13 ICUs in British Columbia, Canada. Each respondent to the survey also reported their age, sex, and years of experience in the ICU where they were working. We used multivariate, hierarchical regression to analyze relationships between demographic characteristics and moral distress scores, and to analyze the relationship between moral distress and tendency to leave the workplace. Response rates to the surveys were the following: nurses—428/870 (49%); other health professionals (not nurses or physicians)—211/452 (47%); physicians—30/68 (44%). Nurses and other health professionals had higher moral distress scores than physicians. Highest ranked items associated with moral distress were related to cost constraints and end-of-life controversies. Multivariate analyses showed that age is inversely associated with moral distress, but only in other health professionals (rate ratio [95% confidence interval]: −7.3 [−13.4, −1.2]); years of experience is directly associated with moral distress, but only in nurses (rate ratio (95% confidence interval):10.8 [2.6, 18.9]). The moral distress score is directly related to the tendency to leave the ICU job, in both the past and present, but only for nurses and other non-physician health professionals. Moral distress is higher in ICU nurses and other non-physician professionals than in physicians, is lower with older age for other non-physician professionals but greater with more years of experience in nurses, and is associated with tendency to leave the job.
HIV testing in a South African Emergency Department: A missed opportunity
South Africa has the largest HIV epidemic in the world, with 19% of the global number of people living with HIV, 15% of new infections and 11% of AIDS-related deaths. Even though HIV testing is mandated in all hospital-based facilities in South Africa (SA), it is rarely implemented in the Emergency Department (ED). The ED provides episodic care to large volumes of undifferentiated who present with unplanned injury or illness. Thus, the ED may provide an opportunity to capture patients with undiagnosed HIV infection missed by clinic-based screening programs. In this prospective exploratory study, we implemented the National South African HIV testing guidelines (counselor initiated non-targeted universal screening with rapid point of care testing) for 24-hours a day at Frere Hospital in the Eastern Cape from September 1st to November 30th, 2016. The purpose of our study was to quantify the burden of undiagnosed HIV infection in a South African ED setting. Furthermore, we sought to evaluate the effectiveness of the nationally recommended HIV testing strategy in the ED. All patients who presented for care in the ED during the study period, and who were clinically stable and fully conscious, were eligible to be approached by HIV counseling and testing (HCT) staff to receive a rapid point-of-care HIV test. A total of 2355 of the 9583 (24.6%) patients who presented to the ED for care during the study period were approached by the HCT staff, of whom 1714 (72.8%) accepted HIV testing. There was a high uptake of HIV testing (78.6%) among a predominantly male (58%) patient group who mostly presented with traumatic injuries (70.8%). Four hundred (21.6%) patients were HIV positive, including 115 (6.2%) with newly diagnosed HIV infection. The overall prevalence of HIV infection was twice as high in females (29.8%) compared to males (15.4%). Both sexes had a similar prevalence of newly diagnosed HIV infection (6.0% for all females and 6.4% for all males) in the ED. Overall there was high HIV testing acceptance by ED patients. A non-targeted testing approached revealed a high HIV prevalence with a significant burden of undiagnosed HIV infection in the ED. Unfortunately, a counselor-driven HIV testing approach fell short of meeting the testing needs in this setting, with over 75% of ED patients not approached by HCT staff.
Locking solutions for prevention of central venous access device complications in the adult critical care population: A systematic review
The objective of this systematic review is to determine the extent and quality of evidence for use of different types of locking fluids to prevent central venous access device complications in adult critical care patients. Specifically, rates of catheter-related bloodstream infection, colonization, and occlusion were considered. All types of devices were included in the review: central venous catheters, peripherally- inserted central catheters and hemodialysis catheters. Eligibility criteria. Papers had to include adult (>18 years old) critical care patients, be experimental trials, conducted in North America and Europe, and published in peer-reviewed journals from 2010 onwards. Information sources. A search of Medline and EMBASE databases was performed. The search is current as of November 28th, 2022. Risk of bias. The Cochrane Risk of Bias 2 and the Risk of Bias In Non-Randomized Studies of Intervention tools were used to assess the risk of bias in included studies. Included studies. A total of 240 paper titles and abstracts underwent review, of these seven studies met the final criteria for quality appraisal. A total of three studies earned a low risk of bias quality appraisal. Limitations of evidence. Due to heterogeneity of types of locking fluids investigated and small number of studies identified, meta-analysis of results was not possible. Interpretation. Out of all fluids investigated, only citrate 46.7% was found to statistically reduce central venous access device complication rates. This systematic review has also identified a gap in the literature regarding studies of locking fluids that are adequately powered in this patient population. Future research should include investigations and use of novel locking fluids with more effective properties against complications. It is imperative that future studies are adequately powered, randomized controlled trials in this patient population to facilitate optimal evidence-based care.
Clinical characteristics and primary management of patients diagnosed with prostate cancer between 2015 and 2019 at the Uganda Cancer Institute
Prostate cancer is the second most common cancer among men in Uganda, with over 2086 incident cases in 2018. This study's objective was to report the clinical characteristics and primary management of men diagnosed with prostate cancer at the Uganda Cancer Institute from 1.sup.st January 2015 to 31.sup.st December 2019. Records from all men diagnosed with Prostate cancer at the Uganda Cancer Institute from 1.sup.st January 2015 to 31.sup.st December 2019 were reviewed. Clinical characteristics and primary treatment were recorded. Risk categorization was done using the European Society for Medical Oncology prostate cancer risk group classification. A total of 874 medical records for men diagnosed with prostate cancer was retrieved. The median age was 70 years (interquartile range 64-77). In this study, 501 (57.32%) patients had localized disease. Among patients with localized disease, 2 (0.23%) were classified as low-risk, 5 (0.53%) as intermediate-risk, and 494 (56.52%) as high-risk. Three hundred seventy-three (373) patients had metastatic disease at diagnosis. Among patients with distant metastases, the most common site of metastases was bone 143 (16.36%), followed by spinal cord 54 (6.18%), abdomen 22 (2.52%), and lungs 14 (1.60%). Regarding the primary treatment options majority of the patients were on chemotherapy 384(43.94%) followed by hormonal therapy 336 (38.44%) and radiotherapy 127 (14.53%). The majority of the patients diagnosed with prostate cancer at the Uganda Cancer Institute presented with advanced disease. The primary treatments were mostly chemotherapy, hormonal therapy, and radiotherapy. There is a need to improve prostate cancer screening in regional health care facilities and the communities to enhance early detection and management of prostate cancer.
Quantifying prevalence and risk factors of HIV multiple infection in Uganda from population-based deep-sequence data
People living with HIV can acquire secondary infections through a process called superinfection, giving rise to simultaneous infection with genetically distinct variants (multiple infection). Multiple infection provides the necessary conditions for the generation of novel recombinant forms of HIV and may worsen clinical outcomes and increase the rate of transmission to HIV seronegative sexual partners. To date, studies of HIV multiple infection have relied on insensitive bulk-sequencing, labor intensive single genome amplification protocols, or deep-sequencing of short genome regions. Here, we identified multiple infections in whole-genome or near whole-genome HIV RNA deep-sequence data generated from plasma samples of 2,029 people living with viremic HIV who participated in the population-based Rakai Community Cohort Study (RCCS). We estimated individual- and population-level probabilities of being multiply infected and assessed epidemiological risk factors using the novel Bayesian deep-phylogenetic multiple infection model ( deep  −  phyloMI ) which accounts for bias due to partial sequencing success and false-negative and false-positive detection rates. We estimated that between 2010 and 2020, 4.09% (95% highest posterior density interval (HPD) 2.95%–5.45%) of RCCS participants with viremic HIV multiple infection at time of sampling. Participants living in high-HIV prevalence communities along Lake Victoria were 2.33-fold (95% HPD 1.3–3.7) more likely to harbor a multiple infection compared to individuals in lower prevalence neighboring communities. This work introduces a high-throughput surveillance framework for identifying people with multiple HIV infections and quantifying population-level prevalence and risk factors of multiple infection for clinical and epidemiological investigations.
Comparison of different cardiovascular risk tools used in HIV patient cohorts in sub-Saharan Africa; do we need to include laboratory tests?
Cardiovascular disease (CVD) is the leading cause of death globally, representing 31% of all global deaths. HIV and long term anti-retroviral therapy (ART) are risk factors for development of CVD in populations of people living with HIV (PLHIV). CVD risk assessment tools are currently being applied to SSA populations, but there are questions about accuracy as well as implementation challenges of these tools in lower resource setting populations. We aimed to assess the level of agreement between the various cardiovascular screening tools (Data collection on Adverse effects of anti-HIV Drugs (D:A:D), Framingham risk score, WHO risk score and The Atherosclerotic Cardiovascular Disease Score) when applied to an HIV ART experienced population in Sub-Saharan Africa. This study was undertaken in an Anti-Retroviral Long Term (ALT) Cohort of 1000 PLHIV in care who have been on ART for at least 10 years in urban Uganda. A systematic review was undertaken to find the most frequently used screening tools from SSA PLHIV populations; these were applied to the ALT cohort. Levels of agreement between the resulting scores (those including lipids and non-lipids based, as well as HIV-specific and non-HIV specific) as applied to our cohort were compared. Prevalence Bias Adjusted Kappa was used to evaluate agreement between tools. Overall, PLHIV in ALT cohort had a median score of 1.1-1.4% risk of a CVD event over 5 years and 1.7-2.5% risk of a CVD event over 10 years. There was no statistical difference in the risk scores obtained for this population when comparing the different tools, including comparisons of those with lipids and non-lipids, and HIV specific vs non-HIV specific. The various tools yielded similar results, but those not including lipids are more feasible to apply in our setting. Long-term cohorts of PLHIV in SSA should in future provide longitudinal data to evaluate existing CVD risk prediction tools for these populations. Inclusion of HIV and ART history factors to existing scoring systems may improve accuracy without adding the expense and technical difficulty of lipid testing.
Effect of Peer Health Workers on AIDS Care in Rakai, Uganda: A Cluster-Randomized Trial
Human resource limitations are a challenge to the delivery of antiretroviral therapy (ART) in low-resource settings. We conducted a cluster randomized trial to assess the effect of community-based peer health workers (PHW) on AIDS care of adults in Rakai, Uganda. 15 AIDS clinics were randomized 2:1 to receive the PHW intervention (n = 10) or control (n = 5). PHW tasks included clinic and home-based provision of counseling, clinical, adherence to ART, and social support. Primary outcomes were adherence and cumulative risk of virologic failure (>400 copies/mL). Secondary outcomes were virologic failure at each 24 week time point up to 192 weeks of ART. Analysis was by intention to treat. From May 2006 to July 2008, 1336 patients were followed. 444 (33%) of these patients were already on ART at the start of the study. No significant differences were found in lack of adherence (<95% pill count adherence risk ratio [RR] 0.55, 95% confidence interval [CI] 0.23-1.35; <100% adherence RR 1.10, 95% CI 0.94-1.30), cumulative risk of virologic failure (RR 0.81, 95% CI 0.61-1.08) or in shorter-term virologic outcomes (24 week virologic failure RR 0.93, 95% CI 0.65-1.32; 48 week, RR 0.83, 95% CI 0.47-1.48; 72 week, RR 0.81, 95% CI 0.44-1.49). However, virologic failure rates >or=96 weeks into ART were significantly decreased in the intervention arm compared to the control arm (96 week failure RR 0.50, 95% CI 0.31-0.81; 120 week, RR 0.59, 95% CI 0.22-1.60; 144 week, RR 0.39, 95% CI 0.16-0.95; 168 week, RR 0.30, 95% CI 0.097-0.92; 192 week, RR 0.067, 95% CI 0.0065-0.71). A PHW intervention was associated with decreased virologic failure rates occurring 96 weeks and longer into ART, but did not affect cumulative risk of virologic failure, adherence measures, or shorter-term virologic outcomes. PHWs may be an effective intervention to sustain long-term ART in low-resource settings. ClinicalTrials.gov NCT00675389.
Mitigation of Ventilator-induced Diaphragm Atrophy by Transvenous Phrenic Nerve Stimulation
Ventilator-induced diaphragm dysfunction is a significant contributor to weaning difficulty in ventilated critically ill patients. It has been hypothesized that electrically pacing the diaphragm during mechanical ventilation could reduce diaphragm dysfunction. We tested a novel, central line catheter-based, transvenous phrenic nerve pacing therapy for protecting the diaphragm in sedated and ventilated pigs. Eighteen Yorkshire pigs were studied. Six pigs were sedated and mechanically ventilated for 2.5 days with pacing on alternate breaths at intensities that reduced the ventilator pressure-time product by 20-30%. Six matched subjects were similarly sedated and ventilated but were not paced. Six pigs served as never-ventilated, never-paced control animals. Cumulative duration of pacing therapy ranged from 19.7 to 35.7 hours. Diaphragm thickness assessed by ultrasound and normalized to initial value showed a significant decline in ventilated-not paced but not in ventilated-paced subjects (0.84 [interquartile range (IQR), 0.78-0.89] vs. 1.10 [IQR, 1.02-1.24]; P = 0.001). Compared with control animals (24.6 μm /kg; IQR, 21.6-26.0), median myofiber cross-sectional areas normalized to weight and sarcomere length were significantly smaller in the ventilated-not paced (17.9 μm /kg; IQR, 15.3-23.7; P = 0.005) but not in the ventilated-paced group (24.9 μm /kg; IQR, 16.6-27.3; P = 0.351). After 60 hours of mechanical ventilation all six ventilated-paced subjects tolerated 8 minutes of intense phrenic stimulation, whereas three of six ventilated-not paced subjects did not (P = 0.055). There was a nonsignificant decrease in diaphragm tetanic force production over the experiment in the ventilated-paced and ventilated-not paced groups. These results suggest that early transvenous phrenic nerve pacing may mitigate ventilator-induced diaphragm dysfunction.