Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Item Type
      Item Type
      Clear All
      Item Type
  • Subject
      Subject
      Clear All
      Subject
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
29 result(s) for "Lompo, F."
Sort by:
Fertilization practices alter microbial nutrient limitations after alleviation of carbon limitation in a Ferric Acrisol
Microbial nutrient limitation was investigated in a 53-year-old field experiment in the Central-West of Burkina Faso under sorghum–cowpea rotation, comparing three fertilization practices: mineral fertilizer (MIN), mineral fertilizer and farmyard manure (MINFYM), and a non-fertilized control (CON). We assessed microbial N and P limitation after removal of C limitation by (i) determining microbial N and P, (ii) assessing respiration kinetics in incubated soil samples amended with easily available C (glucose) alone or in combination with N and/or P, or not amended, and (iii) evaluating changes in microbial biomass and community composition at the peak of microbial respiration by microbial P and phospholipid fatty acid (PLFA) analyses. Microbial N and P were very low in all fertilization practices, but greater in MINFYM than in CON. Easily available C was the first factor limiting microorganisms in all fertilization practices. After removal of C limitation, most indicators suggested N and P co-limitation in CON. In contrast, respiration kinetics in MINFYM and MIN were only N-limited, while biomass formation in MINFYM was also P-limited. PLFA analyses indicated preferential fungal growth on the added C, and P limitation of changes in microbial community composition in MIN. Long-term application of fertilizers mostly alleviated secondary microbial nutrient limitation by P but not by N, and C always remained the primary limiting factor for microbial growth.
Soil properties and not inputs control carbon : nitrogen : phosphorus ratios in cropped soils in the long term
Stoichiometric approaches have been applied to understand the relationship between soil organic matter dynamics and biological nutrient transformations. However, very few studies have explicitly considered the effects of agricultural management practices on the soil C : N : P ratio. The aim of this study was to assess how different input types and rates would affect the C : N : P molar ratios of bulk soil, organic matter and microbial biomass in cropped soils in the long term. Thus, we analysed the C, N, and P inputs and budgets as well as soil properties in three long-term experiments established on different soil types: the Saria soil fertility trial (Burkina Faso), the Wagga Wagga rotation/stubble management/soil preparation trial (Australia), and the DOK (bio-Dynamic, bio-Organic, and “Konventionell”) cropping system trial (Switzerland). In each of these trials, there was a large range of C, N, and P inputs which had a strong impact on element concentrations in soils. However, although C : N : P ratios of the inputs were highly variable, they had only weak effects on soil C : N : P ratios. At Saria, a positive correlation was found between the N : P ratio of inputs and microbial biomass, while no relation was observed between the nutrient ratios of inputs and soil organic matter. At Wagga Wagga, the C : P ratio of inputs was significantly correlated to total soil C : P, N : P, and C : N ratios, but had no impact on the elemental composition of microbial biomass. In the DOK trial, a positive correlation was found between the C budget and the C to organic P ratio in soils, while the nutrient ratios of inputs were not related to those in the microbial biomass. We argue that these responses are due to differences in soil properties among sites. At Saria, the soil is dominated by quartz and some kaolinite, has a coarse texture, a fragile structure, and a low nutrient content. Thus, microorganisms feed on inputs (plant residues, manure). In contrast, the soil at Wagga Wagga contains illite and haematite, is richer in clay and nutrients, and has a stable structure. Thus, organic matter is protected from mineralization and can therefore accumulate, allowing microorganisms to feed on soil nutrients and to keep a constant C : N : P ratio. The DOK soil represents an intermediate situation, with high nutrient concentrations, but a rather fragile soil structure, where organic matter does not accumulate. We conclude that the study of C, N, and P ratios is important to understand the functioning of cropped soils in the long term, but that it must be coupled with a precise assessment of element inputs and budgets in the system and a good understanding of the ability of soils to stabilize C, N, and P compounds.
Influence de la rotation culturale, de la fertilisation et du labour sur les populations de nématodes phytoparasites du sorgho (Sorghum bicolor (L.) Moench)
Influence of crop rotation, fertilization and tillage on populations of plant parasitic nematodes of sorghum (Sorghum bicolor (L.) Moench). The soil nematodes of three long-term trials (1960, 1980 and 1990) representing the production of sorghum (Sorghum bicolor (L.) Moench) under different agricultural practices (rotation, tillage and fertilization) in the Center West of Burkina Faso, have been explored in the wake of the harvest during the agricultural season 2007/2008. The objective was to identify these nematodes and to study the influence of agricultural practices on this nematofauna. Nematodes were extracted by the method of Seinhorst elutriator. Plant-parasitic nematodes identified are Pratylenchus brachyurus, Tylenchorhynchus martini, Helicotylenchus multicinctus, Scutellonema Caveness, Criconemoides curvatum, Telotylenchus indicus and Xiphinema sp. The first three species represent approximately 98% of individuals surveyed. On the first site, the treatments involving mineral fertilizer and recycling of sorghum straw were favorable for the control of nematodes instead of treatments involving manure. As for rotations, monoculture of sorghum was more infested by nematodes than the rotations sorghum – cowpea and sorghum – cotton. On the second site, the nitrogen has increased of infestation by the two major nematodes in comparison to treatments without nitrogen, with the exception of treatment with anaerobic compost incorporation. On the third site, deep plowing has been unfavorable to the main nematode sorghum compared to shallow tillage. The nematofauna in fallow was more diversified than in cultivated sites and P. brachyurus, the main nematode related to sorghum has fallen sharply in fallow.
Spatially explicit multi-threat assessment of food tree species in Burkina Faso: A fine-scale approach
Over the last decades agroforestry parklands in Burkina Faso have come under increasing demographic as well as climatic pressures, which are threatening indigenous tree species that contribute substantially to income generation and nutrition in rural households. Analyzing the threats as well as the species vulnerability to them is fundamental for priority setting in conservation planning. Guided by literature and local experts we selected 16 important food tree species (Acacia macrostachya, Acacia senegal, Adansonia digitata, Annona senegalensis, Balanites aegyptiaca, Bombax costatum, Boscia senegalensis, Detarium microcarpum, Lannea microcarpa, Parkia biglobosa, Sclerocarya birrea, Strychnos spinosa, Tamarindus indica, Vitellaria paradoxa, Ximenia americana, Ziziphus mauritiana) and six key threats to them (overexploitation, overgrazing, fire, cotton production, mining and climate change). We developed a species-specific and spatially explicit approach combining freely accessible datasets, species distribution models (SDMs), climate models and expert survey results to predict, at fine scale, where these threats are likely to have the greatest impact. We find that all species face serious threats throughout much of their distribution in Burkina Faso and that climate change is predicted to be the most prevalent threat in the long term, whereas overexploitation and cotton production are the most important short-term threats. Tree populations growing in areas designated as ‘highly threatened’ due to climate change should be used as seed sources for ex situ conservation and planting in areas where future climate is predicting suitable habitats. Assisted regeneration is suggested for populations in areas where suitable habitat under future climate conditions coincides with high threat levels due to short-term threats. In the case of Vitellaria paradoxa, we suggest collecting seed along the northern margins of its distribution and considering assisted regeneration in the central part where the current threat level is high due to overexploitation. In the same way, population-specific recommendations can be derived from the individual and combined threat maps of the other 15 food tree species. The approach can be easily transferred to other countries and can be used to analyze general and species specific threats at finer and more local as well as at broader (continental) scales in order to plan more selective and efficient conservation actions in time. The concept can be applied anywhere as long as appropriate spatial data are available as well as knowledgeable experts.
Diagnostic accuracy of cervical cancer screening and screening–triage strategies among women living with HIV-1 in Burkina Faso and South Africa: A cohort study
Cervical cancer screening strategies using visual inspection or cytology may have suboptimal diagnostic accuracy for detection of precancer in women living with HIV (WLHIV). The optimal screen and screen-triage strategy, age to initiate, and frequency of screening for WLHIV remain unclear. This study evaluated the sensitivity, specificity, and positive predictive value of different cervical cancer strategies in WLHIV in Africa. WLHIV aged 25-50 years attending HIV treatment centres in Burkina Faso (BF) and South Africa (SA) from 5 December 2011 to 30 October 2012 were enrolled in a prospective evaluation study of visual inspection using acetic acid (VIA) or visual inspection using Lugol's iodine (VILI), high-risk human papillomavirus DNA test (Hybrid Capture 2 [HC2] or careHPV), and cytology for histology-verified high-grade cervical intraepithelial neoplasia (CIN2+/CIN3+) at baseline and endline, a median 16 months later. Among 1,238 women (BF: 615; SA: 623), median age was 36 and 34 years (p < 0.001), 28.6% and 49.6% ever had prior cervical cancer screening (p < 0.001), and 69.9% and 64.2% were taking ART at enrolment (p = 0.045) in BF and SA, respectively. CIN2+ prevalence was 5.8% and 22.4% in BF and SA (p < 0.001), respectively. VIA had low sensitivity for CIN2+ (44.7%, 95% confidence interval [CI] 36.9%-52.7%) and CIN3+ (56.1%, 95% CI 43.3%-68.3%) in both countries, with specificity for ≤CIN1 of 78.7% (95% CI 76.0%-81.3%). HC2 had sensitivity of 88.8% (95% CI 82.9%-93.2%) for CIN2+ and 86.4% (95% CI 75.7%-93.6%) for CIN3+. Specificity for ≤CIN1 was 55.4% (95% CI 52.2%-58.6%), and screen positivity was 51.3%. Specificity was higher with a restricted genotype (HPV16/18/31/33/35/45/52/58) approach (73.5%, 95% CI 70.6%-76.2%), with lower screen positivity (33.7%), although there was lower sensitivity for CIN3+ (77.3%, 95% CI 65.3%-86.7%). In BF, HC2 was more sensitive for CIN2+/CIN3+ compared to VIA/VILI (relative sensitivity for CIN2+ = 1.72, 95% CI 1.28-2.32; CIN3+: 1.18, 95% CI 0.94-1.49). Triage of HC2-positive women with VIA/VILI reduced the number of colposcopy referrals, but with loss in sensitivity for CIN2+ (58.1%) but not for CIN3+ (84.6%). In SA, cytology high-grade squamous intraepithelial lesion or greater (HSIL+) had best combination of sensitivity (CIN2+: 70.1%, 95% CI 61.3%-77.9%; CIN3+: 80.8%, 95% CI 67.5%-90.4%) and specificity (81.6%, 95% CI 77.6%-85.1%). HC2 had similar sensitivity for CIN3+ (83.0%, 95% CI 70.2%-91.9%) but lower specificity compared to HSIL+ (42.7%, 95% CI 38.4%-47.1%; relative specificity = 0.57, 95% CI 0.52-0.63), resulting in almost twice as many referrals. Compared to HC2, triage of HC2-positive women with HSIL+ resulted in a 40% reduction in colposcopy referrals but was associated with some loss in sensitivity. CIN2+ incidence over a median 16 months was highest among VIA baseline screen-negative women (2.2%, 95% CI 1.3%-3.7%) and women who were baseline double-negative with HC2 and VIA (2.1%, 95% CI 1.3%-3.5%) and lowest among HC2 baseline screen-negative women (0.5%, 95% CI 0.1%-1.8%). Limitations of our study are that WLHIV included in the study may not reflect a contemporary cohort of WLHIV initiating ART in the universal ART era and that we did not evaluate HPV tests available in study settings today. In this cohort study among WLHIV in Africa, a human papillomavirus (HPV) test targeting 14 high-risk (HR) types had higher sensitivity to detect CIN2+ compared to visual inspection but had low specificity, although a restricted genotype approach targeting 8 HR types decreased the number of unnecessary colposcopy referrals. Cytology HSIL+ had optimal performance for CIN2+/CIN3+ detection in SA. Triage of HPV-positive women with HSIL+ maintained high specificity but with some loss in sensitivity compared to HC2 alone.
Etiology of severe invasive infections in young infants in rural settings in sub-Saharan Africa
Serious invasive infections in newborns are a major cause of death. Lack of data on etiological causes hampers progress towards reduction of mortality. This study aimed to identify pathogens responsible for such infections in young infants in sub-Saharan Africa and to describe their antibiotics resistance profile. Between September 2016 and April 2018 we implemented an observational study in two rural sites in Burkina Faso and Tanzania enrolling young infants aged 0-59 days old with serious invasive infection. Blood samples underwent blood culture and molecular biology. In total 634 infants with clinical diagnosis of serious invasive infection were enrolled and 4.2% of the infants had a positive blood culture. The most frequent pathogens identified by blood culture were Klebsiella pneumonia and Staphylococcus aureus, followed by Escherichia coli. Gram-negative isolates were only partially susceptible to first line WHO recommended treatment for neonatal sepsis at community level. A total of 18.6% of the infants were PCR positive for at least one pathogen and Escherichia coli and Staphylococcus aureus were the most common bacteria detected. Among infants enrolled, 60/634 (9.5%) died. Positive blood culture but not positive PCR was associated with risk of death. For most deaths, no pathogen was identified either by blood culture or molecular testing, and hence a causal agent remained unclear. Mortality was associated with low body temperature, tachycardia, respiratory symptoms, convulsions, history of difficult feeding, movement only when stimulated or reduced level of consciousness, diarrhea and/or vomiting. While Klebsiella pneumonia and Staphylococcus aureus, as well as Escherichia coli were pathogens most frequently identified in infants with clinical suspicion of serious invasive infections, most cases remain without definite diagnosis, making more accurate diagnostic tools urgently needed. Antibiotics resistance to first line antibiotics is an increasing challenge even in rural Africa.
C-reactive protein, white blood cell counts and sequential interpretation of PfHRP2/pLDH for antibiotic stewardship in children under 5 years of age
Background In sub-Saharan Africa, the management of non-malaria acute febrile illnesses remains challenging in peripheral health centers without laboratory facilities. This study retrospectively assessed diagnostic approaches constructed through the integration of biological data and clinical information from established database to assess their potential impact on antibiotics prescribing practices within an antimicrobial stewardship framework in malaria endemic areas. Methods Data from 396 febrile children (axillary temperature ≥37.5 °C) under 5 years of age, collected between April and December 2016 were retrospectively analyzed. Diagnostic approaches integrating malaria RDT results (sequential interpretation of Pf HRP2/ p LDH and Pf HRP2-only), C-reactive protein (CRP), white blood cell (WBC) counts, microbiological findings (blood, stool, and urine) and recorded antibiotic prescription were assessed. Association between malaria diagnostic results and clinical/biological data were assessed using logistical regression, adjusted for age, sex, axillary temperature, CRP value, WBC count and microbiological findings. Malaria sequential diagnostic results with malaria RDT- Pf HRP2/ p LDH were interpreted and reported as either (i) positive when the T2- p LDH line appear, regardless of the results of T1-HRP2 line, (ii) negative when both T1-HRP2 and T2- p LDH lines do not appear, or (iii) undetermined when the T1-HRP2 line only appears. Results Using malaria sequential diagnostic approach, logistic regression of malaria-negative or undetermined results showed a negative correlation with axillary temperature > 38.5 °C (aOR 0.37; 95% CI 0.24–0.58; p  < 0.001) and CRP value ≥10 mg/L (aOR 0.13; 95% CI 0.06–0.26; p  < 0.001), but a positive correlation with WBC counts > 14 × 10 3 /µL (aOR 2.57; 95% CI 1.39–4.79, p  = 0.003), compared with malaria-positive results. In children with malaria-negative or undetermined sequential results, negative malaria tests showed a positive correlation with CRP ≥10 mg/L (aOR 3.46; 95% CI 1.51–8.82; p  = 0.005), but a negative correlation with WBC counts > 14 × 10 3 /µL (aOR 0.38; 95% CI 0.15–0.93; p  = 0.037), compared with malaria-undetermined results. The optimal diagnostic approach combining Pf HRP2-only results with CRP values and WBC counts predicted the need for antibiotic prescriptions in 18.7% (74/396), potentially identifying 9/23 sepsis cases. When using sequential malaria diagnostic approach combined with CRP values and WBC counts, antibiotic need was predicted in 25.5% (101/396), potentially identifying 18/23 sepsis cases. Conclusion Integrating sequential malaria diagnostics with CRP, WBC counts, and clinical information improves differentiation of febrile illnesses and supports more targeted antibiotic use in malaria-endemic settings. Clinical trial Not applicable.
Prevalence and factors associated with severe illness in West African children under 5 years of age detected with routine pulse oximetry in primary care
BackgroundThe Integrated Management of Childhood Illness (IMCI) guidelines are symptom-based algorithms used to identify critically ill children under five in primary health centres (PHC) in resource-limited countries. Hypoxaemia, a life-threatening event, is clinically underdiagnosed. The Amélioration de l'Identification des détresses Respiratoires de l'Enfant/Improving Identification of Respiratory Distress in Children (AIRE) project implemented the routine use of pulse oximetry (PO) within IMCI consultations to improve the diagnosis and management of severe hypoxaemia (pulse blood oxygen saturation <90%) at PHC level in Burkina Faso, Guinea, Mali and Niger. In this context, we measured the prevalence of severe cases and their associated social and structural factors among outpatients.MethodsIn 16 AIRE research PHC (4/country), all the children under five attending IMCI consultations, except those aged 2–59 months classified as simple case without cough or breathing difficulties, were eligible for the use of PO and enrolled in a cross-sectional study with parental consent. Severe IMCI+PO cases were IMCI severe cases or those with severe hypoxaemia.ResultsFrom June 2021 to June 2022, 968 neonates (0–59 days) and 14 868 children (2–59 months) were included. Prevalence of severe IMCI+PO cases was heterogeneous between countries: 5.0% in Burkina Faso, 6.1% in Niger, 18.9% in Mali and 44.6% in Guinea. Among neonates, 21.9% (95% CI 19.3 to 24.6) were severe cases versus 12.0% (95% CI 11.4 to 12.5) in older children, of which 3.3% versus 0.8%, respectively (p<0.001), had severe hypoxaemia. The adjusted social and structural factors associated with disease severity common to all four countries were as follows: age <2 months or >2 years, IMCI consultation delay >2 days, home to PHC travel time >30 min.ConclusionThe prevalence of seriously ill children under five, including severe hypoxaemia, was high in PHC, particularly in neonates. The high between-country heterogeneity may be explained by differences in case definitions (Guinea) and structural factors (accessibility). Improving early access to primary care could be an actionable lever to improve the health of West African children.
Efficacy and safety of RTS,S/AS01 malaria vaccine with or without a booster dose in infants and children in Africa: final results of a phase 3, individually randomised, controlled trial
The efficacy and safety of the RTS,S/AS01 candidate malaria vaccine during 18 months of follow-up have been published previously. Herein, we report the final results from the same trial, including the efficacy of a booster dose. From March 27, 2009, until Jan 31, 2011, children (age 5–17 months) and young infants (age 6–12 weeks) were enrolled at 11 centres in seven countries in sub-Saharan Africa. Participants were randomly assigned (1:1:1) at first vaccination by block randomisation with minimisation by centre to receive three doses of RTS,S/AS01 at months 0, 1, and 2 and a booster dose at month 20 (R3R group); three doses of RTS,S/AS01 and a dose of comparator vaccine at month 20 (R3C group); or a comparator vaccine at months 0, 1, 2, and 20 (C3C [control group]). Participants were followed up until Jan 31, 2014. Cases of clinical and severe malaria were captured through passive case detection. Serious adverse events (SAEs) were recorded. Analyses were by modified intention to treat and per protocol. The coprimary endpoints were the occurrence of malaria over 12 months after dose 3 in each age category. In this final analysis, we present data for the efficacy of the booster on the occurrence of malaria. Vaccine efficacy (VE) against clinical malaria was analysed by negative binomial regression and against severe malaria by relative risk reduction. This trial is registered with ClinicalTrials.gov, number NCT00866619. 8922 children and 6537 young infants were included in the modified intention-to-treat analyses. Children were followed up for a median of 48 months (IQR 39–50) and young infants for 38 months (34–41) after dose 1. From month 0 until study end, compared with 9585 episodes of clinical malaria that met the primary case definition in children in the C3C group, 6616 episodes occurred in the R3R group (VE 36·3%, 95% CI 31·8–40·5) and 7396 occurred in the R3C group (28·3%, 23·3–32·9); compared with 171 children who experienced at least one episode of severe malaria in the C3C group, 116 children experienced at least one episode of severe malaria in the R3R group (32·2%, 13·7 to 46·9) and 169 in the R3C group (1·1%, −23·0 to 20·5). In young infants, compared with 6170 episodes of clinical malaria that met the primary case definition in the C3C group, 4993 episodes occurred in the R3R group (VE 25·9%, 95% CI 19·9–31·5) and 5444 occurred in the R3C group (18·3%, 11·7–24·4); and compared with 116 infants who experienced at least one episode of severe malaria in the C3C group, 96 infants experienced at least one episode of severe malaria in the R3R group (17·3%, 95% CI −9·4 to 37·5) and 104 in the R3C group (10·3%, −17·9 to 31·8). In children, 1774 cases of clinical malaria were averted per 1000 children (95% CI 1387–2186) in the R3R group and 1363 per 1000 children (995–1797) in the R3C group. The numbers of cases averted per 1000 young infants were 983 (95% CI 592–1337) in the R3R group and 558 (158–926) in the R3C group. The frequency of SAEs overall was balanced between groups. However, meningitis was reported as a SAE in 22 children: 11 in the R3R group, ten in the R3C group, and one in the C3C group. The incidence of generalised convulsive seizures within 7 days of RTS,S/AS01 booster was 2·2 per 1000 doses in young infants and 2·5 per 1000 doses in children. RTS,S/AS01 prevented a substantial number of cases of clinical malaria over a 3–4 year period in young infants and children when administered with or without a booster dose. Efficacy was enhanced by the administration of a booster dose in both age categories. Thus, the vaccine has the potential to make a substantial contribution to malaria control when used in combination with other effective control measures, especially in areas of high transmission. GlaxoSmithKline Biologicals SA and the PATH Malaria Vaccine Initiative.
Acceptability of the routine use of pulse oximetry by healthcare workers and caregivers within primary healthcare in West Africa: mixed-methods study
IntroductionTo better identify severe hypoxaemia as a major risk factor for specific illnesses in children aged under 5 years, the Améliorer l’Identification des détresses Respiratoires chez l’Enfant (AIRE) project implemented routine use of pulse oximetry within implementation of Integrated Management of Childhood Illness (IMCI) guidelines at primary health centres (PHCs) in Burkina Faso, Guinea, Mali and Niger. We aimed to measure and understand the acceptability of pulse oximeter (PO) use among healthcare workers (HCWs) and children’s families (CFs).MethodsBased on an original conceptual framework, we conducted a convergent mixed methods study to assess acceptability. We conducted repeated cross-sectional studies among all HCWs on duty within the 202 PHCs involved in the AIRE project, using quantitative Likert-scale questionnaires. These were administered at four key time points: (1) just before the PO use training, (2) immediately after the training, (3) 6 months after the introduction of PO devices in PHCs and (4) 2 months after the completion of all AIRE project activities. We also conducted semistructured interviews with HCWs (n=100) and CFs (n=59). Quantitative data were analysed using descriptive statistics and multivariable ordinal logistic regression. Qualitative data were thematically analysed with NVivo, and both were interpreted in light of the conceptual framework to explore convergence and divergence across acceptability dimensions.ResultsFrom March 2021 to December 2022, 486, 537, 538 and 476 HCWs completed the four acceptability surveys. Overall, 31% of HCWs had mixed feelings about PO use before the training, 46% found it somewhat acceptable and 23% strongly acceptable. At the end of the project, it was respectively 15%, 34% and 51%. PO training was consistently associated with greater HCWs acceptability. HCWs reported many advantages in using PO, such as a more accurate diagnosis and a boost in their confidence in sick child management. Nevertheless, challenges reported by HCWs included perceived increased workload and consultation time, as well as difficulties in referring children to hospital. CFs did not necessarily understand the device’s purpose, but their opinions of the technology were generally positive.ConclusionPO use, integrated into IMCI consultations, was reported to be accepted by HCWs and CFs, although sustainable challenges in implementation remain.