Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
DisciplineDiscipline
-
Is Peer ReviewedIs Peer Reviewed
-
Item TypeItem Type
-
SubjectSubject
-
YearFrom:-To:
-
More FiltersMore FiltersSourceLanguage
Done
Filters
Reset
27
result(s) for
"Seegers, Henri"
Sort by:
Rapid identification and quantification of Campylobacter coli and Campylobacter jejuni by real-time PCR in pure cultures and in complex samples
by
Denis, Martine
,
Seegers, Henri
,
Belloc, Catherine
in
Animal Feed - microbiology
,
Animals
,
Bacterial Load - methods
2011
Campylobacter spp., especially Campylobacter jejuni (C. jejuni) and Campylobacter coli (C. coli), are recognized as the leading human foodborne pathogens in developed countries. Livestock animals carrying Campylobacter pose an important risk for human contamination. Pigs are known to be frequently colonized with Campylobacter, especially C. coli, and to excrete high numbers of this pathogen in their faeces. Molecular tools, notably real-time PCR, provide an effective, rapid, and sensitive alternative to culture-based methods for the detection of C. coli and C. jejuni in various substrates. In order to serve as a diagnostic tool supporting Campylobacter epidemiology, we developed a quantitative real-time PCR method for species-specific detection and quantification of C. coli and C. jejuni directly in faecal, feed, and environmental samples.
With a sensitivity of 10 genome copies and a linear range of seven to eight orders of magnitude, the C. coli and C. jejuni real-time PCR assays allowed a precise quantification of purified DNA from C. coli and C. jejuni. The assays were highly specific and showed a 6-log-linear dynamic range of quantification with a quantitative detection limit of approximately 2.5 × 10² CFU/g of faeces, 1.3 × 10² CFU/g of feed, and 1.0 × 10³ CFU/m² for the environmental samples. Compared to the results obtained by culture, both C. coli and C. jejuni real-time PCR assays exhibited a specificity of 96.2% with a kappa of 0.94 and 0.89 respectively. For faecal samples of experimentally infected pigs, the coefficients of correlation between the C. coli or C. jejuni real-time PCR assay and culture enumeration were R² = 0.90 and R² = 0.93 respectively.
The C. coli and C. jejuni real-time quantitative PCR assays developed in this study provide a method capable of directly detecting and quantifying C. coli and C. jejuni in faeces, feed, and environmental samples. These assays represent a new diagnostic tool for studying the epidemiology of Campylobacter by, for instance, investigating the carriage and excretion of C. coli and C. jejuni by pigs from conventional herds.
Journal Article
Prevention of Coxiella burnetii shedding in infected dairy herds using a phase IC-burnetii inactivated vaccine
2008
The main objective of this study was to assess the efficacy of a monovalent inactivated vaccine containing phase I Coxiella burnetii to prevent Coxiella shedding in susceptible dairy cows within infected herds in comparison to a placebo. A total of 336 dairy cows and heifers, from six spontaneously infected herds, were followed over a 1-year period. Before treatment (i.e. vaccination or placebo), the C. burnetii infection status of the cows was determined on the basis on PCR results on milk, vaginal mucus and faeces and serological analysis performed 2 weeks apart. A cow was considered susceptible (i.e. non-infected) when all results were negative, and was considered infected otherwise. The allocation of treatments was performed randomly within pregnant and non-pregnant cows. After treatment (D0), the animals were subject to systematic sampling (milk, vaginal mucus and faeces) on D90, D180, D270 and D360 to detect putative shedding. In addition, the same samples were taken within 15 days after calving. An animal was considered as a shedder at a given time t, if at t, it was found PCR-positive on at least one test taken among the samples (milk, vaginal mucus and faeces). The effect of the treatment on the probability for an initially susceptible animal of becoming shedder was assessed using survival analysis techniques (Cox regression model). Almost all heifers were detected as susceptible before treatment. When vaccinated while not pregnant, an animal had a five times lower probability of becoming a shedder than an animal receiving placebo. An animal which was vaccinated while pregnant had a similar probability of becoming shedder as an animal receiving the placebo. There was no significant farm effect in this multi-centric trial. These results highlight the value of implementing vaccination, if possible, in non-infected herds. In infected herds, the vaccination should be implemented in quite all presumably susceptible animals, i.e. at least the heifers. The vaccination of the dairy cows should be performed when the within-herd seroprevalence is low, i.e. in herds where the infection has not spread widely yet.
Journal Article
How Much Can Diptera-Borne Viruses Persist over Unfavourable Seasons?
by
Ezanno, Pauline
,
Balenghien, Thomas
,
Charron, Maud V. P.
in
Analysis
,
Animals
,
Biodiversity and Ecology
2013
Diptera are vectors of major human and animal pathogens worldwide, such as dengue, West-Nile or bluetongue viruses. In seasonal environments, vector-borne disease occurrence varies with the seasonal variations of vector abundance. We aimed at understanding how diptera-borne viruses can persist for years under seasonal climates while vectors overwinter, which should stop pathogen transmission during winter. Modeling is a relevant integrative approach for investigating the large panel of persistence mechanisms evidenced through experimental and observational studies on specific biological systems. Inter-seasonal persistence of virus may occur in hosts due to viremia duration, chronic infection, or vertical transmission, in vector resistance stages, and due to a low continuous transmission in winter. Using a generic stochastic modeling framework, we determine the parameter ranges under which virus persistence could occur via these different mechanisms. The parameter ranges vary according to the host demographic regime: for a high host population turnover, persistence increases with the mechanism parameter, whereas for a low turnover, persistence is maximal for an optimal range of parameter. Persistence in hosts due to long viremia duration in a few hosts or due to vertical transmission is an effective strategy for the virus to overwinter. Unexpectedly, a low continuous transmission during winter does not give rise to certain persistence, persistence barely occurring for a low turnover of the susceptible population. We propose a generic framework adaptable to most diptera-borne diseases. This framework allows ones to assess the plausibility of each persistence mechanism in real epidemiological situations and to compare the range of parameter values theoretically allowing persistence with the range of values determined experimentally.
Journal Article
Using Animal Performance Data to Evidence the Under-Reporting of Case Herds during an Epizootic: Application to an Outbreak of Bluetongue in Cattle
by
Seegers, Henri
,
Fourichon, Christine
,
Monestiez, Pascal
in
Animals
,
Biology
,
Biology and Life Sciences
2014
Following the emergence of the Bluetongue virus serotype 8 (BTV-8) in France in 2006, a surveillance system (both passive and active) was implemented to detect and follow precociously the progression of the epizootic wave. This system did not allow a precise estimation of the extent of the epizootic. Infection by BTV-8 is associated with a decrease of fertility. The objective of this study was to evaluate whether a decrease in fertility can be used to evidence the under-reporting of cases during an epizootic and to quantify to what extent non-reported cases contribute to the total burden of the epizootic. The cow fertility in herds in the outbreak area (reported or not) was monitored around the date of clinical signs. A geostatistical interpolation method was used to estimate a date of clinical signs for non-reported herds. This interpolation was based on the spatiotemporal dynamic of confirmed case herds reported in 2007. Decreases in fertility were evidenced for both types of herds around the date of clinical signs. In non-reported herds, the decrease fertility was large (60% of the effect in reported herds), suggesting that some of these herds have been infected by the virus during 2007. Production losses in non-reported infected herds could thus contribute to an important part of the total burden of the epizootic. Overall, results indicate that performance data can be used to evidence the under-reporting during an epizootic. This approach could be generalized to pathogens that affect cattle's performance, including zoonotic agents such as Coxiella burnetii or Rift Valley fever virus.
Journal Article
Visually undetected fever episodes in newly received beef bulls at a fattening operation: Occurrence, duration, and impact on performance
by
BAREILLE, N
,
ASSIE, S
,
LEHEBEL, A
in
Agricultural sciences
,
Animal Husbandry
,
Animal production studies
2011
Monitoring body temperature of newly received cattle allows for identification of fever episodes not visually detected by feedlot personnel (FENO). Information concerning the occurrence, duration, and impact on performance of these FENO is not available in the literature. Such information is crucial to assess the potential benefit of the identification and treatment of FENO. Therefore, the objectives of this study were to describe the occurrence and duration of FENO and to evaluate their impact on ADG. One hundred twelve beef bulls (initial BW = 346 ± 36 kg) were studied for 40 d after arrival at 3 French fattening operations. At d 1, each animal was administered orally a reticulo-rumen bolus, which allowed continuous measurement and recording of reticulo-rumen temperature. Animals were weighed on d 1 and 40. Bulls were observed twice daily by personnel for visual signs of apparent disease. Bulls that appeared ill, had a rectal temperature ≥39.7°C, and demonstrated symptoms consistent with bovine respiratory disease (BRD) were treated with antibiotics. After d 40, data obtained from the boluses were retrospectively analyzed using a cumulative sum test to detect significant increases in reticulo-rumen temperature considered as fever episodes. Numerous fever episodes (n = 449) were retrospectively detected in 110 bulls. Of these 449 fever episodes, 74% were not associated with any visually detected clinical signs of disease and thus were identified as FENO. These FENO were often transitory (75% lasted less than 47 h). However, 25% lasted from 47 to 263 h. Of the 112 bulls, 88 were treated for BRD with 20 and 7 animals treated, respectively, 2 and 3 times. In treated animals, fever episodes began 4 to 177 h (mean = 50 h) before BRD treatment. The duration of FENO was associated (P = 0.002) with a lesser ADG (d 1 to 40): -33 g/d for daily FENO duration. Our results demonstrated that FENO occurred frequently in bulls during the first weeks after entrance into a fattening operation and can last up to 11 d. The impact of FENO observed on ADG in this study indicated a potential benefit of treating affected animals, specifically those with FENO of long duration. However, further research is needed to determine the medical and economic relevance of such treatment.
Journal Article
Within-herd biosecurity and Salmonella seroprevalence in slaughter pigs: A simulation study
by
Ezanno, Pauline
,
Hoch, Thierry
,
ONIRIS-INRA - Umr 1300 Bioagression, Epidémiologie et Analyse de Risque ; École nationale vétérinaire, agroalimentaire et de l'alimentation Nantes-Atlantique (ONIRIS)
in
Abattoirs
,
age at slaughter
,
Agricultural sciences
2011
In Europe, on-farm biosecurity measures, involving a strict all-in/all-out batch-management system and decontamination of the rearing rooms between consecutive batches, are recommended to control Salmonella infection in growing pigs. However, im- plementation of these measures is often relaxed under common farming conditions. Therefore, this study was conducted to assess the relative contributions of batch- management system and room decontamination efficacy on Salmonella seroprevalence for different growing rates and subsequent slaughter ages of pigs. Because the impact of these factors cannot be easily evaluated by an observational approach in commercial farms, a stochastic simulation model representing the population dynamics, herd management, and Salmonella infection within a farrow-to-finish pig herd was used. Realistic levels were set for each factor under study (3 for batch-management system and slaughter age; 4 for room decontamination) to generate 54 simulation scenarios. Salmonella shedding prevalence in groups of slaughter pigs was then compared. A sensitivity analysis was performed to rank the impacts of the 3 factors on output. Batch-management system had little effect. In contrast, room decontamination efficacy had the greatest impact on Salmonella prevalence in pigs at slaughter. A drop in decontamination efficacy from 100 to 50%, with a strict all-in/all-out batch-management system and for all slaughter ages tested, noticeably increased (P < 0.001) the prevalence and almost doubled it for the reference slaughter age. Our results suggest that the control of Salmonella in pig herds should primarily focus on room decontamination efficacy. Provided that a good level of room decontamination is en- sured, some flexibility in batch management, in term of pig mixing, would be acceptable to limit the number of underweight pigs delivered to the slaughterhouse.
Journal Article
Rapid identification and quantification of Campylobacter coli and Campylobacter jejuni by real-time PCR in pure cultures and in complex samples
2011
Background: Campylobacter spp., especially Campylobacter jejuni (C. jejuni) and Campylobacter coli (C. coli), are recognized as the leading human foodborne pathogens in developed countries. Livestock animals carrying Campylobacter pose an important risk for human contamination. Pigs are known to be frequently colonized with Campylobacter, especially C. coli, and to excrete high numbers of this pathogen in their faeces. Molecular tools, notably real-time PCR, provide an effective, rapid, and sensitive alternative to culture-based methods for the detection of C. coli and C. jejuni in various substrates. In order to serve as a diagnostic tool supporting Campylobacter epidemiology, we developed a quantitative real-time PCR method for species-specific detection and quantification of C. coli and C. jejuni directly in faecal, feed, and environmental samples.Results: With a sensitivity of 10 genome copies and a linear range of seven to eight orders of magnitude, the C. coli and C. jejuni real-time PCR assays allowed a precise quantification of purified DNA from C. coli and C. jejuni. The assays were highly specific and showed a 6-log-linear dynamic range of quantification with a quantitative detection limit of approximately 2.5 × 10² CFU/g of faeces, 1.3 × 10² CFU/g of feed, and 1.0 × 10³ CFU/m² for the environmental samples. Compared to the results obtained by culture, both C. coli and C. jejuni real-time PCR assays exhibited a specificity of 96.2% with a kappa of 0.94 and 0.89 respectively. For faecal samples of experimentally infected pigs, the coefficients of correlation between the C. coli or C. jejuni real-time PCR assay and culture enumeration were R² = 0.90 and R² = 0.93 respectively.Conclusion: The C. coli and C. jejuni real-time quantitative PCR assays developed in this study provide a method capable of directly detecting and quantifying C. coli and C. jejuni in faeces, feed, and environmental samples. These assays represent a new diagnostic tool for studying the epidemiology of Campylobacter by, for instance, investigating the carriage and excretion of C. coli and C. jejuni by pigs from conventional herds.
Journal Article
Review and critical discussion of assumptions and modelling options to study the spread of the bovine viral diarrhoea virus (BVDV) within a cattle herd
by
SEEGERS, H.
,
VIET, A.-F.
,
FOURICHON, C.
in
Animals
,
Biological and medical sciences
,
Bovine viral diarrhea virus
2007
Relevance of epidemiological models depends on assumptions on the population structure and dynamics, on the biology of the host–parasite interaction, and on mathematical modelling. In this paper we reviewed published models of the bovine viral diarrhoea virus (BVDV) spread within a herd. Modelling options and assumptions on herd dynamics and BVDV transmission were discussed. A cattle herd is a population with a controlled size. Animals are separated into subgroups according to their age or their physiological status inducing heterogeneity of horizontal transmission. Complexity of models results from: (1) horizontal and vertical virus transmission, (2) birth of persistently infected animals, (3) excretion by transiently and persistently infected animals. Areas where there was a lack of knowledge were identified. Assumptions on the force of infection used to model the horizontal virus transmission were presented and discussed. We proposed possible ways of improving models (e.g. force of infection, validation) and essential model features for further BVDV models.
Journal Article
Modelling batch farrowing management within a farrow-to-finish pig herd: influence of management on contact structure and pig delivery to the slaughterhouse
by
Touzeau, S.
,
Fourichon, C.
,
Seegers, H.
in
Agricultural sciences
,
animal growth
,
Animal production studies
2008
Pathogen spread within pig host populations can vary depending on within-herd interactions among pigs also called the contact structure. The recommended batch farrowing management, allowing for a fixed-interval mating for groups of sows of equal size, called batches, leads to an all-in/all-out management of pigs in which animals in different batches have no contact. To maintain a profitable pig delivery, producers have to deliver groups of pigs at a given weight, what needs sometimes herd management adaptations. However, producers’ adaptations that avoid delivering pigs below slaughtering weight (out-of-range pigs), result in increasing the contact between animals from different batches. To study the influence of herd management on contact structure and on pig delivery, a stochastic mathematical model representing population dynamics within a farrow-to-finish herd was elaborated. Sixteen management systems were represented combining or not the all-in/all-out management system with producers’ decisions: batch mixing, use of an extra room, suppression of the drying period and sale of post-weaning batches. Two types of contact were considered: via the animals themselves, when batch mixing occurred; and via the room, when decontamination was not complete. The impact of producers’ decisions on contact structure and on pig delivery, differed radically when pig growth was normal and when it was slow (i.e. mean age at slaughtering weight increased by 20%). When pig growth was normal, the all-in/all-out management prevented both contact via the animals and via the room but resulted in 9% of pigs delivered out of range. The use of an extra room or batch mixing decreased this percentage, the latter resulting in very frequent contact between batches via the animals. When pig growth was slow, the all-in/all-out management led to a very high percentage of pigs delivered out of range (almost 80%). The suppression of the drying period at the end of the finishing period and the sale of post-weaning batches induced a significant decrease in this percentage (down to 2% to 20%), the latter allowing to reduce the percentage of batches that made contact via the room (40% instead of 80%). This pig herd model helped to understand the compromise for producers between implementing internal biosecurity or maintaining a profitable pig delivery. Our results show that there was no unique optimal system and that efficient producers’ decisions (for biosecurity and delivery) may differ, depending on pig growth.
Journal Article
Influence of the transmission function on a simulated pathogen spread within a population
by
VIET, A.-F.
,
HOCH, T.
,
FOURICHON, C.
in
Biological and medical sciences
,
Coefficients
,
Communicable Diseases - transmission
2008
The mathematical function for the horizontal transmission of a pathogen is a driving force of epidemiological models. This paper aims at studying the influence of different transmission functions on a simulated pathogen spread. These functions were chosen in the literature and their biological relevance is discussed. A theoretical SIR (Susceptible–Infectious–Recovered) model was used to study the effect of the function used on simulated results. With a constant total population size, different equilibrium values for the number of infectious (NI) were reached, depending on the transmission function used. With an increasing population size, the transmission functions could be assimilated to either density-dependent (DD), where an equilibrium was obtained, or frequency-dependent (FD), with an exponential increase in NI. An analytical study corroborated the simulated results. As a conclusion, the choice between the different transmission functions, particularly between DD and FD, must be carefully considered for a varying population size.
Journal Article