Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Item Type
      Item Type
      Clear All
      Item Type
  • Subject
      Subject
      Clear All
      Subject
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
      More Filters
      Clear All
      More Filters
      Source
    • Language
272 result(s) for "Weller, Daniel"
Sort by:
Listeria monocytogenes Prevalence Varies More within Fields Than between Fields or over Time on Conventionally Farmed New York Produce Fields
Past studies have shown that the on-farm distribution of Listeria monocytogenes is affected by environmental factors (e.g., weather). However, most studies were conducted at large scales (e.g., across farms), whereas few studies examined drivers of L. monocytogenes prevalence at smaller scales (e.g., within a single field). This study was performed to address this knowledge gap by (i) tracking L. monocytogenes distribution in two fields on one farm over a growing season and (ii) identifying factors associated with L. monocytogenes isolation from drag swab, soil, and agricultural water samples. Overall, L. monocytogenes was detected in 78% (21 of 27), 19% (7 of 36), and 8% (37 of 486) of water, drag swab, and soil samples, respectively. All isolates were characterized by pulsed-field gel electrophoresis. Of the 43 types identified, 14 were isolated on multiple sampling visits and/or from multiple sample types, indicating persistence in or repeated introduction into the farm environment during the study. Our findings also suggest that L. monocytogenes prevalence, even at the small spatial scale studied here, (i) was not uniform and (ii) varied more within fields than between fields or over time. This is illustrated by plot (in-field variation), field (between-field variation), and sampling visit (time), accounting for 18, 2, and 3% of variance in odds of isolating L. monocytogenes, respectively. Moreover, according to random forest analysis, water-related factors were among the top-ranked factors associated with L. monocytogenes isolation from all sample types. For example, the likelihood of isolating L. monocytogenes from drag and soil samples increased monotonically as rainfall increased. Overall, findings from this single-farm study suggests that mitigation strategies for L. monocytogenes in produce fields should focus on water-associated risk factors (e.g., rain and distance to water) and be tailored to specific high-risk in-field areas.
Simulated changes in extent of Georgian Bay low-marsh habitat under multiple lake levels
The extent of coastal wetlands in Georgian Bay is controlled primarily by the water level of Lake Huron, which directly affects the amount of critical habitat available for fish and wildlife communities. Lake-levels have historically fluctuated by nearly 2 m and that range could increase in the future. This prompted us to investigate how quantity and quality of wetland habitat in Georgian Bay may be affected by different lake-level scenarios. The extent of low-marsh habitat was modeled with a generalized linear model that used hydrogeomorphic features (i.e. depth, slope, and exposure) as predictors. We simulated lake levels between 175.5 m and 177.5 m at 0.5 m-increments, and found that the total area of low marsh peaked at 176.0 m (7113 ha) and declined sharply as lake levels increased or decreased. In contrast, low-marsh volume was highest at 176.5 m (3.84 × 107 m3) but remained relatively stable across all modeled lake levels. We derived an average elevation profile for low-marsh habitat across the study area that showed a shallow “step” between 175.5 and 176.0 m, flanked by steeper upslope and downslope sections. At historically low lake levels low-marsh habitat would have been dominated by shallow water (< 0.5 m), whereas at higher lake levels it would have been dominated by deeper (0.5–2.0 m) water. The geomorphology at low lake levels (i.e. 176.0 m) appears to favour large areas of shallow habitat at the expense of deeper habitats that could have supported more structurally complex, submersed aquatic vegetation.
Spatially resolved spectroscopy of alkali metal vapour diffusing inside hollow-core photonic crystal fibres
We present a new type of compact and all-glass based vapour cell integrating hollow-core photonic crystal fibres. The absence of metals, as in a traditional vacuum chamber and the much more compact geometry allows for fast and homogeneous heating. As a consequence we can fill the fibres on much faster timescales, ranging from minutes to hours. Additionally the all-glass design ensures optical access along the fibre. This allows live monitoring of the diffusion of rubidium atoms inside the hollow-core by measuring the frequency-dependent fluorescence from the atoms. The atomic density is numerically retrieved using a five-level system of Bloch-equations.
Nationwide genomic atlas of soil-dwelling Listeria reveals effects of selection and population ecology on pangenome evolution
Natural bacterial populations can display enormous genomic diversity, primarily in the form of gene content variation caused by the frequent exchange of DNA with the local environment. However, the ecological drivers of genomic variability and the role of selection remain controversial. Here, we address this gap by developing a nationwide atlas of 1,854 Listeria isolates, collected systematically from soils across the contiguous United States. We found that Listeria was present across a wide range of environmental parameters, being mainly controlled by soil moisture, molybdenum and salinity concentrations. Whole-genome data from 594 representative strains allowed us to decompose Listeria diversity into 12 phylogroups, each with large differences in habitat breadth and endemism. ‘Cosmopolitan’ phylogroups, prevalent across many different habitats, had more open pangenomes and displayed weaker linkage disequilibrium, reflecting higher rates of gene gain and loss, and allele exchange than phylogroups with narrow habitat ranges. Cosmopolitan phylogroups also had a large fraction of genes affected by positive selection. The effect of positive selection was more pronounced in the phylogroup-specific core genome, suggesting that lineage-specific core genes are important drivers of adaptation. These results indicate that genome flexibility and recombination are the consequence of selection to survive in variable environments. A population genomic analysis of 1,854 Listeria soil isolates collected across the contiguous United States identifies geographically prevalent phylogroups with increased pangenome openness and recombination, as a result of adaptation to variable environments.
County-Level COVID-19 Vaccination Coverage and Social Vulnerability — United States, December 14, 2020–March 1, 2021
The U.S. COVID-19 vaccination program began in December 2020, and ensuring equitable COVID-19 vaccine access remains a national priority.* COVID-19 has disproportionately affected racial/ethnic minority groups and those who are economically and socially disadvantaged (1,2). Thus, achieving not just vaccine equality (i.e., similar allocation of vaccine supply proportional to its population across jurisdictions) but equity (i.e., preferential access and administra-tion to those who have been most affected by COVID-19 disease) is an important goal. The CDC social vulnerability index (SVI) uses 15 indicators grouped into four themes that comprise an overall SVI measure, resulting in 20 metrics, each of which has national and state-specific county rankings. The 20 metric-specific rankings were each divided into lowest to highest tertiles to categorize counties as low, moderate, or high social vulnerability counties. These tertiles were combined with vaccine administration data for 49,264,338 U.S. residents in 49 states and the District of Columbia (DC) who received at least one COVID-19 vaccine dose during December 14, 2020-March 1, 2021. Nationally, for the overall SVI measure, vaccination coverage was higher (15.8%) in low social vulnerability counties than in high social vulnerability counties (13.9%), with the largest coverage disparity in the socioeconomic status theme (2.5 percentage points higher coverage in low than in high vulnerability counties). Wide state variations in equity across SVI metrics were found. Whereas in the majority of states, vaccination coverage was higher in low vulnerability counties, some states had equitable coverage at the county level. CDC, state, and local jurisdictions should continue to monitor vaccination coverage by SVI metrics to focus public health interventions to achieve equitable coverage with COVID-19 vaccine.
Patterns in COVID-19 Vaccination Coverage, by Social Vulnerability and Urbanicity — United States, December 14, 2020–May 1, 2021
Disparities in vaccination coverage by social vulnerability, defined as social and structural factors associated with adverse health outcomes, were noted during the first 2.5 months of the U.S. COVID-19 vaccination campaign, which began during mid-December 2020 (1). As vaccine eligibility and availability continue to expand, assuring equitable coverage for disproportionately affected communities remains a priority. CDC examined COVID-19 vaccine administration and 2018 CDC social vulnerability index (SVI) data to ascertain whether inequities in COVID-19 vaccination coverage with respect to county-level SVI have persisted, overall and by urbanicity. Vaccination coverage was defined as the number of persons aged ≥18 years (adults) who had received ≥1 dose of any Food and Drug Administration (FDA)-authorized COVID-19 vaccine divided by the total adult population in a specified SVI category.† SVI was examined overall and by its four themes (socioeconomic status, household composition and disability, racial/ethnic minority status and language, and housing type and transportation). Counties were categorized into SVI quartiles, in which quartile 1 (Q1) represented the lowest level of vulnerability and quartile 4 (Q4), the highest. Trends in vaccination coverage were assessed by SVI quartile and urbanicity, which was categorized as large central metropolitan, large fringe metropolitan (areas surrounding large cities, e.g., suburban), medium and small metropolitan, and nonmetropolitan counties.§ During December 14, 2020-May 1, 2021, disparities in vaccination coverage by SVI increased, especially in large fringe metropolitan (e.g., suburban) and nonmetropolitan counties. By May 1, 2021, vaccination coverage was lower among adults living in counties with the highest overall SVI; differences were most pronounced in large fringe metropolitan (Q4 coverage = 45.0% versus Q1 coverage = 61.7%) and nonmetropolitan (Q4 = 40.6% versus Q1 = 52.9%) counties. Vaccination coverage disparities were largest for two SVI themes: socioeconomic status (Q4 = 44.3% versus Q1 = 61.0%) and household composition and disability (Q4 = 42.0% versus Q1 = 60.1%). Outreach efforts, including expanding public health messaging tailored to local populations and increasing vaccination access, could help increase vaccination coverage in high-SVI counties.Disparities in vaccination coverage by social vulnerability, defined as social and structural factors associated with adverse health outcomes, were noted during the first 2.5 months of the U.S. COVID-19 vaccination campaign, which began during mid-December 2020 (1). As vaccine eligibility and availability continue to expand, assuring equitable coverage for disproportionately affected communities remains a priority. CDC examined COVID-19 vaccine administration and 2018 CDC social vulnerability index (SVI) data to ascertain whether inequities in COVID-19 vaccination coverage with respect to county-level SVI have persisted, overall and by urbanicity. Vaccination coverage was defined as the number of persons aged ≥18 years (adults) who had received ≥1 dose of any Food and Drug Administration (FDA)-authorized COVID-19 vaccine divided by the total adult population in a specified SVI category.† SVI was examined overall and by its four themes (socioeconomic status, household composition and disability, racial/ethnic minority status and language, and housing type and transportation). Counties were categorized into SVI quartiles, in which quartile 1 (Q1) represented the lowest level of vulnerability and quartile 4 (Q4), the highest. Trends in vaccination coverage were assessed by SVI quartile and urbanicity, which was categorized as large central metropolitan, large fringe metropolitan (areas surrounding large cities, e.g., suburban), medium and small metropolitan, and nonmetropolitan counties.§ During December 14, 2020-May 1, 2021, disparities in vaccination coverage by SVI increased, especially in large fringe metropolitan (e.g., suburban) and nonmetropolitan counties. By May 1, 2021, vaccination coverage was lower among adults living in counties with the highest overall SVI; differences were most pronounced in large fringe metropolitan (Q4 coverage = 45.0% versus Q1 coverage = 61.7%) and nonmetropolitan (Q4 = 40.6% versus Q1 = 52.9%) counties. Vaccination coverage disparities were largest for two SVI themes: socioeconomic status (Q4 = 44.3% versus Q1 = 61.0%) and household composition and disability (Q4 = 42.0% versus Q1 = 60.1%). Outreach efforts, including expanding public health messaging tailored to local populations and increasing vaccination access, could help increase vaccination coverage in high-SVI counties.
Heat flows solubilize apatite to boost phosphate availability for prebiotic chemistry
Phosphorus is an essential building block of life, likely since its beginning. Despite this importance for prebiotic chemistry, phosphorus was scarce in Earth’s rock record and mainly bound in poorly soluble minerals, with the calcium-phosphate mineral apatite as key example. While specific chemical boundary conditions have been considered to address this so-called phosphate problem, a fundamental process that solubilizes and enriches phosphate from geological sources remains elusive. Here, we show that ubiquitous heat flows through rock cracks can liberate phosphate from apatite by the selective removal of calcium. Phosphate’s strong thermophoresis not only achieves its 100-fold up-concentration in aqueous solution, but boosts its solubility by two orders of magnitude. We show that the heat-flow-solubilized phosphate can feed the synthesis of trimetaphosphate, increasing the conversion 260-fold compared to thermal equilibrium. Heat flows thus enhance solubility to unlock apatites as phosphate source for prebiotic chemistry, providing a key to early life’s phosphate problem. Heat flows through rock cracks drive the permanent solubilization of phosphate from apatite and its accumulation from phosphate bearing geomaterials, offering a non-equilibrium pathway to approach the phosphate problem at the emergence of life.
Simple motion correction strategy reduces respiratory-induced motion artifacts for k-t accelerated and compressed-sensing cardiovascular magnetic resonance perfusion imaging
Background Cardiovascular magnetic resonance (CMR) stress perfusion imaging provides important diagnostic and prognostic information in coronary artery disease (CAD). Current clinical sequences have limited temporal and/or spatial resolution, and incomplete heart coverage. Techniques such as k-t principal component analysis (PCA) or k-t sparcity and low rank structure (SLR), which rely on the high degree of spatiotemporal correlation in first-pass perfusion data, can significantly accelerate image acquisition mitigating these problems. However, in the presence of respiratory motion, these techniques can suffer from significant degradation of image quality. A number of techniques based on non-rigid registration have been developed. However, to first approximation, breathing motion predominantly results in rigid motion of the heart. To this end, a simple robust motion correction strategy is proposed for k-t accelerated and compressed sensing (CS) perfusion imaging. Methods A simple respiratory motion compensation (MC) strategy for k-t accelerated and compressed-sensing CMR perfusion imaging to selectively correct respiratory motion of the heart was implemented based on linear k-space phase shifts derived from rigid motion registration of a region-of-interest (ROI) encompassing the heart. A variable density Poisson disk acquisition strategy was used to minimize coherent aliasing in the presence of respiratory motion, and images were reconstructed using k-t PCA and k-t SLR with or without motion correction. The strategy was evaluated in a CMR-extended cardiac torso digital (XCAT) phantom and in prospectively acquired first-pass perfusion studies in 12 subjects undergoing clinically ordered CMR studies. Phantom studies were assessed using the Structural Similarity Index (SSIM) and Root Mean Square Error (RMSE). In patient studies, image quality was scored in a blinded fashion by two experienced cardiologists. Results In the phantom experiments, images reconstructed with the MC strategy had higher SSIM ( p  < 0.01) and lower RMSE ( p  < 0.01) in the presence of respiratory motion. For patient studies, the MC strategy improved k-t PCA and k-t SLR reconstruction image quality ( p  < 0.01). The performance of k-t SLR without motion correction demonstrated improved image quality as compared to k-t PCA in the setting of respiratory motion ( p  < 0.01), while with motion correction there is a trend of better performance in k-t SLR as compared with motion corrected k-t PCA. Conclusions Our simple and robust rigid motion compensation strategy greatly reduces motion artifacts and improves image quality for standard k-t PCA and k-t SLR techniques in setting of respiratory motion due to imperfect breath-holding.
Development and illustration of a framework for computational thinking practices in introductory physics
Physics classes with computation integrated into the curriculum are a fitting setting for investigating computational thinking. In this paper, we present a framework for exploring this topic in introductory physics courses. The framework, which was developed by reviewing relevant literature and acquiring video data from high school classrooms, comprises 14 practices that students could engage in when working with Glowscript VPython activities. For every practice, we provide in-class video data to exemplify the practice. In doing this work, we hope to provide ways for teachers to assess their students’ development of computational thinking and give physics education researchers a foundation to study the topic in greater depth.
Survival of Escherichia coli on Lettuce under Field Conditions Encountered in the Northeastern United States
Although wildlife intrusion and untreated manure have been associated with microbial contamination of produce, relatively few studies have examined the survival of Escherichia coli on produce under field conditions following contamination (e.g., via splash from wildlife feces). This experimental study was performed to estimate the die-off rate of E. coli on preharvest lettuce following contamination with a fecal slurry. During August 2015, field-grown lettuce was inoculated via pipette with a fecal slurry that was spiked with a three-strain cocktail of rifampin-resistant nonpathogenic E. coli. Ten lettuce heads were harvested at each of 13 time points following inoculation (0, 2.5, 5, and 24 h after inoculation and every 24 h thereafter until day 10). The most probable number (MPN) of E. coli on each lettuce head was determined, and die-off rates were estimated. The relationship between sample time and the log MPN of E. coli per head was modeled using a segmented linear model. This model had a breakpoint at 106 h (95% confidence interval = 69, 142 h) after inoculation, with a daily decrease of 0.70 and 0.19 log MPN for 0 to 106 h and 106 to 240 h following inoculation, respectively. These findings are consistent with die-off rates obtained in similar studies that assessed E. coli survival on produce following irrigation. Overall, these findings provide die-off rates for E. coli on lettuce that can be used in future quantitative risk assessments.