Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Item Type
      Item Type
      Clear All
      Item Type
  • Subject
      Subject
      Clear All
      Subject
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
      More Filters
      Clear All
      More Filters
      Source
    • Language
1,133 result(s) for "Jeffrey Walker"
Sort by:
Error-correcting barcoded primers for pyrosequencing hundreds of samples in multiplex
We constructed error-correcting DNA barcodes that allow one run of a massively parallel pyrosequencer to process up to 1,544 samples simultaneously. Using these barcodes we processed bacterial 16S rRNA gene sequences representing microbial communities in 286 environmental samples, corrected 92% of sample assignment errors, and thus characterized nearly as many 16S rRNA genes as have been sequenced to date by Sanger sequencing.
Unlocking Biomarker Discovery: Large Scale Application of Aptamer Proteomic Technology for Early Detection of Lung Cancer
Lung cancer is the leading cause of cancer deaths worldwide. New diagnostics are needed to detect early stage lung cancer because it may be cured with surgery. However, most cases are diagnosed too late for curative surgery. Here we present a comprehensive clinical biomarker study of lung cancer and the first large-scale clinical application of a new aptamer-based proteomic technology to discover blood protein biomarkers in disease. We conducted a multi-center case-control study in archived serum samples from 1,326 subjects from four independent studies of non-small cell lung cancer (NSCLC) in long-term tobacco-exposed populations. Sera were collected and processed under uniform protocols. Case sera were collected from 291 patients within 8 weeks of the first biopsy-proven lung cancer and prior to tumor removal by surgery. Control sera were collected from 1,035 asymptomatic study participants with ≥ 10 pack-years of cigarette smoking. We measured 813 proteins in each sample with a new aptamer-based proteomic technology, identified 44 candidate biomarkers, and developed a 12-protein panel (cadherin-1, CD30 ligand, endostatin, HSP90α, LRIG3, MIP-4, pleiotrophin, PRKCI, RGM-C, SCF-sR, sL-selectin, and YES) that discriminates NSCLC from controls with 91% sensitivity and 84% specificity in cross-validated training and 89% sensitivity and 83% specificity in a separate verification set, with similar performance for early and late stage NSCLC. This study is a significant advance in clinical proteomics in an area of high unmet clinical need. Our analysis exceeds the breadth and dynamic range of proteome interrogated of previously published clinical studies of broad serum proteome profiling platforms including mass spectrometry, antibody arrays, and autoantibody arrays. The sensitivity and specificity of our 12-biomarker panel improves upon published protein and gene expression panels. Separate verification of classifier performance provides evidence against over-fitting and is encouraging for the next development phase, independent validation. This careful study provides a solid foundation to develop tests sorely needed to identify early stage lung cancer.
Upscaling sparse ground-based soil moisture observations for the validation of coarse-resolution satellite soil moisture products
The contrast between the point‐scale nature of current ground‐based soil moisture instrumentation and the ground resolution (typically >102 km2) of satellites used to retrieve soil moisture poses a significant challenge for the validation of data products from current and upcoming soil moisture satellite missions. Given typical levels of observed spatial variability in soil moisture fields, this mismatch confounds mission validation goals by introducing significant sampling uncertainty in footprint‐scale soil moisture estimates obtained from sparse ground‐based observations. During validation activities based on comparisons between ground observations and satellite retrievals, this sampling error can be misattributed to retrieval uncertainty and spuriously degrade the perceived accuracy of satellite soil moisture products. This review paper describes the magnitude of the soil moisture upscaling problem and measurement density requirements for ground‐based soil moisture networks. Since many large‐scale networks do not meet these requirements, it also summarizes a number of existing soil moisture upscaling strategies which may reduce the detrimental impact of spatial sampling errors on the reliability of satellite soil moisture validation using spatially sparse ground‐based observations. Key Points Satellite soil moisture retrievals are obtained at coarse spatial resolutions It is difficult to validate them using point‐scale ground observations Credible soil moisture upscaling strategies exist to address the problem
Detecting the effect of water level fluctuations on water quality impacting endangered fish in a shallow, hypereutrophic lake using long-term monitoring data
Water level fluctuations (WLFs) affect phytoplankton dynamics, water quality, and fish populations in lakes and reservoirs around the world. However, such effects are system-specific and vary due to interactions with other external factors such as solar radiation, air temperature, wind, and external phosphorus loading. Utilizing data from a long-term monitoring program (1990–2016), we developed an approach using cross-tabulation contour and conditional probability analyses to detect the effects of WLFs on the frequency of poor water quality impacting native fish in a large, shallow, hypereutrophic lake. Through the incorporation of long-term inter-annual and seasonal variability in climatic factors and cyanobacterial bloom periodicity, our approach detected non-linear WLF effects whereby both high and low-lake levels were associated with higher probabilities of poor water quality conditions stressful to fish including high pH, high un-ionized ammonia, and low dissolved oxygen. Although lake level management may not prevent poor water quality in any given year due to other external factors such as temperature or wind, we determined that seasonally based intermediate lake levels bracketing the long-term median afforded the best water quality conditions for endangered fish during the summer period when poor water quality is most common.
A dynamic subset of network interactions underlies tuning to natural movements in marmoset sensorimotor cortex
Mechanisms of computation in sensorimotor cortex must be flexible and robust to support skilled motor behavior. Patterns of neuronal coactivity emerge as a result of computational processes. Pairwise spike-time statistical relationships, across the population, can be summarized as a functional network (FN) which retains single-unit properties. We record populations of single-unit neural activity in marmoset forelimb sensorimotor cortex during prey capture and spontaneous behavior and use an encoding model incorporating kinematic trajectories and network features to predict single-unit activity during forelimb movements. The contribution of network features depends on structured connectivity within strongly connected functional groups. We identify a context-specific functional group that is highly tuned to kinematics and reorganizes its connectivity between spontaneous and prey capture movements. In the remaining context-invariant group, interactions are comparatively stable across behaviors and units are less tuned to kinematics. This suggests different roles in producing natural forelimb movements and contextualizes single-unit tuning properties within population dynamics. By studying cortical activity patterns during prey capture and spontaneous behavior in marmosets, the authors identify distinct subnetworks defined by reliable spike timing correlations. These subnetworks emerge during prey capture, with each potentially playing different roles in controlling reaching movements.
Phylogenetic stratigraphy in the Guerrero Negro hypersaline microbial mat
The microbial mats of Guerrero Negro (GN), Baja California Sur, Mexico historically were considered a simple environment, dominated by cyanobacteria and sulfate-reducing bacteria. Culture-independent rRNA community profiling instead revealed these microbial mats as among the most phylogenetically diverse environments known. A preliminary molecular survey of the GN mat based on only ∼1500 small subunit rRNA gene sequences discovered several new phylum-level groups in the bacterial phylogenetic domain and many previously undetected lower-level taxa. We determined an additional ∼119 000 nearly full-length sequences and 28 000 >200 nucleotide 454 reads from a 10-layer depth profile of the GN mat. With this unprecedented coverage of long sequences from one environment, we confirm the mat is phylogenetically stratified, presumably corresponding to light and geochemical gradients throughout the depth of the mat. Previous shotgun metagenomic data from the same depth profile show the same stratified pattern and suggest that metagenome properties may be predictable from rRNA gene sequences. We verify previously identified novel lineages and identify new phylogenetic diversity at lower taxonomic levels, for example, thousands of operational taxonomic units at the family-genus levels differ considerably from known sequences. The new sequences populate parts of the bacterial phylogenetic tree that previously were poorly described, but indicate that any comprehensive survey of GN diversity has only begun. Finally, we show that taxonomic conclusions are generally congruent between Sanger and 454 sequencing technologies, with the taxonomic resolution achieved dependent on the abundance of reference sequences in the relevant region of the rRNA tree of life.
Version 4 of the SMAP Level-4 Soil Moisture Algorithm and Data Product
The NASA Soil Moisture Active Passive (SMAP) mission Level-4 Soil Moisture (L4_SM) product provides global, 3-hourly, 9-km resolution estimates of surface (0-5 cm) and root-zone (0-100 cm) soil moisture with a mean latency of ~2.5 days. The underlying L4_SM algorithm assimilates SMAP radiometer brightness temperature (Tb) observations into the NASA Catchment land surface model using a spatially-distributed ensemble Kalman filter. Version 4 of the L4_SM modeling system includes a reduction in the upward recharge of surface soil moisture from below under non-equilibrium conditions, resulting in reduced bias and improved dynamic range of L4_SM surface soil moisture compared to earlier versions. This change and additional technical modifications to the system reduce the mean and standard deviation of the observation-minus-forecast Tb residuals and overall soil moisture analysis increments while maintaining the skill of the L4_SM soil moisture estimates versus independent in situ measurements; the average, bias-adjusted RMSE in Version 4 is 0.039 m(exp 3) m(exp -3) for surface and 0.026 m(exp 3) m(exp -3) for root-zone soil moisture. Moreover, the coverage of assimilated SMAP observations in Version 4 is near-global owing to the use of additional satellite Tb records for algorithm calibration. L4_SM soil moisture uncertainty estimates are biased low (by 0.01-0.02 m(exp 3) m(exp -3)) against actual errors (computed versus in situ measurements). L4_SM runoff estimates, an additional product of the L4_SM algorithm, are biased low (by 35 mm year (exp -1)) against streamflow measurements. Compared to Version 3, bias in Version 4 is reduced by 46% for surface soil moisture uncertainty estimates and by 33% for runoff estimates.
Assimilation of Wheat and Soil States into the APSIM-Wheat Crop Model: A Case Study
Optimised farm crop productivity requires careful management in response to the spatial and temporal variability of yield. Accordingly, combination of crop simulation models and remote sensing data provides a pathway for providing the spatially variable information needed on current crop status and the expected yield. An ensemble Kalman filter (EnKF) data assimilation framework was developed to assimilate plant and soil observations into a prediction model to improve crop development and yield forecasting. Specifically, this study explored the performance of assimilating state observations into the APSIM-Wheat model using a dataset collected during the 2018/19 wheat season at a farm near Cora Lynn in Victoria, Australia. The assimilated state variables include (1) ground-based measurements of Leaf Area Index (LAI), soil moisture throughout the profile, biomass, and soil nitrate-nitrogen; and (2) remotely sensed observations of LAI and surface soil moisture. In a baseline scenario, an unconstrained (open-loop) simulation greatly underestimated the wheat grain with a relative difference (RD) of −38.3%, while the assimilation constrained simulations using ground-based LAI, ground-based biomass, and remotely sensed LAI were all found to improve the RD, reducing it to −32.7%, −9.4%, and −7.6%, respectively. Further improvements in yield estimation were found when: (1) wheat states were assimilated in phenological stages 4 and 5 (end of juvenile to flowering), (2) plot-specific remotely sensed LAI was used instead of the field average, and (3) wheat phenology was constrained by ground observations. Even when using parameters that were not accurately calibrated or measured, the assimilation of LAI and biomass still provided improved yield estimation over that from an open-loop simulation.
Theory-guided machine learning to predict density evolution of sand dynamically compacted under Ko condition
This paper introduces a theory-guided machine learning (TGML) framework, which combines a theoretical model (TM) and a machine learning (ML) algorithm to predict compaction density under cyclic loading. Several 1-D tests were conducted on uniformly graded fine sand compacted at varying moisture contents w, stress levels σz and loading frequencies f, simulating the field compaction of materials using a vibratory roller. The laboratory compaction data were first analysed using a revised TM and an artificial neural network (ANN), and their performance was measured using mean absolute error (MAE). Next, the data were analysed using the TGML framework, which involves three different techniques. TGML1 increased the ML’s ability to extrapolate (MAE improved from 2.2 × 10−3 to 1.2 × 10−3); TGML2 ensured ML and TM complemented each other to model observations better (MAE improved from 2.3 × 10−3 to 7.9 × 10−4); and TGML3 assisted in regularising the ML with an additional loss function which ensured the model followed the mechanistic understandings of the underlying physics (MAE improved from 9.2 × 10−3 to 2.7 × 10−3). Considering TGML3 during modelling is essential when dealing with noisy field datasets, and this is the highlight of this paper. TGML frameworks showed less error and lower model uncertainty, estimated using the novel Monte Carlo dropout technique. Furthermore, the developed TGML framework was used to demonstrate a termination criterion, i.e. the number of cycles of roller movement required to achieve the desired degree of compaction. Finally, an approach is proposed by which a simplified TM and ML model can estimate field compaction behaviour during roller movement.
THE EFFECT OF UNMEASURED CONFOUNDERS ON THE ABILITY TO ESTIMATE A TRUE PERFORMANCE OR SELECTION GRADIENT (AND OTHER PARTIAL REGRESSION COEFFICIENTS)
Multiple regression of observational data is frequently used to infer causal effects. Partial regression coefficients are biased estimates of causal effects if unmeasured confounders are not in the regression model. The sensitivity of partial regression coefficients to omitted confounders is investigated with a Monte-Carlo simulation. A subset of causal traits is \"measured\" and their effects are estimated using ordinary least squares regression and compared to their expected values. Three major results are: (1) the error due to confounding is much larger than that due to sampling, especially with large samples, (2) confounding error shrinks trivially with sample size, and (3) small true effects are frequently estimated as large effects. Consequently, confidence intervals from regression are poor guides to the true intervals, especially with large sample sizes. The addition of a confounder to the model improves estimates only 55% of the time. Results are improved with complete knowledge of the rank order of causal effects but even with this omniscience, measured intervals are poor proxies for true intervals if there are many unmeasured confounders. The results suggest that only under very limited conditions can we have much confidence in the magnitude of partial regression coefficients as estimates of causal effects.