Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Item Type
      Item Type
      Clear All
      Item Type
  • Subject
      Subject
      Clear All
      Subject
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
      More Filters
      Clear All
      More Filters
      Source
    • Language
4,828 result(s) for "Cell Tracking"
Sort by:
3DeeCellTracker, a deep learning-based pipeline for segmenting and tracking cells in 3D time lapse images
Despite recent improvements in microscope technologies, segmenting and tracking cells in three-dimensional time-lapse images (3D + T images) to extract their dynamic positions and activities remains a considerable bottleneck in the field. We developed a deep learning-based software pipeline, 3DeeCellTracker, by integrating multiple existing and new techniques including deep learning for tracking. With only one volume of training data, one initial correction, and a few parameter changes, 3DeeCellTracker successfully segmented and tracked ~100 cells in both semi-immobilized and ‘straightened’ freely moving worm's brain, in a naturally beating zebrafish heart, and ~1000 cells in a 3D cultured tumor spheroid. While these datasets were imaged with highly divergent optical systems, our method tracked 90–100% of the cells in most cases, which is comparable or superior to previous results. These results suggest that 3DeeCellTracker could pave the way for revealing dynamic cell activities in image datasets that have been difficult to analyze. Microscopes have been used to decrypt the tiny details of life since the 17th century. Now, the advent of 3D microscopy allows scientists to build up detailed pictures of living cells and tissues. In that effort, automation is becoming increasingly important so that scientists can analyze the resulting images and understand how bodies grow, heal and respond to changes such as drug therapies. In particular, algorithms can help to spot cells in the picture (called cell segmentation), and then to follow these cells over time across multiple images (known as cell tracking). However, performing these analyses on 3D images over a given period has been quite challenging. In addition, the algorithms that have already been created are often not user-friendly, and they can only be applied to a specific dataset gathered through a particular scientific method. As a response, Wen et al. developed a new program called 3DeeCellTracker, which runs on a desktop computer and uses a type of artificial intelligence known as deep learning to produce consistent results. Crucially, 3DeeCellTracker can be used to analyze various types of images taken using different types of cutting-edge microscope systems. And indeed, the algorithm was then harnessed to track the activity of nerve cells in moving microscopic worms, of beating heart cells in a young small fish, and of cancer cells grown in the lab. This versatile tool can now be used across biology, medical research and drug development to help monitor cell activities.
Tracking cell lineages in 3D by incremental deep learning
Deep learning is emerging as a powerful approach for bioimage analysis. Its use in cell tracking is limited by the scarcity of annotated data for the training of deep-learning models. Moreover, annotation, training, prediction, and proofreading currently lack a unified user interface. We present ELEPHANT, an interactive platform for 3D cell tracking that addresses these challenges by taking an incremental approach to deep learning. ELEPHANT provides an interface that seamlessly integrates cell track annotation, deep learning, prediction, and proofreading. This enables users to implement cycles of incremental learning starting from a few annotated nuclei. Successive prediction-validation cycles enrich the training data, leading to rapid improvements in tracking performance. We test the software's performance against state-of-the-art methods and track lineages spanning the entire course of leg regeneration in a crustacean over 1 week (504 timepoints). ELEPHANT yields accurate, fully-validated cell lineages with a modest investment in time and effort.
Segmentation, tracking and cell cycle analysis of live-cell imaging data with Cell-ACDC
Background High-throughput live-cell imaging is a powerful tool to study dynamic cellular processes in single cells but creates a bottleneck at the stage of data analysis, due to the large amount of data generated and limitations of analytical pipelines. Recent progress on deep learning dramatically improved cell segmentation and tracking. Nevertheless, manual data validation and correction is typically still required and tools spanning the complete range of image analysis are still needed. Results We present Cell-ACDC, an open-source user-friendly GUI-based framework written in Python, for segmentation, tracking and cell cycle annotations. We included state-of-the-art deep learning models for single-cell segmentation of mammalian and yeast cells alongside cell tracking methods and an intuitive, semi-automated workflow for cell cycle annotation of single cells. Using Cell-ACDC, we found that mTOR activity in hematopoietic stem cells is largely independent of cell volume. By contrast, smaller cells exhibit higher p38 activity, consistent with a role of p38 in regulation of cell size. Additionally, we show that, in S. cerevisiae , histone Htb1 concentrations decrease with replicative age. Conclusions Cell-ACDC provides a framework for the application of state-of-the-art deep learning models to the analysis of live cell imaging data without programming knowledge. Furthermore, it allows for visualization and correction of segmentation and tracking errors as well as annotation of cell cycle stages. We embedded several smart algorithms that make the correction and annotation process fast and intuitive. Finally, the open-source and modularized nature of Cell-ACDC will enable simple and fast integration of new deep learning-based and traditional methods for cell segmentation, tracking, and downstream image analysis. Source code: https://github.com/SchmollerLab/Cell_ACDC
Monitoring single-cell gene regulation under dynamically controllable conditions with integrated microfluidics and software
Much is still not understood about how gene regulatory interactions control cell fate decisions in single cells, in part due to the difficulty of directly observing gene regulatory processes in vivo. We introduce here a novel integrated setup consisting of a microfluidic chip and accompanying analysis software that enable long-term quantitative tracking of growth and gene expression in single cells. The dual-input Mother Machine (DIMM) chip enables controlled and continuous variation of external conditions, allowing direct observation of gene regulatory responses to changing conditions in single cells. The Mother Machine Analyzer (MoMA) software achieves unprecedented accuracy in segmenting and tracking cells, and streamlines high-throughput curation with a novel leveraged editing procedure. We demonstrate the power of the method by uncovering several novel features of an iconic gene regulatory program: the induction of Escherichia coli ’s lac operon in response to a switch from glucose to lactose. How gene regulatory pathways control cell fate decisions in single cells is not fully understood. Here the authors present an integrated dual-input microfluidic chip and a linked analysis software, enabling tracking of gene regulatory responses of single bacterial cells to changing conditions.
Surface-enhanced Raman scattering holography
Nanometric probes based on surface-enhanced Raman scattering (SERS) are promising candidates for all-optical environmental, biological and technological sensing applications with intrinsic quantitative molecular specificity. However, the effectiveness of SERS probes depends on a delicate trade-off between particle size, stability and brightness that has so far hindered their wide application in SERS imaging methodologies. In this Article, we introduce holographic Raman microscopy, which allows single-shot three-dimensional single-particle localization. We validate our approach by simultaneously performing Fourier transform Raman spectroscopy of individual SERS nanoparticles and Raman holography, using shearing interferometry to extract both the phase and the amplitude of wide-field Raman images and ultimately localize and track single SERS nanoparticles inside living cells in three dimensions. Our results represent a step towards multiplexed single-shot three-dimensional concentration mapping in many different scenarios, including live cell and tissue interrogation and complex anti-counterfeiting applications.Holography of incoherent emission from SERS probes allows multiplexed single-particle localization in three dimensions in one shot using a wide-field microscope.
Nanoparticle-based Cell Trackers for Biomedical Applications
The continuous or real-time tracking of biological processes using biocompatible contrast agents over a certain period of time is vital for precise diagnosis and treatment, such as monitoring tissue regeneration after stem cell transplantation, understanding the genesis, development, invasion and metastasis of cancer and so on. The rationally designed nanoparticles, including aggregation-induced emission (AIE) dots, inorganic quantum dots (QDs), nanodiamonds, superparamagnetic iron oxide nanoparticles (SPIONs), and semiconducting polymer nanoparticles (SPNs), have been explored to meet this urgent need. In this review, the development and application of these nanoparticle-based cell trackers for a variety of imaging technologies, including fluorescence imaging, photoacoustic imaging, magnetic resonance imaging, magnetic particle imaging, positron emission tomography and single photon emission computing tomography are discussed in detail. Moreover, the further therapeutic treatments using multi-functional trackers endowed with photodynamic and photothermal modalities are also introduced to provide a comprehensive perspective in this promising research field.
Cancer Immunoimaging with Smart Nanoparticles
Dynamic immunoimaging in vivo is crucial in patient-tailored immunotherapies to identify patients who will benefit from immunotherapies, monitor therapeutic efficacy post treatment, and determine alternative strategies for nonresponders. Nanoparticles have played a major role in the immunotherapy landscape. In this review, we summarize recent findings in immunoimaging where smart nanoparticles target, detect, stimulate, and deliver therapeutic dose in vivo. Nanoparticles interfaced with an immunoimaging toolbox enable the use of multiple modalities and achieve depth-resolved whole-body tracking of immunomarkers with high accuracy both before and after treatment. We highlight how functional nanoparticles track T cells, dendritic cells (DCs), tumor-associated macrophages (TAMs), and immune checkpoint receptors (ICRs), and facilitate image-guided interventions. Dynamic imaging of immune cells that have a crucial role in the progression or suppression of tumors has transformed the landscape of cancer immunotherapies, allowing clinicians to predict the best treatment strategies and monitor treatment response.Tracking immune cells in vivo with smart nanoparticles is beneficial because multiple imaging modalities have been successfully integrated on a single nanoscale platform, enabling depth-resolved whole-body detection with high sensitivity and specificity of tumors that overexpress immunomarkers.Smart nanoparticles have also enabled image-guided interventions, where nanoparticles with both diagnostic and therapeutic ability were labeled to immune cells and tracked in vivo, followed by antigen delivery. Such theranostic nanovaccines elicit robust and persistent antigen-specific immune responses with dynamic visualization of therapeutic efficacy.
Single-molecule localization microscopy and tracking with red-shifted states of conventional BODIPY conjugates in living cells
Single-molecule localization microscopy (SMLM) is a rapidly evolving technique to resolve subcellular structures and single-molecule dynamics at the nanoscale. Here, we employ conventional BODIPY conjugates for live-cell SMLM via their previously reported red-shifted ground-state dimers (D II ), which transiently form through bi-molecular encounters and emit bright single-molecule fluorescence. We employ the versatility of D II -state SMLM to resolve the nanoscopic spatial regulation and dynamics of single fatty acid analogs (FAas) and lipid droplets (LDs) in living yeast and mammalian cells with two colors. In fed cells, FAas localize to the endoplasmic reticulum and LDs of ~125 nm diameter. Upon fasting, however, FAas form dense, non-LD clusters of ~100 nm diameter at the plasma membrane and transition from free diffusion to confined immobilization. Our reported SMLM capability of conventional BODIPY conjugates is further demonstrated by imaging lysosomes in mammalian cells and enables simple and versatile live-cell imaging of sub-cellular structures at the nanoscale. Single-molecule localization microscopy (SMLM) requires the use of fluorophores with specific sets of properties. Here the authors employ conventional BODIPY dyes as SMLM fluorophores by making use of rarely reported red-shifted ground state BODIPY dimers to image fatty acids, lipid droplets and lysosomes at single-molecule resolution.
FloCyT: A Flow-Aware Centroid Tracker for Cell Analysis in High-Speed Capillary-Driven Microfluidic Flow
Capillary-driven microfluidic chips have emerged as promising platforms for point-of-care diagnostics, offering portable, inexpensive, and pump-free operation. Accurate tracking of cell flow in these systems is vital for quantitative applications such as on-chip cytometry, cell counting, and biomechanical analysis. However, tracking in capillary-driven devices is challenging due to rapid cell displacements, flow instabilities, and visually similar cells. Under these conditions, conventional tracking algorithms such as TrackPy, TrackMate, SORT, and DeepSORT exhibit frequent identity switches and trajectory fragmentation. Here, we introduce FloCyT, a robust, high-speed centroid tracking tool specifically designed for capillary-driven and microfluidic flow. FloCyT leverages microchannel geometry for tracking and uses anisotropic gating for association, global flow-aware track initialisation, and channel-specific association. This enables precise tracking even under challenging conditions of capillary-driven flow. FloCyT was evaluated on 12 simulated and 4 real patient datasets using standard multi-object tracking metrics, including IDF1 and MOTA, ID switches, and the percentage of mostly tracked objects. The results demonstrate that FloCyT outperforms both standard and flow-aware-modified versions of TrackPy and SORT, achieving higher accuracy, more complete trajectories, and fewer identity switches. By enabling accurate and automated cell tracking in capillary-driven microfluidic devices, FloCyT enhances the quantitative sensing capability of image-based microfluidic diagnostics, supporting novel, low-cost, and portable cytometry applications.
Longitudinal tracking of neuronal activity from the same cells in the developing brain using Track2p
Understanding cortical circuit development requires tracking neuronal activity across days in the growing brain. While in vivo calcium imaging now enables such longitudinal studies, automated tools for reliably tracking large populations of neurons across sessions remain limited. Here, we present a novel cell tracking method based on sequential image registration, validated on calcium imaging data from the barrel cortex of mouse pups over 1 postnatal week. Our approach enables robust long-term analysis of several hundreds of individual neurons, allowing quantification of neuronal dynamics and representational stability over time. Using this method, we identified a key developmental transition in neuronal activity statistics, marking the emergence of arousal state modulation. Beyond this key finding, our method provides an essential tool for tracking developmental trajectories of individual neurons, which could help identify potential deviations associated with neurodevelopmental disorders. In the weeks and months after birth, the brain experiences rapid growth and restructuring through carefully timed developmental sequences. These processes rely heavily on coordinated neuronal activity for the brain to develop healthily. Understanding these processes requires recording neural activity from the same cells across different stages of development. However, this has proven technically challenging because both the anatomy and cellular organization of the brain change dramatically during early life. Two-photon calcium imaging, which uses fluorescent markers to visualize neuronal activity in living animals, has emerged as a powerful method for studying neural circuits. In adult animals, automated techniques can track large neuronal populations across days. However, the rapid brain growth and morphological changes during development make it difficult to track neurons during this stage. Recording from the same developing neurons would provide a more dynamic view of healthy brain maturation and reveal how it diverges in neurodevelopmental conditions. Majnik et al. investigated whether it was possible to automatically and reliably track the same neurons across consecutive days during early postnatal development of mice. They developed Track2p, an open-source algorithm that can track developing neurons across days using a two-step procedure. It first corrects for brain growth using image registration and then matches neurons across days using this growth-corrected alignment. Applying Track2p to calcium imaging data from the mouse barrel cortex during the second postnatal week, Majnik et al. found that the algorithm robustly tracked hundreds of neurons despite substantial brain growth. Benchmarking against manually tracked neurons confirmed high accuracy. Analysis of the tracked population’s activity properties revealed an increase in overall activity rates and a decrease in firing synchrony. Moreover, around postnatal day 11, neuronal activity patterns shifted from highly synchronized and spatially organized population events to more decorrelated, behavior-dependent firing. Majnik et al. demonstrate that Track2p overcomes key technical barriers and reveal new principles of early postnatal brain maturation. The tool enables longitudinal analysis of neural circuits, allowing researchers to measure how individual neurons develop over time. Tracking cells across days will be particularly useful for understanding how genetic mutations or environmental factors influence developmental trajectories. Beyond development, Track2p can also examine long-term properties of adult circuits, such as learning or representational stability. Ultimately, applying Track2p to disease models may identify early circuit-level biomarkers of neurodevelopmental disorders.