Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Item Type
      Item Type
      Clear All
      Item Type
  • Subject
      Subject
      Clear All
      Subject
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
63 result(s) for "Cessac, Bruno"
Sort by:
Retinal Processing: Insights from Mathematical Modelling
The retina is the entrance of the visual system. Although based on common biophysical principles, the dynamics of retinal neurons are quite different from their cortical counterparts, raising interesting problems for modellers. In this paper, I address some mathematically stated questions in this spirit, discussing, in particular: (1) How could lateral amacrine cell connectivity shape the spatio-temporal spike response of retinal ganglion cells? (2) How could spatio-temporal stimuli correlations and retinal network dynamics shape the spike train correlations at the output of the retina? These questions are addressed, first, introducing a mathematically tractable model of the layered retina, integrating amacrine cells’ lateral connectivity and piecewise linear rectification, allowing for computing the retinal ganglion cells receptive field together with the voltage and spike correlations of retinal ganglion cells resulting from the amacrine cells networks. Then, I review some recent results showing how the concept of spatio-temporal Gibbs distributions and linear response theory can be used to characterize the collective spike response to a spatio-temporal stimulus of a set of retinal ganglion cells, coupled via effective interactions corresponding to the amacrine cells network. On these bases, I briefly discuss several potential consequences of these results at the cortical level.
Temporal pattern recognition in retinal ganglion cells is mediated by dynamical inhibitory synapses
A fundamental task for the brain is to generate predictions of future sensory inputs, and signal errors in these predictions. Many neurons have been shown to signal omitted stimuli during periodic stimulation, even in the retina. However, the mechanisms of this error signaling are unclear. Here we show that depressing inhibitory synapses shape the timing of the response to an omitted stimulus in the retina. While ganglion cells, the retinal output, responded to an omitted flash with a constant latency over many frequencies of the flash sequence, we found that this was not the case once inhibition was blocked. We built a simple circuit model and showed that depressing inhibitory synapses were a necessary component to reproduce our experimental findings. A new prediction of our model is that the accuracy of the constant latency requires a sufficient amount of flashes in the stimulus, which we could confirm experimentally. Depressing inhibitory synapses could thus be a key component to generate the predictive responses observed in the retina, and potentially in many brain areas. The retina is known to strongly respond to omitted stimuli in periodic patterns. Here the authors propose that depressing inhibitory synapses shape the timing of such a response and are key to perform temporal pattern recognition in neural networks.
Linear Response of General Observables in Spiking Neuronal Network Models
We establish a general linear response relation for spiking neuronal networks, based on chains with unbounded memory. This relation allow us to predict the influence of a weak amplitude time dependent external stimuli on spatio-temporal spike correlations, from the spontaneous statistics (without stimulus) in a general context where the memory in spike dynamics can extend arbitrarily far in the past. Using this approach, we show how the linear response is explicitly related to the collective effect of the stimuli, intrinsic neuronal dynamics, and network connectivity on spike train statistics. We illustrate our results with numerical simulations performed over a discrete time integrate and fire model.
Thermodynamic Formalism in Neuronal Dynamics and Spike Train Statistics
The Thermodynamic Formalism provides a rigorous mathematical framework for studying quantitative and qualitative aspects of dynamical systems. At its core, there is a variational principle that corresponds, in its simplest form, to the Maximum Entropy principle. It is used as a statistical inference procedure to represent, by specific probability measures (Gibbs measures), the collective behaviour of complex systems. This framework has found applications in different domains of science. In particular, it has been fruitful and influential in neurosciences. In this article, we review how the Thermodynamic Formalism can be exploited in the field of theoretical neuroscience, as a conceptual and operational tool, in order to link the dynamics of interacting neurons and the statistics of action potentials from either experimental data or mathematical models. We comment on perspectives and open problems in theoretical neuroscience that could be addressed within this formalism.
A biophysical model explains the spontaneous bursting behavior in the developing retina
During early development, waves of activity propagate across the retina and play a key role in the proper wiring of the early visual system. During a particular phase of the retina development (stage II) these waves are triggered by a transient network of neurons, called Starburst Amacrine Cells (SACs), showing a bursting activity which disappears upon further maturation. The underlying mechanisms of the spontaneous bursting and the transient excitability of immature SACs are not completely clear yet. While several models have attempted to reproduce retinal waves, none of them is able to mimic the rhythmic autonomous bursting of individual SACs and reveal how these cells change their intrinsic properties during development. Here, we introduce a mathematical model, grounded on biophysics, which enables us to reproduce the bursting activity of SACs and to propose a plausible, generic and robust, mechanism that generates it. The core parameters controlling repetitive firing are fast depolarizing V -gated calcium channels and hyperpolarizing V -gated potassium channels. The quiescent phase of bursting is controlled by a slow after hyperpolarization (sAHP), mediated by calcium-dependent potassium channels. Based on a bifurcation analysis we show how biophysical parameters, regulating calcium and potassium activity, control the spontaneously occurring fast oscillatory activity followed by long refractory periods in individual SACs. We make a testable experimental prediction on the role of voltage-dependent potassium channels on the excitability properties of SACs and on the evolution of this excitability along development. We also propose an explanation on how SACs can exhibit a large variability in their bursting periods, as observed experimentally within a SACs network as well as across different species, yet based on a simple, unique, mechanism. As we discuss, these observations at the cellular level have a deep impact on the retinal waves description.
Parameter Estimation for Spatio-Temporal Maximum Entropy Distributions: Application to Neural Spike Trains
We propose a numerical method to learn maximum entropy (MaxEnt) distributions with spatio-temporal constraints from experimental spike trains. This is an extension of two papers, [10] and [4], which proposed the estimation of parameters where only spatial constraints were taken into account. The extension we propose allows one to properly handle memory effects in spike statistics, for large-sized neural networks.
A constructive mean-field analysis of multi population neural networks with random synaptic weights and stochastic inputs
We deal with the problem of bridging the gap between two scales in neuronal modeling. At the first (microscopic) scale, neurons are considered individually and their behavior described by stochastic differential equations that govern the time variations of their membrane potentials. They are coupled by synaptic connections acting on their resulting activity, a nonlinear function of their membrane potential. At the second (mesoscopic) scale, interacting populations of neurons are described individually by similar equations. The equations describing the dynamical and the stationary mean-field behaviors are considered as functional equations on a set of stochastic processes. Using this new point of view allows us to prove that these equations are well-posed on any finite time interval and to provide a constructive method for effectively computing their unique solution. This method is proved to converge to the unique solution and we characterize its complexity and convergence rate. We also provide partial results for the stationary problem on infinite time intervals. These results shed some new light on such neural mass models as the one of Jansen and Rit (1995): their dynamics appears as a coarse approximation of the much richer dynamics that emerges from our analysis. Our numerical experiments confirm that the framework we propose and the numerical methods we derive from it provide a new and powerful tool for the exploration of neural behaviors at different scales.
PRANAS: A New Platform for Retinal Analysis and Simulation
The retina encodes visual scenes by trains of action potentials that are sent to the brain via the optic nerve. In this paper, we describe a new free access user-end software allowing to better understand this coding. It is called PRANAS (https://pranas.inria.fr), standing for Platform for Retinal ANalysis And Simulation. PRANAS targets neuroscientists and modelers by providing a unique set of retina-related tools. PRANAS integrates a retina simulator allowing large scale simulations while keeping a strong biological plausibility and a toolbox for the analysis of spike train population statistics. The statistical method (entropy maximization under constraints) takes into account both spatial and temporal correlations as constraints, allowing to analyze the effects of memory on statistics. PRANAS also integrates a tool computing and representing in 3D (time-space) receptive fields. All these tools are accessible through a friendly graphical user interface. The most CPU-costly of them have been implemented to run in parallel.
On the potential role of lateral connectivity in retinal anticipation
We analyse the potential effects of lateral connectivity (amacrine cells and gap junctions) on motion anticipation in the retina. Our main result is that lateral connectivity can—under conditions analysed in the paper—trigger a wave of activity enhancing the anticipation mechanism provided by local gain control (Berry et al. in Nature 398(6725):334–338, 1999 ; Chen et al. in J. Neurosci. 33(1):120–132, 2013 ). We illustrate these predictions by two examples studied in the experimental literature: differential motion sensitive cells (Baccus and Meister in Neuron 36(5):909–919, 2002 ) and direction sensitive cells where direction sensitivity is inherited from asymmetry in gap junctions connectivity (Trenholm et al. in Nat. Neurosci. 16:154–156, 2013 ). We finally present reconstructions of retinal responses to 2D visual inputs to assess the ability of our model to anticipate motion in the case of three different 2D stimuli.