Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
LanguageLanguage
-
SubjectSubject
-
Item TypeItem Type
-
DisciplineDiscipline
-
YearFrom:-To:
-
More FiltersMore FiltersIs Peer Reviewed
Done
Filters
Reset
73
result(s) for
"Tetzlaff, Tom"
Sort by:
Sequence learning, prediction, and replay in networks of spiking neurons
by
Wouters, Dirk J.
,
Tetzlaff, Tom
,
Bouhadjar, Younes
in
Biology and Life Sciences
,
Computer and Information Sciences
,
Medicine and Health Sciences
2022
Sequence learning, prediction and replay have been proposed to constitute the universal computations performed by the neocortex. The Hierarchical Temporal Memory (HTM) algorithm realizes these forms of computation. It learns sequences in an unsupervised and continuous manner using local learning rules, permits a context specific prediction of future sequence elements, and generates mismatch signals in case the predictions are not met. While the HTM algorithm accounts for a number of biological features such as topographic receptive fields, nonlinear dendritic processing, and sparse connectivity, it is based on abstract discrete-time neuron and synapse dynamics, as well as on plasticity mechanisms that can only partly be related to known biological mechanisms. Here, we devise a continuous-time implementation of the temporal-memory (TM) component of the HTM algorithm, which is based on a recurrent network of spiking neurons with biophysically interpretable variables and parameters. The model learns high-order sequences by means of a structural Hebbian synaptic plasticity mechanism supplemented with a rate-based homeostatic control. In combination with nonlinear dendritic input integration and local inhibitory feedback, this type of plasticity leads to the dynamic self-organization of narrow sequence-specific subnetworks. These subnetworks provide the substrate for a faithful propagation of sparse, synchronous activity, and, thereby, for a robust, context specific prediction of future sequence elements as well as for the autonomous replay of previously learned sequences. By strengthening the link to biology, our implementation facilitates the evaluation of the TM hypothesis based on experimentally accessible quantities. The continuous-time implementation of the TM algorithm permits, in particular, an investigation of the role of sequence timing for sequence learning, prediction and replay. We demonstrate this aspect by studying the effect of the sequence speed on the sequence learning performance and on the speed of autonomous sequence replay.
Journal Article
The Correlation Structure of Local Neuronal Networks Intrinsically Results from Recurrent Dynamics
2014
Correlated neuronal activity is a natural consequence of network connectivity and shared inputs to pairs of neurons, but the task-dependent modulation of correlations in relation to behavior also hints at a functional role. Correlations influence the gain of postsynaptic neurons, the amount of information encoded in the population activity and decoded by readout neurons, and synaptic plasticity. Further, it affects the power and spatial reach of extracellular signals like the local-field potential. A theory of correlated neuronal activity accounting for recurrent connectivity as well as fluctuating external sources is currently lacking. In particular, it is unclear how the recently found mechanism of active decorrelation by negative feedback on the population level affects the network response to externally applied correlated stimuli. Here, we present such an extension of the theory of correlations in stochastic binary networks. We show that (1) for homogeneous external input, the structure of correlations is mainly determined by the local recurrent connectivity, (2) homogeneous external inputs provide an additive, unspecific contribution to the correlations, (3) inhibitory feedback effectively decorrelates neuronal activity, even if neurons receive identical external inputs, and (4) identical synaptic input statistics to excitatory and to inhibitory cells increases intrinsically generated fluctuations and pairwise correlations. We further demonstrate how the accuracy of mean-field predictions can be improved by self-consistently including correlations. As a byproduct, we show that the cancellation of correlations between the summed inputs to pairs of neurons does not originate from the fast tracking of external input, but from the suppression of fluctuations on the population level by the local network. This suppression is a necessary constraint, but not sufficient to determine the structure of correlations; specifically, the structure observed at finite network size differs from the prediction based on perfect tracking, even though perfect tracking implies suppression of population fluctuations.
Journal Article
Firing rate homeostasis counteracts changes in stability of recurrent neural networks caused by synapse loss in Alzheimer’s disease
by
Tetzlaff, Tom
,
Morrison, Abigail
,
Duarte, Renato
in
Alzheimer Disease - physiopathology
,
Alzheimer's disease
,
Biology and Life Sciences
2020
The impairment of cognitive function in Alzheimer's disease is clearly correlated to synapse loss. However, the mechanisms underlying this correlation are only poorly understood. Here, we investigate how the loss of excitatory synapses in sparsely connected random networks of spiking excitatory and inhibitory neurons alters their dynamical characteristics. Beyond the effects on the activity statistics, we find that the loss of excitatory synapses on excitatory neurons reduces the network's sensitivity to small perturbations. This decrease in sensitivity can be considered as an indication of a reduction of computational capacity. A full recovery of the network's dynamical characteristics and sensitivity can be achieved by firing rate homeostasis, here implemented by an up-scaling of the remaining excitatory-excitatory synapses. Mean-field analysis reveals that the stability of the linearised network dynamics is, in good approximation, uniquely determined by the firing rate, and thereby explains why firing rate homeostasis preserves not only the firing rate but also the network's sensitivity to small perturbations.
Journal Article
Decorrelation of Neural-Network Activity by Inhibitory Feedback
by
Tetzlaff, Tom
,
Helias, Moritz
,
Diesmann, Markus
in
Action Potentials - physiology
,
Animals
,
Biology
2012
Correlations in spike-train ensembles can seriously impair the encoding of information by their spatio-temporal structure. An inevitable source of correlation in finite neural networks is common presynaptic input to pairs of neurons. Recent studies demonstrate that spike correlations in recurrent neural networks are considerably smaller than expected based on the amount of shared presynaptic input. Here, we explain this observation by means of a linear network model and simulations of networks of leaky integrate-and-fire neurons. We show that inhibitory feedback efficiently suppresses pairwise correlations and, hence, population-rate fluctuations, thereby assigning inhibitory neurons the new role of active decorrelation. We quantify this decorrelation by comparing the responses of the intact recurrent network (feedback system) and systems where the statistics of the feedback channel is perturbed (feedforward system). Manipulations of the feedback statistics can lead to a significant increase in the power and coherence of the population response. In particular, neglecting correlations within the ensemble of feedback channels or between the external stimulus and the feedback amplifies population-rate fluctuations by orders of magnitude. The fluctuation suppression in homogeneous inhibitory networks is explained by a negative feedback loop in the one-dimensional dynamics of the compound activity. Similarly, a change of coordinates exposes an effective negative feedback loop in the compound dynamics of stable excitatory-inhibitory networks. The suppression of input correlations in finite networks is explained by the population averaged correlations in the linear network model: In purely inhibitory networks, shared-input correlations are canceled by negative spike-train correlations. In excitatory-inhibitory networks, spike-train correlations are typically positive. Here, the suppression of input correlations is not a result of the mere existence of correlations between excitatory (E) and inhibitory (I) neurons, but a consequence of a particular structure of correlations among the three possible pairings (EE, EI, II).
Journal Article
Coherent noise enables probabilistic sequence replay in spiking neuronal networks
by
Wouters, Dirk J.
,
Tetzlaff, Tom
,
Bouhadjar, Younes
in
Action Potentials - physiology
,
Ambiguity
,
Analysis
2023
Animals rely on different decision strategies when faced with ambiguous or uncertain cues. Depending on the context, decisions may be biased towards events that were most frequently experienced in the past, or be more explorative. A particular type of decision making central to cognition is sequential memory recall in response to ambiguous cues. A previously developed spiking neuronal network implementation of sequence prediction and recall learns complex, high-order sequences in an unsupervised manner by local, biologically inspired plasticity rules. In response to an ambiguous cue, the model deterministically recalls the sequence shown most frequently during training. Here, we present an extension of the model enabling a range of different decision strategies. In this model, explorative behavior is generated by supplying neurons with noise. As the model relies on population encoding, uncorrelated noise averages out, and the recall dynamics remain effectively deterministic. In the presence of locally correlated noise, the averaging effect is avoided without impairing the model performance, and without the need for large noise amplitudes. We investigate two forms of correlated noise occurring in nature: shared synaptic background inputs, and random locking of the stimulus to spatiotemporal oscillations in the network activity. Depending on the noise characteristics, the network adopts various recall strategies. This study thereby provides potential mechanisms explaining how the statistics of learned sequences affect decision making, and how decision strategies can be adjusted after learning.
Journal Article
Dynamical Characteristics of Recurrent Neuronal Networks Are Robust Against Low Synaptic Weight Resolution
by
Tetzlaff, Tom
,
Dasbach, Stefan
,
Diesmann, Markus
in
activity statistics
,
Brain
,
Computational neuroscience
2021
The representation of the natural-density, heterogeneous connectivity of neuronal network models at relevant spatial scales remains a challenge for Computational Neuroscience and Neuromorphic Computing. In particular, the memory demands imposed by the vast number of synapses in brain-scale network simulations constitute a major obstacle. Limiting the number resolution of synaptic weights appears to be a natural strategy to reduce memory and compute load. In this study, we investigate the effects of a limited synaptic-weight resolution on the dynamics of recurrent spiking neuronal networks resembling local cortical circuits and develop strategies for minimizing deviations from the dynamics of networks with high-resolution synaptic weights. We mimic the effect of a limited synaptic weight resolution by replacing normally distributed synaptic weights with weights drawn from a discrete distribution, and compare the resulting statistics characterizing firing rates, spike-train irregularity, and correlation coefficients with the reference solution. We show that a naive discretization of synaptic weights generally leads to a distortion of the spike-train statistics. If the weights are discretized such that the mean and the variance of the total synaptic input currents are preserved, the firing statistics remain unaffected for the types of networks considered in this study. For networks with sufficiently heterogeneous in-degrees, the firing statistics can be preserved even if all synaptic weights are replaced by the mean of the weight distribution. We conclude that even for simple networks with non-plastic neurons and synapses, a discretization of synaptic weights can lead to substantial deviations in the firing statistics unless the discretization is performed with care and guided by a rigorous validation process. For the network model used in this study, the synaptic weights can be replaced by low-resolution weights without affecting its macroscopic dynamical characteristics, thereby saving substantial amounts of memory.
Journal Article
Frequency Dependence of Signal Power and Spatial Reach of the Local Field Potential
2013
Despite its century-old use, the interpretation of local field potentials (LFPs), the low-frequency part of electrical signals recorded in the brain, is still debated. In cortex the LFP appears to mainly stem from transmembrane neuronal currents following synaptic input, and obvious questions regarding the 'locality' of the LFP are: What is the size of the signal-generating region, i.e., the spatial reach, around a recording contact? How far does the LFP signal extend outside a synaptically activated neuronal population? And how do the answers depend on the temporal frequency of the LFP signal? Experimental inquiries have given conflicting results, and we here pursue a modeling approach based on a well-established biophysical forward-modeling scheme incorporating detailed reconstructed neuronal morphologies in precise calculations of population LFPs including thousands of neurons. The two key factors determining the frequency dependence of LFP are the spatial decay of the single-neuron LFP contribution and the conversion of synaptic input correlations into correlations between single-neuron LFP contributions. Both factors are seen to give low-pass filtering of the LFP signal power. For uncorrelated input only the first factor is relevant, and here a modest reduction (<50%) in the spatial reach is observed for higher frequencies (>100 Hz) compared to the near-DC ([Formula: see text]) value of about [Formula: see text]. Much larger frequency-dependent effects are seen when populations of pyramidal neurons receive correlated and spatially asymmetric inputs: the low-frequency ([Formula: see text]) LFP power can here be an order of magnitude or more larger than at 60 Hz. Moreover, the low-frequency LFP components have larger spatial reach and extend further outside the active population than high-frequency components. Further, the spatial LFP profiles for such populations typically span the full vertical extent of the dendrites of neurons in the population. Our numerical findings are backed up by an intuitive simplified model for the generation of population LFP.
Journal Article
Metadata practices for simulation workflows
2025
Computer simulations are an essential pillar of knowledge generation in science. Exploring, understanding, reproducing, and sharing the results of simulations relies on tracking and organizing the metadata describing the numerical experiments. The models used to understand real-world systems, and the computational machinery required to simulate them, are typically complex, and produce large amounts of heterogeneous metadata. Here, we present general practices for acquiring and handling metadata that are agnostic to software and hardware, and highly flexible for the user. These consist of two steps: 1) recording and storing raw metadata, and 2) selecting and structuring metadata. As a proof of concept, we develop the
Archivist
, a Python tool to help with the second step, and use it to apply our practices to distinct high-performance computing use cases from neuroscience and hydrology. Our practices and the
Archivist
can readily be applied to existing workflows without the need for substantial restructuring. They support sustainable numerical workflows, fostering replicability, reproducibility, data exploration, and data sharing in simulation-based research.
Journal Article
Power Laws from Linear Neuronal Cable Theory: Power Spectral Densities of the Soma Potential, Soma Membrane Current and Single-Neuron Contribution to the EEG
by
Tetzlaff, Tom
,
Lindén, Henrik
,
Pettersen, Klas H.
in
Biology and Life Sciences
,
Computational Biology
,
Computer Simulation
2014
Power laws, that is, power spectral densities (PSDs) exhibiting 1/f(α) behavior for large frequencies f, have been observed both in microscopic (neural membrane potentials and currents) and macroscopic (electroencephalography; EEG) recordings. While complex network behavior has been suggested to be at the root of this phenomenon, we here demonstrate a possible origin of such power laws in the biophysical properties of single neurons described by the standard cable equation. Taking advantage of the analytical tractability of the so called ball and stick neuron model, we derive general expressions for the PSD transfer functions for a set of measures of neuronal activity: the soma membrane current, the current-dipole moment (corresponding to the single-neuron EEG contribution), and the soma membrane potential. These PSD transfer functions relate the PSDs of the respective measurements to the PSDs of the noisy input currents. With homogeneously distributed input currents across the neuronal membrane we find that all PSD transfer functions express asymptotic high-frequency 1/f(α) power laws with power-law exponents analytically identified as α∞(I) = 1/2 for the soma membrane current, α∞(p) = 3/2 for the current-dipole moment, and α∞(V) = 2 for the soma membrane potential. Comparison with available data suggests that the apparent power laws observed in the high-frequency end of the PSD spectra may stem from uncorrelated current sources which are homogeneously distributed across the neural membranes and themselves exhibit pink (1/f) noise distributions. While the PSD noise spectra at low frequencies may be dominated by synaptic noise, our findings suggest that the high-frequency power laws may originate in noise from intrinsic ion channels. The significance of this finding goes beyond neuroscience as it demonstrates how 1/f(α) power laws with a wide range of values for the power-law exponent α may arise from a simple, linear partial differential equation.
Journal Article
A Modular Workflow for Performance Benchmarking of Neuronal Network Simulations
by
Pronold, Jari
,
Kurth, Anno Christopher
,
Patronis, Alexander
in
Benchmarks
,
Computational neuroscience
,
Efficiency
2022
Modern computational neuroscience strives to develop complex network models to explain dynamics and function of brains in health and disease. This process goes hand in hand with advancements in the theory of neuronal networks and increasing availability of detailed anatomical data on brain connectivity. Large-scale models that study interactions between multiple brain areas with intricate connectivity and investigate phenomena on long time scales such as system-level learning require progress in simulation speed. The corresponding development of state-of-the-art simulation engines relies on information provided by benchmark simulations which assess the time-to-solution for scientifically relevant, complementary network models using various combinations of hardware and software revisions. However, maintaining comparability of benchmark results is difficult due to a lack of standardized specifications for measuring the scaling performance of simulators on high-performance computing (HPC) systems. Motivated by the challenging complexity of benchmarking, we define a generic workflow that decomposes the endeavor into unique segments consisting of separate modules. As a reference implementation for the conceptual workflow, we develop beNNch: an open-source software framework for the configuration, execution, and analysis of benchmarks for neuronal network simulations. The framework records benchmarking data and metadata in a unified way to foster reproducibility. For illustration, we measure the performance of various versions of the NEST simulator across network models with different levels of complexity on a contemporary HPC system, demonstrating how performance bottlenecks can be identified, ultimately guiding the development toward more efficient simulation technology.
Journal Article