Search Results Heading

MBRLSearchResults

mbrl.module.common.modules.added.book.to.shelf
Title added to your shelf!
View what I already have on My Shelf.
Oops! Something went wrong.
Oops! Something went wrong.
While trying to add the title to your shelf something went wrong :( Kindly try again later!
Are you sure you want to remove the book from the shelf?
Oops! Something went wrong.
Oops! Something went wrong.
While trying to remove the title from your shelf something went wrong :( Kindly try again later!
    Done
    Filters
    Reset
  • Discipline
      Discipline
      Clear All
      Discipline
  • Is Peer Reviewed
      Is Peer Reviewed
      Clear All
      Is Peer Reviewed
  • Item Type
      Item Type
      Clear All
      Item Type
  • Subject
      Subject
      Clear All
      Subject
  • Year
      Year
      Clear All
      From:
      -
      To:
  • More Filters
10 result(s) for "Taher, Halgurd"
Sort by:
Exact neural mass model for synaptic-based working memory
A synaptic theory of Working Memory (WM) has been developed in the last decade as a possible alternative to the persistent spiking paradigm. In this context, we have developed a neural mass model able to reproduce exactly the dynamics of heterogeneous spiking neural networks encompassing realistic cellular mechanisms for short-term synaptic plasticity. This population model reproduces the macroscopic dynamics of the network in terms of the firing rate and the mean membrane potential. The latter quantity allows us to gain insight of the Local Field Potential and electroencephalographic signals measured during WM tasks to characterize the brain activity. More specifically synaptic facilitation and depression integrate each other to efficiently mimic WM operations via either synaptic reactivation or persistent activity. Memory access and loading are related to stimulus-locked transient oscillations followed by a steady-state activity in the β - γ band, thus resembling what is observed in the cortex during vibrotactile stimuli in humans and object recognition in monkeys. Memory juggling and competition emerge already by loading only two items. However more items can be stored in WM by considering neural architectures composed of multiple excitatory populations and a common inhibitory pool. Memory capacity depends strongly on the presentation rate of the items and it maximizes for an optimal frequency range. In particular we provide an analytic expression for the maximal memory capacity. Furthermore, the mean membrane potential turns out to be a suitable proxy to measure the memory load, analogously to event driven potentials in experiments on humans. Finally we show that the γ power increases with the number of loaded items, as reported in many experiments, while θ and β power reveal non monotonic behaviours. In particular, β and γ rhythms are crucially sustained by the inhibitory activity, while the θ rhythm is controlled by excitatory synapses.
Homeodynamic feedback inhibition control in whole-brain simulations
Simulations of large-scale brain dynamics are often impacted by overexcitation resulting from heavy-tailed structural network distributions, leading to biologically implausible simulation results. We implement a homeodynamic plasticity mechanism, known from other modeling work, in the widely used Jansen-Rit neural mass model for The Virtual Brain (TVB) simulation framework. We aim at heterogeneously adjusting the inhibitory coupling weights to reach desired dynamic regimes in each brain region. We show that, by using this dynamic approach, we can control the target activity level to obtain biologically plausible brain simulations, including post-synaptic potentials and blood-oxygen-level-dependent functional magnetic resonance imaging (fMRI) activity. We demonstrate that the derived dynamic Feedback Inhibitory Control (dFIC) can be used to enable increased variability of model dynamics. We derive the conditions under which the simulated brain activity converges to a predefined target level analytically and via simulations. We highlight the benefits of dFIC in the context of fitting the TVB model to static and dynamic measures of fMRI empirical data, accounting for global synchronization across the whole brain. The proposed novel method helps computational neuroscientists, especially TVB users, to easily “tune” brain models to desired dynamical regimes depending on the specific requirements of each study. The presented method is a steppingstone towards increased biological realism in brain network models and a valuable tool to better understand their underlying behavior.
Bursting in a next generation neural mass model with synaptic dynamics: a slow–fast approach
We report a detailed analysis on the emergence of bursting in a recently developed neural mass model that includes short-term synaptic plasticity. Neural mass models can mimic the collective dynamics of large-scale neuronal populations in terms of a few macroscopic variables like mean membrane potential and firing rate. The present one is particularly important, as it represents an exact meanfield limit of synaptically coupled quadratic integrate and fire (QIF) neurons. Without synaptic dynamics, a periodic external current with slow frequency ε can lead to burst-like dynamics. The firing patterns can be understood using singular perturbation theory, specifically slow–fast dissection. With synaptic dynamics, timescale separation leads to a variety of slow–fast phenomena and their role for bursting becomes inordinately more intricate. Canards are crucial to understand the route to bursting. They describe trajectories evolving nearby repelling locally invariant sets of the system and exist at the transition between subthreshold dynamics and bursting. Near the singular limit ε = 0 , we report peculiar jump-on canards , which block a continuous transition to bursting. In the biologically more plausible ε -regime, this transition becomes continuous and bursts emerge via consecutive spike-adding transitions. The onset of bursting is complex and involves mixed-type-like torus canards , which form the very first spikes of the burst and follow fast-subsystem repelling limit cycles. We numerically evidence the same mechanisms to be responsible for bursting emergence in the QIF network with plastic synapses. The main conclusions apply for the network, owing to the exactness of the meanfield limit.
Exact neural mass model for synaptic-based working memory
A synaptic theory of Working Memory (WM) has been developed in the last decade as a possible alternative to the persistent spiking paradigm. In this context, we have developed a neural mass model able to reproduce exactly the dynamics of heterogeneous spiking neural networks encompassing realistic cellular mechanisms for short-term synaptic plasticity. This population model reproduces the macroscopic dynamics of the network in terms of the firing rate and the mean membrane potential. The latter quantity allows us to get insight on Local Field Potential and electroencephalographic signals measured during WM tasks to characterize the brain activity. More specifically synaptic facilitation and depression integrate each other to efficiently mimic WM operations via either synaptic reactivation or persistent activity. Memory access and loading are associated to stimulus-locked transient oscillations followed by a steady-state activity in the \\(\\beta-\\gamma\\) band, thus resembling what observed in the cortex during vibrotactile stimuli in humans and object recognition in monkeys. Memory juggling and competition emerge already by loading only two items. However more items can be stored in WM by considering neural architectures composed of multiple excitatory populations and a common inhibitory pool. Memory capacity depends strongly on the presentation rate of the items and it maximizes for an optimal frequency range. In particular we provide an analytic expression for the maximal memory capacity. Furthermore, the mean membrane potential turns out to be a suitable proxy to measure the memory load, analogously to event driven potentials in experiments on humans. Finally we show that the \\(\\gamma\\) power increases with the number of loaded items, as reported in many experiments, while \\(\\theta\\) and \\(\\beta\\) power reveal non monotonic behaviours.
Enhancing power grid synchronization and stability through time delayed feedback control
We study the synchronization and stability of power grids within the Kuramoto phase oscillator model with inertia with a bimodal frequency distribution representing the generators and the loads. We identify critical nodes through solitary frequency deviations and Lyapunov vectors corresponding to unstable Lyapunov exponents. To cure dangerous deviations from synchronization we propose time-delayed feedback control, which is an efficient control concept in nonlinear dynamic systems. Different control strategies are tested and compared with respect to the minimum number of controlled nodes required to achieve synchronization and Lyapunov stability. As a proof of principle, this fast-acting control method is demonstrated using a model of the German power transmission grid.
Exact neural mass model for synaptic-based working memory
Abstract A synaptic theory of Working Memory (WM) has been developed in the last decade as a possible alternative to the persistent spiking paradigm. In this context, we have developed a neural mass model able to reproduce exactly the dynamics of heterogeneous spiking neural networks encompassing realistic cellular mechanisms for short-term synaptic plasticity. This population model reproduces the macroscopic dynamics of the network in terms of the firing rate and the mean membrane potential. The latter quantity allows us to get insigth on Local Field Potential and electroencephalographic signals measured during WM tasks to characterize the brain activity. More specifically synaptic facilitation and depression integrate each other to efficiently mimic WM operations via either synaptic reactivation or persistent activity. Memory access and loading are associated to stimulus-locked transient oscillations followed by a steady-state activity in the β-γ band, thus resembling what observed in the cortex during vibrotactile stimuli in humans and object recognition in monkeys. Memory juggling and competition emerge already by loading only two items. However more items can be stored in WM by considering neural architectures composed of multiple excitatory populations and a common inhibitory pool. Memory capacity depends strongly on the presentation rate of the items and it maximizes for an optimal frequency range. In particular we provide an analytic expression for the maximal memory capacity. Furthermore, the mean membrane potential turns out to be a suitable proxy to measure the memory load, analogously to event driven potentials in experiments on humans. Finally we show that the γ power increases with the number of loaded items, as reported in many experiments, while θ and β power reveal non monotonic behaviours. In particular, β and γ rhytms are crucially sustained by the inhibitory activity, while the θ rhythm is controlled by excitatory synapses. Author summary Working Memory (WM) is the ability to temporarily store and manipulate stimuli representations that are no longer available to the senses. We have developed an innovative coarse-grained population model able to mimic several operations associated to WM. The novelty of the model consists in reproducing exactly the dynamics of spiking neural networks with realistic synaptic plasticity composed of hundreds of thousands neurons in terms of a few macroscopic variables. These variables give access to experimentally measurable quantities such as local field potentials and electroencephalografic signals. Memory operations are joined to sustained or transient oscillations emerging in different frequency bands, in accordance with experimental results for primate and humans performing WM tasks. We have designed an architecture composed of many excitatory populations and a common inhibitory pool able to store and retain several memory items. The capacity of our multi-item architecture is around 3-5 items, a value corresponding to the WM capacities measured in many experiments. Furthermore, the maximal capacity is achievable only for presentation rates within an optimal frequency range. Finally, we have defined a measure of the memory load analogous to the event-related potentials employed to test humans’ WM capacity during visual memory tasks. Competing Interest Statement The authors have declared no competing interest. Footnotes * ↵* simona.olmi{at}inria.fr
Patient-specific network connectivity combined with a next generation neural mass model to test clinical hypothesis of seizure propagation
Dynamics underlying epileptic seizures span multiple scales in space and time, therefore, understanding seizure mechanisms requires identifying the relations between seizure components within and across these scales, together with the analysis of their dynamical repertoire. In this view, mathematical models have been developed, ranging from single neuron to neural population. In this study we consider a neural mass model able to exactly reproduce the dynamics of heterogeneous spiking neural networks. We combine the mathematical modelling with structural information from non-invasive brain imaging, thus building large-scale brain network models to explore emergent dynamics and test clinical hypothesis. We provide a comprehensive study on the effect of external drives on neuronal networks exhibiting multistability, in order to investigate the role played by the neuroanatomical connectivity matrices in shaping the emergent dynamics. In particular we systematically investigate the conditions under which the network displays a transition from a low activity regime to a high activity state, which we identify with a seizure-like event. This approach allows us to study the biophysical parameters and variables leading to multiple recruitment events at the network level. We further exploit topological network measures in order to explain the differences and the analogies among the subjects and their brain regions, in showing recruitment events at different parameter values. We demonstrate, along the example of diffusion-weighted magnetic resonance imaging (MRI) connectomes of 20 healthy subjects and 15 epileptic patients, that individual variations in structural connectivity, when linked with mathematical dynamic models, have the capacity to explain changes in spatiotemporal organization of brain dynamics, as observed in network-based brain disorders. In particular, for epileptic patients, by means of the integration of the clinical hypotheses on the epileptogenic zone (EZ), i.e. the local network where highly synchronous seizures originate, we have identified the sequence of recruitment events and discussed their links with the topological properties of the specific connectomes. The predictions made on the basis of the implemented set of exact mean-field equations turn out to be in line with the clinical pre-surgical evaluation on recruited secondary networks. Competing Interest Statement The authors have declared no competing interest. Footnotes * state-of-the-art review and results have been updated
The Virtual Brain Ontology: A Digital Knowledge Framework for Reproducible Brain Network Modeling
Computational models of brain network dynamics offer mechanistic insights into brain function and disease, and are utilized for hypothesis generation, data interpretation, and the creation of personalized digital brain twins. However, results remain difficult to reproduce and compare because equations, parameters, networks, and numerical settings are reported inconsistently across the literature, and shared code is often not fully documented, standardized, or executable. We introduce (TVB-O), a semantic knowledge base, minimal metadata standard, and Python toolbox that simplifies the description, execution, and sharing of network simulations. TVB-O offers 1) a common vocabulary and ontology for core concepts and axioms representing current domain knowledge for simulating brain network dynamics, 2) a minimal, human- and machine-readable metadata specification for the information needed to reproduce an experiment, 3) a curated database of published models, brain networks, and study configurations, and 4) software that generates executable code for various simulation platforms and programming languages, including The Virtual Brain, Jax, or Julia. FAIR metadata and provenance-aware reports can be exported from TVB-O's model specification. It hereby enables a flexible framework for adopting new models and enhances reproducibility, comparability, and portability across simulators, while making assumptions explicit and linking models to biomedical knowledge and observation pathways. By reducing technical barriers and standardizing workflows, TVB-O broadens access to computational neuroscience and establishes a foundation for transparent, shareable \"digital brain twins\" that integrate with clinical pipelines and large-scale data resources.
NER- RoBERTa: Fine-Tuning RoBERTa for Named Entity Recognition (NER) within low-resource languages
Nowadays, Natural Language Processing (NLP) is an important tool for most people's daily life routines, ranging from understanding speech, translation, named entity recognition (NER), and text categorization, to generative text models such as ChatGPT. Due to the existence of big data and consequently large corpora for widely used languages like English, Spanish, Turkish, Persian, and many more, these applications have been developed accurately. However, the Kurdish language still requires more corpora and large datasets to be included in NLP applications. This is because Kurdish has a rich linguistic structure, varied dialects, and a limited dataset, which poses unique challenges for Kurdish NLP (KNLP) application development. While several studies have been conducted in KNLP for various applications, Kurdish NER (KNER) remains a challenge for many KNLP tasks, including text analysis and classification. In this work, we address this limitation by proposing a methodology for fine-tuning the pre-trained RoBERTa model for KNER. To this end, we first create a Kurdish corpus, followed by designing a modified model architecture and implementing the training procedures. To evaluate the trained model, a set of experiments is conducted to demonstrate the performance of the KNER model using different tokenization methods and trained models. The experimental results show that fine-tuned RoBERTa with the SentencePiece tokenization method substantially improves KNER performance, achieving a 12.8% improvement in F1-score compared to traditional models, and consequently establishes a new benchmark for KNLP.
A Novel Poisoned Water Detection Method Using Smartphone Embedded Wi-Fi Technology and Machine Learning Algorithms
Water is a necessary fluid to the human body and automatic checking of its quality and cleanness is an ongoing area of research. One such approach is to present the liquid to various types of signals and make the amount of signal attenuation an indication of the liquid category. In this article, we have utilized the Wi-Fi signal to distinguish clean water from poisoned water via training different machine learning algorithms. The Wi-Fi access points (WAPs) signal is acquired via equivalent smartphone-embedded Wi-Fi chipsets, and then Channel-State-Information CSI measures are extracted and converted into feature vectors to be used as input for machine learning classification algorithms. The measured amplitude and phase of the CSI data are selected as input features into four classifiers k-NN, SVM, LSTM, and Ensemble. The experimental results show that the model is adequate to differentiate poison water from clean water with a classification accuracy of 89% when LSTM is applied, while 92% classification accuracy is achieved when the AdaBoost-Ensemble classifier is applied.