Catalogue Search | MBRL
Search Results Heading
Explore the vast range of titles available.
MBRLSearchResults
-
LanguageLanguage
-
SubjectSubject
-
Item TypeItem Type
-
DisciplineDiscipline
-
YearFrom:-To:
-
More FiltersMore FiltersIs Peer Reviewed
Done
Filters
Reset
158,358
result(s) for
"COMPUTERS / Neural Networks."
Sort by:
Neural control engineering : the emerging intersection between control theory and neuroscience
2012,2011
How powerful new methods in nonlinear control engineering can be applied to neuroscience, from fundamental model formulation to advanced medical applications.Over the past sixty years, powerful methods of model-based control engineering have been responsible for such dramatic advances in engineering systems as autolanding aircraft, autonomous vehicles, and even weather forecasting. Over those same decades, our models of the nervous system have evolved from single-cell membranes to neuronal networks to large-scale models of the human brain. Yet until recently control theory was completely inapplicable to the types of nonlinear models being developed in neuroscience. The revolution in nonlinear control engineering in the late 1990s has made the intersection of control theory and neuroscience possible. In Neural Control Engineering, Steven Schiff seeks to bridge the two fields, examining the application of new methods in nonlinear control engineering to neuroscience. After presenting extensive material on formulating computational neuroscience models in a control environment-including some fundamentals of the algorithms helpful in crossing the divide from intuition to effective application-Schiff examines a range of applications, including brain-machine interfaces and neural stimulation. He reports on research that he and his colleagues have undertaken showing that nonlinear control theory methods can be applied to models of single cells, small neuronal networks, and large-scale networks in disease states of Parkinson's disease and epilepsy. With Neural Control Engineering the reader acquires a working knowledge of the fundamentals of control theory and computational neuroscience sufficient not only to understand the literature in this trandisciplinary area but also to begin working to advance the field. The book will serve as an essential guide for scientists in either biology or engineering and for physicians who wish to gain expertise in these areas.
Advanced Deep Learning with TensorFlow 2 and Keras
by
Atienza, Rowel
in
COM004000 COMPUTERS / Intelligence (AI) & Semantics
,
COMPUTERS / Natural Language Processing
,
COMPUTERS / Neural Networks
2020
A second edition of the bestselling guide to exploring and mastering deep learning with Keras, updated to include TensorFlow 2.x with new chapters on object detection, semantic segmentation, and unsupervised learning using mutual information.
A Proof that Artificial Neural Networks Overcome the Curse of Dimensionality in the Numerical Approximation of Black–Scholes Partial Differential Equations
by
Hornung, Fabian
,
von Wurstemberger, Philippe
,
Grohs, Philipp
in
Approximation theory
,
Differential equations, Partial-Numerical solutions
,
Neural networks (Computer science)
2023
Artificial neural networks (ANNs) have very successfully been used in numerical simulations for a series of computational problems
ranging from image classification/image recognition, speech recognition, time series analysis, game intelligence, and computational
advertising to numerical approximations of partial differential equations (PDEs). Such numerical simulations suggest that ANNs have the
capacity to very efficiently approximate high-dimensional functions and, especially, indicate that ANNs seem to admit the fundamental
power to overcome the curse of dimensionality when approximating the high-dimensional functions appearing in the above named
computational problems. There are a series of rigorous mathematical approximation results for ANNs in the scientific literature. Some of
them prove convergence without convergence rates and some of these mathematical results even rigorously establish convergence rates but
there are only a few special cases where mathematical results can rigorously explain the empirical success of ANNs when approximating
high-dimensional functions. The key contribution of this article is to disclose that ANNs can efficiently approximate high-dimensional
functions in the case of numerical approximations of Black-Scholes PDEs. More precisely, this work reveals that the number of required
parameters of an ANN to approximate the solution of the Black-Scholes PDE grows at most polynomially in both the reciprocal of the
prescribed approximation accuracy
New directions in statistical signal processing : from systems to brain
by
Sejnowski, Terrence J. (Terrence Joseph)
,
McWhirter, John
,
Príncipe, J. C. (José C.)
in
Neural computers
,
Neural networks (Computer science)
,
Neural networks (Neurobiology)
2007,2006
Signal processing and neural computation have separately and
significantly influenced many disciplines, but the cross-fertilization of the two
fields has begun only recently. Research now shows that each has much to teach the
other, as we see highly sophisticated kinds of signal processing and elaborate
hierachical levels of neural computation performed side by side in the brain. In New
Directions in Statistical Signal Processing, leading researchers from both signal
processing and neural computation present new work that aims to promote interaction
between the two disciplines.The book's 14 chapters, almost evenly divided between
signal processing and neural computation, begin with the brain and move on to
communication, signal processing, and learning systems. They examine such topics as
how computational models help us understand the brain's information processing, how
an intelligent machine could solve the \"cocktail party problem\" with \"active
audition\" in a noisy environment, graphical and network structure modeling
approaches, uncertainty in network communications, the geometric approach to blind
signal processing, game-theoretic learning algorithms, and observable operator
models (OOMs) as an alternative to hidden Markov models (HMMs).
Blind Equalization in Neural Networks
by
Tsinghua University Press, Tsinghua University
,
Zhang, Liyi
in
COM004000 COMPUTERS / Intelligence (AI) & Semantics
,
COMPUTERS / Neural Networks
,
COMPUTERS / Programming / Algorithms
2017
The book begins with an introduction of blind equalization theory and its application in neural networks, then discusses the algorithms in recurrent networks, fuzzy networks and other frequently-studied neural networks.
Information processing by biochemical systems
2010,2009
Describes fully delineated biochemical systems, organized as neural network-type assemblies. It explains the relationship between these two apparently unrelated fields, revealing how biochemical systems have the advantage of using the \"language\" of the physiological processes and, therefore, can be organized into the neural network-type assemblies, much in the way that natural biosystems are.
Information processing in echo state networks at the edge of chaos
by
Lizier, Joseph T.
,
Mayer, N. Michael
,
Asada, Minoru
in
Bioinformatics
,
Biology
,
Biomedical and Life Sciences
2012
We investigate information processing in randomly connected recurrent neural networks. It has been shown previously that the computational capabilities of these networks are maximized when the recurrent layer is close to the border between a stable and an unstable dynamics regime, the so called
edge of chaos
. The reasons, however, for this maximized performance are not completely understood. We adopt an information-theoretical framework and are for the first time able to quantify the computational capabilities between elements of these networks directly as they undergo the phase transition to chaos. Specifically, we present evidence that both information transfer and storage in the recurrent layer are maximized close to this phase transition, providing an explanation for why guiding the recurrent layer toward the edge of chaos is computationally useful. As a consequence, our study suggests self-organized ways of improving performance in recurrent neural networks, driven by input data. Moreover, the networks we study share important features with biological systems such as feedback connections and online computation on input streams. A key example is the cerebral cortex, which was shown to also operate close to the edge of chaos. Consequently, the behavior of model systems as studied here is likely to shed light on reasons why biological systems are tuned into this specific regime.
Journal Article
Machine Learning for Future Wireless Communications
by
Luo, Fa-Long
in
Communication, Networking and Broadcast Technologies
,
Computing and Processing
,
Machine learning
2020
A comprehensive review to the theory, application and research of machine learning for future wireless communications In one single volume, Machine Learning for Future Wireless Communications provides a comprehensive and highly accessible treatment to the theory, applications and current research developments to the technology aspects related to machine learning for wireless communications and networks. The technology development of machine learning for wireless communications has grown explosively and is one of the biggest trends in related academic, research and industry communities. Deep neural networks-based machine learning technology is a promising tool to attack the big challenge in wireless communications and networks imposed by the increasing demands in terms of capacity, coverage, latency, efficiency flexibility, compatibility, quality of experience and silicon convergence. The author – a noted expert on the topic – covers a wide range of topics including system architecture and optimization, physical-layer and cross-layer processing, air interface and protocol design, beamforming and antenna configuration, network coding and slicing, cell acquisition and handover, scheduling and rate adaption, radio access control, smart proactive caching and adaptive resource allocations. Uniquely organized into three categories: Spectrum Intelligence, Transmission Intelligence and Network Intelligence, this important resource: Offers a comprehensive review of the theory, applications and current developments of machine learning for wireless communications and networks Covers a range of topics from architecture and optimization to adaptive resource allocations Reviews state-of-the-art machine learning based solutions for network coverage Includes an overview of the applications of machine learning algorithms in future wireless networks Explores flexible backhaul and front-haul, cross-layer optimization and coding, full-duplex radio, digital front-end (DFE) and radio-frequency (RF) processing Written for professional engineers, researchers, scientists, manufacturers, network operators, software developers and graduate students, Machine Learning for Future Wireless Communications presents in 21 chapters a comprehensive review of the topic authored by an expert in the field.