A New Perspective on Memorization in Recurrent Networks of Spiking Neurons

A New Perspective on Memorization in Recurrent Networks of Spiking Neurons PDF Author: Patrick Murer
Publisher: BoD – Books on Demand
ISBN: 3866287585
Category : Computers
Languages : en
Pages : 230

Get Book Here

Book Description
This thesis studies the capability of spiking recurrent neural network models to memorize dynamical pulse patterns (or firing signals). In the first part, discrete-time firing signals (or firing sequences) are considered. A recurrent network model, consisting of neurons with bounded disturbance, is introduced to analyze (simple) local learning. Two modes of learning/memorization are considered: The first mode is strictly online, with a single pass through the data, while the second mode uses multiple passes through the data. In both modes, the learning is strictly local (quasi-Hebbian): At any given time step, only the weights between the neurons firing (or supposed to be firing) at the previous time step and those firing (or supposed to be firing) at the present time step are modified. The main result is an upper bound on the probability that the single-pass memorization is not perfect. It follows that the memorization capacity in this mode asymptotically scales like that of the classical Hopfield model (which, in contrast, memorizes static patterns). However, multiple-rounds memorization is shown to achieve a higher capacity with an asymptotically nonvanishing number of bits per connection/synapse. These mathematical findings may be helpful for understanding the functionality of short-term memory and long-term memory in neuroscience. In the second part, firing signals in continuous-time are studied. It is shown how firing signals, containing firings only on a regular time grid, can be (robustly) memorized with a recurrent network model. In principle, the corresponding weights are obtained by supervised (quasi-Hebbian) multi-pass learning. The asymptotic memorization capacity is a nonvanishing number measured in bits per connection/synapse as its discrete-time analogon. Furthermore, the timing robustness of the memorized firing signals is investigated for different disturbance models. The regime of disturbances, where the relative occurrence-time of the firings is preserved over a long time span, is elaborated for the various disturbance models. The proposed models have the potential for energy efficient self-timed neuromorphic hardware implementations.

A New Perspective on Memorization in Recurrent Networks of Spiking Neurons

A New Perspective on Memorization in Recurrent Networks of Spiking Neurons PDF Author: Patrick Murer
Publisher: BoD – Books on Demand
ISBN: 3866287585
Category : Computers
Languages : en
Pages : 230

Get Book Here

Book Description
This thesis studies the capability of spiking recurrent neural network models to memorize dynamical pulse patterns (or firing signals). In the first part, discrete-time firing signals (or firing sequences) are considered. A recurrent network model, consisting of neurons with bounded disturbance, is introduced to analyze (simple) local learning. Two modes of learning/memorization are considered: The first mode is strictly online, with a single pass through the data, while the second mode uses multiple passes through the data. In both modes, the learning is strictly local (quasi-Hebbian): At any given time step, only the weights between the neurons firing (or supposed to be firing) at the previous time step and those firing (or supposed to be firing) at the present time step are modified. The main result is an upper bound on the probability that the single-pass memorization is not perfect. It follows that the memorization capacity in this mode asymptotically scales like that of the classical Hopfield model (which, in contrast, memorizes static patterns). However, multiple-rounds memorization is shown to achieve a higher capacity with an asymptotically nonvanishing number of bits per connection/synapse. These mathematical findings may be helpful for understanding the functionality of short-term memory and long-term memory in neuroscience. In the second part, firing signals in continuous-time are studied. It is shown how firing signals, containing firings only on a regular time grid, can be (robustly) memorized with a recurrent network model. In principle, the corresponding weights are obtained by supervised (quasi-Hebbian) multi-pass learning. The asymptotic memorization capacity is a nonvanishing number measured in bits per connection/synapse as its discrete-time analogon. Furthermore, the timing robustness of the memorized firing signals is investigated for different disturbance models. The regime of disturbances, where the relative occurrence-time of the firings is preserved over a long time span, is elaborated for the various disturbance models. The proposed models have the potential for energy efficient self-timed neuromorphic hardware implementations.

Composite NUV Priors and Applications

Composite NUV Priors and Applications PDF Author: Raphael Urs Keusch
Publisher: BoD – Books on Demand
ISBN: 3866287682
Category : Computers
Languages : en
Pages : 275

Get Book Here

Book Description
Normal with unknown variance (NUV) priors are a central idea of sparse Bayesian learning and allow variational representations of non-Gaussian priors. More specifically, such variational representations can be seen as parameterized Gaussians, wherein the parameters are generally unknown. The advantage is apparent: for fixed parameters, NUV priors are Gaussian, and hence computationally compatible with Gaussian models. Moreover, working with (linear-)Gaussian models is particularly attractive since the Gaussian distribution is closed under affine transformations, marginalization, and conditioning. Interestingly, the variational representation proves to be rather universal than restrictive: many common sparsity-promoting priors (among them, in particular, the Laplace prior) can be represented in this manner. In estimation problems, parameters or variables of the underlying model are often subject to constraints (e.g., discrete-level constraints). Such constraints cannot adequately be represented by linear-Gaussian models and generally require special treatment. To handle such constraints within a linear-Gaussian setting, we extend the idea of NUV priors beyond its original use for sparsity. In particular, we study compositions of existing NUV priors, referred to as composite NUV priors, and show that many commonly used model constraints can be represented in this way.

Using Local State Space Model Approximation for Fundamental Signal Analysis Tasks

Using Local State Space Model Approximation for Fundamental Signal Analysis Tasks PDF Author: Elizabeth Ren
Publisher: BoD – Books on Demand
ISBN: 3866287925
Category : Computers
Languages : en
Pages : 288

Get Book Here

Book Description
With increasing availability of computation power, digital signal analysis algorithms have the potential of evolving from the common framewise operational method to samplewise operations which offer more precision in time. This thesis discusses a set of methods with samplewise operations: local signal approximation via Recursive Least Squares (RLS) where a mathematical model is fit to the signal within a sliding window at each sample. Thereby both the signal models and cost windows are generated by Autonomous Linear State Space Models (ALSSMs). The modeling capability of ALSSMs is vast, as they can model exponentials, polynomials and sinusoidal functions as well as any linear and multiplicative combination thereof. The fitting method offers efficient recursions, subsample precision by way of the signal model and additional goodness of fit measures based on the recursively computed fitting cost. Classical methods such as standard Savitzky-Golay (SG) smoothing filters and the Short-Time Fourier Transform (STFT) are united under a common framework. First, we complete the existing framework. The ALSSM parameterization and RLS recursions are provided for a general function. The solution of the fit parameters for different constraint problems are reviewed. Moreover, feature extraction from both the fit parameters and the cost is detailed as well as examples of their use. In particular, we introduce terminology to analyze the fitting problem from the perspective of projection to a local Hilbert space and as a linear filter. Analytical rules are given for computation of the equivalent filter response and the steady-state precision matrix of the cost. After establishing the local approximation framework, we further discuss two classes of signal models in particular, namely polynomial and sinusoidal functions. The signal models are complementary, as by nature, polynomials are suited for time-domain description of signals while sinusoids are suited for the frequency-domain. For local approximation of polynomials, we derive analytical expressions for the steady-state covariance matrix and the linear filter of the coefficients based on the theory of orthogonal polynomial bases. We then discuss the fundamental application of smoothing filters based on local polynomial approximation. We generalize standard SG filters to any ALSSM window and introduce a novel class of smoothing filters based on polynomial fitting to running sums.

The Role of Synaptic Tagging and Capture for Memory Dynamics in Spiking Neural Networks

The Role of Synaptic Tagging and Capture for Memory Dynamics in Spiking Neural Networks PDF Author: Jannik Luboeinski
Publisher:
ISBN:
Category : Science
Languages : en
Pages : 201

Get Book Here

Book Description
Memory serves to process and store information about experiences such that this information can be used in future situations. The transfer from transient storage into long-term memory, which retains information for hours, days, and even years, is called consolidation. In brains, information is primarily stored via alteration of synapses, so-called synaptic plasticity. While these changes are at first in a transient early phase, they can be transferred to a late phase, meaning that they become stabilized over the course of several hours. This stabilization has been explained by so-called synaptic tagging and capture (STC) mechanisms. To store and recall memory representations, emergent dynamics arise from the synaptic structure of recurrent networks of neurons. This happens through so-called cell assemblies, which feature particularly strong synapses. It has been proposed that the stabilization of such cell assemblies by STC corresponds to so-called synaptic consolidation, which is observed in humans and other animals in the first hours after acquiring a new memory. The exact connection between the physiological mechanisms of STC and memory consolidation remains, however, unclear. It is equally unknown which influence STC mechanisms exert on further cognitive functions that guide behavior. On timescales of minutes to hours (that means, the timescales of STC) such functions include memory improvement, modification of memories, interference and enhancement of similar memories, and transient priming of certain memories. Thus, diverse memory dynamics may be linked to STC, which can be investigated by employing theoretical methods based on experimental data from the neuronal and the behavioral level. In this thesis, we present a theoretical model of STC-based memory consolidation in recurrent networks of spiking neurons, which are particularly suited to reproduce biologically realistic dynamics. Furthermore, we combine the STC mechanisms with calcium dynamics, which have been found to guide the major processes of early-phase synaptic plasticity in vivo. In three included research articles as well as additional sections, we develop this model and investigate how it can account for a variety of behavioral effects. We find that the model enables the robust implementation of the cognitive memory functions mentioned above. The main steps to this are: 1. demonstrating the formation, consolidation, and improvement of memories represented by cell assemblies, 2. showing that neuromodulator-dependent STC can retroactively control whether information is stored in a temporal or rate-based neural code, and 3. examining interaction of multiple cell assemblies with transient and attractor dynamics in different organizational paradigms. In summary, we demonstrate several ways by which STC controls the late-phase synaptic structure of cell assemblies. Linking these structures to functional dynamics, we show that our STC-based model implements functionality that can be related to long-term memory. Thereby, we provide a basis for the mechanistic explanation of various neuropsychological effects. Keywords: synaptic plasticity; synaptic tagging and capture; spiking recurrent neural networks; memory consolidation; long-term memory

Spike-timing dependent plasticity

Spike-timing dependent plasticity PDF Author: Henry Markram
Publisher: Frontiers E-books
ISBN: 2889190439
Category :
Languages : en
Pages : 575

Get Book Here

Book Description
Hebb's postulate provided a crucial framework to understand synaptic alterations underlying learning and memory. Hebb's theory proposed that neurons that fire together, also wire together, which provided the logical framework for the strengthening of synapses. Weakening of synapses was however addressed by "not being strengthened", and it was only later that the active decrease of synaptic strength was introduced through the discovery of long-term depression caused by low frequency stimulation of the presynaptic neuron. In 1994, it was found that the precise relative timing of pre and postynaptic spikes determined not only the magnitude, but also the direction of synaptic alterations when two neurons are active together. Neurons that fire together may therefore not necessarily wire together if the precise timing of the spikes involved are not tighly correlated. In the subsequent 15 years, Spike Timing Dependent Plasticity (STDP) has been found in multiple brain brain regions and in many different species. The size and shape of the time windows in which positive and negative changes can be made vary for different brain regions, but the core principle of spike timing dependent changes remain. A large number of theoretical studies have also been conducted during this period that explore the computational function of this driving principle and STDP algorithms have become the main learning algorithm when modeling neural networks. This Research Topic will bring together all the key experimental and theoretical research on STDP.

Artificial Neural Networks and Machine Learning -- ICANN 2013

Artificial Neural Networks and Machine Learning -- ICANN 2013 PDF Author: Valeri Mladenov
Publisher: Springer
ISBN: 3642407285
Category : Computers
Languages : en
Pages : 660

Get Book Here

Book Description
The book constitutes the proceedings of the 23rd International Conference on Artificial Neural Networks, ICANN 2013, held in Sofia, Bulgaria, in September 2013. The 78 papers included in the proceedings were carefully reviewed and selected from 128 submissions. The focus of the papers is on following topics: neurofinance graphical network models, brain machine interfaces, evolutionary neural networks, neurodynamics, complex systems, neuroinformatics, neuroengineering, hybrid systems, computational biology, neural hardware, bioinspired embedded systems, and collective intelligence.

Neuromorphic Cognitive Systems

Neuromorphic Cognitive Systems PDF Author: Qiang Yu
Publisher: Springer
ISBN: 3319553100
Category : Technology & Engineering
Languages : en
Pages : 180

Get Book Here

Book Description
This book presents neuromorphic cognitive systems from a learning and memory-centered perspective. It illustrates how to build a system network of neurons to perform spike-based information processing, computing, and high-level cognitive tasks. It is beneficial to a wide spectrum of readers, including undergraduate and postgraduate students and researchers who are interested in neuromorphic computing and neuromorphic engineering, as well as engineers and professionals in industry who are involved in the design and applications of neuromorphic cognitive systems, neuromorphic sensors and processors, and cognitive robotics. The book formulates a systematic framework, from the basic mathematical and computational methods in spike-based neural encoding, learning in both single and multi-layered networks, to a near cognitive level composed of memory and cognition. Since the mechanisms for integrating spiking neurons integrate to formulate cognitive functions as in the brain are little understood, studies of neuromorphic cognitive systems are urgently needed. The topics covered in this book range from the neuronal level to the system level. In the neuronal level, synaptic adaptation plays an important role in learning patterns. In order to perform higher-level cognitive functions such as recognition and memory, spiking neurons with learning abilities are consistently integrated, building a system with encoding, learning and memory functionalities. The book describes these aspects in detail.

Improving Associative Memory in a Network of Spiking Neurons

Improving Associative Memory in a Network of Spiking Neurons PDF Author: Russell I. Hunter
Publisher:
ISBN:
Category :
Languages : en
Pages :

Get Book Here

Book Description
In this thesis we use computational neural network models to examine the dynamics and functionality of the CA3 region of the mammalian hippocampus. The emphasis of the project is to investigate how the dynamic control structures provided by inhibitory circuitry and cellular modification may effect the CA3 region during the recall of previously stored information. The CA3 region is commonly thought to work as a recurrent auto-associative neural network due to the neurophysiological characteristics found, such as, recurrent collaterals, strong and sparse synapses from external inputs and plasticity between coactive cells. Associative memory models have been developed using various configurations of mathematical artificial neural networks which were first developed over 40 years ago. Within these models we can store information via changes in the strength of connections between simplified model neurons (two-state). These memories can be recalled when a cue (noisy or partial) is instantiated upon the net. The type of information they can store is quite limited due to restrictions caused by the simplicity of the hard-limiting nodes which are commonly associated with a binary activation threshold. We build a much more biologically plausible model with complex spiking cell models and with realistic synaptic properties between cells. This model is based upon some of the many details we now know of the neuronal circuitry of the CA3 region. We implemented the model in computer software using Neuron and Matlab and tested it by running simulations of storage and recall in the network. By building this model we gain new insights into how different types of neurons, and the complex circuits they form, actually work. The mammalian brain consists of complex resistive-capacative electrical circuitry which is formed by the interconnection of large numbers of neurons. A principal cell type is the pyramidal cell within the cortex, which is the main information processor in our neural networks. Pyramidal cells are surrounded by diverse populations of interneurons which have proportionally smaller numbers compared to the pyramidal cells and these form connections with pyramidal cells and other inhibitory cells. By building detailed computational models of recurrent neural circuitry we explore how these microcircuits of interneurons control the flow of information through pyramidal cells and regulate the efficacy of the network. We also explore the effect of cellular modification due to neuronal activity and the effect of incorporating spatially dependent connectivity on the network during recall of previously stored information. In particular we implement a spiking neural network proposed by Sommer and Wennekers (2001). We consider methods for improving associative memory recall using methods inspired by the work by Graham and Willshaw (1995) where they apply mathematical transforms to an artificial neural network to improve the recall quality within the network. The networks tested contain either 100 or 1000 pyramidal cells with 10% connectivity applied and a partial cue instantiated, and with a global pseudo-inhibition. We investigate three methods. Firstly, applying localised disynaptic inhibition which will proportionalise the excitatory post synaptic potentials and provide a fast acting reversal potential which should help to reduce the variability in signal propagation between cells and provide further inhibition to help synchronise the network activity. Secondly, implementing a persistent sodium channel to the cell body which will act to non-linearise the activation threshold where after a given membrane potential the amplitude of the excitatory postsynaptic potential (EPSP) is boosted to push cells which receive slightly more excitation (most likely high units) over the firing threshold. Finally, implementing spatial characteristics of the dendritic tree will allow a greater probability of a modified synapse existing after 10% random connectivity has been applied throughout the network. We apply spatial characteristics by scaling the conductance weights of excitatory synapses which simulate the loss in potential in synapses found in the outer dendritic regions due to increased resistance. To further increase the biological plausibility of the network we remove the pseudo-inhibition and apply realistic basket cell models with differing configurations for a global inhibitory circuit. The networks are configured with; 1 single basket cell providing feedback inhibition, 10% basket cells providing feedback inhibition where 10 pyramidal cells connect to each basket cell and finally, 100% basket cells providing feedback inhibition. These networks are compared and contrasted for efficacy on recall quality and the effect on the network behaviour. We have found promising results from applying biologically plausible recall strategies and network configurations which suggests the role of inhibition and cellular dynamics are pivotal in learning and memory.

Brain-Inspired Computing: From Neuroscience to Neuromorphic Electronics driving new forms of Artificial Intelligence

Brain-Inspired Computing: From Neuroscience to Neuromorphic Electronics driving new forms of Artificial Intelligence PDF Author: Jonathan Mapelli
Publisher: Frontiers Media SA
ISBN: 2889746089
Category : Science
Languages : en
Pages : 139

Get Book Here

Book Description


Brain-inspired Cognition and Understanding for Next-generation AI: Computational Models, Architectures and Learning Algorithms

Brain-inspired Cognition and Understanding for Next-generation AI: Computational Models, Architectures and Learning Algorithms PDF Author: Chenwei Deng
Publisher: Frontiers Media SA
ISBN: 2832521169
Category : Science
Languages : en
Pages : 223

Get Book Here

Book Description