WorldWideScience

Sample records for neural spike recording

  1. A Low Noise Amplifier for Neural Spike Recording Interfaces

    Directory of Open Access Journals (Sweden)

    Jesus Ruiz-Amaya

    2015-09-01

    Full Text Available This paper presents a Low Noise Amplifier (LNA for neural spike recording applications. The proposed topology, based on a capacitive feedback network using a two-stage OTA, efficiently solves the triple trade-off between power, area and noise. Additionally, this work introduces a novel transistor-level synthesis methodology for LNAs tailored for the minimization of their noise efficiency factor under area and noise constraints. The proposed LNA has been implemented in a 130 nm CMOS technology and occupies 0.053 mm-sq. Experimental results show that the LNA offers a noise efficiency factor of 2.16 and an input referred noise of 3.8 μVrms for 1.2 V power supply. It provides a gain of 46 dB over a nominal bandwidth of 192 Hz–7.4 kHz and consumes 1.92 μW. The performance of the proposed LNA has been validated through in vivo experiments with animal models.

  2. A Low Noise Amplifier for Neural Spike Recording Interfaces.

    Science.gov (United States)

    Ruiz-Amaya, Jesus; Rodriguez-Perez, Alberto; Delgado-Restituto, Manuel

    2015-09-30

    This paper presents a Low Noise Amplifier (LNA) for neural spike recording applications. The proposed topology, based on a capacitive feedback network using a two-stage OTA, efficiently solves the triple trade-off between power, area and noise. Additionally, this work introduces a novel transistor-level synthesis methodology for LNAs tailored for the minimization of their noise efficiency factor under area and noise constraints. The proposed LNA has been implemented in a 130 nm CMOS technology and occupies 0.053 mm-sq. Experimental results show that the LNA offers a noise efficiency factor of 2.16 and an input referred noise of 3.8 μVrms for 1.2 V power supply. It provides a gain of 46 dB over a nominal bandwidth of 192 Hz-7.4 kHz and consumes 1.92 μW. The performance of the proposed LNA has been validated through in vivo experiments with animal models.

  3. Spike detection from noisy neural data in linear-probe recordings.

    Science.gov (United States)

    Takekawa, Takashi; Ota, Keisuke; Murayama, Masanori; Fukai, Tomoki

    2014-06-01

    Simultaneous recordings of multiple neuron activities with multi-channel extracellular electrodes are widely used for studying information processing by the brain's neural circuits. In this method, the recorded signals containing the spike events of a number of adjacent or distant neurons must be correctly sorted into spike trains of individual neurons, and a variety of methods have been proposed for this spike sorting. However, spike sorting is computationally difficult because the recorded signals are often contaminated by biological noise. Here, we propose a novel method for spike detection, which is the first stage of spike sorting and hence crucially determines overall sorting performance. Our method utilizes a model of extracellular recording data that takes into account variations in spike waveforms, such as the widths and amplitudes of spikes, by detecting the peaks of band-pass-filtered data. We show that the new method significantly improves the cost-performance of multi-channel electrode recordings by increasing the number of cleanly sorted neurons. © 2014 Federation of European Neuroscience Societies and John Wiley & Sons Ltd.

  4. Simultaneous Measurement of Neural Spike Recordings and Multi-Photon Calcium Imaging in Neuroblastoma Cells

    Directory of Open Access Journals (Sweden)

    Jeehyun Kim

    2012-11-01

    Full Text Available This paper proposes the design and implementation of a micro-electrode array (MEA for neuroblastoma cell culturing. It also explains the implementation of a multi-photon microscope (MPM customized for neuroblastoma cell excitation and imaging under ambient light. Electrical signal and fluorescence images were simultaneously acquired from the neuroblastoma cells on the MEA. MPM calcium images of the cultured neuroblastoma cell on the MEA are presented and also the neural activity was acquired through the MEA recording. A calcium green-1 (CG-1 dextran conjugate of 10,000 D molecular weight was used in this experiment for calcium imaging. This study also evaluated the calcium oscillations and neural spike recording of neuroblastoma cells in an epileptic condition. Based on our observation of neural spikes in neuroblastoma cells with our proposed imaging modality, we report that neuroblastoma cells can be an important model for epileptic activity studies.

  5. A 128-channel 6 mW wireless neural recording IC with spike feature extraction and UWB transmitter.

    Science.gov (United States)

    Chae, Moo Sung; Yang, Zhi; Yuce, Mehmet R; Hoang, Linh; Liu, Wentai

    2009-08-01

    This paper reports a 128-channel neural recording integrated circuit (IC) with on-the-fly spike feature extraction and wireless telemetry. The chip consists of eight 16-channel front-end recording blocks, spike detection and feature extraction digital signal processor (DSP), ultra wideband (UWB) transmitter, and on-chip bias generators. Each recording channel has amplifiers with programmable gain and bandwidth to accommodate different types of biological signals. An analog-to-digital converter (ADC) shared by 16 amplifiers through time-multiplexing results in a balanced trade-off between the power consumption and chip area. A nonlinear energy operator (NEO) based spike detector is implemented for identifying spikes, which are further processed by a digital frequency-shaping filter. The computationally efficient spike detection and feature extraction algorithms attribute to an auspicious DSP implementation on-chip. UWB telemetry is designed to wirelessly transfer raw data from 128 recording channels at a data rate of 90 Mbit/s. The chip is realized in 0.35 mum complementary metal-oxide-semiconductor (CMOS) process with an area of 8.8 x 7.2 mm(2) and consumes 6 mW by employing a sequential turn-on architecture that selectively powers off idle analog circuit blocks. The chip has been tested for electrical specifications and verified in an ex vivo biological environment.

  6. Closed-loop optical neural stimulation based on a 32-channel low-noise recording system with online spike sorting

    Science.gov (United States)

    Nguyen, T. K. T.; Navratilova, Z.; Cabral, H.; Wang, L.; Gielen, G.; Battaglia, F. P.; Bartic, C.

    2014-08-01

    Objective. Closed-loop operation of neuro-electronic systems is desirable for both scientific and clinical (neuroprosthesis) applications. Integrating optical stimulation with recording capability further enhances the selectivity of neural stimulation. We have developed a system enabling the local delivery of optical stimuli and the simultaneous electrical measuring of the neural activities in a closed-loop approach. Approach. The signal analysis is performed online through the implementation of a template matching algorithm. The system performance is demonstrated with the recorded data and in awake rats. Main results. Specifically, the neural activities are simultaneously recorded, detected, classified online (through spike sorting) from 32 channels, and used to trigger a light emitting diode light source using generated TTL signals. Significance. A total processing time of 8 ms is achieved, suitable for optogenetic studies of brain mechanisms online.

  7. Training Deep Spiking Neural Networks Using Backpropagation.

    Science.gov (United States)

    Lee, Jun Haeng; Delbruck, Tobi; Pfeiffer, Michael

    2016-01-01

    Deep spiking neural networks (SNNs) hold the potential for improving the latency and energy efficiency of deep neural networks through data-driven event-based computation. However, training such networks is difficult due to the non-differentiable nature of spike events. In this paper, we introduce a novel technique, which treats the membrane potentials of spiking neurons as differentiable signals, where discontinuities at spike times are considered as noise. This enables an error backpropagation mechanism for deep SNNs that follows the same principles as in conventional deep networks, but works directly on spike signals and membrane potentials. Compared with previous methods relying on indirect training and conversion, our technique has the potential to capture the statistics of spikes more precisely. We evaluate the proposed framework on artificially generated events from the original MNIST handwritten digit benchmark, and also on the N-MNIST benchmark recorded with an event-based dynamic vision sensor, in which the proposed method reduces the error rate by a factor of more than three compared to the best previous SNN, and also achieves a higher accuracy than a conventional convolutional neural network (CNN) trained and tested on the same data. We demonstrate in the context of the MNIST task that thanks to their event-driven operation, deep SNNs (both fully connected and convolutional) trained with our method achieve accuracy equivalent with conventional neural networks. In the N-MNIST example, equivalent accuracy is achieved with about five times fewer computational operations.

  8. A Real-Time Spike Classification Method Based on Dynamic Time Warping for Extracellular Enteric Neural Recording with Large Waveform Variability

    Science.gov (United States)

    Cao, Yingqiu; Rakhilin, Nikolai; Gordon, Philip H.; Shen, Xiling; Kan, Edwin C.

    2015-01-01

    Background Computationally efficient spike recognition methods are required for real-time analysis of extracellular neural recordings. The enteric nervous system (ENS) is important to human health but less well-understood with few appropriate spike recognition algorithms due to large waveform variability. New Method Here we present a method based on dynamic time warping (DTW) with high tolerance to variability in time and magnitude. Adaptive temporal gridding for fastDTW in similarity calculation significantly reduces the computational cost. The automated threshold selection allows for real-time classification for extracellular recordings. Results Our method is first evaluated on synthesized data at different noise levels, improving both classification accuracy and computational complexity over the conventional cross-correlation based template-matching method (CCTM) and PCA + k-means clustering without time warping. Our method is then applied to analyze the mouse enteric neural recording with mechanical and chemical stimuli. Successful classification of biphasic and monophasic spikes is achieved even when the spike variability is larger than millisecond in width and millivolt in magnitude. Comparison with Existing Method(s) In comparison with conventional template matching and clustering methods, the fastDTW method is computationally efficient with high tolerance to waveform variability. Conclusions We have developed an adaptive fastDTW algorithm for real-time spike classification of ENS recording with large waveform variability against colony motility, ambient changes and cellular heterogeneity. PMID:26719239

  9. An overview of Bayesian methods for neural spike train analysis.

    Science.gov (United States)

    Chen, Zhe

    2013-01-01

    Neural spike train analysis is an important task in computational neuroscience which aims to understand neural mechanisms and gain insights into neural circuits. With the advancement of multielectrode recording and imaging technologies, it has become increasingly demanding to develop statistical tools for analyzing large neuronal ensemble spike activity. Here we present a tutorial overview of Bayesian methods and their representative applications in neural spike train analysis, at both single neuron and population levels. On the theoretical side, we focus on various approximate Bayesian inference techniques as applied to latent state and parameter estimation. On the application side, the topics include spike sorting, tuning curve estimation, neural encoding and decoding, deconvolution of spike trains from calcium imaging signals, and inference of neuronal functional connectivity and synchrony. Some research challenges and opportunities for neural spike train analysis are discussed.

  10. An Overview of Bayesian Methods for Neural Spike Train Analysis

    Directory of Open Access Journals (Sweden)

    Zhe Chen

    2013-01-01

    Full Text Available Neural spike train analysis is an important task in computational neuroscience which aims to understand neural mechanisms and gain insights into neural circuits. With the advancement of multielectrode recording and imaging technologies, it has become increasingly demanding to develop statistical tools for analyzing large neuronal ensemble spike activity. Here we present a tutorial overview of Bayesian methods and their representative applications in neural spike train analysis, at both single neuron and population levels. On the theoretical side, we focus on various approximate Bayesian inference techniques as applied to latent state and parameter estimation. On the application side, the topics include spike sorting, tuning curve estimation, neural encoding and decoding, deconvolution of spike trains from calcium imaging signals, and inference of neuronal functional connectivity and synchrony. Some research challenges and opportunities for neural spike train analysis are discussed.

  11. High-density microelectrode array recordings and real-time spike sorting for closed-loop experiments: an emerging technology to study neural plasticity.

    Science.gov (United States)

    Franke, Felix; Jäckel, David; Dragas, Jelena; Müller, Jan; Radivojevic, Milos; Bakkum, Douglas; Hierlemann, Andreas

    2012-01-01

    Understanding plasticity of neural networks is a key to comprehending their development and function. A powerful technique to study neural plasticity includes recording and control of pre- and post-synaptic neural activity, e.g., by using simultaneous intracellular recording and stimulation of several neurons. Intracellular recording is, however, a demanding technique and has its limitations in that only a small number of neurons can be stimulated and recorded from at the same time. Extracellular techniques offer the possibility to simultaneously record from larger numbers of neurons with relative ease, at the expenses of increased efforts to sort out single neuronal activities from the recorded mixture, which is a time consuming and error prone step, referred to as spike sorting. In this mini-review, we describe recent technological developments in two separate fields, namely CMOS-based high-density microelectrode arrays, which also allow for extracellular stimulation of neurons, and real-time spike sorting. We argue that these techniques, when combined, will provide a powerful tool to study plasticity in neural networks consisting of several thousand neurons in vitro.

  12. Inferring oscillatory modulation in neural spike trains.

    Directory of Open Access Journals (Sweden)

    Kensuke Arai

    2017-10-01

    Full Text Available Oscillations are observed at various frequency bands in continuous-valued neural recordings like the electroencephalogram (EEG and local field potential (LFP in bulk brain matter, and analysis of spike-field coherence reveals that spiking of single neurons often occurs at certain phases of the global oscillation. Oscillatory modulation has been examined in relation to continuous-valued oscillatory signals, and independently from the spike train alone, but behavior or stimulus triggered firing-rate modulation, spiking sparseness, presence of slow modulation not locked to stimuli and irregular oscillations with large variability in oscillatory periods, present challenges to searching for temporal structures present in the spike train. In order to study oscillatory modulation in real data collected under a variety of experimental conditions, we describe a flexible point-process framework we call the Latent Oscillatory Spike Train (LOST model to decompose the instantaneous firing rate in biologically and behaviorally relevant factors: spiking refractoriness, event-locked firing rate non-stationarity, and trial-to-trial variability accounted for by baseline offset and a stochastic oscillatory modulation. We also extend the LOST model to accommodate changes in the modulatory structure over the duration of the experiment, and thereby discover trial-to-trial variability in the spike-field coherence of a rat primary motor cortical neuron to the LFP theta rhythm. Because LOST incorporates a latent stochastic auto-regressive term, LOST is able to detect oscillations when the firing rate is low, the modulation is weak, and when the modulating oscillation has a broad spectral peak.

  13. Inferring oscillatory modulation in neural spike trains.

    Science.gov (United States)

    Arai, Kensuke; Kass, Robert E

    2017-10-01

    Oscillations are observed at various frequency bands in continuous-valued neural recordings like the electroencephalogram (EEG) and local field potential (LFP) in bulk brain matter, and analysis of spike-field coherence reveals that spiking of single neurons often occurs at certain phases of the global oscillation. Oscillatory modulation has been examined in relation to continuous-valued oscillatory signals, and independently from the spike train alone, but behavior or stimulus triggered firing-rate modulation, spiking sparseness, presence of slow modulation not locked to stimuli and irregular oscillations with large variability in oscillatory periods, present challenges to searching for temporal structures present in the spike train. In order to study oscillatory modulation in real data collected under a variety of experimental conditions, we describe a flexible point-process framework we call the Latent Oscillatory Spike Train (LOST) model to decompose the instantaneous firing rate in biologically and behaviorally relevant factors: spiking refractoriness, event-locked firing rate non-stationarity, and trial-to-trial variability accounted for by baseline offset and a stochastic oscillatory modulation. We also extend the LOST model to accommodate changes in the modulatory structure over the duration of the experiment, and thereby discover trial-to-trial variability in the spike-field coherence of a rat primary motor cortical neuron to the LFP theta rhythm. Because LOST incorporates a latent stochastic auto-regressive term, LOST is able to detect oscillations when the firing rate is low, the modulation is weak, and when the modulating oscillation has a broad spectral peak.

  14. A Multi-Channel Low-Power System-on-Chip for in Vivo Recording and Wireless Transmission of Neural Spikes

    Directory of Open Access Journals (Sweden)

    Alessandro Sottocornola Spinelli

    2012-09-01

    Full Text Available This paper reports a multi-channel neural spike recording system-on-chip with digital data compression and wireless telemetry. The circuit consists of 16 amplifiers, an analog time-division multiplexer, a single 8 bit analog-to-digital converter, a digital signal compression unit and a wireless transmitter. Although only 16 amplifiers are integrated in our current die version, the whole system is designed to work with 64, demonstrating the feasibility of a digital processing and narrowband wireless transmission of 64 neural recording channels. Compression of the raw data is achieved by detecting the action potentials (APs and storing 20 samples for each spike waveform. This compression method retains sufficiently high data quality to allow for single neuron identification (spike sorting. The 400 MHz transmitter employs a Manchester-Coded Frequency Shift Keying (MC-FSK modulator with low modulation index. In this way, a 1:25 Mbit/s data rate is delivered within a limited band of about 3 MHz. The chip is realized in a 0:35 m AMS CMOS process featuring a 3 V power supply with an area of 3:1 2:7 mm2. The achieved transmission range is over 10 m with an overall power consumption for 64 channels of 17:2 mW. This figure translates into a power budget of 269 W per channel, in line with published results but allowing a larger transmission distance and more efficient bandwidth occupation of the wireless link. The integrated circuit was mounted on a small and light board to be used during neuroscience experiments with freely-behaving rats. Powered by 2 AAA batteries, the system can continuously work for more than 100 hours allowing for long-lasting neural spike recordings.

  15. Epileptiform spike detection via convolutional neural networks

    DEFF Research Database (Denmark)

    Johansen, Alexander Rosenberg; Jin, Jing; Maszczyk, Tomasz

    2016-01-01

    The EEG of epileptic patients often contains sharp waveforms called "spikes", occurring between seizures. Detecting such spikes is crucial for diagnosing epilepsy. In this paper, we develop a convolutional neural network (CNN) for detecting spikes in EEG of epileptic patients in an automated...

  16. Robust spike classification based on frequency domain neural waveform features.

    Science.gov (United States)

    Yang, Chenhui; Yuan, Yuan; Si, Jennie

    2013-12-01

    We introduce a new spike classification algorithm based on frequency domain features of the spike snippets. The goal for the algorithm is to provide high classification accuracy, low false misclassification, ease of implementation, robustness to signal degradation, and objectivity in classification outcomes. In this paper, we propose a spike classification algorithm based on frequency domain features (CFDF). It makes use of frequency domain contents of the recorded neural waveforms for spike classification. The self-organizing map (SOM) is used as a tool to determine the cluster number intuitively and directly by viewing the SOM output map. After that, spike classification can be easily performed using clustering algorithms such as the k-Means. In conjunction with our previously developed multiscale correlation of wavelet coefficient (MCWC) spike detection algorithm, we show that the MCWC and CFDF detection and classification system is robust when tested on several sets of artificial and real neural waveforms. The CFDF is comparable to or outperforms some popular automatic spike classification algorithms with artificial and real neural data. The detection and classification of neural action potentials or neural spikes is an important step in single-unit-based neuroscientific studies and applications. After the detection of neural snippets potentially containing neural spikes, a robust classification algorithm is applied for the analysis of the snippets to (1) extract similar waveforms into one class for them to be considered coming from one unit, and to (2) remove noise snippets if they do not contain any features of an action potential. Usually, a snippet is a small 2 or 3 ms segment of the recorded waveform, and differences in neural action potentials can be subtle from one unit to another. Therefore, a robust, high performance classification system like the CFDF is necessary. In addition, the proposed algorithm does not require any assumptions on statistical

  17. Vectorized algorithms for spiking neural network simulation.

    Science.gov (United States)

    Brette, Romain; Goodman, Dan F M

    2011-06-01

    High-level languages (Matlab, Python) are popular in neuroscience because they are flexible and accelerate development. However, for simulating spiking neural networks, the cost of interpretation is a bottleneck. We describe a set of algorithms to simulate large spiking neural networks efficiently with high-level languages using vector-based operations. These algorithms constitute the core of Brian, a spiking neural network simulator written in the Python language. Vectorized simulation makes it possible to combine the flexibility of high-level languages with the computational efficiency usually associated with compiled languages.

  18. Spiking Neural Networks based on OxRAM Synapses for Real-time Unsupervised Spike Sorting

    Directory of Open Access Journals (Sweden)

    Thilo Werner

    2016-11-01

    Full Text Available In this paper, we present an alternative approach to perform spike sorting of complex brain signals based on spiking neural networks (SNN. The proposed architecture is suitable for hardware implementation by using RRAM technology for the implementation of synapses whose low latency (< 1μs enable real-time spike sorting. This offers promising advantagesto conventional spike sorting techniques for brain-computer interface and neural prosthesis applications. Moreover, the ultralow power consumption of the RRAM synapses of the spiking neural network (nW range may enable the design of autonomous implantable devices for rehabilitation purposes. We demonstrate an original methodology to use Oxide based RRAM (OxRAM as easy to program and low power (< 75 pJ synapses. Synaptic weights are modulated through the application of an online learning strategy inspired by biological Spike Timing Dependent Plasticity. Real spiking data have been recorded both intraand extracellularly from an in-vitro preparation of the Crayfish sensory-motor system and used for validation of the proposed OxRAM based SNN. This artificial SNN is able to identify, learn, recognize and distinguish between different spike shapes in the input signal with a recognition rate about 90% without any supervision.

  19. Spiking neural P systems with multiple channels.

    Science.gov (United States)

    Peng, Hong; Yang, Jinyu; Wang, Jun; Wang, Tao; Sun, Zhang; Song, Xiaoxiao; Luo, Xiaohui; Huang, Xiangnian

    2017-11-01

    Spiking neural P systems (SNP systems, in short) are a class of distributed parallel computing systems inspired from the neurophysiological behavior of biological spiking neurons. In this paper, we investigate a new variant of SNP systems in which each neuron has one or more synaptic channels, called spiking neural P systems with multiple channels (SNP-MC systems, in short). The spiking rules with channel label are introduced to handle the firing mechanism of neurons, where the channel labels indicate synaptic channels of transmitting the generated spikes. The computation power of SNP-MC systems is investigated. Specifically, we prove that SNP-MC systems are Turing universal as both number generating and number accepting devices. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Implementing Signature Neural Networks with Spiking Neurons.

    Science.gov (United States)

    Carrillo-Medina, José Luis; Latorre, Roberto

    2016-01-01

    Spiking Neural Networks constitute the most promising approach to develop realistic Artificial Neural Networks (ANNs). Unlike traditional firing rate-based paradigms, information coding in spiking models is based on the precise timing of individual spikes. It has been demonstrated that spiking ANNs can be successfully and efficiently applied to multiple realistic problems solvable with traditional strategies (e.g., data classification or pattern recognition). In recent years, major breakthroughs in neuroscience research have discovered new relevant computational principles in different living neural systems. Could ANNs benefit from some of these recent findings providing novel elements of inspiration? This is an intriguing question for the research community and the development of spiking ANNs including novel bio-inspired information coding and processing strategies is gaining attention. From this perspective, in this work, we adapt the core concepts of the recently proposed Signature Neural Network paradigm-i.e., neural signatures to identify each unit in the network, local information contextualization during the processing, and multicoding strategies for information propagation regarding the origin and the content of the data-to be employed in a spiking neural network. To the best of our knowledge, none of these mechanisms have been used yet in the context of ANNs of spiking neurons. This paper provides a proof-of-concept for their applicability in such networks. Computer simulations show that a simple network model like the discussed here exhibits complex self-organizing properties. The combination of multiple simultaneous encoding schemes allows the network to generate coexisting spatio-temporal patterns of activity encoding information in different spatio-temporal spaces. As a function of the network and/or intra-unit parameters shaping the corresponding encoding modality, different forms of competition among the evoked patterns can emerge even in the absence

  1. Implementing Signature Neural Networks with Spiking Neurons

    Science.gov (United States)

    Carrillo-Medina, José Luis; Latorre, Roberto

    2016-01-01

    Spiking Neural Networks constitute the most promising approach to develop realistic Artificial Neural Networks (ANNs). Unlike traditional firing rate-based paradigms, information coding in spiking models is based on the precise timing of individual spikes. It has been demonstrated that spiking ANNs can be successfully and efficiently applied to multiple realistic problems solvable with traditional strategies (e.g., data classification or pattern recognition). In recent years, major breakthroughs in neuroscience research have discovered new relevant computational principles in different living neural systems. Could ANNs benefit from some of these recent findings providing novel elements of inspiration? This is an intriguing question for the research community and the development of spiking ANNs including novel bio-inspired information coding and processing strategies is gaining attention. From this perspective, in this work, we adapt the core concepts of the recently proposed Signature Neural Network paradigm—i.e., neural signatures to identify each unit in the network, local information contextualization during the processing, and multicoding strategies for information propagation regarding the origin and the content of the data—to be employed in a spiking neural network. To the best of our knowledge, none of these mechanisms have been used yet in the context of ANNs of spiking neurons. This paper provides a proof-of-concept for their applicability in such networks. Computer simulations show that a simple network model like the discussed here exhibits complex self-organizing properties. The combination of multiple simultaneous encoding schemes allows the network to generate coexisting spatio-temporal patterns of activity encoding information in different spatio-temporal spaces. As a function of the network and/or intra-unit parameters shaping the corresponding encoding modality, different forms of competition among the evoked patterns can emerge even in the

  2. iSpike: a spiking neural interface for the iCub robot.

    Science.gov (United States)

    Gamez, D; Fidjeland, A K; Lazdins, E

    2012-06-01

    This paper presents iSpike: a C++ library that interfaces between spiking neural network simulators and the iCub humanoid robot. It uses a biologically inspired approach to convert the robot's sensory information into spikes that are passed to the neural network simulator, and it decodes output spikes from the network into motor signals that are sent to control the robot. Applications of iSpike range from embodied models of the brain to the development of intelligent robots using biologically inspired spiking neural networks. iSpike is an open source library that is available for free download under the terms of the GPL.

  3. Spike Neural Models Part II: Abstract Neural Models

    OpenAIRE

    Johnson, Melissa G.; Chartier, Sylvain

    2018-01-01

    Neurons are complex cells that require a lot of time and resources to model completely. In spiking neural networks (SNN) though, not all that complexity is required. Therefore simple, abstract models are often used. These models save time, use less computer resources, and are easier to understand. This tutorial presents two such models: Izhikevich's model, which is biologically realistic in the resulting spike trains but not in the parameters, and the Leaky Integrate and Fire (LIF) model whic...

  4. Spiking Neural P Systems with Communication on Request.

    Science.gov (United States)

    Pan, Linqiang; Păun, Gheorghe; Zhang, Gexiang; Neri, Ferrante

    2017-12-01

    Spiking Neural [Formula: see text] Systems are Neural System models characterized by the fact that each neuron mimics a biological cell and the communication between neurons is based on spikes. In the Spiking Neural [Formula: see text] systems investigated so far, the application of evolution rules depends on the contents of a neuron (checked by means of a regular expression). In these [Formula: see text] systems, a specified number of spikes are consumed and a specified number of spikes are produced, and then sent to each of the neurons linked by a synapse to the evolving neuron. [Formula: see text]In the present work, a novel communication strategy among neurons of Spiking Neural [Formula: see text] Systems is proposed. In the resulting models, called Spiking Neural [Formula: see text] Systems with Communication on Request, the spikes are requested from neighboring neurons, depending on the contents of the neuron (still checked by means of a regular expression). Unlike the traditional Spiking Neural [Formula: see text] systems, no spikes are consumed or created: the spikes are only moved along synapses and replicated (when two or more neurons request the contents of the same neuron). [Formula: see text]The Spiking Neural [Formula: see text] Systems with Communication on Request are proved to be computationally universal, that is, equivalent with Turing machines as long as two types of spikes are used. Following this work, further research questions are listed to be open problems.

  5. Spiking modular neural networks: A neural network modeling approach for hydrological processes

    National Research Council Canada - National Science Library

    Kamban Parasuraman; Amin Elshorbagy; Sean K. Carey

    2006-01-01

    .... In this study, a novel neural network model called the spiking modular neural networks (SMNNs) is proposed. An SMNN consists of an input layer, a spiking layer, and an associator neural network layer...

  6. Phase Diagram of Spiking Neural Networks

    Directory of Open Access Journals (Sweden)

    Hamed eSeyed-Allaei

    2015-03-01

    Full Text Available In computer simulations of spiking neural networks, often it is assumed that every two neurons of the network are connected by a probablilty of 2%, 20% of neurons are inhibitory and 80% are excitatory. These common values are based on experiments, observations. but here, I take a different perspective, inspired by evolution. I simulate many networks, each with a different set of parameters, and then I try to figure out what makes the common values desirable by nature. Networks which are configured according to the common values, have the best dynamic range in response to an impulse and their dynamic range is more robust in respect to synaptic weights. In fact, evolution has favored networks of best dynamic range. I present a phase diagram that shows the dynamic ranges of different networks of different parameteres. This phase diagram gives an insight into the space of parameters -- excitatory to inhibitory ratio, sparseness of connections and synaptic weights. It may serve as a guideline to decide about the values of parameters in a simulation of spiking neural network.

  7. Automatic fitting of spiking neuron models to electrophysiological recordings

    Directory of Open Access Journals (Sweden)

    Cyrille Rossant

    2010-03-01

    Full Text Available Spiking models can accurately predict the spike trains produced by cortical neurons in response to somatically injected currents. Since the specific characteristics of the model depend on the neuron, a computational method is required to fit models to electrophysiological recordings. The fitting procedure can be very time consuming both in terms of computer simulations and in terms of code writing. We present algorithms to fit spiking models to electrophysiological data (time-varying input and spike trains that can run in parallel on graphics processing units (GPUs. The model fitting library is interfaced with Brian, a neural network simulator in Python. If a GPU is present it uses just-in-time compilation to translate model equations into optimized code. Arbitrary models can then be defined at script level and run on the graphics card. This tool can be used to obtain empirically validated spiking models of neurons in various systems. We demonstrate its use on public data from the INCF Quantitative Single-Neuron Modeling 2009 competition by comparing the performance of a number of neuron spiking models.

  8. Neural cache: a low-power online digital spike-sorting architecture.

    Science.gov (United States)

    Peng, Chung-Ching; Sabharwal, Pawan; Bashirullah, Rizwan

    2008-01-01

    Transmitting large amounts of data sensed by multi-electrode array devices is widely considered to be a challenging problem in designing implantable neural recording systems. Spike sorting is an important step to reducing the data bandwidth before wireless data transmission. The feasibility of spike sorting algorithms in scaled CMOS technologies, which typically operate on low frequency neural spikes, is determined largely by its power consumption, a dominant portion of which is leakage power. Our goal is to explore energy saving architectures for online spike sorting without sacrificing performance. In the face of non-stationary neural data, training is not always a feasible option. We present a low-power architecture for real-time online spike sorting that does not require any training period and has the capability to quickly respond to the changing spike shapes.

  9. A supervised multi-spike learning algorithm based on gradient descent for spiking neural networks.

    Science.gov (United States)

    Xu, Yan; Zeng, Xiaoqin; Han, Lixin; Yang, Jing

    2013-07-01

    We use a supervised multi-spike learning algorithm for spiking neural networks (SNNs) with temporal encoding to simulate the learning mechanism of biological neurons in which the SNN output spike trains are encoded by firing times. We first analyze why existing gradient-descent-based learning methods for SNNs have difficulty in achieving multi-spike learning. We then propose a new multi-spike learning method for SNNs based on gradient descent that solves the problems of error function construction and interference among multiple output spikes during learning. The method could be widely applied to single spiking neurons to learn desired output spike trains and to multilayer SNNs to solve classification problems. By overcoming learning interference among multiple spikes, our method has high learning accuracy when there are a relatively large number of output spikes in need of learning. We also develop an output encoding strategy with respect to multiple spikes for classification problems. This effectively improves the classification accuracy of multi-spike learning compared to that of single-spike learning. Copyright © 2013 Elsevier Ltd. All rights reserved.

  10. Spatio-temporal conditional inference and hypothesis tests for neural ensemble spiking precision

    Science.gov (United States)

    Harrison, Matthew T.; Amarasingham, Asohan; Truccolo, Wilson

    2014-01-01

    The collective dynamics of neural ensembles create complex spike patterns with many spatial and temporal scales. Understanding the statistical structure of these patterns can help resolve fundamental questions about neural computation and neural dynamics. Spatio-temporal conditional inference (STCI) is introduced here as a semiparametric statistical framework for investigating the nature of precise spiking patterns from collections of neurons that is robust to arbitrarily complex and nonstationary coarse spiking dynamics. The main idea is to focus statistical modeling and inference, not on the full distribution of the data, but rather on families of conditional distributions of precise spiking given different types of coarse spiking. The framework is then used to develop families of hypothesis tests for probing the spatio-temporal precision of spiking patterns. Relationships among different conditional distributions are used to improve multiple hypothesis testing adjustments and to design novel Monte Carlo spike resampling algorithms. Of special note are algorithms that can locally jitter spike times while still preserving the instantaneous peri-stimulus time histogram (PSTH) or the instantaneous total spike count from a group of recorded neurons. The framework can also be used to test whether first-order maximum entropy models with possibly random and time-varying parameters can account for observed patterns of spiking. STCI provides a detailed example of the generic principle of conditional inference, which may be applicable in other areas of neurostatistical analysis. PMID:25380339

  11. Sparse Data Analysis Strategy for Neural Spike Classification

    Directory of Open Access Journals (Sweden)

    Vincent Vigneron

    2014-01-01

    Full Text Available Many of the multichannel extracellular recordings of neural activity consist of attempting to sort spikes on the basis of shared characteristics with some feature detection techniques. Then spikes can be sorted into distinct clusters. There are in general two main statistical issues: firstly, spike sorting can result in well-sorted units, but by with no means one can be sure that one is dealing with single units due to the number of neurons adjacent to the recording electrode. Secondly, the waveform dimensionality is reduced in a small subset of discriminating features. This shortening dimension effort was introduced as an aid to visualization and manual clustering, but also to reduce the computational complexity in automatic classification. We introduce a metric based on common neighbourhood to introduce sparsity in the dataset and separate data into more homogeneous subgroups. The approach is particularly well suited for clustering when the individual clusters are elongated (that is nonspherical. In addition it does need not to select the number of clusters, it is very efficient to visualize clusters in a dataset, it is robust to noise, it can handle imbalanced data, and it is fully automatic and deterministic.

  12. Phase diagram of spiking neural networks.

    Science.gov (United States)

    Seyed-Allaei, Hamed

    2015-01-01

    In computer simulations of spiking neural networks, often it is assumed that every two neurons of the network are connected by a probability of 2%, 20% of neurons are inhibitory and 80% are excitatory. These common values are based on experiments, observations, and trials and errors, but here, I take a different perspective, inspired by evolution, I systematically simulate many networks, each with a different set of parameters, and then I try to figure out what makes the common values desirable. I stimulate networks with pulses and then measure their: dynamic range, dominant frequency of population activities, total duration of activities, maximum rate of population and the occurrence time of maximum rate. The results are organized in phase diagram. This phase diagram gives an insight into the space of parameters - excitatory to inhibitory ratio, sparseness of connections and synaptic weights. This phase diagram can be used to decide the parameters of a model. The phase diagrams show that networks which are configured according to the common values, have a good dynamic range in response to an impulse and their dynamic range is robust in respect to synaptic weights, and for some synaptic weights they oscillates in α or β frequencies, independent of external stimuli.

  13. Fast and Efficient Asynchronous Neural Computation with Adapting Spiking Neural Networks

    NARCIS (Netherlands)

    D. Zambrano (Davide); S.M. Bohte (Sander)

    2016-01-01

    textabstractBiological neurons communicate with a sparing exchange of pulses - spikes. It is an open question how real spiking neurons produce the kind of powerful neural computation that is possible with deep artificial neural networks, using only so very few spikes to communicate. Building on

  14. Supervised Learning in Spiking Neural Networks for Precise Temporal Encoding

    National Research Council Canada - National Science Library

    Gardner, Brian; Grüning, André

    2016-01-01

    Precise spike timing as a means to encode information in neural networks is biologically supported, and is advantageous over frequency-based codes by processing input features on a much shorter time-scale...

  15. Enhancement of Spike-Timing-Dependent Plasticity in Spiking Neural Systems with Noise.

    Science.gov (United States)

    Nobukawa, Sou; Nishimura, Haruhiko

    2016-08-01

    Synaptic plasticity is widely recognized to support adaptable information processing in the brain. Spike-timing-dependent plasticity, one subtype of plasticity, can lead to synchronous spike propagation with temporal spiking coding information. Recently, it was reported that in a noisy environment, like the actual brain, the spike-timing-dependent plasticity may be made efficient by the effect of stochastic resonance. In the stochastic resonance, the presence of noise helps a nonlinear system in amplifying a weak (under barrier) signal. However, previous studies have ignored the full variety of spiking patterns and many relevant factors in neural dynamics. Thus, in order to prove the physiological possibility for the enhancement of spike-timing-dependent plasticity by stochastic resonance, it is necessary to demonstrate that this stochastic resonance arises in realistic cortical neural systems. In this study, we evaluate this stochastic resonance phenomenon in the realistic cortical neural system described by the Izhikevich neuron model and compare the characteristics of typical spiking patterns of regular spiking, intrinsically bursting and chattering experimentally observed in the cortex.

  16. Neuronal spike sorting based on radial basis function neural networks

    Directory of Open Access Journals (Sweden)

    Taghavi Kani M

    2011-02-01

    Full Text Available "nBackground: Studying the behavior of a society of neurons, extracting the communication mechanisms of brain with other tissues, finding treatment for some nervous system diseases and designing neuroprosthetic devices, require an algorithm to sort neuralspikes automatically. However, sorting neural spikes is a challenging task because of the low signal to noise ratio (SNR of the spikes. The main purpose of this study was to design an automatic algorithm for classifying neuronal spikes that are emitted from a specific region of the nervous system."n "nMethods: The spike sorting process usually consists of three stages: detection, feature extraction and sorting. We initially used signal statistics to detect neural spikes. Then, we chose a limited number of typical spikes as features and finally used them to train a radial basis function (RBF neural network to sort the spikes. In most spike sorting devices, these signals are not linearly discriminative. In order to solve this problem, the aforesaid RBF neural network was used."n "nResults: After the learning process, our proposed algorithm classified any arbitrary spike. The obtained results showed that even though the proposed Radial Basis Spike Sorter (RBSS reached to the same error as the previous methods, however, the computational costs were much lower compared to other algorithms. Moreover, the competitive points of the proposed algorithm were its good speed and low computational complexity."n "nConclusion: Regarding the results of this study, the proposed algorithm seems to serve the purpose of procedures that require real-time processing and spike sorting.

  17. Financial time series prediction using spiking neural networks.

    Directory of Open Access Journals (Sweden)

    David Reid

    Full Text Available In this paper a novel application of a particular type of spiking neural network, a Polychronous Spiking Network, was used for financial time series prediction. It is argued that the inherent temporal capabilities of this type of network are suited to non-stationary data such as this. The performance of the spiking neural network was benchmarked against three systems: two "traditional", rate-encoded, neural networks; a Multi-Layer Perceptron neural network and a Dynamic Ridge Polynomial neural network, and a standard Linear Predictor Coefficients model. For this comparison three non-stationary and noisy time series were used: IBM stock data; US/Euro exchange rate data, and the price of Brent crude oil. The experiments demonstrated favourable prediction results for the Spiking Neural Network in terms of Annualised Return and prediction error for 5-Step ahead predictions. These results were also supported by other relevant metrics such as Maximum Drawdown and Signal-To-Noise ratio. This work demonstrated the applicability of the Polychronous Spiking Network to financial data forecasting and this in turn indicates the potential of using such networks over traditional systems in difficult to manage non-stationary environments.

  18. Inherently stochastic spiking neurons for probabilistic neural computation

    KAUST Repository

    Al-Shedivat, Maruan

    2015-04-01

    Neuromorphic engineering aims to design hardware that efficiently mimics neural circuitry and provides the means for emulating and studying neural systems. In this paper, we propose a new memristor-based neuron circuit that uniquely complements the scope of neuron implementations and follows the stochastic spike response model (SRM), which plays a cornerstone role in spike-based probabilistic algorithms. We demonstrate that the switching of the memristor is akin to the stochastic firing of the SRM. Our analysis and simulations show that the proposed neuron circuit satisfies a neural computability condition that enables probabilistic neural sampling and spike-based Bayesian learning and inference. Our findings constitute an important step towards memristive, scalable and efficient stochastic neuromorphic platforms. © 2015 IEEE.

  19. Efficient sequential Bayesian inference method for real-time detection and sorting of overlapped neural spikes.

    Science.gov (United States)

    Haga, Tatsuya; Fukayama, Osamu; Takayama, Yuzo; Hoshino, Takayuki; Mabuchi, Kunihiko

    2013-09-30

    Overlapping of extracellularly recorded neural spike waveforms causes the original spike waveforms to become hidden and merged, confounding the real-time detection and sorting of these spikes. Methods proposed for solving this problem include using a multi-trode or placing a restriction on the complexity of overlaps. In this paper, we propose a rapid sequential method for the robust detection and sorting of arbitrarily overlapped spikes recorded with arbitrary types of electrodes. In our method, the probabilities of possible spike trains, including those that are overlapping, are evaluated by sequential Bayesian inference based on probabilistic models of spike-train generation and extracellular voltage recording. To reduce the high computational cost inherent in an exhaustive evaluation, candidates with low probabilities are considered as impossible candidates and are abolished at each sampling time to limit the number of candidates in the next evaluation. In addition, the data from a few subsequent sampling times are considered and used to calculate the "look-ahead probability", resulting in improved calculation efficiency due to a more rapid elimination of candidates. These sufficiently reduce computational time to enable real-time calculation without impairing performance. We assessed the performance of our method using simulated neural signals and actual neural signals recorded in primary cortical neurons cultured on a multi-electrode array. Our results demonstrated that our computational method could be applied in real-time with a delay of less than 10 ms. The estimation accuracy was higher than that of a conventional spike sorting method, particularly for signals with multiple overlapping spikes. Copyright © 2013 Elsevier B.V. All rights reserved.

  20. Efficiently passing messages in distributed spiking neural network simulation.

    Science.gov (United States)

    Thibeault, Corey M; Minkovich, Kirill; O'Brien, Michael J; Harris, Frederick C; Srinivasa, Narayan

    2013-01-01

    Efficiently passing spiking messages in a neural model is an important aspect of high-performance simulation. As the scale of networks has increased so has the size of the computing systems required to simulate them. In addition, the information exchange of these resources has become more of an impediment to performance. In this paper we explore spike message passing using different mechanisms provided by the Message Passing Interface (MPI). A specific implementation, MVAPICH, designed for high-performance clusters with Infiniband hardware is employed. The focus is on providing information about these mechanisms for users of commodity high-performance spiking simulators. In addition, a novel hybrid method for spike exchange was implemented and benchmarked.

  1. An Efficient Supervised Training Algorithm for Multilayer Spiking Neural Networks.

    Science.gov (United States)

    Xie, Xiurui; Qu, Hong; Liu, Guisong; Zhang, Malu; Kurths, Jürgen

    2016-01-01

    The spiking neural networks (SNNs) are the third generation of neural networks and perform remarkably well in cognitive tasks such as pattern recognition. The spike emitting and information processing mechanisms found in biological cognitive systems motivate the application of the hierarchical structure and temporal encoding mechanism in spiking neural networks, which have exhibited strong computational capability. However, the hierarchical structure and temporal encoding approach require neurons to process information serially in space and time respectively, which reduce the training efficiency significantly. For training the hierarchical SNNs, most existing methods are based on the traditional back-propagation algorithm, inheriting its drawbacks of the gradient diffusion and the sensitivity on parameters. To keep the powerful computation capability of the hierarchical structure and temporal encoding mechanism, but to overcome the low efficiency of the existing algorithms, a new training algorithm, the Normalized Spiking Error Back Propagation (NSEBP) is proposed in this paper. In the feedforward calculation, the output spike times are calculated by solving the quadratic function in the spike response model instead of detecting postsynaptic voltage states at all time points in traditional algorithms. Besides, in the feedback weight modification, the computational error is propagated to previous layers by the presynaptic spike jitter instead of the gradient decent rule, which realizes the layer-wised training. Furthermore, our algorithm investigates the mathematical relation between the weight variation and voltage error change, which makes the normalization in the weight modification applicable. Adopting these strategies, our algorithm outperforms the traditional SNN multi-layer algorithms in terms of learning efficiency and parameter sensitivity, that are also demonstrated by the comprehensive experimental results in this paper.

  2. Effects of Spike Anticipation on the Spiking Dynamics of Neural Networks

    Directory of Open Access Journals (Sweden)

    Daniel ede Santos-Sierra

    2015-11-01

    Full Text Available Synchronization is one of the central phenomena involved in information processing in living systems. It is known that the nervous system requires the coordinated activity of both local and distant neural populations. Such an interplay allows to merge different information modalities in a whole processing supporting high-level mental skills as understanding, memory, abstraction, etc. Though the biological processes underlying synchronization in the brain are not fully understood there have been reported a variety of mechanisms supporting different types of synchronization both at theoretical and experimental level. One of the more intriguing of these phenomena is the anticipating synchronization, which has been recently reported in a pair of unidirectionally coupled artificial neurons under simple conditions cite{Pyragas}, where the slave neuron is able to anticipate in time the behaviour of the master one. In this paper we explore the effect of spike anticipation over the information processing performed by a neural network at functional and structural level. We show that the introduction of intermediary neurons in the network enhances spike anticipation and analyse how these variations in spike anticipation can significantly change the firing regime of the neural network according to its functional and structural properties. In addition we show that the interspike interval (ISI, one of the main features of the neural response associated to the information coding, can be closely related to spike anticipation by each spike, and how synaptic plasticity can be modulated through that relationship. This study has been performed through numerical simulation of a coupled system of Hindmarsh-Rose neurons.

  3. A Granger causality measure for point process models of ensemble neural spiking activity.

    Directory of Open Access Journals (Sweden)

    Sanggyun Kim

    2011-03-01

    Full Text Available The ability to identify directional interactions that occur among multiple neurons in the brain is crucial to an understanding of how groups of neurons cooperate in order to generate specific brain functions. However, an optimal method of assessing these interactions has not been established. Granger causality has proven to be an effective method for the analysis of the directional interactions between multiple sets of continuous-valued data, but cannot be applied to neural spike train recordings due to their discrete nature. This paper proposes a point process framework that enables Granger causality to be applied to point process data such as neural spike trains. The proposed framework uses the point process likelihood function to relate a neuron's spiking probability to possible covariates, such as its own spiking history and the concurrent activity of simultaneously recorded neurons. Granger causality is assessed based on the relative reduction of the point process likelihood of one neuron obtained excluding one of its covariates compared to the likelihood obtained using all of its covariates. The method was tested on simulated data, and then applied to neural activity recorded from the primary motor cortex (MI of a Felis catus subject. The interactions present in the simulated data were predicted with a high degree of accuracy, and when applied to the real neural data, the proposed method identified causal relationships between many of the recorded neurons. This paper proposes a novel method that successfully applies Granger causality to point process data, and has the potential to provide unique physiological insights when applied to neural spike trains.

  4. A Granger causality measure for point process models of ensemble neural spiking activity.

    Science.gov (United States)

    Kim, Sanggyun; Putrino, David; Ghosh, Soumya; Brown, Emery N

    2011-03-01

    The ability to identify directional interactions that occur among multiple neurons in the brain is crucial to an understanding of how groups of neurons cooperate in order to generate specific brain functions. However, an optimal method of assessing these interactions has not been established. Granger causality has proven to be an effective method for the analysis of the directional interactions between multiple sets of continuous-valued data, but cannot be applied to neural spike train recordings due to their discrete nature. This paper proposes a point process framework that enables Granger causality to be applied to point process data such as neural spike trains. The proposed framework uses the point process likelihood function to relate a neuron's spiking probability to possible covariates, such as its own spiking history and the concurrent activity of simultaneously recorded neurons. Granger causality is assessed based on the relative reduction of the point process likelihood of one neuron obtained excluding one of its covariates compared to the likelihood obtained using all of its covariates. The method was tested on simulated data, and then applied to neural activity recorded from the primary motor cortex (MI) of a Felis catus subject. The interactions present in the simulated data were predicted with a high degree of accuracy, and when applied to the real neural data, the proposed method identified causal relationships between many of the recorded neurons. This paper proposes a novel method that successfully applies Granger causality to point process data, and has the potential to provide unique physiological insights when applied to neural spike trains.

  5. Dual roles for spike signaling in cortical neural populations

    Directory of Open Access Journals (Sweden)

    Dana eBallard

    2011-06-01

    Full Text Available A prominent feature of signaling in cortical neurons is that of randomness in the action potential. The output of a typical pyramidal cell can be well fit with a Poisson model, and variations in the Poisson rate repeatedly have been shown to be correlated with stimuli. However while the rate provides a very useful characterization of neural spike data, it may not be the most fundamental description of the signaling code. Recent data showing γ frequency range multi-cell action potential correlations, together with spike timing dependent plasticity, are spurring a re-examination of the classical model, since precise timing codes imply that the generation of spikes is essentially deterministic. Could the observed Poisson randomness and timing determinism reflect two separate modes of communication, or do they somehow derive from a single process? We investigate in a timing-based model whether the apparent incompatibility between these probabilistic and deterministic observations may be resolved by examining how spikes could be used in the underlying neural circuits. The crucial component of this model draws on dual roles for spike signaling. In learning receptive fields from ensembles of inputs, spikes need to behave probabilistically, whereas for fast signaling of individual stimuli, the spikes need to behave deterministically. Our simulations show that this combination is possible if deterministic signals using γ latency coding are probabilistically routed through different members of a cortical cell population at different times. This model exhibits standard features characteristic of Poisson models such as orientation tuning post-stimulus histograms and exponential interval histograms. In addition it makes testable predictions that follow from the γ latency coding.

  6. Spiking neural network-based control chart pattern recognition

    Directory of Open Access Journals (Sweden)

    Medhat H.A. Awadalla

    2012-03-01

    Full Text Available Due to an increasing competition in products, consumers have become more critical in choosing products. The quality of products has become more important. Statistical Process Control (SPC is usually used to improve the quality of products. Control charting plays the most important role in SPC. Control charts help to monitor the behavior of the process to determine whether it is stable or not. Unnatural patterns in control charts mean that there are some unnatural causes for variations in SPC. Spiking neural networks (SNNs are the third generation of artificial neural networks that consider time as an important feature for information representation and processing. In this paper, a spiking neural network architecture is proposed to be used for control charts pattern recognition (CCPR. Furthermore, enhancements to the SpikeProp learning algorithm are proposed. These enhancements provide additional learning rules for the synaptic delays, time constants and for the neurons thresholds. Simulated experiments have been conducted and the achieved results show a remarkable improvement in the overall performance compared with artificial neural networks.

  7. Exact computation of the Maximum Entropy Potential of spiking neural networks models

    CERN Document Server

    Cofre, Rodrigo

    2014-01-01

    Understanding how stimuli and synaptic connectivity in uence the statistics of spike patterns in neural networks is a central question in computational neuroscience. Maximum Entropy approach has been successfully used to characterize the statistical response of simultaneously recorded spiking neurons responding to stimuli. But, in spite of good performance in terms of prediction, the ?tting parameters do not explain the underlying mechanistic causes of the observed correlations. On the other hand, mathematical models of spiking neurons (neuro-mimetic models) provide a probabilistic mapping between stimulus, network architecture and spike patterns in terms of conditional proba- bilities. In this paper we build an exact analytical mapping between neuro-mimetic and Maximum Entropy models.

  8. Classification of epileptiform and wicket spike of EEG pattern using backpropagation neural network

    Science.gov (United States)

    Puspita, Juni Wijayanti; Jaya, Agus Indra; Gunadharma, Suryani

    2017-03-01

    Epilepsy is characterized by recurrent seizures that is resulted by permanent brain abnormalities. One of tools to support the diagnosis of epilepsy is Electroencephalograph (EEG), which describes the recording of brain electrical activity. Abnormal EEG patterns in epilepsy patients consist of Spike and Sharp waves. While both waves, there is a normal pattern that sometimes misinterpreted as epileptiform by electroenchepalographer (EEGer), namely Wicket Spike. The main difference of the three waves are on the time duration that related to the frequency. In this study, we proposed a method to classify a EEG wave into Sharp wave, Spike wave or Wicket spike group using Backpropagation Neural Network based on the frequency and amplitude of each wave. The results show that the proposed method can classifies the three group of waves with good accuracy.

  9. SpikingLab: modelling agents controlled by Spiking Neural Networks in Netlogo.

    Science.gov (United States)

    Jimenez-Romero, Cristian; Johnson, Jeffrey

    2017-01-01

    The scientific interest attracted by Spiking Neural Networks (SNN) has lead to the development of tools for the simulation and study of neuronal dynamics ranging from phenomenological models to the more sophisticated and biologically accurate Hodgkin-and-Huxley-based and multi-compartmental models. However, despite the multiple features offered by neural modelling tools, their integration with environments for the simulation of robots and agents can be challenging and time consuming. The implementation of artificial neural circuits to control robots generally involves the following tasks: (1) understanding the simulation tools, (2) creating the neural circuit in the neural simulator, (3) linking the simulated neural circuit with the environment of the agent and (4) programming the appropriate interface in the robot or agent to use the neural controller. The accomplishment of the above-mentioned tasks can be challenging, especially for undergraduate students or novice researchers. This paper presents an alternative tool which facilitates the simulation of simple SNN circuits using the multi-agent simulation and the programming environment Netlogo (educational software that simplifies the study and experimentation of complex systems). The engine proposed and implemented in Netlogo for the simulation of a functional model of SNN is a simplification of integrate and fire (I&F) models. The characteristics of the engine (including neuronal dynamics, STDP learning and synaptic delay) are demonstrated through the implementation of an agent representing an artificial insect controlled by a simple neural circuit. The setup of the experiment and its outcomes are described in this work.

  10. Spatiotemporal receptive field properties of epiretinally recorded spikes and local electroretinograms in cats

    Directory of Open Access Journals (Sweden)

    Wilms Marcus

    2005-08-01

    Full Text Available Abstract Background Receptive fields of retinal neural signals of different origin can be determined from extracellular microelectrode recordings at the inner retinal surface. However, locations and types of neural processes generating the different signal components are difficult to separate and identify. We here report epiretinal receptive fields (RFs from simultaneously recorded spikes and local electroretinograms (LERGs using a semi-chronic multi-electrode in vivo recording technique in cats. Broadband recordings were filtered to yield LERG and multi unit as well as single unit spike signals. RFs were calculated from responses to multifocal pseudo-random spatiotemporal visual stimuli registered at the retinal surface by a 7-electrode array. Results LERGs exhibit spatially unimodal RFs always centered at the location of the electrode tip. Spike-RFs are either congruent with LERG-RFs (N = 26/61 or shifted distally (N = 35/61 but never proximally with respect to the optic disk. LERG-RFs appear at shorter latencies (11.9 ms ± 0.5 ms, N = 18 than those of spikes (18.6 ms ± 0.4 ms, N = 53. Furthermore, OFF-center spike-RFs precede and have shorter response rise times than ON-center spike-RFs. Our results indicate that displaced spike-RFs result from action potentials of ganglion cell axons passing the recording electrode en route to the optic disk while LERG-RFs are related to superimposed postsynaptic potentials of cells near the electrode tip. Conclusion Besides contributing to the understanding of retinal function we demonstrate the caveats that come with recordings from the retinal surface, i.e., the likelihood of recordings from mixed sets of retinal neurons. Implications for the design of an epiretinal visual implant are discussed.

  11. Spiking neural circuits with dendritic stimulus processors : encoding, decoding, and identification in reproducing kernel Hilbert spaces.

    Science.gov (United States)

    Lazar, Aurel A; Slutskiy, Yevgeniy B

    2015-02-01

    We present a multi-input multi-output neural circuit architecture for nonlinear processing and encoding of stimuli in the spike domain. In this architecture a bank of dendritic stimulus processors implements nonlinear transformations of multiple temporal or spatio-temporal signals such as spike trains or auditory and visual stimuli in the analog domain. Dendritic stimulus processors may act on both individual stimuli and on groups of stimuli, thereby executing complex computations that arise as a result of interactions between concurrently received signals. The results of the analog-domain computations are then encoded into a multi-dimensional spike train by a population of spiking neurons modeled as nonlinear dynamical systems. We investigate general conditions under which such circuits faithfully represent stimuli and demonstrate algorithms for (i) stimulus recovery, or decoding, and (ii) identification of dendritic stimulus processors from the observed spikes. Taken together, our results demonstrate a fundamental duality between the identification of the dendritic stimulus processor of a single neuron and the decoding of stimuli encoded by a population of neurons with a bank of dendritic stimulus processors. This duality result enabled us to derive lower bounds on the number of experiments to be performed and the total number of spikes that need to be recorded for identifying a neural circuit.

  12. Evolving Spiking Neural Networks for Control of Artificial Creatures

    Directory of Open Access Journals (Sweden)

    Arash Ahmadi

    2013-10-01

    Full Text Available To understand and analysis behavior of complicated and intelligent organisms, scientists apply bio-inspired concepts including evolution and learning to mathematical models and analyses. Researchers utilize these perceptions in different applications, searching for improved methods andapproaches for modern computational systems. This paper presents a genetic algorithm based evolution framework in which Spiking Neural Network (SNN of artificial creatures are evolved for higher chance of survival in a virtual environment. The artificial creatures are composed ofrandomly connected Izhikevich spiking reservoir neural networks using population activity rate coding. Inspired by biological neurons, the neuronal connections are considered with different axonal conduction delays. Simulations results prove that the evolutionary algorithm has thecapability to find or synthesis artificial creatures which can survive in the environment successfully.

  13. Real-time cerebellar neuroprosthetic system based on a spiking neural network model of motor learning.

    Science.gov (United States)

    Xu, Tao; Xiao, Na; Zhai, Xiaolong; Kwan Chan, Pak; Tin, Chung

    2018-02-01

    Damage to the brain, as a result of various medical conditions, impacts the everyday life of patients and there is still no complete cure to neurological disorders. Neuroprostheses that can functionally replace the damaged neural circuit have recently emerged as a possible solution to these problems. Here we describe the development of a real-time cerebellar neuroprosthetic system to substitute neural function in cerebellar circuitry for learning delay eyeblink conditioning (DEC). The system was empowered by a biologically realistic spiking neural network (SNN) model of the cerebellar neural circuit, which considers the neuronal population and anatomical connectivity of the network. The model simulated synaptic plasticity critical for learning DEC. This SNN model was carefully implemented on a field programmable gate array (FPGA) platform for real-time simulation. This hardware system was interfaced in in vivo experiments with anesthetized rats and it used neural spikes recorded online from the animal to learn and trigger conditioned eyeblink in the animal during training. This rat-FPGA hybrid system was able to process neuronal spikes in real-time with an embedded cerebellum model of ~10 000 neurons and reproduce learning of DEC with different inter-stimulus intervals. Our results validated that the system performance is physiologically relevant at both the neural (firing pattern) and behavioral (eyeblink pattern) levels. This integrated system provides the sufficient computation power for mimicking the cerebellar circuit in real-time. The system interacts with the biological system naturally at the spike level and can be generalized for including other neural components (neuron types and plasticity) and neural functions for potential neuroprosthetic applications.

  14. Real-time cerebellar neuroprosthetic system based on a spiking neural network model of motor learning

    Science.gov (United States)

    Xu, Tao; Xiao, Na; Zhai, Xiaolong; Chan, Pak Kwan; Tin, Chung

    2018-02-01

    Objective. Damage to the brain, as a result of various medical conditions, impacts the everyday life of patients and there is still no complete cure to neurological disorders. Neuroprostheses that can functionally replace the damaged neural circuit have recently emerged as a possible solution to these problems. Here we describe the development of a real-time cerebellar neuroprosthetic system to substitute neural function in cerebellar circuitry for learning delay eyeblink conditioning (DEC). Approach. The system was empowered by a biologically realistic spiking neural network (SNN) model of the cerebellar neural circuit, which considers the neuronal population and anatomical connectivity of the network. The model simulated synaptic plasticity critical for learning DEC. This SNN model was carefully implemented on a field programmable gate array (FPGA) platform for real-time simulation. This hardware system was interfaced in in vivo experiments with anesthetized rats and it used neural spikes recorded online from the animal to learn and trigger conditioned eyeblink in the animal during training. Main results. This rat-FPGA hybrid system was able to process neuronal spikes in real-time with an embedded cerebellum model of ~10 000 neurons and reproduce learning of DEC with different inter-stimulus intervals. Our results validated that the system performance is physiologically relevant at both the neural (firing pattern) and behavioral (eyeblink pattern) levels. Significance. This integrated system provides the sufficient computation power for mimicking the cerebellar circuit in real-time. The system interacts with the biological system naturally at the spike level and can be generalized for including other neural components (neuron types and plasticity) and neural functions for potential neuroprosthetic applications.

  15. Supervised Learning in Spiking Neural Networks for Precise Temporal Encoding.

    Science.gov (United States)

    Gardner, Brian; Grüning, André

    2016-01-01

    Precise spike timing as a means to encode information in neural networks is biologically supported, and is advantageous over frequency-based codes by processing input features on a much shorter time-scale. For these reasons, much recent attention has been focused on the development of supervised learning rules for spiking neural networks that utilise a temporal coding scheme. However, despite significant progress in this area, there still lack rules that have a theoretical basis, and yet can be considered biologically relevant. Here we examine the general conditions under which synaptic plasticity most effectively takes place to support the supervised learning of a precise temporal code. As part of our analysis we examine two spike-based learning methods: one of which relies on an instantaneous error signal to modify synaptic weights in a network (INST rule), and the other one relying on a filtered error signal for smoother synaptic weight modifications (FILT rule). We test the accuracy of the solutions provided by each rule with respect to their temporal encoding precision, and then measure the maximum number of input patterns they can learn to memorise using the precise timings of individual spikes as an indication of their storage capacity. Our results demonstrate the high performance of the FILT rule in most cases, underpinned by the rule's error-filtering mechanism, which is predicted to provide smooth convergence towards a desired solution during learning. We also find the FILT rule to be most efficient at performing input pattern memorisations, and most noticeably when patterns are identified using spikes with sub-millisecond temporal precision. In comparison with existing work, we determine the performance of the FILT rule to be consistent with that of the highly efficient E-learning Chronotron rule, but with the distinct advantage that our FILT rule is also implementable as an online method for increased biological realism.

  16. A 41 μW real-time adaptive neural spike classifier

    NARCIS (Netherlands)

    Zjajo, A.; van Leuken, T.G.R.M.

    2016-01-01

    Robust, power- and area-efficient spike classifier, capable of accurate identification of the neural spikes even for low SNR, is a prerequisite for the real-time, implantable, closed-loop brain-machine interface. In this paper, we propose an easily-scalable, 128-channel, programmable, neural spike

  17. Fixed latency on-chip interconnect for hardware spiking neural network architectures

    NARCIS (Netherlands)

    Pande, Sandeep; Morgan, Fearghal; Smit, Gerardus Johannes Maria; Bruintjes, Tom; Rutgers, J.H.; Cawley, Seamus; Harkin, Jim; McDaid, Liam

    Information in a Spiking Neural Network (SNN) is encoded as the relative timing between spikes. Distortion in spike timings can impact the accuracy of SNN operation by modifying the precise firing time of neurons within the SNN. Maintaining the integrity of spike timings is crucial for reliable

  18. Event-driven processing for hardware-efficient neural spike sorting.

    Science.gov (United States)

    Liu, Yan; L Pereira, João; Constandinou, Timothy

    2017-10-05

    The prospect of real-time and on-node spike sorting provides a genuine opportunity to push the envelope for large-scale integration of neural recording systems. In such systems the hardware resource, power requirements and data bandwidth increase linearly with channel count. Event-based (or data-driven) processing can here provide a new efficient means for hardware implementation that is completely activity dependant. In this work, we investigate using continuous time level-crossing sampling for efficient data representation and subsequent spike processing. We first compare signals (using synthetic neural datasets) that are encoded using this technique against conventional sampling. It is observed that considerably lower data rates are achievable when utilising 7 bits or less to represent the signals, whilst maintaining the signal fidelity. We then show how such a representation can be directly exploited by extracting simple time domain features from the bitstream to perform neural spike sorting. The proposed method is implemented in a low power FPGA platform to demonstrate the hardware viability. Results obtained using both MATLAB and reconfigurable logic (FPGA) hardware indicate that feature extraction and spike sorting accuracies can be achieved with comparable or better accuracy than reference methods whilst also requiring relatively low hardware cost. Creative Commons Attribution license.

  19. A new EC-PC threshold estimation method for in vivo neural spike detection

    Science.gov (United States)

    Yang, Zhi; Liu, Wentai; Keshtkaran, Mohammad Reza; Zhou, Yin; Xu, Jian; Pikov, Victor; Guan, Cuntai; Lian, Yong

    2012-08-01

    This paper models in vivo neural signals and noise for extracellular spike detection. Although the recorded data approximately follow Gaussian distribution, they clearly deviate from white Gaussian noise due to neuronal synchronization and sparse distribution of spike energy. Our study predicts the coexistence of two components embedded in neural data dynamics, one in the exponential form (noise) and the other in the power form (neural spikes). The prediction of the two components has been confirmed in experiments of in vivo sequences recorded from the hippocampus, cortex surface, and spinal cord; both acute and long-term recordings; and sleep and awake states. These two components are further used as references for threshold estimation. Different from the conventional wisdom of setting a threshold at 3×RMS, the estimated threshold exhibits a significant variation. When our algorithm was tested on synthesized sequences with a different signal to noise ratio and on/off firing dynamics, inferred threshold statistics track the benchmarks well. We envision that this work may be applied to a wide range of experiments as a front-end data analysis tool.

  20. Spike neural models (part I: The Hodgkin-Huxley model

    Directory of Open Access Journals (Sweden)

    Johnson, Melissa G.

    2017-05-01

    Full Text Available Artificial neural networks, or ANNs, have grown a lot since their inception back in the 1940s. But no matter the changes, one of the most important components of neural networks is still the node, which represents the neuron. Within spiking neural networks, the node is especially important because it contains the functions and properties of neurons that are necessary for their network. One important aspect of neurons is the ionic flow which produces action potentials, or spikes. Forces of diffusion and electrostatic pressure work together with the physical properties of the cell to move ions around changing the cell membrane potential which ultimately produces the action potential. This tutorial reviews the Hodkgin-Huxley model and shows how it simulates the ionic flow of the giant squid axon via four differential equations. The model is implemented in Matlab using Euler's Method to approximate the differential equations. By using Euler's method, an extra parameter is created, the time step. This new parameter needs to be carefully considered or the results of the node may be impaired.

  1. Topological analysis of chaos in neural spike train bursts

    Energy Technology Data Exchange (ETDEWEB)

    Gilmore, R. (Department of Physics and Atmospheric Science, Drexel University, Philadelphia, Pennsylvania 19104 (United States)); Pei, X.; Moss, F. (Center for Neurodynamics, University of Missouri, St. Louis, Missouri 63121 (United States))

    1999-09-01

    We show how a topological model which describes the stretching and squeezing mechanisms responsible for creating chaotic behavior can be extracted from the neural spike train data. The mechanism we have identified is the same one ([open quotes]gateau roul[acute e],[close quotes] or jelly-roll) which has previously been identified in the Duffing oscillator [Gilmore and McCallum, Phys. Rev. E [bold 51], 935 (1995)] and in a YAG laser [Boulant [ital et al.], Phys. Rev. E [bold 55], 5082 (1997)]. [copyright] [ital 1999 American Institute of Physics.

  2. Serial correlation in neural spike trains: experimental evidence, stochastic modeling, and single neuron variability.

    Science.gov (United States)

    Farkhooi, Farzad; Strube-Bloss, Martin F; Nawrot, Martin P

    2009-02-01

    The activity of spiking neurons is frequently described by renewal point process models that assume the statistical independence and identical distribution of the intervals between action potentials. However, the assumption of independent intervals must be questioned for many different types of neurons. We review experimental studies that reported the feature of a negative serial correlation of neighboring intervals, commonly observed in neurons in the sensory periphery as well as in central neurons, notably in the mammalian cortex. In our experiments we observed the same short-lived negative serial dependence of intervals in the spontaneous activity of mushroom body extrinsic neurons in the honeybee. To model serial interval correlations of arbitrary lags, we suggest a family of autoregressive point processes. Its marginal interval distribution is described by the generalized gamma model, which includes as special cases the log-normal and gamma distributions, which have been widely used to characterize regular spiking neurons. In numeric simulations we investigated how serial correlation affects the variance of the neural spike count. We show that the experimentally confirmed negative correlation reduces single-neuron variability, as quantified by the Fano factor, by up to 50%, which favors the transmission of a rate code. We argue that the feature of a negative serial correlation is likely to be common to the class of spike-frequency-adapting neurons and that it might have been largely overlooked in extracellular single-unit recordings due to spike sorting errors.

  3. Serial correlation in neural spike trains: Experimental evidence, stochastic modeling, and single neuron variability

    Science.gov (United States)

    Farkhooi, Farzad; Strube-Bloss, Martin F.; Nawrot, Martin P.

    2009-02-01

    The activity of spiking neurons is frequently described by renewal point process models that assume the statistical independence and identical distribution of the intervals between action potentials. However, the assumption of independent intervals must be questioned for many different types of neurons. We review experimental studies that reported the feature of a negative serial correlation of neighboring intervals, commonly observed in neurons in the sensory periphery as well as in central neurons, notably in the mammalian cortex. In our experiments we observed the same short-lived negative serial dependence of intervals in the spontaneous activity of mushroom body extrinsic neurons in the honeybee. To model serial interval correlations of arbitrary lags, we suggest a family of autoregressive point processes. Its marginal interval distribution is described by the generalized gamma model, which includes as special cases the log-normal and gamma distributions, which have been widely used to characterize regular spiking neurons. In numeric simulations we investigated how serial correlation affects the variance of the neural spike count. We show that the experimentally confirmed negative correlation reduces single-neuron variability, as quantified by the Fano factor, by up to 50%, which favors the transmission of a rate code. We argue that the feature of a negative serial correlation is likely to be common to the class of spike-frequency-adapting neurons and that it might have been largely overlooked in extracellular single-unit recordings due to spike sorting errors.

  4. Brian: a simulator for spiking neural networks in Python

    Directory of Open Access Journals (Sweden)

    Dan F M Goodman

    2008-11-01

    Full Text Available Brian is a new simulator for spiking neural networks, written in Python (http://brian.di.ens.fr. It is an intuitive and highly flexible tool for rapidly developing new models, especially networks of single-compartment neurons. In addition to using standard types of neuron models, users can define models by writing arbitrary differential equations in ordinary mathematical notation. Python scientific libraries can also be used for defining models and analysing data. Vectorisation techniques allow efficient simulations despite the overheads of an interpreted language. Brian will be especially valuable for working on non-standard neuron models not easily covered by existing software, and as an alternative to using Matlab or C for simulations. With its easy and intuitive syntax, Brian is also very well suited for teaching computational neuroscience.

  5. SPANNER: A Self-Repairing Spiking Neural Network Hardware Architecture.

    Science.gov (United States)

    Liu, Junxiu; Harkin, Jim; Maguire, Liam P; McDaid, Liam J; Wade, John J

    2017-03-06

    Recent research has shown that a glial cell of astrocyte underpins a self-repair mechanism in the human brain, where spiking neurons provide direct and indirect feedbacks to presynaptic terminals. These feedbacks modulate the synaptic transmission probability of release (PR). When synaptic faults occur, the neuron becomes silent or near silent due to the low PR of synapses; whereby the PRs of remaining healthy synapses are then increased by the indirect feedback from the astrocyte cell. In this paper, a novel hardware architecture of Self-rePAiring spiking Neural NEtwoRk (SPANNER) is proposed, which mimics this self-repairing capability in the human brain. This paper demonstrates that the hardware can self-detect and self-repair synaptic faults without the conventional components for the fault detection and fault repairing. Experimental results show that SPANNER can maintain the system performance with fault densities of up to 40%, and more importantly SPANNER has only a 20% performance degradation when the self-repairing architecture is significantly damaged at a fault density of 80%.

  6. Knowledge extraction from evolving spiking neural networks with rank order population coding.

    Science.gov (United States)

    Soltic, Snjezana; Kasabov, Nikola

    2010-12-01

    This paper demonstrates how knowledge can be extracted from evolving spiking neural networks with rank order population coding. Knowledge discovery is a very important feature of intelligent systems. Yet, a disproportionally small amount of research is centered on the issue of knowledge extraction from spiking neural networks which are considered to be the third generation of artificial neural networks. The lack of knowledge representation compatibility is becoming a major detriment to end users of these networks. We show that a high-level knowledge can be obtained from evolving spiking neural networks. More specifically, we propose a method for fuzzy rule extraction from an evolving spiking network with rank order population coding. The proposed method was used for knowledge discovery on two benchmark taste recognition problems where the knowledge learnt by an evolving spiking neural network was extracted in the form of zero-order Takagi-Sugeno fuzzy IF-THEN rules.

  7. A case for spiking neural network simulation based on configurable multiple-FPGA systems.

    Science.gov (United States)

    Yang, Shufan; Wu, Qiang; Li, Renfa

    2011-09-01

    Recent neuropsychological research has begun to reveal that neurons encode information in the timing of spikes. Spiking neural network simulations are a flexible and powerful method for investigating the behaviour of neuronal systems. Simulation of the spiking neural networks in software is unable to rapidly generate output spikes in large-scale of neural network. An alternative approach, hardware implementation of such system, provides the possibility to generate independent spikes precisely and simultaneously output spike waves in real time, under the premise that spiking neural network can take full advantage of hardware inherent parallelism. We introduce a configurable FPGA-oriented hardware platform for spiking neural network simulation in this work. We aim to use this platform to combine the speed of dedicated hardware with the programmability of software so that it might allow neuroscientists to put together sophisticated computation experiments of their own model. A feed-forward hierarchy network is developed as a case study to describe the operation of biological neural systems (such as orientation selectivity of visual cortex) and computational models of such systems. This model demonstrates how a feed-forward neural network constructs the circuitry required for orientation selectivity and provides platform for reaching a deeper understanding of the primate visual system. In the future, larger scale models based on this framework can be used to replicate the actual architecture in visual cortex, leading to more detailed predictions and insights into visual perception phenomenon.

  8. Neuromorphic implementations of neurobiological learning algorithms for spiking neural networks.

    Science.gov (United States)

    Walter, Florian; Röhrbein, Florian; Knoll, Alois

    2015-12-01

    The application of biologically inspired methods in design and control has a long tradition in robotics. Unlike previous approaches in this direction, the emerging field of neurorobotics not only mimics biological mechanisms at a relatively high level of abstraction but employs highly realistic simulations of actual biological nervous systems. Even today, carrying out these simulations efficiently at appropriate timescales is challenging. Neuromorphic chip designs specially tailored to this task therefore offer an interesting perspective for neurorobotics. Unlike Von Neumann CPUs, these chips cannot be simply programmed with a standard programming language. Like real brains, their functionality is determined by the structure of neural connectivity and synaptic efficacies. Enabling higher cognitive functions for neurorobotics consequently requires the application of neurobiological learning algorithms to adjust synaptic weights in a biologically plausible way. In this paper, we therefore investigate how to program neuromorphic chips by means of learning. First, we provide an overview over selected neuromorphic chip designs and analyze them in terms of neural computation, communication systems and software infrastructure. On the theoretical side, we review neurobiological learning techniques. Based on this overview, we then examine on-die implementations of these learning algorithms on the considered neuromorphic chips. A final discussion puts the findings of this work into context and highlights how neuromorphic hardware can potentially advance the field of autonomous robot systems. The paper thus gives an in-depth overview of neuromorphic implementations of basic mechanisms of synaptic plasticity which are required to realize advanced cognitive capabilities with spiking neural networks. Copyright © 2015 Elsevier Ltd. All rights reserved.

  9. Weighted spiking neural P systems with structural plasticity working in sequential mode based on maximum spike number

    Science.gov (United States)

    Sun, Mingming; Qu, Jianhua

    2017-10-01

    Spiking neural P systems (SNP systems, in short) are a group of parallel and distributed computing devices inspired by the function and structure of spiking neurons. Recently, a new variant of SNP systems, called SNP systems with structural plasticity (SNPSP systems, in short) was proposed. In SNPSP systems, neuron can use plasticity ru les to create and delete synapses. In this work, we consider many restrictions sequentiality on SNPSP systems: (1) neuron with the maximum number of spikes is chosen to fire; (2) we use the weighted synapses. Specifically, we investigate the computational power of weighted SNPSP systems working in the sequential mode based on maximum spike number (WSNPSPM systems, in short) and we proved that SNPSP systems with these new restrictions are universal as generating devices.

  10. A stimulus-dependent spike threshold is an optimal neural coder

    Directory of Open Access Journals (Sweden)

    Douglas L Jones

    2015-06-01

    Full Text Available A neural code based on sequences of spikes can consume a significant portion of the brain’s energy budget. Thus, energy considerations would dictate that spiking activity be kept as low as possible. However, a high spike-rate improves the coding and representation of signals in spike trains, particularly in sensory systems. These are competing demands, and selective pressure has presumably worked to optimize coding by apportioning a minimum number of spikes so as to maximize coding fidelity. The mechanisms by which a neuron generates spikes while maintaining a fidelity criterion are not known. Here, we show that a signal-dependent neural threshold, similar to a dynamic or adapting threshold, optimizes the trade-off between spike generation (encoding and fidelity (decoding. The threshold mimics a post-synaptic membrane (a low-pass filter and serves as an internal decoder. Further, it sets the average firing rate (the energy constraint. The decoding process provides an internal copy of the coding error to the spike-generator which emits a spike when the error equals or exceeds a spike threshold. When optimized, the trade-off leads to a deterministic spike firing-rule that generates optimally timed spikes so as to maximize fidelity. The optimal coder is derived in closed-form in the limit of high spike-rates, when the signal can be approximated as a piece-wise constant signal. The predicted spike-times are close to those obtained experimentally in the primary electrosensory afferent neurons of weakly electric fish (Apteronotus leptorhynchus and pyramidal neurons from the somatosensory cortex of the rat. We suggest that KCNQ/Kv7 channels (underlying the M-current are good candidates for the decoder. They are widely coupled to metabolic processes and do not inactivate. We conclude that the neural threshold is optimized to generate an energy-efficient and high-fidelity neural code.

  11. A stationary wavelet transform and a time-frequency based spike detection algorithm for extracellular recorded data

    Science.gov (United States)

    Lieb, Florian; Stark, Hans-Georg; Thielemann, Christiane

    2017-06-01

    Objective. Spike detection from extracellular recordings is a crucial preprocessing step when analyzing neuronal activity. The decision whether a specific part of the signal is a spike or not is important for any kind of other subsequent preprocessing steps, like spike sorting or burst detection in order to reduce the classification of erroneously identified spikes. Many spike detection algorithms have already been suggested, all working reasonably well whenever the signal-to-noise ratio is large enough. When the noise level is high, however, these algorithms have a poor performance. Approach. In this paper we present two new spike detection algorithms. The first is based on a stationary wavelet energy operator and the second is based on the time-frequency representation of spikes. Both algorithms are more reliable than all of the most commonly used methods. Main results. The performance of the algorithms is confirmed by using simulated data, resembling original data recorded from cortical neurons with multielectrode arrays. In order to demonstrate that the performance of the algorithms is not restricted to only one specific set of data, we also verify the performance using a simulated publicly available data set. We show that both proposed algorithms have the best performance under all tested methods, regardless of the signal-to-noise ratio in both data sets. Significance. This contribution will redound to the benefit of electrophysiological investigations of human cells. Especially the spatial and temporal analysis of neural network communications is improved by using the proposed spike detection algorithms.

  12. A non-parametric Bayesian approach for clustering and tracking non-stationarities of neural spikes.

    Science.gov (United States)

    Shalchyan, Vahid; Farina, Dario

    2014-02-15

    Neural spikes from multiple neurons recorded in a multi-unit signal are usually separated by clustering. Drifts in the position of the recording electrode relative to the neurons over time cause gradual changes in the position and shapes of the clusters, challenging the clustering task. By dividing the data into short time intervals, Bayesian tracking of the clusters based on Gaussian cluster model has been previously proposed. However, the Gaussian cluster model is often not verified for neural spikes. We present a Bayesian clustering approach that makes no assumptions on the distribution of the clusters and use kernel-based density estimation of the clusters in every time interval as a prior for Bayesian classification of the data in the subsequent time interval. The proposed method was tested and compared to Gaussian model-based approach for cluster tracking by using both simulated and experimental datasets. The results showed that the proposed non-parametric kernel-based density estimation of the clusters outperformed the sequential Gaussian model fitting in both simulated and experimental data tests. Using non-parametric kernel density-based clustering that makes no assumptions on the distribution of the clusters enhances the ability of tracking cluster non-stationarity over time with respect to the Gaussian cluster modeling approach. Copyright © 2013 Elsevier B.V. All rights reserved.

  13. A Hardware-Efficient Scalable Spike Sorting Neural Signal Processor Module for Implantable High-Channel-Count Brain Machine Interfaces.

    Science.gov (United States)

    Yang, Yuning; Boling, Sam; Mason, Andrew J

    2017-08-01

    Next-generation brain machine interfaces demand a high-channel-count neural recording system to wirelessly monitor activities of thousands of neurons. A hardware efficient neural signal processor (NSP) is greatly desirable to ease the data bandwidth bottleneck for a fully implantable wireless neural recording system. This paper demonstrates a complete multichannel spike sorting NSP module that incorporates all of the necessary spike detector, feature extractor, and spike classifier blocks. To meet high-channel-count and implantability demands, each block was designed to be highly hardware efficient and scalable while sharing resources efficiently among multiple channels. To process multiple channels in parallel, scalability analysis was performed, and the utilization of each block was optimized according to its input data statistics and the power, area and/or speed of each block. Based on this analysis, a prototype 32-channel spike sorting NSP scalable module was designed and tested on an FPGA using synthesized datasets over a wide range of signal to noise ratios. The design was mapped to 130 nm CMOS to achieve 0.75 μW power and 0.023 mm2 area consumptions per channel based on post synthesis simulation results, which permits scalability of digital processing to 690 channels on a 4×4 mm2 electrode array.

  14. NeuroFlow: A General Purpose Spiking Neural Network Simulation Platform using Customizable Processors

    National Research Council Canada - National Science Library

    Cheung, Kit; Schultz, Simon R; Luk, Wayne

    2015-01-01

    NeuroFlow is a scalable spiking neural network simulation platform for off-the-shelf high performance computing systems using customizable hardware processors such as Field-Programmable Gate Arrays (FPGAs...

  15. An FPGA Implementation of a Polychronous Spiking Neural Network with Delay Adaptation

    Science.gov (United States)

    Wang, Runchun; Cohen, Gregory; Stiefel, Klaus M.; Hamilton, Tara Julia; Tapson, Jonathan; van Schaik, André

    2013-01-01

    We present an FPGA implementation of a re-configurable, polychronous spiking neural network with a large capacity for spatial-temporal patterns. The proposed neural network generates delay paths de novo, so that only connections that actually appear in the training patterns will be created. This allows the proposed network to use all the axons (variables) to store information. Spike Timing Dependent Delay Plasticity is used to fine-tune and add dynamics to the network. We use a time multiplexing approach allowing us to achieve 4096 (4k) neurons and up to 1.15 million programmable delay axons on a Virtex 6 FPGA. Test results show that the proposed neural network is capable of successfully recalling more than 95% of all spikes for 96% of the stored patterns. The tests also show that the neural network is robust to noise from random input spikes. PMID:23408739

  16. Spotting neural spike patterns using an adversary background model.

    Science.gov (United States)

    Gat, I; Tishby, N

    2001-12-01

    The detection of a specific stochastic pattern embedded in an unknown background noise is a difficult pattern recognition problem, encountered in many applications such as word spotting in speech. A similar problem emerges when trying to detect a multineural spike pattern in a single electrical recording, embedded in the complex cortical activity of a behaving animal. Solving this problem is crucial for the identification of neuronal code words with specific meaning. The technical difficulty of this detection is due to the lack of a good statistical model for the background activity, which rapidly changes with the recording conditions and activity of the animal. This work introduces the use of an adversary background model. This model assumes that the background "knows" the pattern sought, up to a first-order statistics, and this "knowledge" creates a background composed of all the permutations of our pattern. We show that this background model is tightly connected to the type-based information-theoretic approach. Furthermore, we show that computing the likelihood ratio is actually decomposing the log-likelihood distribution according to types of the empirical counts. We demonstrate the application of this method for detection of the reward patterns in the basal ganglia of behaving monkeys, yielding some unexpected biological results.

  17. Neural recording and modulation technologies

    Science.gov (United States)

    Chen, Ritchie; Canales, Andres; Anikeeva, Polina

    2017-01-01

    In the mammalian nervous system, billions of neurons connected by quadrillions of synapses exchange electrical, chemical and mechanical signals. Disruptions to this network manifest as neurological or psychiatric conditions. Despite decades of neuroscience research, our ability to treat or even to understand these conditions is limited by the capability of tools to probe the signalling complexity of the nervous system. Although orders of magnitude smaller and computationally faster than neurons, conventional substrate-bound electronics do not recapitulate the chemical and mechanical properties of neural tissue. This mismatch results in a foreign-body response and the encapsulation of devices by glial scars, suggesting that the design of an interface between the nervous system and a synthetic sensor requires additional materials innovation. Advances in genetic tools for manipulating neural activity have fuelled the demand for devices that are capable of simultaneously recording and controlling individual neurons at unprecedented scales. Recently, flexible organic electronics and bio- and nanomaterials have been developed for multifunctional and minimally invasive probes for long-term interaction with the nervous system. In this Review, we discuss the design lessons from the quarter-century-old field of neural engineering, highlight recent materials-driven progress in neural probes and look at emergent directions inspired by the principles of neural transduction.

  18. Stochastic spike synchronization in a small-world neural network with spike-timing-dependent plasticity.

    Science.gov (United States)

    Kim, Sang-Yoon; Lim, Woochang

    2018-01-01

    We consider the Watts-Strogatz small-world network (SWN) consisting of subthreshold neurons which exhibit noise-induced spikings. This neuronal network has adaptive dynamic synaptic strengths governed by the spike-timing-dependent plasticity (STDP). In previous works without STDP, stochastic spike synchronization (SSS) between noise-induced spikings of subthreshold neurons was found to occur in a range of intermediate noise intensities. Here, we investigate the effect of additive STDP on the SSS by varying the noise intensity. Occurrence of a "Matthew" effect in synaptic plasticity is found due to a positive feedback process. As a result, good synchronization gets better via long-term potentiation of synaptic strengths, while bad synchronization gets worse via long-term depression. Emergences of long-term potentiation and long-term depression of synaptic strengths are intensively investigated via microscopic studies based on the pair-correlations between the pre- and the post-synaptic IISRs (instantaneous individual spike rates) as well as the distributions of time delays between the pre- and the post-synaptic spike times. Furthermore, the effects of multiplicative STDP (which depends on states) on the SSS are studied and discussed in comparison with the case of additive STDP (independent of states). These effects of STDP on the SSS in the SWN are also compared with those in the regular lattice and the random graph. Copyright © 2017 Elsevier Ltd. All rights reserved.

  19. A 0.7 V, 40 nW Compact, Current-Mode Neural Spike Detector in 65 nm CMOS.

    Science.gov (United States)

    Yao, Enyi; Chen, Yi; Basu, Arindam

    2016-04-01

    In this paper, we describe a novel low power, compact, current-mode spike detector circuit for real-time neural recording systems where neural spikes or action potentials (AP) are of interest. Such a circuit can enable massive compression of data facilitating wireless transmission. This design can generate a high signal-to-noise ratio (SNR) output by approximating the popularly used nonlinear energy operator (NEO) through standard analog blocks. We show that a low pass filter after the NEO can be used for two functions-(i) estimate and cancel low frequency interference and (ii) estimate threshold for spike detection. The circuit is implemented in a 65 nm CMOS process and occupies 200 μm × 150 μ m of chip area. Operating from a 0.7 V power supply, it consumes about 30 nW of static power and 7 nW of dynamic power for 100 Hz input spike rate making it the lowest power consuming spike detector reported so far.

  20. Point process modeling and estimation: Advances in the analysis of dynamic neural spiking data

    Science.gov (United States)

    Deng, Xinyi

    2016-08-01

    population spiking data. Lastly, we proposed a general three-step paradigm that allows us to relate behavioral outcomes of various tasks to simultaneously recorded neural activity across multiple brain areas, which is a step towards closed-loop therapies for psychological diseases using real-time neural stimulation. These methods are suitable for real-time implementation for content-based feedback experiments.

  1. Adaptive inverse control of neural spatiotemporal spike patterns with a reproducing kernel Hilbert space (RKHS) framework.

    Science.gov (United States)

    Li, Lin; Park, Il Memming; Brockmeier, Austin; Chen, Badong; Seth, Sohan; Francis, Joseph T; Sanchez, Justin C; Príncipe, José C

    2013-07-01

    The precise control of spiking in a population of neurons via applied electrical stimulation is a challenge due to the sparseness of spiking responses and neural system plasticity. We pose neural stimulation as a system control problem where the system input is a multidimensional time-varying signal representing the stimulation, and the output is a set of spike trains; the goal is to drive the output such that the elicited population spiking activity is as close as possible to some desired activity, where closeness is defined by a cost function. If the neural system can be described by a time-invariant (homogeneous) model, then offline procedures can be used to derive the control procedure; however, for arbitrary neural systems this is not tractable. Furthermore, standard control methodologies are not suited to directly operate on spike trains that represent both the target and elicited system response. In this paper, we propose a multiple-input multiple-output (MIMO) adaptive inverse control scheme that operates on spike trains in a reproducing kernel Hilbert space (RKHS). The control scheme uses an inverse controller to approximate the inverse of the neural circuit. The proposed control system takes advantage of the precise timing of the neural events by using a Schoenberg kernel defined directly in the space of spike trains. The Schoenberg kernel maps the spike train to an RKHS and allows linear algorithm to control the nonlinear neural system without the danger of converging to local minima. During operation, the adaptation of the controller minimizes a difference defined in the spike train RKHS between the system and the target response and keeps the inverse controller close to the inverse of the current neural circuit, which enables adapting to neural perturbations. The results on a realistic synthetic neural circuit show that the inverse controller based on the Schoenberg kernel outperforms the decoding accuracy of other models based on the conventional rate

  2. Extraction and characterization of essential discharge patterns from multisite recordings of spiking ongoing activity.

    Directory of Open Access Journals (Sweden)

    Riccardo Storchi

    Full Text Available BACKGROUND: Neural activation patterns proceed often by schemes or motifs distributed across the involved cortical networks. As neurons are correlated, the estimate of all possible dependencies quickly goes out of control. The complex nesting of different oscillation frequencies and their high non-stationariety further hamper any quantitative evaluation of spiking network activities. The problem is exacerbated by the intrinsic variability of neural patterns. METHODOLOGY/PRINCIPAL FINDINGS: Our technique introduces two important novelties and enables to insulate essential patterns on larger sets of spiking neurons and brain activity regimes. First, the sampling procedure over N units is based on a fixed spike number k in order to detect N-dimensional arrays (k-sequences, whose sum over all dimension is k. Then k-sequences variability is greatly reduced by a hierarchical separative clustering, that assigns large amounts of distinct k-sequences to few classes. Iterative separations are stopped when the dimension of each cluster comes to be smaller than a certain threshold. As threshold tuning critically impacts on the number of classes extracted, we developed an effective cost criterion to select the shortest possible description of our dataset. Finally we described three indexes (C,S,R to evaluate the average pattern complexity, the structure of essential classes and their stability in time. CONCLUSIONS/SIGNIFICANCE: We validated this algorithm with four kinds of surrogated activity, ranging from random to very regular patterned. Then we characterized a selection of ongoing activity recordings. By the S index we identified unstable, moderatly and strongly stable patterns while by the C and the R indices we evidenced their non-random structure. Our algorithm seems able to extract interesting and non-trivial spatial dynamics from multisource neuronal recordings of ongoing and potentially stimulated activity. Combined with time-frequency analysis of

  3. On the Computational Power of Spiking Neural P Systems with Self-Organization

    Science.gov (United States)

    Wang, Xun; Song, Tao; Gong, Faming; Zheng, Pan

    2016-06-01

    Neural-like computing models are versatile computing mechanisms in the field of artificial intelligence. Spiking neural P systems (SN P systems for short) are one of the recently developed spiking neural network models inspired by the way neurons communicate. The communications among neurons are essentially achieved by spikes, i. e. short electrical pulses. In terms of motivation, SN P systems fall into the third generation of neural network models. In this study, a novel variant of SN P systems, namely SN P systems with self-organization, is introduced, and the computational power of the system is investigated and evaluated. It is proved that SN P systems with self-organization are capable of computing and accept the family of sets of Turing computable natural numbers. Moreover, with 87 neurons the system can compute any Turing computable recursive function, thus achieves Turing universality. These results demonstrate promising initiatives to solve an open problem arisen by Gh Păun.

  4. Synthesis of neural networks for spatio-temporal spike pattern recognition and processing

    Directory of Open Access Journals (Sweden)

    Jonathan C Tapson

    2013-08-01

    Full Text Available The advent of large scale neural computational platforms has highlighted the lack of algorithms for synthesis of neural structures to perform predefined cognitive tasks. The Neural Engineering Framework offers one such synthesis, but it is most effective for a spike rate representation of neural information, and it requires a large number of neurons to implement simple functions. We describe a neural network synthesis method that generates synaptic connectivity for neurons which process time-encoded neural signals, and which makes very sparse use of neurons. The method allows the user to specify – arbitrarily - neuronal characteristics such as axonal and dendritic delays, and synaptic transfer functions, and then solves for the optimal input-output relationship using computed dendritic weights. The method may be used for batch or online learning and has an extremely fast optimization process. We demonstrate its use in generating a network to recognize speech which is sparsely encoded as spike times.

  5. Testing of information condensation in a model reverberating spiking neural network.

    Science.gov (United States)

    Vidybida, Alexander

    2011-06-01

    Information about external world is delivered to the brain in the form of structured in time spike trains. During further processing in higher areas, information is subjected to a certain condensation process, which results in formation of abstract conceptual images of external world, apparently, represented as certain uniform spiking activity partially independent on the input spike trains details. Possible physical mechanism of condensation at the level of individual neuron was discussed recently. In a reverberating spiking neural network, due to this mechanism the dynamics should settle down to the same uniform/ periodic activity in response to a set of various inputs. Since the same periodic activity may correspond to different input spike trains, we interpret this as possible candidate for information condensation mechanism in a network. Our purpose is to test this possibility in a network model consisting of five fully connected neurons, particularly, the influence of geometric size of the network, on its ability to condense information. Dynamics of 20 spiking neural networks of different geometric sizes are modelled by means of computer simulation. Each network was propelled into reverberating dynamics by applying various initial input spike trains. We run the dynamics until it becomes periodic. The Shannon's formula is used to calculate the amount of information in any input spike train and in any periodic state found. As a result, we obtain explicit estimate of the degree of information condensation in the networks, and conclude that it depends strongly on the net's geometric size.

  6. A 128-Channel FPGA-Based Real-Time Spike-Sorting Bidirectional Closed-Loop Neural Interface System.

    Science.gov (United States)

    Park, Jongkil; Kim, Gookhwa; Jung, Sang-Don

    2017-12-01

    A multichannel neural interface system is an important tool for various types of neuroscientific studies. For the electrical interface with a biological system, high-precision high-speed data recording and various types of stimulation capability are required. In addition, real-time signal processing is an important feature in the implementation of a real-time closed-loop system without unwanted substantial delay for feedback stimulation. Online spike sorting, the process of assigning neural spikes to an identified group of neurons or clusters, is a necessary step to make a closed-loop path in real time, but massive memory-space requirements commonly limit hardware implementations. Here, we present a 128-channel field-programmable gate array (FPGA)-based real-time closed-loop bidirectional neural interface system. The system supports 128 channels for simultaneous signal recording and eight selectable channels for stimulation. A modular 64-channel analog front-end (AFE) provides scalability and a parameterized specification of the AFE supports the recording of various electrophysiological signal types with 1.59 ± 0.76 root-mean-square noise. The stimulator supports both voltage-controlled and current-controlled arbitrarily shaped waveforms with the programmable amplitude and duration of pulse. An empirical algorithm for online real-time spike sorting is implemented in an FPGA. The spike-sorting is performed by template matching, and templates are created by an online real-time unsupervised learning process. A memory saving technique, called dynamic cache organizing, is proposed to reduce the memory requirement down to 6 kbit per channel and modular implementation improves the scalability for further extensions.

  7. Neural dynamics as sampling: a model for stochastic computation in recurrent networks of spiking neurons.

    Directory of Open Access Journals (Sweden)

    Lars Buesing

    2011-11-01

    Full Text Available The organization of computations in networks of spiking neurons in the brain is still largely unknown, in particular in view of the inherently stochastic features of their firing activity and the experimentally observed trial-to-trial variability of neural systems in the brain. In principle there exists a powerful computational framework for stochastic computations, probabilistic inference by sampling, which can explain a large number of macroscopic experimental data in neuroscience and cognitive science. But it has turned out to be surprisingly difficult to create a link between these abstract models for stochastic computations and more detailed models of the dynamics of networks of spiking neurons. Here we create such a link and show that under some conditions the stochastic firing activity of networks of spiking neurons can be interpreted as probabilistic inference via Markov chain Monte Carlo (MCMC sampling. Since common methods for MCMC sampling in distributed systems, such as Gibbs sampling, are inconsistent with the dynamics of spiking neurons, we introduce a different approach based on non-reversible Markov chains that is able to reflect inherent temporal processes of spiking neuronal activity through a suitable choice of random variables. We propose a neural network model and show by a rigorous theoretical analysis that its neural activity implements MCMC sampling of a given distribution, both for the case of discrete and continuous time. This provides a step towards closing the gap between abstract functional models of cortical computation and more detailed models of networks of spiking neurons.

  8. Neural dynamics as sampling: a model for stochastic computation in recurrent networks of spiking neurons.

    Science.gov (United States)

    Buesing, Lars; Bill, Johannes; Nessler, Bernhard; Maass, Wolfgang

    2011-11-01

    The organization of computations in networks of spiking neurons in the brain is still largely unknown, in particular in view of the inherently stochastic features of their firing activity and the experimentally observed trial-to-trial variability of neural systems in the brain. In principle there exists a powerful computational framework for stochastic computations, probabilistic inference by sampling, which can explain a large number of macroscopic experimental data in neuroscience and cognitive science. But it has turned out to be surprisingly difficult to create a link between these abstract models for stochastic computations and more detailed models of the dynamics of networks of spiking neurons. Here we create such a link and show that under some conditions the stochastic firing activity of networks of spiking neurons can be interpreted as probabilistic inference via Markov chain Monte Carlo (MCMC) sampling. Since common methods for MCMC sampling in distributed systems, such as Gibbs sampling, are inconsistent with the dynamics of spiking neurons, we introduce a different approach based on non-reversible Markov chains that is able to reflect inherent temporal processes of spiking neuronal activity through a suitable choice of random variables. We propose a neural network model and show by a rigorous theoretical analysis that its neural activity implements MCMC sampling of a given distribution, both for the case of discrete and continuous time. This provides a step towards closing the gap between abstract functional models of cortical computation and more detailed models of networks of spiking neurons.

  9. Neural control of computer cursor velocity by decoding motor cortical spiking activity in humans with tetraplegia

    Science.gov (United States)

    Kim, Sung-Phil; Simeral, John D.; Hochberg, Leigh R.; Donoghue, John P.; Black, Michael J.

    2008-12-01

    Computer-mediated connections between human motor cortical neurons and assistive devices promise to improve or restore lost function in people with paralysis. Recently, a pilot clinical study of an intracortical neural interface system demonstrated that a tetraplegic human was able to obtain continuous two-dimensional control of a computer cursor using neural activity recorded from his motor cortex. This control, however, was not sufficiently accurate for reliable use in many common computer control tasks. Here, we studied several central design choices for such a system including the kinematic representation for cursor movement, the decoding method that translates neuronal ensemble spiking activity into a control signal and the cursor control task used during training for optimizing the parameters of the decoding method. In two tetraplegic participants, we found that controlling a cursor's velocity resulted in more accurate closed-loop control than controlling its position directly and that cursor velocity control was achieved more rapidly than position control. Control quality was further improved over conventional linear filters by using a probabilistic method, the Kalman filter, to decode human motor cortical activity. Performance assessment based on standard metrics used for the evaluation of a wide range of pointing devices demonstrated significantly improved cursor control with velocity rather than position decoding. Disclosure. JPD is the Chief Scientific Officer and a director of Cyberkinetics Neurotechnology Systems (CYKN); he holds stock and receives compensation. JDS has been a consultant for CYKN. LRH receives clinical trial support from CYKN.

  10. Computational modeling of spiking neural network with learning rules from STDP and intrinsic plasticity

    Science.gov (United States)

    Li, Xiumin; Wang, Wei; Xue, Fangzheng; Song, Yongduan

    2018-02-01

    Recently there has been continuously increasing interest in building up computational models of spiking neural networks (SNN), such as the Liquid State Machine (LSM). The biologically inspired self-organized neural networks with neural plasticity can enhance the capability of computational performance, with the characteristic features of dynamical memory and recurrent connection cycles which distinguish them from the more widely used feedforward neural networks. Despite a variety of computational models for brain-like learning and information processing have been proposed, the modeling of self-organized neural networks with multi-neural plasticity is still an important open challenge. The main difficulties lie in the interplay among different forms of neural plasticity rules and understanding how structures and dynamics of neural networks shape the computational performance. In this paper, we propose a novel approach to develop the models of LSM with a biologically inspired self-organizing network based on two neural plasticity learning rules. The connectivity among excitatory neurons is adapted by spike-timing-dependent plasticity (STDP) learning; meanwhile, the degrees of neuronal excitability are regulated to maintain a moderate average activity level by another learning rule: intrinsic plasticity (IP). Our study shows that LSM with STDP+IP performs better than LSM with a random SNN or SNN obtained by STDP alone. The noticeable improvement with the proposed method is due to the better reflected competition among different neurons in the developed SNN model, as well as the more effectively encoded and processed relevant dynamic information with its learning and self-organizing mechanism. This result gives insights to the optimization of computational models of spiking neural networks with neural plasticity.

  11. A Spiking Neural Network Model of the Lateral Geniculate Nucleus on the SpiNNaker Machine

    Directory of Open Access Journals (Sweden)

    Basabdatta Sen-Bhattacharya

    2017-08-01

    Full Text Available We present a spiking neural network model of the thalamic Lateral Geniculate Nucleus (LGN developed on SpiNNaker, which is a state-of-the-art digital neuromorphic hardware built with very-low-power ARM processors. The parallel, event-based data processing in SpiNNaker makes it viable for building massively parallel neuro-computational frameworks. The LGN model has 140 neurons representing a “basic building block” for larger modular architectures. The motivation of this work is to simulate biologically plausible LGN dynamics on SpiNNaker. Synaptic layout of the model is consistent with biology. The model response is validated with existing literature reporting entrainment in steady state visually evoked potentials (SSVEP—brain oscillations corresponding to periodic visual stimuli recorded via electroencephalography (EEG. Periodic stimulus to the model is provided by: a synthetic spike-train with inter-spike-intervals in the range 10–50 Hz at a resolution of 1 Hz; and spike-train output from a state-of-the-art electronic retina subjected to a light emitting diode flashing at 10, 20, and 40 Hz, simulating real-world visual stimulus to the model. The resolution of simulation is 0.1 ms to ensure solution accuracy for the underlying differential equations defining Izhikevichs neuron model. Under this constraint, 1 s of model simulation time is executed in 10 s real time on SpiNNaker; this is because simulations on SpiNNaker work in real time for time-steps dt ⩾ 1 ms. The model output shows entrainment with both sets of input and contains harmonic components of the fundamental frequency. However, suppressing the feed-forward inhibition in the circuit produces subharmonics within the gamma band (>30 Hz implying a reduced information transmission fidelity. These model predictions agree with recent lumped-parameter computational model-based predictions, using conventional computers. Scalability of the framework is demonstrated by a multi

  12. Integration of Continuous-Time Dynamics in a Spiking Neural Network Simulator.

    Science.gov (United States)

    Hahne, Jan; Dahmen, David; Schuecker, Jannis; Frommer, Andreas; Bolten, Matthias; Helias, Moritz; Diesmann, Markus

    2017-01-01

    Contemporary modeling approaches to the dynamics of neural networks include two important classes of models: biologically grounded spiking neuron models and functionally inspired rate-based units. We present a unified simulation framework that supports the combination of the two for multi-scale modeling, enables the quantitative validation of mean-field approaches by spiking network simulations, and provides an increase in reliability by usage of the same simulation code and the same network model specifications for both model classes. While most spiking simulations rely on the communication of discrete events, rate models require time-continuous interactions between neurons. Exploiting the conceptual similarity to the inclusion of gap junctions in spiking network simulations, we arrive at a reference implementation of instantaneous and delayed interactions between rate-based models in a spiking network simulator. The separation of rate dynamics from the general connection and communication infrastructure ensures flexibility of the framework. In addition to the standard implementation we present an iterative approach based on waveform-relaxation techniques to reduce communication and increase performance for large-scale simulations of rate-based models with instantaneous interactions. Finally we demonstrate the broad applicability of the framework by considering various examples from the literature, ranging from random networks to neural-field models. The study provides the prerequisite for interactions between rate-based and spiking models in a joint simulation.

  13. Structured chaos shapes spike-response noise entropy in balanced neural networks

    Directory of Open Access Journals (Sweden)

    Guillaume eLajoie

    2014-10-01

    Full Text Available Large networks of sparsely coupled, excitatory and inhibitory cells occur throughout the brain. For many models of these networks, a striking feature is that their dynamics are chaotic and thus, are sensitive to small perturbations. How does this chaos manifest in the neural code? Specifically, how variable are the spike patterns that such a network produces in response to an input signal? To answer this, we derive a bound for a general measure of variability -- spike-train entropy. This leads to important insights on the variability of multi-cell spike pattern distributions in large recurrent networks of spiking neurons responding to fluctuating inputs. The analysis is based on results from random dynamical systems theory and is complemented by detailed numerical simulations. We find that the spike pattern entropy is an order of magnitude lower than what would be extrapolated from single cells. This holds despite the fact that network coupling becomes vanishingly sparse as network size grows -- a phenomenon that depends on ``extensive chaos, as previously discovered for balanced networks without stimulus drive. Moreover, we show how spike pattern entropy is controlled by temporal features of the inputs. Our findings provide insight into how neural networks may encode stimuli in the presence of inherently chaotic dynamics.

  14. Macroscopic phase-resetting curves for spiking neural networks

    Science.gov (United States)

    Dumont, Grégory; Ermentrout, G. Bard; Gutkin, Boris

    2017-10-01

    The study of brain rhythms is an open-ended, and challenging, subject of interest in neuroscience. One of the best tools for the understanding of oscillations at the single neuron level is the phase-resetting curve (PRC). Synchronization in networks of neurons, effects of noise on the rhythms, effects of transient stimuli on the ongoing rhythmic activity, and many other features can be understood by the PRC. However, most macroscopic brain rhythms are generated by large populations of neurons, and so far it has been unclear how the PRC formulation can be extended to these more common rhythms. In this paper, we describe a framework to determine a macroscopic PRC (mPRC) for a network of spiking excitatory and inhibitory neurons that generate a macroscopic rhythm. We take advantage of a thermodynamic approach combined with a reduction method to simplify the network description to a small number of ordinary differential equations. From this simplified but exact reduction, we can compute the mPRC via the standard adjoint method. Our theoretical findings are illustrated with and supported by numerical simulations of the full spiking network. Notably our mPRC framework allows us to predict the difference between effects of transient inputs to the excitatory versus the inhibitory neurons in the network.

  15. Spikes not slots: noise in neural populations limits working memory.

    Science.gov (United States)

    Bays, Paul M

    2015-08-01

    This opinion article argues that noise (randomness) in neural activity is the limiting factor in visual working memory (WM), determining how accurately we can maintain stable internal representations of external stimuli. Sharing of a fixed amount of neural activity between items in memory explains why WM can be successfully described as a continuous resource. This contrasts with the popular conception of WM as comprising a limited number of memory slots, each holding a representation of one stimulus - I argue that this view is challenged by computational theory and the latest neurophysiological evidence. Copyright © 2015 Elsevier Ltd. All rights reserved.

  16. Hybrid Spintronic-CMOS Spiking Neural Network with On-Chip Learning: Devices, Circuits, and Systems

    Science.gov (United States)

    Sengupta, Abhronil; Banerjee, Aparajita; Roy, Kaushik

    2016-12-01

    Over the past decade, spiking neural networks (SNNs) have emerged as one of the popular architectures to emulate the brain. In SNNs, information is temporally encoded and communication between neurons is accomplished by means of spikes. In such networks, spike-timing-dependent plasticity mechanisms require the online programing of synapses based on the temporal information of spikes transmitted by spiking neurons. In this work, we propose a spintronic synapse with decoupled spike-transmission and programing-current paths. The spintronic synapse consists of a ferromagnet-heavy-metal heterostructure where the programing current through the heavy metal generates spin-orbit torque to modulate the device conductance. Low programing energy and fast programing times demonstrate the efficacy of the proposed device as a nanoelectronic synapse. We perform a simulation study based on an experimentally benchmarked device-simulation framework to demonstrate the interfacing of such spintronic synapses with CMOS neurons and learning circuits operating in the transistor subthreshold region to form a network of spiking neurons that can be utilized for pattern-recognition problems.

  17. Biological modelling of a computational spiking neural network with neuronal avalanches

    Science.gov (United States)

    Li, Xiumin; Chen, Qing; Xue, Fangzheng

    2017-05-01

    In recent years, an increasing number of studies have demonstrated that networks in the brain can self-organize into a critical state where dynamics exhibit a mixture of ordered and disordered patterns. This critical branching phenomenon is termed neuronal avalanches. It has been hypothesized that the homeostatic level balanced between stability and plasticity of this critical state may be the optimal state for performing diverse neural computational tasks. However, the critical region for high performance is narrow and sensitive for spiking neural networks (SNNs). In this paper, we investigated the role of the critical state in neural computations based on liquid-state machines, a biologically plausible computational neural network model for real-time computing. The computational performance of an SNN when operating at the critical state and, in particular, with spike-timing-dependent plasticity for updating synaptic weights is investigated. The network is found to show the best computational performance when it is subjected to critical dynamic states. Moreover, the active-neuron-dominant structure refined from synaptic learning can remarkably enhance the robustness of the critical state and further improve computational accuracy. These results may have important implications in the modelling of spiking neural networks with optimal computational performance. This article is part of the themed issue `Mathematical methods in medicine: neuroscience, cardiology and pathology'.

  18. FPGA Implementation of Self-Organized Spiking Neural Network Controller for Mobile Robots

    Directory of Open Access Journals (Sweden)

    Fangzheng Xue

    2014-06-01

    Full Text Available Spiking neural network, a computational model which uses spikes to process the information, is good candidate for mobile robot controller. In this paper, we present a novel mechanism for controlling mobile robots based on self-organized spiking neural network (SOSNN and introduce a method for FPGA implementation of this SOSNN. The spiking neuron we used is Izhikevich model. A key feature of this controller is that it can simulate the process of unconditioned reflex (avoid obstacles using infrared sensor signals and conditioned reflex (make right choices in multiple T-maze by spike timing-dependent plasticity (STDP learning and dopamine-receptor modulation. Experimental results show that the proposed controller is effective and is easy to implement. The FPGA implementation method aims to build up a specific network using generic blocks designed in the MATLAB Simulink environment. The main characteristics of this original solution are: on-chip learning algorithm implementation, high reconfiguration capability, and operation under real time constraints. An extended analysis has been carried out on the hardware resources used to implement the whole SOSNN network, as well as each individual component block.

  19. An FPGA hardware/software co-design towards evolvable spiking neural networks for robotics application.

    Science.gov (United States)

    Johnston, S P; Prasad, G; Maguire, L; McGinnity, T M

    2010-12-01

    This paper presents an approach that permits the effective hardware realization of a novel Evolvable Spiking Neural Network (ESNN) paradigm on Field Programmable Gate Arrays (FPGAs). The ESNN possesses a hybrid learning algorithm that consists of a Spike Timing Dependent Plasticity (STDP) mechanism fused with a Genetic Algorithm (GA). The design and implementation direction utilizes the latest advancements in FPGA technology to provide a partitioned hardware/software co-design solution. The approach achieves the maximum FPGA flexibility obtainable for the ESNN paradigm. The algorithm was applied as an embedded intelligent system robotic controller to solve an autonomous navigation and obstacle avoidance problem.

  20. Neural coordination can be enhanced by occasional interruption of normal firing patterns: a self-optimizing spiking neural network model.

    Science.gov (United States)

    Woodward, Alexander; Froese, Tom; Ikegami, Takashi

    2015-02-01

    The state space of a conventional Hopfield network typically exhibits many different attractors of which only a small subset satisfies constraints between neurons in a globally optimal fashion. It has recently been demonstrated that combining Hebbian learning with occasional alterations of normal neural states avoids this problem by means of self-organized enlargement of the best basins of attraction. However, so far it is not clear to what extent this process of self-optimization is also operative in real brains. Here we demonstrate that it can be transferred to more biologically plausible neural networks by implementing a self-optimizing spiking neural network model. In addition, by using this spiking neural network to emulate a Hopfield network with Hebbian learning, we attempt to make a connection between rate-based and temporal coding based neural systems. Although further work is required to make this model more realistic, it already suggests that the efficacy of the self-optimizing process is independent from the simplifying assumptions of a conventional Hopfield network. We also discuss natural and cultural processes that could be responsible for occasional alteration of neural firing patterns in actual brains. Copyright © 2014 Elsevier Ltd. All rights reserved.

  1. Force sensor in simulated skin and neural model mimic tactile SAI afferent spiking response to ramp and hold stimuli

    Directory of Open Access Journals (Sweden)

    Kim Elmer K

    2012-07-01

    Full Text Available Abstract Background The next generation of prosthetic limbs will restore sensory feedback to the nervous system by mimicking how skin mechanoreceptors, innervated by afferents, produce trains of action potentials in response to compressive stimuli. Prior work has addressed building sensors within skin substitutes for robotics, modeling skin mechanics and neural dynamics of mechanotransduction, and predicting response timing of action potentials for vibration. The effort here is unique because it accounts for skin elasticity by measuring force within simulated skin, utilizes few free model parameters for parsimony, and separates parameter fitting and model validation. Additionally, the ramp-and-hold, sustained stimuli used in this work capture the essential features of the everyday task of contacting and holding an object. Methods This systems integration effort computationally replicates the neural firing behavior for a slowly adapting type I (SAI afferent in its temporally varying response to both intensity and rate of indentation force by combining a physical force sensor, housed in a skin-like substrate, with a mathematical model of neuronal spiking, the leaky integrate-and-fire. Comparison experiments were then conducted using ramp-and-hold stimuli on both the spiking-sensor model and mouse SAI afferents. The model parameters were iteratively fit against recorded SAI interspike intervals (ISI before validating the model to assess its performance. Results Model-predicted spike firing compares favorably with that observed for single SAI afferents. As indentation magnitude increases (1.2, 1.3, to 1.4 mm, mean ISI decreases from 98.81 ± 24.73, 54.52 ± 6.94, to 41.11 ± 6.11 ms. Moreover, as rate of ramp-up increases, ISI during ramp-up decreases from 21.85 ± 5.33, 19.98 ± 3.10, to 15.42 ± 2.41 ms. Considering first spikes, the predicted latencies exhibited a decreasing trend as stimulus rate increased, as is

  2. Firing rate dynamics in recurrent spiking neural networks with intrinsic and network heterogeneity.

    Science.gov (United States)

    Ly, Cheng

    2015-12-01

    Heterogeneity of neural attributes has recently gained a lot of attention and is increasing recognized as a crucial feature in neural processing. Despite its importance, this physiological feature has traditionally been neglected in theoretical studies of cortical neural networks. Thus, there is still a lot unknown about the consequences of cellular and circuit heterogeneity in spiking neural networks. In particular, combining network or synaptic heterogeneity and intrinsic heterogeneity has yet to be considered systematically despite the fact that both are known to exist and likely have significant roles in neural network dynamics. In a canonical recurrent spiking neural network model, we study how these two forms of heterogeneity lead to different distributions of excitatory firing rates. To analytically characterize how these types of heterogeneities affect the network, we employ a dimension reduction method that relies on a combination of Monte Carlo simulations and probability density function equations. We find that the relationship between intrinsic and network heterogeneity has a strong effect on the overall level of heterogeneity of the firing rates. Specifically, this relationship can lead to amplification or attenuation of firing rate heterogeneity, and these effects depend on whether the recurrent network is firing asynchronously or rhythmically firing. These observations are captured with the aforementioned reduction method, and furthermore simpler analytic descriptions based on this dimension reduction method are developed. The final analytic descriptions provide compact and descriptive formulas for how the relationship between intrinsic and network heterogeneity determines the firing rate heterogeneity dynamics in various settings.

  3. Decoding Lower Limb Muscle Activity and Kinematics from Cortical Neural Spike Trains during Monkey Performing Stand and Squat Movements

    Science.gov (United States)

    Ma, Xuan; Ma, Chaolin; Huang, Jian; Zhang, Peng; Xu, Jiang; He, Jiping

    2017-01-01

    Extensive literatures have shown approaches for decoding upper limb kinematics or muscle activity using multichannel cortical spike recordings toward brain machine interface (BMI) applications. However, similar topics regarding lower limb remain relatively scarce. We previously reported a system for training monkeys to perform visually guided stand and squat tasks. The current study, as a follow-up extension, investigates whether lower limb kinematics and muscle activity characterized by electromyography (EMG) signals during monkey performing stand/squat movements can be accurately decoded from neural spike trains in primary motor cortex (M1). Two monkeys were used in this study. Subdermal intramuscular EMG electrodes were implanted to 8 right leg/thigh muscles. With ample data collected from neurons from a large brain area, we performed a spike triggered average (SpTA) analysis and got a series of density contours which revealed the spatial distributions of different muscle-innervating neurons corresponding to each given muscle. Based on the guidance of these results, we identified the locations optimal for chronic electrode implantation and subsequently carried on chronic neural data recordings. A recursive Bayesian estimation framework was proposed for decoding EMG signals together with kinematics from M1 spike trains. Two specific algorithms were implemented: a standard Kalman filter and an unscented Kalman filter. For the latter one, an artificial neural network was incorporated to deal with the nonlinearity in neural tuning. High correlation coefficient and signal to noise ratio between the predicted and the actual data were achieved for both EMG signals and kinematics on both monkeys. Higher decoding accuracy and faster convergence rate could be achieved with the unscented Kalman filter. These results demonstrate that lower limb EMG signals and kinematics during monkey stand/squat can be accurately decoded from a group of M1 neurons with the proposed

  4. Recording Spikes Activity in Cultured Hippocampal Neurons Using Flexible or Transparent Graphene Transistors

    Directory of Open Access Journals (Sweden)

    Farida Veliev

    2017-08-01

    Full Text Available The emergence of nanoelectronics applied to neural interfaces has started few decades ago, and aims to provide new tools for replacing or restoring disabled functions of the nervous systems as well as further understanding the evolution of such complex organization. As the same time, graphene and other 2D materials have offered new possibilities for integrating micro and nano-devices on flexible, transparent, and biocompatible substrates, promising for bio and neuro-electronics. In addition to many bio-suitable features of graphene interface, such as, chemical inertness and anti-corrosive properties, its optical transparency enables multimodal approach of neuronal based systems, the electrical layer being compatible with additional microfluidics and optical manipulation ports. The convergence of these fields will provide a next generation of neural interfaces for the reliable detection of single spike and record with high fidelity activity patterns of neural networks. Here, we report on the fabrication of graphene field effect transistors (G-FETs on various substrates (silicon, sapphire, glass coverslips, and polyimide deposited onto Si/SiO2 substrates, exhibiting high sensitivity (4 mS/V, close to the Dirac point at VLG < VD and low noise level (10−22 A2/Hz, at VLG = 0 V. We demonstrate the in vitro detection of the spontaneous activity of hippocampal neurons in-situ-grown on top of the graphene sensors during several weeks in a millimeter size PDMS fluidics chamber (8 mm wide. These results provide an advance toward the realization of biocompatible devices for reliable and high spatio-temporal sensing of neuronal activity for both in vitro and in vivo applications.

  5. Silicon synaptic transistor for hardware-based spiking neural network and neuromorphic system

    Science.gov (United States)

    Kim, Hyungjin; Hwang, Sungmin; Park, Jungjin; Park, Byung-Gook

    2017-10-01

    Brain-inspired neuromorphic systems have attracted much attention as new computing paradigms for power-efficient computation. Here, we report a silicon synaptic transistor with two electrically independent gates to realize a hardware-based neural network system without any switching components. The spike-timing dependent plasticity characteristics of the synaptic devices are measured and analyzed. With the help of the device model based on the measured data, the pattern recognition capability of the hardware-based spiking neural network systems is demonstrated using the modified national institute of standards and technology handwritten dataset. By comparing systems with and without inhibitory synapse part, it is confirmed that the inhibitory synapse part is an essential element in obtaining effective and high pattern classification capability.

  6. [Research on UKF control of epileptic-form spikes in neural mass models].

    Science.gov (United States)

    Liu, Xian; Ma, Baiwang; Ji, June; Li, Xiaoli

    2013-12-01

    Neural mass models are able to produce epileptic electroencephalogram (EEG) signals in different stages of seizures. The models play important roles in studying the mechanism analysis and control of epileptic seizures. In this study, the closed-loop feedback control was used to suppress the epileptic-form spikes in the neural mass models. It was expected to provide certain theory basis for the choice of stimulus position and parameter in the clinical treatment. With the influence of measurement noise taken into account, an unscented Kalman filter (UKF) was added to the feedback loop to estimate the system state and an UKF controller was constructed via the estimated state. The control action was imposed on the hyper-excitable population and all populations respectively in simulations. It was shown that both UKF control schemes suppressed the epileptic-form spikes in the model. However, the control energy needed in the latter scheme was less than that needed in the former one.

  7. Different propagation speeds of recalled sequences in plastic spiking neural networks

    Science.gov (United States)

    Huang, Xuhui; Zheng, Zhigang; Hu, Gang; Wu, Si; Rasch, Malte J.

    2015-03-01

    Neural networks can generate spatiotemporal patterns of spike activity. Sequential activity learning and retrieval have been observed in many brain areas, and e.g. is crucial for coding of episodic memory in the hippocampus or generating temporal patterns during song production in birds. In a recent study, a sequential activity pattern was directly entrained onto the neural activity of the primary visual cortex (V1) of rats and subsequently successfully recalled by a local and transient trigger. It was observed that the speed of activity propagation in coordinates of the retinotopically organized neural tissue was constant during retrieval regardless how the speed of light stimulation sweeping across the visual field during training was varied. It is well known that spike-timing dependent plasticity (STDP) is a potential mechanism for embedding temporal sequences into neural network activity. How training and retrieval speeds relate to each other and how network and learning parameters influence retrieval speeds, however, is not well described. We here theoretically analyze sequential activity learning and retrieval in a recurrent neural network with realistic synaptic short-term dynamics and STDP. Testing multiple STDP rules, we confirm that sequence learning can be achieved by STDP. However, we found that a multiplicative nearest-neighbor (NN) weight update rule generated weight distributions and recall activities that best matched the experiments in V1. Using network simulations and mean-field analysis, we further investigated the learning mechanisms and the influence of network parameters on recall speeds. Our analysis suggests that a multiplicative STDP rule with dominant NN spike interaction might be implemented in V1 since recall speed was almost constant in an NMDA-dominant regime. Interestingly, in an AMPA-dominant regime, neural circuits might exhibit recall speeds that instead follow the change in stimulus speeds. This prediction could be tested in

  8. CMOS-based Stochastically Spiking Neural Network for Optimization under Uncertainties

    Science.gov (United States)

    2017-03-01

    uncertainties. We discuss a ‘scenario generation’ circuit to non- parametrically estimate/emulate statistics of uncertain cost/constraints...are explored: (1) We discuss a ‘scenario generation’ circuit to non- parametrically estimate and emulate statistics of uncertain cost/constraints...uncertainties. The discussed mixed-signal, CMOS-based architecture of stochastically spiking neural network minimizes area/power of each cell and

  9. [Robustness analysis of adaptive neural network model based on spike timing-dependent plasticity].

    Science.gov (United States)

    Chen, Yunzhi; Xu, Guizhi; Zhou, Qian; Guo, Miaomiao; Guo, Lei; Wan, Xiaowei

    2015-02-01

    To explore the self-organization robustness of the biological neural network, and thus to provide new ideas and methods for the electromagnetic bionic protection, we studied both the information transmission mechanism of neural network and spike timing-dependent plasticity (STDP) mechanism, and then investigated the relationship between synaptic plastic and adaptive characteristic of biology. Then a feedforward neural network with the Izhikevich model and the STDP mechanism was constructed, and the adaptive robust capacity of the network was analyzed. Simulation results showed that the neural network based on STDP mechanism had good rubustness capacity, and this characteristics is closely related to the STDP mechanisms. Based on this simulation work, the cell circuit with neurons and synaptic circuit which can simulate the information processing mechanisms of biological nervous system will be further built, then the electronic circuits with adaptive robustness will be designed based on the cell circuit.

  10. PAX: A mixed hardware/software simulation platform for spiking neural networks.

    Science.gov (United States)

    Renaud, S; Tomas, J; Lewis, N; Bornat, Y; Daouzli, A; Rudolph, M; Destexhe, A; Saïghi, S

    2010-09-01

    Many hardware-based solutions now exist for the simulation of bio-like neural networks. Less conventional than software-based systems, these types of simulators generally combine digital and analog forms of computation. In this paper we present a mixed hardware-software platform, specifically designed for the simulation of spiking neural networks, using conductance-based models of neurons and synaptic connections with dynamic adaptation rules (Spike-Timing-Dependent Plasticity). The neurons and networks are configurable, and are computed in 'biological real time' by which we mean that the difference between simulated time and simulation time is guaranteed lower than 50 mus. After presenting the issues and context involved in the design and use of hardware-based spiking neural networks, we describe the analog neuromimetic integrated circuits which form the core of the platform. We then explain the organization and computation principles of the modules within the platform, and present experimental results which validate the system. Designed as a tool for computational neuroscience, the platform is exploited in collaborative research projects together with neurobiology and computer science partners. Copyright 2010 Elsevier Ltd. All rights reserved.

  11. Towards a theory of cortical columns: From spiking neurons to interacting neural populations of finite size.

    Directory of Open Access Journals (Sweden)

    Tilo Schwalger

    2017-04-01

    Full Text Available Neural population equations such as neural mass or field models are widely used to study brain activity on a large scale. However, the relation of these models to the properties of single neurons is unclear. Here we derive an equation for several interacting populations at the mesoscopic scale starting from a microscopic model of randomly connected generalized integrate-and-fire neuron models. Each population consists of 50-2000 neurons of the same type but different populations account for different neuron types. The stochastic population equations that we find reveal how spike-history effects in single-neuron dynamics such as refractoriness and adaptation interact with finite-size fluctuations on the population level. Efficient integration of the stochastic mesoscopic equations reproduces the statistical behavior of the population activities obtained from microscopic simulations of a full spiking neural network model. The theory describes nonlinear emergent dynamics such as finite-size-induced stochastic transitions in multistable networks and synchronization in balanced networks of excitatory and inhibitory neurons. The mesoscopic equations are employed to rapidly integrate a model of a cortical microcircuit consisting of eight neuron types, which allows us to predict spontaneous population activities as well as evoked responses to thalamic input. Our theory establishes a general framework for modeling finite-size neural population dynamics based on single cell and synapse parameters and offers an efficient approach to analyzing cortical circuits and computations.

  12. Stochastic Spiking Neural Networks Enabled by Magnetic Tunnel Junctions: From Nontelegraphic to Telegraphic Switching Regimes

    Science.gov (United States)

    Liyanagedera, Chamika M.; Sengupta, Abhronil; Jaiswal, Akhilesh; Roy, Kaushik

    2017-12-01

    Stochastic spiking neural networks based on nanoelectronic spin devices can be a possible pathway to achieving "brainlike" compact and energy-efficient cognitive intelligence. The computational model attempt to exploit the intrinsic device stochasticity of nanoelectronic synaptic or neural components to perform learning or inference. However, there has been limited analysis on the scaling effect of stochastic spin devices and its impact on the operation of such stochastic networks at the system level. This work attempts to explore the design space and analyze the performance of nanomagnet-based stochastic neuromorphic computing architectures for magnets with different barrier heights. We illustrate how the underlying network architecture must be modified to account for the random telegraphic switching behavior displayed by magnets with low barrier heights as they are scaled into the superparamagnetic regime. We perform a device-to-system-level analysis on a deep neural-network architecture for a digit-recognition problem on the MNIST data set.

  13. Spiking Neural Network With Distributed Plasticity Reproduces Cerebellar Learning in Eye Blink Conditioning Paradigms.

    Science.gov (United States)

    Antonietti, Alberto; Casellato, Claudia; Garrido, Jesús A; Luque, Niceto R; Naveros, Francisco; Ros, Eduardo; D' Angelo, Egidio; Pedrocchi, Alessandra

    2016-01-01

    In this study, we defined a realistic cerebellar model through the use of artificial spiking neural networks, testing it in computational simulations that reproduce associative motor tasks in multiple sessions of acquisition and extinction. By evolutionary algorithms, we tuned the cerebellar microcircuit to find out the near-optimal plasticity mechanism parameters that better reproduced human-like behavior in eye blink classical conditioning, one of the most extensively studied paradigms related to the cerebellum. We used two models: one with only the cortical plasticity and another including two additional plasticity sites at nuclear level. First, both spiking cerebellar models were able to well reproduce the real human behaviors, in terms of both "timing" and "amplitude", expressing rapid acquisition, stable late acquisition, rapid extinction, and faster reacquisition of an associative motor task. Even though the model with only the cortical plasticity site showed good learning capabilities, the model with distributed plasticity produced faster and more stable acquisition of conditioned responses in the reacquisition phase. This behavior is explained by the effect of the nuclear plasticities, which have slow dynamics and can express memory consolidation and saving. We showed how the spiking dynamics of multiple interactive neural mechanisms implicitly drive multiple essential components of complex learning processes.  This study presents a very advanced computational model, developed together by biomedical engineers, computer scientists, and neuroscientists. Since its realistic features, the proposed model can provide confirmations and suggestions about neurophysiological and pathological hypotheses and can be used in challenging clinical applications.

  14. Unsupervised Learning in an Ensemble of Spiking Neural Networks Mediated by ITDP.

    Science.gov (United States)

    Shim, Yoonsik; Philippides, Andrew; Staras, Kevin; Husbands, Phil

    2016-10-01

    We propose a biologically plausible architecture for unsupervised ensemble learning in a population of spiking neural network classifiers. A mixture of experts type organisation is shown to be effective, with the individual classifier outputs combined via a gating network whose operation is driven by input timing dependent plasticity (ITDP). The ITDP gating mechanism is based on recent experimental findings. An abstract, analytically tractable model of the ITDP driven ensemble architecture is derived from a logical model based on the probabilities of neural firing events. A detailed analysis of this model provides insights that allow it to be extended into a full, biologically plausible, computational implementation of the architecture which is demonstrated on a visual classification task. The extended model makes use of a style of spiking network, first introduced as a model of cortical microcircuits, that is capable of Bayesian inference, effectively performing expectation maximization. The unsupervised ensemble learning mechanism, based around such spiking expectation maximization (SEM) networks whose combined outputs are mediated by ITDP, is shown to perform the visual classification task well and to generalize to unseen data. The combined ensemble performance is significantly better than that of the individual classifiers, validating the ensemble architecture and learning mechanisms. The properties of the full model are analysed in the light of extensive experiments with the classification task, including an investigation into the influence of different input feature selection schemes and a comparison with a hierarchical STDP based ensemble architecture.

  15. A Spiking Neural Network Based Cortex-Like Mechanism and Application to Facial Expression Recognition

    Directory of Open Access Journals (Sweden)

    Si-Yao Fu

    2012-01-01

    Full Text Available In this paper, we present a quantitative, highly structured cortex-simulated model, which can be simply described as feedforward, hierarchical simulation of ventral stream of visual cortex using biologically plausible, computationally convenient spiking neural network system. The motivation comes directly from recent pioneering works on detailed functional decomposition analysis of the feedforward pathway of the ventral stream of visual cortex and developments on artificial spiking neural networks (SNNs. By combining the logical structure of the cortical hierarchy and computing power of the spiking neuron model, a practical framework has been presented. As a proof of principle, we demonstrate our system on several facial expression recognition tasks. The proposed cortical-like feedforward hierarchy framework has the merit of capability of dealing with complicated pattern recognition problems, suggesting that, by combining the cognitive models with modern neurocomputational approaches, the neurosystematic approach to the study of cortex-like mechanism has the potential to extend our knowledge of brain mechanisms underlying the cognitive analysis and to advance theoretical models of how we recognize face or, more specifically, perceive other people’s facial expression in a rich, dynamic, and complex environment, providing a new starting point for improved models of visual cortex-like mechanism.

  16. Unsupervised Learning in an Ensemble of Spiking Neural Networks Mediated by ITDP.

    Directory of Open Access Journals (Sweden)

    Yoonsik Shim

    2016-10-01

    Full Text Available We propose a biologically plausible architecture for unsupervised ensemble learning in a population of spiking neural network classifiers. A mixture of experts type organisation is shown to be effective, with the individual classifier outputs combined via a gating network whose operation is driven by input timing dependent plasticity (ITDP. The ITDP gating mechanism is based on recent experimental findings. An abstract, analytically tractable model of the ITDP driven ensemble architecture is derived from a logical model based on the probabilities of neural firing events. A detailed analysis of this model provides insights that allow it to be extended into a full, biologically plausible, computational implementation of the architecture which is demonstrated on a visual classification task. The extended model makes use of a style of spiking network, first introduced as a model of cortical microcircuits, that is capable of Bayesian inference, effectively performing expectation maximization. The unsupervised ensemble learning mechanism, based around such spiking expectation maximization (SEM networks whose combined outputs are mediated by ITDP, is shown to perform the visual classification task well and to generalize to unseen data. The combined ensemble performance is significantly better than that of the individual classifiers, validating the ensemble architecture and learning mechanisms. The properties of the full model are analysed in the light of extensive experiments with the classification task, including an investigation into the influence of different input feature selection schemes and a comparison with a hierarchical STDP based ensemble architecture.

  17. Evolving spiking neural networks: a novel growth algorithm exhibits unintelligent design

    Science.gov (United States)

    Schaffer, J. David

    2015-06-01

    Spiking neural networks (SNNs) have drawn considerable excitement because of their computational properties, believed to be superior to conventional von Neumann machines, and sharing properties with living brains. Yet progress building these systems has been limited because we lack a design methodology. We present a gene-driven network growth algorithm that enables a genetic algorithm (evolutionary computation) to generate and test SNNs. The genome for this algorithm grows O(n) where n is the number of neurons; n is also evolved. The genome not only specifies the network topology, but all its parameters as well. Experiments show the algorithm producing SNNs that effectively produce a robust spike bursting behavior given tonic inputs, an application suitable for central pattern generators. Even though evolution did not include perturbations of the input spike trains, the evolved networks showed remarkable robustness to such perturbations. In addition, the output spike patterns retain evidence of the specific perturbation of the inputs, a feature that could be exploited by network additions that could use this information for refined decision making if required. On a second task, a sequence detector, a discriminating design was found that might be considered an example of "unintelligent design"; extra non-functional neurons were included that, while inefficient, did not hamper its proper functioning.

  18. Analog frontend for multichannel neuronal recording system with spike and LFP separation.

    Science.gov (United States)

    Perelman, Yevgeny; Ginosar, Ran

    2006-05-15

    A 0.35microm CMOS integrated circuit for multi-channel neuronal recording with twelve true-differential channels, band separation and digital offset calibration is presented. The measured signal is separated into a low-frequency local field potential and high-frequency spike data. Digitally programmable gains of up to 60 and 80 dB for the local field potential and spike bands are provided. DC offsets are compensated on both bands by means of digitally programmable DACs. Spike band is limited by a second order low-pass filter with digitally programmable cutoff frequency. The IC has been fabricated and tested. 3microV input referred noise on the spike data band was measured.

  19. Reliability of directional information in unsorted spikes and local field potentials recorded in human motor cortex.

    Science.gov (United States)

    Perge, János A; Zhang, Shaomin; Malik, Wasim Q; Homer, Mark L; Cash, Sydney; Friehs, Gerhard; Eskandar, Emad N; Donoghue, John P; Hochberg, Leigh R

    2014-08-01

    Action potentials and local field potentials (LFPs) recorded in primary motor cortex contain information about the direction of movement. LFPs are assumed to be more robust to signal instabilities than action potentials, which makes LFPs, along with action potentials, a promising signal source for brain-computer interface applications. Still, relatively little research has directly compared the utility of LFPs to action potentials in decoding movement direction in human motor cortex. We conducted intracortical multi-electrode recordings in motor cortex of two persons (T2 and [S3]) as they performed a motor imagery task. We then compared the offline decoding performance of LFPs and spiking extracted from the same data recorded across a one-year period in each participant. We obtained offline prediction accuracy of movement direction and endpoint velocity in multiple LFP bands, with the best performance in the highest (200-400 Hz) LFP frequency band, presumably also containing low-pass filtered action potentials. Cross-frequency correlations of preferred directions and directional modulation index showed high similarity of directional information between action potential firing rates (spiking) and high frequency LFPs (70-400 Hz), and increasing disparity with lower frequency bands (0-7, 10-40 and 50-65 Hz). Spikes predicted the direction of intended movement more accurately than any individual LFP band, however combined decoding of all LFPs was statistically indistinguishable from spike-based performance. As the quality of spiking signals (i.e. signal amplitude) and the number of significantly modulated spiking units decreased, the offline decoding performance decreased 3.6[5.65]%/month (for T2 and [S3] respectively). The decrease in the number of significantly modulated LFP signals and their decoding accuracy followed a similar trend (2.4[2.85]%/month, ANCOVA, p = 0.27[0.03]). Field potentials provided comparable offline decoding performance to unsorted spikes. Thus

  20. Informational lesions: optical perturbation of spike timing and neural synchrony via microbial opsin gene fusions

    Directory of Open Access Journals (Sweden)

    Xue Han

    2009-08-01

    Full Text Available Synchronous neural activity occurs throughout the brain in association with normal and pathological brain functions. Despite theoretical work exploring how such neural coordination might facilitate neural computation and be corrupted in disease states, it has proven difficult to test experimentally the causal role of synchrony in such phenomena. Attempts to manipulate neural synchrony often alter other features of neural activity such as firing rate. Here we evaluate a single gene which encodes for the blue-light gated cation channel channelrhodopsin-2 and the yellow-light driven chloride pump halorhodopsin from N. pharaonis, linked by a ‘self-cleaving’ 2A peptide. This fusion enables proportional expression of both opsins, sensitizing neurons to being bi-directionally controlled with blue and yellow light, facilitating proportional optical spike insertion and deletion upon delivery of trains of precisely-timed blue and yellow light pulses. Such approaches may enable more detailed explorations of the causal role of specific features of the neural code.

  1. NeuroFlow: A General Purpose Spiking Neural Network Simulation Platform using Customizable Processors

    Directory of Open Access Journals (Sweden)

    Kit eCheung

    2016-01-01

    Full Text Available NeuroFlow is a scalable spiking neural network simulation platform for off-the-shelf high performance computing systems using customizable hardware processors such as Field-Programmable Gate Arrays (FPGAs. Unlike multi-core processors and application-specific integrated circuits, the processor architecture of NeuroFlow can be redesigned and reconfigured to suit a particular simulation to deliver optimised performance, such as the degree of parallelism to employ. The compilation process supports using PyNN, a simulator-independent neural network description language, to configure the processor. NeuroFlow supports a number of commonly used current or conductance based neuronal models such as integrate-and-fire and Izhikevich models, and the spike-timing-dependent plasticity (STDP rule for learning. A 6-FPGA system can simulate a network of up to approximately 600,000 neurons and can achieve a real-time performance of 400,000 neurons. Using one FPGA, NeuroFlow delivers a speedup of up to 33.6 times the speed of an 8-core processor, or 2.83 times the speed of GPU-based platforms. With high flexibility and throughput, NeuroFlow provides a viable environment for large-scale neural network simulation.

  2. NeuroFlow: A General Purpose Spiking Neural Network Simulation Platform using Customizable Processors.

    Science.gov (United States)

    Cheung, Kit; Schultz, Simon R; Luk, Wayne

    2015-01-01

    NeuroFlow is a scalable spiking neural network simulation platform for off-the-shelf high performance computing systems using customizable hardware processors such as Field-Programmable Gate Arrays (FPGAs). Unlike multi-core processors and application-specific integrated circuits, the processor architecture of NeuroFlow can be redesigned and reconfigured to suit a particular simulation to deliver optimized performance, such as the degree of parallelism to employ. The compilation process supports using PyNN, a simulator-independent neural network description language, to configure the processor. NeuroFlow supports a number of commonly used current or conductance based neuronal models such as integrate-and-fire and Izhikevich models, and the spike-timing-dependent plasticity (STDP) rule for learning. A 6-FPGA system can simulate a network of up to ~600,000 neurons and can achieve a real-time performance of 400,000 neurons. Using one FPGA, NeuroFlow delivers a speedup of up to 33.6 times the speed of an 8-core processor, or 2.83 times the speed of GPU-based platforms. With high flexibility and throughput, NeuroFlow provides a viable environment for large-scale neural network simulation.

  3. NeuroFlow: A General Purpose Spiking Neural Network Simulation Platform using Customizable Processors

    Science.gov (United States)

    Cheung, Kit; Schultz, Simon R.; Luk, Wayne

    2016-01-01

    NeuroFlow is a scalable spiking neural network simulation platform for off-the-shelf high performance computing systems using customizable hardware processors such as Field-Programmable Gate Arrays (FPGAs). Unlike multi-core processors and application-specific integrated circuits, the processor architecture of NeuroFlow can be redesigned and reconfigured to suit a particular simulation to deliver optimized performance, such as the degree of parallelism to employ. The compilation process supports using PyNN, a simulator-independent neural network description language, to configure the processor. NeuroFlow supports a number of commonly used current or conductance based neuronal models such as integrate-and-fire and Izhikevich models, and the spike-timing-dependent plasticity (STDP) rule for learning. A 6-FPGA system can simulate a network of up to ~600,000 neurons and can achieve a real-time performance of 400,000 neurons. Using one FPGA, NeuroFlow delivers a speedup of up to 33.6 times the speed of an 8-core processor, or 2.83 times the speed of GPU-based platforms. With high flexibility and throughput, NeuroFlow provides a viable environment for large-scale neural network simulation. PMID:26834542

  4. Learning by stimulation avoidance: A principle to control spiking neural networks dynamics.

    Science.gov (United States)

    Sinapayen, Lana; Masumori, Atsushi; Ikegami, Takashi

    2017-01-01

    Learning based on networks of real neurons, and learning based on biologically inspired models of neural networks, have yet to find general learning rules leading to widespread applications. In this paper, we argue for the existence of a principle allowing to steer the dynamics of a biologically inspired neural network. Using carefully timed external stimulation, the network can be driven towards a desired dynamical state. We term this principle "Learning by Stimulation Avoidance" (LSA). We demonstrate through simulation that the minimal sufficient conditions leading to LSA in artificial networks are also sufficient to reproduce learning results similar to those obtained in biological neurons by Shahaf and Marom, and in addition explains synaptic pruning. We examined the underlying mechanism by simulating a small network of 3 neurons, then scaled it up to a hundred neurons. We show that LSA has a higher explanatory power than existing hypotheses about the response of biological neural networks to external simulation, and can be used as a learning rule for an embodied application: learning of wall avoidance by a simulated robot. In other works, reinforcement learning with spiking networks can be obtained through global reward signals akin simulating the dopamine system; we believe that this is the first project demonstrating sensory-motor learning with random spiking networks through Hebbian learning relying on environmental conditions without a separate reward system.

  5. Recent advances in neural recording microsystems.

    Science.gov (United States)

    Gosselin, Benoit

    2011-01-01

    The accelerating pace of research in neuroscience has created a considerable demand for neural interfacing microsystems capable of monitoring the activity of large groups of neurons. These emerging tools have revealed a tremendous potential for the advancement of knowledge in brain research and for the development of useful clinical applications. They can extract the relevant control signals directly from the brain enabling individuals with severe disabilities to communicate their intentions to other devices, like computers or various prostheses. Such microsystems are self-contained devices composed of a neural probe attached with an integrated circuit for extracting neural signals from multiple channels, and transferring the data outside the body. The greatest challenge facing development of such emerging devices into viable clinical systems involves addressing their small form factor and low-power consumption constraints, while providing superior resolution. In this paper, we survey the recent progress in the design and the implementation of multi-channel neural recording Microsystems, with particular emphasis on the design of recording and telemetry electronics. An overview of the numerous neural signal modalities is given and the existing microsystem topologies are covered. We present energy-efficient sensory circuits to retrieve weak signals from neural probes and we compare them. We cover data management and smart power scheduling approaches, and we review advances in low-power telemetry. Finally, we conclude by summarizing the remaining challenges and by highlighting the emerging trends in the field.

  6. Recent Advances in Neural Recording Microsystems

    Directory of Open Access Journals (Sweden)

    Benoit Gosselin

    2011-04-01

    Full Text Available The accelerating pace of research in neuroscience has created a considerable demand for neural interfacing microsystems capable of monitoring the activity of large groups of neurons. These emerging tools have revealed a tremendous potential for the advancement of knowledge in brain research and for the development of useful clinical applications. They can extract the relevant control signals directly from the brain enabling individuals with severe disabilities to communicate their intentions to other devices, like computers or various prostheses. Such microsystems are self-contained devices composed of a neural probe attached with an integrated circuit for extracting neural signals from multiple channels, and transferring the data outside the body. The greatest challenge facing development of such emerging devices into viable clinical systems involves addressing their small form factor and low-power consumption constraints, while providing superior resolution. In this paper, we survey the recent progress in the design and the implementation of multi-channel neural recording Microsystems, with particular emphasis on the design of recording and telemetry electronics. An overview of the numerous neural signal modalities is given and the existing microsystem topologies are covered. We present energy-efficient sensory circuits to retrieve weak signals from neural probes and we compare them. We cover data management and smart power scheduling approaches, and we review advances in low-power telemetry. Finally, we conclude by summarizing the remaining challenges and by highlighting the emerging trends in the field.

  7. Self-organized noise resistance of oscillatory neural networks with spike timing-dependent plasticity.

    Science.gov (United States)

    Popovych, Oleksandr V; Yanchuk, Serhiy; Tass, Peter A

    2013-10-11

    Intuitively one might expect independent noise to be a powerful tool for desynchronizing a population of synchronized neurons. We here show that, intriguingly, for oscillatory neural populations with adaptive synaptic weights governed by spike timing-dependent plasticity (STDP) the opposite is true. We found that the mean synaptic coupling in such systems increases dynamically in response to the increase of the noise intensity, and there is an optimal noise level, where the amount of synaptic coupling gets maximal in a resonance-like manner as found for the stochastic or coherence resonances, although the mechanism in our case is different. This constitutes a noise-induced self-organization of the synaptic connectivity, which effectively counteracts the desynchronizing impact of independent noise over a wide range of the noise intensity. Given the attempts to counteract neural synchrony underlying tinnitus with noisers and maskers, our results may be of clinical relevance.

  8. Spike-frequency adapting neural ensembles: beyond mean adaptation and renewal theories.

    Science.gov (United States)

    Muller, Eilif; Buesing, Lars; Schemmel, Johannes; Meier, Karlheinz

    2007-11-01

    We propose a Markov process model for spike-frequency adapting neural ensembles that synthesizes existing mean-adaptation approaches, population density methods, and inhomogeneous renewal theory, resulting in a unified and tractable framework that goes beyond renewal and mean-adaptation theories by accounting for correlations between subsequent interspike intervals. A method for efficiently generating inhomogeneous realizations of the proposed Markov process is given, numerical methods for solving the population equation are presented, and an expression for the first-order interspike interval correlation is derived. Further, we show that the full five-dimensional master equation for a conductance-based integrate-and-fire neuron with spike-frequency adaptation and a relative refractory mechanism driven by Poisson spike trains can be reduced to a two-dimensional generalization of the proposed Markov process by an adiabatic elimination of fast variables. For static and dynamic stimulation, negative serial interspike interval correlations and transient population responses, respectively, of Monte Carlo simulations of the full five-dimensional system can be accurately described by the proposed two-dimensional Markov process.

  9. Effect of Heterogeneity on Decorrelation Mechanisms in Spiking Neural Networks: A Neuromorphic-Hardware Study

    Directory of Open Access Journals (Sweden)

    Thomas Pfeil

    2016-05-01

    Full Text Available High-level brain function, such as memory, classification, or reasoning, can be realized by means of recurrent networks of simplified model neurons. Analog neuromorphic hardware constitutes a fast and energy-efficient substrate for the implementation of such neural computing architectures in technical applications and neuroscientific research. The functional performance of neural networks is often critically dependent on the level of correlations in the neural activity. In finite networks, correlations are typically inevitable due to shared presynaptic input. Recent theoretical studies have shown that inhibitory feedback, abundant in biological neural networks, can actively suppress these shared-input correlations and thereby enable neurons to fire nearly independently. For networks of spiking neurons, the decorrelating effect of inhibitory feedback has so far been explicitly demonstrated only for homogeneous networks of neurons with linear subthreshold dynamics. Theory, however, suggests that the effect is a general phenomenon, present in any system with sufficient inhibitory feedback, irrespective of the details of the network structure or the neuronal and synaptic properties. Here, we investigate the effect of network heterogeneity on correlations in sparse, random networks of inhibitory neurons with nonlinear, conductance-based synapses. Emulations of these networks on the analog neuromorphic-hardware system Spikey allow us to test the efficiency of decorrelation by inhibitory feedback in the presence of hardware-specific heterogeneities. The configurability of the hardware substrate enables us to modulate the extent of heterogeneity in a systematic manner. We selectively study the effects of shared input and recurrent connections on correlations in membrane potentials and spike trains. Our results confirm that shared-input correlations are actively suppressed by inhibitory feedback also in highly heterogeneous networks exhibiting broad

  10. Functional connectivity among spike trains in neural assemblies during rat working memory task.

    Science.gov (United States)

    Xie, Jiacun; Bai, Wenwen; Liu, Tiaotiao; Tian, Xin

    2014-11-01

    Working memory refers to a brain system that provides temporary storage to manipulate information for complex cognitive tasks. As the brain is a more complex, dynamic and interwoven network of connections and interactions, the questions raised here: how to investigate the mechanism of working memory from the view of functional connectivity in brain network? How to present most characteristic features of functional connectivity in a low-dimensional network? To address these questions, we recorded the spike trains in prefrontal cortex with multi-electrodes when rats performed a working memory task in Y-maze. The functional connectivity matrix among spike trains was calculated via maximum likelihood estimation (MLE). The average connectivity value Cc, mean of the matrix, was calculated and used to describe connectivity strength quantitatively. The spike network was constructed by the functional connectivity matrix. The information transfer efficiency Eglob was calculated and used to present the features of the network. In order to establish a low-dimensional spike network, the active neurons with higher firing rates than average rate were selected based on sparse coding. The results show that the connectivity Cc and the network transfer efficiency Eglob vaired with time during the task. The maximum values of Cc and Eglob were prior to the working memory behavior reference point. Comparing with the results in the original network, the feature network could present more characteristic features of functional connectivity. Copyright © 2014 Elsevier B.V. All rights reserved.

  11. A simple method to simultaneously detect and identify spikes from raw extracellular recordings

    Directory of Open Access Journals (Sweden)

    Panagiotis ePetrantonakis

    2015-12-01

    Full Text Available The ability to track when and which neurons fire in the vicinity of an electrode, in an efficient and reliable manner can revolutionize the neuroscience field. The current bottleneck lies in spike sorting algorithms; existing methods for detecting and discriminating the activity of multiple neurons rely on inefficient, multi-step processing of extracellular recordings. In this work, we show that a single-step processing of raw (unfiltered extracellular signals is sufficient for both the detection and identification of active neurons, thus greatly simplifying and optimizing the spike sorting approach. The efficiency and reliability of our method is demonstrated in both real and simulated data.

  12. Auto-deleting brain machine interface: Error detection using spiking neural activity in the motor cortex.

    Science.gov (United States)

    Even-Chen, Nir; Stavisky, Sergey D; Kao, Jonathan C; Ryu, Stephen I; Shenoy, Krishna V

    2015-01-01

    Brain machine interfaces (BMIs) aim to assist people with paralysis by increasing their independence and ability to communicate, e.g., by using a cursor-based virtual keyboard. Current BMI clinical trials are hampered by modest performance that causes selection of wrong characters (errors) and thus reduces achieved typing rate. If it were possible to detect these errors without explicit knowledge of the task goal, this could be used to automatically "undo" wrong selections or even prevent upcoming wrong selections. We decoded imminent or recent errors during closed-loop BMI control from intracortical spiking neural activity. In our experiment, a non-human primate controlled a neurally-driven BMI cursor to acquire targets on a grid, which simulates a virtual keyboard. In offline analyses of this closed-loop BMI control data, we identified motor cortical neural signals indicative of task error occurrence. We were able to detect task outcomes (97% accuracy) and even predict upcoming task outcomes (86% accuracy) using neural activity alone. This novel strategy may help increase the performance and clinical viability of BMIs.

  13. Synchronous digital implementation of the AER communication scheme for emulating large-scale spiking neural networks models

    OpenAIRE

    Moreno Aróstegui, Juan Manuel; Madrenas Boadas, Jordi; Kotynia, L.

    2009-01-01

    In this paper we shall present a fully synchronous digital implementation of the Address Event Representation (AER) communication scheme that has been used in the PERPLEXUS chip in order to permit the emulation of large-scale biologically inspired spiking neural networks models. By introducing specific commands in the AER protocol it is possible to distribute the AER bus among a large number of chips where the functionality of the spiking neurons is being emulated. A c...

  14. Unsupervised discrimination of patterns in spiking neural networks with excitatory and inhibitory synaptic plasticity.

    Science.gov (United States)

    Srinivasa, Narayan; Cho, Youngkwan

    2014-01-01

    A spiking neural network model is described for learning to discriminate among spatial patterns in an unsupervised manner. The network anatomy consists of source neurons that are activated by external inputs, a reservoir that resembles a generic cortical layer with an excitatory-inhibitory (EI) network and a sink layer of neurons for readout. Synaptic plasticity in the form of STDP is imposed on all the excitatory and inhibitory synapses at all times. While long-term excitatory STDP enables sparse and efficient learning of the salient features in inputs, inhibitory STDP enables this learning to be stable by establishing a balance between excitatory and inhibitory currents at each neuron in the network. The synaptic weights between source and reservoir neurons form a basis set for the input patterns. The neural trajectories generated in the reservoir due to input stimulation and lateral connections between reservoir neurons can be readout by the sink layer neurons. This activity is used for adaptation of synapses between reservoir and sink layer neurons. A new measure called the discriminability index (DI) is introduced to compute if the network can discriminate between old patterns already presented in an initial training session. The DI is also used to compute if the network adapts to new patterns without losing its ability to discriminate among old patterns. The final outcome is that the network is able to correctly discriminate between all patterns-both old and new. This result holds as long as inhibitory synapses employ STDP to continuously enable current balance in the network. The results suggest a possible direction for future investigation into how spiking neural networks could address the stability-plasticity question despite having continuous synaptic plasticity.

  15. Neural Spike-Train Analyses of the Speech-Based Envelope Power Spectrum Model

    Directory of Open Access Journals (Sweden)

    Varsha H. Rallapalli

    2016-10-01

    Full Text Available Diagnosing and treating hearing impairment is challenging because people with similar degrees of sensorineural hearing loss (SNHL often have different speech-recognition abilities. The speech-based envelope power spectrum model (sEPSM has demonstrated that the signal-to-noise ratio (SNRENV from a modulation filter bank provides a robust speech-intelligibility measure across a wider range of degraded conditions than many long-standing models. In the sEPSM, noise (N is assumed to: (a reduce S + N envelope power by filling in dips within clean speech (S and (b introduce an envelope noise floor from intrinsic fluctuations in the noise itself. While the promise of SNRENV has been demonstrated for normal-hearing listeners, it has not been thoroughly extended to hearing-impaired listeners because of limited physiological knowledge of how SNHL affects speech-in-noise envelope coding relative to noise alone. Here, envelope coding to speech-in-noise stimuli was quantified from auditory-nerve model spike trains using shuffled correlograms, which were analyzed in the modulation-frequency domain to compute modulation-band estimates of neural SNRENV. Preliminary spike-train analyses show strong similarities to the sEPSM, demonstrating feasibility of neural SNRENV computations. Results suggest that individual differences can occur based on differential degrees of outer- and inner-hair-cell dysfunction in listeners currently diagnosed into the single audiological SNHL category. The predicted acoustic-SNR dependence in individual differences suggests that the SNR-dependent rate of susceptibility could be an important metric in diagnosing individual differences. Future measurements of the neural SNRENV in animal studies with various forms of SNHL will provide valuable insight for understanding individual differences in speech-in-noise intelligibility.

  16. Unsupervised Discrimination of Patterns in Spiking Neural Networks with Excitatory and Inhibitory Synaptic Plasticity

    Directory of Open Access Journals (Sweden)

    Narayan eSrinivasa

    2014-12-01

    Full Text Available A spiking neural network model is described for learning to discriminate among spatial patterns in an unsupervised manner. The network anatomy consists of source neurons that are activated by external inputs, a reservoir that resembles a generic cortical layer with an excitatory-inhibitory (EI network and a sink layer of neurons for readout. Synaptic plasticity in the form of STDP is imposed on all the excitatory and inhibitory synapses at all times. While long-term excitatory STDP enables sparse and efficient learning of the salient features in inputs, inhibitory STDP enables this learning to be stable by establishing a balance between excitatory and inhibitory currents at each neuron in the network. The synaptic weights between source and reservoir neurons form a basis set for the input patterns. The neural trajectories generated in the reservoir due to input stimulation and lateral connections between reservoir neurons can be readout by the sink layer neurons. This activity is used for adaptation of synapses between reservoir and sink layer neurons. A new measure called the discriminability index (DI is introduced to compute if the network can discriminate between old patterns already presented in an initial training session. The DI is also used to compute if the network adapts to new patterns without losing its ability to discriminate among old patterns. The final outcome is that the network is able to correctly discriminate between all patterns – both old and new. This result holds as long as inhibitory synapses employ STDP to continuously enable current balance in the network. The results suggest a possible direction for future investigation into how spiking neural networks could address the stability-plasticity question despite having continuous synaptic plasticity.

  17. A spiking neural network model of model-free reinforcement learning with high-dimensional sensory input and perceptual ambiguity.

    Science.gov (United States)

    Nakano, Takashi; Otsuka, Makoto; Yoshimoto, Junichiro; Doya, Kenji

    2015-01-01

    A theoretical framework of reinforcement learning plays an important role in understanding action selection in animals. Spiking neural networks provide a theoretically grounded means to test computational hypotheses on neurally plausible algorithms of reinforcement learning through numerical simulation. However, most of these models cannot handle observations which are noisy, or occurred in the past, even though these are inevitable and constraining features of learning in real environments. This class of problem is formally known as partially observable reinforcement learning (PORL) problems. It provides a generalization of reinforcement learning to partially observable domains. In addition, observations in the real world tend to be rich and high-dimensional. In this work, we use a spiking neural network model to approximate the free energy of a restricted Boltzmann machine and apply it to the solution of PORL problems with high-dimensional observations. Our spiking network model solves maze tasks with perceptually ambiguous high-dimensional observations without knowledge of the true environment. An extended model with working memory also solves history-dependent tasks. The way spiking neural networks handle PORL problems may provide a glimpse into the underlying laws of neural information processing which can only be discovered through such a top-down approach.

  18. Learning Pitch with STDP: A Computational Model of Place and Temporal Pitch Perception Using Spiking Neural Networks.

    Directory of Open Access Journals (Sweden)

    Nafise Erfanian Saeedi

    2016-04-01

    Full Text Available Pitch perception is important for understanding speech prosody, music perception, recognizing tones in tonal languages, and perceiving speech in noisy environments. The two principal pitch perception theories consider the place of maximum neural excitation along the auditory nerve and the temporal pattern of the auditory neurons' action potentials (spikes as pitch cues. This paper describes a biophysical mechanism by which fine-structure temporal information can be extracted from the spikes generated at the auditory periphery. Deriving meaningful pitch-related information from spike times requires neural structures specialized in capturing synchronous or correlated activity from amongst neural events. The emergence of such pitch-processing neural mechanisms is described through a computational model of auditory processing. Simulation results show that a correlation-based, unsupervised, spike-based form of Hebbian learning can explain the development of neural structures required for recognizing the pitch of simple and complex tones, with or without the fundamental frequency. The temporal code is robust to variations in the spectral shape of the signal and thus can explain the phenomenon of pitch constancy.

  19. Anti-correlations in the degree distribution increase stimulus detection performance in noisy spiking neural networks.

    Science.gov (United States)

    Martens, Marijn B; Houweling, Arthur R; E Tiesinga, Paul H

    2017-02-01

    Neuronal circuits in the rodent barrel cortex are characterized by stable low firing rates. However, recent experiments show that short spike trains elicited by electrical stimulation in single neurons can induce behavioral responses. Hence, the underlying neural networks provide stability against internal fluctuations in the firing rate, while simultaneously making the circuits sensitive to small external perturbations. Here we studied whether stability and sensitivity are affected by the connectivity structure in recurrently connected spiking networks. We found that anti-correlation between the number of afferent (in-degree) and efferent (out-degree) synaptic connections of neurons increases stability against pathological bursting, relative to networks where the degrees were either positively correlated or uncorrelated. In the stable network state, stimulation of a few cells could lead to a detectable change in the firing rate. To quantify the ability of networks to detect the stimulation, we used a receiver operating characteristic (ROC) analysis. For a given level of background noise, networks with anti-correlated degrees displayed the lowest false positive rates, and consequently had the highest stimulus detection performance. We propose that anti-correlation in the degree distribution may be a computational strategy employed by sensory cortices to increase the detectability of external stimuli. We show that networks with anti-correlated degrees can in principle be formed by applying learning rules comprised of a combination of spike-timing dependent plasticity, homeostatic plasticity and pruning to networks with uncorrelated degrees. To test our prediction we suggest a novel experimental method to estimate correlations in the degree distribution.

  20. FPGA IMPLEMENTATION OF ADAPTIVE INTEGRATED SPIKING NEURAL NETWORK FOR EFFICIENT IMAGE RECOGNITION SYSTEM

    Directory of Open Access Journals (Sweden)

    T. Pasupathi

    2014-05-01

    Full Text Available Image recognition is a technology which can be used in various applications such as medical image recognition systems, security, defense video tracking, and factory automation. In this paper we present a novel pipelined architecture of an adaptive integrated Artificial Neural Network for image recognition. In our proposed work we have combined the feature of spiking neuron concept with ANN to achieve the efficient architecture for image recognition. The set of training images are trained by ANN and target output has been identified. Real time videos are captured and then converted into frames for testing purpose and the image were recognized. The machine can operate at up to 40 frames/sec using images acquired from the camera. The system has been implemented on XC3S400 SPARTAN-3 Field Programmable Gate Arrays.

  1. A stochastic-field description of finite-size spiking neural networks.

    Science.gov (United States)

    Dumont, Grégory; Payeur, Alexandre; Longtin, André

    2017-08-01

    Neural network dynamics are governed by the interaction of spiking neurons. Stochastic aspects of single-neuron dynamics propagate up to the network level and shape the dynamical and informational properties of the population. Mean-field models of population activity disregard the finite-size stochastic fluctuations of network dynamics and thus offer a deterministic description of the system. Here, we derive a stochastic partial differential equation (SPDE) describing the temporal evolution of the finite-size refractory density, which represents the proportion of neurons in a given refractory state at any given time. The population activity-the density of active neurons per unit time-is easily extracted from this refractory density. The SPDE includes finite-size effects through a two-dimensional Gaussian white noise that acts both in time and along the refractory dimension. For an infinite number of neurons the standard mean-field theory is recovered. A discretization of the SPDE along its characteristic curves allows direct simulations of the activity of large but finite spiking networks; this constitutes the main advantage of our approach. Linearizing the SPDE with respect to the deterministic asynchronous state allows the theoretical investigation of finite-size activity fluctuations. In particular, analytical expressions for the power spectrum and autocorrelation of activity fluctuations are obtained. Moreover, our approach can be adapted to incorporate multiple interacting populations and quasi-renewal single-neuron dynamics.

  2. Spiking neural network model for memorizing sequences with forward and backward recall.

    Science.gov (United States)

    Borisyuk, Roman; Chik, David; Kazanovich, Yakov; da Silva Gomes, João

    2013-06-01

    We present an oscillatory network of conductance based spiking neurons of Hodgkin-Huxley type as a model of memory storage and retrieval of sequences of events (or objects). The model is inspired by psychological and neurobiological evidence on sequential memories. The building block of the model is an oscillatory module which contains excitatory and inhibitory neurons with all-to-all connections. The connection architecture comprises two layers. A lower layer represents consecutive events during their storage and recall. This layer is composed of oscillatory modules. Plastic excitatory connections between the modules are implemented using an STDP type learning rule for sequential storage. Excitatory neurons in the upper layer project star-like modifiable connections toward the excitatory lower layer neurons. These neurons in the upper layer are used to tag sequences of events represented in the lower layer. Computer simulations demonstrate good performance of the model including difficult cases when different sequences contain overlapping events. We show that the model with STDP type or anti-STDP type learning rules can be applied for the simulation of forward and backward replay of neural spikes respectively. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  3. Lateral Information Processing by Spiking Neurons: A Theoretical Model of the Neural Correlate of Consciousness

    Directory of Open Access Journals (Sweden)

    Marc Ebner

    2011-01-01

    Full Text Available Cognitive brain functions, for example, sensory perception, motor control and learning, are understood as computation by axonal-dendritic chemical synapses in networks of integrate-and-fire neurons. Cognitive brain functions may occur either consciously or nonconsciously (on “autopilot”. Conscious cognition is marked by gamma synchrony EEG, mediated largely by dendritic-dendritic gap junctions, sideways connections in input/integration layers. Gap-junction-connected neurons define a sub-network within a larger neural network. A theoretical model (the “conscious pilot” suggests that as gap junctions open and close, a gamma-synchronized subnetwork, or zone moves through the brain as an executive agent, converting nonconscious “auto-pilot” cognition to consciousness, and enhancing computation by coherent processing and collective integration. In this study we implemented sideways “gap junctions” in a single-layer artificial neural network to perform figure/ground separation. The set of neurons connected through gap junctions form a reconfigurable resistive grid or sub-network zone. In the model, outgoing spikes are temporally integrated and spatially averaged using the fixed resistive grid set up by neurons of similar function which are connected through gap-junctions. This spatial average, essentially a feedback signal from the neuron's output, determines whether particular gap junctions between neurons will open or close. Neurons connected through open gap junctions synchronize their output spikes. We have tested our gap-junction-defined sub-network in a one-layer neural network on artificial retinal inputs using real-world images. Our system is able to perform figure/ground separation where the laterally connected sub-network of neurons represents a perceived object. Even though we only show results for visual stimuli, our approach should generalize to other modalities. The system demonstrates a moving sub-network zone of

  4. Recording human cortical population spikes non-invasively--An EEG tutorial.

    Science.gov (United States)

    Waterstraat, Gunnar; Fedele, Tommaso; Burghoff, Martin; Scheer, Hans-Jürgen; Curio, Gabriel

    2015-07-30

    Non-invasively recorded somatosensory high-frequency oscillations (sHFOs) evoked by electric nerve stimulation are markers of human cortical population spikes. Previously, their analysis was based on massive averaging of EEG responses. Advanced neurotechnology and optimized off-line analysis can enhance the signal-to-noise ratio of sHFOs, eventually enabling single-trial analysis. The rationale for developing dedicated low-noise EEG technology for sHFOs is unfolded. Detailed recording procedures and tailored analysis principles are explained step-by-step. Source codes in Matlab and Python are provided as supplementary material online. Combining synergistic hardware and analysis improvements, evoked sHFOs at around 600 Hz ('σ-bursts') can be studied in single-trials. Additionally, optimized spatial filters increase the signal-to-noise ratio of components at about 1 kHz ('κ-bursts') enabling their detection in non-invasive surface EEG. sHFOs offer a unique possibility to record evoked human cortical population spikes non-invasively. The experimental approaches and algorithms presented here enable also non-specialized EEG laboratories to combine measurements of conventional low-frequency EEG with the analysis of concomitant cortical population spike responses. Copyright © 2014 Elsevier B.V. All rights reserved.

  5. Functional recordings from awake, behaving rodents through a microchannel based regenerative neural interface

    Science.gov (United States)

    Gore, Russell K.; Choi, Yoonsu; Bellamkonda, Ravi; English, Arthur

    2015-02-01

    Objective. Neural interface technologies could provide controlling connections between the nervous system and external technologies, such as limb prosthetics. The recording of efferent, motor potentials is a critical requirement for a peripheral neural interface, as these signals represent the user-generated neural output intended to drive external devices. Our objective was to evaluate structural and functional neural regeneration through a microchannel neural interface and to characterize potentials recorded from electrodes placed within the microchannels in awake and behaving animals. Approach. Female rats were implanted with muscle EMG electrodes and, following unilateral sciatic nerve transection, the cut nerve was repaired either across a microchannel neural interface or with end-to-end surgical repair. During a 13 week recovery period, direct muscle responses to nerve stimulation proximal to the transection were monitored weekly. In two rats repaired with the neural interface, four wire electrodes were embedded in the microchannels and recordings were obtained within microchannels during proximal stimulation experiments and treadmill locomotion. Main results. In these proof-of-principle experiments, we found that axons from cut nerves were capable of functional reinnervation of distal muscle targets, whether regenerating through a microchannel device or after direct end-to-end repair. Discrete stimulation-evoked and volitional potentials were recorded within interface microchannels in a small group of awake and behaving animals and their firing patterns correlated directly with intramuscular recordings during locomotion. Of 38 potentials extracted, 19 were identified as motor axons reinnervating tibialis anterior or soleus muscles using spike triggered averaging. Significance. These results are evidence for motor axon regeneration through microchannels and are the first report of in vivo recordings from regenerated motor axons within microchannels in a small

  6. A neuromorphic implementation of multiple spike-timing synaptic plasticity rules for large-scale neural networks

    Directory of Open Access Journals (Sweden)

    Runchun Mark Wang

    2015-05-01

    Full Text Available We present a neuromorphic implementation of multiple synaptic plasticity learning rules, which include both Spike Timing Dependent Plasticity (STDP and Spike Timing Dependent Delay Plasticity (STDDP. We present a fully digital implementation as well as a mixed-signal implementation, both of which use a novel dynamic-assignment time-multiplexing approach and support up to 2^26 (64M synaptic plasticity elements. Rather than implementing dedicated synapses for particular types of synaptic plasticity, we implemented a more generic synaptic plasticity adaptor array that is separate from the neurons in the neural network. Each adaptor performs synaptic plasticity according to the arrival times of the pre- and post-synaptic spikes assigned to it, and sends out a weighted and/or delayed pre-synaptic spike to the target synapse in the neural network. This strategy provides great flexibility for building complex large-scale neural networks, as a neural network can be configured for multiple synaptic plasticity rules without changing its structure. We validate the proposed neuromorphic implementations with measurement results and illustrate that the circuits are capable of performing both STDP and STDDP. We argue that it is practical to scale the work presented here up to 2^36 (64G synaptic adaptors on a current high-end FPGA platform.

  7. A neuromorphic implementation of multiple spike-timing synaptic plasticity rules for large-scale neural networks.

    Science.gov (United States)

    Wang, Runchun M; Hamilton, Tara J; Tapson, Jonathan C; van Schaik, André

    2015-01-01

    We present a neuromorphic implementation of multiple synaptic plasticity learning rules, which include both Spike Timing Dependent Plasticity (STDP) and Spike Timing Dependent Delay Plasticity (STDDP). We present a fully digital implementation as well as a mixed-signal implementation, both of which use a novel dynamic-assignment time-multiplexing approach and support up to 2(26) (64M) synaptic plasticity elements. Rather than implementing dedicated synapses for particular types of synaptic plasticity, we implemented a more generic synaptic plasticity adaptor array that is separate from the neurons in the neural network. Each adaptor performs synaptic plasticity according to the arrival times of the pre- and post-synaptic spikes assigned to it, and sends out a weighted or delayed pre-synaptic spike to the post-synaptic neuron in the neural network. This strategy provides great flexibility for building complex large-scale neural networks, as a neural network can be configured for multiple synaptic plasticity rules without changing its structure. We validate the proposed neuromorphic implementations with measurement results and illustrate that the circuits are capable of performing both STDP and STDDP. We argue that it is practical to scale the work presented here up to 2(36) (64G) synaptic adaptors on a current high-end FPGA platform.

  8. Using Stochastic Spiking Neural Networks on SpiNNaker to Solve Constraint Satisfaction Problems

    Science.gov (United States)

    Fonseca Guerra, Gabriel A.; Furber, Steve B.

    2017-01-01

    Constraint satisfaction problems (CSP) are at the core of numerous scientific and technological applications. However, CSPs belong to the NP-complete complexity class, for which the existence (or not) of efficient algorithms remains a major unsolved question in computational complexity theory. In the face of this fundamental difficulty heuristics and approximation methods are used to approach instances of NP (e.g., decision and hard optimization problems). The human brain efficiently handles CSPs both in perception and behavior using spiking neural networks (SNNs), and recent studies have demonstrated that the noise embedded within an SNN can be used as a computational resource to solve CSPs. Here, we provide a software framework for the implementation of such noisy neural solvers on the SpiNNaker massively parallel neuromorphic hardware, further demonstrating their potential to implement a stochastic search that solves instances of P and NP problems expressed as CSPs. This facilitates the exploration of new optimization strategies and the understanding of the computational abilities of SNNs. We demonstrate the basic principles of the framework by solving difficult instances of the Sudoku puzzle and of the map color problem, and explore its application to spin glasses. The solver works as a stochastic dynamical system, which is attracted by the configuration that solves the CSP. The noise allows an optimal exploration of the space of configurations, looking for the satisfiability of all the constraints; if applied discontinuously, it can also force the system to leap to a new random configuration effectively causing a restart.

  9. Using Stochastic Spiking Neural Networks on SpiNNaker to Solve Constraint Satisfaction Problems

    Directory of Open Access Journals (Sweden)

    Gabriel A. Fonseca Guerra

    2017-12-01

    Full Text Available Constraint satisfaction problems (CSP are at the core of numerous scientific and technological applications. However, CSPs belong to the NP-complete complexity class, for which the existence (or not of efficient algorithms remains a major unsolved question in computational complexity theory. In the face of this fundamental difficulty heuristics and approximation methods are used to approach instances of NP (e.g., decision and hard optimization problems. The human brain efficiently handles CSPs both in perception and behavior using spiking neural networks (SNNs, and recent studies have demonstrated that the noise embedded within an SNN can be used as a computational resource to solve CSPs. Here, we provide a software framework for the implementation of such noisy neural solvers on the SpiNNaker massively parallel neuromorphic hardware, further demonstrating their potential to implement a stochastic search that solves instances of P and NP problems expressed as CSPs. This facilitates the exploration of new optimization strategies and the understanding of the computational abilities of SNNs. We demonstrate the basic principles of the framework by solving difficult instances of the Sudoku puzzle and of the map color problem, and explore its application to spin glasses. The solver works as a stochastic dynamical system, which is attracted by the configuration that solves the CSP. The noise allows an optimal exploration of the space of configurations, looking for the satisfiability of all the constraints; if applied discontinuously, it can also force the system to leap to a new random configuration effectively causing a restart.

  10. Bayesian filtering in spiking neural networks: noise, adaptation, and multisensory integration.

    Science.gov (United States)

    Bobrowski, Omer; Meir, Ron; Eldar, Yonina C

    2009-05-01

    A key requirement facing organisms acting in uncertain dynamic environments is the real-time estimation and prediction of environmental states, based on which effective actions can be selected. While it is becoming evident that organisms employ exact or approximate Bayesian statistical calculations for these purposes, it is far less clear how these putative computations are implemented by neural networks in a strictly dynamic setting. In this work, we make use of rigorous mathematical results from the theory of continuous time point process filtering and show how optimal real-time state estimation and prediction may be implemented in a general setting using simple recurrent neural networks. The framework is applicable to many situations of common interest, including noisy observations, non-Poisson spike trains (incorporating adaptation), multisensory integration, and state prediction. The optimal network properties are shown to relate to the statistical structure of the environment, and the benefits of adaptation are studied and explicitly demonstrated. Finally, we recover several existing results as appropriate limits of our general setting.

  11. A customizable stochastic state point process filter (SSPPF) for neural spiking activity.

    Science.gov (United States)

    Xin, Yao; Li, Will X Y; Min, Biao; Han, Yan; Cheung, Ray C C

    2013-01-01

    Stochastic State Point Process Filter (SSPPF) is effective for adaptive signal processing. In particular, it has been successfully applied to neural signal coding/decoding in recent years. Recent work has proven its efficiency in non-parametric coefficients tracking in modeling of mammal nervous system. However, existing SSPPF has only been realized in commercial software platforms which limit their computational capability. In this paper, the first hardware architecture of SSPPF has been designed and successfully implemented on field-programmable gate array (FPGA), proving a more efficient means for coefficient tracking in a well-established generalized Laguerre-Volterra model for mammalian hippocampal spiking activity research. By exploring the intrinsic parallelism of the FPGA, the proposed architecture is able to process matrices or vectors with random size, and is efficiently scalable. Experimental result shows its superior performance comparing to the software implementation, while maintaining the numerical precision. This architecture can also be potentially utilized in the future hippocampal cognitive neural prosthesis design.

  12. The effects of dynamical synapses on firing rate activity: a spiking neural network model.

    Science.gov (United States)

    Khalil, Radwa; Moftah, Marie Z; Moustafa, Ahmed A

    2017-11-01

    Accumulating evidence relates the fine-tuning of synaptic maturation and regulation of neural network activity to several key factors, including GABA A signaling and a lateral spread length between neighboring neurons (i.e., local connectivity). Furthermore, a number of studies consider short-term synaptic plasticity (STP) as an essential element in the instant modification of synaptic efficacy in the neuronal network and in modulating responses to sustained ranges of external Poisson input frequency (IF). Nevertheless, evaluating the firing activity in response to the dynamical interaction between STP (triggered by ranges of IF) and these key parameters in vitro remains elusive. Therefore, we designed a spiking neural network (SNN) model in which we incorporated the following parameters: local density of arbor essences and a lateral spread length between neighboring neurons. We also created several network scenarios based on these key parameters. Then, we implemented two classes of STP: (1) short-term synaptic depression (STD) and (2) short-term synaptic facilitation (STF). Each class has two differential forms based on the parametric value of its synaptic time constant (either for depressing or facilitating synapses). Lastly, we compared the neural firing responses before and after the treatment with STP. We found that dynamical synapses (STP) have a critical differential role on evaluating and modulating the firing rate activity in each network scenario. Moreover, we investigated the impact of changing the balance between excitation (E) and inhibition (I) on stabilizing this firing activity. © 2017 Federation of European Neuroscience Societies and John Wiley & Sons Ltd.

  13. An Event-Driven Classifier for Spiking Neural Networks Fed with Synthetic or Dynamic Vision Sensor Data

    Directory of Open Access Journals (Sweden)

    Evangelos Stromatias

    2017-06-01

    Full Text Available This paper introduces a novel methodology for training an event-driven classifier within a Spiking Neural Network (SNN System capable of yielding good classification results when using both synthetic input data and real data captured from Dynamic Vision Sensor (DVS chips. The proposed supervised method uses the spiking activity provided by an arbitrary topology of prior SNN layers to build histograms and train the classifier in the frame domain using the stochastic gradient descent algorithm. In addition, this approach can cope with leaky integrate-and-fire neuron models within the SNN, a desirable feature for real-world SNN applications, where neural activation must fade away after some time in the absence of inputs. Consequently, this way of building histograms captures the dynamics of spikes immediately before the classifier. We tested our method on the MNIST data set using different synthetic encodings and real DVS sensory data sets such as N-MNIST, MNIST-DVS, and Poker-DVS using the same network topology and feature maps. We demonstrate the effectiveness of our approach by achieving the highest classification accuracy reported on the N-MNIST (97.77% and Poker-DVS (100% real DVS data sets to date with a spiking convolutional network. Moreover, by using the proposed method we were able to retrain the output layer of a previously reported spiking neural network and increase its performance by 2%, suggesting that the proposed classifier can be used as the output layer in works where features are extracted using unsupervised spike-based learning methods. In addition, we also analyze SNN performance figures such as total event activity and network latencies, which are relevant for eventual hardware implementations. In summary, the paper aggregates unsupervised-trained SNNs with a supervised-trained SNN classifier, combining and applying them to heterogeneous sets of benchmarks, both synthetic and from real DVS chips.

  14. An Event-Driven Classifier for Spiking Neural Networks Fed with Synthetic or Dynamic Vision Sensor Data.

    Science.gov (United States)

    Stromatias, Evangelos; Soto, Miguel; Serrano-Gotarredona, Teresa; Linares-Barranco, Bernabé

    2017-01-01

    This paper introduces a novel methodology for training an event-driven classifier within a Spiking Neural Network (SNN) System capable of yielding good classification results when using both synthetic input data and real data captured from Dynamic Vision Sensor (DVS) chips. The proposed supervised method uses the spiking activity provided by an arbitrary topology of prior SNN layers to build histograms and train the classifier in the frame domain using the stochastic gradient descent algorithm. In addition, this approach can cope with leaky integrate-and-fire neuron models within the SNN, a desirable feature for real-world SNN applications, where neural activation must fade away after some time in the absence of inputs. Consequently, this way of building histograms captures the dynamics of spikes immediately before the classifier. We tested our method on the MNIST data set using different synthetic encodings and real DVS sensory data sets such as N-MNIST, MNIST-DVS, and Poker-DVS using the same network topology and feature maps. We demonstrate the effectiveness of our approach by achieving the highest classification accuracy reported on the N-MNIST (97.77%) and Poker-DVS (100%) real DVS data sets to date with a spiking convolutional network. Moreover, by using the proposed method we were able to retrain the output layer of a previously reported spiking neural network and increase its performance by 2%, suggesting that the proposed classifier can be used as the output layer in works where features are extracted using unsupervised spike-based learning methods. In addition, we also analyze SNN performance figures such as total event activity and network latencies, which are relevant for eventual hardware implementations. In summary, the paper aggregates unsupervised-trained SNNs with a supervised-trained SNN classifier, combining and applying them to heterogeneous sets of benchmarks, both synthetic and from real DVS chips.

  15. An optical microsystem for wireless neural recording.

    Science.gov (United States)

    Wei, P; Ziaie, B

    2009-01-01

    In this paper, we describe an optical microsystem for wireless neural recording. The system incorporated recording electrodes, integrated electronics, surface-mount LEDs, and a CCD camera. The components were mounted on a PCB platform having a total dimension of 2.2 x 2.2 cm(2), 4 integrated biopotential amplifiers (IBA) and 16 LEDs. The IBAs having a bandwidth of 0.1-93.5Hz with the midband gain of 38 dB were fabricated using AMI 1.6microm technology. The simulated local field potentials (LFP) were amplified and used to drive the LEDs. A CCD camera with a temporal resolution of 30FPS was used to capture the image and retrieve the signal.

  16. Design of silicon brains in the nano-CMOS era: spiking neurons, learning synapses and neural architecture optimization.

    Science.gov (United States)

    Cassidy, Andrew S; Georgiou, Julius; Andreou, Andreas G

    2013-09-01

    We present a design framework for neuromorphic architectures in the nano-CMOS era. Our approach to the design of spiking neurons and STDP learning circuits relies on parallel computational structures where neurons are abstracted as digital arithmetic logic units and communication processors. Using this approach, we have developed arrays of silicon neurons that scale to millions of neurons in a single state-of-the-art Field Programmable Gate Array (FPGA). We demonstrate the validity of the design methodology through the implementation of cortical development in a circuit of spiking neurons, STDP synapses, and neural architecture optimization. Copyright © 2013 Elsevier Ltd. All rights reserved.

  17. Feasibility Study of Extended-Gate-Type Silicon Nanowire Field-Effect Transistors for Neural Recording

    Science.gov (United States)

    Kang, Hongki; Kim, Jee-Yeon; Choi, Yang-Kyu; Nam, Yoonkey

    2017-01-01

    In this research, a high performance silicon nanowire field-effect transistor (transconductance as high as 34 µS and sensitivity as 84 nS/mV) is extensively studied and directly compared with planar passive microelectrode arrays for neural recording application. Electrical and electrochemical characteristics are carefully characterized in a very well-controlled manner. We especially focused on the signal amplification capability and intrinsic noise of the transistors. A neural recording system using both silicon nanowire field-effect transistor-based active-type microelectrode array and platinum black microelectrode-based passive-type microelectrode array are implemented and compared. An artificial neural spike signal is supplied as input to both arrays through a buffer solution and recorded simultaneously. Recorded signal intensity by the silicon nanowire transistor was precisely determined by an electrical characteristic of the transistor, transconductance. Signal-to-noise ratio was found to be strongly dependent upon the intrinsic 1/f noise of the silicon nanowire transistor. We found how signal strength is determined and how intrinsic noise of the transistor determines signal-to-noise ratio of the recorded neural signals. This study provides in-depth understanding of the overall neural recording mechanism using silicon nanowire transistors and solid design guideline for further improvement and development. PMID:28350370

  18. Learning expectation in insects: a recurrent spiking neural model for spatio-temporal representation.

    Science.gov (United States)

    Arena, Paolo; Patané, Luca; Termini, Pietro Savio

    2012-08-01

    Insects are becoming a reference point in Neuroscience for the study of biological aspects at the basis of cognitive processes. These animals have much simpler brains with respect to higher animals, showing, at the same time, impressive capability to adaptively react and take decisions in front of complex environmental situations. In this paper we propose a neural model inspired by the insect olfactory system, with particular attention to the fruit fly Drosophila melanogaster. This architecture is a multilayer spiking network, where each layer is inspired by the structures of the insect brain mainly involved in olfactory information processing, namely the Mushroom Bodies, the Lateral Horns and the Antennal Lobes. In the Antennal Lobes layer olfactory signals lead to a competition among sets of neurons, resulting in a pattern which is projected to the Mushroom Bodies layer. Here a competitive reaction-diffusion process leads to a spontaneous emerging of clusters. The Lateral Horns have been modeled as a delayed input-triggered resetting system. Using plastic recurrent connections, with the addition of simple learning mechanisms, the structure is able to realize a top-down modulation at the input level. This leads to the emergence of an attentional loop as well as to the arousal of basic expectation behaviors in case of subsequently presented stimuli. Simulation results and analysis on the biological plausibility of the architecture are provided and the role of noise in the network is reported. Copyright © 2012 Elsevier Ltd. All rights reserved.

  19. Self-organization of spiking neural network that generates autonomous behavior in a real mobile robot.

    Science.gov (United States)

    Alnajjar, Fady; Murase, Kazuyuki

    2006-08-01

    In this paper, we propose self-organization algorithm of spiking neural network (SNN) applicable to autonomous robot for generation of adoptive and goal-directed behavior. First, we formulated a SNN model whose inputs and outputs were analog and the hidden unites are interconnected each other. Next, we implemented it into a miniature mobile robot Khepera. In order to see whether or not a solution(s) for the given task(s) exists with the SNN, the robot was evolved with the genetic algorithm in the environment. The robot acquired the obstacle avoidance and navigation task successfully, exhibiting the presence of the solution. After that, a self-organization algorithm based on a use-dependent synaptic potentiation and depotentiation at synapses of input layer to hidden layer and of hidden layer to output layer was formulated and implemented into the robot. In the environment, the robot incrementally organized the network and the given tasks were successfully performed. The time needed to acquire the desired adoptive and goal-directed behavior using the proposed self-organization method was much less than that with the genetic evolution, approximately one fifth.

  20. Robust working memory in an asynchronously spiking neural network realized in neuromorphic VLSI

    Directory of Open Access Journals (Sweden)

    Massimiliano eGiulioni

    2012-02-01

    Full Text Available We demonstrate bistable attractor dynamics in a spiking neural network implemented with neuromorphic VLSI hardware. The on-chip network consists of three interacting populations (two excitatory, one inhibitory of integrate-and-fire (LIF neurons. One excitatory population is distinguished by strong synaptic self-excitation, which sustains meta-stable states of ‘high’ and ‘low’-firing activity. Depending on the overall excitability, transitions to the ‘high’ state may be evoked by external stimulation, or may occur spontaneously due to random activity fluctuations. In the former case, the ‘high’ state retains a working memory of a stimulus until well after its release. In the latter case, ‘high’ states remain stable for seconds, three orders of magnitude longer than the largest time-scale implemented in the circuitry. Evoked and spontaneous transitions form a continuum and may exhibit a wide range of latencies, depending on the strength of external stimulation and of recurrent synaptic excitation. In addition, we investigated corrupted ‘high’ states comprising neurons of both excitatory populations. Within a basin of attraction, the network dynamics corrects such states and re-establishes the prototypical ‘high’ state. We conclude that, with effective theoretical guidance, full-fledged attractor dynamics can be realized with comparatively small populations of neuromorphic hardware neurons.

  1. Acceleration of spiking neural network based pattern recognition on NVIDIA graphics processors.

    Science.gov (United States)

    Han, Bing; Taha, Tarek M

    2010-04-01

    There is currently a strong push in the research community to develop biological scale implementations of neuron based vision models. Systems at this scale are computationally demanding and generally utilize more accurate neuron models, such as the Izhikevich and the Hodgkin-Huxley models, in favor of the more popular integrate and fire model. We examine the feasibility of using graphics processing units (GPUs) to accelerate a spiking neural network based character recognition network to enable such large scale systems. Two versions of the network utilizing the Izhikevich and Hodgkin-Huxley models are implemented. Three NVIDIA general-purpose (GP) GPU platforms are examined, including the GeForce 9800 GX2, the Tesla C1060, and the Tesla S1070. Our results show that the GPGPUs can provide significant speedup over conventional processors. In particular, the fastest GPGPU utilized, the Tesla S1070, provided a speedup of 5.6 and 84.4 over highly optimized implementations on the fastest central processing unit (CPU) tested, a quadcore 2.67 GHz Xeon processor, for the Izhikevich and the Hodgkin-Huxley models, respectively. The CPU implementation utilized all four cores and the vector data parallelism offered by the processor. The results indicate that GPUs are well suited for this application domain.

  2. Event management for large scale event-driven digital hardware spiking neural networks.

    Science.gov (United States)

    Caron, Louis-Charles; D'Haene, Michiel; Mailhot, Frédéric; Schrauwen, Benjamin; Rouat, Jean

    2013-09-01

    The interest in brain-like computation has led to the design of a plethora of innovative neuromorphic systems. Individually, spiking neural networks (SNNs), event-driven simulation and digital hardware neuromorphic systems get a lot of attention. Despite the popularity of event-driven SNNs in software, very few digital hardware architectures are found. This is because existing hardware solutions for event management scale badly with the number of events. This paper introduces the structured heap queue, a pipelined digital hardware data structure, and demonstrates its suitability for event management. The structured heap queue scales gracefully with the number of events, allowing the efficient implementation of large scale digital hardware event-driven SNNs. The scaling is linear for memory, logarithmic for logic resources and constant for processing time. The use of the structured heap queue is demonstrated on a field-programmable gate array (FPGA) with an image segmentation experiment and a SNN of 65,536 neurons and 513,184 synapses. Events can be processed at the rate of 1 every 7 clock cycles and a 406×158 pixel image is segmented in 200 ms. Copyright © 2013 Elsevier Ltd. All rights reserved.

  3. Topological dynamics in spike-timing dependent plastic model neural networks

    Directory of Open Access Journals (Sweden)

    David B. Stone

    2013-04-01

    Full Text Available Spike-timing dependent plasticity (STDP is a biologically constrained unsupervised form of learning that potentiates or depresses synaptic connections based on the precise timing of pre-synaptic and post-synaptic firings. The effects of on-going STDP on the topology of evolving model neural networks were assessed in 50 unique simulations which modeled two hours of activity. After a period of stabilization, a number of global and local topological features were monitored periodically to quantify on-going changes in network structure. Global topological features included the total number of remaining synapses, average synaptic strengths, and average number of synapses per neuron (degree. Under a range of different input regimes and initial network configurations, each network maintained a robust and highly stable global structure across time. Local topology was monitored by assessing state changes of all three-neuron subgraphs (triads present in the networks. Overall counts and the range of triad configurations varied little across the simulations; however, a substantial set of individual triads continued to undergo rapid state changes and revealed a dynamic local topology. In addition, specific small-world properties also fluctuated across time. These findings suggest that on-going STDP provides an efficient means of selecting and maintaining a stable yet flexible network organization.

  4. Physical principles for scalable neural recording.

    Science.gov (United States)

    Marblestone, Adam H; Zamft, Bradley M; Maguire, Yael G; Shapiro, Mikhail G; Cybulski, Thaddeus R; Glaser, Joshua I; Amodei, Dario; Stranges, P Benjamin; Kalhor, Reza; Dalrymple, David A; Seo, Dongjin; Alon, Elad; Maharbiz, Michel M; Carmena, Jose M; Rabaey, Jan M; Boyden, Edward S; Church, George M; Kording, Konrad P

    2013-01-01

    Simultaneously measuring the activities of all neurons in a mammalian brain at millisecond resolution is a challenge beyond the limits of existing techniques in neuroscience. Entirely new approaches may be required, motivating an analysis of the fundamental physical constraints on the problem. We outline the physical principles governing brain activity mapping using optical, electrical, magnetic resonance, and molecular modalities of neural recording. Focusing on the mouse brain, we analyze the scalability of each method, concentrating on the limitations imposed by spatiotemporal resolution, energy dissipation, and volume displacement. Based on this analysis, all existing approaches require orders of magnitude improvement in key parameters. Electrical recording is limited by the low multiplexing capacity of electrodes and their lack of intrinsic spatial resolution, optical methods are constrained by the scattering of visible light in brain tissue, magnetic resonance is hindered by the diffusion and relaxation timescales of water protons, and the implementation of molecular recording is complicated by the stochastic kinetics of enzymes. Understanding the physical limits of brain activity mapping may provide insight into opportunities for novel solutions. For example, unconventional methods for delivering electrodes may enable unprecedented numbers of recording sites, embedded optical devices could allow optical detectors to be placed within a few scattering lengths of the measured neurons, and new classes of molecularly engineered sensors might obviate cumbersome hardware architectures. We also study the physics of powering and communicating with microscale devices embedded in brain tissue and find that, while radio-frequency electromagnetic data transmission suffers from a severe power-bandwidth tradeoff, communication via infrared light or ultrasound may allow high data rates due to the possibility of spatial multiplexing. The use of embedded local recording and

  5. A Reconfigurable and Biologically Inspired Paradigm for Computation Using Network-On-Chip and Spiking Neural Networks

    Directory of Open Access Journals (Sweden)

    Jim Harkin

    2009-01-01

    Full Text Available FPGA devices have emerged as a popular platform for the rapid prototyping of biological Spiking Neural Networks (SNNs applications, offering the key requirement of reconfigurability. However, FPGAs do not efficiently realise the biologically plausible neuron and synaptic models of SNNs, and current FPGA routing structures cannot accommodate the high levels of interneuron connectivity inherent in complex SNNs. This paper highlights and discusses the current challenges of implementing scalable SNNs on reconfigurable FPGAs. The paper proposes a novel field programmable neural network architecture (EMBRACE, incorporating low-power analogue spiking neurons, interconnected using a Network-on-Chip architecture. Results on the evaluation of the EMBRACE architecture using the XOR benchmark problem are presented, and the performance of the architecture is discussed. The paper also discusses the adaptability of the EMBRACE architecture in supporting fault tolerant computing.

  6. Comparison of a spiking neural network and an MLP for robust identification of generator dynamics in a multimachine power system.

    Science.gov (United States)

    Johnson, Cameron; Venayagamoorthy, Ganesh Kumar; Mitra, Pinaki

    2009-01-01

    The application of a spiking neural network (SNN) and a multi-layer perceptron (MLP) for online identification of generator dynamics in a multimachine power system are compared in this paper. An integrate-and-fire model of an SNN which communicates information via the inter-spike interval is applied. The neural network identifiers are used to predict the speed and terminal voltage deviations one time-step ahead of generators in a multimachine power system. The SNN is developed in two steps: (i) neuron centers determined by offline k-means clustering and (ii) output weights obtained by online training. The sensitivity of the SNN to the neuron centers determined in the first step is evaluated on generators of different ratings and parameters. Performances of the SNN and MLP are compared to evaluate robustness on the identification of generator dynamics under small and large disturbances, and to illustrate that SNNs are capable of learning nonlinear dynamics of complex systems.

  7. PySpike - A Python library for analyzing spike train synchrony

    CERN Document Server

    Mulansky, Mario

    2016-01-01

    Understanding how the brain functions is one of the biggest challenges of our time. The analysis of experimentally recorded neural firing patterns (spike trains) plays a crucial role in addressing this problem. Here, the PySpike library is introduced, a Python package for spike train analysis providing parameter-free and time-scale independent measures of spike train synchrony. It allows to compute bi- and multivariate dissimilarity profiles, averaged values and bivariate matrices. Although mainly focusing on neuroscience, PySpike can also be applied in other contexts like climate research or social sciences. The package is available as Open Source on Github and PyPI.

  8. The role of cortical oscillations in a spiking neural network model of the basal ganglia.

    Directory of Open Access Journals (Sweden)

    Zafeirios Fountas

    Full Text Available Although brain oscillations involving the basal ganglia (BG have been the target of extensive research, the main focus lies disproportionally on oscillations generated within the BG circuit rather than other sources, such as cortical areas. We remedy this here by investigating the influence of various cortical frequency bands on the intrinsic effective connectivity of the BG, as well as the role of the latter in regulating cortical behaviour. To do this, we construct a detailed neural model of the complete BG circuit based on fine-tuned spiking neurons, with both electrical and chemical synapses as well as short-term plasticity between structures. As a measure of effective connectivity, we estimate information transfer between nuclei by means of transfer entropy. Our model successfully reproduces firing and oscillatory behaviour found in both the healthy and Parkinsonian BG. We found that, indeed, effective connectivity changes dramatically for different cortical frequency bands and phase offsets, which are able to modulate (or even block information flow in the three major BG pathways. In particular, alpha (8-12Hz and beta (13-30Hz oscillations activate the direct BG pathway, and favour the modulation of the indirect and hyper-direct pathways via the subthalamic nucleus-globus pallidus loop. In contrast, gamma (30-90Hz frequencies block the information flow from the cortex completely through activation of the indirect pathway. Finally, below alpha, all pathways decay gradually and the system gives rise to spontaneous activity generated in the globus pallidus. Our results indicate the existence of a multimodal gating mechanism at the level of the BG that can be entirely controlled by cortical oscillations, and provide evidence for the hypothesis of cortically-entrained but locally-generated subthalamic beta activity. These two findings suggest new insights into the pathophysiology of specific BG disorders.

  9. The role of cortical oscillations in a spiking neural network model of the basal ganglia.

    Science.gov (United States)

    Fountas, Zafeirios; Shanahan, Murray

    2017-01-01

    Although brain oscillations involving the basal ganglia (BG) have been the target of extensive research, the main focus lies disproportionally on oscillations generated within the BG circuit rather than other sources, such as cortical areas. We remedy this here by investigating the influence of various cortical frequency bands on the intrinsic effective connectivity of the BG, as well as the role of the latter in regulating cortical behaviour. To do this, we construct a detailed neural model of the complete BG circuit based on fine-tuned spiking neurons, with both electrical and chemical synapses as well as short-term plasticity between structures. As a measure of effective connectivity, we estimate information transfer between nuclei by means of transfer entropy. Our model successfully reproduces firing and oscillatory behaviour found in both the healthy and Parkinsonian BG. We found that, indeed, effective connectivity changes dramatically for different cortical frequency bands and phase offsets, which are able to modulate (or even block) information flow in the three major BG pathways. In particular, alpha (8-12Hz) and beta (13-30Hz) oscillations activate the direct BG pathway, and favour the modulation of the indirect and hyper-direct pathways via the subthalamic nucleus-globus pallidus loop. In contrast, gamma (30-90Hz) frequencies block the information flow from the cortex completely through activation of the indirect pathway. Finally, below alpha, all pathways decay gradually and the system gives rise to spontaneous activity generated in the globus pallidus. Our results indicate the existence of a multimodal gating mechanism at the level of the BG that can be entirely controlled by cortical oscillations, and provide evidence for the hypothesis of cortically-entrained but locally-generated subthalamic beta activity. These two findings suggest new insights into the pathophysiology of specific BG disorders.

  10. A configurable simulation environment for the efficient simulation of large-scale spiking neural networks on graphics processors.

    Science.gov (United States)

    Nageswaran, Jayram Moorkanikara; Dutt, Nikil; Krichmar, Jeffrey L; Nicolau, Alex; Veidenbaum, Alexander V

    2009-01-01

    Neural network simulators that take into account the spiking behavior of neurons are useful for studying brain mechanisms and for various neural engineering applications. Spiking Neural Network (SNN) simulators have been traditionally simulated on large-scale clusters, super-computers, or on dedicated hardware architectures. Alternatively, Compute Unified Device Architecture (CUDA) Graphics Processing Units (GPUs) can provide a low-cost, programmable, and high-performance computing platform for simulation of SNNs. In this paper we demonstrate an efficient, biologically realistic, large-scale SNN simulator that runs on a single GPU. The SNN model includes Izhikevich spiking neurons, detailed models of synaptic plasticity and variable axonal delay. We allow user-defined configuration of the GPU-SNN model by means of a high-level programming interface written in C++ but similar to the PyNN programming interface specification. PyNN is a common programming interface developed by the neuronal simulation community to allow a single script to run on various simulators. The GPU implementation (on NVIDIA GTX-280 with 1 GB of memory) is up to 26 times faster than a CPU version for the simulation of 100K neurons with 50 Million synaptic connections, firing at an average rate of 7 Hz. For simulation of 10 Million synaptic connections and 100K neurons, the GPU SNN model is only 1.5 times slower than real-time. Further, we present a collection of new techniques related to parallelism extraction, mapping of irregular communication, and network representation for effective simulation of SNNs on GPUs. The fidelity of the simulation results was validated on CPU simulations using firing rate, synaptic weight distribution, and inter-spike interval analysis. Our simulator is publicly available to the modeling community so that researchers will have easy access to large-scale SNN simulations.

  11. Neural recording front-end IC using action potential detection and analog buffer with digital delay for data compression.

    Science.gov (United States)

    Liu, Lei; Yao, Lei; Zou, Xiaodan; Goh, Wang Ling; Je, Minkyu

    2013-01-01

    This paper presents a neural recording analog front-end IC intended for simultaneous neural recording with action potential (AP) detection for data compression in wireless multichannel neural implants. The proposed neural recording front-end IC detects the neural spikes and sends only the preserved AP information for wireless transmission in order to reduce the overall power consumption of the neural implant. The IC consists of a low-noise neural amplifier, an AP detection circuit and an analog buffer with digital delay. The neural amplifier makes use of a current-reuse technique to maximize the transconductance efficiency for attaining a good noise efficiency factor. The AP detection circuit uses an adaptive threshold voltage to generate an enable signal for the subsequent functional blocks. The analog buffer with digital delay is employed using a finite impulse response (FIR) filter which preserves the AP waveform before the enable signal as well as provides low-pass filtering. The neural recording front-end IC has been designed using standard CMOS 0.18-µm technology occupying a core area of 220 µm by 820 µm.

  12. Effects of bursting dynamic features on the generation of multi-clustered structure of neural network with symmetric spike-timing-dependent plasticity learning rule.

    Science.gov (United States)

    Liu, Hui; Song, Yongduan; Xue, Fangzheng; Li, Xiumin

    2015-11-01

    In this paper, the generation of multi-clustered structure of self-organized neural network with different neuronal firing patterns, i.e., bursting or spiking, has been investigated. The initially all-to-all-connected spiking neural network or bursting neural network can be self-organized into clustered structure through the symmetric spike-timing-dependent plasticity learning for both bursting and spiking neurons. However, the time consumption of this clustering procedure of the burst-based self-organized neural network (BSON) is much shorter than the spike-based self-organized neural network (SSON). Our results show that the BSON network has more obvious small-world properties, i.e., higher clustering coefficient and smaller shortest path length than the SSON network. Also, the results of larger structure entropy and activity entropy of the BSON network demonstrate that this network has higher topological complexity and dynamical diversity, which benefits for enhancing information transmission of neural circuits. Hence, we conclude that the burst firing can significantly enhance the efficiency of clustering procedure and the emergent clustered structure renders the whole network more synchronous and therefore more sensitive to weak input. This result is further confirmed from its improved performance on stochastic resonance. Therefore, we believe that the multi-clustered neural network which self-organized from the bursting dynamics has high efficiency in information processing.

  13. Effects of bursting dynamic features on the generation of multi-clustered structure of neural network with symmetric spike-timing-dependent plasticity learning rule

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Hui; Song, Yongduan; Xue, Fangzheng; Li, Xiumin, E-mail: xmli@cqu.edu.cn [Key Laboratory of Dependable Service Computing in Cyber Physical Society of Ministry of Education, Chongqing University, Chongqing 400044 (China); College of Automation, Chongqing University, Chongqing 400044 (China)

    2015-11-15

    In this paper, the generation of multi-clustered structure of self-organized neural network with different neuronal firing patterns, i.e., bursting or spiking, has been investigated. The initially all-to-all-connected spiking neural network or bursting neural network can be self-organized into clustered structure through the symmetric spike-timing-dependent plasticity learning for both bursting and spiking neurons. However, the time consumption of this clustering procedure of the burst-based self-organized neural network (BSON) is much shorter than the spike-based self-organized neural network (SSON). Our results show that the BSON network has more obvious small-world properties, i.e., higher clustering coefficient and smaller shortest path length than the SSON network. Also, the results of larger structure entropy and activity entropy of the BSON network demonstrate that this network has higher topological complexity and dynamical diversity, which benefits for enhancing information transmission of neural circuits. Hence, we conclude that the burst firing can significantly enhance the efficiency of clustering procedure and the emergent clustered structure renders the whole network more synchronous and therefore more sensitive to weak input. This result is further confirmed from its improved performance on stochastic resonance. Therefore, we believe that the multi-clustered neural network which self-organized from the bursting dynamics has high efficiency in information processing.

  14. Refinement and Pattern Formation in Neural Circuits by the Interaction of Traveling Waves with Spike-Timing Dependent Plasticity

    Science.gov (United States)

    Bennett, James E. M.; Bair, Wyeth

    2015-01-01

    Traveling waves in the developing brain are a prominent source of highly correlated spiking activity that may instruct the refinement of neural circuits. A candidate mechanism for mediating such refinement is spike-timing dependent plasticity (STDP), which translates correlated activity patterns into changes in synaptic strength. To assess the potential of these phenomena to build useful structure in developing neural circuits, we examined the interaction of wave activity with STDP rules in simple, biologically plausible models of spiking neurons. We derive an expression for the synaptic strength dynamics showing that, by mapping the time dependence of STDP into spatial interactions, traveling waves can build periodic synaptic connectivity patterns into feedforward circuits with a broad class of experimentally observed STDP rules. The spatial scale of the connectivity patterns increases with wave speed and STDP time constants. We verify these results with simulations and demonstrate their robustness to likely sources of noise. We show how this pattern formation ability, which is analogous to solutions of reaction-diffusion systems that have been widely applied to biological pattern formation, can be harnessed to instruct the refinement of postsynaptic receptive fields. Our results hold for rich, complex wave patterns in two dimensions and over several orders of magnitude in wave speeds and STDP time constants, and they provide predictions that can be tested under existing experimental paradigms. Our model generalizes across brain areas and STDP rules, allowing broad application to the ubiquitous occurrence of traveling waves and to wave-like activity patterns induced by moving stimuli. PMID:26308406

  15. Accuracy evaluation of numerical methods used in state-of-the-art simulators for spiking neural networks.

    Science.gov (United States)

    Henker, Stephan; Partzsch, Johannes; Schüffny, René

    2012-04-01

    With the various simulators for spiking neural networks developed in recent years, a variety of numerical solution methods for the underlying differential equations are available. In this article, we introduce an approach to systematically assess the accuracy of these methods. In contrast to previous investigations, our approach focuses on a completely deterministic comparison and uses an analytically solved model as a reference. This enables the identification of typical sources of numerical inaccuracies in state-of-the-art simulation methods. In particular, with our approach we can separate the error of the numerical integration from the timing error of spike detection and propagation, the latter being prominent in simulations with fixed timestep. To verify the correctness of the testing procedure, we relate the numerical deviations to theoretical predictions for the employed numerical methods. Finally, we give an example of the influence of simulation artefacts on network behaviour and spike-timing-dependent plasticity (STDP), underlining the importance of spike-time accuracy for the simulation of STDP.

  16. The dynamic brain: from spiking neurons to neural masses and cortical fields.

    Directory of Open Access Journals (Sweden)

    Gustavo Deco

    2008-08-01

    Full Text Available The cortex is a complex system, characterized by its dynamics and architecture, which underlie many functions such as action, perception, learning, language, and cognition. Its structural architecture has been studied for more than a hundred years; however, its dynamics have been addressed much less thoroughly. In this paper, we review and integrate, in a unifying framework, a variety of computational approaches that have been used to characterize the dynamics of the cortex, as evidenced at different levels of measurement. Computational models at different space-time scales help us understand the fundamental mechanisms that underpin neural processes and relate these processes to neuroscience data. Modeling at the single neuron level is necessary because this is the level at which information is exchanged between the computing elements of the brain; the neurons. Mesoscopic models tell us how neural elements interact to yield emergent behavior at the level of microcolumns and cortical columns. Macroscopic models can inform us about whole brain dynamics and interactions between large-scale neural systems such as cortical regions, the thalamus, and brain stem. Each level of description relates uniquely to neuroscience data, from single-unit recordings, through local field potentials to functional magnetic resonance imaging (fMRI, electroencephalogram (EEG, and magnetoencephalogram (MEG. Models of the cortex can establish which types of large-scale neuronal networks can perform computations and characterize their emergent properties. Mean-field and related formulations of dynamics also play an essential and complementary role as forward models that can be inverted given empirical data. This makes dynamic models critical in integrating theory and experiments. We argue that elaborating principled and informed models is a prerequisite for grounding empirical neuroscience in a cogent theoretical framework, commensurate with the achievements in the

  17. Towards a magnetoresistive platform for neural signal recording

    Science.gov (United States)

    Sharma, P. P.; Gervasoni, G.; Albisetti, E.; D'Ercoli, F.; Monticelli, M.; Moretti, D.; Forte, N.; Rocchi, A.; Ferrari, G.; Baldelli, P.; Sampietro, M.; Benfenati, F.; Bertacco, R.; Petti, D.

    2017-05-01

    A promising strategy to get deeper insight on brain functionalities relies on the investigation of neural activities at the cellular and sub-cellular level. In this framework, methods for recording neuron electrical activity have gained interest over the years. Main technological challenges are associated to finding highly sensitive detection schemes, providing considerable spatial and temporal resolution. Moreover, the possibility to perform non-invasive assays would constitute a noteworthy benefit. In this work, we present a magnetoresistive platform for the detection of the action potential propagation in neural cells. Such platform allows, in perspective, the in vitro recording of neural signals arising from single neurons, neural networks and brain slices.

  18. Learning to Generate Sequences with Combination of Hebbian and Non-hebbian Plasticity in Recurrent Spiking Neural Networks.

    Science.gov (United States)

    Panda, Priyadarshini; Roy, Kaushik

    2017-01-01

    Synaptic Plasticity, the foundation for learning and memory formation in the human brain, manifests in various forms. Here, we combine the standard spike timing correlation based Hebbian plasticity with a non-Hebbian synaptic decay mechanism for training a recurrent spiking neural model to generate sequences. We show that inclusion of the adaptive decay of synaptic weights with standard STDP helps learn stable contextual dependencies between temporal sequences, while reducing the strong attractor states that emerge in recurrent models due to feedback loops. Furthermore, we show that the combined learning scheme suppresses the chaotic activity in the recurrent model substantially, thereby enhancing its' ability to generate sequences consistently even in the presence of perturbations.

  19. Technology-aware algorithm design for neural spike detection, feature extraction, and dimensionality reduction.

    Science.gov (United States)

    Gibson, Sarah; Judy, Jack W; Marković, Dejan

    2010-10-01

    Applications such as brain-machine interfaces require hardware spike sorting in order to 1) obtain single-unit activity and 2) perform data reduction for wireless data transmission. Such systems must be low-power, low-area, high-accuracy, automatic, and able to operate in real time. Several detection, feature-extraction, and dimensionality-reduction algorithms for spike sorting are described and evaluated in terms of accuracy versus complexity. The nonlinear energy operator is chosen as the optimal spike-detection algorithm, being most robust over noise and relatively simple. Discrete derivatives is chosen as the optimal feature-extraction method, maintaining high accuracy across signal-to-noise ratios with a complexity orders of magnitude less than that of traditional methods such as principal-component analysis. We introduce the maximum-difference algorithm, which is shown to be the best dimensionality-reduction method for hardware spike sorting.

  20. Exact subthreshold integration with continuous spike times in discrete-time neural network simulations.

    Science.gov (United States)

    Morrison, Abigail; Straube, Sirko; Plesser, Hans Ekkehard; Diesmann, Markus

    2007-01-01

    Very large networks of spiking neurons can be simulated efficiently in parallel under the constraint that spike times are bound to an equidistant time grid. Within this scheme, the subthreshold dynamics of a wide class of integrate-and-fire-type neuron models can be integrated exactly from one grid point to the next. However, the loss in accuracy caused by restricting spike times to the grid can have undesirable consequences, which has led to interest in interpolating spike times between the grid points to retrieve an adequate representation of network dynamics. We demonstrate that the exact integration scheme can be combined naturally with off-grid spike events found by interpolation. We show that by exploiting the existence of a minimal synaptic propagation delay, the need for a central event queue is removed, so that the precision of event-driven simulation on the level of single neurons is combined with the efficiency of time-driven global scheduling. Further, for neuron models with linear subthreshold dynamics, even local event queuing can be avoided, resulting in much greater efficiency on the single-neuron level. These ideas are exemplified by two implementations of a widely used neuron model. We present a measure for the efficiency of network simulations in terms of their integration error and show that for a wide range of input spike rates, the novel techniques we present are both more accurate and faster than standard techniques.

  1. Sound Source Localization Through 8 MEMS Microphones Array Using a Sand-Scorpion-Inspired Spiking Neural Network

    Directory of Open Access Journals (Sweden)

    Christoph Beck

    2016-10-01

    Full Text Available Sand-scorpions and many other arachnids perceive their environment by using their feet to sense ground waves. They are able to determine amplitudes the size of an atom and locate the acoustic stimuli with an accuracy of within 13° based on their neuronal anatomy. We present here a prototype sound source localization system, inspired from this impressive performance. The system presented utilizes custom-built hardware with eight MEMS microphones, one for each foot, to acquire the acoustic scene, and a spiking neural model to localize the sound source. The current implementation shows smaller localization error than those observed in nature.

  2. A carbon-fiber electrode array for long-term neural recording.

    Science.gov (United States)

    Guitchounts, Grigori; Markowitz, Jeffrey E; Liberti, William A; Gardner, Timothy J

    2013-08-01

    Chronic neural recording in behaving animals is an essential method for studies of neural circuit function. However, stable recordings from small, densely packed neurons remains challenging, particularly over time-scales relevant for learning. We describe an assembly method for a 16-channel electrode array consisting of carbon fibers (Carbon fiber arrays were tested in HVC (used as a proper name), a song motor nucleus, of singing zebra finches where individual neurons discharge with temporally precise patterns. Previous reports of activity in this population of neurons have required the use of high impedance electrodes on movable microdrives. Here, the carbon fiber electrodes provided stable multi-unit recordings over time-scales of months. Spike-sorting indicated that the multi-unit signals were dominated by one, or a small number of cells. Stable firing patterns during singing confirmed the stability of these clusters over time-scales of months. In addition, from a total of 10 surgeries, 16 projection neurons were found. This cell type is characterized by sparse stereotyped firing patterns, providing unambiguous confirmation of single cell recordings. Carbon fiber electrode bundles may provide a scalable solution for long-term neural recordings of densely packed neurons.

  3. Circuitry for a Wireless Microsystem for Neural Recording Microprobes

    National Research Council Canada - National Science Library

    Yu, Hao

    2001-01-01

    .... Recorded neural signals are amplified, multiplexed, digitized using a 2nd order sigma-delta modulator, and then transmitted to the outside world by an on-chip transmitter, The circuit is designed using a standard...

  4. A compound memristive synapse model for statistical learning through STDP in spiking neural networks

    Directory of Open Access Journals (Sweden)

    Johannes eBill

    2014-12-01

    Full Text Available Memristors have recently emerged as promising circuit elements to mimic the function of biological synapses in neuromorphic computing. The fabrication of reliable nanoscale memristive synapses, that feature continuous conductance changes based on the timing of pre- and postsynaptic spikes, has however turned out to be challenging. In this article, we propose an alternative approach, the compound memristive synapse, that circumvents this problem by the use of memristors with binary memristive states. A compound memristive synapse employs multiple bistable memristors in parallel to jointly form one synapse, thereby providing a spectrum of synaptic efficacies. We investigate the computational implications of synaptic plasticity in the compound synapse by integrating the recently observed phenomenon of stochastic filament formation into an abstract model of stochastic switching. Using this abstract model, we first show how standard pulsing schemes give rise to spike-timing dependent plasticity (STDP with a stabilizing weight dependence in compound synapses. In a next step, we study unsupervised learning with compound synapses in networks of spiking neurons organized in a winner-take-all architecture. Our theoretical analysis reveals that compound-synapse STDP implements generalized Expectation-Maximization in the spiking network. Specifically, the emergent synapse configuration represents the most salient features of the input distribution in a Mixture-of-Gaussians generative model. Furthermore, the network’s spike response to spiking input streams approximates a well-defined Bayesian posterior distribution. We show in computer simulations how such networks learn to represent high-dimensional distributions over images of handwritten digits with high fidelity even in presence of substantial device variations and under severe noise conditions. Therefore, the compound memristive synapse may provide a synaptic design principle for future neuromorphic

  5. Stimulation and recording electrodes for neural prostheses

    CERN Document Server

    Pour Aryan, Naser; Rothermel, Albrecht

    2015-01-01

    This book provides readers with basic principles of the electrochemistry of the electrodes used in modern, implantable neural prostheses. The authors discuss the boundaries and conditions in which the electrodes continue to function properly for long time spans, which are required when designing neural stimulator devices for long-term in vivo applications. Two kinds of electrode materials, titanium nitride and iridium are discussed extensively, both qualitatively and quantitatively. The influence of the counter electrode on the safety margins and electrode lifetime in a two electrode system is explained. Electrode modeling is handled in a final chapter.

  6. Delayed afterdepolarization and spontaneous secondary spiking in a simple model of neural activity

    Science.gov (United States)

    Klinshov, V. V.; Nekorkin, V. I.

    2012-03-01

    In this paper we suggest a new dynamical model of neuron excitability. It is based on the classical FitzHugh-Nagumo model in which we introduce the third variable for additional ionic current. By using the method of fast and slow motions we study the afterdepolarization, spontaneous secondary spiking and tonic spiking effects. We build regions in the parameter space that correspond to different dynamical regimes. The obtained results may be important for different problems of neuroscience, e.g. for the problem of working memory.

  7. A spiking neural network model of 3D perception for event-based neuromorphic stereo vision systems

    Science.gov (United States)

    Osswald, Marc; Ieng, Sio-Hoi; Benosman, Ryad; Indiveri, Giacomo

    2017-01-01

    Stereo vision is an important feature that enables machine vision systems to perceive their environment in 3D. While machine vision has spawned a variety of software algorithms to solve the stereo-correspondence problem, their implementation and integration in small, fast, and efficient hardware vision systems remains a difficult challenge. Recent advances made in neuromorphic engineering offer a possible solution to this problem, with the use of a new class of event-based vision sensors and neural processing devices inspired by the organizing principles of the brain. Here we propose a radically novel model that solves the stereo-correspondence problem with a spiking neural network that can be directly implemented with massively parallel, compact, low-latency and low-power neuromorphic engineering devices. We validate the model with experimental results, highlighting features that are in agreement with both computational neuroscience stereo vision theories and experimental findings. We demonstrate its features with a prototype neuromorphic hardware system and provide testable predictions on the role of spike-based representations and temporal dynamics in biological stereo vision processing systems.

  8. A spiking neural network model of 3D perception for event-based neuromorphic stereo vision systems.

    Science.gov (United States)

    Osswald, Marc; Ieng, Sio-Hoi; Benosman, Ryad; Indiveri, Giacomo

    2017-01-12

    Stereo vision is an important feature that enables machine vision systems to perceive their environment in 3D. While machine vision has spawned a variety of software algorithms to solve the stereo-correspondence problem, their implementation and integration in small, fast, and efficient hardware vision systems remains a difficult challenge. Recent advances made in neuromorphic engineering offer a possible solution to this problem, with the use of a new class of event-based vision sensors and neural processing devices inspired by the organizing principles of the brain. Here we propose a radically novel model that solves the stereo-correspondence problem with a spiking neural network that can be directly implemented with massively parallel, compact, low-latency and low-power neuromorphic engineering devices. We validate the model with experimental results, highlighting features that are in agreement with both computational neuroscience stereo vision theories and experimental findings. We demonstrate its features with a prototype neuromorphic hardware system and provide testable predictions on the role of spike-based representations and temporal dynamics in biological stereo vision processing systems.

  9. Spiking Neural Classifier with Lumped Dendritic Nonlinearity and Binary Synapses: A Current Mode VLSI Implementation and Analysis.

    Science.gov (United States)

    Bhaduri, Aritra; Banerjee, Amitava; Roy, Subhrajit; Kar, Sougata; Basu, Arindam

    2017-12-08

    We present a neuromorphic current mode implementation of a spiking neural classifier with lumped square law dendritic nonlinearity. It has been shown previously in software simulations that such a system with binary synapses can be trained with structural plasticity algorithms to achieve comparable classification accuracy with fewer synaptic resources than conventional algorithms. We show that even in real analog systems with manufacturing imperfections (CV of 23.5% and 14.4% for dendritic branch gains and leaks respectively), this network is able to produce comparable results with fewer synaptic resources. The chip fabricated in [Formula: see text]m complementary metal oxide semiconductor has eight dendrites per cell and uses two opposing cells per class to cancel common-mode inputs. The chip can operate down to a [Formula: see text] V and dissipates 19 nW of static power per neuronal cell and [Formula: see text] 125 pJ/spike. For two-class classification problems of high-dimensional rate encoded binary patterns, the hardware achieves comparable performance as software implementation of the same with only about a 0.5% reduction in accuracy. On two UCI data sets, the IC integrated circuit has classification accuracy comparable to standard machine learners like support vector machines and extreme learning machines while using two to five times binary synapses. We also show that the system can operate on mean rate encoded spike patterns, as well as short bursts of spikes. To the best of our knowledge, this is the first attempt in hardware to perform classification exploiting dendritic properties and binary synapses.

  10. A Modified Izhikevich Model For Circuit Implementation of Spiking Neural Networks

    OpenAIRE

    Ahmadi, Arash; Zwolinski, Mark

    2010-01-01

    The Izhikevich neuron model reproduces the spiking and bursting behaviour of certain types of cortical neurons. This model has a second order nonlinearity that makes it difficult to implement in hardware. We propose a simplified version of the model that has a piecewise-linear relationship. This modification simplifies the hardware implementation but demonstrates similar dynamic behaviour.

  11. Emergence of slow collective oscillations in neural networks with spike-timing dependent plasticity

    DEFF Research Database (Denmark)

    Mikkelsen, Kaare; Imparato, Alberto; Torcini, Alessandro

    2013-01-01

    The collective dynamics of excitatory pulse coupled neurons with spike timing dependent plasticity (STDP) is studied. The introduction of STDP induces persistent irregular oscillations between strongly and weakly synchronized states, reminiscent of brain activity during slow-wave sleep. We explain...

  12. Spike-timing computation properties of a feed-forward neural network model

    Directory of Open Access Journals (Sweden)

    Drew Benjamin Sinha

    2014-01-01

    Full Text Available Brain function is characterized by dynamical interactions among networks of neurons. These interactions are mediated by network topology at many scales ranging from microcircuits to brain areas. Understanding how networks operate can be aided by understanding how the transformation of inputs depends upon network connectivity patterns, e.g. serial and parallel pathways. To tractably determine how single synapses or groups of synapses in such pathways shape transformations, we modeled feed-forward networks of 7-22 neurons in which synaptic strength changed according to a spike-timing dependent plasticity rule. We investigated how activity varied when dynamics were perturbed by an activity-dependent electrical stimulation protocol (spike-triggered stimulation; STS in networks of different topologies and background input correlations. STS can successfully reorganize functional brain networks in vivo, but with a variability in effectiveness that may derive partially from the underlying network topology. In a simulated network with a single disynaptic pathway driven by uncorrelated background activity, structured spike-timing relationships between polysynaptically connected neurons were not observed. When background activity was correlated or parallel disynaptic pathways were added, however, robust polysynaptic spike timing relationships were observed, and application of STS yielded predictable changes in synaptic strengths and spike-timing relationships. These observations suggest that precise input-related or topologically induced temporal relationships in network activity are necessary for polysynaptic signal propagation. Such constraints for polysynaptic computation suggest potential roles for higher-order topological structure in network organization, such as maintaining polysynaptic correlation in the face of relatively weak synapses.

  13. Neural Code-Neural Self-information Theory on How Cell-Assembly Code Rises from Spike Time and Neuronal Variability.

    Science.gov (United States)

    Li, Meng; Tsien, Joe Z

    2017-01-01

    A major stumbling block to cracking the real-time neural code is neuronal variability - neurons discharge spikes with enormous variability not only across trials within the same experiments but also in resting states. Such variability is widely regarded as a noise which is often deliberately averaged out during data analyses. In contrast to such a dogma, we put forth the Neural Self-Information Theory that neural coding is operated based on the self-information principle under which variability in the time durations of inter-spike-intervals (ISI), or neuronal silence durations, is self-tagged with discrete information. As the self-information processor, each ISI carries a certain amount of information based on its variability-probability distribution; higher-probability ISIs which reflect the balanced excitation-inhibition ground state convey minimal information, whereas lower-probability ISIs which signify rare-occurrence surprisals in the form of extremely transient or prolonged silence carry most information. These variable silence durations are naturally coupled with intracellular biochemical cascades, energy equilibrium and dynamic regulation of protein and gene expression levels. As such, this silence variability-based self-information code is completely intrinsic to the neurons themselves, with no need for outside observers to set any reference point as typically used in the rate code, population code and temporal code models. Moreover, temporally coordinated ISI surprisals across cell population can inherently give rise to robust real-time cell-assembly codes which can be readily sensed by the downstream neural clique assemblies. One immediate utility of this self-information code is a general decoding strategy to uncover a variety of cell-assembly patterns underlying external and internal categorical or continuous variables in an unbiased manner.

  14. Multiple-color optical activation, silencing, and desynchronization of neural activity, with single-spike temporal resolution.

    Directory of Open Access Journals (Sweden)

    Xue Han

    Full Text Available The quest to determine how precise neural activity patterns mediate computation, behavior, and pathology would be greatly aided by a set of tools for reliably activating and inactivating genetically targeted neurons, in a temporally precise and rapidly reversible fashion. Having earlier adapted a light-activated cation channel, channelrhodopsin-2 (ChR2, for allowing neurons to be stimulated by blue light, we searched for a complementary tool that would enable optical neuronal inhibition, driven by light of a second color. Here we report that targeting the codon-optimized form of the light-driven chloride pump halorhodopsin from the archaebacterium Natronomas pharaonis (hereafter abbreviated Halo to genetically-specified neurons enables them to be silenced reliably, and reversibly, by millisecond-timescale pulses of yellow light. We show that trains of yellow and blue light pulses can drive high-fidelity sequences of hyperpolarizations and depolarizations in neurons simultaneously expressing yellow light-driven Halo and blue light-driven ChR2, allowing for the first time manipulations of neural synchrony without perturbation of other parameters such as spiking rates. The Halo/ChR2 system thus constitutes a powerful toolbox for multichannel photoinhibition and photostimulation of virally or transgenically targeted neural circuits without need for exogenous chemicals, enabling systematic analysis and engineering of the brain, and quantitative bioengineering of excitable cells.

  15. [Multi-channel in vivo recording techniques: analysis of phase coupling between spikes and rhythmic oscillations of local field potentials].

    Science.gov (United States)

    Wang, Ce-Qun; Chen, Qiang; Zhang, Lu; Xu, Jia-Min; Lin, Long-Nian

    2014-12-25

    The purpose of this article is to introduce the measurements of phase coupling between spikes and rhythmic oscillations of local field potentials (LFPs). Multi-channel in vivo recording techniques allow us to record ensemble neuronal activity and LFPs simultaneously from the same sites in the brain. Neuronal activity is generally characterized by temporal spike sequences, while LFPs contain oscillatory rhythms in different frequency ranges. Phase coupling analysis can reveal the temporal relationships between neuronal firing and LFP rhythms. As the first step, the instantaneous phase of LFP rhythms can be calculated using Hilbert transform, and then for each time-stamped spike occurred during an oscillatory epoch, we marked instantaneous phase of the LFP at that time stamp. Finally, the phase relationships between the neuronal firing and LFP rhythms were determined by examining the distribution of the firing phase. Phase-locked spikes are revealed by the non-random distribution of spike phase. Theta phase precession is a unique phase relationship between neuronal firing and LFPs, which is one of the basic features of hippocampal place cells. Place cells show rhythmic burst firing following theta oscillation within a place field. And phase precession refers to that rhythmic burst firing shifted in a systematic way during traversal of the field, moving progressively forward on each theta cycle. This relation between phase and position can be described by a linear model, and phase precession is commonly quantified with a circular-linear coefficient. Phase coupling analysis helps us to better understand the temporal information coding between neuronal firing and LFPs.

  16. A Spiking Neural Simulator Integrating Event-Driven and Time-Driven Computation Schemes Using Parallel CPU-GPU Co-Processing: A Case Study.

    Science.gov (United States)

    Naveros, Francisco; Luque, Niceto R; Garrido, Jesús A; Carrillo, Richard R; Anguita, Mancia; Ros, Eduardo

    2015-07-01

    Time-driven simulation methods in traditional CPU architectures perform well and precisely when simulating small-scale spiking neural networks. Nevertheless, they still have drawbacks when simulating large-scale systems. Conversely, event-driven simulation methods in CPUs and time-driven simulation methods in graphic processing units (GPUs) can outperform CPU time-driven methods under certain conditions. With this performance improvement in mind, we have developed an event-and-time-driven spiking neural network simulator suitable for a hybrid CPU-GPU platform. Our neural simulator is able to efficiently simulate bio-inspired spiking neural networks consisting of different neural models, which can be distributed heterogeneously in both small layers and large layers or subsystems. For the sake of efficiency, the low-activity parts of the neural network can be simulated in CPU using event-driven methods while the high-activity subsystems can be simulated in either CPU (a few neurons) or GPU (thousands or millions of neurons) using time-driven methods. In this brief, we have undertaken a comparative study of these different simulation methods. For benchmarking the different simulation methods and platforms, we have used a cerebellar-inspired neural-network model consisting of a very dense granular layer and a Purkinje layer with a smaller number of cells (according to biological ratios). Thus, this cerebellar-like network includes a dense diverging neural layer (increasing the dimensionality of its internal representation and sparse coding) and a converging neural layer (integration) similar to many other biologically inspired and also artificial neural networks.

  17. Using strategic movement to calibrate a neural compass: a spiking network for tracking head direction in rats and robots.

    Directory of Open Access Journals (Sweden)

    Peter Stratton

    Full Text Available The head direction (HD system in mammals contains neurons that fire to represent the direction the animal is facing in its environment. The ability of these cells to reliably track head direction even after the removal of external sensory cues implies that the HD system is calibrated to function effectively using just internal (proprioceptive and vestibular inputs. Rat pups and other infant mammals display stereotypical warm-up movements prior to locomotion in novel environments, and similar warm-up movements are seen in adult mammals with certain brain lesion-induced motor impairments. In this study we propose that synaptic learning mechanisms, in conjunction with appropriate movement strategies based on warm-up movements, can calibrate the HD system so that it functions effectively even in darkness. To examine the link between physical embodiment and neural control, and to determine that the system is robust to real-world phenomena, we implemented the synaptic mechanisms in a spiking neural network and tested it on a mobile robot platform. Results show that the combination of the synaptic learning mechanisms and warm-up movements are able to reliably calibrate the HD system so that it accurately tracks real-world head direction, and that calibration breaks down in systematic ways if certain movements are omitted. This work confirms that targeted, embodied behaviour can be used to calibrate neural systems, demonstrates that 'grounding' of modelled biological processes in the real world can reveal underlying functional principles (supporting the importance of robotics to biology, and proposes a functional role for stereotypical behaviours seen in infant mammals and those animals with certain motor deficits. We conjecture that these calibration principles may extend to the calibration of other neural systems involved in motion tracking and the representation of space, such as grid cells in entorhinal cortex.

  18. Stable learning of functional maps in self-organizing spiking neural networks with continuous synaptic plasticity.

    Science.gov (United States)

    Srinivasa, Narayan; Jiang, Qin

    2013-01-01

    This study describes a spiking model that self-organizes for stable formation and maintenance of orientation and ocular dominance maps in the visual cortex (V1). This self-organization process simulates three development phases: an early experience-independent phase, a late experience-independent phase and a subsequent refinement phase during which experience acts to shape the map properties. The ocular dominance maps that emerge accommodate the two sets of monocular inputs that arise from the lateral geniculate nucleus (LGN) to layer 4 of V1. The orientation selectivity maps that emerge feature well-developed iso-orientation domains and fractures. During the last two phases of development the orientation preferences at some locations appear to rotate continuously through ±180° along circular paths and referred to as pinwheel-like patterns but without any corresponding point discontinuities in the orientation gradient maps. The formation of these functional maps is driven by balanced excitatory and inhibitory currents that are established via synaptic plasticity based on spike timing for both excitatory and inhibitory synapses. The stability and maintenance of the formed maps with continuous synaptic plasticity is enabled by homeostasis caused by inhibitory plasticity. However, a prolonged exposure to repeated stimuli does alter the formed maps over time due to plasticity. The results from this study suggest that continuous synaptic plasticity in both excitatory neurons and interneurons could play a critical role in the formation, stability, and maintenance of functional maps in the cortex.

  19. Fractal character of the neural spike train in the visual system of the cat.

    Science.gov (United States)

    Teich, M C; Heneghan, C; Lowen, S B; Ozaki, T; Kaplan, E

    1997-03-01

    We used a variety of statistical measures to identify the point process that describes the maintained discharge of retinal ganglion cells (RGC's) and neurons in the lateral geniculate nucleus (LGN) of the cat. These measures are based on both interevent intervals and event counts and include the interevent-interval histogram, rescaled range analysis, the event-number histogram, the Fano factor, Allan factor, and the periodogram. In addition, we applied these measures to surrogate versions of the data, generated by random shuffling of the order of interevent intervals. The continuing statistics reveal 1/f-type fluctuations in the data (long-duration power-law correlation), which are not present in the shuffled data. Estimates of the fractal exponents measured for RGC- and their target LGN-spike trains are similar in value, indicating that the fractal behavior either is transmitted form one cell to the other or has a common origin. The gamma-r renewal process model, often used in the analysis of visual-neuron interevent intervals, describes certain short-term features of the RGC and LGN data reasonably well but fails to account for the long-duration correlation. We present a new model for visual-system nerve-spike firings: a gamma-r renewal process whose mean is modulated by fractal binomial noise. This fractal, doubly stochastic point process characterizes the statistical behavior of both RGC and LGN data sets remarkably well.

  20. Wireless gigabit data telemetry for large-scale neural recording.

    Science.gov (United States)

    Kuan, Yen-Cheng; Lo, Yi-Kai; Kim, Yanghyo; Chang, Mau-Chung Frank; Liu, Wentai

    2015-05-01

    Implantable wireless neural recording from a large ensemble of simultaneously acting neurons is a critical component to thoroughly investigate neural interactions and brain dynamics from freely moving animals. Recent researches have shown the feasibility of simultaneously recording from hundreds of neurons and suggested that the ability of recording a larger number of neurons results in better signal quality. This massive recording inevitably demands a large amount of data transfer. For example, recording 2000 neurons while keeping the signal fidelity ( > 12 bit, > 40 KS/s per neuron) needs approximately a 1-Gb/s data link. Designing a wireless data telemetry system to support such (or higher) data rate while aiming to lower the power consumption of an implantable device imposes a grand challenge on neuroscience community. In this paper, we present a wireless gigabit data telemetry for future large-scale neural recording interface. This telemetry comprises of a pair of low-power gigabit transmitter and receiver operating at 60 GHz, and establishes a short-distance wireless link to transfer the massive amount of neural signals outward from the implanted device. The transmission distance of the received neural signal can be further extended by an externally rendezvous wireless transceiver, which is less power/heat-constraint since it is not at the immediate proximity of the cortex and its radiated signal is not seriously attenuated by the lossy tissue. The gigabit data link has been demonstrated to achieve a high data rate of 6 Gb/s with a bit-error-rate of 10(-12) at a transmission distance of 6 mm, an applicable separation between transmitter and receiver. This high data rate is able to support thousands of recording channels while ensuring a low energy cost per bit of 2.08 pJ/b.

  1. Streaming Parallel GPU Acceleration of Large-Scale filter-based Spiking Neural Networks

    NARCIS (Netherlands)

    L.P. Slazynski (Leszek); S.M. Bohte (Sander)

    2012-01-01

    htmlabstractThe arrival of graphics processing (GPU) cards suitable for massively parallel computing promises a↵ordable large-scale neural network simulation previously only available at supercomputing facil- ities. While the raw numbers suggest that GPUs may outperform CPUs by at least an order of

  2. Diverse synaptic plasticity mechanisms orchestrated to form and retrieve memories in spiking neural networks.

    Science.gov (United States)

    Zenke, Friedemann; Agnes, Everton J; Gerstner, Wulfram

    2015-04-21

    Synaptic plasticity, the putative basis of learning and memory formation, manifests in various forms and across different timescales. Here we show that the interaction of Hebbian homosynaptic plasticity with rapid non-Hebbian heterosynaptic plasticity is, when complemented with slower homeostatic changes and consolidation, sufficient for assembly formation and memory recall in a spiking recurrent network model of excitatory and inhibitory neurons. In the model, assemblies were formed during repeated sensory stimulation and characterized by strong recurrent excitatory connections. Even days after formation, and despite ongoing network activity and synaptic plasticity, memories could be recalled through selective delay activity following the brief stimulation of a subset of assembly neurons. Blocking any component of plasticity prevented stable functioning as a memory network. Our modelling results suggest that the diversity of plasticity phenomena in the brain is orchestrated towards achieving common functional goals.

  3. Diverse synaptic plasticity mechanisms orchestrated to form and retrieve memories in spiking neural networks

    Science.gov (United States)

    Zenke, Friedemann; Agnes, Everton J.; Gerstner, Wulfram

    2015-01-01

    Synaptic plasticity, the putative basis of learning and memory formation, manifests in various forms and across different timescales. Here we show that the interaction of Hebbian homosynaptic plasticity with rapid non-Hebbian heterosynaptic plasticity is, when complemented with slower homeostatic changes and consolidation, sufficient for assembly formation and memory recall in a spiking recurrent network model of excitatory and inhibitory neurons. In the model, assemblies were formed during repeated sensory stimulation and characterized by strong recurrent excitatory connections. Even days after formation, and despite ongoing network activity and synaptic plasticity, memories could be recalled through selective delay activity following the brief stimulation of a subset of assembly neurons. Blocking any component of plasticity prevented stable functioning as a memory network. Our modelling results suggest that the diversity of plasticity phenomena in the brain is orchestrated towards achieving common functional goals. PMID:25897632

  4. Chronic multichannel neural recordings from soft regenerative microchannel electrodes during gait

    Science.gov (United States)

    Musick, Katherine M.; Rigosa, Jacopo; Narasimhan, Shreya; Wurth, Sophie; Capogrosso, Marco; Chew, Daniel J.; Fawcett, James W.; Micera, Silvestro; Lacour, Stéphanie P.

    2015-09-01

    Reliably interfacing a nerve with an electrode array is one of the approaches to restore motor and sensory functions after an injury to the peripheral nerve. Accomplishing this with current technologies is challenging as the electrode-neuron interface often degrades over time, and surrounding myoelectric signals contaminate the neuro-signals in awake, moving animals. The purpose of this study was to evaluate the potential of microchannel electrode implants to monitor over time and in freely moving animals, neural activity from regenerating nerves. We designed and fabricated implants with silicone rubber and elastic thin-film metallization. Each implant carries an eight-by-twelve matrix of parallel microchannels (of 120 × 110 μm2 cross-section and 4 mm length) and gold thin-film electrodes embedded in the floor of ten of the microchannels. After sterilization, the soft, multi-lumen electrode implant is sutured between the stumps of the sciatic nerve. Over a period of three months and in four rats, the microchannel electrodes recorded spike activity from the regenerating sciatic nerve. Histology indicates mini-nerves formed of axons and supporting cells regenerate robustly in the implants. Analysis of the recorded spikes and gait kinematics over the ten-week period suggests firing patterns collected with the microchannel electrode implant can be associated with different phases of gait.

  5. A decision-making model based on a spiking neural circuit and synaptic plasticity.

    Science.gov (United States)

    Wei, Hui; Bu, Yijie; Dai, Dawei

    2017-10-01

    To adapt to the environment and survive, most animals can control their behaviors by making decisions. The process of decision-making and responding according to cues in the environment is stable, sustainable, and learnable. Understanding how behaviors are regulated by neural circuits and the encoding and decoding mechanisms from stimuli to responses are important goals in neuroscience. From results observed in Drosophila experiments, the underlying decision-making process is discussed, and a neural circuit that implements a two-choice decision-making model is proposed to explain and reproduce the observations. Compared with previous two-choice decision making models, our model uses synaptic plasticity to explain changes in decision output given the same environment. Moreover, biological meanings of parameters of our decision-making model are discussed. In this paper, we explain at the micro-level (i.e., neurons and synapses) how observable decision-making behavior at the macro-level is acquired and achieved.

  6. Automatic spike detection via an artificial neural network using raw EEG data: effects of data preparation and implications in the limitations of online recognition.

    Science.gov (United States)

    Ko, C W; Chung, H W

    2000-03-01

    Automatic detection of epileptic EEG spikes via an artificial neural network has been reported to be feasible using raw EEG data as input. This study re-investigated its suitability by further exploring the effects of data preparation on classification performance testing. Six hundred EEG files (300 spikes and 300 non-spikes) taken from 20 patients were included in this study. Raw EEG data were sent to the neural network using the architecture reported to give best performance (30 input-layer and 6 hidden-layer neurons). Significantly larger weighting of the 10th input-layer neuron was found after training with prepared raw EEG data. The classification process was thus dominated by the peak location. Subsequent analysis showed that online spike detection with an erroneously trained network yielded an area less than 0.5 under the receiver-operating-characteristic curve, and hence performed inferiorly to random assignments. Networks trained and tested using the same unprepared EEG data achieved no better than about 87% true classification rate at equal sensitivity and specificity. The high true classification rate reported previously is believed to be an artifact arising from erroneous data preparation and off-line validation. Spike detection using raw EEG data as input is unlikely to be feasible under current computer technology.

  7. Decision making under uncertainty in a spiking neural network model of the basal ganglia.

    Science.gov (United States)

    Héricé, Charlotte; Khalil, Radwa; Moftah, Marie; Boraud, Thomas; Guthrie, Martin; Garenne, André

    2016-12-01

    The mechanisms of decision-making and action selection are generally thought to be under the control of parallel cortico-subcortical loops connecting back to distinct areas of cortex through the basal ganglia and processing motor, cognitive and limbic modalities of decision-making. We have used these properties to develop and extend a connectionist model at a spiking neuron level based on a previous rate model approach. This model is demonstrated on decision-making tasks that have been studied in primates and the electrophysiology interpreted to show that the decision is made in two steps. To model this, we have used two parallel loops, each of which performs decision-making based on interactions between positive and negative feedback pathways. This model is able to perform two-level decision-making as in primates. We show here that, before learning, synaptic noise is sufficient to drive the decision-making process and that, after learning, the decision is based on the choice that has proven most likely to be rewarded. The model is then submitted to lesion tests, reversal learning and extinction protocols. We show that, under these conditions, it behaves in a consistent manner and provides predictions in accordance with observed experimental data.

  8. A two-stage neural spiking model of visual contrast detection in perimetry

    Science.gov (United States)

    SK, Gardiner; WH, Swanson; S, Demirel; AM, McKendrick; A, Turpin; CA, Johnson

    2008-01-01

    Perimetry is a commonly used clinical test for visual function, limited by high variability. The sources of this variability need to be better understood. In this paper, we investigate whether noise intrinsic to neural firing could explain the variability in normal subjects. We present the most physiologically accurate model to date for stimulus detection in perimetry combining knowledge of the physiology of components of the visual system with signal detection theory, and show that it requires that detection be mediated by multiple cortical cells in order to give predictions consistent with psychometric functions measured in human observers. PMID:18602414

  9. Advancing interconnect density for spiking neural network hardware implementations using traffic-aware adaptive network-on-chip routers.

    Science.gov (United States)

    Carrillo, Snaider; Harkin, Jim; McDaid, Liam; Pande, Sandeep; Cawley, Seamus; McGinley, Brian; Morgan, Fearghal

    2012-09-01

    The brain is highly efficient in how it processes information and tolerates faults. Arguably, the basic processing units are neurons and synapses that are interconnected in a complex pattern. Computer scientists and engineers aim to harness this efficiency and build artificial neural systems that can emulate the key information processing principles of the brain. However, existing approaches cannot provide the dense interconnect for the billions of neurons and synapses that are required. Recently a reconfigurable and biologically inspired paradigm based on network-on-chip (NoC) and spiking neural networks (SNNs) has been proposed as a new method of realising an efficient, robust computing platform. However, the use of the NoC as an interconnection fabric for large-scale SNNs demands a good trade-off between scalability, throughput, neuron/synapse ratio and power consumption. This paper presents a novel traffic-aware, adaptive NoC router, which forms part of a proposed embedded mixed-signal SNN architecture called EMBRACE (EMulating Biologically-inspiRed ArChitectures in hardwarE). The proposed adaptive NoC router provides the inter-neuron connectivity for EMBRACE, maintaining router communication and avoiding dropped router packets by adapting to router traffic congestion. Results are presented on throughput, power and area performance analysis of the adaptive router using a 90 nm CMOS technology which outperforms existing NoCs in this domain. The adaptive behaviour of the router is also verified on a Stratix II FPGA implementation of a 4 × 2 router array with real-time traffic congestion. The presented results demonstrate the feasibility of using the proposed adaptive NoC router within the EMBRACE architecture to realise large-scale SNNs on embedded hardware. Copyright © 2012 Elsevier Ltd. All rights reserved.

  10. SNAVA-A real-time multi-FPGA multi-model spiking neural network simulation architecture.

    Science.gov (United States)

    Sripad, Athul; Sanchez, Giovanny; Zapata, Mireya; Pirrone, Vito; Dorta, Taho; Cambria, Salvatore; Marti, Albert; Krishnamourthy, Karthikeyan; Madrenas, Jordi

    2018-01-01

    Spiking Neural Networks (SNN) for Versatile Applications (SNAVA) simulation platform is a scalable and programmable parallel architecture that supports real-time, large-scale, multi-model SNN computation. This parallel architecture is implemented in modern Field-Programmable Gate Arrays (FPGAs) devices to provide high performance execution and flexibility to support large-scale SNN models. Flexibility is defined in terms of programmability, which allows easy synapse and neuron implementation. This has been achieved by using a special-purpose Processing Elements (PEs) for computing SNNs, and analyzing and customizing the instruction set according to the processing needs to achieve maximum performance with minimum resources. The parallel architecture is interfaced with customized Graphical User Interfaces (GUIs) to configure the SNN's connectivity, to compile the neuron-synapse model and to monitor SNN's activity. Our contribution intends to provide a tool that allows to prototype SNNs faster than on CPU/GPU architectures but significantly cheaper than fabricating a customized neuromorphic chip. This could be potentially valuable to the computational neuroscience and neuromorphic engineering communities. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Epileptic focus localization based on resting state interictal MEG recordings is feasible irrespective of the presence or absence of spikes.

    Science.gov (United States)

    Krishnan, B; Vlachos, I; Wang, Z I; Mosher, J; Najm, I; Burgess, R; Iasemidis, L; Alexopoulos, A V

    2015-04-01

    To investigate whether epileptogenic focus localization is possible based on resting state connectivity analysis of magnetoencephalographic (MEG) data. A multivariate autoregressive (MVAR) model was constructed using the sensor space data and was projected to the source space using lead field and inverse matrix. The generalized partial directed coherence was estimated from the MVAR model in the source space. The dipole with the maximum information inflow was hypothesized to be within the epileptogenic focus. Applying the focus localization algorithm (FLA) to the interictal MEG recordings from five patients with neocortical epilepsy, who underwent presurgical evaluation for the identification of epileptogenic focus, we were able to correctly localize the focus, on the basis of maximum interictal information inflow in the presence or absence of interictal epileptic spikes in the data, with three out of five patients undergoing resective surgery and being seizure free since. Our preliminary results suggest that accurate localization of the epileptogenic focus may be accomplished using noninvasive spontaneous "resting-state" recordings of relatively brief duration and without the need to capture definite interictal and/or ictal abnormalities. Epileptogenic focus localization is possible through connectivity analysis of resting state MEG data irrespective of the presence/absence of spikes. Copyright © 2015. Published by Elsevier Ireland Ltd.

  12. Real-time radionuclide identification in γ-emitter mixtures based on spiking neural network.

    Science.gov (United States)

    Bobin, C; Bichler, O; Lourenço, V; Thiam, C; Thévenin, M

    2016-03-01

    Portal radiation monitors dedicated to the prevention of illegal traffic of nuclear materials at international borders need to deliver as fast as possible a radionuclide identification of a potential radiological threat. Spectrometry techniques applied to identify the radionuclides contributing to γ-emitter mixtures are usually performed using off-line spectrum analysis. As an alternative to these usual methods, a real-time processing based on an artificial neural network and Bayes' rule is proposed for fast radionuclide identification. The validation of this real-time approach was carried out using γ-emitter spectra ((241)Am, (133)Ba, (207)Bi, (60)Co, (137)Cs) obtained with a high-efficiency well-type NaI(Tl). The first tests showed that the proposed algorithm enables a fast identification of each γ-emitting radionuclide using the information given by the whole spectrum. Based on an iterative process, the on-line analysis only needs low-statistics spectra without energy calibration to identify the nature of a radiological threat. Copyright © 2015 Elsevier Ltd. All rights reserved.

  13. Artificial earthquake record generation using cascade neural network

    Directory of Open Access Journals (Sweden)

    Bani-Hani Khaldoon A.

    2017-01-01

    Full Text Available This paper presents the results of using artificial neural networks (ANN in an inverse mapping problem for earthquake accelerograms generation. This study comprises of two parts: 1-D site response analysis; performed for Dubai Emirate at UAE, where eight earthquakes records are selected and spectral matching are performed to match Dubai response spectrum using SeismoMatch software. Site classification of Dubai soil is being considered for two classes C and D based on shear wave velocity of soil profiles. Amplifications factors are estimated to quantify Dubai soil effect. Dubai’s design response spectra are developed for site classes C & D according to International Buildings Code (IBC -2012. In the second part, ANN is employed to solve inverse mapping problem to generate time history earthquake record. Thirty earthquakes records and their design response spectrum with 5% damping are used to train two cascade forward backward neural networks (ANN1, ANN2. ANN1 is trained to map the design response spectrum to time history and ANN2 is trained to map time history records to the design response spectrum. Generalized time history earthquake records are generated using ANN1 for Dubai’s site classes C and D, and ANN2 is used to evaluate the performance of ANN1.

  14. Mapping spikes to sensations

    Directory of Open Access Journals (Sweden)

    Maik Christopher Stüttgen

    2011-11-01

    Full Text Available Single-unit recordings conducted during perceptual decision-making tasks have yielded tremendous insights into the neural coding of sensory stimuli. In such experiments, detection or discrimination behavior (the psychometric data is observed in parallel with spike trains in sensory neurons (the neurometric data. Frequently, candidate neural codes for information read-out are pitted against each other by transforming the neurometric data in some way and asking which code’s performance most closely approximates the psychometric performance. The code that matches the psychometric performance best is retained as a viable candidate and the others are rejected. In following this strategy, psychometric data is often considered to provide an unbiased measure of perceptual sensitivity. It is rarely acknowledged that psychometric data result from a complex interplay of sensory and non-sensory processes and that neglect of these processes may result in misestimating psychophysical sensitivity. This again may lead to erroneous conclusions regarding the adequacy of neural candidate codes. In this review, we first discuss requirements on the neural data for a subsequent neurometric-psychometric comparison. We then focus on different psychophysical tasks for the assessment of detection and discrimination performance and the cognitive processes that may underlie their execution. We discuss further factors that may compromise psychometric performance and how they can be detected or avoided. We believe that these considerations point to shortcomings in our understanding of the processes underlying perceptual decisions, and therefore offer potential for future research.

  15. Polychronization: computation with spikes.

    Science.gov (United States)

    Izhikevich, Eugene M

    2006-02-01

    We present a minimal spiking network that can polychronize, that is, exhibit reproducible time-locked but not synchronous firing patterns with millisecond precision, as in synfire braids. The network consists of cortical spiking neurons with axonal conduction delays and spike-timing-dependent plasticity (STDP); a ready-to-use MATLAB code is included. It exhibits sleeplike oscillations, gamma (40 Hz) rhythms, conversion of firing rates to spike timings, and other interesting regimes. Due to the interplay between the delays and STDP, the spiking neurons spontaneously self-organize into groups and generate patterns of stereotypical polychronous activity. To our surprise, the number of coexisting polychronous groups far exceeds the number of neurons in the network, resulting in an unprecedented memory capacity of the system. We speculate on the significance of polychrony to the theory of neuronal group selection (TNGS, neural Darwinism), cognitive neural computations, binding and gamma rhythm, mechanisms of attention, and consciousness as "attention to memories."

  16. EEG in the classroom: Synchronised neural recordings during video presentation

    Science.gov (United States)

    Poulsen, Andreas Trier; Kamronn, Simon; Dmochowski, Jacek; Parra, Lucas C.; Hansen, Lars Kai

    2017-03-01

    We performed simultaneous recordings of electroencephalography (EEG) from multiple students in a classroom, and measured the inter-subject correlation (ISC) of activity evoked by a common video stimulus. The neural reliability, as quantified by ISC, has been linked to engagement and attentional modulation in earlier studies that used high-grade equipment in laboratory settings. Here we reproduce many of the results from these studies using portable low-cost equipment, focusing on the robustness of using ISC for subjects experiencing naturalistic stimuli. The present data shows that stimulus-evoked neural responses, known to be modulated by attention, can be tracked for groups of students with synchronized EEG acquisition. This is a step towards real-time inference of engagement in the classroom.

  17. Fractal dimension analysis for spike detection in low SNR extracellular signals.

    Science.gov (United States)

    Salmasi, Mehrdad; Büttner, Ulrich; Glasauer, Stefan

    2016-06-01

    Many algorithms have been suggested for detection and sorting of spikes in extracellular recording. Nevertheless, it is still challenging to detect spikes in low signal-to-noise ratios (SNR). We propose a spike detection algorithm that is based on the fractal properties of extracellular signals and can detect spikes in low SNR regimes. Semi-intact spikes are low-amplitude spikes whose shapes are almost preserved. The detection of these spikes can significantly enhance the performance of multi-electrode recording systems. Semi-intact spikes are simulated by adding three noise components to a spike train: thermal noise, inter-spike noise, and spike-level noise. We show that simulated signals have fractal properties which make them proper candidates for fractal analysis. Then we use fractal dimension as the main core of our spike detection algorithm and call it fractal detector. The performance of the fractal detector is compared with three frequently used spike detectors. We demonstrate that in low SNR, the fractal detector has the best performance and results in the highest detection probability. It is shown that, in contrast to the other three detectors, the performance of the fractal detector is independent of inter-spike noise power and that variations in spike shape do not alter its performance. Finally, we use the fractal detector for spike detection in experimental data and similar to simulations, it is shown that the fractal detector has the best performance in low SNR regimes. The detection of low-amplitude spikes provides more information about the neural activity in the vicinity of the recording electrodes. Our results suggest using the fractal detector as a reliable and robust method for detecting semi-intact spikes in low SNR extracellular signals.

  18. Time Multiplexed Active Neural Probe with 1356 Parallel Recording Sites

    Directory of Open Access Journals (Sweden)

    Bogdan C. Raducanu

    2017-10-01

    Full Text Available We present a high electrode density and high channel count CMOS (complementary metal-oxide-semiconductor active neural probe containing 1344 neuron sized recording pixels (20 µm × 20 µm and 12 reference pixels (20 µm × 80 µm, densely packed on a 50 µm thick, 100 µm wide, and 8 mm long shank. The active electrodes or pixels consist of dedicated in-situ circuits for signal source amplification, which are directly located under each electrode. The probe supports the simultaneous recording of all 1356 electrodes with sufficient signal to noise ratio for typical neuroscience applications. For enhanced performance, further noise reduction can be achieved while using half of the electrodes (678. Both of these numbers considerably surpass the state-of-the art active neural probes in both electrode count and number of recording channels. The measured input referred noise in the action potential band is 12.4 µVrms, while using 678 electrodes, with just 3 µW power dissipation per pixel and 45 µW per read-out channel (including data transmission.

  19. Polymer neural interface with dual-sided electrodes for neural stimulation and recording.

    Science.gov (United States)

    Tooker, Angela; Tolosa, Vanessa; Shah, Kedar G; Sheth, Heeral; Felix, Sarah; Delima, Terri; Pannu, Satinderpall

    2012-01-01

    We present here a demonstration of a dual-sided, 4-layer metal, polyimide-based electrode array suitable for neural stimulation and recording. The fabrication process outlined here utilizes simple polymer and metal deposition and etching steps, with no potentially harmful backside etches or long exposures to extremely toxic chemicals. These polyimide-based electrode arrays have been tested to ensure they are fully biocompatible and suitable for long-term implantation; their flexibility minimizes the injury and glial scarring that can occur at the implantation site. The creation of dual-side electrode arrays with more than two layers of trace metal enables the fabrication of neural probes with more electrodes without a significant increase in probe size. This allows for more stimulation/recording sites without inducing additional injury and glial scarring.

  20. Spike sorting for polytrodes: a divide and conquer approach

    Directory of Open Access Journals (Sweden)

    Nicholas V. Swindale

    2014-02-01

    Full Text Available In order to determine patterns of neural activity, spike signals recorded by extracellular electrodes have to be clustered (sorted with the aim of ensuring that each cluster represents all the spikes generated by an individual neuron. Many methods for spike sorting have been proposed but few are easily applicable to recordings from polytrodes which may have 16 or more recording sites. As with tetrodes, these are spaced sufficiently closely that signals from single neurons will usually be recorded on several adjacent sites. Although this offers a better chance of distinguishing neurons with similarly shaped spikes, sorting is difficult in such cases because of the high dimensionality of the space in which the signals must be classified. This report details a method for spike sorting based on a divide and conquer approach. Clusters are initially formed by assigning each event to the channel on which it is largest. Each channel-based cluster is then sub-divided into as many distinct clusters as possible. These are then recombined on the basis of pairwise tests into a final set of clusters. Pairwise tests are also performed to establish how distinct each cluster is from the others. A modified gradient ascent clustering (GAC algorithm is used to do the clustering. The method can sort spikes with minimal user input in times comparable to real time for recordings lasting up to 45 minutes. Our results illustrate some of the difficulties inherent in spike sorting, including changes in spike shape over time. We show that some physiologically distinct units may have very similar spike shapes. We show that RMS measures of spike shape similarity are not sensitive enough to discriminate clusters that can otherwise be separated by principal components analysis. Hence spike sorting based on least-squares matching to templates may be unreliable. Our methods should be applicable to tetrodes and scaleable to larger multi-electrode arrays (MEAs.

  1. Pontine-ventral respiratory column interactions through raphe circuits detected using multi-array spike train recordings.

    Science.gov (United States)

    Nuding, Sarah C; Segers, Lauren S; Baekey, David M; Dick, Thomas E; Solomon, Irene C; Shannon, Roger; Morris, Kendall F; Lindsey, Bruce G

    2009-06-01

    Recently, Segers et al. identified functional connectivity between the ventrolateral respiratory column (VRC) and the pontine respiratory group (PRG). The apparent sparseness of detected paucisynaptic interactions motivated consideration of other potential functional pathways between these two regions. We report here evidence for "indirect" serial functional linkages between the PRG and VRC via intermediary brain stem midline raphé neurons. Arrays of microelectrodes were used to record sets of spike trains from a total of 145 PRG, 282 VRC, and 340 midline neurons in 11 decerebrate, vagotomized, neuromuscularly blocked, ventilated cats. Spike trains of 13,843 pairs of neurons that included at least one raphé cell were screened for respiratory modulation and short-time scale correlations. Significant correlogram features were detected in 7.2% of raphé-raphé (291/4,021), 4.3% of VRC-raphé (292/6,755), and 4.0% of the PRG-raphé (124/3,067) neuron pairs. Central peaks indicative of shared influences were the most common feature in correlations between pairs of raphé neurons, whereas correlated raphé-PRG and raphé-VRC neuron pairs displayed predominantly offset peaks and troughs, features suggesting a paucisynaptic influence of one neuron on the other. Overall, offset correlogram features provided evidence for 33 VRC-to-raphé-to-PRG and 45 PRG-to-raphé-to-VRC correlational linkage chains with one or two intermediate raphé neurons. The results support a respiratory network architecture with parallel VRC-to-PRG and PRG-to-VRC links operating through intervening midline circuits, and suggest that raphé neurons contribute to the respiratory modulation of PRG neurons and shape the respiratory motor pattern through coordinated divergent actions on both the PRG and VRC.

  2. FNS: an event-driven spiking neural network framework for efficient simulations of large-scale brain models

    OpenAIRE

    Susi, Gianluca; Garces, Pilar; Cristini, Alessandro; Paracone, Emanuele; Salerno, Mario; Maestu, Fernando; Pereda, Ernesto

    2018-01-01

    Limitations in processing capabilities and memory of today's computers make spiking neuron-based (human) whole-brain simulations inevitably characterized by a compromise between bio-plausibility and computational cost. It translates into brain models composed of a reduced number of neurons and a simplified neuron's mathematical model. Taking advantage of the sparse character of brain-like computation, eventdriven technique allows us to carry out efficient simulation of large-scale Spiking Neu...

  3. Amorphous silicon carbide ultramicroelectrode arrays for neural stimulation and recording.

    Science.gov (United States)

    Deku, Felix; Cohen, Yarden; Joshi-Imre, Alexandra; Kanneganti, Aswini; Gardner, Timothy; Cogan, Stuart

    2017-09-27

    Foreign body response to indwelling cortical microelectrodes limits the reliability of neural stimulation and recording, particularly for extended chronic applications in behaving animals. The extent to which this response compromises the chronic stability of neural devices depends on many factors including the materials used in the electrode construction, the size, and geometry of the indwelling structure. Here, we report on the development of microelectrode arrays (MEAs) based on amorphous silicon carbide (a-SiC). This technology utilizes a-SiC for its chronic stability and employs semiconductor manufacturing processes to create MEAs with small shank dimensions. The a-SiC films were deposited by plasma enhanced chemical vapor deposition and patterned by thin-film photolithographic techniques. To improve stimulation and recording capabilities with small contact areas, we investigated low impedance coatings on the electrode sites. The assembled devices were characterized in phosphate buffered saline for their electrochemical properties. MEAs utilizing a-SiC as both the primary structural element and encapsulation were fabricated successfully. These a-SiC MEAs had 16 penetrating shanks. Each shank has a cross-sectional area less than 60 µm2 and electrode sites with a geometric surface area varying from 20-200 μm2. Electrode coatings of TiN and SIROF reduced 1 kHz electrode impedance to less than 100 kΩ from ~2.8 MΩ for 100 µm2 Au electrode sites and increased the charge injection capacities to values greater than 3 mC/cm2. Finally, we demonstrated functionality by recording neural activity from basal ganglia nucleus of Zebra Finches and motor cortex of rat. The a-SiC MEAs provide a significant advancement in the development of microelectrodes that over the years has relied on silicon platforms for device manufacture. These flexible a-SiC MEAs have the potential for decreased tissue damage and reduced foreign body

  4. Event- and Time-Driven Techniques Using Parallel CPU-GPU Co-processing for Spiking Neural Networks.

    Science.gov (United States)

    Naveros, Francisco; Garrido, Jesus A; Carrillo, Richard R; Ros, Eduardo; Luque, Niceto R

    2017-01-01

    Modeling and simulating the neural structures which make up our central neural system is instrumental for deciphering the computational neural cues beneath. Higher levels of biological plausibility usually impose higher levels of complexity in mathematical modeling, from neural to behavioral levels. This paper focuses on overcoming the simulation problems (accuracy and performance) derived from using higher levels of mathematical complexity at a neural level. This study proposes different techniques for simulating neural models that hold incremental levels of mathematical complexity: leaky integrate-and-fire (LIF), adaptive exponential integrate-and-fire (AdEx), and Hodgkin-Huxley (HH) neural models (ranged from low to high neural complexity). The studied techniques are classified into two main families depending on how the neural-model dynamic evaluation is computed: the event-driven or the time-driven families. Whilst event-driven techniques pre-compile and store the neural dynamics within look-up tables, time-driven techniques compute the neural dynamics iteratively during the simulation time. We propose two modifications for the event-driven family: a look-up table recombination to better cope with the incremental neural complexity together with a better handling of the synchronous input activity. Regarding the time-driven family, we propose a modification in computing the neural dynamics: the bi-fixed-step integration method. This method automatically adjusts the simulation step size to better cope with the stiffness of the neural model dynamics running in CPU platforms. One version of this method is also implemented for hybrid CPU-GPU platforms. Finally, we analyze how the performance and accuracy of these modifications evolve with increasing levels of neural complexity. We also demonstrate how the proposed modifications which constitute the main contribution of this study systematically outperform the traditional event- and time-driven techniques under

  5. Event- and Time-Driven Techniques Using Parallel CPU-GPU Co-processing for Spiking Neural Networks

    Science.gov (United States)

    Naveros, Francisco; Garrido, Jesus A.; Carrillo, Richard R.; Ros, Eduardo; Luque, Niceto R.

    2017-01-01

    Modeling and simulating the neural structures which make up our central neural system is instrumental for deciphering the computational neural cues beneath. Higher levels of biological plausibility usually impose higher levels of complexity in mathematical modeling, from neural to behavioral levels. This paper focuses on overcoming the simulation problems (accuracy and performance) derived from using higher levels of mathematical complexity at a neural level. This study proposes different techniques for simulating neural models that hold incremental levels of mathematical complexity: leaky integrate-and-fire (LIF), adaptive exponential integrate-and-fire (AdEx), and Hodgkin-Huxley (HH) neural models (ranged from low to high neural complexity). The studied techniques are classified into two main families depending on how the neural-model dynamic evaluation is computed: the event-driven or the time-driven families. Whilst event-driven techniques pre-compile and store the neural dynamics within look-up tables, time-driven techniques compute the neural dynamics iteratively during the simulation time. We propose two modifications for the event-driven family: a look-up table recombination to better cope with the incremental neural complexity together with a better handling of the synchronous input activity. Regarding the time-driven family, we propose a modification in computing the neural dynamics: the bi-fixed-step integration method. This method automatically adjusts the simulation step size to better cope with the stiffness of the neural model dynamics running in CPU platforms. One version of this method is also implemented for hybrid CPU-GPU platforms. Finally, we analyze how the performance and accuracy of these modifications evolve with increasing levels of neural complexity. We also demonstrate how the proposed modifications which constitute the main contribution of this study systematically outperform the traditional event- and time-driven techniques under

  6. Low-cost wireless neural recording system and software.

    Science.gov (United States)

    Gregory, Jeffrey A; Borna, Amir; Roy, Sabyasachi; Wang, Xiaoqin; Lewandowski, Brian; Schmidt, Marc; Najafi, Khalil

    2009-01-01

    We describe a flexible wireless neural recording system, which is comprised of a 15-channel analog FM transmitter, digital receiver and custom user interface software for data acquisition. The analog front-end is constructed from commercial off the shelf (COTS) components and weighs 6.3g (including batteries) and is capable of transmitting over 24 hours up to a range over 3m with a 25microV(rms) in-vivo noise floor. The Software Defined Radio (SDR) and the acquisition software provide a data acquisition platform with real time data display and can be customized based on the specifications of various experiments. The described system was characterized with in-vitro and in-vivo experiments and the results are presented.

  7. Two multichannel integrated circuits for neural recording and signal processing.

    Science.gov (United States)

    Obeid, Iyad; Morizio, James C; Moxon, Karen A; Nicolelis, Miguel A L; Wolf, Patrick D

    2003-02-01

    We have developed, manufactured, and tested two analog CMOS integrated circuit "neurochips" for recording from arrays of densely packed neural electrodes. Device A is a 16-channel buffer consisting of parallel noninverting amplifiers with a gain of 2 V/V. Device B is a 16-channel two-stage analog signal processor with differential amplification and high-pass filtering. It features selectable gains of 250 and 500 V/V as well as reference channel selection. The resulting amplifiers on Device A had a mean gain of 1.99 V/V with an equivalent input noise of 10 microV(rms). Those on Device B had mean gains of 53.4 and 47.4 dB with a high-pass filter pole at 211 Hz and an equivalent input noise of 4.4 microV(rms). Both devices were tested in vivo with electrode arrays implanted in the somatosensory cortex.

  8. A fast L(p) spike alignment metric.

    Science.gov (United States)

    Dubbs, Alexander J; Seiler, Brad A; Magnasco, Marcelo O

    2010-11-01

    The metrization of the space of neural responses is an ongoing research program seeking to find natural ways to describe, in geometrical terms, the sets of possible activities in the brain. One component of this program is spike metrics-notions of distance between two spike trains recorded from a neuron. Alignment spike metrics work by identifying "equivalent" spikes in both trains. We present an alignment spike metric having L(p) underlying geometrical structure; the L(2) version is Euclidean and is suitable for further embedding in Euclidean spaces by multidimensional scaling methods or related procedures. We show how to implement a fast algorithm for the computation of this metric based on bipartite graph matching theory.

  9. Brainlab: a Python toolkit to aid in the design, simulation, and analysis of spiking neural networks with the NeoCortical Simulator

    Directory of Open Access Journals (Sweden)

    Richard P Drewes

    2009-05-01

    Full Text Available Neuroscience modeling experiments often involve multiple complex neural network and cell model variants, complex input stimuli and input protocols, followed by complex data analysis. Coordinating all this complexity becomes a central difficulty for the experimenter. The Python programming language, along with its extensive library packages, has emerged as a leading ``glue'' tool for managing all sorts of complex programmatictasks. This paper describes a toolkit called Brainlab, written in Python, that leverages Python's strengths for the task of managing the general complexity of neuroscience modeling experiments. Brainlab was also designed to overcome the major difficulties of working with the NCS environment in particular. Brainlab is an integrated model building, experimentation, and data analysis environment for the powerful parallel spiking neural network simulator system NCS (the NeoCortical Simulator.

  10. Brainlab: A Python Toolkit to Aid in the Design, Simulation, and Analysis of Spiking Neural Networks with the NeoCortical Simulator.

    Science.gov (United States)

    Drewes, Rich; Zou, Quan; Goodman, Philip H

    2009-01-01

    Neuroscience modeling experiments often involve multiple complex neural network and cell model variants, complex input stimuli and input protocols, followed by complex data analysis. Coordinating all this complexity becomes a central difficulty for the experimenter. The Python programming language, along with its extensive library packages, has emerged as a leading "glue" tool for managing all sorts of complex programmatic tasks. This paper describes a toolkit called Brainlab, written in Python, that leverages Python's strengths for the task of managing the general complexity of neuroscience modeling experiments. Brainlab was also designed to overcome the major difficulties of working with the NCS (NeoCortical Simulator) environment in particular. Brainlab is an integrated model-building, experimentation, and data analysis environment for the powerful parallel spiking neural network simulator system NCS.

  11. Traditional waveform based spike sorting yields biased rate code estimates.

    Science.gov (United States)

    Ventura, Valérie

    2009-04-28

    Much of neuroscience has to do with relating neural activity and behavior or environment. One common measure of this relationship is the firing rates of neurons as functions of behavioral or environmental parameters, often called tuning functions and receptive fields. Firing rates are estimated from the spike trains of neurons recorded by electrodes implanted in the brain. Individual neurons' spike trains are not typically readily available, because the signal collected at an electrode is often a mixture of activities from different neurons and noise. Extracting individual neurons' spike trains from voltage signals, which is known as spike sorting, is one of the most important data analysis problems in neuroscience, because it has to be undertaken prior to any analysis of neurophysiological data in which more than one neuron is believed to be recorded on a single electrode. All current spike-sorting methods consist of clustering the characteristic spike waveforms of neurons. The sequence of first spike sorting based on waveforms, then estimating tuning functions, has long been the accepted way to proceed. Here, we argue that the covariates that modulate tuning functions also contain information about spike identities, and that if tuning information is ignored for spike sorting, the resulting tuning function estimates are biased and inconsistent, unless spikes can be classified with perfect accuracy. This means, for example, that the commonly used peristimulus time histogram is a biased estimate of the firing rate of a neuron that is not perfectly isolated. We further argue that the correct conceptual way to view the problem out is to note that spike sorting provides information about rate estimation and vice versa, so that the two relationships should be considered simultaneously rather than sequentially. Indeed we show that when spike sorting and tuning-curve estimation are performed in parallel, unbiased estimates of tuning curves can be recovered even from

  12. A 500 year sediment lake record of anthropogenic and natural inputs to Windermere (English Lake District) using double-spike lead isotopes, radiochronology, and sediment microanalysis

    OpenAIRE

    Miller, Helen; Croudace, Ian W.; Bull, Jonathan M.; Cotterill, Carol J.; Dix, Justin K.; Taylor, Rex N.

    2014-01-01

    A high-resolution record of pollution is preserved in recent sediments from Windermere, the largest lake in the English Lake District. Data derived from X-ray core scanning (validated against wavelength dispersive X-ray fluorescence), radiochronological techniques (210Pb and 137Cs) and ultrahigh precision, double-spike mass spectrometry for lead isotopes are combined to decipher the anthropogenic inputs to the lake. The sediment record suggests that while most element concentrations have been...

  13. Real-time simulation of a spiking neural network model of the basal ganglia circuitry using general purpose computing on graphics processing units.

    Science.gov (United States)

    Igarashi, Jun; Shouno, Osamu; Fukai, Tomoki; Tsujino, Hiroshi

    2011-11-01

    Real-time simulation of a biologically realistic spiking neural network is necessary for evaluation of its capacity to interact with real environments. However, the real-time simulation of such a neural network is difficult due to its high computational costs that arise from two factors: (1) vast network size and (2) the complicated dynamics of biologically realistic neurons. In order to address these problems, mainly the latter, we chose to use general purpose computing on graphics processing units (GPGPUs) for simulation of such a neural network, taking advantage of the powerful computational capability of a graphics processing unit (GPU). As a target for real-time simulation, we used a model of the basal ganglia that has been developed according to electrophysiological and anatomical knowledge. The model consists of heterogeneous populations of 370 spiking model neurons, including computationally heavy conductance-based models, connected by 11,002 synapses. Simulation of the model has not yet been performed in real-time using a general computing server. By parallelization of the model on the NVIDIA Geforce GTX 280 GPU in data-parallel and task-parallel fashion, faster-than-real-time simulation was robustly realized with only one-third of the GPU's total computational resources. Furthermore, we used the GPU's full computational resources to perform faster-than-real-time simulation of three instances of the basal ganglia model; these instances consisted of 1100 neurons and 33,006 synapses and were synchronized at each calculation step. Finally, we developed software for simultaneous visualization of faster-than-real-time simulation output. These results suggest the potential power of GPGPU techniques in real-time simulation of realistic neural networks. Copyright © 2011 Elsevier Ltd. All rights reserved.

  14. Towards statistical summaries of spike train data.

    Science.gov (United States)

    Wu, Wei; Srivastava, Anuj

    2011-01-30

    Statistical inference has an important role in analysis of neural spike trains. While current approaches are mostly model-based, and designed for capturing the temporal evolution of the underlying stochastic processes, we focus on a data-driven approach where statistics are defined and computed in function spaces where individual spike trains are viewed as points. The first contribution of this paper is to endow spike train space with a parameterized family of metrics that takes into account different time warpings and generalizes several currently used metrics. These metrics are essentially penalized L(p) norms, involving appropriate functions of spike trains, with penalties associated with time-warpings. The second contribution of this paper is to derive a notion of a mean spike train in the case when p=2. We present an efficient recursive algorithm, termed Matching-Minimization algorithm, to compute the sample mean of a set of spike trains. The proposed metrics as well as the mean computations are demonstrated using an experimental recording from the motor cortex. Copyright © 2010 Elsevier B.V. All rights reserved.

  15. Magnetic Tunnel Junction Based Long-Term Short-Term Stochastic Synapse for a Spiking Neural Network with On-Chip STDP Learning.

    Science.gov (United States)

    Srinivasan, Gopalakrishnan; Sengupta, Abhronil; Roy, Kaushik

    2016-07-13

    Spiking Neural Networks (SNNs) have emerged as a powerful neuromorphic computing paradigm to carry out classification and recognition tasks. Nevertheless, the general purpose computing platforms and the custom hardware architectures implemented using standard CMOS technology, have been unable to rival the power efficiency of the human brain. Hence, there is a need for novel nanoelectronic devices that can efficiently model the neurons and synapses constituting an SNN. In this work, we propose a heterostructure composed of a Magnetic Tunnel Junction (MTJ) and a heavy metal as a stochastic binary synapse. Synaptic plasticity is achieved by the stochastic switching of the MTJ conductance states, based on the temporal correlation between the spiking activities of the interconnecting neurons. Additionally, we present a significance driven long-term short-term stochastic synapse comprising two unique binary synaptic elements, in order to improve the synaptic learning efficiency. We demonstrate the efficacy of the proposed synaptic configurations and the stochastic learning algorithm on an SNN trained to classify handwritten digits from the MNIST dataset, using a device to system-level simulation framework. The power efficiency of the proposed neuromorphic system stems from the ultra-low programming energy of the spintronic synapses.

  16. Magnetic Tunnel Junction Based Long-Term Short-Term Stochastic Synapse for a Spiking Neural Network with On-Chip STDP Learning

    Science.gov (United States)

    Srinivasan, Gopalakrishnan; Sengupta, Abhronil; Roy, Kaushik

    2016-07-01

    Spiking Neural Networks (SNNs) have emerged as a powerful neuromorphic computing paradigm to carry out classification and recognition tasks. Nevertheless, the general purpose computing platforms and the custom hardware architectures implemented using standard CMOS technology, have been unable to rival the power efficiency of the human brain. Hence, there is a need for novel nanoelectronic devices that can efficiently model the neurons and synapses constituting an SNN. In this work, we propose a heterostructure composed of a Magnetic Tunnel Junction (MTJ) and a heavy metal as a stochastic binary synapse. Synaptic plasticity is achieved by the stochastic switching of the MTJ conductance states, based on the temporal correlation between the spiking activities of the interconnecting neurons. Additionally, we present a significance driven long-term short-term stochastic synapse comprising two unique binary synaptic elements, in order to improve the synaptic learning efficiency. We demonstrate the efficacy of the proposed synaptic configurations and the stochastic learning algorithm on an SNN trained to classify handwritten digits from the MNIST dataset, using a device to system-level simulation framework. The power efficiency of the proposed neuromorphic system stems from the ultra-low programming energy of the spintronic synapses.

  17. Statistical technique for analysing functional connectivity of multiple spike trains.

    Science.gov (United States)

    Masud, Mohammad Shahed; Borisyuk, Roman

    2011-03-15

    A new statistical technique, the Cox method, used for analysing functional connectivity of simultaneously recorded multiple spike trains is presented. This method is based on the theory of modulated renewal processes and it estimates a vector of influence strengths from multiple spike trains (called reference trains) to the selected (target) spike train. Selecting another target spike train and repeating the calculation of the influence strengths from the reference spike trains enables researchers to find all functional connections among multiple spike trains. In order to study functional connectivity an "influence function" is identified. This function recognises the specificity of neuronal interactions and reflects the dynamics of postsynaptic potential. In comparison to existing techniques, the Cox method has the following advantages: it does not use bins (binless method); it is applicable to cases where the sample size is small; it is sufficiently sensitive such that it estimates weak influences; it supports the simultaneous analysis of multiple influences; it is able to identify a correct connectivity scheme in difficult cases of "common source" or "indirect" connectivity. The Cox method has been thoroughly tested using multiple sets of data generated by the neural network model of the leaky integrate and fire neurons with a prescribed architecture of connections. The results suggest that this method is highly successful for analysing functional connectivity of simultaneously recorded multiple spike trains. Copyright © 2011 Elsevier B.V. All rights reserved.

  18. Glutamate gated spiking Neuron Model.

    Science.gov (United States)

    Deka, Krisha M; Roy, Soumik

    2014-01-01

    Biological neuron models mainly analyze the behavior of neural networks. Neurons are described in terms of firing rates viz an analog signal. The Izhikevich neuron model is an efficient, powerful model of spiking neuron. This model is a reduction of Hodgkin-Huxley model to a two variable system and is capable of producing rich firing patterns for many biological neurons. In this paper, the Regular Spiking (RS) neuron firing pattern is used to simulate the spiking of Glutamate gated postsynaptic membrane. Simulation is done in MATLAB environment for excitatory action of synapses. Analogous simulation of spiking of excitatory postsynaptic membrane potential is obtained.

  19. Unbiased and robust quantification of synchronization between spikes and local field potential.

    Science.gov (United States)

    Li, Zhaohui; Cui, Dong; Li, Xiaoli

    2016-08-30

    In neuroscience, relating the spiking activity of individual neurons to the local field potential (LFP) of neural ensembles is an increasingly useful approach for studying rhythmic neuronal synchronization. Many methods have been proposed to measure the strength of the association between spikes and rhythms in the LFP recordings, and most existing measures are dependent upon the total number of spikes. In the present work, we introduce a robust approach for quantifying spike-LFP synchronization which performs reliably for limited samples of data. The measure is termed as spike-triggered correlation matrix synchronization (SCMS), which takes LFP segments centered on each spike as multi-channel signals and calculates the index of spike-LFP synchronization by constructing a correlation matrix. The simulation based on artificial data shows that the SCMS output almost does not change with the sample size. This property is of crucial importance when making comparisons between different experimental conditions. When applied to actual neuronal data recorded from the monkey primary visual cortex, it is found that the spike-LFP synchronization strength shows orientation selectivity to drifting gratings. In comparison to another unbiased method, pairwise phase consistency (PPC), the proposed SCMS behaves better for noisy spike trains by means of numerical simulations. This study demonstrates the basic idea and calculating process of the SCMS method. Considering its unbiasedness and robustness, the measure is of great advantage to characterize the synchronization between spike trains and rhythms present in LFP. Copyright © 2016 Elsevier B.V. All rights reserved.

  20. iRaster: a novel information visualization tool to explore spatiotemporal patterns in multiple spike trains.

    Science.gov (United States)

    Somerville, J; Stuart, L; Sernagor, E; Borisyuk, R

    2010-12-15

    Over the last few years, simultaneous recordings of multiple spike trains have become widely used by neuroscientists. Therefore, it is important to develop new tools for analysing multiple spike trains in order to gain new insight into the function of neural systems. This paper describes how techniques from the field of visual analytics can be used to reveal specific patterns of neural activity. An interactive raster plot called iRaster has been developed. This software incorporates a selection of statistical procedures for visualization and flexible manipulations with multiple spike trains. For example, there are several procedures for the re-ordering of spike trains which can be used to unmask activity propagation, spiking synchronization, and many other important features of multiple spike train activity. Additionally, iRaster includes a rate representation of neural activity, a combined representation of rate and spikes, spike train removal and time interval removal. Furthermore, it provides multiple coordinated views, time and spike train zooming windows, a fisheye lens distortion, and dissemination facilities. iRaster is a user friendly, interactive, flexible tool which supports a broad range of visual representations. This tool has been successfully used to analyse both synthetic and experimentally recorded datasets. In this paper, the main features of iRaster are described and its performance and effectiveness are demonstrated using various types of data including experimental multi-electrode array recordings from the ganglion cell layer in mouse retina. iRaster is part of an ongoing research project called VISA (Visualization of Inter-Spike Associations) at the Visualization Lab in the University of Plymouth. The overall aim of the VISA project is to provide neuroscientists with the ability to freely explore and analyse their data. The software is freely available from the Visualization Lab website (see www.plymouth.ac.uk/infovis). Copyright © 2010

  1. Using pulse width modulation for wireless transmission of neural signals in multichannel neural recording systems.

    Science.gov (United States)

    Yin, Ming; Ghovanloo, Maysam

    2009-08-01

    We have used a well-known technique in wireless communication, pulse width modulation (PWM) of time division multiplexed (TDM) signals, within the architecture of a novel wireless integrated neural recording (WINeR) system. We have evaluated the performance of the PWM-based architecture and indicated its accuracy and potential sources of error through detailed theoretical analysis, simulations, and measurements on a setup consisting of a 15-channel WINeR prototype as the transmitter and two types of receivers; an Agilent 89600 vector signal analyzer and a custom wideband receiver, with 36 and 75 MHz of maximum bandwidth, respectively. Furthermore, we present simulation results from a realistic MATLAB-Simulink model of the entire WINeR system to observe the system behavior in response to changes in various parameters. We have concluded that the 15-ch WINeR prototype, which is fabricated in a 0.5- mum standard CMOS process and consumes 4.5 mW from +/-1.5 V supplies, can acquire and wirelessly transmit up to 320 k-samples/s to a 75-MHz receiver with 8.4 bits of resolution, which is equivalent to a wireless data rate of approximately 2.56 Mb/s.

  2. Dopamine-signalled reward predictions generated by competitive excitation and inhibition in a spiking neural network model

    Directory of Open Access Journals (Sweden)

    Paul eChorley

    2011-05-01

    Full Text Available Dopaminergic neurons in the mammalian substantia nigra displaycharacteristic phasic responses to stimuli which reliably predict thereceipt of primary rewards. These responses have been suggested toencode reward prediction-errors similar to those used in reinforcementlearning. Here, we propose a model of dopaminergic activity in whichprediction error signals are generated by the joint action ofshort-latency excitation and long-latency inhibition, in a networkundergoing dopaminergic neuromodulation of both spike-timing dependentsynaptic plasticity and neuronal excitability. In contrast toprevious models, sensitivity to recent events is maintained by theselective modification of specific striatal synapses, efferent tocortical neurons exhibiting stimulus-specific, temporally extendedactivity patterns. Our model shows, in the presence of significantbackground activity, (i a shift in dopaminergic response from rewardto reward predicting stimuli, (ii preservation of a response tounexpected rewards, and (iii a precisely-timed below-baseline dip inactivity observed when expected rewards are omitted.

  3. A Multiple-Plasticity Spiking Neural Network Embedded in a Closed-Loop Control System to Model Cerebellar Pathologies.

    Science.gov (United States)

    Geminiani, Alice; Casellato, Claudia; Antonietti, Alberto; D'Angelo, Egidio; Pedrocchi, Alessandra

    2017-01-10

    The cerebellum plays a crucial role in sensorimotor control and cerebellar disorders compromise adaptation and learning of motor responses. However, the link between alterations at network level and cerebellar dysfunction is still unclear. In principle, this understanding would benefit of the development of an artificial system embedding the salient neuronal and plastic properties of the cerebellum and operating in closed-loop. To this aim, we have exploited a realistic spiking computational model of the cerebellum to analyze the network correlates of cerebellar impairment. The model was modified to reproduce three different damages of the cerebellar cortex: (i) a loss of the main output neurons (Purkinje Cells), (ii) a lesion to the main cerebellar afferents (Mossy Fibers), and (iii) a damage to a major mechanism of synaptic plasticity (Long Term Depression). The modified network models were challenged with an Eye-Blink Classical Conditioning test, a standard learning paradigm used to evaluate cerebellar impairment, in which the outcome was compared to reference results obtained in human or animal experiments. In all cases, the model reproduced the partial and delayed conditioning typical of the pathologies, indicating that an intact cerebellar cortex functionality is required to accelerate learning by transferring acquired information to the cerebellar nuclei. Interestingly, depending on the type of lesion, the redistribution of synaptic plasticity and response timing varied greatly generating specific adaptation patterns. Thus, not only the present work extends the generalization capabilities of the cerebellar spiking model to pathological cases, but also predicts how changes at the neuronal level are distributed across the network, making it usable to infer cerebellar circuit alterations occurring in cerebellar pathologies.

  4. A soft magnetic underlayer with negative uniaxial magnetocrystalline anisotropy for suppression of spike noise and wide adjacent track erasure in perpendicular recording media

    Science.gov (United States)

    Hashimoto, Atsushi; Saito, Shin; Takahashi, Migaku

    2006-04-01

    The suppression of spike noise and wide adjacent track erasure (WATE) are important technical issues in the development of a perpendicular recording medium (PRM). As a solution to both of these problems, this paper presents a type of soft magnetic underlayer (SUL) with negative uniaxial perpendicular magnetic anisotropy. The magnetic anisotropy is achieved by employing a material with negative uniaxial magnetocrystalline anisotropy (Kugrain). WATE is suppressed in the SUL by realizing wide distribution of magnetic flux below the edge of the return yoke, while spike noise is eliminated by ensuring the formation of a Néel wall instead of a Bloch wall in SUL domains. CoIr with the disordered hcp structure is selected as a negative Kugrain material, and c-plane-oriented CoIr films with various Ir contents are prepared for experimental evaluation. Among the films tested, the CoIr film with 22 at. % Ir is found to provide the minimum Kugrain value of -6×106 ergs/cm3. Under a field applied parallel to the film plane, this film exhibits soft magnetic properties, attributable to the high crystallographic symmetry of the c-plane sheet texture. A PRM fabricated using the CoIr SUL is confirmed to display substantially lower spike noise and WATE compared to conventional structures.

  5. Benchmarking Spike-Based Visual Recognition: A Dataset and Evaluation.

    Science.gov (United States)

    Liu, Qian; Pineda-García, Garibaldi; Stromatias, Evangelos; Serrano-Gotarredona, Teresa; Furber, Steve B

    2016-01-01

    Today, increasing attention is being paid to research into spike-based neural computation both to gain a better understanding of the brain and to explore biologically-inspired computation. Within this field, the primate visual pathway and its hierarchical organization have been extensively studied. Spiking Neural Networks (SNNs), inspired by the understanding of observed biological structure and function, have been successfully applied to visual recognition and classification tasks. In addition, implementations on neuromorphic hardware have enabled large-scale networks to run in (or even faster than) real time, making spike-based neural vision processing accessible on mobile robots. Neuromorphic sensors such as silicon retinas are able to feed such mobile systems with real-time visual stimuli. A new set of vision benchmarks for spike-based neural processing are now needed to measure progress quantitatively within this rapidly advancing field. We propose that a large dataset of spike-based visual stimuli is needed to provide meaningful comparisons between different systems, and a corresponding evaluation methodology is also required to measure the performance of SNN models and their hardware implementations. In this paper we first propose an initial NE (Neuromorphic Engineering) dataset based on standard computer vision benchmarksand that uses digits from the MNIST database. This dataset is compatible with the state of current research on spike-based image recognition. The corresponding spike trains are produced using a range of techniques: rate-based Poisson spike generation, rank order encoding, and recorded output from a silicon retina with both flashing and oscillating input stimuli. In addition, a complementary evaluation methodology is presented to assess both model-level and hardware-level performance. Finally, we demonstrate the use of the dataset and the evaluation methodology using two SNN models to validate the performance of the models and their hardware

  6. Joint analysis of spikes and local field potentials using copula.

    Science.gov (United States)

    Hu, Meng; Li, Mingyao; Li, Wu; Liang, Hualou

    2016-06-01

    Recent technological advances, which allow for simultaneous recording of spikes and local field potentials (LFPs) at multiple sites in a given cortical area or across different areas, have greatly increased our understanding of signal processing in brain circuits. Joint analysis of simultaneously collected spike and LFP signals is an important step to explicate how the brain orchestrates information processing. In this contribution, we present a novel statistical framework based on Gaussian copula to jointly model spikes and LFP. In our approach, we use copula to link separate, marginal regression models to construct a joint regression model, in which the binary-valued spike train data are modeled using generalized linear model (GLM) and the continuous-valued LFP data are modeled using linear regression. Model parameters can be efficiently estimated via maximum-likelihood. In particular, we show that our model offers a means to statistically detect directional influence between spikes and LFP, akin to Granger causality measure, and that we are able to assess its statistical significance by conducting a Wald test. Through extensive simulations, we also show that our method is able to reliably recover the true model used to generate the data. To demonstrate the effectiveness of our approach in real setting, we further apply the method to a mixed neural dataset, consisting of spikes and LFP simultaneously recorded from the visual cortex of a monkey performing a contour detection task. Copyright © 2016 Elsevier Inc. All rights reserved.

  7. Simultaneous recording of brain extracellular glucose, spike and local field potential in real time using an implantable microelectrode array with nano-materials.

    Science.gov (United States)

    Wei, Wenjing; Song, Yilin; Fan, Xinyi; Zhang, Song; Wang, Li; Xu, Shengwei; Wang, Mixia; Cai, Xinxia

    2016-03-18

    Glucose is the main substrate for neurons in the central nervous system. In order to efficiently characterize the brain glucose mechanism, it is desirable to determine the extracellular glucose dynamics as well as the corresponding neuroelectrical activity in vivo. In the present study, we fabricated an implantable microelectrode array (MEA) probe composed of platinum electrochemical and electrophysiology microelectrodes by standard micro electromechanical system (MEMS) processes. The MEA probe was modified with nano-materials and implanted in a urethane-anesthetized rat for simultaneous recording of striatal extracellular glucose, local field potential (LFP) and spike on the same spatiotemporal scale when the rat was in normoglycemia, hypoglycemia and hyperglycemia. During these dual-mode recordings, we observed that increase of extracellular glucose enhanced the LFP power and spike firing rate, while decrease of glucose had an opposite effect. This dual mode MEA probe is capable of examining specific spatiotemporal relationships between electrical and chemical signaling in the brain, which will contribute significantly to improve our understanding of the neuron physiology.

  8. Intracellular Neural Recording with Pure Carbon Nanotube Probes.

    Directory of Open Access Journals (Sweden)

    Inho Yoon

    Full Text Available The computational complexity of the brain depends in part on a neuron's capacity to integrate electrochemical information from vast numbers of synaptic inputs. The measurements of synaptic activity that are crucial for mechanistic understanding of brain function are also challenging, because they require intracellular recording methods to detect and resolve millivolt- scale synaptic potentials. Although glass electrodes are widely used for intracellular recordings, novel electrodes with superior mechanical and electrical properties are desirable, because they could extend intracellular recording methods to challenging environments, including long term recordings in freely behaving animals. Carbon nanotubes (CNTs can theoretically deliver this advance, but the difficulty of assembling CNTs has limited their application to a coating layer or assembly on a planar substrate, resulting in electrodes that are more suitable for in vivo extracellular recording or extracellular recording from isolated cells. Here we show that a novel, yet remarkably simple, millimeter-long electrode with a sub-micron tip, fabricated from self-entangled pure CNTs can be used to obtain intracellular and extracellular recordings from vertebrate neurons in vitro and in vivo. This fabrication technology provides a new method for assembling intracellular electrodes from CNTs, affording a promising opportunity to harness nanotechnology for neuroscience applications.

  9. Test statistics for the identification of assembly neurons in parallel spike trains.

    Science.gov (United States)

    Picado Muiño, David; Borgelt, Christian

    2015-01-01

    In recent years numerous improvements have been made in multiple-electrode recordings (i.e., parallel spike-train recordings) and spike sorting to the extent that nowadays it is possible to monitor the activity of up to hundreds of neurons simultaneously. Due to these improvements it is now potentially possible to identify assembly activity (roughly understood as significant synchronous spiking of a group of neurons) from these recordings, which-if it can be demonstrated reliably-would significantly improve our understanding of neural activity and neural coding. However, several methodological problems remain when trying to do so and, among them, a principal one is the combinatorial explosion that one faces when considering all potential neuronal assemblies, since in principle every subset of the recorded neurons constitutes a candidate set for an assembly. We present several statistical tests to identify assembly neurons (i.e., neurons that participate in a neuronal assembly) from parallel spike trains with the aim of reducing the set of neurons to a relevant subset of them and this way ease the task of identifying neuronal assemblies in further analyses. These tests are an improvement of those introduced in the work by Berger et al. (2010) based on additional features like spike weight or pairwise overlap and on alternative ways to identify spike coincidences (e.g., by avoiding time binning, which tends to lose information).

  10. Test Statistics for the Identification of Assembly Neurons in Parallel Spike Trains

    Directory of Open Access Journals (Sweden)

    David Picado Muiño

    2015-01-01

    Full Text Available In recent years numerous improvements have been made in multiple-electrode recordings (i.e., parallel spike-train recordings and spike sorting to the extent that nowadays it is possible to monitor the activity of up to hundreds of neurons simultaneously. Due to these improvements it is now potentially possible to identify assembly activity (roughly understood as significant synchronous spiking of a group of neurons from these recordings, which—if it can be demonstrated reliably—would significantly improve our understanding of neural activity and neural coding. However, several methodological problems remain when trying to do so and, among them, a principal one is the combinatorial explosion that one faces when considering all potential neuronal assemblies, since in principle every subset of the recorded neurons constitutes a candidate set for an assembly. We present several statistical tests to identify assembly neurons (i.e., neurons that participate in a neuronal assembly from parallel spike trains with the aim of reducing the set of neurons to a relevant subset of them and this way ease the task of identifying neuronal assemblies in further analyses. These tests are an improvement of those introduced in the work by Berger et al. (2010 based on additional features like spike weight or pairwise overlap and on alternative ways to identify spike coincidences (e.g., by avoiding time binning, which tends to lose information.

  11. Mapping, Learning, Visualization, Classification, and Understanding of fMRI Data in the NeuCube Evolving Spatiotemporal Data Machine of Spiking Neural Networks.

    Science.gov (United States)

    Kasabov, Nikola K; Doborjeh, Maryam Gholami; Doborjeh, Zohreh Gholami

    2017-04-01

    This paper introduces a new methodology for dynamic learning, visualization, and classification of functional magnetic resonance imaging (fMRI) as spatiotemporal brain data. The method is based on an evolving spatiotemporal data machine of evolving spiking neural networks (SNNs) exemplified by the NeuCube architecture [1]. The method consists of several steps: mapping spatial coordinates of fMRI data into a 3-D SNN cube (SNNc) that represents a brain template; input data transformation into trains of spikes; deep, unsupervised learning in the 3-D SNNc of spatiotemporal patterns from data; supervised learning in an evolving SNN classifier; parameter optimization; and 3-D visualization and model interpretation. Two benchmark case study problems and data are used to illustrate the proposed methodology-fMRI data collected from subjects when reading affirmative or negative sentences and another one-on reading a sentence or seeing a picture. The learned connections in the SNNc represent dynamic spatiotemporal relationships derived from the fMRI data. They can reveal new information about the brain functions under different conditions. The proposed methodology allows for the first time to analyze dynamic functional and structural connectivity of a learned SNN model from fMRI data. This can be used for a better understanding of brain activities and also for online generation of appropriate neurofeedback to subjects for improved brain functions. For example, in this paper, tracing the 3-D SNN model connectivity enabled us for the first time to capture prominent brain functional pathways evoked in language comprehension. We found stronger spatiotemporal interaction between left dorsolateral prefrontal cortex and left temporal while reading a negated sentence. This observation is obviously distinguishable from the patterns generated by either reading affirmative sentences or seeing pictures. The proposed NeuCube-based methodology offers also a superior classification accuracy

  12. Circuit models and experimental noise measurements of micropipette amplifiers for extracellular neural recordings from live animals.

    Science.gov (United States)

    Chen, Chang Hao; Pun, Sio Hang; Mak, Peng Un; Vai, Mang I; Klug, Achim; Lei, Tim C

    2014-01-01

    Glass micropipettes are widely used to record neural activity from single neurons or clusters of neurons extracellularly in live animals. However, to date, there has been no comprehensive study of noise in extracellular recordings with glass micropipettes. The purpose of this work was to assess various noise sources that affect extracellular recordings and to create model systems in which novel micropipette neural amplifier designs can be tested. An equivalent circuit of the glass micropipette and the noise model of this circuit, which accurately describe the various noise sources involved in extracellular recordings, have been developed. Measurement schemes using dead brain tissue as well as extracellular recordings from neurons in the inferior colliculus, an auditory brain nucleus of an anesthetized gerbil, were used to characterize noise performance and amplification efficacy of the proposed micropipette neural amplifier. According to our model, the major noise sources which influence the signal to noise ratio are the intrinsic noise of the neural amplifier and the thermal noise from distributed pipette resistance. These two types of noise were calculated and measured and were shown to be the dominating sources of background noise for in vivo experiments.

  13. Circuit Models and Experimental Noise Measurements of Micropipette Amplifiers for Extracellular Neural Recordings from Live Animals

    Directory of Open Access Journals (Sweden)

    Chang Hao Chen

    2014-01-01

    Full Text Available Glass micropipettes are widely used to record neural activity from single neurons or clusters of neurons extracellularly in live animals. However, to date, there has been no comprehensive study of noise in extracellular recordings with glass micropipettes. The purpose of this work was to assess various noise sources that affect extracellular recordings and to create model systems in which novel micropipette neural amplifier designs can be tested. An equivalent circuit of the glass micropipette and the noise model of this circuit, which accurately describe the various noise sources involved in extracellular recordings, have been developed. Measurement schemes using dead brain tissue as well as extracellular recordings from neurons in the inferior colliculus, an auditory brain nucleus of an anesthetized gerbil, were used to characterize noise performance and amplification efficacy of the proposed micropipette neural amplifier. According to our model, the major noise sources which influence the signal to noise ratio are the intrinsic noise of the neural amplifier and the thermal noise from distributed pipette resistance. These two types of noise were calculated and measured and were shown to be the dominating sources of background noise for in vivo experiments.

  14. Neural stimulation and recording with bidirectional, soft carbon nanotube fiber microelectrodes.

    Science.gov (United States)

    Vitale, Flavia; Summerson, Samantha R; Aazhang, Behnaam; Kemere, Caleb; Pasquali, Matteo

    2015-01-01

    The development of microelectrodes capable of safely stimulating and recording neural activity is a critical step in the design of many prosthetic devices, brain-machine interfaces, and therapies for neurologic or nervous-system-mediated disorders. Metal electrodes are inadequate prospects for the miniaturization needed to attain neuronal-scale stimulation and recording because of their poor electrochemical properties, high stiffness, and propensity to fail due to bending fatigue. Here we demonstrate neural recording and stimulation using carbon nanotube (CNT) fiber electrodes. In vitro characterization shows that the tissue contact impedance of CNT fibers is remarkably lower than that of state-of-the-art metal electrodes, making them suitable for recording single-neuron activity without additional surface treatments. In vivo chronic studies in parkinsonian rodents show that CNT fiber microelectrodes stimulate neurons as effectively as metal electrodes with 10 times larger surface area, while eliciting a significantly reduced inflammatory response. The same CNT fiber microelectrodes can record neural activity for weeks, paving the way for the development of novel multifunctional and dynamic neural interfaces with long-term stability.

  15. Spatiotemporal Spike Coding of Behavioral Adaptation in the Dorsal Anterior Cingulate Cortex.

    Directory of Open Access Journals (Sweden)

    Laureline Logiaco

    2015-08-01

    Full Text Available The frontal cortex controls behavioral adaptation in environments governed by complex rules. Many studies have established the relevance of firing rate modulation after informative events signaling whether and how to update the behavioral policy. However, whether the spatiotemporal features of these neuronal activities contribute to encoding imminent behavioral updates remains unclear. We investigated this issue in the dorsal anterior cingulate cortex (dACC of monkeys while they adapted their behavior based on their memory of feedback from past choices. We analyzed spike trains of both single units and pairs of simultaneously recorded neurons using an algorithm that emulates different biologically plausible decoding circuits. This method permits the assessment of the performance of both spike-count and spike-timing sensitive decoders. In response to the feedback, single neurons emitted stereotypical spike trains whose temporal structure identified informative events with higher accuracy than mere spike count. The optimal decoding time scale was in the range of 70-200 ms, which is significantly shorter than the memory time scale required by the behavioral task. Importantly, the temporal spiking patterns of single units were predictive of the monkeys' behavioral response time. Furthermore, some features of these spiking patterns often varied between jointly recorded neurons. All together, our results suggest that dACC drives behavioral adaptation through complex spatiotemporal spike coding. They also indicate that downstream networks, which decode dACC feedback signals, are unlikely to act as mere neural integrators.

  16. Conflict Resolution as Near-Threshold Decision-Making: A Spiking Neural Circuit Model with Two-Stage Competition for Antisaccadic Task.

    Science.gov (United States)

    Lo, Chung-Chuan; Wang, Xiao-Jing

    2016-08-01

    Automatic responses enable us to react quickly and effortlessly, but they often need to be inhibited so that an alternative, voluntary action can take place. To investigate the brain mechanism of controlled behavior, we investigated a biologically-based network model of spiking neurons for inhibitory control. In contrast to a simple race between pro- versus anti-response, our model incorporates a sensorimotor remapping module, and an action-selection module endowed with a "Stop" process through tonic inhibition. Both are under the modulation of rule-dependent control. We tested the model by applying it to the well known antisaccade task in which one must suppress the urge to look toward a visual target that suddenly appears, and shift the gaze diametrically away from the target instead. We found that the two-stage competition is crucial for reproducing the complex behavior and neuronal activity observed in the antisaccade task across multiple brain regions. Notably, our model demonstrates two types of errors: fast and slow. Fast errors result from failing to inhibit the quick automatic responses and therefore exhibit very short response times. Slow errors, in contrast, are due to incorrect decisions in the remapping process and exhibit long response times comparable to those of correct antisaccade responses. The model thus reveals a circuit mechanism for the empirically observed slow errors and broad distributions of erroneous response times in antisaccade. Our work suggests that selecting between competing automatic and voluntary actions in behavioral control can be understood in terms of near-threshold decision-making, sharing a common recurrent (attractor) neural circuit mechanism with discrimination in perception.

  17. Olfactory learning without the mushroom bodies: Spiking neural network models of the honeybee lateral antennal lobe tract reveal its capacities in odour memory tasks of varied complexities.

    Science.gov (United States)

    MaBouDi, HaDi; Shimazaki, Hideaki; Giurfa, Martin; Chittka, Lars

    2017-06-01

    The honeybee olfactory system is a well-established model for understanding functional mechanisms of learning and memory. Olfactory stimuli are first processed in the antennal lobe, and then transferred to the mushroom body and lateral horn through dual pathways termed medial and lateral antennal lobe tracts (m-ALT and l-ALT). Recent studies reported that honeybees can perform elemental learning by associating an odour with a reward signal even after lesions in m-ALT or blocking the mushroom bodies. To test the hypothesis that the lateral pathway (l-ALT) is sufficient for elemental learning, we modelled local computation within glomeruli in antennal lobes with axons of projection neurons connecting to a decision neuron (LHN) in the lateral horn. We show that inhibitory spike-timing dependent plasticity (modelling non-associative plasticity by exposure to different stimuli) in the synapses from local neurons to projection neurons decorrelates the projection neurons' outputs. The strength of the decorrelations is regulated by global inhibitory feedback within antennal lobes to the projection neurons. By additionally modelling octopaminergic modification of synaptic plasticity among local neurons in the antennal lobes and projection neurons to LHN connections, the model can discriminate and generalize olfactory stimuli. Although positive patterning can be accounted for by the l-ALT model, negative patterning requires further processing and mushroom body circuits. Thus, our model explains several-but not all-types of associative olfactory learning and generalization by a few neural layers of odour processing in the l-ALT. As an outcome of the combination between non-associative and associative learning, the modelling approach allows us to link changes in structural organization of honeybees' antennal lobes with their behavioural performances over the course of their life.

  18. Assembly and operation of the autopatcher for automated intracellular neural recording in vivo

    OpenAIRE

    Kodandaramaiah, Suhasa B.; Holst, Gregory L.; Wickersham, Ian R.; Singer, Annabelle C.; Franzesi, Giovanni Talei; McKinnon, Michael L; Forest, Craig R.; Boyden, Edward S.

    2016-01-01

    Whole-cell patch clamping in vivo is an important neuroscience technique that uniquely provides access to both suprathreshold spiking and subthreshold synaptic events of single neurons in the brain. This article describes how to set up and use the autopatcher, which is a robot for automatically obtaining high-yield and high-quality whole-cell patch clamp recordings in vivo. By following this protocol, a functional experimental rig for automated whole-cell patch clamping can be set up in 1 wee...

  19. A 2.1 μW/channel current-mode integrated neural recording interface

    NARCIS (Netherlands)

    Zjajo, A.; van Leuken, T.G.R.M.

    2016-01-01

    In this paper, we present a neural recording interface circuit for biomedical implantable devices, which includes low-noise signal amplification, band-pass filtering, and current-mode successive approximation A/D signal conversion. The integrated interface circuit is realized in a 65 nm CMOS

  20. Unsupervised clustering with spiking neurons by sparse temporal coding and multi-layer RBF networks

    NARCIS (Netherlands)

    S.M. Bohte (Sander); J.A. La Poutré (Han); J.N. Kok (Joost)

    2000-01-01

    textabstractWe demonstrate that spiking neural networks encoding information in spike times are capable of computing and learning clusters from realistic data. We show how a spiking neural network based on spike-time coding and Hebbian learning can successfully perform unsupervised clustering on

  1. Stress-induced impairment of a working memory task: role of spiking rate and spiking history predicted discharge.

    Science.gov (United States)

    Devilbiss, David M; Jenison, Rick L; Berridge, Craig W

    2012-01-01

    Stress, pervasive in society, contributes to over half of all work place accidents a year and over time can contribute to a variety of psychiatric disorders including depression, schizophrenia, and post-traumatic stress disorder. Stress impairs higher cognitive processes, dependent on the prefrontal cortex (PFC) and that involve maintenance and integration of information over extended periods, including working memory and attention. Substantial evidence has demonstrated a relationship between patterns of PFC neuron spiking activity (action-potential discharge) and components of delayed-response tasks used to probe PFC-dependent cognitive function in rats and monkeys. During delay periods of these tasks, persistent spiking activity is posited to be essential for the maintenance of information for working memory and attention. However, the degree to which stress-induced impairment in PFC-dependent cognition involves changes in task-related spiking rates or the ability for PFC neurons to retain information over time remains unknown. In the current study, spiking activity was recorded from the medial PFC of rats performing a delayed-response task of working memory during acute noise stress (93 db). Spike history-predicted discharge (SHPD) for PFC neurons was quantified as a measure of the degree to which ongoing neuronal discharge can be predicted by past spiking activity and reflects the degree to which past information is retained by these neurons over time. We found that PFC neuron discharge is predicted by their past spiking patterns for nearly one second. Acute stress impaired SHPD, selectively during delay intervals of the task, and simultaneously impaired task performance. Despite the reduction in delay-related SHPD, stress increased delay-related spiking rates. These findings suggest that neural codes utilizing SHPD within PFC networks likely reflects an additional important neurophysiological mechanism for maintenance of past information over time. Stress

  2. Computing with Spiking Neuron Networks

    NARCIS (Netherlands)

    H. Paugam-Moisy; S.M. Bohte (Sander); G. Rozenberg; T.H.W. Baeck (Thomas); J.N. Kok (Joost)

    2012-01-01

    htmlabstractAbstract Spiking Neuron Networks (SNNs) are often referred to as the 3rd gener- ation of neural networks. Highly inspired from natural computing in the brain and recent advances in neurosciences, they derive their strength and interest from an ac- curate modeling of synaptic interactions

  3. A hybrid hardware and software approach for cancelling stimulus artifacts during same-electrode neural stimulation and recording.

    Science.gov (United States)

    Culaclii, Stanislav; Kim, Brian; Yi-Kai Lo; Wentai Liu

    2016-08-01

    Recovering neural responses from electrode recordings is fundamental for understanding the dynamics of neural networks. This effort is often obscured by stimulus artifacts in the recordings, which result from stimuli injected into the electrode-tissue interface. Stimulus artifacts, which can be orders of magnitude larger than the neural responses of interest, can mask short-latency evoked responses. Furthermore, simultaneous neural stimulation and recording on the same electrode generates artifacts with larger amplitudes compared to a separate electrode setup, which inevitably overwhelm the amplifier operation and cause unrecoverable neural signal loss. This paper proposes an end-to-end system combining hardware and software techniques for actively cancelling stimulus artifacts, avoiding amplifier saturation, and recovering neural responses during current-controlled in-vivo neural stimulation and recording. The proposed system is tested in-vitro under various stimulation settings by stimulating and recording on the same electrode with a superimposed pre-recorded neural signal. Experimental results show that neural responses can be recovered with minimal distortion even during stimulus artifacts that are several orders greater in magnitude.

  4. An implantable wireless neural interface for recording cortical circuit dynamics in moving primates.

    Science.gov (United States)

    Borton, David A; Yin, Ming; Aceros, Juan; Nurmikko, Arto

    2013-04-01

    Neural interface technology suitable for clinical translation has the potential to significantly impact the lives of amputees, spinal cord injury victims and those living with severe neuromotor disease. Such systems must be chronically safe, durable and effective. We have designed and implemented a neural interface microsystem, housed in a compact, subcutaneous and hermetically sealed titanium enclosure. The implanted device interfaces the brain with a 510k-approved, 100-element silicon-based microelectrode array via a custom hermetic feedthrough design. Full spectrum neural signals were amplified (0.1 Hz to 7.8 kHz, 200× gain) and multiplexed by a custom application specific integrated circuit, digitized and then packaged for transmission. The neural data (24 Mbps) were transmitted by a wireless data link carried on a frequency-shift-key-modulated signal at 3.2 and 3.8 GHz to a receiver 1 m away by design as a point-to-point communication link for human clinical use. The system was powered by an embedded medical grade rechargeable Li-ion battery for 7 h continuous operation between recharge via an inductive transcutaneous wireless power link at 2 MHz. Device verification and early validation were performed in both swine and non-human primate freely-moving animal models and showed that the wireless implant was electrically stable, effective in capturing and delivering broadband neural data, and safe for over one year of testing. In addition, we have used the multichannel data from these mobile animal models to demonstrate the ability to decode neural population dynamics associated with motor activity. We have developed an implanted wireless broadband neural recording device evaluated in non-human primate and swine. The use of this new implantable neural interface technology can provide insight into how to advance human neuroprostheses beyond the present early clinical trials. Further, such tools enable mobile patient use, have the potential for wider diagnosis of

  5. A 515 nW, 0-18 dB Programmable Gain Analog-to-Digital Converter for In-Channel Neural Recording Interfaces.

    Science.gov (United States)

    Rodriguez-Perez, Alberto; Delgado-Restituto, Manuel; Medeiro, Fernando

    2014-06-01

    This paper presents a low-area low-power Switched-Capacitor (SC)-based Programmable-Gain Analog-to-Digital Converter (PG-ADC) suitable for in-channel neural recording applications. The PG-ADC uses a novel implementation of the binary search algorithm that is complemented with adaptive biasing techniques for power saving. It has been fabricated in a standard CMOS 130 nm technology and only occupies 0.0326 mm(2). The PG-ADC has been optimized to operate under two different sampling modes, 27 kS/s and 90 kS/s. The former is tailored for raw data conversion of neural activity, whereas the latter is used for the on-the-fly feature extraction of neural spikes. Experimental results show that, under a voltage supply of 1.2 V, the PG-ADC obtains an ENOB of 7.56 bit (8-bit output) for both sampling modes, regardless of the gain setting. The amplification gain can be programmed from 0 to 18 dB. The power consumption of the PG-ADC at 90 kS/s is 1.52 μW with a FoM of 89.49 fJ/conv, whereas at 27 kS/s it consumes 515 nW and obtains a FoM of 98.31 fJ/conv .

  6. A 500 year sediment lake record of anthropogenic and natural inputs to Windermere (English Lake District) using double-spike lead isotopes, radiochronology, and sediment microanalysis.

    Science.gov (United States)

    Miller, Helen; Croudace, Ian W; Bull, Jonathan M; Cotterill, Carol J; Dix, Justin K; Taylor, Rex N

    2014-07-01

    A high-resolution record of pollution is preserved in recent sediments from Windermere, the largest lake in the English Lake District. Data derived from X-ray core scanning (validated against wavelength dispersive X-ray fluorescence), radiochronological techniques ((210)Pb and (137)Cs) and ultrahigh precision, double-spike mass spectrometry for lead isotopes are combined to decipher the anthropogenic inputs to the lake. The sediment record suggests that while most element concentrations have been stable, there has been a significant increase in lead, zinc, and copper concentrations since the 1930s. Lead isotope down-core variations identify three major contributory sources of anthropogenic (industrial) lead, comprising gasoline lead, coal combustion lead (most likely source is coal-fired steam ships), and lead derived from Carboniferous Pb-Zn mineralization (mining activities). Periods of metal workings do not correlate with peaks in heavy metals due to the trapping efficiency of up-system lakes in the catchment. Heavy metal increases could be due to flood-induced metal inwash after the cessation of mining and the weathering of bedrock in the catchment. The combination of sediment analysis techniques used provides new insights into the pollutant depositional history of Windermere and could be similarly applied to other lake systems to determine the timing and scale of anthropogenic inputs.

  7. Spike sorting of heterogeneous neuron types by multimodality-weighted PCA and explicit robust variational Bayes

    Directory of Open Access Journals (Sweden)

    Takashi eTakekawa

    2012-03-01

    Full Text Available This study introduces a new spike sorting method that classifies spike waveforms from multiunit recordings into spike trains of individual neurons. In particular, we develop a method to sort a spike mixture generated by a heterogeneous neural population. Such a spike sorting has a significant practical value, but was previously difficult. The method combines a feature extraction method, which we may term multimodality-weighted principal component analysis (mPCA, and a clustering method by variational Bayes for Student’s t mixture model (SVB. The performance of the proposed method was compared with that of other conventional methods for simulated and experimental data sets. We found that the mPCA efficiently extracts highly informative features as clusters clearly separable in a relatively low-dimensional feature space. The SVB was implemented explicitly without relying on Maximum-A-Posterior (MAP inference for the degree of freedom parameters. The explicit SVB is faster than the conventional SVB derived with MAP inference and works more reliably over various data sets that include spiking patterns difficult to sort. For instance, spikes of a single bursting neuron may be separated incorrectly into multiple clusters, whereas those of a sparsely firing neuron tend to be merged into clusters for other neurons. Our method showed significantly improved performance in spike sorting of these difficult neurons. A parallelized implementation of the proposed algorithm (EToS version 3 is available as open-source code at http://etos.sourceforge.net/.

  8. Simultaneous in vivo recording of local brain temperature and electrophysiological signals with a novel neural probe

    Science.gov (United States)

    Fekete, Z.; Csernai, M.; Kocsis, K.; Horváth, Á. C.; Pongrácz, A.; Barthó, P.

    2017-06-01

    Objective. Temperature is an important factor for neural function both in normal and pathological states, nevertheless, simultaneous monitoring of local brain temperature and neuronal activity has not yet been undertaken. Approach. In our work, we propose an implantable, calibrated multimodal biosensor that facilitates the complex investigation of thermal changes in both cortical and deep brain regions, which records multiunit activity of neuronal populations in mice. The fabricated neural probe contains four electrical recording sites and a platinum temperature sensor filament integrated on the same probe shaft within a distance of 30 µm from the closest recording site. The feasibility of the simultaneous functionality is presented in in vivo studies. The probe was tested in the thalamus of anesthetized mice while manipulating the core temperature of the animals. Main results. We obtained multiunit and local field recordings along with measurement of local brain temperature with accuracy of 0.14 °C. Brain temperature generally followed core body temperature, but also showed superimposed fluctuations corresponding to epochs of increased local neural activity. With the application of higher currents, we increased the local temperature by several degrees without observable tissue damage between 34-39 °C. Significance. The proposed multifunctional tool is envisioned to broaden our knowledge on the role of the thermal modulation of neuronal activity in both cortical and deeper brain regions.

  9. Constructing Precisely Computing Networks with Biophysical Spiking Neurons.

    Science.gov (United States)

    Schwemmer, Michael A; Fairhall, Adrienne L; Denéve, Sophie; Shea-Brown, Eric T

    2015-07-15

    While spike timing has been shown to carry detailed stimulus information at the sensory periphery, its possible role in network computation is less clear. Most models of computation by neural networks are based on population firing rates. In equivalent spiking implementations, firing is assumed to be random such that averaging across populations of neurons recovers the rate-based approach. Recently, however, Denéve and colleagues have suggested that the spiking behavior of neurons may be fundamental to how neuronal networks compute, with precise spike timing determined by each neuron's contribution to producing the desired output (Boerlin and Denéve, 2011; Boerlin et al., 2013). By postulating that each neuron fires to reduce the error in the network's output, it was demonstrated that linear computations can be performed by networks of integrate-and-fire neurons that communicate through instantaneous synapses. This left open, however, the possibility that realistic networks, with conductance-based neurons with subthreshold nonlinearity and the slower timescales of biophysical synapses, may not fit into this framework. Here, we show how the spike-based approach can be extended to biophysically plausible networks. We then show that our network reproduces a number of key features of cortical networks including irregular and Poisson-like spike times and a tight balance between excitation and inhibition. Lastly, we discuss how the behavior of our model scales with network size or with the number of neurons "recorded" from a larger computing network. These results significantly increase the biological plausibility of the spike-based approach to network computation. We derive a network of neurons with standard spike-generating currents and synapses with realistic timescales that computes based upon the principle that the precise timing of each spike is important for the computation. We then show that our network reproduces a number of key features of cortical networks

  10. Multi-modal decoding: longitudinal coherency changes between spike trains, local field potentials and electrocorticogram signals.

    Science.gov (United States)

    Balasubramanian, Karthikeyan; Takahashi, Kazutaka; Slutzky, Marc; Hatsopoulos, Nicholas G

    2014-01-01

    Neural information degeneracy in chronic implants due to signal instabilities affects optimal performance of brain-machine interfaces (BMIs). Spike-decoders are more vulnerable compared to those using LFPs and ECoG signals. In order for BMIs to perform reliably across years, decoders should be able to use neural information contained in various signal modalities. Hence, it is important to identify information redundancy among signal types. In this work, spikes, LFPs and ECoGs were recorded simultaneously from motor cortex of a rhesus monkey, while the animal was learning to control a multi-DOF robot with a spike-decoder. As the behavioral performance increased, the linear association among the signal types increased. Coherency of these signals increased in specific frequency bands as learning occurred. These results suggest the possibility of substituting the information lost in one modality by another.

  11. Characterization of the causality between spike trains with permutation conditional mutual information

    Science.gov (United States)

    Li, Zhaohui; Ouyang, Gaoxiang; Li, Duan; Li, Xiaoli

    2011-08-01

    Uncovering the causal relationship between spike train recordings from different neurons is a key issue for understanding the neural coding. This paper presents a method, called permutation conditional mutual information (PCMI), for characterizing the causality between a pair of neurons. The performance of this method is demonstrated with the spike trains generated by the Poisson point process model and the Izhikevich neuronal model, including estimation of the directionality index and detection of the temporal dynamics of the causal link. Simulations show that the PCMI method is superior to the transfer entropy and causal entropy methods at identifying the coupling direction between the spike trains. The advantages of PCMI are twofold: It is able to estimate the directionality index under the weak coupling and against the missing and extra spikes.

  12. The Right Delay: Detecting Specific Spike Patterns with STDP and

    NARCIS (Netherlands)

    Datadien, A.H.R.; Haselager, W.F.G.; Sprinkhuizen-Kuyper, I.G.; Dobnikar, A.; Lotric, U.; Ster, B.

    2011-01-01

    Axonal conduction delays should not be ignored in simulations of spiking neural networks. Here it is shown that by using axonal conduction delays, neurons can display sensitivity to a specific spatio-temporal spike pattern. By using delays that complement the firing times in a pattern, spikes can

  13. Unsupervised decoding of long-term, naturalistic human neural recordings with automated video and audio annotations

    Directory of Open Access Journals (Sweden)

    Nancy X.R. Wang

    2016-04-01

    Full Text Available Fully automated decoding of human activities and intentions from direct neural recordings is a tantalizing challenge in brain-computer interfacing. Implementing Brain Computer Interfaces (BCIs outside carefully controlled experiments in laboratory settings requires adaptive and scalable strategies with minimal supervision. Here we describe an unsupervised approach to decoding neural states from naturalistic human brain recordings. We analyzed continuous, long-term electrocorticography (ECoG data recorded over many days from the brain of subjects in a hospital room, with simultaneous audio and video recordings. We discovered coherent clusters in high-dimensional ECoG recordings using hierarchical clustering and automatically annotated them using speech and movement labels extracted from audio and video. To our knowledge, this represents the first time techniques from computer vision and speech processing have been used for natural ECoG decoding. Interpretable behaviors were decoded from ECoG data, including moving, speaking and resting; the results were assessed by comparison with manual annotation. Discovered clusters were projected back onto the brain revealing features consistent with known functional areas, opening the door to automated functional brain mapping in natural settings.

  14. Extracting spatial-temporal coherent patterns in large-scale neural recordings using dynamic mode decomposition.

    Science.gov (United States)

    Brunton, Bingni W; Johnson, Lise A; Ojemann, Jeffrey G; Kutz, J Nathan

    2016-01-30

    There is a broad need in neuroscience to understand and visualize large-scale recordings of neural activity, big data acquired by tens or hundreds of electrodes recording dynamic brain activity over minutes to hours. Such datasets are characterized by coherent patterns across both space and time, yet existing computational methods are typically restricted to analysis either in space or in time separately. Here we report the adaptation of dynamic mode decomposition (DMD), an algorithm originally developed for studying fluid physics, to large-scale neural recordings. DMD is a modal decomposition algorithm that describes high-dimensional dynamic data using coupled spatial-temporal modes. The algorithm is robust to variations in noise and subsampling rate; it scales easily to very large numbers of simultaneously acquired measurements. We first validate the DMD approach on sub-dural electrode array recordings from human subjects performing a known motor task. Next, we combine DMD with unsupervised clustering, developing a novel method to extract spindle networks during sleep. We uncovered several distinct sleep spindle networks identifiable by their stereotypical cortical distribution patterns, frequency, and duration. DMD is closely related to principal components analysis (PCA) and discrete Fourier transform (DFT). We may think of DMD as a rotation of the low-dimensional PCA space such that each basis vector has coherent dynamics. The resulting analysis combines key features of performing PCA in space and power spectral analysis in time, making it particularly suitable for analyzing large-scale neural recordings. Copyright © 2015 Elsevier B.V. All rights reserved.

  15. Setting up and using the autopatcher for automated intracellular neural recording in vivo

    Science.gov (United States)

    Kodandaramaiah, Suhasa B.; Holst, Gregory L.; Wickersham, Ian R.; Singer, Annabelle C.; Franzesi, Giovanni Talei; McKinnon, Michael L.; Forest, Craig R.; Boyden, Edward S.

    2016-01-01

    Whole cell patch clamping in vivo is an important neuroscience technique that uniquely provides access to both supra-threshold spiking and sub-threshold synaptic events of single neurons in the brain. This article describes how to set up and use the autopatcher, a robot for automatically obtaining high yield and high quality whole cell patch clamp recordings in vivo. Following this protocol, a functional experimental rig for automated whole cell patch clamping can be set up in one week. High quality surgical preparation of mice takes approximately 1 hour, and each autopatching experiment can be carried out over periods lasting several hours. Autopatching should enable in vivo intracellular investigations to be accessible by a significant number of neuroscience labs, and enable labs already doing in vivo patch clamp to scale up their efforts by reducing training time for new lab members and increasing experimental durations by handling mentally intensive tasks automatically. PMID:26938115

  16. 3D probe array integrated with a front-end 100-channel neural recording ASIC

    Science.gov (United States)

    Cheng, Ming-Yuan; Yao, Lei; Tan, Kwan Ling; Lim, Ruiqi; Li, Peng; Chen, Weiguo

    2014-12-01

    Brain-machine interface technology can improve the lives of spinal cord injury victims and amputees. A neural interface system, consisting of a 3D probe array and a custom low-power (1 mW) 100-channel (100-ch) neural recording application-specific integrated circuit (ASIC), was designed and implemented to monitor neural activity. In this study, a microassembly 3D probe array method using a novel lead transfer technique was proposed to overcome the bonding plane mismatch encountered during orthogonal assembly. The proposed lead transfer technique can be completed using standard micromachining and packaging processes. The ASIC can be stacking-integrated with the probe array, minimizing the form factor of the assembled module. To minimize trauma to brain cells, the profile of the integrated probe array was controlled within 730 μm. The average impedance of the assembled probe was approximately 0.55 MΩ at 1 kHz. To verify the functionality of the integrated neural probe array, bench-top signal acquisitions were performed and discussed.

  17. An implantable wireless neural interface for recording cortical circuit dynamics in moving primates

    Science.gov (United States)

    Borton, David A.; Yin, Ming; Aceros, Juan; Nurmikko, Arto

    2013-04-01

    Objective. Neural interface technology suitable for clinical translation has the potential to significantly impact the lives of amputees, spinal cord injury victims and those living with severe neuromotor disease. Such systems must be chronically safe, durable and effective. Approach. We have designed and implemented a neural interface microsystem, housed in a compact, subcutaneous and hermetically sealed titanium enclosure. The implanted device interfaces the brain with a 510k-approved, 100-element silicon-based microelectrode array via a custom hermetic feedthrough design. Full spectrum neural signals were amplified (0.1 Hz to 7.8 kHz, 200× gain) and multiplexed by a custom application specific integrated circuit, digitized and then packaged for transmission. The neural data (24 Mbps) were transmitted by a wireless data link carried on a frequency-shift-key-modulated signal at 3.2 and 3.8 GHz to a receiver 1 m away by design as a point-to-point communication link for human clinical use. The system was powered by an embedded medical grade rechargeable Li-ion battery for 7 h continuous operation between recharge via an inductive transcutaneous wireless power link at 2 MHz. Main results. Device verification and early validation were performed in both swine and non-human primate freely-moving animal models and showed that the wireless implant was electrically stable, effective in capturing and delivering broadband neural data, and safe for over one year of testing. In addition, we have used the multichannel data from these mobile animal models to demonstrate the ability to decode neural population dynamics associated with motor activity. Significance. We have developed an implanted wireless broadband neural recording device evaluated in non-human primate and swine. The use of this new implantable neural interface technology can provide insight into how to advance human neuroprostheses beyond the present early clinical trials. Further, such tools enable mobile

  18. A multichannel integrated circuit for electrical recording of neural activity, with independent channel programmability.

    Science.gov (United States)

    Mora Lopez, Carolina; Prodanov, Dimiter; Braeken, Dries; Gligorijevic, Ivan; Eberle, Wolfgang; Bartic, Carmen; Puers, Robert; Gielen, Georges

    2012-04-01

    Since a few decades, micro-fabricated neural probes are being used, together with microelectronic interfaces, to get more insight in the activity of neuronal networks. The need for higher temporal and spatial recording resolutions imposes new challenges on the design of integrated neural interfaces with respect to power consumption, data handling and versatility. In this paper, we present an integrated acquisition system for in vitro and in vivo recording of neural activity. The ASIC consists of 16 low-noise, fully-differential input channels with independent programmability of its amplification (from 100 to 6000 V/V) and filtering (1-6000 Hz range) capabilities. Each channel is AC-coupled and implements a fourth-order band-pass filter in order to steeply attenuate out-of-band noise and DC input offsets. The system achieves an input-referred noise density of 37 nV/√Hz, a NEF of 5.1, a CMRR > 60 dB, a THD noise ratios.

  19. Toward a distributed free-floating wireless implantable neural recording system.

    Science.gov (United States)

    Pyungwoo Yeon; Xingyuan Tong; Byunghun Lee; Mirbozorgi, Abdollah; Ash, Bruce; Eckhardt, Helmut; Ghovanloo, Maysam

    2016-08-01

    To understand the complex correlations between neural networks across different regions in the brain and their functions at high spatiotemporal resolution, a tool is needed for obtaining long-term single unit activity (SUA) across the entire brain area. The concept and preliminary design of a distributed free-floating wireless implantable neural recording (FF-WINeR) system are presented, which can enabling SUA acquisition by dispersedly implanting tens to hundreds of untethered 1 mm3 neural recording probes, floating with the brain and operating wirelessly across the cortical surface. For powering FF-WINeR probes, a 3-coil link with an intermediate high-Q resonator provides a minimum S21 of -22.22 dB (in the body medium) and -21.23 dB (in air) at 2.8 cm coil separation, which translates to 0.76%/759 μW and 0.6%/604 μW of power transfer efficiency (PTE) / power delivered to a 9 kΩ load (PDL), in body and air, respectively. A mock-up FF-WINeR is implemented to explore microassembly method of the 1×1 mm2 micromachined silicon die with a bonding wire-wound coil and a tungsten micro-wire electrode. Circuit design methods to fit the active circuitry in only 0.96 mm2 of die area in a 130 nm standard CMOS process, and satisfy the strict power and performance requirements (in simulations) are discussed.

  20. Detecting dependencies between spike trains of pairs of neurons through copulas.

    Science.gov (United States)

    Sacerdote, Laura; Tamborrino, Massimiliano; Zucca, Cristina

    2012-01-24

    The dynamics of a neuron are influenced by the connections with the network where it lies. Recorded spike trains exhibit patterns due to the interactions between neurons. However, the structure of the network is not known. A challenging task is to investigate it from the analysis of simultaneously recorded spike trains. We develop a non-parametric method based on copulas, that we apply to simulated data according to different bivariate Leaky Integrate and Fire models. The method discerns dependencies determined by the surrounding network, from those determined by direct interactions between the two neurons. Furthermore, the method recognizes the presence of delays in the spike propagation. This article is part of a Special Issue entitled "Neural Coding". Copyright © 2011 Elsevier B.V. All rights reserved.

  1. An Integrated Circuit for Simultaneous Extracellular Electrophysiology Recording and Optogenetic Neural Manipulation.

    Science.gov (United States)

    Chen, Chang Hao; McCullagh, Elizabeth A; Pun, Sio Hang; Mak, Peng Un; Vai, Mang I; Mak, Pui In; Klug, Achim; Lei, Tim C

    2017-03-01

    The ability to record and to control action potential firing in neuronal circuits is critical to understand how the brain functions. The objective of this study is to develop a monolithic integrated circuit (IC) to record action potentials and simultaneously control action potential firing using optogenetics. A low-noise and high input impedance (or low input capacitance) neural recording amplifier is combined with a high current laser/light-emitting diode (LED) driver in a single IC. The low input capacitance of the amplifier (9.7 pF) was achieved by adding a dedicated unity gain stage optimized for high impedance metal electrodes. The input referred noise of the amplifier is [Formula: see text], which is lower than the estimated thermal noise of the metal electrode. Thus, the action potentials originating from a single neuron can be recorded with a signal-to-noise ratio of at least 6.6. The LED/laser current driver delivers a maximum current of 330 mA, which is adequate for optogenetic control. The functionality of the IC was tested with an anesthetized Mongolian gerbil and auditory stimulated action potentials were recorded from the inferior colliculus. Spontaneous firings of fifth (trigeminal) nerve fibers were also inhibited using the optogenetic protein Halorhodopsin. Moreover, a noise model of the system was derived to guide the design. A single IC to measure and control action potentials using optogenetic proteins is realized so that more complicated behavioral neuroscience research and the translational neural disorder treatments become possible in the future.

  2. Neural signal registration and analysis of axons grown in microchannels

    Science.gov (United States)

    Pigareva, Y.; Malishev, E.; Gladkov, A.; Kolpakov, V.; Bukatin, A.; Mukhina, I.; Kazantsev, V.; Pimashkin, A.

    2016-08-01

    Registration of neuronal bioelectrical signals remains one of the main physical tools to study fundamental mechanisms of signal processing in the brain. Neurons generate spiking patterns which propagate through complex map of neural network connectivity. Extracellular recording of isolated axons grown in microchannels provides amplification of the signal for detailed study of spike propagation. In this study we used neuronal hippocampal cultures grown in microfluidic devices combined with microelectrode arrays to investigate a changes of electrical activity during neural network development. We found that after 5 days in vitro after culture plating the spiking activity appears first in microchannels and on the next 2-3 days appears on the electrodes of overall neural network. We conclude that such approach provides a convenient method to study neural signal processing and functional structure development on a single cell and network level of the neuronal culture.

  3. Spike-train variability of auditory neurons in vivo: dynamic responses follow predictions from constant stimuli.

    Science.gov (United States)

    Schaette, Roland; Gollisch, Tim; Herz, Andreas V M

    2005-06-01

    Reliable accounts of the variability observed in neural spike trains are a prerequisite for the proper interpretation of neural dynamics and coding principles. Models that accurately describe neural variability over a wide range of stimulation and response patterns are therefore highly desirable, especially if they can explain this variability in terms of basic neural observables and parameters such as firing rate and refractory period. In this work, we analyze the response variability recorded in vivo from locust auditory receptor neurons under acoustic stimulation. In agreement with results from other systems, our data suggest that neural refractoriness has a strong influence on spike-train variability. We therefore explore a stochastic model of spike generation that includes refractoriness through a recovery function. Because our experimental data are consistent with a renewal process, the recovery function can be derived from a single interspike-interval histogram obtained under constant stimulation. The resulting description yields quantitatively accurate predictions of the response variability over the whole range of firing rates for constant-intensity as well as amplitude-modulated sound stimuli. Model parameters obtained from constant stimulation can be used to predict the variability in response to dynamic stimuli. These results demonstrate that key ingredients of the stochastic response dynamics of a sensory neuron are faithfully captured by a simple stochastic model framework.

  4. A wireless recording system that utilizes Bluetooth technology to transmit neural activity in freely moving animals.

    Science.gov (United States)

    Hampson, Robert E; Collins, Vernell; Deadwyler, Sam A

    2009-09-15

    A new wireless transceiver is described for recording individual neuron firing from behaving rats utilizing Bluetooth transmission technology and a processor onboard for discrimination of neuronal waveforms and associated time stamps. This universal brain activity transmitter (UBAT) is attached to rodents via a backpack and amplifier headstage and can transmit 16 channels of captured neuronal firing data via a Bluetooth transceiver chip over very large and unconstrained distances. The onboard microprocessor of the UBAT allows flexible online control over waveform isolation criteria via transceiver instruction and the two-way communication capacity allows for closed-loop applications between neural events and behavioral or physiological processes which can be modified by transceiver instructions. A detailed description of the multiplexer processing of channel data as well as examples of neuronal recordings in different behavioral testing contexts is provided to demonstrate the capacity for robust transmission within almost any laboratory environment. A major advantage of the UBAT is the long transmission range and lack of object-based line of sight interference afforded by Bluetooth technology, allowing flexible recording capabilities within multiple experimental paradigms without interruption. Continuous recordings over very large distance separations from the monitor station are demonstrated providing experimenters with recording advantages not previously available with other telemetry devices.

  5. Assessing artificial neural networks and statistical methods for infilling missing soil moisture records

    Science.gov (United States)

    Dumedah, Gift; Walker, Jeffrey P.; Chik, Li

    2014-07-01

    Soil moisture information is critically important for water management operations including flood forecasting, drought monitoring, and groundwater recharge estimation. While an accurate and continuous record of soil moisture is required for these applications, the available soil moisture data, in practice, is typically fraught with missing values. There are a wide range of methods available to infilling hydrologic variables, but a thorough inter-comparison between statistical methods and artificial neural networks has not been made. This study examines 5 statistical methods including monthly averages, weighted Pearson correlation coefficient, a method based on temporal stability of soil moisture, and a weighted merging of the three methods, together with a method based on the concept of rough sets. Additionally, 9 artificial neural networks are examined, broadly categorized into feedforward, dynamic, and radial basis networks. These 14 infilling methods were used to estimate missing soil moisture records and subsequently validated against known values for 13 soil moisture monitoring stations for three different soil layer depths in the Yanco region in southeast Australia. The evaluation results show that the top three highest performing methods are the nonlinear autoregressive neural network, rough sets method, and monthly replacement. A high estimation accuracy (root mean square error (RMSE) of about 0.03 m/m) was found in the nonlinear autoregressive network, due to its regression based dynamic network which allows feedback connections through discrete-time estimation. An equally high accuracy (0.05 m/m RMSE) in the rough sets procedure illustrates the important role of temporal persistence of soil moisture, with the capability to account for different soil moisture conditions.

  6. Long-term neural recordings using MEMS based moveable microelectrodes in the brain

    Directory of Open Access Journals (Sweden)

    Nathan Jackson

    2010-06-01

    Full Text Available One of the critical requirements of the emerging class of neural prosthetic devices is to maintain good quality neural recordings over long time periods. We report here a novel (Micro-ElectroMechanical Systems based technology that can move microelectrodes in the event of deterioration in neural signal to sample a new set of neurons. Microscale electro-thermal actuators are used to controllably move microelectrodes post-implantation in steps of approximately 9 µm. In this study, a total of 12 moveable microelectrode chips were individually implanted in adult rats. Two of the 12 moveable microelectrode chips were not moved over a period of 3 weeks and were treated as control experiments. During the first three weeks of implantation, moving the microelectrodes led to an improvement in the average SNR from 14.61 ± 5.21 dB before movement to 18.13 ± 4.99 dB after movement across all microelectrodes and all days. However, the average RMS values of noise amplitudes were similar at 2.98 ± 1.22 µV and 3.01 ± 1.16 µV before and after microelectrode movement. Beyond three weeks, the primary observed failure mode was biological rejection of the PMMA (dental cement based skull mount resulting in the device loosening and eventually falling from the skull. Additionally, the average SNR for functioning devices beyond three weeks was 11.88 ± 2.02 dB before microelectrode movement and was significantly different (p<0.01 from the average SNR of 13.34 ± 0.919 dB after movement. The results of this study demonstrate that MEMS based technologies can move microelectrodes in rodent brains in long-term experiments resulting in improvements in signal quality. Further improvements in packaging and surgical techniques will potentially enable movable microelectrodes to record cortical neuronal activity in chronic experiments.

  7. Mesoscale-duration activated states gate spiking in response to fast rises in membrane voltage in the awake brain.

    Science.gov (United States)

    Singer, Annabelle C; Talei Franzesi, Giovanni; Kodandaramaiah, Suhasa B; Flores, Francisco J; Cohen, Jeremy D; Lee, Albert K; Borgers, Christoph; Forest, Craig R; Kopell, Nancy J; Boyden, Edward S

    2017-08-01

    Seconds-scale network states, affecting many neurons within a network, modulate neural activity by complementing fast integration of neuron-specific inputs that arrive in the milliseconds before spiking. Nonrhythmic subthreshold dynamics at intermediate timescales, however, are less well characterized. We found, using automated whole cell patch clamping in vivo, that spikes recorded in CA1 and barrel cortex in awake mice are often preceded not only by monotonic voltage rises lasting milliseconds but also by more gradual (lasting tens to hundreds of milliseconds) depolarizations. The latter exert a gating function on spiking, in a fashion that depends on the gradual rise duration: the probability of spiking was higher for longer gradual rises, even when controlled for the amplitude of the gradual rises. Barrel cortex double-autopatch recordings show that gradual rises are shared across some, but not all, neurons. The gradual rises may represent a new kind of state, intermediate both in timescale and in proportion of neurons participating, which gates a neuron's ability to respond to subsequent inputs.NEW & NOTEWORTHY We analyzed subthreshold activity preceding spikes in hippocampus and barrel cortex of awake mice. Aperiodic voltage ramps extending over tens to hundreds of milliseconds consistently precede and facilitate spikes, in a manner dependent on both their amplitude and their duration. These voltage ramps represent a "mesoscale" activated state that gates spike production in vivo. Copyright © 2017 the American Physiological Society.

  8. A Robustness Comparison of Two Algorithms Used for EEG Spike Detection

    OpenAIRE

    Chaibi, Sahbi; Lajnef, Tarek; Ghrob, Abdelbacet; Samet, Mounir; Kachouri, Abdennaceur

    2015-01-01

    Spikes and sharp waves recorded on scalp EEG may play an important role in identifying the epileptogenic network as well as in understanding the central nervous system. Therefore, several automatic and semi-automatic methods have been implemented to detect these two neural transients. A consistent gold standard associated with a high degree of agreement among neuroscientists is required to measure relevant performance of different methods. In fact, scalp EEG data can often be corrupted by a s...

  9. Memristors Empower Spiking Neurons With Stochasticity

    KAUST Repository

    Al-Shedivat, Maruan

    2015-06-01

    Recent theoretical studies have shown that probabilistic spiking can be interpreted as learning and inference in cortical microcircuits. This interpretation creates new opportunities for building neuromorphic systems driven by probabilistic learning algorithms. However, such systems must have two crucial features: 1) the neurons should follow a specific behavioral model, and 2) stochastic spiking should be implemented efficiently for it to be scalable. This paper proposes a memristor-based stochastically spiking neuron that fulfills these requirements. First, the analytical model of the memristor is enhanced so it can capture the behavioral stochasticity consistent with experimentally observed phenomena. The switching behavior of the memristor model is demonstrated to be akin to the firing of the stochastic spike response neuron model, the primary building block for probabilistic algorithms in spiking neural networks. Furthermore, the paper proposes a neural soma circuit that utilizes the intrinsic nondeterminism of memristive switching for efficient spike generation. The simulations and analysis of the behavior of a single stochastic neuron and a winner-take-all network built of such neurons and trained on handwritten digits confirm that the circuit can be used for building probabilistic sampling and pattern adaptation machinery in spiking networks. The findings constitute an important step towards scalable and efficient probabilistic neuromorphic platforms. © 2011 IEEE.

  10. Flavopiridol reduces the impedance of neural prostheses in vivo without affecting recording quality.

    Science.gov (United States)

    Purcell, Erin K; Thompson, David E; Ludwig, Kip A; Kipke, Daryl R

    2009-10-15

    We hypothesized that re-entry into the cell cycle may be associated with reactive gliosis surrounding neural prostheses, and that administration of a cell cycle inhibitor (flavopiridol) at the time of surgery would reduce this effect. We investigated the effects of flavopiridol on recording quality and impedance over a 28-day time period and conducted histology at 3 and 28 days post-implantation. Flavopiridol reduced the expression of a cell cycle protein (cyclin D1) in microglia surrounding probes at the 3-day time point. Impedance at 1 kHz was decreased by drug administration across the study period compared to vehicle controls. Correlations between recording (SNR, units) and impedance metrics revealed a small, but statistically significant, inverse relationship between these variables. However, the relationship between impedance and recording quality was not sufficiently strong for flavopiridol to result in an improvement in SNR or the number of units detected. Our data indicate that flavopiridol is an effective, easily administered treatment for reducing impedance in vivo, potentially through inhibiting microglial encapsulation of implanted devices. This strategy may be useful in stimulation applications, where reduced impedance is desirable for achieving activation thresholds and prolonging the lifetime of the implanted power supply. While improvements in recording quality were not observed, a combination of flavopiridol with a second strategy which enhances neuronal signal detection may enhance these results in future studies.

  11. Serial Spike Time Correlations Affect Probability Distribution of Joint Spike Events.

    Science.gov (United States)

    Shahi, Mina; van Vreeswijk, Carl; Pipa, Gordon

    2016-01-01

    Detecting the existence of temporally coordinated spiking activity, and its role in information processing in the cortex, has remained a major challenge for neuroscience research. Different methods and approaches have been suggested to test whether the observed synchronized events are significantly different from those expected by chance. To analyze the simultaneous spike trains for precise spike correlation, these methods typically model the spike trains as a Poisson process implying that the generation of each spike is independent of all the other spikes. However, studies have shown that neural spike trains exhibit dependence among spike sequences, such as the absolute and relative refractory periods which govern the spike probability of the oncoming action potential based on the time of the last spike, or the bursting behavior, which is characterized by short epochs of rapid action potentials, followed by longer episodes of silence. Here we investigate non-renewal processes with the inter-spike interval distribution model that incorporates spike-history dependence of individual neurons. For that, we use the Monte Carlo method to estimate the full shape of the coincidence count distribution and to generate false positives for coincidence detection. The results show that compared to the distributions based on homogeneous Poisson processes, and also non-Poisson processes, the width of the distribution of joint spike events changes. Non-renewal processes can lead to both heavy tailed or narrow coincidence distribution. We conclude that small differences in the exact autostructure of the point process can cause large differences in the width of a coincidence distribution. Therefore, manipulations of the autostructure for the estimation of significance of joint spike events seem to be inadequate.

  12. Improved SpikeProp for Using Particle Swarm Optimization

    Directory of Open Access Journals (Sweden)

    Falah Y. H. Ahmed

    2013-01-01

    Full Text Available A spiking neurons network encodes information in the timing of individual spike times. A novel supervised learning rule for SpikeProp is derived to overcome the discontinuities introduced by the spiking thresholding. This algorithm is based on an error-backpropagation learning rule suited for supervised learning of spiking neurons that use exact spike time coding. The SpikeProp is able to demonstrate the spiking neurons that can perform complex nonlinear classification in fast temporal coding. This study proposes enhancements of SpikeProp learning algorithm for supervised training of spiking networks which can deal with complex patterns. The proposed methods include the SpikeProp particle swarm optimization (PSO and angle driven dependency learning rate. These methods are presented to SpikeProp network for multilayer learning enhancement and weights optimization. Input and output patterns are encoded as spike trains of precisely timed spikes, and the network learns to transform the input trains into target output trains. With these enhancements, our proposed methods outperformed other conventional neural network architectures.

  13. Stimulus-dependent spiking relationships with the EEG

    Science.gov (United States)

    Snyder, Adam C.

    2015-01-01

    The development and refinement of noninvasive techniques for imaging neural activity is of paramount importance for human neuroscience. Currently, the most accessible and popular technique is electroencephalography (EEG). However, nearly all of what we know about the neural events that underlie EEG signals is based on inference, because of the dearth of studies that have simultaneously paired EEG recordings with direct recordings of single neurons. From the perspective of electrophysiologists there is growing interest in understanding how spiking activity coordinates with large-scale cortical networks. Evidence from recordings at both scales highlights that sensory neurons operate in very distinct states during spontaneous and visually evoked activity, which appear to form extremes in a continuum of coordination in neural networks. We hypothesized that individual neurons have idiosyncratic relationships to large-scale network activity indexed by EEG signals, owing to the neurons' distinct computational roles within the local circuitry. We tested this by recording neuronal populations in visual area V4 of rhesus macaques while we simultaneously recorded EEG. We found substantial heterogeneity in the timing and strength of spike-EEG relationships and that these relationships became more diverse during visual stimulation compared with the spontaneous state. The visual stimulus apparently shifts V4 neurons from a state in which they are relatively uniformly embedded in large-scale network activity to a state in which their distinct roles within the local population are more prominent, suggesting that the specific way in which individual neurons relate to EEG signals may hold clues regarding their computational roles. PMID:26108954

  14. Identifying functional connectivity in large-scale neural ensemble recordings: a multiscale data mining approach.

    Science.gov (United States)

    Eldawlatly, Seif; Jin, Rong; Oweiss, Karim G

    2009-02-01

    Identifying functional connectivity between neuronal elements is an essential first step toward understanding how the brain orchestrates information processing at the single-cell and population levels to carry out biological computations. This letter suggests a new approach to identify functional connectivity between neuronal elements from their simultaneously recorded spike trains. In particular, we identify clusters of neurons that exhibit functional interdependency over variable spatial and temporal patterns of interaction. We represent neurons as objects in a graph and connect them using arbitrarily defined similarity measures calculated across multiple timescales. We then use a probabilistic spectral clustering algorithm to cluster the neurons in the graph by solving a minimum graph cut optimization problem. Using point process theory to model population activity, we demonstrate the robustness of the approach in tracking a broad spectrum of neuronal interaction, from synchrony to rate co-modulation, by systematically varying the length of the firing history interval and the strength of the connecting synapses that govern the discharge pattern of each neuron. We also demonstrate how activity-dependent plasticity can be tracked and quantified in multiple network topologies built to mimic distinct behavioral contexts. We compare the performance to classical approaches to illustrate the substantial gain in performance.

  15. Minimally-Invasive Neural Interface for Distributed Wireless Electrocorticogram Recording Systems

    Directory of Open Access Journals (Sweden)

    Sun-Il Chang

    2018-01-01

    Full Text Available This paper presents a minimally-invasive neural interface for distributed wireless electrocorticogram (ECoG recording systems. The proposed interface equips all necessary components for ECoG recording, such as the high performance front-end integrated circuits, a fabricated flexible microelectrode array, and wireless communication inside a miniaturized custom-made platform. The multiple units of the interface systems can be deployed to cover a broad range of the target brain region and transmit signals via a built-in intra-skin communication (ISCOM module. The core integrated circuit (IC consists of 16-channel, low-power push-pull double-gated preamplifiers, in-channel successive approximation register analog-to-digital converters (SAR ADC with a single-clocked bootstrapping switch and a time-delayed control unit, an ISCOM module for wireless data transfer through the skin instead of a power-hungry RF wireless transmitter, and a monolithic voltage/current reference generator to support the aforementioned analog and mixed-signal circuit blocks. The IC was fabricated using 250 nm CMOS processes in an area of 3.2 × 0.9 mm2 and achieved the low-power operation of 2.5 µW per channel. Input-referred noise was measured as 5.62 µVrms for 10 Hz to 10 kHz and ENOB of 7.21 at 31.25 kS/s. The implemented system successfully recorded multi-channel neural activities in vivo from a primate and demonstrated modular expandability using the ISCOM with power consumption of 160 µW.

  16. Minimally-Invasive Neural Interface for Distributed Wireless Electrocorticogram Recording Systems.

    Science.gov (United States)

    Chang, Sun-Il; Park, Sung-Yun; Yoon, Euisik

    2018-01-17

    This paper presents a minimally-invasive neural interface for distributed wireless electrocorticogram (ECoG) recording systems. The proposed interface equips all necessary components for ECoG recording, such as the high performance front-end integrated circuits, a fabricated flexible microelectrode array, and wireless communication inside a miniaturized custom-made platform. The multiple units of the interface systems can be deployed to cover a broad range of the target brain region and transmit signals via a built-in intra-skin communication (ISCOM) module. The core integrated circuit (IC) consists of 16-channel, low-power push-pull double-gated preamplifiers, in-channel successive approximation register analog-to-digital converters (SAR ADC) with a single-clocked bootstrapping switch and a time-delayed control unit, an ISCOM module for wireless data transfer through the skin instead of a power-hungry RF wireless transmitter, and a monolithic voltage/current reference generator to support the aforementioned analog and mixed-signal circuit blocks. The IC was fabricated using 250 nm CMOS processes in an area of 3.2 × 0.9 mm² and achieved the low-power operation of 2.5 µW per channel. Input-referred noise was measured as 5.62 µVrms for 10 Hz to 10 kHz and ENOB of 7.21 at 31.25 kS/s. The implemented system successfully recorded multi-channel neural activities in vivo from a primate and demonstrated modular expandability using the ISCOM with power consumption of 160 µW.

  17. Comparison of MLP neural network and neuro-fuzzy system in transcranial Doppler signals recorded from the cerebral vessels.

    Science.gov (United States)

    Hardalaç, Firat

    2008-04-01

    Transcranial Doppler signals recorded from cerebral vessels of 110 patients were transferred to a personal computer by using a 16 bit sound card. Spectral analyses of Transcranial Doppler signals were performed for determining the Multi Layer Perceptron (MLP) neural network and neuro Ankara-fuzzy system inputs. In order to do a good interpretation and rapid diagnosis, FFT parameters of Transcranial Doppler signals classified using MLP neural network and neuro-fuzzy system. Our findings demonstrated that 92% correct classification rate was obtained from MLP neural network, and 86% correct classification rate was obtained from neuro-fuzzy system.

  18. Visually evoked spiking evolves while spontaneous ongoing dynamics persist

    Directory of Open Access Journals (Sweden)

    Raoul eHuys

    2016-01-01

    Full Text Available Neurons in the primary visual cortex spontaneously spike even when there are no visual stimuli. It is unknown whether the spiking evoked by visual stimuli is just a modification of the spontaneous ongoing cortical spiking dynamics or whether the spontaneous spiking state disappears and is replaced by evoked spiking. This study of laminar recordings of spontaneous spiking and visually evoked spiking of neurons in the ferret primary visual cortex shows that the spiking dynamics does not change: the spontaneous spiking as well as evoked spiking is controlled by a stable and persisting fixed point attractor. Its existence guarantees that evoked spiking return to the spontaneous state. However, the spontaneous ongoing spiking state and the visual evoked spiking states are qualitatively different and are separated by a threshold (separatrix. The functional advantage of this organization is that it avoids the need for a system reorganization following visual stimulation, and impedes the transition of spontaneous spiking to evoked spiking and the propagation of spontaneous spiking from layer 4 to layers 2-3.

  19. A sub-microwatt low-noise amplifier for neural recording.

    Science.gov (United States)

    Holleman, Jeremy; Otis, Brian

    2007-01-01

    In this paper we present a pre-amplifier designed for neural recording applications. Extremely low power dissipation is achieved by operating in an open-loop configuration, restricting the circuit to a single current branch, and reusing current to improve noise performance. Our amplifier exhibits 3.5 microVrms of input-referred noise and has a digitally-controlled gain between 36 dB and 44 dB. The amplifier is AC-coupled, with a pass-band from 0.3 Hz to 4.7 kHz. The circuit is implemented in a 0.5 microm SOI Bi-CMOS process and consumes 805nA from a 1.0V supply, corresponding to a noise efficiency factor (NEF) of 1.8, which is the lowest reported NEF to date.

  20. Deep Spiking Networks

    NARCIS (Netherlands)

    O'Connor, P.; Welling, M.

    2016-01-01

    We introduce an algorithm to do backpropagation on a spiking network. Our network is "spiking" in the sense that our neurons accumulate their activation into a potential over time, and only send out a signal (a "spike") when this potential crosses a threshold and the neuron is reset. Neurons only

  1. A wireless and batteryless neural headstage with optical stimulation and electrophysiological recording.

    Science.gov (United States)

    Ameli, Reza; Mirbozorgi, Abdollah; Neron, Jean-Luc; Lechasseur, Yoan; Gosselin, Benoit

    2013-01-01

    This paper presents a miniature Optogenetics headstage for wirelessly stimulating the brain of rodents with an implanted LED while recording electrophysiological data from a two-channel custom readout. The headstage is powered wirelessly using an inductive link, and is built using inexpensive commercial off-the-shelf electronic components, including a RF microcontroller and a printed antenna. This device has the capability to drive one light-stimulating LED and, at the same time, capture and send back neural signals recorded from two microelectrode readout channels. Light stimulation uses flexible patterns that allow for easy tuning of light intensity and stimulation periods. For driving the LED, a low-pass filtered digitally-generated PWM signal is employed for providing a flexible pulse generation method that alleviates the need for D/A converters. The proposed device can be powered wirelessly into an animal chamber using inductive energy transfer, which enables compact, light-weight and cost-effective smart animal research systems. The device dimensions are 15×25×17 mm; it weighs 7.4 grams and has a data transmission range of more than 2 meters. Different types of LEDs with different power consumptions can be used for this system. The power consumption of the system without the LED is 94.52 mW.

  2. Recent developments in wireless recording from the nervous system with ultrasonic neural dust (Conference Presentation)

    Science.gov (United States)

    Maharbiz, Michel M.

    2017-05-01

    The emerging field of bioelectronic medicine seeks methods for deciphering and modulating electrophysiological activity in the body to attain therapeutic effects at target organs. Current approaches to interfacing with peripheral nerves and muscles rely heavily on wires, creating problems for chronic use, while emerging wireless approaches lack the size scalability necessary to interrogate small-diameter nerves. Furthermore, conventional electrode-based technologies lack the capability to record from nerves with high spatial resolution or to record independently from many discrete sites within a nerve bundle. We recently demonstrated (Seo et al., arXiV, 2013; Seo et al., Neuron, 2016) "neural dust," a wireless and scalable ultrasonic backscatter system for powering and communicating with implanted bioelectronics. There, we showed that ultrasound is effective at delivering power to mm-scale devices in tissue; likewise, passive, battery-less communication using backscatter enabled high-fidelity transmission of electromyogram (EMG) and electroneurogram (ENG) signals from anesthetized rats. In this talk, I will review recent developments from my group and collaborators in this area.

  3. Multineuronal Spike Sequences Repeat with Millisecond Precision

    Directory of Open Access Journals (Sweden)

    Koki eMatsumoto

    2013-06-01

    Full Text Available Cortical microcircuits are nonrandomly wired by neurons. As a natural consequence, spikes emitted by microcircuits are also nonrandomly patterned in time and space. One of the prominent spike organizations is a repetition of fixed patterns of spike series across multiple neurons. However, several questions remain unsolved, including how precisely spike sequences repeat, how the sequences are spatially organized, how many neurons participate in sequences, and how different sequences are functionally linked. To address these questions, we monitored spontaneous spikes of hippocampal CA3 neurons ex vivo using a high-speed functional multineuron calcium imaging technique that allowed us to monitor spikes with millisecond resolution and to record the location of spiking and nonspiking neurons. Multineuronal spike sequences were overrepresented in spontaneous activity compared to the statistical chance level. Approximately 75% of neurons participated in at least one sequence during our observation period. The participants were sparsely dispersed and did not show specific spatial organization. The number of sequences relative to the chance level decreased when larger time frames were used to detect sequences. Thus, sequences were precise at the millisecond level. Sequences often shared common spikes with other sequences; parts of sequences were subsequently relayed by following sequences, generating complex chains of multiple sequences.

  4. Hybrid spiking models.

    Science.gov (United States)

    Izhikevich, Eugene M

    2010-11-13

    I review a class of hybrid models of neurons that combine continuous spike-generation mechanisms and a discontinuous 'after-spike' reset of state variables. Unlike Hodgkin-Huxley-type conductance-based models, the hybrid spiking models have a few parameters derived from the bifurcation theory; instead of matching neuronal electrophysiology, they match neuronal dynamics. I present a method of after-spike resetting suitable for hardware implementation of such models, and a hybrid numerical method for simulations of large-scale biological spiking networks.

  5. Controlling selective stimulations below a spinal cord hemisection using brain recordings with a neural interface system approach

    Science.gov (United States)

    Panetsos, Fivos; Sanchez-Jimenez, Abel; Torets, Carlos; Largo, Carla; Micera, Silvestro

    2011-08-01

    In this work we address the use of realtime cortical recordings for the generation of coherent, reliable and robust motor activity in spinal-lesioned animals through selective intraspinal microstimulation (ISMS). The spinal cord of adult rats was hemisectioned and groups of multielectrodes were implanted in both the central nervous system (CNS) and the spinal cord below the lesion level to establish a neural system interface (NSI). To test the reliability of this new NSI connection, highly repeatable neural responses recorded from the CNS were used as a pattern generator of an open-loop control strategy for selective ISMS of the spinal motoneurons. Our experimental procedure avoided the spontaneous non-controlled and non-repeatable neural activity that could have generated spurious ISMS and the consequent undesired muscle contractions. Combinations of complex CNS patterns generated precisely coordinated, reliable and robust motor actions.

  6. Energy efficient low-noise neural recording amplifier with enhanced noise efficiency factor.

    Science.gov (United States)

    Majidzadeh, V; Schmid, A; Leblebici, Y

    2011-06-01

    This paper presents a neural recording amplifier array suitable for large-scale integration with multielectrode arrays in very low-power microelectronic cortical implants. The proposed amplifier is one of the most energy-efficient structures reported to date, which theoretically achieves an effective noise efficiency factor (NEF) smaller than the limit that can be achieved by any existing amplifier topology, which utilizes a differential pair input stage. The proposed architecture, which is referred to as a partial operational transconductance amplifier sharing architecture, results in a significant reduction of power dissipation as well as silicon area, in addition to the very low NEF. The effect of mismatch on crosstalk between channels and the tradeoff between noise and crosstalk are theoretically analyzed. Moreover, a mathematical model of the nonlinearity of the amplifier is derived, and its accuracy is confirmed by simulations and measurements. For an array of four neural amplifiers, measurement results show a midband gain of 39.4 dB and a -3-dB bandwidth ranging from 10 Hz to 7.2 kHz. The input-referred noise integrated from 10 Hz to 100 kHz is measured at 3.5 μVrms and the power consumption is 7.92 μW from a 1.8-V supply, which corresponds to NEF = 3.35. The worst-case crosstalk and common-mode rejection ratio within the desired bandwidth are - 43.5 dB and 70.1 dB, respectively, and the active silicon area of each amplifier is 256 μm × 256 μm in 0.18-μm complementary metal-oxide semiconductor technology.

  7. Which model to use for cortical spiking neurons?

    Science.gov (United States)

    Izhikevich, Eugene M

    2004-09-01

    We discuss the biological plausibility and computational efficiency of some of the most useful models of spiking and bursting neurons. We compare their applicability to large-scale simulations of cortical neural networks.

  8. Graph-Laplacian features for neural waveform classification.

    Science.gov (United States)

    Ghanbari, Yasser; Papamichalis, Panos E; Spence, Larry

    2011-05-01

    Analysis of extracellular recordings of neural action potentials (known as spikes) is highly dependent upon the accuracy of neural waveform classification, commonly referred to as spike sorting. Feature extraction is an important stage of this process because it can limit the quality of clustering that is performed in the feature space. Principal components analysis (PCA) is the most commonly used feature extraction method employed for neural spike recordings. To improve upon PCA's feature extraction performance for neural spike sorting, we revisit the PCA procedure to analyze its weaknesses and describe an improved feature extraction method. This paper proposes a linear feature extraction technique that we call graph-Laplacian features, which simultaneously minimizes the graph Laplacian and maximizes variance. The algorithm's performance is compared with PCA and a wavelet-coefficient-based feature extraction algorithm on simulated single-electrode neural data. A cluster-quality metric is proposed to quantitatively measure the algorithm performance. The results show that the proposed algorithm produces more compact and well-separated clusters compared to the other approaches. © 2011 IEEE

  9. A Fully Integrated Wireless Compressed Sensing Neural Signal Acquisition System for Chronic Recording and Brain Machine Interface.

    Science.gov (United States)

    Liu, Xilin; Zhang, Milin; Xiong, Tao; Richardson, Andrew G; Lucas, Timothy H; Chin, Peter S; Etienne-Cummings, Ralph; Tran, Trac D; Van der Spiegel, Jan

    2016-07-18

    Reliable, multi-channel neural recording is critical to the neuroscience research and clinical treatment. However, most hardware development of fully integrated, multi-channel wireless neural recorders to-date, is still in the proof-of-concept stage. To be ready for practical use, the trade-offs between performance, power consumption, device size, robustness, and compatibility need to be carefully taken into account. This paper presents an optimized wireless compressed sensing neural signal recording system. The system takes advantages of both custom integrated circuits and universal compatible wireless solutions. The proposed system includes an implantable wireless system-on-chip (SoC) and an external wireless relay. The SoC integrates 16-channel low-noise neural amplifiers, programmable filters and gain stages, a SAR ADC, a real-time compressed sensing module, and a near field wireless power and data transmission link. The external relay integrates a 32 bit low-power microcontroller with Bluetooth 4.0 wireless module, a programming interface, and an inductive charging unit. The SoC achieves high signal recording quality with minimized power consumption, while reducing the risk of infection from through-skin connectors. The external relay maximizes the compatibility and programmability. The proposed compressed sensing module is highly configurable, featuring a SNDR of 9.78 dB with a compression ratio of 8×. The SoC has been fabricated in a 180 nm standard CMOS technology, occupying 2.1 mm × 0.6 mm silicon area. A pre-implantable system has been assembled to demonstrate the proposed paradigm. The developed system has been successfully used for long-term wireless neural recording in freely behaving rhesus monkey.

  10. How adaptation shapes spike rate oscillations in recurrent neuronal networks

    Directory of Open Access Journals (Sweden)

    Moritz eAugustin

    2013-02-01

    Full Text Available Neural mass signals from in-vivo recordings often show oscillations with frequencies ranging from <1 Hz to 100 Hz. Fast rhythmic activity in the beta and gamma range can be generated by network based mechanisms such as recurrent synaptic excitation-inhibition loops. Slower oscillations might instead depend on neuronal adaptation currents whose timescales range from tens of milliseconds to seconds. Here we investigate how the dynamics of such adaptation currents contribute to spike rate oscillations and resonance properties in recurrent networks of excitatory and inhibitory neurons. Based on a network of sparsely coupled spiking model neurons with two types of adaptation current and conductance based synapses with heterogeneous strengths and delays we use a mean-field approach to analyze oscillatory network activity. For constant external input, we find that spike-triggered adaptation currents provide a mechanism to generate slow oscillations over a wide range of adaptation timescales as long as recurrent synaptic excitation is sufficiently strong. Faster rhythms occur when recurrent inhibition is slower than excitation and oscillation frequency increases with the strength of inhibition. Adaptation facilitates such network based oscillations for fast synaptic inhibition and leads to decreased frequencies. For oscillatory external input, adaptation currents amplify a narrow band of frequencies and cause phase advances for low frequencies in addition to phase delays at higher frequencies. Our results therefore identify the different key roles of neuronal adaptation dynamics for rhythmogenesis and selective signal propagation in recurrent networks.

  11. Optical and electrical recording of neural activity evoked by graded contrast visual stimulus

    Directory of Open Access Journals (Sweden)

    Bulf Luca

    2007-07-01

    Full Text Available Abstract Background Brain activity has been investigated by several methods with different principles, notably optical ones. Each method may offer information on distinct physiological or pathological aspects of brain function. The ideal instrument to measure brain activity should include complementary techniques and integrate the resultant information. As a "low cost" approach towards this objective, we combined the well-grounded electroencephalography technique with the newer near infrared spectroscopy methods to investigate human visual function. Methods The article describes an embedded instrumentation combining a continuous-wave near-infrared spectroscopy system and an electroencephalography system to simultaneously monitor functional hemodynamics and electrical activity. Near infrared spectroscopy (NIRS signal depends on the light absorption spectra of haemoglobin and measures the blood volume and blood oxygenation regulation supporting the neural activity. The NIRS and visual evoked potential (VEP are concurrently acquired during steady state visual stimulation, at 8 Hz, with a b/w "windmill" pattern, in nine human subjects. The pattern contrast is varied (1%, 10%, 100% according to a stimulation protocol. Results In this study, we present the measuring system; the results consist in concurrent recordings of hemodynamic changes and evoked potential responses emerging from different contrast levels of a patterned stimulus. The concentration of [HbO2] increases and [HHb] decreases after the onset of the stimulus. Their variation shows a clear relationship with the contrast value: large contrast produce huge difference in concentration, while low contrast provokes small concentration difference. This behaviour is similar to the already known relationship between VEP response amplitude and contrast. Conclusion The simultaneous recording and analysis of NIRS and VEP signals in humans during visual stimulation with a b/w pattern at variable

  12. A high performing brain-machine interface driven by low-frequency local field potentials alone and together with spikes

    Science.gov (United States)

    Stavisky, Sergey D.; Kao, Jonathan C.; Nuyujukian, Paul; Ryu, Stephen I.; Shenoy, Krishna V.

    2015-06-01

    Objective. Brain-machine interfaces (BMIs) seek to enable people with movement disabilities to directly control prosthetic systems with their neural activity. Current high performance BMIs are driven by action potentials (spikes), but access to this signal often diminishes as sensors degrade over time. Decoding local field potentials (LFPs) as an alternative or complementary BMI control signal may improve performance when there is a paucity of spike signals. To date only a small handful of LFP decoding methods have been tested online; there remains a need to test different LFP decoding approaches and improve LFP-driven performance. There has also not been a reported demonstration of a hybrid BMI that decodes kinematics from both LFP and spikes. Here we first evaluate a BMI driven by the local motor potential (LMP), a low-pass filtered time-domain LFP amplitude feature. We then combine decoding of both LMP and spikes to implement a hybrid BMI. Approach. Spikes and LFP were recorded from two macaques implanted with multielectrode arrays in primary and premotor cortex while they performed a reaching task. We then evaluated closed-loop BMI control using biomimetic decoders driven by LMP, spikes, or both signals together. Main results. LMP decoding enabled quick and accurate cursor control which surpassed previously reported LFP BMI performance. Hybrid decoding of both spikes and LMP improved performance when spikes signal quality was mediocre to poor. Significance. These findings show that LMP is an effective BMI control signal which requires minimal power to extract and can substitute for or augment impoverished spikes signals. Use of this signal may lengthen the useful lifespan of BMIs and is therefore an important step towards clinically viable BMIs.

  13. A high performing brain-machine interface driven by low-frequency local field potentials alone and together with spikes.

    Science.gov (United States)

    Stavisky, Sergey D; Kao, Jonathan C; Nuyujukian, Paul; Ryu, Stephen I; Shenoy, Krishna V

    2015-06-01

    Brain-machine interfaces (BMIs) seek to enable people with movement disabilities to directly control prosthetic systems with their neural activity. Current high performance BMIs are driven by action potentials (spikes), but access to this signal often diminishes as sensors degrade over time. Decoding local field potentials (LFPs) as an alternative or complementary BMI control signal may improve performance when there is a paucity of spike signals. To date only a small handful of LFP decoding methods have been tested online; there remains a need to test different LFP decoding approaches and improve LFP-driven performance. There has also not been a reported demonstration of a hybrid BMI that decodes kinematics from both LFP and spikes. Here we first evaluate a BMI driven by the local motor potential (LMP), a low-pass filtered time-domain LFP amplitude feature. We then combine decoding of both LMP and spikes to implement a hybrid BMI. Spikes and LFP were recorded from two macaques implanted with multielectrode arrays in primary and premotor cortex while they performed a reaching task. We then evaluated closed-loop BMI control using biomimetic decoders driven by LMP, spikes, or both signals together. LMP decoding enabled quick and accurate cursor control which surpassed previously reported LFP BMI performance. Hybrid decoding of both spikes and LMP improved performance when spikes signal quality was mediocre to poor. These findings show that LMP is an effective BMI control signal which requires minimal power to extract and can substitute for or augment impoverished spikes signals. Use of this signal may lengthen the useful lifespan of BMIs and is therefore an important step towards clinically viable BMIs.

  14. An externally head-mounted wireless neural recording device for laboratory animal research and possible human clinical use.

    Science.gov (United States)

    Yin, Ming; Li, Hao; Bull, Christopher; Borton, David A; Aceros, Juan; Larson, Lawrence; Nurmikko, Arto V

    2013-01-01

    In this paper we present a new type of head-mounted wireless neural recording device in a highly compact package, dedicated for untethered laboratory animal research and designed for future mobile human clinical use. The device, which takes its input from an array of intracortical microelectrode arrays (MEA) has ninety-seven broadband parallel neural recording channels and was integrated on to two custom designed printed circuit boards. These house several low power, custom integrated circuits, including a preamplifier ASIC, a controller ASIC, plus two SAR ADCs, a 3-axis accelerometer, a 48MHz clock source, and a Manchester encoder. Another ultralow power RF chip supports an OOK transmitter with the center frequency tunable from 3GHz to 4GHz, mounted on a separate low loss dielectric board together with a 3V LDO, with output fed to a UWB chip antenna. The IC boards were interconnected and packaged in a polyether ether ketone (PEEK) enclosure which is compatible with both animal and human use (e.g. sterilizable). The entire system consumes 17mA from a 1.2Ahr 3.6V Li-SOCl2 1/2AA battery, which operates the device for more than 2 days. The overall system includes a custom RF receiver electronics which are designed to directly interface with any number of commercial (or custom) neural signal processors for multi-channel broadband neural recording. Bench-top measurements and in vivo testing of the device in rhesus macaques are presented to demonstrate the performance of the wireless neural interface.

  15. Evoking prescribed spike times in stochastic neurons

    Science.gov (United States)

    Doose, Jens; Lindner, Benjamin

    2017-09-01

    Single cell stimulation in vivo is a powerful tool to investigate the properties of single neurons and their functionality in neural networks. We present a method to determine a cell-specific stimulus that reliably evokes a prescribed spike train with high temporal precision of action potentials. We test the performance of this stimulus in simulations for two different stochastic neuron models. For a broad range of parameters and a neuron firing with intermediate firing rates (20-40 Hz) the reliability in evoking the prescribed spike train is close to its theoretical maximum that is mainly determined by the level of intrinsic noise.

  16. Relationships between spike-free local field potentials and spike timing in human temporal cortex.

    Science.gov (United States)

    Zanos, Stavros; Zanos, Theodoros P; Marmarelis, Vasilis Z; Ojemann, George A; Fetz, Eberhard E

    2012-04-01

    Intracortical recordings comprise both fast events, action potentials (APs), and slower events, known as local field potentials (LFPs). Although it is believed that LFPs mostly reflect local synaptic activity, it is unclear which of their signal components are most closely related to synaptic potentials and would therefore be causally related to the occurrence of individual APs. This issue is complicated by the significant contribution from AP waveforms, especially at higher LFP frequencies. In recordings of single-cell activity and LFPs from the human temporal cortex, we computed quantitative, nonlinear, causal dynamic models for the prediction of AP timing from LFPs, at millisecond resolution, before and after removing AP contributions to the LFP. In many cases, the timing of a significant number of single APs could be predicted from spike-free LFPs at different frequencies. Not surprisingly, model performance was superior when spikes were not removed. Cells whose activity was predicted by the spike-free LFP models generally fell into one of two groups: in the first group, neuronal spike activity was associated with specific phases of low LFP frequencies, lower spike activity at high LFP frequencies, and a stronger linear component in the spike-LFP model; in the second group, neuronal spike activity was associated with larger amplitude of high LFP frequencies, less frequent phase locking, and a stronger nonlinear model component. Spike timing in the first group was better predicted by the sign and level of the LFP preceding the spike, whereas spike timing in the second group was better predicted by LFP power during a certain time window before the spike.

  17. Data on copula modeling of mixed discrete and continuous neural time series

    Directory of Open Access Journals (Sweden)

    Meng Hu

    2016-06-01

    Full Text Available Copula is an important tool for modeling neural dependence. Recent work on copula has been expanded to jointly model mixed time series in neuroscience (“Hu et al., 2016, Joint Analysis of Spikes and Local Field Potentials using Copula” [1]. Here we present further data for joint analysis of spike and local field potential (LFP with copula modeling. In particular, the details of different model orders and the influence of possible spike contamination in LFP data from the same and different electrode recordings are presented. To further facilitate the use of our copula model for the analysis of mixed data, we provide the Matlab codes, together with example data.

  18. Tracing 'driver' versus 'modulator' information flow throughout large-scale, task-related neural circuitry.

    Science.gov (United States)

    Hermer-Vazquez, Linda

    2008-04-01

    PRIMARY OBJECTIVE: To determine the relative uses of neural action potential ('spike') data versus local field potentials (LFPs) for modeling information flow through complex brain networks. HYPOTHESIS: The common use of LFP data, which are continuous and therefore more mathematically suited for spectral information-flow modeling techniques such as Granger causality analysis, can lead to spurious inferences about whether a given brain area 'drives' the spiking in a downstream area. EXPERIMENT: We recorded spikes and LFPs from the forelimb motor cortex (M1) and the magnocellular red nucleus (mRN), which receives axon collaterals from M1 projection cells onto its distal dendrites, but not onto its perisomatic regions, as rats performed a skilled reaching task. RESULTS AND IMPLICATIONS: As predicted, Granger causality analysis on the LFPs-which are mainly composed of vector-summed dendritic currents-produced results that if conventionally interpreted would suggest that the M1 cells drove spike firing in the mRN, whereas analyses of spiking in the two recorded regions revealed no significant correlations. These results suggest that mathematical models of information flow should treat the sampled dendritic activity as more likely to reflect intrinsic dendritic and input-related processing in neural networks, whereas spikes are more likely to provide information about the output of neural network processing.

  19. Neural network connectivity and response latency modelled by stochastic processes

    DEFF Research Database (Denmark)

    Tamborrino, Massimiliano

    is connected to thousands of other neurons. The rst question is: how to model neural networks through stochastic processes? A multivariate Ornstein-Uhlenbeck process, obtained as a diffusion approximation of a jump process, is the proposed answer. Obviously, dependencies between neurons imply dependencies......Stochastic processes and their rst passage times have been widely used to describe the membrane potential dynamics of single neurons and to reproduce neuronal spikes, respectively.However, cerebral cortex in human brains is estimated to contain 10-20 billions of neurons and each of them...... between their spike times. Therefore, the second question is: how to detect neural network connectivity from simultaneously recorded spike trains? Answering this question corresponds to investigate the joint distribution of sequences of rst passage times. A non-parametric method based on copulas...

  20. Solving constraint satisfaction problems with networks of spiking neurons

    Directory of Open Access Journals (Sweden)

    Zeno eJonke

    2016-03-01

    Full Text Available Network of neurons in the brain apply – unlike processors in our current generation ofcomputer hardware – an event-based processing strategy, where short pulses (spikes areemitted sparsely by neurons to signal the occurrence of an event at a particular point intime. Such spike-based computations promise to be substantially more power-efficient thantraditional clocked processing schemes. However it turned out to be surprisingly difficult todesign networks of spiking neurons that can solve difficult computational problems on the levelof single spikes (rather than rates of spikes. We present here a new method for designingnetworks of spiking neurons via an energy function. Furthermore we show how the energyfunction of a network of stochastically firing neurons can be shaped in a quite transparentmanner by composing the networks of simple stereotypical network motifs. We show that thisdesign approach enables networks of spiking neurons to produce approximate solutions todifficult (NP-hard constraint satisfaction problems from the domains of planning/optimizationand verification/logical inference. The resulting networks employ noise as a computationalresource. Nevertheless the timing of spikes (rather than just spike rates plays an essential rolein their computations. Furthermore, networks of spiking neurons carry out for the Traveling Salesman Problem a more efficient stochastic search for good solutions compared with stochastic artificial neural networks (Boltzmann machines and Gibbs sampling.

  1. Neuronal communication: firing spikes with spikes.

    Science.gov (United States)

    Brecht, Michael

    2012-08-21

    Spikes of single cortical neurons can exert powerful effects even though most cortical synapses are too weak to fire postsynaptic neurons. A recent study combining single-cell stimulation with population imaging has visualized in vivo postsynaptic firing in genetically identified target cells. The results confirm predictions from in vitro work and might help to understand how the brain reads single-neuron activity. Copyright © 2012 Elsevier Ltd. All rights reserved.

  2. Nonsmooth dynamics in spiking neuron models

    Science.gov (United States)

    Coombes, S.; Thul, R.; Wedgwood, K. C. A.

    2012-11-01

    Large scale studies of spiking neural networks are a key part of modern approaches to understanding the dynamics of biological neural tissue. One approach in computational neuroscience has been to consider the detailed electrophysiological properties of neurons and build vast computational compartmental models. An alternative has been to develop minimal models of spiking neurons with a reduction in the dimensionality of both parameter and variable space that facilitates more effective simulation studies. In this latter case the single neuron model of choice is often a variant of the classic integrate-and-fire model, which is described by a nonsmooth dynamical system. In this paper we review some of the more popular spiking models of this class and describe the types of spiking pattern that they can generate (ranging from tonic to burst firing). We show that a number of techniques originally developed for the study of impact oscillators are directly relevant to their analysis, particularly those for treating grazing bifurcations. Importantly we highlight one particular single neuron model, capable of generating realistic spike trains, that is both computationally cheap and analytically tractable. This is a planar nonlinear integrate-and-fire model with a piecewise linear vector field and a state dependent reset upon spiking. We call this the PWL-IF model and analyse it at both the single neuron and network level. The techniques and terminology of nonsmooth dynamical systems are used to flesh out the bifurcation structure of the single neuron model, as well as to develop the notion of Lyapunov exponents. We also show how to construct the phase response curve for this system, emphasising that techniques in mathematical neuroscience may also translate back to the field of nonsmooth dynamical systems. The stability of periodic spiking orbits is assessed using a linear stability analysis of spiking times. At the network level we consider linear coupling between voltage

  3. Control strategies for underactuated neural ensembles driven by optogenetic stimulation

    Directory of Open Access Journals (Sweden)

    ShiNung eChing

    2013-04-01

    Full Text Available Motivated by experiments employing optogenetic stimulation of cortical regions, we consider spike control strategies for ensembles of uncoupled integrate and fire neurons with a common conductance input. We construct strategies for control of spike patterns, that is, multineuron trains of action potentials, up to some maximal spike rate determined by the neural biophysics. We emphasize a constructive role for parameter heterogeneity, and find a simple rule for controllability in pairs of neurons. In particular, we determine parameters for which common drive is not limited to inducing synchronous spiking. For large ensembles, we determine how the number of controllable neurons varies with the number of observed (recorded neurons, and what collateral spiking occurs in the full ensemble during control of the subensemble. While complete control of spiking in every neuron is not possible with a single input, we find that a degree of subensemble control is made possible by exploiting dynamical heterogeneity. As most available technologies for neural stimulation are underactuated, in the sense that the number of target neurons far exceeds the number of independent channels of stimulation, these results suggest partial control strategies that may be important in the development of sensory neuroprosthetics and other neurocontrol applications.

  4. Control strategies for underactuated neural ensembles driven by optogenetic stimulation.

    Science.gov (United States)

    Ching, ShiNung; Ritt, Jason T

    2013-01-01

    Motivated by experiments employing optogenetic stimulation of cortical regions, we consider spike control strategies for ensembles of uncoupled integrate and fire neurons with a common conductance input. We construct strategies for control of spike patterns, that is, multineuron trains of action potentials, up to some maximal spike rate determined by the neural biophysics. We emphasize a constructive role for parameter heterogeneity, and find a simple rule for controllability in pairs of neurons. In particular, we determine parameters for which common drive is not limited to inducing synchronous spiking. For large ensembles, we determine how the number of controllable neurons varies with the number of observed (recorded) neurons, and what collateral spiking occurs in the full ensemble during control of the subensemble. While complete control of spiking in every neuron is not possible with a single input, we find that a degree of subensemble control is made possible by exploiting dynamical heterogeneity. As most available technologies for neural stimulation are underactuated, in the sense that the number of target neurons far exceeds the number of independent channels of stimulation, these results suggest partial control strategies that may be important in the development of sensory neuroprosthetics and other neurocontrol applications.

  5. A CMOS IC–based multisite measuring system for stimulation and recording in neural preparations in vitro

    Directory of Open Access Journals (Sweden)

    Takashi eTateno

    2014-10-01

    Full Text Available In this report, we describe the system integration of a complementary metal oxide semiconductor (CMOS integrated circuit (IC chip, capable of both stimulation and recording of neurons or neural tissues, to investigate electrical signal propagation within cellular networks in vitro. The overall system consisted of three major subunits: a 5.0 mm × 5.0 mm CMOS IC chip, a reconfigurable logic device (field-programmable gate array, FPGA, and a PC. To test the system, microelectrode arrays (MEAs were used to extracellularly measure the activity of cultured rat cortical neurons and mouse cortical slices. The MEA had 64 bidirectional (stimulation and recording electrodes. In addition, the CMOS IC chip was equipped with dedicated analog filters, amplification stages, and a stimulation buffer. Signals from the electrodes were sampled at 15.6 kHz with 16-bit resolution. The measured input-referred circuitry noise was 10.1 μV root mean square (10 Hz to 100 kHz, which allowed reliable detection of neural signals ranging from several millivolts down to approximately 33 μVpp. Experiments were performed involving the stimulation of neurons with several spatiotemporal patterns and the recording of the triggered activity. An advantage over current MEAs, as demonstrated by our experiments, includes the ability to stimulate (voltage stimulation, 5-bit resolution spatiotemporal patterns in arbitrary subsets of electrodes. Furthermore, the fast stimulation reset mechanism allowed us to record neuronal signals from a stimulating electrode around 3 ms after stimulation. We demonstrate that the system can be directly applied to, for example, auditory neural prostheses in conjunction with an acoustic sensor and a sound processing system.

  6. Characterization of Early Cortical Neural Network Development in Multiwell Microelectrode Array Plates

    Science.gov (United States)

    We examined the development of neural network activity using microelectrode array (MEA) recordings made in multi-well MEA plates (mwMEAs) over the first 12 days in vitro (DIV). In primary cortical cultures made from postnatal rats, action potential spiking activity was essentiall...

  7. Spike-based population coding and working memory.

    Directory of Open Access Journals (Sweden)

    Martin Boerlin

    2011-02-01

    Full Text Available Compelling behavioral evidence suggests that humans can make optimal decisions despite the uncertainty inherent in perceptual or motor tasks. A key question in neuroscience is how populations of spiking neurons can implement such probabilistic computations. In this article, we develop a comprehensive framework for optimal, spike-based sensory integration and working memory in a dynamic environment. We propose that probability distributions are inferred spike-per-spike in recurrently connected networks of integrate-and-fire neurons. As a result, these networks can combine sensory cues optimally, track the state of a time-varying stimulus and memorize accumulated evidence over periods much longer than the time constant of single neurons. Importantly, we propose that population responses and persistent working memory states represent entire probability distributions and not only single stimulus values. These memories are reflected by sustained, asynchronous patterns of activity which make relevant information available to downstream neurons within their short time window of integration. Model neurons act as predictive encoders, only firing spikes which account for new information that has not yet been signaled. Thus, spike times signal deterministically a prediction error, contrary to rate codes in which spike times are considered to be random samples of an underlying firing rate. As a consequence of this coding scheme, a multitude of spike patterns can reliably encode the same information. This results in weakly correlated, Poisson-like spike trains that are sensitive to initial conditions but robust to even high levels of external neural noise. This spike train variability reproduces the one observed in cortical sensory spike trains, but cannot be equated to noise. On the contrary, it is a consequence of optimal spike-based inference. In contrast, we show that rate-based models perform poorly when implemented with stochastically spiking neurons.

  8. Spiking Neuron Network Helmholtz Machine

    Directory of Open Access Journals (Sweden)

    Pavel eSountsov

    2015-04-01

    Full Text Available An increasing amount of behavioral and neurophysiological data suggests that the brain performs optimal (or near-optimal probabilistic inference and learning during perception and other tasks. Although many machine learning algorithms exist that perform inference and learning in an optimal way, the complete description of how one of those algorithms (or a novel algorithm can be implemented in the brain is currently incomplete. There have been many proposed solutions that address how neurons can perform optimal inference but the question of how synaptic plasticity can implement optimal learning is rarely addressed. This paper aims to unify the two fields of probabilistic inference and synaptic plasticity by using a neuronal network of realistic model spiking neurons to implement a well studied computational model called the Helmholtz Machine. The Helmholtz Machine is amenable to neural implementation as the algorithm it uses to learn its parameters, called the wake-sleep algorithm, uses a local delta learning rule. Our spiking-neuron network implements both the delta rule and a small example of a Helmholtz machine. This neuronal network can learn an internal model of continuous-valued training data sets without supervision. The network can also perform inference on the learned internal models. We show how various biophysical features of the neural implementation constrain the parameters of the wake-sleep algorithm, such as the duration of the wake and sleep phases of learning and the minimal sample duration. We examine the deviations from optimal performance and tie them to the properties of the synaptic plasticity rule.

  9. Comparison of neuronal spike exchange methods on a Blue Gene/P supercomputer

    Directory of Open Access Journals (Sweden)

    Michael eHines

    2011-11-01

    Full Text Available The performance of several spike exchange methods using a Blue Gene/P supercomputerhas been tested with 8K to 128K cores using randomly connected networks of up to 32M cells with 1k connections per cell and 4M cells with 10k connections per cell. The spike exchange methods used are the standard Message Passing Interface collective, MPI_Allgather, and several variants of the non-blocking multisend method either implemented via non-blocking MPI_Isend, or exploiting the possibility of very low overhead direct memory access communication available on the Blue Gene/P. In all cases the worst performing method was that using MPI_Isend due to the high overhead of initiating a spike communication. The two best performing methods --- the persistent multisend method using the Record-Replay feature of the Deep Computing Messaging Framework DCMF_Multicast;and a two phase multisend in which a DCMF_Multicast is used to first send to a subset of phase 1 destination cores which then pass it on to their subset of phase 2 destination cores --- had similar performance with very low overhead for the initiation of spike communication. Departure from ideal scaling for the multisend methods is almost completely due to load imbalance caused by the largevariation in number of cells that fire on each processor in the interval between synchronization. Spike exchange time itself is negligible since transmission overlaps with computation and is handled by a direct memory access controller. We conclude that ideal performance scaling will be ultimately limited by imbalance between incoming processor spikes between synchronization intervals. Thus, counterintuitively, maximization of load balance requires that the distribution of cells on processors should not reflect neural net architecture but be randomly distributed so that sets of cells which are burst firing together should be on different processors with their targets on as large a set of processors as possible.

  10. Spiking in auditory cortex following thalamic stimulation is dominated by cortical network activity

    Science.gov (United States)

    Krause, Bryan M.; Raz, Aeyal; Uhlrich, Daniel J.; Smith, Philip H.; Banks, Matthew I.

    2014-01-01

    The state of the sensory cortical network can have a profound impact on neural responses and perception. In rodent auditory cortex, sensory responses are reported to occur in the context of network events, similar to brief UP states, that produce “packets” of spikes and are associated with synchronized synaptic input (Bathellier et al., 2012; Hromadka et al., 2013; Luczak et al., 2013). However, traditional models based on data from visual and somatosensory cortex predict that ascending sensory thalamocortical (TC) pathways sequentially activate cells in layers 4 (L4), L2/3, and L5. The relationship between these two spatio-temporal activity patterns is unclear. Here, we used calcium imaging and electrophysiological recordings in murine auditory TC brain slices to investigate the laminar response pattern to stimulation of TC afferents. We show that although monosynaptically driven spiking in response to TC afferents occurs, the vast majority of spikes fired following TC stimulation occurs during brief UP states and outside the context of the L4>L2/3>L5 activation sequence. Specifically, monosynaptic subthreshold TC responses with similar latencies were observed throughout layers 2–6, presumably via synapses onto dendritic processes located in L3 and L4. However, monosynaptic spiking was rare, and occurred primarily in L4 and L5 non-pyramidal cells. By contrast, during brief, TC-induced UP states, spiking was dense and occurred primarily in pyramidal cells. These network events always involved infragranular layers, whereas involvement of supragranular layers was variable. During UP states, spike latencies were comparable between infragranular and supragranular cells. These data are consistent with a model in which activation of auditory cortex, especially supragranular layers, depends on internally generated network events that represent a non-linear amplification process, are initiated by infragranular cells and tightly regulated by feed-forward inhibitory

  11. A unified description of cerebellar inter-spike interval distributions and variabilities using summation of Gaussians.

    Science.gov (United States)

    Chen, Yanqing; Nitz, Douglas A

    2011-01-01

    Neuronal inter-spike intervals (ISIs) have previously been described as Poisson, Gamma, inverse Gaussian or other unimodal distributions. We analyzed ISIs of rhythmic and arrhythmic neuronal spike trains in cerebellum recorded from freely behaving rats, and found that their distributions can be described as the summation or integration of multiple Gaussian distributions. The ISIs of rhythmic cerebellar Purkinje cells have a main Gaussian peak at a basic firing interval and exponentially reduced peaks at multiples of this firing period. ISIs of arrhythmic Purkinje cells can be modeled as the integration of multiple Gaussian distributions centered at continuous intervals with exponentially reduced peak amplitudes. The sources of variability are directly related to the relative timing of action potentials between neighboring cells since we show that irregularities of discharge in one cell are associated with the previous history of its discharge in time relative to another cell. Through relative phase analyses, we demonstrate that the shape and the mathematical form of the ISI distributions in cerebellum are direct result of dynamic interactions in the nearby neuronal network, in addition to intrinsic firing properties. The analysis in this paper provides a unified description of cerebellar inter-spike interval distributions which deviate from the usual Poisson assumptions. Our results suggest the existence of an intrinsic rhythmicity in cells exhibiting arrhythmic spike trains in cerebellum, and may identify an important source of variability in neuronal firing patterns that is relevant to the mechanism of neural computation in cerebellum.

  12. Cellular and circuit mechanisms maintain low spike co-variability and enhance population coding in somatosensory cortex

    Directory of Open Access Journals (Sweden)

    Cheng eLy

    2012-03-01

    Full Text Available The responses of cortical neurons are highly variable across repeated presentations of a stimulus. Understanding this variability is critical for theories of both sensory and motor processing, since response variance affects the accuracy of neural codes. Despite this influence, the cellular and circuit mechanisms that shape the trial-to-trial variability of population responses remain poorly understood. We used a combination of experimental and computational techniques to uncover the mechanisms underlying response variability of populations of pyramidal (E cells in layer 2/3 of rat whisker barrel cortex. Spike trains recorded from pairs of E-cells during either spontaneous activity or whisker deflected responses show similarly low levels of spiking co-variability, despite large differences in network activation between the two states. We developed network models that show how spike threshold nonlinearities dilutes E-cell spiking co-variability during spontaneous activity and low velocity whisker deflections. In contrast, during high velocity whisker deflections, cancelation mechanisms mediated by feedforward inhibition maintain low E-cell pairwise co-variability. Thus, the combination of these two mechanisms ensure low E-cell population variability over a wide range of whisker deflection velocities. Finally, we show how this active decorrelation of population variability leads to a drastic increase in the population information about whisker velocity. The canonical cellular and circuit components of our study suggest that low network variability over a broad range of neural states may generalize across the nervous system.

  13. Stream-based Hebbian eigenfilter for real-time neuronal spike discrimination

    Directory of Open Access Journals (Sweden)

    Yu Bo

    2012-04-01

    Full Text Available Abstract Background Principal component analysis (PCA has been widely employed for automatic neuronal spike sorting. Calculating principal components (PCs is computationally expensive, and requires complex numerical operations and large memory resources. Substantial hardware resources are therefore needed for hardware implementations of PCA. General Hebbian algorithm (GHA has been proposed for calculating PCs of neuronal spikes in our previous work, which eliminates the needs of computationally expensive covariance analysis and eigenvalue decomposition in conventional PCA algorithms. However, large memory resources are still inherently required for storing a large volume of aligned spikes for training PCs. The large size memory will consume large hardware resources and contribute significant power dissipation, which make GHA difficult to be implemented in portable or implantable multi-channel recording micro-systems. Method In this paper, we present a new algorithm for PCA-based spike sorting based on GHA, namely stream-based Hebbian eigenfilter, which eliminates the inherent memory requirements of GHA while keeping the accuracy of spike sorting by utilizing the pseudo-stationarity of neuronal spikes. Because of the reduction of large hardware storage requirements, the proposed algorithm can lead to ultra-low hardware resources and power consumption of hardware implementations, which is critical for the future multi-channel micro-systems. Both clinical and synthetic neural recording data sets were employed for evaluating the accuracy of the stream-based Hebbian eigenfilter. The performance of spike sorting using stream-based eigenfilter and the computational complexity of the eigenfilter were rigorously evaluated and compared with conventional PCA algorithms. Field programmable logic arrays (FPGAs were employed to implement the proposed algorithm, evaluate the hardware implementations and demonstrate the reduction in both power consumption and

  14. A 110-nW in-channel sigma-delta converter for large-scale neural recording implants.

    Science.gov (United States)

    Rezaei, M; Maghsoudloo, E; Sawan, M; Gosselin, B

    2016-08-01

    Advancement in wireless and microsystems technology have ushered in new devices that can directly interface with the central nervous system for stimulating and/or monitoring neural circuitry. In this paper, we present an ultra low-power sigma-delta analog-to-digital converter (ADC) intended for utilization into large-scale multi-channel neural recording implants. This proposed design, which provides a resolution of 9 bits using a one-bit oversampled ADC, presents several desirable features that allow for an in-channel ADC scheme, where one sigma-delta converter is provided for each channel, enabling development of scalable systems that can interface with different types of high-density neural microprobes. The proposed circuit, which have been fabricated in a TSMC 180-nm CMOS process, employs a first order noise shaping topology with a passive integrator and a low-supply voltage of 0.6 V to achieve ultra low-power consumption and small size. The proposed ADC clearly outperforms other designs with a power consumption as low as 110 nW for a precision of 9 bits (11-fJ per conversion), a silicon area of only 82 μm × 84 μm and one of the best reported figure of merit among recently published data converters utilized in similar applications.

  15. Spike train encoding by regular-spiking cells of the visual cortex.

    Science.gov (United States)

    Carandini, M; Mechler, F; Leonard, C S; Movshon, J A

    1996-11-01

    1. To study the encoding of input currents into output spike trains by regular-spiking cells, we recorded intracellularly from slices of the guinea pig visual cortex while injecting step, sinusoidal, and broadband noise currents. 2. When measured with sinusoidal currents, the frequency tuning of the spike responses was markedly band-pass. The preferred frequency was between 8 and 30 Hz, and grew with stimulus amplitude and mean intensity. 3. Stimulation with broadband noise currents dramatically enhanced the gain of the spike responses at low and high frequencies, yielding an essentially flat frequency tuning between 0.1 and 130 Hz. 4. The averaged spike responses to sinusoidal currents exhibited two nonlinearities: rectification and spike synchronization. By contrast, no nonlinearity was evident in the averaged responses to broadband noise stimuli. 5. These properties of the spike responses were not present in the membrane potential responses. The latter were roughly linear, and their frequency tuning was low-pass and well fit by a single-compartment passive model of the cell membrane composed of a resistance and a capacitance in parallel (RC circuit). 6. To account for the spike responses, we used a "sandwich model" consisting of a low-pass linear filter (the RC circuit), a rectification nonlinearity, and a high-pass linear filter. The model is described by six parameters and predicts analog firing rates rather than discrete spikes. It provided satisfactory fits to the firing rate responses to steps, sinusoids, and broadband noise currents. 7. The properties of spike encoding are consistent with temporal nonlinearities of the visual responses in V1, such as the dependence of response frequency tuning and latency on stimulus contrast and bandwidth. We speculate that one of the roles of the high-frequency membrane potential fluctuations observed in vivo could be to amplify and linearize the responses to lower, stimulus-related frequencies.

  16. NeuroQuest: a comprehensive analysis tool for extracellular neural ensemble recordings.

    Science.gov (United States)

    Kwon, Ki Yong; Eldawlatly, Seif; Oweiss, Karim

    2012-02-15

    Analyzing the massive amounts of neural data collected using microelectrodes to extract biologically relevant information is a major challenge. Many scientific findings rest on the ability to overcome these challenges and to standardize experimental analysis across labs. This can be facilitated in part through comprehensive, efficient and practical software tools disseminated to the community at large. We have developed a comprehensive, MATLAB-based software package - entitled NeuroQuest - that bundles together a number of advanced neural signal processing algorithms in a user-friendly environment. Results demonstrate the efficiency and reliability of the software compared to other software packages, and versatility over a wide range of experimental conditions. Copyright © 2011 Elsevier B.V. All rights reserved.

  17. Extending transfer entropy improves identification of effective connectivity in a spiking cortical network model.

    Directory of Open Access Journals (Sweden)

    Shinya Ito

    Full Text Available Transfer entropy (TE is an information-theoretic measure which has received recent attention in neuroscience for its potential to identify effective connectivity between neurons. Calculating TE for large ensembles of spiking neurons is computationally intensive, and has caused most investigators to probe neural interactions at only a single time delay and at a message length of only a single time bin. This is problematic, as synaptic delays between cortical neurons, for example, range from one to tens of milliseconds. In addition, neurons produce bursts of spikes spanning multiple time bins. To address these issues, here we introduce a free software package that allows TE to be measured at multiple delays and message lengths. To assess performance, we applied these extensions of TE to a spiking cortical network model (Izhikevich, 2006 with known connectivity and a range of synaptic delays. For comparison, we also investigated single-delay TE, at a message length of one bin (D1TE, and cross-correlation (CC methods. We found that D1TE could identify 36% of true connections when evaluated at a false positive rate of 1%. For extended versions of TE, this dramatically improved to 73% of true connections. In addition, the connections correctly identified by extended versions of TE accounted for 85% of the total synaptic weight in the network. Cross correlation methods generally performed more poorly than extended TE, but were useful when data length was short. A computational performance analysis demonstrated that the algorithm for extended TE, when used on currently available desktop computers, could extract effective connectivity from 1 hr recordings containing 200 neurons in ∼5 min. We conclude that extending TE to multiple delays and message lengths improves its ability to assess effective connectivity between spiking neurons. These extensions to TE soon could become practical tools for experimentalists who record hundreds of spiking neurons.

  18. Extending transfer entropy improves identification of effective connectivity in a spiking cortical network model.

    Science.gov (United States)

    Ito, Shinya; Hansen, Michael E; Heiland, Randy; Lumsdaine, Andrew; Litke, Alan M; Beggs, John M

    2011-01-01

    Transfer entropy (TE) is an information-theoretic measure which has received recent attention in neuroscience for its potential to identify effective connectivity between neurons. Calculating TE for large ensembles of spiking neurons is computationally intensive, and has caused most investigators to probe neural interactions at only a single time delay and at a message length of only a single time bin. This is problematic, as synaptic delays between cortical neurons, for example, range from one to tens of milliseconds. In addition, neurons produce bursts of spikes spanning multiple time bins. To address these issues, here we introduce a free software package that allows TE to be measured at multiple delays and message lengths. To assess performance, we applied these extensions of TE to a spiking cortical network model (Izhikevich, 2006) with known connectivity and a range of synaptic delays. For comparison, we also investigated single-delay TE, at a message length of one bin (D1TE), and cross-correlation (CC) methods. We found that D1TE could identify 36% of true connections when evaluated at a false positive rate of 1%. For extended versions of TE, this dramatically improved to 73% of true connections. In addition, the connections correctly identified by extended versions of TE accounted for 85% of the total synaptic weight in the network. Cross correlation methods generally performed more poorly than extended TE, but were useful when data length was short. A computational performance analysis demonstrated that the algorithm for extended TE, when used on currently available desktop computers, could extract effective connectivity from 1 hr recordings containing 200 neurons in ∼5 min. We conclude that extending TE to multiple delays and message lengths improves its ability to assess effective connectivity between spiking neurons. These extensions to TE soon could become practical tools for experimentalists who record hundreds of spiking neurons.

  19. Stochastic optimal control of single neuron spike trains

    DEFF Research Database (Denmark)

    Iolov, Alexandre; Ditlevsen, Susanne; Longtin, Andrë

    2014-01-01

    Objective. External control of spike times in single neurons can reveal important information about a neuron's sub-threshold dynamics that lead to spiking, and has the potential to improve brain–machine interfaces and neural prostheses. The goal of this paper is the design of optimal electrical...... stimulation of a neuron to achieve a target spike train under the physiological constraint to not damage tissue. Approach. We pose a stochastic optimal control problem to precisely specify the spike times in a leaky integrate-and-fire (LIF) model of a neuron with noise assumed to be of intrinsic or synaptic...... of control degrades with increasing intensity of the noise. Simulations show that our algorithms produce the desired results for the LIF model, but also for the case where the neuron dynamics are given by more complex models than the LIF model. This is illustrated explicitly using the Morris–Lecar spiking...

  20. On the relation between encoding and decoding of neuronal spikes.

    Science.gov (United States)

    Koyama, Shinsuke

    2012-06-01

    Neural coding is a field of study that concerns how sensory information is represented in the brain by networks of neurons. The link between external stimulus and neural response can be studied from two parallel points of view. The first, neural encoding, refers to the mapping from stimulus to response. It focuses primarily on understanding how neurons respond to a wide variety of stimuli and constructing models that accurately describe the stimulus-response relationship. Neural decoding refers to the reverse mapping, from response to stimulus, where the challenge is to reconstruct a stimulus from the spikes it evokes. Since neuronal response is stochastic, a one-to-one mapping of stimuli into neural responses does not exist, causing a mismatch between the two viewpoints of neural coding. Here we use these two perspectives to investigate the question of what rate coding is, in the simple setting of a single stationary stimulus parameter and a single stationary spike train represented by a renewal process. We show that when rate codes are defined in terms of encoding, that is, the stimulus parameter is mapped onto the mean firing rate, the rate decoder given by spike counts or the sample mean does not always efficiently decode the rate codes, but it can improve efficiency in reading certain rate codes when correlations within a spike train are taken into account.

  1. In Vivo Recording of Neural and Behavioral Correlates of Anesthesia Induction, Reversal, and Euthanasia in Cephalopod Molluscs

    Directory of Open Access Journals (Sweden)

    Hanna M. Butler-Struben

    2018-02-01

    Full Text Available Cephalopod molluscs are among the most behaviorally and neurologically complex invertebrates. As they are now included in research animal welfare regulations in many countries, humane and effective anesthesia is required during invasive procedures. However, currently there is no evidence that agents believed to act as anesthetics produce effects beyond immobility. In this study we demonstrate, for the first time, that two of the most commonly used agents in cephalopod general anesthesia, magnesium chloride and ethanol, are capable of producing strong and reversible blockade of afferent and efferent neural signal; thus they are genuine anesthetics, rather than simply sedating agents that render animals immobile but not insensible. Additionally, we demonstrate that injected magnesium chloride and lidocaine are effective local anesthetic agents. This represents a considerable advance for cephalopod welfare. Using a reversible, minimally invasive recording procedure, we measured activity in the pallial nerve of cuttlefish (Sepia bandensis and octopus (Abdopus aculeatus, Octopus bocki, during induction and reversal for five putative general anesthetic and two local anesthetic agents. We describe the temporal relationship between loss of behavioral responses (immobility, loss of efferent neural signal (loss of “consciousness” and loss of afferent neural signal (anesthesia for general anesthesia, and loss of afferent signal for local anesthesia. Both ethanol and magnesium chloride were effective as bath-applied general anesthetics, causing immobility, complete loss of behavioral responsiveness and complete loss of afferent and efferent neural signal. Cold seawater, diethyl ether, and MS-222 (tricaine were ineffective. Subcutaneous injection of either lidocaine or magnesium chloride blocked behavioral and neural responses to pinch in the injected area, and we conclude that both are effective local anesthetic agents for cephalopods. Lastly, we

  2. A low power multichannel analog front end for portable neural signal recordings.

    Science.gov (United States)

    Obeid, Iyad; Nicolelis, Miguel A L; Wolf, Patrick D

    2004-02-15

    We present the design and testing of a 16-channel analog amplifier for processing neural signals. Each channel has the following features: (1) variable gain (70-94 dB), (2) four high pass Bessel filter poles (f(-3 dB)=445 Hz), (3) five low pass Bessel filter poles (f(-3 dB)=6.6 kHz), and (4) differential amplification with a user selectable reference channel to reject common mode background biological noise. Processed signals are time division multiplexed and sampled by an on-board 12-bit analog to digital converter at up to 62.5k samples/s per channel. The board is powered by two low dropout voltage regulators which may be supplied by a single battery. The board measures 8.1 cm x 9.9 cm, weighs 50 g, and consumes up to 130 mW. Its low input-referred noise (1.0 microV(RMS)) makes it possible to process low amplitude neural signals; the board was successfully tested in vivo to process cortically derived extracellular action potentials in primates. Signals processed by this board were compared to those generated by a commercially available system and were found to be nearly identical. Background noise generated by mastication was substantially attenuated by the selectable reference circuit. The described circuit is light weight and low power and is used as a component of a wearable multichannel neural telemetry system.

  3. Seismic Design Value Evaluation Based on Checking Records and Site Geological Conditions Using Artificial Neural Networks

    Directory of Open Access Journals (Sweden)

    Tienfuan Kerh

    2013-01-01

    Full Text Available This study proposes an improved computational neural network model that uses three seismic parameters (i.e., local magnitude, epicentral distance, and epicenter depth and two geological conditions (i.e., shear wave velocity and standard penetration test value as the inputs for predicting peak ground acceleration—the key element for evaluating earthquake response. Initial comparison results show that a neural network model with three neurons in the hidden layer can achieve relatively better performance based on the evaluation index of correlation coefficient or mean square error. This study further develops a new weight-based neural network model for estimating peak ground acceleration at unchecked sites. Four locations identified to have higher estimated peak ground accelerations than that of the seismic design value in the 24 subdivision zones are investigated in Taiwan. Finally, this study develops a new equation for the relationship of horizontal peak ground acceleration and focal distance by the curve fitting method. This equation represents seismic characteristics in Taiwan region more reliably and reasonably. The results of this study provide an insight into this type of nonlinear problem, and the proposed method may be applicable to other areas of interest around the world.

  4. Successful Reconstruction of a Physiological Circuit with Known Connectivity from Spiking Activity Alone

    Science.gov (United States)

    Gerhard, Felipe; Kispersky, Tilman; Gutierrez, Gabrielle J.; Marder, Eve; Kramer, Mark; Eden, Uri

    2013-01-01

    Identifying the structure and dynamics of synaptic interactions between neurons is the first step to understanding neural network dynamics. The presence of synaptic connections is traditionally inferred through the use of targeted stimulation and paired recordings or by post-hoc histology. More recently, causal network inference algorithms have been proposed to deduce connectivity directly from electrophysiological signals, such as extracellularly recorded spiking activity. Usually, these algorithms have not been validated on a neurophysiological data set for which the actual circuitry is known. Recent work has shown that traditional network inference algorithms based on linear models typically fail to identify the correct coupling of a small central pattern generating circuit in the stomatogastric ganglion of the crab Cancer borealis. In this work, we show that point process models of observed spike trains can guide inference of relative connectivity estimates that match the known physiological connectivity of the central pattern generator up to a choice of threshold. We elucidate the necessary steps to derive faithful connectivity estimates from a model that incorporates the spike train nature of the data. We then apply the model to measure changes in the effective connectivity pattern in response to two pharmacological interventions, which affect both intrinsic neural dynamics and synaptic transmission. Our results provide the first successful application of a network inference algorithm to a circuit for which the actual physiological synapses between neurons are known. The point process methodology presented here generalizes well to larger networks and can describe the statistics of neural populations. In general we show that advanced statistical models allow for the characterization of effective network structure, deciphering underlying network dynamics and estimating information-processing capabilities. PMID:23874181

  5. Successful reconstruction of a physiological circuit with known connectivity from spiking activity alone.

    Directory of Open Access Journals (Sweden)

    Felipe Gerhard

    Full Text Available Identifying the structure and dynamics of synaptic interactions between neurons is the first step to understanding neural network dynamics. The presence of synaptic connections is traditionally inferred through the use of targeted stimulation and paired recordings or by post-hoc histology. More recently, causal network inference algorithms have been proposed to deduce connectivity directly from electrophysiological signals, such as extracellularly recorded spiking activity. Usually, these algorithms have not been validated on a neurophysiological data set for which the actual circuitry is known. Recent work has shown that traditional network inference algorithms based on linear models typically fail to identify the correct coupling of a small central pattern generating circuit in the stomatogastric ganglion of the crab Cancer borealis. In this work, we show that point process models of observed spike trains can guide inference of relative connectivity estimates that match the known physiological connectivity of the central pattern generator up to a choice of threshold. We elucidate the necessary steps to derive faithful connectivity estimates from a model that incorporates the spike train nature of the data. We then apply the model to measure changes in the effective connectivity pattern in response to two pharmacological interventions, which affect both intrinsic neural dynamics and synaptic transmission. Our results provide the first successful application of a network inference algorithm to a circuit for which the actual physiological synapses between neurons are known. The point process methodology presented here generalizes well to larger networks and can describe the statistics of neural populations. In general we show that advanced statistical models allow for the characterization of effective network structure, deciphering underlying network dynamics and estimating information-processing capabilities.

  6. Dynamic balance of excitation and inhibition rapidly modulates spike probability and precision in feed-forward hippocampal circuits.

    Science.gov (United States)

    Wahlstrom-Helgren, Sarah; Klyachko, Vitaly A

    2016-12-01

    Feed-forward inhibitory (FFI) circuits are important for many information-processing functions. FFI circuit operations critically depend on the balance and timing between the excitatory and inhibitory components, which undergo rapid dynamic changes during neural activity due to short-term plasticity (STP) of both components. How dynamic changes in excitation/inhibition (E/I) balance during spike trains influence FFI circuit operations remains poorly understood. In the current study we examined the role of STP in the FFI circuit functions in the mouse hippocampus. Using a coincidence detection paradigm with simultaneous activation of two Schaffer collateral inputs, we found that the spiking probability in the target CA1 neuron was increased while spike precision concomitantly decreased during high-frequency bursts compared with a single spike. Blocking inhibitory synaptic transmission revealed that dynamics of inhibition predominately modulates the spike precision but not the changes in spiking probability, whereas the latter is modulated by the dynamics of excitation. Further analyses combining whole cell recordings and simulations of the FFI circuit suggested that dynamics of the inhibitory circuit component may influence spiking behavior during bursts by broadening the width of excitatory postsynaptic responses and that the strength of this modulation depends on the basal E/I ratio. We verified these predictions using a mouse model of fragile X syndrome, which has an elevated E/I ratio, and found a strongly reduced modulation of postsynaptic response width during bursts. Our results suggest that changes in the dynamics of excitatory and inhibitory circuit components due to STP play important yet distinct roles in modulating the properties of FFI circuits. Copyright © 2016 the American Physiological Society.

  7. Impact of morphometry, myelinization and synaptic current strength on spike conduction in human and cat spiral ganglion neurons.

    Directory of Open Access Journals (Sweden)

    Frank Rattay

    Full Text Available Our knowledge about the neural code in the auditory nerve is based to a large extent on experiments on cats. Several anatomical differences between auditory neurons in human and cat are expected to lead to functional differences in speed and safety of spike conduction.Confocal microscopy was used to systematically evaluate peripheral and central process diameters, commonness of myelination and morphology of spiral ganglion neurons (SGNs along the cochlea of three human and three cats. Based on these morphometric data, model analysis reveales that spike conduction in SGNs is characterized by four phases: a postsynaptic delay, constant velocity in the peripheral process, a presomatic delay and constant velocity in the central process. The majority of SGNs are type I, connecting the inner hair cells with the brainstem. In contrast to those of humans, type I neurons of the cat are entirely myelinated. Biophysical model evaluation showed delayed and weak spikes in the human soma region as a consequence of a lack of myelin. The simulated spike conduction times are in accordance with normal interwave latencies from auditory brainstem response recordings from man and cat. Simulated 400 pA postsynaptic currents from inner hair cell ribbon synapses were 15 times above threshold. They enforced quick and synchronous spiking. Both of these properties were not present in type II cells as they receive fewer and much weaker (∼26 pA synaptic stimuli.Wasting synaptic energy boosts spike initiation, which guarantees the rapid transmission of temporal fine structure of auditory signals. However, a lack of myelin in the soma regions of human type I neurons causes a large delay in spike conduction in comparison with cat neurons. The absent myelin, in combination with a longer peripheral process, causes quantitative differences of temporal parameters in the electrically stimulated human cochlea compared to the cat cochlea.

  8. Decoding of grasping information from neural signals recorded using peripheral intrafascicular interfaces

    Directory of Open Access Journals (Sweden)

    Cipriani Christian

    2011-09-01

    Full Text Available Abstract Background The restoration of complex hand functions by creating a novel bidirectional link between the nervous system and a dexterous hand prosthesis is currently pursued by several research groups. This connection must be fast, intuitive, with a high success rate and quite natural to allow an effective bidirectional flow of information between the user's nervous system and the smart artificial device. This goal can be achieved with several approaches and among them, the use of implantable interfaces connected with the peripheral nervous system, namely intrafascicular electrodes, is considered particularly interesting. Methods Thin-film longitudinal intra-fascicular electrodes were implanted in the median and ulnar nerves of an amputee's stump during a four-week trial. The possibility of decoding motor commands suitable to control a dexterous hand prosthesis was investigated for the first time in this research field by implementing a spike sorting and classification algorithm. Results The results showed that motor information (e.g., grip types and single finger movements could be extracted with classification accuracy around 85% (for three classes plus rest and that the user could improve his ability to govern motor commands over time as shown by the improved discrimination ability of our classification algorithm. Conclusions These results open up new and promising possibilities for the development of a neuro-controlled hand prosthesis.

  9. Novel four-sided neural probe fabricated by a thermal lamination process of polymer films.

    Science.gov (United States)

    Shin, Soowon; Kim, Jae-Hyun; Jeong, Joonsoo; Gwon, Tae Mok; Lee, Seung-Hee; Kim, Sung June

    2017-02-15

    Ideally, neural probes should have channels with a three-dimensional (3-D) configuration to record the activities of 3-D neural circuits. Many types of 3-D neural probes have been developed; however, most of them were designed as an array of multiple shanks with electrodes located along one side of the shanks. We developed a novel liquid crystal polymer (LCP)-based neural probe with four-sided electrodes. This probe has electrodes on four sides of the shank, i.e., the front, back and two sidewalls. To generate the proposed configuration of the electrodes, we used a thermal lamination process involving LCP films and laser micromachining. The proposed novel four-sided neural probe, was used to successfully perform in vivo multichannel neural recording in the mouse primary somatosensory cortex. The multichannel neural recording showed that the proposed four-sided neural probe can record spiking activities from a more diverse neuronal population than single-sided probes. This was confirmed by a pairwise Pearson correlation coefficient (Pearson's r) analysis and a cross-correlation analysis. The developed four-sided neural probe can be used to record various signals from a complex neural network. Copyright © 2016 Elsevier B.V. All rights reserved.

  10. Wireless hippocampal neural recording via a multiple input RF receiver to construct place-specific firing fields.

    Science.gov (United States)

    Lee, Seung Bae; Manns, Joseph R; Ghovanloo, Maysam

    2012-01-01

    This paper reports scientifically meaningful in vivo experiments using a 32-channel wireless neural recording system (WINeR). The WINeR system is divided into transmitter (Tx) and receiver (Rx) parts. On the Tx side, we had WINeR-6, a system-on-a-chip (SoC) that operated based on time division multiplexing (TDM) of pulse width modulated (PWM) samples. The chip was fabricated in a 0.5-µm CMOS process, occupying 4.9 × 3.3 mm(2) and consuming 15 mW from ±1.5V supplies. The Rx used two antennas with separate pathways to down-convert the RF signal from a large area. A time-to-digital converter (TDC) in an FPGA converted the PWM pulses into digitized samples. In order to further increase the wireless coverage area and eliminate blind spots within a large experimental arena, two receivers were synchronized. The WINeR system was used to record epileptic activities from a rat that was injected with tetanus toxin (TT) in the dorsal hippocampus. In a different in vivo experiment, place-specific firing fields of place cells, which are parts of the hippocampal-dependent memory, were mapped from a series of behavioral experiments from a rat running in a circular track. Results from the same animal were compared against a commercial hard-wired recording system to evaluate the quality of the wireless recordings.

  11. An implantable two axis micromanipulator made with a 3D printer for recording neural activity in free-swimming fish.

    Science.gov (United States)

    Rogers, Loranzie S; Van Wert, Jacey C; Mensinger, Allen F

    2017-08-15

    Chronically implanted electrodes allow monitoring neural activity from free moving animals. While a wide variety of implanted headstages, microdrives and electrodes exist for terrestrial animals, few have been developed for use with aquatic animals. A two axis micromanipulator was fabricated with a Formlabs 3D printer for implanting electrodes into free-swimming oyster toadfish (Opsanus tau). The five piece manipulator consisted of a base, body, electrode holder, manual screw drive and locking nut. The manipulator measured approximately 25×20×30mm (l×w×h) and weighed 5.28g after hand assembly. Microwire electrodes were inserted successfully with the manipulator to record high fidelity signals from the anterior lateral line nerve of the toadfish. The micromanipulator allowed the chronically implanted electrodes to be repositioned numerous times to record from multiple sites and extended successful recording time in the toadfish by several days. Three dimensional printing allowed an inexpensive (<$US 5 material), two axis micromanipulator to be printed relatively rapidly (<2h) to successfully record from multiple sites in the anterior lateral line nerve of free-swimming toadfish. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. Biological channel modeling and implantable UWB antenna design for neural recording systems.

    Science.gov (United States)

    Bahrami, Hadi; Mirbozorgi, S Abdollah; Rusch, Leslie A; Gosselin, Benoit

    2015-01-01

    Ultrawideband (UWB) short-range communication systems have proved to be valuable in medical technology, particularly for implanted devices, due to their low-power consumption, low cost, small size, and high data rates. Neural activity monitoring in the brain requires high data rate (800 kb/s per neural sensor), and we target a system supporting a large number of sensors, in particular, aggregate transmission above 430 Mb/s (∼512 sensors). Knowledge of channel behavior is required to determine the maximum allowable power to 1) respect ANSI guidelines for avoiding tissue damage, and 2) respect FCC guidelines on unlicensed transmissions. We utilize a realistic model of the biological channel to inform the design of antennas for the implanted transmitter and the external receiver under these requirements. Antennas placement is examined under two scenarios having contrasting power constraints. Performance of the system within the biological tissues is examined via simulation and experiment. Our miniaturized antennas, 12 mm ×12 mm, need worst-case receiver sensitivities of -38 and -30.5 dBm for the first and second scenarios, respectively. These sensitivities allow us to successfully detect signals transmitted through tissues in the 3.1-10.6-GHz UWB band.

  13. Development of an ex Vivo Method for Multi-unit Recording of Microbiota-Colonic-Neural Signaling in Real Time

    Directory of Open Access Journals (Sweden)

    Maria M. Buckley

    2018-02-01

    Full Text Available Background and Objectives: Bidirectional signaling between the gastrointestinal tract and the brain is vital for maintaining whole-body homeostasis. Moreover, emerging evidence implicates vagal afferent signaling in the modulation of host physiology by microbes, which are most abundant in the colon. This study aims to optimize and advance dissection and recording techniques to facilitate real-time recordings of afferent neural signals originating in the distal colon.New Protocol: This paper describes a dissection technique, which facilitates extracellular electrophysiological recordings from visceral pelvic, spinal and vagal afferent neurons in response to stimulation of the distal colon.Examples of Application: Focal application of 75 mM KCl to a section of distal colon with exposed submucosal or myenteric nerve cell bodies and sensory nerve endings evoked activity in the superior mesenteric plexus and the vagal nerve. Noradrenaline stimulated nerve activity in the superior mesenteric plexus, whereas application of carbachol stimulated vagal nerve activity. Exposure of an ex vivo section of distal colon with an intact colonic mucosa to peptidoglycan, but not lipopolysaccharide, evoked vagal nerve firing.Discussion: Previous studies have recorded vagal signaling evoked by bacteria in the small intestine. The technical advances of this dissection and recording technique facilitates recording of afferent nerve signals evoked in extrinsic sensory pathways by neuromodulatory reagents applied to the distal colon. Moreover, we have demonstrated vagal afferent activation evoked by bacterial products applied to the distal colonic mucosa. This protocol may contribute to our understanding of functional bowel disorders where gut-brain communication is dysfunctional, and facilitate real-time interrogation of microbiota-gut-brain signaling.

  14. Response Features Determining Spike Times

    Directory of Open Access Journals (Sweden)

    Barry J. Richmond

    1999-01-01

    redundant with that carried by the coarse structure. Thus, the existence of precisely timed spike patterns carrying stimulus-related information does not imply control of spike timing at precise time scales.

  15. Critical Branching Neural Networks

    Science.gov (United States)

    Kello, Christopher T.

    2013-01-01

    It is now well-established that intrinsic variations in human neural and behavioral activity tend to exhibit scaling laws in their fluctuations and distributions. The meaning of these scaling laws is an ongoing matter of debate between isolable causes versus pervasive causes. A spiking neural network model is presented that self-tunes to critical…

  16. Comparison of spike sorting and thresholding of voltage waveforms for intracortical brain-machine interface performance

    Science.gov (United States)

    Christie, Breanne P.; Tat, Derek M.; Irwin, Zachary T.; Gilja, Vikash; Nuyujukian, Paul; Foster, Justin D.; Ryu, Stephen I.; Shenoy, Krishna V.; Thompson, David E.; Chestek, Cynthia A.

    2015-02-01

    Objective. For intracortical brain-machine interfaces (BMIs), action potential voltage waveforms are often sorted to separate out individual neurons. If these neurons contain independent tuning information, this process could increase BMI performance. However, the sorting of action potentials (‘spikes’) requires high sampling rates and is computationally expensive. To explicitly define the difference between spike sorting and alternative methods, we quantified BMI decoder performance when using threshold-crossing events versus sorted action potentials. Approach. We used data sets from 58 experimental sessions from two rhesus macaques implanted with Utah arrays. Data were recorded while the animals performed a center-out reaching task with seven different angles. For spike sorting, neural signals were sorted into individual units by using a mixture of Gaussians to cluster the first four principal components of the waveforms. For thresholding events, spikes that simply crossed a set threshold were retained. We decoded the data offline using both a Naïve Bayes classifier for reaching direction and a linear regression to evaluate hand position. Main results. We found the highest performance for thresholding when placing a threshold between -3 and -4.5 × Vrms. Spike sorted data outperformed thresholded data for one animal but not the other. The mean Naïve Bayes classification accuracy for sorted data was 88.5% and changed by 5% on average when data were thresholded. The mean correlation coefficient for sorted data was 0.92, and changed by 0.015 on average when thresholded. Significance. For prosthetics applications, these results imply that when thresholding is used instead of spike sorting, only a small amount of performance may be lost. The utilization of threshold-crossing events may significantly extend the lifetime of a device because these events are often still detectable once single neurons are no longer isolated.

  17. A CMOS power-efficient low-noise current-mode front-end amplifier for neural signal recording.

    Science.gov (United States)

    Wu, Chung-Yu; Chen, Wei-Ming; Kuo, Liang-Ting

    2013-04-01

    In this paper, a new current-mode front-end amplifier (CMFEA) for neural signal recording systems is proposed. In the proposed CMFEA, a current-mode preamplifier with an active feedback loop operated at very low frequency is designed as the first gain stage to bypass any dc offset current generated by the electrode-tissue interface and to achieve a low high-pass cutoff frequency below 0.5 Hz. No reset signal or ultra-large pseudo resistor is required. The current-mode preamplifier has low dc operation current to enhance low-noise performance and decrease power consumption. A programmable current gain stage is adopted to provide adjustable gain for adaptive signal scaling. A following current-mode filter is designed to adjust the low-pass cutoff frequency for different neural signals. The proposed CMFEA is designed and fabricated in 0.18-μm CMOS technology and the area of the core circuit is 0.076 mm(2). The measured high-pass cutoff frequency is as low as 0.3 Hz and the low-pass cutoff frequency is adjustable from 1 kHz to 10 kHz. The measured maximum current gain is 55.9 dB. The measured input-referred current noise density is 153 fA /√Hz , and the power consumption is 13 μW at 1-V power supply. The fabricated CMFEA has been successfully applied to the animal test for recording the seizure ECoG of Long-Evan rats.

  18. Inverse current source density method in two dimensions: inferring neural activation from multielectrode recordings.

    Science.gov (United States)

    Łęski, Szymon; Pettersen, Klas H; Tunstall, Beth; Einevoll, Gaute T; Gigg, John; Wójcik, Daniel K

    2011-12-01

    The recent development of large multielectrode recording arrays has made it affordable for an increasing number of laboratories to record from multiple brain regions simultaneously. The development of analytical tools for array data, however, lags behind these technological advances in hardware. In this paper, we present a method based on forward modeling for estimating current source density from electrophysiological signals recorded on a two-dimensional grid using multi-electrode rectangular arrays. This new method, which we call two-dimensional inverse Current Source Density (iCSD 2D), is based upon and extends our previous one- and three-dimensional techniques. We test several variants of our method, both on surrogate data generated from a collection of Gaussian sources, and on model data from a population of layer 5 neocortical pyramidal neurons. We also apply the method to experimental data from the rat subiculum. The main advantages of the proposed method are the explicit specification of its assumptions, the possibility to include system-specific information as it becomes available, the ability to estimate CSD at the grid boundaries, and lower reconstruction errors when compared to the traditional approach. These features make iCSD 2D a substantial improvement over the approaches used so far and a powerful new tool for the analysis of multielectrode array data. We also provide a free GUI-based MATLAB toolbox to analyze and visualize our test data as well as user datasets.

  19. Linking structure and activity in nonlinear spiking networks.

    Directory of Open Access Journals (Sweden)

    Gabriel Koch Ocker

    2017-06-01

    Full Text Available Recent experimental advances are producing an avalanche of data on both neural connectivity and neural activity. To take full advantage of these two emerging datasets we need a framework that links them, revealing how collective neural activity arises from the structure of neural connectivity and intrinsic neural dynamics. This problem of structure-driven activity has drawn major interest in computational neuroscience. Existing methods for relating activity and architecture in spiking networks rely on linearizing activity around a central operating point and thus fail to capture the nonlinear responses of individual neurons that are the hallmark of neural information processing. Here, we overcome this limitation and present a new relationship between connectivity and activity in networks of nonlinear spiking neurons by developing a diagrammatic fluctuation expansion based on statistical field theory. We explicitly show how recurrent network structure produces pairwise and higher-order correlated activity, and how nonlinearities impact the networks' spiking activity. Our findings open new avenues to investigating how single-neuron nonlinearities-including those of different cell types-combine with connectivity to shape population activity and function.

  20. A Neuron Model for FPGA Spiking Neuronal Network Implementation

    Directory of Open Access Journals (Sweden)

    BONTEANU, G.

    2011-11-01

    Full Text Available We propose a neuron model, able to reproduce the basic elements of the neuronal dynamics, optimized for digital implementation of Spiking Neural Networks. Its architecture is structured in two major blocks, a datapath and a control unit. The datapath consists of a membrane potential circuit, which emulates the neuronal dynamics at the soma level, and a synaptic circuit used to update the synaptic weight according to the spike timing dependent plasticity (STDP mechanism. The proposed model is implemented into a Cyclone II-Altera FPGA device. Our results indicate the neuron model can be used to build up 1K Spiking Neural Networks on reconfigurable logic suport, to explore various network topologies.

  1. Emergent dynamics of spiking neurons with fluctuating threshold

    Science.gov (United States)

    Bhattacharjee, Anindita; Das, M. K.

    2017-05-01

    Role of fluctuating threshold on neuronal dynamics is investigated. The threshold function is assumed to follow a normal probability distribution. Standard deviation of inter-spike interval of the response is computed as an indicator of irregularity in spike emission. It has been observed that, the irregularity in spiking is more if the threshold variation is more. A significant change in modal characteristics of Inter Spike Intervals (ISI) is seen to occur as a function of fluctuation parameter. Investigation is further carried out for coupled system of neurons. Cooperative dynamics of coupled neurons are discussed in view of synchronization. Total and partial synchronization regimes are depicted with the help of contour plots of synchrony measure under various conditions. Results of this investigation may provide a basis for exploring the complexities of neural communication and brain functioning.

  2. An integrated multichannel neural recording analog front-end ASIC with area-efficient driven right leg circuit.

    Science.gov (United States)

    Tao Tang; Wang Ling Goh; Lei Yao; Jia Hao Cheong; Yuan Gao

    2017-07-01

    This paper describes an integrated multichannel neural recording analog front end (AFE) with a novel area-efficient driven right leg (DRL) circuit to improve the system common mode rejection ratio (CMRR). The proposed AFE consists of an AC-coupled low-noise programmable-gain amplifier, an area-efficient DRL block and a 10-bit SAR ADC. Compared to conventional DRL circuit, the proposed capacitor-less DRL design achieves 90% chip area reduction with enhanced CMRR performance, making it ideal for multichannel biomedical recording applications. The AFE circuit has been designed in a standard 0.18-μm CMOS process. Post-layout simulation results show that the AFE provides two gain settings of 54dB/60dB while consuming 1 μA per channel under a supply voltage of 1 V. The input-referred noise of the AFE integrated from 1 Hz to 10k Hz is only 4 μVrms and the CMRR is 110 dB.

  3. Neck collar for restraining head and body movements in rats for behavioral task performance and simultaneous neural activity recording.

    Science.gov (United States)

    Tateyama, Yukina; Oyama, Kei; Lo, Cheuk Wa Christopher; Iijima, Toshio; Tsutsui, Ken-Ichiro

    2016-04-01

    Head fixation has been one of the major methods in behavioral neurophysiology because it allows precision in stimulus application and behavioral assessment. Most neural recordings in awake monkeys have been obtained under head fixation, which is nowadays also being used in awake rodents. However, head fixation devices in rats often become unstable within several months, which increases risks for inflammation, infection, and necrosis of the bone and surrounding tissue. In this study we developed a novel non-invasive "neck collar system" for restraining the head and body movements of behaving rats. The attachment of the neck collar for 2-3 months did not affect the animals' health and welfare. Rats under neck-collar fixation could learn a behavioral task (standard delayed licking task) with the same efficiency as those under standard head fixation. They could also learn a more complicated task (delayed pro/anti-licking task) under neck-collar fixation and afterwards transfer their learning to the task under standard head fixation. Furthermore, we were able to record single-unit activity in rats under neck-collar fixation during the performance of the standard delayed licking task. This system consists of economical materials and is easily constructed, and it enables head-restraint without surgery, thus eliminating the risk of inflammation or infection. We consider the neck-collar fixation developed in this study would be useful for restraining the head of a behaving rodent. Copyright © 2016 Elsevier B.V. All rights reserved.

  4. Reading Neural Encodings using Phase Space Methods

    OpenAIRE

    Abarbanel, Henry D. I.; Tumer, Evren C.

    2003-01-01

    Environmental signals sensed by nervous systems are often represented in spike trains carried from sensory neurons to higher neural functions where decisions and functional actions occur. Information about the environmental stimulus is contained (encoded) in the train of spikes. We show how to "read" the encoding using state space methods of nonlinear dynamics. We create a mapping from spike signals which are output from the neural processing system back to an estimate of the analog input sig...

  5. Correcting for the sampling bias problem in spike train information measures.

    Science.gov (United States)

    Panzeri, Stefano; Senatore, Riccardo; Montemurro, Marcelo A; Petersen, Rasmus S

    2007-09-01

    Information Theory enables the quantification of how much information a neuronal response carries about external stimuli and is hence a natural analytic framework for studying neural coding. The main difficulty in its practical application to spike train analysis is that estimates of neuronal information from experimental data are prone to a systematic error (called "bias"). This bias is an inevitable consequence of the limited number of stimulus-response samples that it is possible to record in a real experiment. In this paper, we first explain the origin and the implications of the bias problem in spike train analysis. We then review and evaluate some recent general-purpose methods to correct for sampling bias: the Panzeri-Treves, Quadratic Extrapolation, Best Universal Bound, Nemenman-Shafee-Bialek procedures, and a recently proposed shuffling bias reduction procedure. Finally, we make practical recommendations for the accurate computation of information from spike trains. Our main recommendation is to estimate information using the shuffling bias reduction procedure in combination with one of the other four general purpose bias reduction procedures mentioned in the preceding text. This provides information estimates with acceptable variance and which are unbiased even when the number of trials per stimulus is as small as the number of possible discrete neuronal responses.

  6. Basal forebrain activation enhances between-trial reliability of low-frequency local field potentials (LFP) and spiking activity in tree shrew primary visual cortex (V1).

    Science.gov (United States)

    De Luna, Paolo; Veit, Julia; Rainer, Gregor

    2017-12-01

    Brain state has profound effects on neural processing and stimulus encoding in sensory cortices. While the synchronized state is dominated by low-frequency local field potential (LFP) activity, low-frequency LFP power is suppressed in the desynchronized state, where a concurrent enhancement in gamma power is observed. Recently, it has been shown that cortical desynchronization co-occurs with enhanced between-trial reliability of spiking activity in sensory neurons, but it is currently unclear whether this effect is also evident in LFP signals. Here, we address this question by recording both spike trains and LFP in primary visual cortex during natural movie stimulation, and using isoflurane anesthesia and basal forebrain (BF) electrical activation as proxies for synchronized and desynchronized brain states. We show that indeed, low-frequency LFP modulations ("LFP events") also occur more reliably following BF activation. Interestingly, while being more reliable, these LFP events are smaller in amplitude compared to those generated in the synchronized brain state. We further demonstrate that differences in reliability of spiking activity between cortical states can be linked to amplitude and probability of LFP events. The correlated temporal dynamics between low-frequency LFP and spiking response reliability in visual cortex suggests that these effects may both be the result of the same neural circuit activation triggered by BF stimulation, which facilitates switching between processing of incoming sensory information in the desynchronized and reverberation of internal signals in the synchronized state.

  7. Neural network connectivity and response latency modelled by stochastic processes

    DEFF Research Database (Denmark)

    Tamborrino, Massimiliano

    is connected to thousands of other neurons. The rst question is: how to model neural networks through stochastic processes? A multivariate Ornstein-Uhlenbeck process, obtained as a diffusion approximation of a jump process, is the proposed answer. Obviously, dependencies between neurons imply dependencies...... between their spike times. Therefore, the second question is: how to detect neural network connectivity from simultaneously recorded spike trains? Answering this question corresponds to investigate the joint distribution of sequences of rst passage times. A non-parametric method based on copulas...... generation of pikes. When a stimulus is applied to the network, the spontaneous rings may prevail and hamper detection of the effects of the stimulus. Therefore, the spontaneous rings cannot be ignored and the response latency has to be detected on top of a background signal. Everything becomes more dicult...

  8. Improving the quality of combined EEG-TMS neural recordings: Introducing the coil spacer.

    Science.gov (United States)

    Ruddy, K L; Woolley, D G; Mantini, D; Balsters, J H; Enz, N; Wenderoth, N

    2017-11-06

    In the last decade, interest in combined transcranial magnetic stimulation (TMS) and electroencephalography (EEG) approaches has grown substantially. Aside from the obvious artifacts induced by the magnetic pulses themselves, separate and more sinister signal disturbances arise as a result of contact between the TMS coil and EEG electrodes. Here we profile the characteristics of these artifacts and introduce a simple device - the coil spacer - to provide a platform allowing physical separation between the coil and electrodes during stimulation. EEG data revealed high amplitude signal disturbances when the TMS coil was in direct contact with the EEG electrodes, well within the physiological range of viable EEG signals. The largest artifacts were located in the Delta and Theta frequency range, and standard data cleanup using independent components analysis (ICA) was ineffective due to the artifact's similarity to real brain oscillations. While the current best practice is to use a large coil holding apparatus to fixate the coil 'hovering' over the head with an air gap, the spacer provides a simpler solution that ensures this distance is kept constant throughout testing. The results strongly suggest that data collected from combined TMS-EEG studies with the coil in direct contact with the EEG cap are polluted with low frequency artifacts that are indiscernible from physiological brain signals. The coil spacer provides a cheap and simple solution to this problem and is recommended for use in future simultaneous TMS-EEG recordings. Copyright © 2017 The Author(s). Published by Elsevier B.V. All rights reserved.

  9. Spike-timing theory of working memory.

    Science.gov (United States)

    Szatmáry, Botond; Izhikevich, Eugene M

    2010-08-19

    Working memory (WM) is the part of the brain's memory system that provides temporary storage and manipulation of information necessary for cognition. Although WM has limited capacity at any given time, it has vast memory content in the sense that it acts on the brain's nearly infinite repertoire of lifetime long-term memories. Using simulations, we show that large memory content and WM functionality emerge spontaneously if we take the spike-timing nature of neuronal processing into account. Here, memories are represented by extensively overlapping groups of neurons that exhibit stereotypical time-locked spatiotemporal spike-timing patterns, called polychronous patterns; and synapses forming such polychronous neuronal groups (PNGs) are subject to associative synaptic plasticity in the form of both long-term and short-term spike-timing dependent plasticity. While long-term potentiation is essential in PNG formation, we show how short-term plasticity can temporarily strengthen the synapses of selected PNGs and lead to an increase in the spontaneous reactivation rate of these PNGs. This increased reactivation rate, consistent with in vivo recordings during WM tasks, results in high interspike interval variability and irregular, yet systematically changing, elevated firing rate profiles within the neurons of the selected PNGs. Additionally, our theory explains the relationship between such slowly changing firing rates and precisely timed spikes, and it reveals a novel relationship between WM and the perception of time on the order of seconds.

  10. Spike-timing theory of working memory.

    Directory of Open Access Journals (Sweden)

    Botond Szatmáry

    Full Text Available Working memory (WM is the part of the brain's memory system that provides temporary storage and manipulation of information necessary for cognition. Although WM has limited capacity at any given time, it has vast memory content in the sense that it acts on the brain's nearly infinite repertoire of lifetime long-term memories. Using simulations, we show that large memory content and WM functionality emerge spontaneously if we take the spike-timing nature of neuronal processing into account. Here, memories are represented by extensively overlapping groups of neurons that exhibit stereotypical time-locked spatiotemporal spike-timing patterns, called polychronous patterns; and synapses forming such polychronous neuronal groups (PNGs are subject to associative synaptic plasticity in the form of both long-term and short-term spike-timing dependent plasticity. While long-term potentiation is essential in PNG formation, we show how short-term plasticity can temporarily strengthen the synapses of selected PNGs and lead to an increase in the spontaneous reactivation rate of these PNGs. This increased reactivation rate, consistent with in vivo recordings during WM tasks, results in high interspike interval variability and irregular, yet systematically changing, elevated firing rate profiles within the neurons of the selected PNGs. Additionally, our theory explains the relationship between such slowly changing firing rates and precisely timed spikes, and it reveals a novel relationship between WM and the perception of time on the order of seconds.

  11. Design and measurements of 64-channel ASIC for neural signal recording.

    Science.gov (United States)

    Kmon, P; Zoladz, M; Grybos, P; Szczygiel, R

    2009-01-01

    This paper presents the design and measurements of a low noise multi-channel front-end electronics for recording extra-cellular neuronal signals using microelectrode arrays. The integrated circuit contains 64 readout channels and was fabricated in CMOS 0.18 microm technology. A single readout channel is built of an AC coupling circuit at the input, a low noise preamplifier, a band-pass filter and a second amplifier. In order to reduce the number of output lines, the 64 analog signals from readout channels are multiplexed to a single output by an analog multiplexer. The chip is optimized for low noise and matching performance with the possibility of cut-off frequencies tuning. The low cut-off frequency can be tuned in the 1 Hz-60 Hz range and the high cut-off frequency can be tuned in the 3.5 kHz-15 kHz range. For the nominal gain setting at 44 dB and power dissipation per single channel of 220 microW the equivalent input noise is in the range from 6 microV-11 microV rms depending on the band-pass filter settings. The chip has good uniformity concerning the spread of its electrical parameters from channel to channel. The spread of gain calculated as standard deviation to mean value is about 4.4% and the spread of the low cut-off frequency is on the same level. The chip occupies 5x2.3 mm(2) of silicon area.

  12. Binary Associative Memories as a Benchmark for Spiking Neuromorphic Hardware

    Directory of Open Access Journals (Sweden)

    Andreas Stöckel

    2017-08-01

    Full Text Available Large-scale neuromorphic hardware platforms, specialized computer systems for energy efficient simulation of spiking neural networks, are being developed around the world, for example as part of the European Human Brain Project (HBP. Due to conceptual differences, a universal performance analysis of these systems in terms of runtime, accuracy and energy efficiency is non-trivial, yet indispensable for further hard- and software development. In this paper we describe a scalable benchmark based on a spiking neural network implementation of the binary neural associative memory. We treat neuromorphic hardware and software simulators as black-boxes and execute exactly the same network description across all devices. Experiments on the HBP platforms under varying configurations of the associative memory show that the presented method allows to test the quality of the neuron model implementation, and to explain significant deviations from the expected reference output.

  13. Towards building hybrid biological/in silico neural networks for motor neuroprosthetic control

    Directory of Open Access Journals (Sweden)

    Mehmet eKocaturk

    2015-08-01

    Full Text Available In this article, we introduce the Bioinspired Neuroprosthetic Design Environment (BNDE as a practical platform for the development of novel brain machine interface (BMI controllers which are based on spiking model neurons. We built the BNDE around a hard real-time system so that it is capable of creating simulated synapses from extracellularly recorded neurons to model neurons. In order to evaluate the practicality of the BNDE for neuroprosthetic control experiments, a novel, adaptive BMI controller was developed and tested using real-time closed-loop simulations. The present controller consists of two in silico medium spiny neurons which receive simulated synaptic inputs from recorded motor cortical neurons. In the closed-loop simulations, the recordings from the cortical neurons were imitated using an external, hardware-based neural signal synthesizer. By implementing a reward-modulated spike timing-dependent plasticity rule, the controller achieved perfect target reach accuracy for a two target reaching task in one dimensional space. The BNDE combines the flexibility of software-based spiking neural network (SNN simulations with powerful online data visualization tools and is a low-cost, PC-based and all-in-one solution for developing neurally-inspired BMI controllers. We believe the BNDE is the first implementation which is capable of creating hybrid biological/in silico neural networks for motor neuroprosthetic control and utilizes multiple CPU cores for computationally intensive real-time SNN simulations.

  14. [Multi-channel in vivo recording techniques: signal processing of action potentials and local field potentials].

    Science.gov (United States)

    Xu, Jia-Min; Wang, Ce-Qun; Lin, Long-Nian

    2014-06-25

    Multi-channel in vivo recording techniques are used to record ensemble neuronal activity and local field potentials (LFP) simultaneously. One of the key points for the technique is how to process these two sets of recorded neural signals properly so that data accuracy can be assured. We intend to introduce data processing approaches for action potentials and LFP based on the original data collected through multi-channel recording system. Action potential signals are high-frequency signals, hence high sampling rate of 40 kHz is normally chosen for recording. Based on waveforms of extracellularly recorded action potentials, tetrode technology combining principal component analysis can be used to discriminate neuronal spiking signals from differently spatially distributed neurons, in order to obtain accurate single neuron spiking activity. LFPs are low-frequency signals (lower than 300 Hz), hence the sampling rate of 1 kHz is used for LFPs. Digital filtering is required for LFP analysis to isolate different frequency oscillations including theta oscillation (4-12 Hz), which is dominant in active exploration and rapid-eye-movement (REM) sleep, gamma oscillation (30-80 Hz), which is accompanied by theta oscillation during cognitive processing, and high frequency ripple oscillation (100-250 Hz) in awake immobility and slow wave sleep (SWS) state in rodent hippocampus. For the obtained signals, common data post-processing methods include inter-spike interval analysis, spike auto-correlation analysis, spike cross-correlation analysis, power spectral density analysis, and spectrogram analysis.

  15. Effects of nicotine stimulation on spikes, theta frequency oscillations, and spike-theta oscillation relationship in rat medial septum diagonal band Broca slices

    Science.gov (United States)

    Wen, Dong; Peng, Ce; Ou-yang, Gao-xiang; Henderson, Zainab; Li, Xiao-li; Lu, Cheng-biao

    2013-01-01

    Aim: Spiking activities and neuronal network oscillations in the theta frequency range have been found in many cortical areas during information processing. The aim of this study is to determine whether nicotinic acetylcholine receptors (nAChRs) mediate neuronal network activity in rat medial septum diagonal band Broca (MSDB) slices. Methods: Extracellular field potentials were recorded in the slices using an Axoprobe 1A amplifier. Data analysis was performed off-line. Spike sorting and local field potential (LFP) analyses were performed using Spike2 software. The role of spiking activity in the generation of LFP oscillations in the slices was determined by analyzing the phase-time relationship between the spikes and LFP oscillations. Circular statistic analysis based on the Rayleigh test was used to determine the significance of phase relationships between the spikes and LFP oscillations. The timing relationship was examined by quantifying the spike-field coherence (SFC). Results: Application of nicotine (250 nmol/L) induced prominent LFP oscillations in the theta frequency band and both small- and large-amplitude population spiking activity in the slices. These spikes were phase-locked to theta oscillations at specific phases. The Rayleigh test showed a statistically significant relationship in phase-locking between the spikes and theta oscillations. Larger changes in the SFC were observed for large-amplitude spikes, indicating an accurate timing relationship between this type of spike and LFP oscillations. The nicotine-induced spiking activity (large-amplitude population spikes) was suppressed by the nAChR antagonist dihydro-β-erythroidine (0.3 μmol/L). Conclusion: The results demonstrate that large-amplitude spikes are phase-locked to theta oscillations and have a high spike-timing accuracy, which are likely a main contributor to the theta oscillations generated in MSDB during nicotine receptor activation. PMID:23474704

  16. Validating silicon polytrodes with paired juxtacellular recordings: method and dataset.

    Science.gov (United States)

    Neto, Joana P; Lopes, Gonçalo; Frazão, João; Nogueira, Joana; Lacerda, Pedro; Baião, Pedro; Aarts, Arno; Andrei, Alexandru; Musa, Silke; Fortunato, Elvira; Barquinha, Pedro; Kampff, Adam R

    2016-08-01

    Cross-validating new methods for recording neural activity is necessary to accurately interpret and compare the signals they measure. Here we describe a procedure for precisely aligning two probes for in vivo "paired-recordings" such that the spiking activity of a single neuron is monitored with both a dense extracellular silicon polytrode and a juxtacellular micropipette. Our new method allows for efficient, reliable, and automated guidance of both probes to the same neural structure with micrometer resolution. We also describe a new dataset of paired-recordings, which is available online. We propose that our novel targeting system, and ever expanding cross-validation dataset, will be vital to the development of new algorithms for automatically detecting/sorting single-units, characterizing new electrode materials/designs, and resolving nagging questions regarding the origin and nature of extracellular neural signals. Copyright © 2016 the American Physiological Society.

  17. Event-Driven Contrastive Divergence for Spiking Neuromorphic Systems

    Directory of Open Access Journals (Sweden)

    Emre eNeftci

    2014-01-01

    Full Text Available Restricted Boltzmann Machines (RBMs and Deep Belief Networks have been demonstrated to perform efficiently in variety of applications, such as dimensionality reduction, feature learning, and classification. Their implementation on neuromorphic hardware platforms emulating large-scale networks of spiking neurons can have significant advantages from the perspectives of scalability, power dissipation and real-time interfacing with the environment. However the traditional RBM architecture and the commonly used training algorithm known as Contrastive Divergence (CD are based on discrete updates and exact arithmetics which do not directly map onto a dynamical neural substrate. Here, we present an event-driven variation of CD to train a RBM constructed with Integrate & Fire (I&F neurons, that is constrained by the limitations of existing and near future neuromorphic hardware platforms. Our strategy is based on neural sampling, which allows us to synthesize a spiking neural network that samples from a target Boltzmann distribution. The reverberating activity of the network replaces the discrete steps of the CD algorithm, while Spike Time Dependent Plasticity (STDP carries out the weight updates in an online, asynchronous fashion.We demonstrate our approach by training an RBM composed of leaky I&F neurons with STDP synapses to learn a generative model of the MNIST hand-written digit dataset, and by testing it in recognition, generation and cue integration tasks. Our results contribute to a machine learning-driven approach for synthesizing networks of spiking neurons capable of carrying out practical, high-level functionality.

  18. Radioxenon spiked air.

    Science.gov (United States)

    Watrous, Matthew G; Delmore, James E; Hague, Robert K; Houghton, Tracy P; Jenson, Douglas D; Mann, Nick R

    2015-12-01

    Four of the radioactive xenon isotopes ((131m)Xe, (133m)Xe, (133)Xe and (135)Xe) with half-lives ranging from 9 h to 12 days are produced from nuclear fission and can be detected from days to weeks following their production and release. Being inert gases, they are readily transported through the atmosphere. Sources for release of radioactive xenon isotopes include operating nuclear reactors via leaks in fuel rods, medical isotope production facilities, and nuclear weapons' detonations. They are not normally released from fuel reprocessing due to the short half-lives. The Comprehensive Nuclear-Test-Ban Treaty has led to creation of the International Monitoring System. The International Monitoring System, when fully implemented, will consist of one component with 40 stations monitoring radioactive xenon around the globe. Monitoring these radioactive xenon isotopes is important to the Comprehensive Nuclear-Test-Ban Treaty in determining whether a seismically detected event is or is not a nuclear detonation. A variety of radioactive xenon quality control check standards, quantitatively spiked into various gas matrices, could be used to demonstrate that these stations are operating on the same basis in order to bolster defensibility of data across the International Monitoring System. This paper focuses on Idaho National Laboratory's capability to produce three of the xenon isotopes in pure form and the use of the four xenon isotopes in various combinations to produce radioactive xenon spiked air samples that could be subsequently distributed to participating facilities. Copyright © 2015 Elsevier Ltd. All rights reserved.

  19. A simultaneous comparison of acupuncture needle and insulated needle sphenoidal electrodes for detection of anterior temporal spikes.

    Science.gov (United States)

    Chu, N S

    1992-01-01

    Uninsulated acupuncture needles have been used as sphenoidal electrodes, but the issue of insulation has not been adequately addressed. In this report, acupuncture needles and insulated needle sphenoidal electrodes were simultaneously used to compare the rate of spike detection, spike amplitude and distribution of maximal spikes from eight spike foci in seven patients with temporal lobe epilepsy. When compared to the insulated needle electrode, the acupuncture needle electrode was equally effective in spike detection, but spike amplitudes tended to be smaller and maximal spikes were less frequently encountered. Thus, insulation has an influence on the spikes recorded by the acupuncture needle sphenoidal electrode. However, the overall effect appears to be not sufficiently different from the insulated needle electrode for the purpose of detecting anterior temporal spikes in outpatient EEG recordings for the diagnosis of temporal lobe epilepsy.

  20. A Robustness Comparison of Two Algorithms Used for EEG Spike Detection.

    Science.gov (United States)

    Chaibi, Sahbi; Lajnef, Tarek; Ghrob, Abdelbacet; Samet, Mounir; Kachouri, Abdennaceur

    2015-01-01

    Spikes and sharp waves recorded on scalp EEG may play an important role in identifying the epileptogenic network as well as in understanding the central nervous system. Therefore, several automatic and semi-automatic methods have been implemented to detect these two neural transients. A consistent gold standard associated with a high degree of agreement among neuroscientists is required to measure relevant performance of different methods. In fact, scalp EEG data can often be corrupted by a set of artifacts and are not always served as data of gold standard. For this reason, the use of intracerebral EEG data mixed with gaussian noise seems to best resemble the output of scalp EEG brain and serves as a consistent gold standard. In the present framework, we test the robustness of two important methods that have been previously used for the automatic detection of epileptiform transients (spikes and sharp waves). These methods are based respectively on Discrete Wavelet Transform (DWT) and Continuous Wavelet Transform (CWT). Our purpose is to elaborate a comparative study in terms of sensitivity and selectivity changes via the decrease of Signal to Noise Ratio (SNR), which is ranged from 10 dB up to -10 dB. The results demonstrate that, DWT approach turns to be more stable in terms of sensitivity, and it successfully follows the detection of relevant spikes with the decrease of SNR. However, CWT-based approach remains more stable in terms of selectivity, so that, it performs well in terms of rejecting false spikes compared to DWT approach.

  1. Estimating short-term synaptic plasticity from pre- and postsynaptic spiking.

    Directory of Open Access Journals (Sweden)

    Abed Ghanbari

    2017-09-01

    Full Text Available Short-term synaptic plasticity (STP critically affects the processing of information in neuronal circuits by reversibly changing the effective strength of connections between neurons on time scales from milliseconds to a few seconds. STP is traditionally studied using intracellular recordings of postsynaptic potentials or currents evoked by presynaptic spikes. However, STP also affects the statistics of postsynaptic spikes. Here we present two model-based approaches for estimating synaptic weights and short-term plasticity from pre- and postsynaptic spike observations alone. We extend a generalized linear model (GLM that predicts postsynaptic spiking as a function of the observed pre- and postsynaptic spikes and allow the connection strength (coupling term in the GLM to vary as a function of time based on the history of presynaptic spikes. Our first model assumes that STP follows a Tsodyks-Markram description of vesicle depletion and recovery. In a second model, we introduce a functional description of STP where we estimate the coupling term as a biophysically unrestrained function of the presynaptic inter-spike intervals. To validate the models, we test the accuracy of STP estimation using the spiking of pre- and postsynaptic neurons with known synaptic dynamics. We first test our models using the responses of layer 2/3 pyramidal neurons to simulated presynaptic input with different types of STP, and then use simulated spike trains to examine the effects of spike-frequency adaptation, stochastic vesicle release, spike sorting errors, and common input. We find that, using only spike observations, both model-based methods can accurately reconstruct the time-varying synaptic weights of presynaptic inputs for different types of STP. Our models also capture the differences in postsynaptic spike responses to presynaptic spikes following short vs long inter-spike intervals, similar to results reported for thalamocortical connections. These models may

  2. Estimating short-term synaptic plasticity from pre- and postsynaptic spiking

    Science.gov (United States)

    Malyshev, Aleksey; Stevenson, Ian H.

    2017-01-01

    Short-term synaptic plasticity (STP) critically affects the processing of information in neuronal circuits by reversibly changing the effective strength of connections between neurons on time scales from milliseconds to a few seconds. STP is traditionally studied using intracellular recordings of postsynaptic potentials or currents evoked by presynaptic spikes. However, STP also affects the statistics of postsynaptic spikes. Here we present two model-based approaches for estimating synaptic weights and short-term plasticity from pre- and postsynaptic spike observations alone. We extend a generalized linear model (GLM) that predicts postsynaptic spiking as a function of the observed pre- and postsynaptic spikes and allow the connection strength (coupling term in the GLM) to vary as a function of time based on the history of presynaptic spikes. Our first model assumes that STP follows a Tsodyks-Markram description of vesicle depletion and recovery. In a second model, we introduce a functional description of STP where we estimate the coupling term as a biophysically unrestrained function of the presynaptic inter-spike intervals. To validate the models, we test the accuracy of STP estimation using the spiking of pre- and postsynaptic neurons with known synaptic dynamics. We first test our models using the responses of layer 2/3 pyramidal neurons to simulated presynaptic input with different types of STP, and then use simulated spike trains to examine the effects of spike-frequency adaptation, stochastic vesicle release, spike sorting errors, and common input. We find that, using only spike observations, both model-based methods can accurately reconstruct the time-varying synaptic weights of presynaptic inputs for different types of STP. Our models also capture the differences in postsynaptic spike responses to presynaptic spikes following short vs long inter-spike intervals, similar to results reported for thalamocortical connections. These models may thus be useful

  3. Simple model of spiking neurons.

    Science.gov (United States)

    Izhikevich, E M

    2003-01-01

    A model is presented that reproduces spiking and bursting behavior of known types of cortical neurons. The model combines the biologically plausibility of Hodgkin-Huxley-type dynamics and the computational efficiency of integrate-and-fire neurons. Using this model, one can simulate tens of thousands of spiking cortical neurons in real time (1 ms resolution) using a desktop PC.

  4. Coronavirus spike-receptor interactions

    NARCIS (Netherlands)

    Mou, H.

    2015-01-01

    Coronaviruses cause important diseases in humans and animals. Coronavirus infection starts with the virus binding with its spike proteins to molecules present on the surface of host cells that act as receptors. This spike-receptor interaction is highly specific and determines the virus’ cell, tissue

  5. Spike Pattern Structure Influences Synaptic Efficacy Variability Under STDP and Synaptic Homeostasis. I: Spike Generating Models on Converging Motifs

    Directory of Open Access Journals (Sweden)

    Zedong eBi

    2016-02-01

    Full Text Available In neural systems, synaptic plasticity is usually driven by spike trains. Due to the inherent noises of neurons and synapses as well as the randomness of connection details, spike trains typically exhibit variability such as spatial randomness and temporal stochasticity, resulting in variability of synaptic changes under plasticity, which we call efficacy variability. How the variability of spike trains influences the efficacy variability of synapses remains unclear. In this paper, we try to understand this influence under pair-wise additive spike-timing dependent plasticity (STDP when the mean strength of plastic synapses into a neuron is bounded (synaptic homeostasis. Specifically, we systematically study, analytically and numerically, how four aspects of statistical features, i.e. synchronous firing, burstiness/regularity, heterogeneity of rates and heterogeneity of cross-correlations, as well as their interactions influence the efficacy variability in converging motifs (simple networks in which one neuron receives from many other neurons. Neurons (including the post-synaptic neuron in a converging motif generate spikes according to statistical models with tunable parameters. In this way, we can explicitly control the statistics of the spike patterns, and investigate their influence onto the efficacy variability, without worrying about the feedback from synaptic changes onto the dynamics of the post-synaptic neuron. We separate efficacy variability into two parts: the drift part (DriftV induced by the heterogeneity of change rates of different synapses, and the diffusion part (DiffV induced by weight diffusion caused by stochasticity of spike trains. Our main findings are: (1 synchronous firing and burstiness tend to increase DiffV, (2 heterogeneity of rates induces DriftV when potentiation and depression in STDP are not balanced, and (3 heterogeneity of cross-correlations induces DriftV together with heterogeneity of rates. We anticipate our

  6. Power-law inter-spike interval distributions infer a conditional maximization of entropy in cortical neurons.

    Directory of Open Access Journals (Sweden)

    Yasuhiro Tsubo

    Full Text Available The brain is considered to use a relatively small amount of energy for its efficient information processing. Under a severe restriction on the energy consumption, the maximization of mutual information (MMI, which is adequate for designing artificial processing machines, may not suit for the brain. The MMI attempts to send information as accurate as possible and this usually requires a sufficient energy supply for establishing clearly discretized communication bands. Here, we derive an alternative hypothesis for neural code from the neuronal activities recorded juxtacellularly in the sensorimotor cortex of behaving rats. Our hypothesis states that in vivo cortical neurons maximize the entropy of neuronal firing under two constraints, one limiting the energy consumption (as assumed previously and one restricting the uncertainty in output spike sequences at given firing rate. Thus, the conditional maximization of firing-rate entropy (CMFE solves a tradeoff between the energy cost and noise in neuronal response. In short, the CMFE sends a rich variety of information through broader communication bands (i.e., widely distributed firing rates at the cost of accuracy. We demonstrate that the CMFE is reflected in the long-tailed, typically power law, distributions of inter-spike intervals obtained for the majority of recorded neurons. In other words, the power-law tails are more consistent with the CMFE rather than the MMI. Thus, we propose the mathematical principle by which cortical neurons may represent information about synaptic input into their output spike trains.

  7. Reliability of spike and burst firing in thalamocortical relay cells.

    Science.gov (United States)

    Zeldenrust, Fleur; Chameau, Pascal J P; Wadman, Wytse J

    2013-12-01

    The reliability and precision of the timing of spikes in a spike train is an important aspect of neuronal coding. We investigated reliability in thalamocortical relay (TCR) cells in the acute slice and also in a Morris-Lecar model with several extensions. A frozen Gaussian noise current, superimposed on a DC current, was injected into the TCR cell soma. The neuron responded with spike trains that showed trial-to-trial variability, due to amongst others slow changes in its internal state and the experimental setup. The DC current allowed to bring the neuron in different states, characterized by a well defined membrane voltage (between -80 and -50 mV) and by a specific firing regime that on depolarization gradually shifted from a predominantly bursting regime to a tonic spiking regime. The filtered frozen white noise generated a spike pattern output with a broad spike interval distribution. The coincidence factor and the Hunter and Milton measure were used as reliability measures of the output spike train. In the experimental TCR cell as well as the Morris-Lecar model cell the reliability depends on the shape (steepness) of the current input versus spike frequency output curve. The model also allowed to study the contribution of three relevant ionic membrane currents to reliability: a T-type calcium current, a cation selective h-current and a calcium dependent potassium current in order to allow bursting, investigate the consequences of a more complex current-frequency relation and produce realistic firing rates. The reliability of the output of the TCR cell increases with depolarization. In hyperpolarized states bursts are more reliable than single spikes. The analytically derived relations were capable to predict several of the experimentally recorded spike features.

  8. When less is more: non-monotonic spike sequence processing in neurons.

    Directory of Open Access Journals (Sweden)

    Hinrich Arnoldt

    2015-02-01

    Full Text Available Fundamental response properties of neurons centrally underly the computational capabilities of both individual nerve cells and neural networks. Most studies on neuronal input-output relations have focused on continuous-time inputs such as constant or noisy sinusoidal currents. Yet, most neurons communicate via exchanging action potentials (spikes at discrete times. Here, we systematically analyze the stationary spiking response to regular spiking inputs and reveal that it is generically non-monotonic. Our theoretical analysis shows that the underlying mechanism relies solely on a combination of the discrete nature of the communication by spikes, the capability of locking output to input spikes and limited resources required for spike processing. Numerical simulations of mathematically idealized and biophysically detailed models, as well as neurophysiological experiments confirm and illustrate our theoretical predictions.

  9. Short-Range Temporal Interactions in Sleep; Hippocampal Spike Avalanches Support a Large Milieu of Sequential Activity Including Replay.

    Directory of Open Access Journals (Sweden)

    J Matthew Mahoney

    Full Text Available Hippocampal neural systems consolidate multiple complex behaviors into memory. However, the temporal structure of neural firing supporting complex memory consolidation is unknown. Replay of hippocampal place cells during sleep supports the view that a simple repetitive behavior modifies sleep firing dynamics, but does not explain how multiple episodes could be integrated into associative networks for recollection during future cognition. Here we decode sequential firing structure within spike avalanches of all pyramidal cells recorded in sleeping rats after running in a circular track. We find that short sequences that combine into multiple long sequences capture the majority of the sequential structure during sleep, including replay of hippocampal place cells. The ensemble, however, is not optimized for maximally producing the behavior-enriched episode. Thus behavioral programming of sequential correlations occurs at the level of short-range interactions, not whole behavioral sequences and these short sequences are assembled into a large and complex milieu that could support complex memory consolidation.

  10. The visual development of hand-centered receptive fields in a neural network model of the primate visual system trained with experimentally recorded human gaze changes

    OpenAIRE

    Galeazzi, Juan M.; Navajas, Joaquin; Mender, Bedeho M. W.; Quian Quiroga, Rodrigo; Minini, Loredana; Stringer, Simon M.

    2016-01-01

    ABSTRACT Neurons have been found in the primate brain that respond to objects in specific locations in hand-centered coordinates. A key theoretical challenge is to explain how such hand-centered neuronal responses may develop through visual experience. In this paper we show how hand-centered visual receptive fields can develop using an artificial neural network model, VisNet, of the primate visual system when driven by gaze changes recorded from human test subjects as they completed a jigsaw....

  11. Changes in complex spike activity during classical conditioning

    Directory of Open Access Journals (Sweden)

    Anders eRasmussen

    2014-08-01

    Full Text Available The cerebellar cortex is necessary for adaptively timed conditioned responses (CRs in eyeblink conditioning. During conditioning, Purkinje cells acquire pause responses or Purkinje cell CRs to the conditioned stimuli (CS, resulting in disinhibition of the cerebellar nuclei (CN, allowing them to activate motor nuclei that control eyeblinks. This disinhibition also causes inhibition of the inferior olive (IO, via the nucleo-olivary pathway (N-O. Activation of the IO, which relays the unconditional stimulus (US to the cortex, elicits characteristic complex spikes in Purkinje cells. Although Purkinje cell activity, as well as stimulation of the CN, is known to influence IO activity, much remains to be learned about the way that learned changes in simple spike firing affects the IO. In the present study, we analyzed changes in simple and complex spike firing, in extracellular Purkinje cell records, from the C3 zone, in decerebrate ferrets undergoing training in a conditioning paradigm. In agreement with the N-O feedback hypothesis, acquisition resulted in a gradual decrease in complex spike activity during the conditioned stimulus, with a delay that is consistent with the long N-O latency. Also supporting the feedback hypothesis, training with a short interstimulus interval (ISI, which does not lead to acquisition of a Purkinje cell CR, did not cause a suppression of complex spike activity. In contrast, observations that extinction did not lead to a recovery in complex spike activity and the irregular patterns of simple and complex spike activity after the conditioned stimulus are less conclusive.

  12. A Simple Deep Learning Method for Neuronal Spike Sorting

    Science.gov (United States)

    Yang, Kai; Wu, Haifeng; Zeng, Yu

    2017-10-01

    Spike sorting is one of key technique to understand brain activity. With the development of modern electrophysiology technology, some recent multi-electrode technologies have been able to record the activity of thousands of neuronal spikes simultaneously. The spike sorting in this case will increase the computational complexity of conventional sorting algorithms. In this paper, we will focus spike sorting on how to reduce the complexity, and introduce a deep learning algorithm, principal component analysis network (PCANet) to spike sorting. The introduced method starts from a conventional model and establish a Toeplitz matrix. Through the column vectors in the matrix, we trains a PCANet, where some eigenvalue vectors of spikes could be extracted. Finally, support vector machine (SVM) is used to sort spikes. In experiments, we choose two groups of simulated data from public databases availably and compare this introduced method with conventional methods. The results indicate that the introduced method indeed has lower complexity with the same sorting errors as the conventional methods.

  13. Neuronify: An Educational Simulator for Neural Circuits.

    Science.gov (United States)

    Dragly, Svenn-Arne; Hobbi Mobarhan, Milad; Våvang Solbrå, Andreas; Tennøe, Simen; Hafreager, Anders; Malthe-Sørenssen, Anders; Fyhn, Marianne; Hafting, Torkel; Einevoll, Gaute T

    2017-01-01

    Educational software (apps) can improve science education by providing an interactive way of learning about complicated topics that are hard to explain with text and static illustrations. However, few educational apps are available for simulation of neural networks. Here, we describe an educational app, Neuronify, allowing the user to easily create and explore neural networks in a plug-and-play simulation environment. The user can pick network elements with adjustable parameters from a menu, i.e., synaptically connected neurons modelled as integrate-and-fire neurons and various stimulators (current sources, spike generators, visual, and touch) and recording devices (voltmeter, spike detector, and loudspeaker). We aim to provide a low entry point to simulation-based neuroscience by allowing students with no programming experience to create and simulate neural networks. To facilitate the use of Neuronify in teaching, a set of premade common network motifs is provided, performing functions such as input summation, gain control by inhibition, and detection of direction of stimulus movement. Neuronify is developed in C++ and QML using the cross-platform application framework Qt and runs on smart phones (Android, iOS) and tablet computers as well personal computers (Windows, Mac, Linux).

  14. Intra-day signal instabilities affect decoding performance in an intracortical neural interface system

    Science.gov (United States)

    Perge, János A.; Homer, Mark L.; Malik, Wasim Q.; Cash, Sydney; Eskandar, Emad; Friehs, Gerhard; Donoghue, John P.; Hochberg, Leigh R.

    2013-06-01

    Objective. Motor neural interface systems (NIS) aim to convert neural signals into motor prosthetic or assistive device control, allowing people with paralysis to regain movement or control over their immediate environment. Effector or prosthetic control can degrade if the relationship between recorded neural signals and intended motor behavior changes. Therefore, characterizing both biological and technological sources of signal variability is important for a reliable NIS. Approach. To address the frequency and causes of neural signal variability in a spike-based NIS, we analyzed within-day fluctuations in spiking activity and action potential amplitude recorded with silicon microelectrode arrays implanted in the motor cortex of three people with tetraplegia (BrainGate pilot clinical trial, IDE). Main results. 84% of the recorded units showed a statistically significant change in apparent firing rate (3.8 ± 8.71 Hz or 49% of the mean rate) across several-minute epochs of tasks performed on a single session, and 74% of the units showed a significant change in spike amplitude (3.7 ± 6.5 µV or 5.5% of mean spike amplitude). 40% of the recording sessions showed a significant correlation in the occurrence of amplitude changes across electrodes, suggesting array micro-movement. Despite the relatively frequent amplitude changes, only 15% of the observed within-day rate changes originated from recording artifacts such as spike amplitude change or electrical noise, while 85% of the rate changes most likely emerged from physiological mechanisms. Computer simulations confirmed that systematic rate changes of individual neurons could produce a directional ‘bias’ in the decoded neural cursor movements. Instability in apparent neuronal spike rates indeed yielded a directional bias in 56% of all performance assessments in participant cursor control (n = 2 participants, 108 and 20 assessments over two years), resulting in suboptimal performance in these sessions

  15. The Mechanisms of Repetitive Spike Generation in an Axonless Retinal Interneuron

    Directory of Open Access Journals (Sweden)

    Mark S. Cembrowski

    2012-02-01

    Full Text Available Several types of retinal interneurons exhibit spikes but lack axons. One such neuron is the AII amacrine cell, in which spikes recorded at the soma exhibit small amplitudes (5 ms. Here, we used electrophysiological recordings and computational analysis to examine the mechanisms underlying this atypical spiking. We found that somatic spikes likely represent large, brief action potential-like events initiated in a single, electrotonically distal dendritic compartment. In this same compartment, spiking undergoes slow modulation, likely by an M-type K conductance. The structural correlate of this compartment is a thin neurite that extends from the primary dendritic tree: local application of TTX to this neurite, or excision of it, eliminates spiking. Thus, the physiology of the axonless AII is much more complex than would be anticipated from morphological descriptions and somatic recordings; in particular, the AII possesses a single dendritic structure that controls its firing pattern.

  16. Population activity statistics dissect subthreshold and spiking variability in V1.

    Science.gov (United States)

    Bányai, Mihály; Koman, Zsombor; Orbán, Gergő

    2017-07-01

    Response variability, as measured by fluctuating responses upon repeated performance of trials, is a major component of neural responses, and its characterization is key to interpret high dimensional population recordings. Response variability and covariability display predictable changes upon changes in stimulus and cognitive or behavioral state, providing an opportunity to test the predictive power of models of neural variability. Still, there is little agreement on which model to use as a building block for population-level analyses, and models of variability are often treated as a subject of choice. We investigate two competing models, the doubly stochastic Poisson (DSP) model assuming stochasticity at spike generation, and the rectified Gaussian (RG) model tracing variability back to membrane potential variance, to analyze stimulus-dependent modulation of both single-neuron and pairwise response statistics. Using a pair of model neurons, we demonstrate that the two models predict similar single-cell statistics. However, DSP and RG models have contradicting predictions on the joint statistics of spiking responses. To test the models against data, we build a population model to simulate stimulus change-related modulations in pairwise response statistics. We use single-unit data from the primary visual cortex (V1) of monkeys to show that while model predictions for variance are qualitatively similar to experimental data, only the RG model's predictions are compatible with joint statistics. These results suggest that models using Poisson-like variability might fail to capture important properties of response statistics. We argue that membrane potential-level modeling of stochasticity provides an efficient strategy to model correlations.NEW & NOTEWORTHY Neural variability and covariability are puzzling aspects of cortical computations. For efficient decoding and prediction, models of information encoding in neural populations hinge on an appropriate model of

  17. A multi-channel low-power system-on-chip for single-unit recording and narrowband wireless transmission of neural signal.

    Science.gov (United States)

    Bonfanti, A; Ceravolo, M; Zambra, G; Gusmeroli, R; Spinelli, A S; Lacaita, A L; Angotzi, G N; Baranauskas, G; Fadiga, L

    2010-01-01

    This paper reports a multi-channel neural recording system-on-chip (SoC) with digital data compression and wireless telemetry. The circuit consists of a 16 amplifiers, an analog time division multiplexer, an 8-bit SAR AD converter, a digital signal processor (DSP) and a wireless narrowband 400-MHz binary FSK transmitter. Even though only 16 amplifiers are present in our current die version, the whole system is designed to work with 64 channels demonstrating the feasibility of a digital processing and narrowband wireless transmission of 64 neural recording channels. A digital data compression, based on the detection of action potentials and storage of correspondent waveforms, allows the use of a 1.25-Mbit/s binary FSK wireless transmission. This moderate bit-rate and a low frequency deviation, Manchester-coded modulation are crucial for exploiting a narrowband wireless link and an efficient embeddable antenna. The chip is realized in a 0.35- εm CMOS process with a power consumption of 105 εW per channel (269 εW per channel with an extended transmission range of 4 m) and an area of 3.1 × 2.7 mm(2). The transmitted signal is captured by a digital TV tuner and demodulated by a wideband phase-locked loop (PLL), and then sent to a PC via an FPGA module. The system has been tested for electrical specifications and its functionality verified in in-vivo neural recording experiments.

  18. Estimating the correlation between bursty spike trains and local field potentials.

    Science.gov (United States)

    Li, Zhaohui; Ouyang, Gaoxiang; Yao, Li; Li, Xiaoli

    2014-09-01

    To further understand rhythmic neuronal synchronization, an increasingly useful method is to determine the relationship between the spiking activity of individual neurons and the local field potentials (LFPs) of neural ensembles. Spike field coherence (SFC) is a widely used method for measuring the synchronization between spike trains and LFPs. However, due to the strong dependency of SFC on the burst index, it is not suitable for analyzing the relationship between bursty spike trains and LFPs, particularly in high frequency bands. To address this issue, we developed a method called weighted spike field correlation (WSFC), which uses the first spike in each burst multiple times to estimate the relationship. In the calculation, the number of times that the first spike is used is equal to the spike count per burst. The performance of this method was demonstrated using simulated bursty spike trains and LFPs, which comprised sinusoids with different frequencies, amplitudes, and phases. This method was also used to estimate the correlation between pyramidal cells in the hippocampus and gamma oscillations in rats performing behaviors. Analyses using simulated and real data demonstrated that the WSFC method is a promising measure for estimating the correlation between bursty spike trains and high frequency LFPs. Copyright © 2014 Elsevier Ltd. All rights reserved.

  19. Characterization of Early Cortical Neural Network ...

    Science.gov (United States)

    We examined the development of neural network activity using microelectrode array (MEA) recordings made in multi-well MEA plates (mwMEAs) over the first 12 days in vitro (DIV). In primary cortical cultures made from postnatal rats, action potential spiking activity was essentially absent on DIV 2 and developed rapidly between DIV 5 and 12. Spiking activity was primarily sporadic and unorganized at early DIV, and became progressively more organized with time in culture, with bursting parameters, synchrony and network bursting increasing between DIV 5 and 12. We selected 12 features to describe network activity and principal components analysis using these features demonstrated a general segregation of data by age at both the well and plate levels. Using a combination of random forest classifiers and Support Vector Machines, we demonstrated that 4 features (CV of within burst ISI, CV of IBI, network spike rate and burst rate) were sufficient to predict the age (either DIV 5, 7, 9 or 12) of each well recording with >65% accuracy. When restricting the classification problem to a binary decision, we found that classification improved dramatically, e.g. 95% accuracy for discriminating DIV 5 vs DIV 12 wells. Further, we present a novel resampling approach to determine the number of wells that might be needed for conducting comparisons of different treatments using mwMEA plates. Overall, these results demonstrate that network development on mwMEA plates is similar to

  20. Tight neurovascular coupling in a rat model of quasi-periodic interictal spiking using multispectral optical imaging

    Science.gov (United States)

    Pouliot, Philippe; Truong, Van Tri; Zhang, Cong; Dubeau, Simon; Lesage, Frédéric

    2012-10-01

    The hemodynamic responses to 4-aminopyridine (4-AP) induced focal epileptic spikes and electrical stimulations are compared in a rat model. Nonlinearities are quantified with biophysical models. Supranormal oxygen consumption from epileptic spikes is inferred. In one recording, interictal spikes followed an almost periodic pattern. Such rhythmic spiking is a well-documented phenomenon in electrophysiological studies, but the hemodynamics correlates have been less studied. Spikes occurred every 12.5 +/- 1.0 s. Peaks in total hemoglobin (HbT), a proxy for regional cerebral blood volume, followed spikes by 2.6 +/- 0.3 s. Troughs in HbT preceded spikes by 1.68 +/- 1.2 s. The narrowness of this distribution is surprising. From it, one may derive a significant but paradoxical fall in HbT several seconds before the spikes, but which this decrease in HbT is better interpreted as being due to the interictal spike that occurred before.

  1. Distinct temporal spike and local field potential activities in the thalamic parafascicular nucleus of parkinsonian rats during rest and limb movement.

    Science.gov (United States)

    Wang, Min; Qu, Qingyang; He, Tingting; Li, Min; Song, Zhimin; Chen, Feiyu; Zhang, Xiao; Xie, Jinlu; Geng, Xiwen; Yang, Maoquan; Wang, Xiusong; Lei, Chengdong; Hou, Yabing

    2016-08-25

    Several studies have suggested that the thalamic centromedian-parafascicular (CM/PF or the PF in rodents) is implicated in the pathophysiology of Parkinson's disease (PD). However, inconsistent changes in the neuronal firing rate and pattern have been reported in parkinsonian animals. To investigate the impact of a dopaminergic cell lesion on PF extracellular discharge in behaving rats, the PF neural activities in the spike and local field potential (LFP) were recorded in unilaterally 6-hydroxydopamine- (6-OHDA) lesioned and neurologically intact control rats during rest and limb movement. During rest, the two PF neuronal subtypes was less spontaneously active, with no difference in the spike firing rates between the control and lesioned rats; only the lesioned rats reshaped their spike firing pattern. Furthermore, the simultaneously recorded LFP in the lesioned rats exhibited a significant increase in power at 12-35 and 35-70Hz and a decrease in power at 0.7-12Hz. During the execution of a voluntary movement, two subtypes of PF neurons were identified by a rapid increase in the discharge activity in both the control and lesioned rats. However, dopamine lesioning was associated with a decrease in neuronal spiking fire rate and reshaping in the firing pattern in the PF. The simultaneously recorded LFP activity exhibited a significant increase in power at 12-35Hz and a decrease in power at 0.7-12Hz compared with the control rats. These findings indicate that 6-OHDA induces modifications in PF spike and LFP activities in rats during rest and movement and suggest that PF dysfunction may be an important contributor to the pathophysiology of parkinsonian motor impairment. Copyright © 2016 IBRO. Published by Elsevier Ltd. All rights reserved.

  2. Spiking neuron network Helmholtz machine

    National Research Council Canada - National Science Library

    Sountsov, Pavel; Miller, Paul

    2015-01-01

    .... This paper aims to unify the two fields of probabilistic inference and synaptic plasticity by using a neuronal network of realistic model spiking neurons to implement a well-studied computational...

  3. Wavelet analysis of epileptic spikes

    CERN Document Server

    Latka, M; Kozik, A; West, B J; Latka, Miroslaw; Was, Ziemowit; Kozik, Andrzej; West, Bruce J.

    2003-01-01

    Interictal spikes and sharp waves in human EEG are characteristic signatures of epilepsy. These potentials originate as a result of synchronous, pathological discharge of many neurons. The reliable detection of such potentials has been the long standing problem in EEG analysis, especially after long-term monitoring became common in investigation of epileptic patients. The traditional definition of a spike is based on its amplitude, duration, sharpness, and emergence from its background. However, spike detection systems built solely around this definition are not reliable due to the presence of numerous transients and artifacts. We use wavelet transform to analyze the properties of EEG manifestations of epilepsy. We demonstrate that the behavior of wavelet transform of epileptic spikes across scales can constitute the foundation of a relatively simple yet effective detection algorithm.

  4. Millisecond precision spike timing shapes tactile perception.

    Science.gov (United States)

    Mackevicius, Emily L; Best, Matthew D; Saal, Hannes P; Bensmaia, Sliman J

    2012-10-31

    In primates, the sense of touch has traditionally been considered to be a spatial modality, drawing an analogy to the visual system. In this view, stimuli are encoded in spatial patterns of activity over the sheet of receptors embedded in the skin. We propose that the spatial processing mode is complemented by a temporal one. Indeed, the transduction and processing of complex, high-frequency skin vibrations have been shown to play an important role in tactile texture perception, and the frequency composition of vibrations shapes the evoked percept. Mechanoreceptive afferents innervating the glabrous skin exhibit temporal patterning in their responses, but the importance and behavioral relevance of spike timing, particularly for naturalistic stimuli, remains to be elucidated. Based on neurophysiological recordings from Rhesus macaques, we show that spike timing conveys information about the frequency composition of skin vibrations, both for individual afferents and for afferent populations, and that the temporal fidelity varies across afferent class. Furthermore, the perception of skin vibrations, measured in human subjects, is better predicted when spike timing is taken into account, and the resolution that predicts perception best matches the optimal resolution of the respective afferent classes. In light of these results, the peripheral representation of complex skin vibrations draws a powerful analogy with the auditory and vibrissal systems.

  5. Spike count, spike timing and temporal information in the cortex of awake, freely moving rats

    Science.gov (United States)

    Scaglione, Alessandro; Foffani, Guglielmo; Moxon, Karen A.

    2014-08-01

    Objective. Sensory processing of peripheral information is not stationary but is, in general, a dynamic process related to the behavioral state of the animal. Yet the link between the state of the behavior and the encoding properties of neurons is unclear. This report investigates the impact of the behavioral state on the encoding mechanisms used by cortical neurons for both detection and discrimination of somatosensory stimuli in awake, freely moving, rats. Approach. Neuronal activity was recorded from the primary somatosensory cortex of five rats under two different behavioral states (quiet versus whisking) while electrical stimulation of increasing stimulus strength was delivered to the mystacial pad. Information theoretical measures were then used to measure the contribution of different encoding mechanisms to the information carried by neurons in response to the whisker stimulation. Main results. We found that the behavioral state of the animal modulated the total amount of information conveyed by neurons and that the timing of individual spikes increased the information compared to the total count of spikes alone. However, the temporal information, i.e. information exclusively related to when the spikes occur, was not modulated by behavioral state. Significance. We conclude that information about somatosensory stimuli is modulated by the behavior of the animal and this modulation is mainly expressed in the spike count while the temporal information is more robust to changes in behavioral state.

  6. Paired Stimulation for Spike-Timing-Dependent Plasticity in Primate Sensorimotor Cortex

    Science.gov (United States)

    Seeman, Stephanie C.

    2017-01-01

    Classic in vitro studies have described spike-timing-dependent plasticity (STDP) at a synapse: the connection from neuron A to neuron B is strengthened (or weakened) when A fires before (or after) B within an optimal time window. Accordingly, more recent in vivo works have demonstrated behavioral effects consistent with an STDP mechanism; however, many relied on single-unit recordings. The ability to modify cortical connections becomes useful in the context of injury, when connectivity and associated behavior are compromised. To avoid the need for long-term, stable isolation of single units, one could control timed activation of two cortical sites with paired electrical stimulation. We tested the hypothesis that STDP could be induced via prolonged paired stimulation as quantified by cortical evoked potentials (EPs) in the sensorimotor cortex of awake, behaving monkeys. Paired simulation between two interconnected sites produced robust effects in EPs consistent with STDP, but only at 2/15 tested pairs. The stimulation protocol often produced increases in global network excitability or depression of the conditioned pair. Together, these results suggest that paired stimulation in vivo is a viable method to induce STDP between cortical populations, but that factors beyond activation timing must be considered to produce conditioning effects. SIGNIFICANCE STATEMENT Plasticity of neural connections is important for development, learning, memory, and recovery from injury. Cellular mechanisms underlying spike-timing-dependent plasticity have been studied extensively in vitro. Recent in vivo work has demonstrated results consistent with the previously defined cellular mechanisms; however, the output measure in these studies was typically an indirect assessment of plasticity at the neural level. Here, we show direct plasticity in recordings of neuronal populations in awake, behaving nonhuman primates induced by paired electrical stimulation. In contrast to in vitro studies

  7. Unbiased estimation of precise temporal correlations between spike trains.

    Science.gov (United States)

    Stark, Eran; Abeles, Moshe

    2009-04-30

    A key issue in systems neuroscience is the contribution of precise temporal inter-neuronal interactions to information processing in the brain, and the main analytical tool used for studying pair-wise interactions is the cross-correlation histogram (CCH). Although simple to generate, a CCH is influenced by multiple factors in addition to precise temporal correlations between two spike trains, thus complicating its interpretation. A Monte-Carlo-based technique, the jittering method, has been suggested to isolate the contribution of precise temporal interactions to neural information processing. Here, we show that jittering spike trains is equivalent to convolving the CCH derived from the original trains with a finite window and using a Poisson distribution to estimate probabilities. Both procedures over-fit the original spike trains and therefore the resulting statistical tests are biased and have low power. We devise an alternative method, based on convolving the CCH with a partially hollowed window, and illustrate its utility using artificial and real spike trains. The modified convolution method is unbiased, has high power, and is computationally fast. We recommend caution in the use of the jittering method and in the interpretation of results based on it, and suggest using the modified convolution method for detecting precise temporal correlations between spike trains.

  8. Using Tweedie distributions for fitting spike count data.

    Science.gov (United States)

    Moshitch, Dina; Nelken, Israel

    2014-03-30

    The nature of spike count distributions is of great practical concern for the analysis of neural data. These distributions often have a tendency for 'failures' and a long tail of large counts, and may show a strong dependence of variance on the mean. Furthermore, spike count distributions often show multiplicative rather than additive effects of covariates. We analyzed the responses of neurons in primary auditory cortex to transposed stimuli as a function of interaural time differences (ITD). In more than half of the cases, the variance of neuronal responses showed a supralinear dependence on the mean spike count. We explored the use of the Tweedie family of distributions, which has a supralinear dependence of means on variances. To quantify the effects of ITD on neuronal responses, we used generalized linear models (GLMs), and developed methods for significance testing under the Tweedie assumption. We found the Tweedie distribution to be generally a better fit to the data than the Poisson distribution for over-dispersed responses. Standard analysis of variance wrongly assumes Gaussian distributions with fixed variance and additive effects, but even generalized models under Poisson assumptions may be hampered by the over-dispersion of spike counts. The use of GLMs assuming Tweedie distributions increased the reliability of tests of sensitivity to ITD in our data. When spike count variance depends strongly on the mean, the use of Tweedie distributions for analyzing the data is advised. Copyright © 2014 Elsevier B.V. All rights reserved.

  9. Data-driven model comparing the effects of glial scarring and interface interactions on chronic neural recordings in non-human primates.

    Science.gov (United States)

    Malaga, Karlo A; Schroeder, Karen E; Patel, Paras R; Irwin, Zachary T; Thompson, David E; Nicole Bentley, J; Lempka, Scott F; Chestek, Cynthia A; Patil, Parag G

    2016-02-01

    We characterized electrode stability over twelve weeks of impedance and neural recording data from four chronically-implanted Utah arrays in two rhesus macaques, and investigated the effects of glial scarring and interface interactions at the electrode recording site on signal quality using a computational model. A finite-element model of a Utah array microelectrode in neural tissue was coupled with a multi-compartmental model of a neuron to quantify the effects of encapsulation thickness, encapsulation resistivity, and interface resistivity on electrode impedance and waveform amplitude. The coupled model was then reconciled with the in vivo data. Histology was obtained seventeen weeks post-implantation to measure gliosis. From week 1-3, mean impedance and amplitude increased at rates of 115.8 kΩ/week and 23.1 μV/week, respectively. This initial ramp up in impedance and amplitude was observed across all arrays, and is consistent with biofouling (increasing interface resistivity) and edema clearing (increasing tissue resistivity), respectively, in the model. Beyond week 3, the trends leveled out. Histology showed that thin scars formed around the electrodes. In the model, scarring could not match the in vivo data. However, a thin interface layer at the electrode tip could. Despite having a large effect on impedance, interface resistivity did not have a noticeable effect on amplitude. This study suggests that scarring does not cause an electrical problem with regard to signal quality since it does not appear to be the main contributor to increasing impedance or significantly affect amplitude unless it displaces neurons. This, in turn, suggests that neural signals can be obtained reliably despite scarring as long as the recording site has sufficiently low impedance after accumulating a thin layer of biofouling. Therefore, advancements in microelectrode technology may be expedited by focusing on improvements to the recording site-tissue interface rather than

  10. Data-driven model comparing the effects of glial scarring and interface interactions on chronic neural recordings in non-human primates

    Science.gov (United States)

    Malaga, Karlo A.; Schroeder, Karen E.; Patel, Paras R.; Irwin, Zachary T.; Thompson, David E.; Bentley, J. Nicole; Lempka, Scott F.; Chestek, Cynthia A.; Patil, Parag G.

    2016-02-01

    Objective. We characterized electrode stability over twelve weeks of impedance and neural recording data from four chronically-implanted Utah arrays in two rhesus macaques, and investigated the effects of glial scarring and interface interactions at the electrode recording site on signal quality using a computational model. Approach. A finite-element model of a Utah array microelectrode in neural tissue was coupled with a multi-compartmental model of a neuron to quantify the effects of encapsulation thickness, encapsulation resistivity, and interface resistivity on electrode impedance and waveform amplitude. The coupled model was then reconciled with the in vivo data. Histology was obtained seventeen weeks post-implantation to measure gliosis. Main results. From week 1-3, mean impedance and amplitude increased at rates of 115.8 kΩ/week and 23.1 μV/week, respectively. This initial ramp up in impedance and amplitude was observed across all arrays, and is consistent with biofouling (increasing interface resistivity) and edema clearing (increasing tissue resistivity), respectively, in the model. Beyond week 3, the trends leveled out. Histology showed that thin scars formed around the electrodes. In the model, scarring could not match the in vivo data. However, a thin interface layer at the electrode tip could. Despite having a large effect on impedance, interface resistivity did not have a noticeable effect on amplitude. Significance. This study suggests that scarring does not cause an electrical problem with regard to signal quality since it does not appear to be the main contributor to increasing impedance or significantly affect amplitude unless it displaces neurons. This, in turn, suggests that neural signals can be obtained reliably despite scarring as long as the recording site has sufficiently low impedance after accumulating a thin layer of biofouling. Therefore, advancements in microelectrode technology may be expedited by focusing on improvements to the

  11. Slow moving neural source in the epileptic hippocampus can mimic progression of human seizures.

    Science.gov (United States)

    Chiang, Chia-Chu; Wei, Xile; Ananthakrishnan, Arvind Keshav; Shivacharan, Rajat S; Gonzalez-Reyes, Luis E; Zhang, Mingming; Durand, Dominique M

    2018-01-24

    Fast and slow neural waves have been observed to propagate in the human brain during seizures. Yet the nature of these waves is difficult to study in a surgical setting. Here, we report an observation of two different traveling waves propagating in the in-vitro epileptic hippocampus at speeds similar to those in the human brain. A fast traveling spike and a slow moving wave were recorded simultaneously with a genetically encoded voltage sensitive fluorescent protein (VSFP Butterfly 1.2) and a high speed camera. The results of this study indicate that the fast traveling spike is NMDA-sensitive but the slow moving wave is not. Image analysis and model simulation demonstrate that the slow moving wave is moving slowly, generating the fast traveling spike and is, therefore, a moving source of the epileptiform activity. This slow moving wave is associated with a propagating neural calcium wave detected with calcium dye (OGB-1) but is independent of NMDA receptors, not related to ATP release, and much faster than those previously recorded potassium waves. Computer modeling suggests that the slow moving wave can propagate by the ephaptic effect like epileptiform activity. These findings provide an alternative explanation for slow propagation seizure wavefronts associated with fast propagating spikes.

  12. Robust Spike Sorting of Retinal Ganglion Cells Tuned to Spot Stimuli*

    OpenAIRE

    Ghahari, Alireza; Badea, Tudor C

    2016-01-01

    We propose an automatic spike sorting approach for the data recorded from a microelectrode array during visual stimulation of wild type retinas with 25 spot stimuli. The approach first detects individual spikes per electrode by their signature local minima. With the mixture probability distribution of the local minima estimated afterwards, it applies a minimum-squared-error clustering algorithm to sort the spikes into different clusters. A template waveform for each cluster per electrode is d...

  13. Neurodynamics of biased competition and cooperation for attention: a model with spiking neurons.

    Science.gov (United States)

    Deco, Gustavo; Rolls, Edmund T

    2005-07-01

    Recent neurophysiological experiments have led to a promising "biased competition hypothesis" of the neural basis of attention. According to this hypothesis, attention appears as a sometimes nonlinear property that results from a top-down biasing effect that influences the competitive and cooperative interactions that work both within cortical areas and between cortical areas. In this paper we describe a detailed dynamical analysis of the synaptic and neuronal spiking mechanisms underlying biased competition. We perform a detailed analysis of the dynamical capabilities of the system by exploring the stationary attractors in the parameter space by a mean-field reduction consistent with the underlying synaptic and spiking dynamics. The nonstationary dynamical behavior, as measured in neuronal recording experiments, is studied by an integrate-and-fire model with realistic dynamics. This elucidates the role of cooperation and competition in the dynamics of biased competition and shows why feedback connections between cortical areas need optimally to be weaker by a factor of about 2.5 than the feedforward connections in an attentional network. We modeled the interaction between top-down attention and bottom-up stimulus contrast effects found neurophysiologically and showed that top-down attentional effects can be explained by external attention inputs biasing neurons to move to different parts of their nonlinear activation functions. Further, it is shown that, although NMDA nonlinear effects may be useful in attention, they are not necessary, with nonlinear effects (which may appear multiplicative) being produced in the way just described.

  14. Bifurcation and chaos in the spontaneously firing spike train of cultured neuronal network

    Science.gov (United States)

    Chen, Wenjuan; Li, Xiangning; Zhu, Geng; Zhou, Wei; Zeng, Shaoqun; Luo, Qingming

    2008-02-01

    Both neuroscience and nonlinear science have focused attention on the dynamics of the neural network. However, litter is known concerning the electrical activity of the cultured neuronal network because of the high complexity and moment change. Instead of traditional methods, we use chaotic time series analysis and temporal coding to analyze the spontaneous firing spike train recorded from hippocampal neuronal network cultured on multi-electrode array. When analyzing interspike interval series of different firing patterns, we found when single spike and burst alternate, the largest Lyapunov exponent of interspike interval (ISI) series is positive. It suggests that chaos should exist. Furthermore, a nonlinear phenomenon of bifurcation is found in the ISI vs. number histogram. It determined that this complex firing pattern of neuron and the irregular ISI series were resulted from deterministic factors and chaos should exist in cultured term.These results suggest that chaotic time series analysis and temporal coding provide us effective methods to investigate the role played by deterministic and stochastic component in neuron information coding, but further research should be carried out because of the high complexity and remarkable noise of the electric activity.

  15. NeuralWISP: A Wirelessly Powered Neural Interface With 1-m Range.

    Science.gov (United States)

    Yeager, D J; Holleman, J; Prasad, R; Smith, J R; Otis, B P

    2009-12-01

    We present the NeuralWISP, a wireless neural interface operating from far-field radio-frequency RF energy. The NeuralWISP is compatible with commercial RF identification readers and operates at a range up to 1 m. It includes a custom low-noise, low-power amplifier integrated circuit for processing the neural signal and an analog spike detection circuit for reducing digital computational requirements and communications bandwidth. Our system monitors the neural signal and periodically transmits the spike density in a user-programmable time window. The entire system draws an average 20 muA from the harvested 1.8-V supply.

  16. The visual development of hand-centered receptive fields in a neural network model of the primate visual system trained with experimentally recorded human gaze changes.

    Science.gov (United States)

    Galeazzi, Juan M; Navajas, Joaquín; Mender, Bedeho M W; Quian Quiroga, Rodrigo; Minini, Loredana; Stringer, Simon M

    2016-01-01

    Neurons have been found in the primate brain that respond to objects in specific locations in hand-centered coordinates. A key theoretical challenge is to explain how such hand-centered neuronal responses may develop through visual experience. In this paper we show how hand-centered visual receptive fields can develop using an artificial neural network model, VisNet, of the primate visual system when driven by gaze changes recorded from human test subjects as they completed a jigsaw. A camera mounted on the head captured images of the hand and jigsaw, while eye movements were recorded using an eye-tracking device. This combination of data allowed us to reconstruct the retinal images seen as humans undertook the jigsaw task. These retinal images were then fed into the neural network model during self-organization of its synaptic connectivity using a biologically plausible trace learning rule. A trace learning mechanism encourages neurons in the model to learn to respond to input images that tend to occur in close temporal proximity. In the data recorded from human subjects, we found that the participant's gaze often shifted through a sequence of locations around a fixed spatial configuration of the hand and one of the jigsaw pieces. In this case, trace learning should bind these retinal images together onto the same subset of output neurons. The simulation results consequently confirmed that some cells learned to respond selectively to the hand and a jigsaw piece in a fixed spatial configuration across different retinal views.

  17. Recurrent Spiking Networks Solve Planning Tasks.

    Science.gov (United States)

    Rueckert, Elmar; Kappel, David; Tanneberg, Daniel; Pecevski, Dejan; Peters, Jan

    2016-02-18

    A recurrent spiking neural network is proposed that implements planning as probabilistic inference for finite and infinite horizon tasks. The architecture splits this problem into two parts: The stochastic transient firing of the network embodies the dynamics of the planning task. With appropriate injected input this dynamics is shaped to generate high-reward state trajectories. A general class of reward-modulated plasticity rules for these afferent synapses is presented. The updates optimize the likelihood of getting a reward through a variant of an Expectation Maximization algorithm and learning is guaranteed to convergence to a local maximum. We find that the network dynamics are qualitatively similar to transient firing patterns during planning and foraging in the hippocampus of awake behaving rats. The model extends classical attractor models and provides a testable prediction on identifying modulating contextual information. In a real robot arm reaching and obstacle avoidance task the ability to represent multiple task solutions is investigated. The neural planning method with its local update rules provides the basis for future neuromorphic hardware implementations with promising potentials like large data processing abilities and early initiation of strategies to avoid dangerous situations in robot co-worker scenarios.

  18. Eliminating thermal violin spikes from LIGO noise

    Energy Technology Data Exchange (ETDEWEB)

    Santamore, D. H.; Levin, Yuri

    2001-08-15

    We have developed a scheme for reducing LIGO suspension thermal noise close to violin-mode resonances. The idea is to monitor directly the thermally induced motion of a small portion of (a 'point' on) each suspension fiber, thereby recording the random forces driving the test-mass motion close to each violin-mode frequency. One can then suppress the thermal noise by optimally subtracting the recorded fiber motions from the measured motion of the test mass, i.e., from the LIGO output. The proposed method is a modification of an analogous but more technically difficult scheme by Braginsky, Levin and Vyatchanin for reducing broad-band suspension thermal noise. The efficiency of our method is limited by the sensitivity of the sensor used to monitor the fiber motion. If the sensor has no intrinsic noise (i.e. has unlimited sensitivity), then our method allows, in principle, a complete removal of violin spikes from the thermal-noise spectrum. We find that in LIGO-II interferometers, in order to suppress violin spikes below the shot-noise level, the intrinsic noise of the sensor must be less than {approx}2 x 10{sup -13} cm/Hz. This sensitivity is two orders of magnitude greater than that of currently available sensors.

  19. Sums of Spike Waveform Features for Motor Decoding

    Directory of Open Access Journals (Sweden)

    Jie Li

    2017-07-01

    Full Text Available Traditionally, the key step before decoding motor intentions from cortical recordings is spike sorting, the process of identifying which neuron was responsible for an action potential. Recently, researchers have started investigating approaches to decoding which omit the spike sorting step, by directly using information about action potentials' waveform shapes in the decoder, though this approach is not yet widespread. Particularly, one recent approach involves computing the moments of waveform features and using these moment values as inputs to decoders. This computationally inexpensive approach was shown to be comparable in accuracy to traditional spike sorting. In this study, we use offline data recorded from two Rhesus monkeys to further validate this approach. We also modify this approach by using sums of exponentiated features of spikes, rather than moments. Our results show that using waveform feature sums facilitates significantly higher hand movement reconstruction accuracy than using waveform feature moments, though the magnitudes of differences are small. We find that using the sums of one simple feature, the spike amplitude, allows better offline decoding accuracy than traditional spike sorting by expert (correlation of 0.767, 0.785 vs. 0.744, 0.738, respectively, for two monkeys, average 16% reduction in mean-squared-error, as well as unsorted threshold crossings (0.746, 0.776; average 9% reduction in mean-squared-error. Our results suggest that the sums-of-features framework has potential as an alternative to both spike sorting and using unsorted threshold crossings, if developed further. Also, we present data comparing sorted vs. unsorted spike counts in terms of offline decoding accuracy. Traditional sorted spike counts do not include waveforms that do not match any template (“hash”, but threshold crossing counts do include this hash. On our data and in previous work, hash contributes to decoding accuracy. Thus, using the

  20. Sums of Spike Waveform Features for Motor Decoding.

    Science.gov (United States)

    Li, Jie; Li, Zheng

    2017-01-01

    Traditionally, the key step before decoding motor intentions from cortical recordings is spike sorting, the process of identifying which neuron was responsible for an action potential. Recently, researchers have started investigating approaches to decoding which omit the spike sorting step, by directly using information about action potentials' waveform shapes in the decoder, though this approach is not yet widespread. Particularly, one recent approach involves computing the moments of waveform features and using these moment values as inputs to decoders. This computationally inexpensive approach was shown to be comparable in accuracy to traditional spike sorting. In this study, we use offline data recorded from two Rhesus monkeys to further validate this approach. We also modify this approach by using sums of exponentiated features of spikes, rather than moments. Our results show that using waveform feature sums facilitates significantly higher hand movement reconstruction accuracy than using waveform feature moments, though the magnitudes of differences are small. We find that using the sums of one simple feature, the spike amplitude, allows better offline decoding accuracy than traditional spike sorting by expert (correlation of 0.767, 0.785 vs. 0.744, 0.738, respectively, for two monkeys, average 16% reduction in mean-squared-error), as well as unsorted threshold crossings (0.746, 0.776; average 9% reduction in mean-squared-error). Our results suggest that the sums-of-features framework has potential as an alternative to both spike sorting and using unsorted threshold crossings, if developed further. Also, we present data comparing sorted vs. unsorted spike counts in terms of offline decoding accuracy. Traditional sorted spike counts do not include waveforms that do not match any template ("hash"), but threshold crossing counts do include this hash. On our data and in previous work, hash contributes to decoding accuracy. Thus, using the comparison between

  1. Input-output relation and energy efficiency in the neuron with different spike threshold dynamics

    Directory of Open Access Journals (Sweden)

    Guo-Sheng eYi

    2015-05-01

    Full Text Available Neuron encodes and transmits information through generating sequences of output spikes, which is a high energy-consuming process. The spike is initiated when membrane depolarization reaches a threshold voltage. In many neurons, threshold is dynamic and depends on the rate of membrane depolarization (dV/dt preceding a spike. Identifying the metabolic energy involved in neural coding and their relationship to threshold dynamic is critical to understanding neuronal function and evolution. Here, we use a modified Morris-Lecar model to investigate neuronal input-output property and energy efficiency associated with different spike threshold dynamics. We find that the neurons with dynamic threshold sensitive to dV/dt generate discontinuous frequency-current curve and type II phase response curve (PRC through Hopf bifurcation, and weak noise could prohibit spiking when bifurcation just occurs. The threshold that is insensitive to dV/dt, instead, results in a continuous frequency-current curve, a type I PRC and a saddle-node on invariant circle bifurcation, and simultaneously weak noise cannot inhibit spiking. It is also shown that the bifurcation, frequency-current curve and PRC type associated with different threshold dynamics arise from the distinct subthreshold interactions of membrane currents. Further, we observe that the energy consumption of the neuron is related to its firing characteristics. The depolarization of spike threshold improves neuronal energy efficiency by reducing the overlap of Na+ and K+ currents during an action potential. The high energy efficiency is achieved at more depolarized spike threshold and high stimulus current. These results provide a fundamental biophysical connection that links spike threshold dynamics, input-output relation, energetics and spike initiation, which could contribute to uncover neural encoding mechanism.

  2. A generative spike train model with time-structured higher order correlations

    Directory of Open Access Journals (Sweden)

    James eTrousdale

    2013-07-01

    Full Text Available Emerging technologies are revealing the spiking activity in ever larger neural ensembles. Frequently, this spiking is far from independent, with correlations in the spike times of different cells. Understanding how such correlations impact the dynamics and function of neural ensembles remains an important open problem.Here we describe a new, generative model for correlated spike trains that can exhibit many of the features observed in data. Extending prior work in mathematical finance, this generalized thinning and shift (GTaS model creates marginally Poisson spike trains with diverse temporal correlation structures.We give several examples which highlight the model's flexibility and utility. For instance, we use it to examine how a neural network responds to highly structured patterns of inputs.We then show that the GTaS model is analytically tractable, and derive cumulant densities of all orders in terms of model parameters. The GTaS framework can therefore be an important tool in the experimental and theoretical exploration of neural dynamics.

  3. Nonvisual complex spike signals in the rabbit cerebellar flocculus

    NARCIS (Netherlands)

    Winkelman, B.H.; Belton, Tim; Suh, Minah; Coesmans, Michiel; Morpurgo, Menno M; Simpson, John I

    2014-01-01

    In addition to the well-known signals of retinal image slip, floccular complex spikes (CSs) also convey nonvisual signals. We recorded eye movement and CS activity from Purkinje cells in awake rabbits sinusoidally oscillated in the dark on a vestibular turntable. The stimulus frequency ranged from

  4. Regulation of granule cell excitability by a low-threshold calcium spike in turtle olfactory bulb

    DEFF Research Database (Denmark)

    Pinato, Giulietta; Midtgaard, Jens

    2003-01-01

    Granule cells excitability in the turtle olfactory bulb was analyzed using whole cell recordings in current- and voltage-clamp mode. Low-threshold spikes (LTSs) were evoked at potentials that are subthreshold for Na spikes in normal medium. The LTSs were evoked from rest, but hyperpolarization...

  5. DFAspike: a new computational proposition for efficient recognition of epileptic spike in EEG.

    Science.gov (United States)

    Keshri, Anup Kumar; Sinha, Rakesh Kumar; Singh, Aishwarya; Nand Das, Barda

    2011-07-01

    An automated method has been presented for the detection of epileptic spikes in the electroencephalogram (EEG) using a deterministic finite automata (DFA) and has been named as DFAspike. EEG data (sampled, 256 Hz) files are the inputs to the DFAspike. The DFAspike was tested with different data files containing epileptic spikes. The obtained recognition rate of epileptic spike was 99.13% on an average. This system does not require any kind of prior training or human intrusion. The result shows that the designed system can be very effectively used for the detection of spikes present in the recorded EEG signals. Copyright © 2011 Elsevier Ltd. All rights reserved.

  6. Decoding Local Field Potentials for Neural Interfaces.

    Science.gov (United States)

    Jackson, Andrew; Hall, Thomas M

    2017-10-01

    The stability and frequency content of local field potentials (LFPs) offer key advantages for long-term, low-power neural interfaces. However, interpreting LFPs may require new signal processing techniques which should be informed by a scientific understanding of how these recordings arise from the coordinated activity of underlying neuronal populations. We review current approaches to decoding LFPs for brain-machine interface (BMI) applications, and suggest several directions for future research. To facilitate an improved understanding of the relationship between LFPs and spike activity, we share a dataset of multielectrode recordings from monkey motor cortex, and describe two unsupervised analysis methods we have explored for extracting a low-dimensional feature space that is amenable to biomimetic decoding and biofeedback training.

  7. Patterns and pauses in Purkinje cell simple spike trains: experiments, modeling and theory.

    Science.gov (United States)

    De Schutter, E; Steuber, V

    2009-09-01

    We review our recent experimental and modeling results on how cerebellar Purkinje cells encode information in their simple spike trains and present a theory of the function of pauses and regular spiking patterns. The regular spiking patterns were discovered in extracellular recordings of simple spikes in awake and anesthetized rodents, where it was shown that more than half of the spontaneous activity consists of short epochs of regular spiking. These periods of regular spiking are interrupted by pauses, which can be tightly synchronized among nearby Purkinje cells, while the spikes in the regular patterns are not. Interestingly, pauses are affected by long-term depression of the parallel fiber synapses. Both in modeling and slice experiments it was demonstrated that long-term depression causes a decrease in the duration of pauses, leading to an increase of the spike output of the neuron. Based on these results we propose that pauses in the simple spike train form a temporal code which can lead to a rebound burst in the target deep cerebellar nucleus neurons. Conversely, the regular spike patterns may be a rate code, which presets the amplitude of future rebound bursts.

  8. Multi-array silicon probes with integrated optical fibers: light-assisted perturbation and recording of local neural circuits in the behaving animal.

    Science.gov (United States)

    Royer, Sébastien; Zemelman, Boris V; Barbic, Mladen; Losonczy, Attila; Buzsáki, György; Magee, Jeffrey C

    2010-06-01

    Recordings of large neuronal ensembles and neural stimulation of high spatial and temporal precision are important requisites for studying the real-time dynamics of neural networks. Multiple-shank silicon probes enable large-scale monitoring of individual neurons. Optical stimulation of genetically targeted neurons expressing light-sensitive channels or other fast (milliseconds) actuators offers the means for controlled perturbation of local circuits. Here we describe a method to equip the shanks of silicon probes with micron-scale light guides for allowing the simultaneous use of the two approaches. We then show illustrative examples of how these compact hybrid electrodes can be used in probing local circuits in behaving rats and mice. A key advantage of these devices is the enhanced spatial precision of stimulation that is achieved by delivering light close to the recording sites of the probe. When paired with the expression of light-sensitive actuators within genetically specified neuronal populations, these devices allow the relatively straightforward and interpretable manipulation of network activity.

  9. Iterative learning control algorithm for spiking behavior of neuron model

    Science.gov (United States)

    Li, Shunan; Li, Donghui; Wang, Jiang; Yu, Haitao

    2016-11-01

    Controlling neurons to generate a desired or normal spiking behavior is the fundamental building block of the treatment of many neurologic diseases. The objective of this work is to develop a novel control method-closed-loop proportional integral (PI)-type iterative learning control (ILC) algorithm to control the spiking behavior in model neurons. In order to verify the feasibility and effectiveness of the proposed method, two single-compartment standard models of different neuronal excitability are specifically considered: Hodgkin-Huxley (HH) model for class 1 neural excitability and Morris-Lecar (ML) model for class 2 neural excitability. ILC has remarkable advantages for the repetitive processes in nature. To further highlight the superiority of the proposed method, the performances of the iterative learning controller are compared to those of classical PI controller. Either in the classical PI control or in the PI control combined with ILC, appropriate background noises are added in neuron models to approach the problem under more realistic biophysical conditions. Simulation results show that the controller performances are more favorable when ILC is considered, no matter which neuronal excitability the neuron belongs to and no matter what kind of firing pattern the desired trajectory belongs to. The error between real and desired output is much smaller under ILC control signal, which suggests ILC of neuron’s spiking behavior is more accurate.

  10. Enhanced polychronisation in a spiking network with metaplasticity

    Directory of Open Access Journals (Sweden)

    Mira eGuise

    2015-02-01

    Full Text Available Computational models of metaplasticity have usually focused on the modeling of single synapses (Shouval et al., 2002. In this paper we study the effect of metaplasticity on network behavior. Our guiding assumption is that the primary purpose of metaplasticity is to regulate synaptic plasticity, by increasing it when input is low and decreasing it when input is high. For our experiments we adopt a model of metaplasticity that demonstrably has this effect for a single synapse; our primary interest is in how metaplasticity thus defined affects network-level phenomena. We focus on a network-level phenomenon called polychronicity, that has a potential role in representation and memory. A network with polychronicity has the ability to produce non-synchronous but precisely timed sequences of neural firing events that can arise from strongly connected groups of neurons called polychronous neural groups (Izhikevich et al., 2004; Izhikevich, 2006a. Polychronous groups (PNGs develop readily when spiking networks are exposed to repeated spatio-temporal stimuli under the influence of spike-timing-dependent plasticity (STDP, but are sensitive to changes in synaptic weight distribution. We use a technique we have recently developed called Response Fingerprinting to show that PNGs formed in the presence of metaplasticity are significantly larger than those with no metaplasticity. A potential mechanism for this enhancement is proposed that links an inherent property of integrator type neurons called spike latency to an increase in the tolerance of PNG neurons to jitter in their inputs.

  11. Enhanced polychronization in a spiking network with metaplasticity.

    Science.gov (United States)

    Guise, Mira; Knott, Alistair; Benuskova, Lubica

    2015-01-01

    Computational models of metaplasticity have usually focused on the modeling of single synapses (Shouval et al., 2002). In this paper we study the effect of metaplasticity on network behavior. Our guiding assumption is that the primary purpose of metaplasticity is to regulate synaptic plasticity, by increasing it when input is low and decreasing it when input is high. For our experiments we adopt a model of metaplasticity that demonstrably has this effect for a single synapse; our primary interest is in how metaplasticity thus defined affects network-level phenomena. We focus on a network-level phenomenon called polychronicity, that has a potential role in representation and memory. A network with polychronicity has the ability to produce non-synchronous but precisely timed sequences of neural firing events that can arise from strongly connected groups of neurons called polychronous neural groups (Izhikevich et al., 2004). Polychronous groups (PNGs) develop readily when spiking networks are exposed to repeated spatio-temporal stimuli under the influence of spike-timing-dependent plasticity (STDP), but are sensitive to changes in synaptic weight distribution. We use a technique we have recently developed called Response Fingerprinting to show that PNGs formed in the presence of metaplasticity are significantly larger than those with no metaplasticity. A potential mechanism for this enhancement is proposed that links an inherent property of integrator type neurons called spike latency to an increase in the tolerance of PNG neurons to jitter in their inputs.

  12. Spike-timing-based computation in sound localization.

    Directory of Open Access Journals (Sweden)

    Dan F M Goodman

    2010-11-01

    Full Text Available Spike timing is precise in the auditory system and it has been argued that it conveys information about auditory stimuli, in particular about the location of a sound source. However, beyond simple time differences, the way in which neurons might extract this information is unclear and the potential computational advantages are unknown. The computational difficulty of this task for an animal is to locate the source of an unexpected sound from two monaural signals that are highly dependent on the unknown source signal. In neuron models consisting of spectro-temporal filtering and spiking nonlinearity, we found that the binaural structure induced by spatialized sounds is mapped to synchrony patterns that depend on source location rather than on source signal. Location-specific synchrony patterns would then result in the activation of location-specific assemblies of postsynaptic neurons. We designed a spiking neuron model which exploited this principle to locate a variety of sound sources in a virtual acoustic environment using measured human head-related transfer functions. The model was able to accurately estimate the location of previously unknown sounds in both azimuth and elevation (including front/back discrimination in a known acoustic environment. We found that multiple representations of different acoustic environments could coexist as sets of overlapping neural assemblies which could be associated with spatial locations by Hebbian learning. The model demonstrates the computational relevance of relative spike timing to extract spatial information about sources independently of the source signal.

  13. Conduction Delay Learning Model for Unsupervised and Supervised Classification of Spatio-Temporal Spike Patterns

    Directory of Open Access Journals (Sweden)

    Takashi Matsubara

    2017-11-01

    Full Text Available Precise spike timing is considered to play a fundamental role in communications and signal processing in biological neural networks. Understanding the mechanism of spike timing adjustment would deepen our understanding of biological systems and enable advanced engineering applications such as efficient computational architectures. However, the biological mechanisms that adjust and maintain spike timing remain unclear. Existing algorithms adopt a supervised approach, which adjusts the axonal conduction delay and synaptic efficacy until the spike timings approximate the desired timings. This study proposes a spike timing-dependent learning model that adjusts the axonal conduction delay and synaptic efficacy in both unsupervised and supervised manners. The proposed learning algorithm approximates the Expectation-Maximization algorithm, and classifies the input data encoded into spatio-temporal spike patterns. Even in the supervised classification, the algorithm requires no external spikes indicating the desired spike timings unlike existing algorithms. Furthermore, because the algorithm is consistent with biological models and hypotheses found in existing biological studies, it could capture the mechanism underlying biological delay learning.

  14. Joint Probability-Based Neuronal Spike Train Classification

    Directory of Open Access Journals (Sweden)

    Yan Chen

    2009-01-01

    Full Text Available Neuronal spike trains are used by the nervous system to encode and transmit information. Euclidean distance-based methods (EDBMs have been applied to quantify the similarity between temporally-discretized spike trains and model responses. In this study, using the same discretization procedure, we developed and applied a joint probability-based method (JPBM to classify individual spike trains of slowly adapting pulmonary stretch receptors (SARs. The activity of individual SARs was recorded in anaesthetized, paralysed adult male rabbits, which were artificially-ventilated at constant rate and one of three different volumes. Two-thirds of the responses to the 600 stimuli presented at each volume were used to construct three response models (one for each stimulus volume consisting of a series of time bins, each with spike probabilities. The remaining one-third of the responses where used as test responses to be classified into one of the three model responses. This was done by computing the joint probability of observing the same series of events (spikes or no spikes, dictated by the test response in a given model and determining which probability of the three was highest. The JPBM generally produced better classification accuracy than the EDBM, and both performed well above chance. Both methods were similarly affected by variations in discretization parameters, response epoch duration, and two different response alignment strategies. Increasing bin widths increased classification accuracy, which also improved with increased observation time, but primarily during periods of increasing lung inflation. Thus, the JPBM is a simple and effective method performing spike train classification.

  15. Detecting multineuronal temporal patterns in parallel spike trains

    Directory of Open Access Journals (Sweden)

    Kai S. Gansel

    2012-05-01

    Full Text Available We present a non-parametric and computationally efficient method that detects spatiotemporal firing patterns and pattern sequences in parallel spike trains and tests whether the observed numbers of repeating patterns and sequences on a given timescale are significantly different from those expected by chance. The method is generally applicable and uncovers coordinated activity with arbitrary precision by comparing it to appropriate surrogate data. The analysis of coherent patterns of spatially and temporally distributed spiking activity on various timescales enables the immediate tracking of diverse qualities of coordinated firing related to neuronal state changes and information processing. We apply the method to simulated data and multineuronal recordings from rat visual cortex and show that it reliably discriminates between data sets with random pattern occurrences and with additional exactly repeating spatiotemporal patterns and pattern sequences. Multineuronal cortical spiking activity appears to be precisely coordinated and exhibits a sequential organization beyond the cell assembly concept.

  16. Neural and response correlations to natural complex sounds in the auditory midbrain

    Directory of Open Access Journals (Sweden)

    Dominika Lyzwa

    2016-11-01

    Full Text Available How natural communication sounds are spatially represented across the inferior colliculus, the main center of convergence for auditory information in the midbrain, is not known. The neural representation of the acoustic stimuli results from the interplay of locally differing input and the organization of spectral and temporal neural preferences that change gradually across the nucleus. This raises the question how similar the neural representation of the communication sounds is across these gradients of neural preferences, and whether it also changes gradually. Analyzed neural recordings were multi-unit cluster spike trains from guinea pigs presented with a spectrotemporally rich set of eleven species-specific communication sounds. Using cross-correlation, we analyzed the response similarity of spiking activity across a broad frequency range for neurons of similar and different frequency tuning. Furthermore, we separated the contribution of the stimulus to the correlations to investigate whether similarity is only attributable to the stimulus, or, whether interactions exist between the multi-unit clusters that lead to neural correlations and whether these follow the same representation as the response correlations. We found that similarity of responses is dependent on the neurons' spatial distance for similarly and differently frequency-tuned neurons, and that similarity decreases gradually with spatial distance. Significant neural correlations exist, and contribute to the total response similarity. Our findings suggest that for multi-unit clusters in the mammalian inferior colliculus, the gradual response similarity with spatial distance to natural complex sounds is shaped by neural interactions and the gradual organization of neural preferences.

  17. Inferring Neuronal Network Connectivity from Spike Data: A Temporal Data Mining Approach

    Directory of Open Access Journals (Sweden)

    Debprakash Patnaik

    2008-01-01

    Full Text Available Understanding the functioning of a neural system in terms of its underlying circuitry is an important problem in neuroscience. Recent developments in electrophysiology and imaging allow one to simultaneously record activities of hundreds of neurons. Inferring the underlying neuronal connectivity patterns from such multi-neuronal spike train data streams is a challenging statistical and computational problem. This task involves finding significant temporal patterns from vast amounts of symbolic time series data. In this paper we show that the frequent episode mining methods from the field of temporal data mining can be very useful in this context. In the frequent episode discovery framework, the data is viewed as a sequence of events, each of which is characterized by an event type and its time of occurrence and episodes are certain types of temporal patterns in such data. Here we show that, using the set of discovered frequent episodes from multi-neuronal data, one can infer different types of connectivity patterns in the neural system that generated it. For this purpose, we introduce the notion of mining for frequent episodes under certain temporal constraints; the structure of these temporal constraints is motivated by the application. We present algorithms for discovering serial and parallel episodes under these temporal constraints. Through extensive simulation studies we demonstrate that these methods are useful for unearthing patterns of neuronal network connectivity.

  18. Prolonging the postcomplex spike pause speeds eyeblink conditioning.

    Science.gov (United States)

    Maiz, Jaione; Karakossian, Movses H; Pakaprot, Narawut; Robleto, Karla; Thompson, Richard F; Otis, Thomas S

    2012-10-09

    Climbing fiber input to the cerebellum is believed to serve as a teaching signal during associative, cerebellum-dependent forms of motor learning. However, it is not understood how this neural pathway coordinates changes in cerebellar circuitry during learning. Here, we use pharmacological manipulations to prolong the postcomplex spike pause, a component of the climbing fiber signal in Purkinje neurons, and show that these manipulations enhance the rate of learning in classical eyelid conditioning. Our findings elucidate an unappreciated aspect of the climbing fiber teaching signal, and are consistent with a model in which convergent postcomplex spike pauses drive learning-related plasticity in the deep cerebellar nucleus. They also suggest a physiological mechanism that could modulate motor learning rates.

  19. Spin-orbit torque induced spike-timing dependent plasticity

    Energy Technology Data Exchange (ETDEWEB)

    Sengupta, Abhronil, E-mail: asengup@purdue.edu; Al Azim, Zubair; Fong, Xuanyao; Roy, Kaushik [School of Electrical and Computer Engineering, Purdue University, West Lafayette, Indiana 47907 (United States)

    2015-03-02

    Nanoelectronic devices that mimic the functionality of synapses are a crucial requirement for performing cortical simulations of the brain. In this work, we propose a ferromagnet-heavy metal heterostructure that employs spin-orbit torque to implement spike-timing dependent plasticity. The proposed device offers the advantage of decoupled spike transmission and programming current paths, thereby leading to reliable operation during online learning. Possible arrangement of such devices in a crosspoint architecture can pave the way for ultra-dense neural networks. Simulation studies indicate that the device has the potential of achieving pico-Joule level energy consumption (maximum 2 pJ per synaptic event) which is comparable to the energy consumption for synaptic events in biological synapses.

  20. Robust spike sorting of retinal ganglion cells tuned to spot stimuli.

    Science.gov (United States)

    Ghahari, Alireza; Badea, Tudor C

    2016-08-01

    We propose an automatic spike sorting approach for the data recorded from a microelectrode array during visual stimulation of wild type retinas with tiled spot stimuli. The approach first detects individual spikes per electrode by their signature local minima. With the mixture probability distribution of the local minima estimated afterwards, it applies a minimum-squared-error clustering algorithm to sort the spikes into different clusters. A template waveform for each cluster per electrode is defined, and a number of reliability tests are performed on it and its corresponding spikes. Finally, a divisive hierarchical clustering algorithm is used to deal with the correlated templates per cluster type across all the electrodes. According to the measures of performance of the spike sorting approach, it is robust even in the cases of recordings with low signal-to-noise ratio.

  1. Robust Spike Sorting of Retinal Ganglion Cells Tuned to Spot Stimuli*

    Science.gov (United States)

    Ghahari, Alireza; Badea, Tudor C.

    2017-01-01

    We propose an automatic spike sorting approach for the data recorded from a microelectrode array during visual stimulation of wild type retinas with 25 spot stimuli. The approach first detects individual spikes per electrode by their signature local minima. With the mixture probability distribution of the local minima estimated afterwards, it applies a minimum-squared-error clustering algorithm to sort the spikes into different clusters. A template waveform for each cluster per electrode is defined, and a number of reliability tests are performed on it and its corresponding spikes. Finally, a divisive hierarchical clustering algorithm is used to deal with the correlated templates per cluster type across all the electrodes. According to the measures of performance of the spike sorting approach, it is very robust even in the cases of recordings with low signal-to-noise ratio. PMID:28268664

  2. Prospective Coding by Spiking Neurons.

    Directory of Open Access Journals (Sweden)

    Johanni Brea

    2016-06-01

    Full Text Available Animals learn to make predictions, such as associating the sound of a bell with upcoming feeding or predicting a movement that a motor command is eliciting. How predictions are realized on the neuronal level and what plasticity rule underlies their learning is not well understood. Here we propose a biologically plausible synaptic plasticity rule to learn predictions on a single neuron level on a timescale of seconds. The learning rule allows a spiking two-compartment neuron to match its current firing rate to its own expected future discounted firing rate. For instance, if an originally neutral event is repeatedly followed by an event that elevates the firing rate of a neuron, the originally neutral event will eventually also elevate the neuron's firing rate. The plasticity rule is a form of spike timing dependent plasticity in which a presynaptic spike followed by a postsynaptic spike leads to potentiation. Even if the plasticity window has a width of 20 milliseconds, associations on the time scale of seconds can be learned. We illustrate prospective coding with three examples: learning to predict a time varying input, learning to predict the next stimulus in a delayed paired-associate task and learning with a recurrent network to reproduce a temporally compressed version of a sequence. We discuss the potential role of the learning mechanism in classical trace conditioning. In the special case that the signal to be predicted encodes reward, the neuron learns to predict the discounted future reward and learning is closely related to the temporal difference learning algorithm TD(λ.

  3. An investigation on the role of spike latency in an artificial olfactory system.

    Science.gov (United States)

    Martinelli, Eugenio; Polese, Davide; Dini, Francesca; Paolesse, Roberto; Filippini, Daniel; Lundström, Ingemar; Di Natale, Corrado

    2011-01-01

    Experimental studies have shown that the reactions to external stimuli may appear only few hundreds of milliseconds after the physical interaction of the stimulus with the proper receptor. This behavior suggests that neurons transmit the largest meaningful part of their signal in the first spikes, and than that the spike latency is a good descriptor of the information content in biological neural networks. In this paper this property has been investigated in an artificial sensorial system where a single layer of spiking neurons is trained with the data generated by an artificial olfactory platform based on a large array of chemical sensors. The capability to discriminate between distinct chemicals and mixtures of them was studied with spiking neural networks endowed with and without lateral inhibitions and considering as output feature of the network both the spikes latency and the average firing rate. Results show that the average firing rate of the output spikes sequences shows the best separation among the experienced vapors, however the latency code is able in a shorter time to correctly discriminate all the tested volatile compounds. This behavior is qualitatively similar to those recently found in natural olfaction, and noteworthy it provides practical suggestions to tail the measurement conditions of artificial olfactory systems defining for each specific case a proper measurement time.

  4. Detection and Evaluation of Spatio-Temporal Spike Patterns in Massively Parallel Spike Train Data with SPADE

    Directory of Open Access Journals (Sweden)

    Pietro Quaglio

    2017-05-01

    Full Text Available Repeated, precise sequences of spikes are largely considered a signature of activation of cell assemblies. These repeated sequences are commonly known under the name of spatio-temporal patterns (STPs. STPs are hypothesized to play a role in the communication of information in the computational process operated by the cerebral cortex. A variety of statistical methods for the detection of STPs have been developed and applied to electrophysiological recordings, but such methods scale poorly with the current size of available parallel spike train recordings (more than 100 neurons. In this work, we introduce a novel method capable of overcoming the computational and statistical limits of existing analysis techniques in detecting repeating STPs within massively parallel spike trains (MPST. We employ advanced data mining techniques to efficiently extract repeating sequences of spikes from the data. Then, we introduce and compare two alternative approaches to distinguish statistically significant patterns from chance sequences. The first approach uses a measure known as conceptual stability, of which we investigate a computationally cheap approximation for applications to such large data sets. The second approach is based on the evaluation of pattern statistical significance. In particular, we provide an extension to STPs of a method we recently introduced for the evaluation of statistical significance of synchronous spike patterns. The performance of the two approaches is evaluated in terms of computational load and statistical power on a variety of artificial data sets that replicate specific features of experimental data. Both methods provide an effective and robust procedure for detection of STPs in MPST data. The method based on significance evaluation shows the best overall performance, although at a higher computational cost. We name the novel procedure the spatio-temporal Spike PAttern Detection and Evaluation (SPADE analysis.

  5. Identifying neural drivers with functional MRI: an electrophysiological validation.

    Directory of Open Access Journals (Sweden)

    Olivier David

    2008-12-01

    Full Text Available Whether functional magnetic resonance imaging (fMRI allows the identification of neural drivers remains an open question of particular importance to refine physiological and neuropsychological models of the brain, and/or to understand neurophysiopathology. Here, in a rat model of absence epilepsy showing spontaneous spike-and-wave discharges originating from the first somatosensory cortex (S1BF, we performed simultaneous electroencephalographic (EEG and fMRI measurements, and subsequent intracerebral EEG (iEEG recordings in regions strongly activated in fMRI (S1BF, thalamus, and striatum. fMRI connectivity was determined from fMRI time series directly and from hidden state variables using a measure of Granger causality and Dynamic Causal Modelling that relates synaptic activity to fMRI. fMRI connectivity was compared to directed functional coupling estimated from iEEG using asymmetry in generalised synchronisation metrics. The neural driver of spike-and-wave discharges was estimated in S1BF from iEEG, and from fMRI only when hemodynamic effects were explicitly removed. Functional connectivity analysis applied directly on fMRI signals failed because hemodynamics varied between regions, rendering temporal precedence irrelevant. This paper provides the first experimental substantiation of the theoretical possibility to improve interregional coupling estimation from hidden neural states of fMRI. As such, it has important implications for future studies on brain connectivity using functional neuroimaging.

  6. Detecting dependencies between spike trains of pairs of neurons through copulas

    OpenAIRE

    Sacerdote, Laura; Tamborrino, Massimiliano; Zucca, Cristina

    2011-01-01

    The dynamics of a neuron are influenced by the connections with the network where it lies. Recorded spike trains exhibit patterns due to the interactions between neurons. However, the structure of the network is not known. A challenging task is to investigate it from the analysis of simultaneously recorded spike trains. We develop a non-parametric method based on copulas, that we apply to simulated data according to different bivariate Leaky In- tegrate and Fire models. The method discerns de...

  7. Stochastic description of complex and simple spike firing in cerebellar Purkinje cells.

    Science.gov (United States)

    Shin, Soon-Lim; Rotter, Stefan; Aertsen, Ad; De Schutter, Erik

    2007-02-01

    Cerebellar Purkinje cells generate two distinct types of spikes, complex and simple spikes, both of which have conventionally been considered to be highly irregular, suggestive of certain types of stochastic processes as underlying mechanisms. Interestingly, however, the interspike interval structures of complex spikes have not been carefully studied so far. We showed in a previous study that simple spike trains are actually composed of regular patterns and single interspike intervals, a mixture that could not be explained by a simple rate-modulated Poisson process. In the present study, we systematically investigated the interspike interval structures of separated complex and simple spike trains recorded in anaesthetized rats, and derived an appropriate stochastic model. We found that: (i) complex spike trains do not exhibit any serial correlations, so they can effectively be generated by a renewal process, (ii) the distribution of intervals between complex spikes exhibits two narrow bands, possibly caused by two oscillatory bands (0.5-1 and 4-8 Hz) in the input to Purkinje cells and (iii) the regularity of regular patterns and single interspike intervals in simple spike trains can be represented by gamma processes of orders, which themselves are drawn from gamma distributions, suggesting that multiple sources modulate the regularity of simple spike trains.

  8. Dendritic spikes are enhanced by cooperative network activity in the intact hippocampus.

    Science.gov (United States)

    Kamondi, A; Acsády, L; Buzsáki, G

    1998-05-15

    In vitro experiments suggest that dendritic fast action potentials may influence the efficacy of concurrently active synapses by enhancing Ca2+ influx into the dendrites. However, the exact circumstances leading to these effects in the intact brain are not known. We have addressed these issues by performing intracellular sharp electrode recordings from morphologically identified sites in the apical dendrites of CA1 pyramidal neurons in vivo while simultaneously monitoring extracellular population activity. The amplitude of spontaneous fast action potentials in dendrites decreased as a function of distance from the soma, suggesting that dendritic propagation of fast action potentials is strongly attenuated in vivo. Whereas the amplitude variability of somatic action potentials was very small, the amplitude of fast spikes varied substantially in distal dendrites. Large-amplitude fast spikes in dendrites occurred during population discharges of CA3-CA1 neurons concurrent with field sharp waves. The large-amplitude fast spikes were associated with bursts of smaller-amplitude action potentials and putative Ca2+ spikes. Both current pulse-evoked and spontaneously occurring Ca2+ spikes were always preceded by large-amplitude fast spikes. More spikes were observed in the dendrites during sharp waves than in the soma, suggesting that local dendritic spikes may be generated during this behaviorally relevant population pattern. Because not all dendritic spikes produce somatic action potentials, they may be functionally distinct from action potentials that signal via the axon.

  9. Wireless transmission of neural signals using entropy and mutual information compression.

    Science.gov (United States)

    Craciun, Stefan; Cheney, David; Gugel, Karl; Sanchez, Justin C; Principe, Jose C

    2011-02-01

    Two of the most critical tasks when designing a portable wireless neural recording system are to limit power consumption and to efficiently use the limited bandwidth. It is known that for most wireless devices the majority of power is consumed by the wireless transmitter and it often represents the bottleneck of the overall design. This paper compares two compression techniques that take advantage of the sparseness of the neural spikes in neural recordings using an information theoretic formalism to enhance the well-established vector quantization (VQ) algorithm. The two discriminative VQ algorithms are applied to neuronal recordings proving their ability to accurately reconstruct action potential (AP) regions of the neuronal signal while compressing background activity without using thresholds. The two operational modes presented offer distinct characteristics to lossy compression. The first approach requires no preprocessing or prior knowledge of the signal while the second requires a training set of spikes to obtain AP templates. The compression algorithms are implemented on an on-board digital signal processor (DSP) and results show that power consumption is decreased while the bandwidth is more efficiently utilized. The compression algorithms have been tested in real time on a hardware platform (PICO DSP ) enhanced with the DSP which runs the algorithm before sending the compressed data to a wireless transmitter. The compression ratios obtained range from 70:1 and 40:1 depending on the signal to noise ratio (SNR) of the input signal. The spike sorting accuracy in the reconstructed data is 95% compatible to the original neural data.

  10. LIF and Simplified SRM Neurons Encode Signals Into Spikes via a Form of Asynchronous Pulse Sigma-Delta Modulation.

    Science.gov (United States)

    Yoon, Young C

    2017-05-01

    We show how two spiking neuron models encode continuous-time signals into spikes (action potentials, time-encoded pulses, or point processes) using a special form of sigma-delta modulation (SDM). In particular, we show that the well-known leaky integrate-and-fire (LIF) neuron and the simplified spike response model (SRM0) neuron encode the continuous-time signals into spikes via a proposed asynchronous pulse SDM (APSDM) scheme. The encoder is clock free using level-crossing sampling with a single-level quantizer, unipolar signaling, differential coding, and pulse-shaping filters. The decoder, in the form of a low-pass filter or bandpass smoothing filter, can be fed with the spikes to reconstruct an estimate of the signal. The density of the spikes reflects the amplitude of the encoded signal. Numerical examples illustrating the concepts and the signaling efficiency of APSDM vis-à-vis SDM for comparable reconstruction accuracies are presented. We anticipate these results will facilitate the design of spiking neurons and spiking neural networks as well as cross fertilizations between the fields of neural coding and the SDM.

  11. Novel Spiking Neuron-Astrocyte Networks based on nonlinear transistor-like models of tripartite synapses.

    Science.gov (United States)

    Valenza, Gaetano; Tedesco, Luciano; Lanata, Antonio; De Rossi, Danilo; Scilingo, Enzo Pasquale

    2013-01-01

    In this paper a novel and efficient computational implementation of a Spiking Neuron-Astrocyte Network (SNAN) is reported. Neurons are modeled according to the Izhikevich formulation and the neuron-astrocyte interactions are intended as tripartite synapsis and modeled with the previously proposed nonlinear transistor-like model. Concerning the learning rules, the original spike-timing dependent plasticity is used for the neural part of the SNAN whereas an ad-hoc rule is proposed for the astrocyte part. SNAN performances are compared with a standard spiking neural network (SNN) and evaluated using the polychronization concept, i.e., number of co-existing groups that spontaneously generate patterns of polychronous activity. The astrocyte-neuron ratio is the biologically inspired value of 1.5. The proposed SNAN shows higher number of polychronous groups than SNN, remarkably achieved for the whole duration of simulation (24 hours).

  12. Open Ephys electroencephalography (Open Ephys  +  EEG): a modular, low-cost, open-source solution to human neural recording.

    Science.gov (United States)

    Black, Christopher; Voigts, Jakob; Agrawal, Uday; Ladow, Max; Santoyo, Juan; Moore, Christopher; Jones, Stephanie

    2017-06-01

    Electroencephalography (EEG) offers a unique opportunity to study human neural activity non-invasively with millisecond resolution using minimal equipment in or outside of a lab setting. EEG can be combined with a number of techniques for closed-loop experiments, where external devices are driven by specific neural signals. However, reliable, commercially available EEG systems are expensive, often making them impractical for individual use and research development. Moreover, by design, a majority of these systems cannot be easily altered to the specification needed by the end user. We focused on mitigating these issues by implementing open-source tools to develop a new EEG platform to drive down research costs and promote collaboration and innovation. Here, we present methods to expand the open-source electrophysiology system, Open Ephys (www.openephys.org), to include human EEG recordings. We describe the equipment and protocol necessary to interface various EEG caps with the Open Ephys acquisition board, and detail methods for processing data. We present applications of Open Ephys  +  EEG as a research tool and discuss how this innovative EEG technology lays a framework for improved closed-loop paradigms and novel brain-computer interface experiments. The Open Ephys  +  EEG system can record reliable human EEG data, as well as human EMG data. A side-by-side comparison of eyes closed 8-14 Hz activity between the Open Ephys  +  EEG system and the Brainvision ActiCHamp EEG system showed similar average power and signal to noise. Open Ephys  +  EEG enables users to acquire high-quality human EEG data comparable to that of commercially available systems, while maintaining the price point and extensibility inherent to open-source systems.

  13. Open Ephys electroencephalography (Open Ephys  +  EEG): a modular, low-cost, open-source solution to human neural recording

    Science.gov (United States)

    Black, Christopher; Voigts, Jakob; Agrawal, Uday; Ladow, Max; Santoyo, Juan; Moore, Christopher; Jones, Stephanie

    2017-06-01

    Objective. Electroencephalography (EEG) offers a unique opportunity to study human neural activity non-invasively with millisecond resolution using minimal equipment in or outside of a lab setting. EEG can be combined with a number of techniques for closed-loop experiments, where external devices are driven by specific neural signals. However, reliable, commercially available EEG systems are expensive, often making them impractical for individual use and research development. Moreover, by design, a majority of these systems cannot be easily altered to the specification needed by the end user. We focused on mitigating these issues by implementing open-source tools to develop a new EEG platform to drive down research costs and promote collaboration and innovation. Approach. Here, we present methods to expand the open-source electrophysiology system, Open Ephys (www.openephys.org), to include human EEG recordings. We describe the equipment and protocol necessary to interface various EEG caps with the Open Ephys acquisition board, and detail methods for processing data. We present applications of Open Ephys  +  EEG as a research tool and discuss how this innovative EEG technology lays a framework for improved closed-loop paradigms and novel brain-computer interface experiments. Main results. The Open Ephys  +  EEG system can record reliable human EEG data, as well as human EMG data. A side-by-side comparison of eyes closed 8-14 Hz activity between the Open Ephys  +  EEG system and the Brainvision ActiCHamp EEG system showed similar average power and signal to noise. Significance. Open Ephys  +  EEG enables users to acquire high-quality human EEG data comparable to that of commercially available systems, while maintaining the price point and extensibility inherent to open-source systems.

  14. Modeling spike-wave discharges by a complex network of neuronal oscillators

    NARCIS (Netherlands)

    Medvedeva, T.M.; Sysoeva, M.V.; Luijtelaar, E.L.J.M. van; Sysoev, I.V.

    2018-01-01

    Purpose: The organization of neural networks and the mechanisms, which generate the highly stereotypical for absence epilepsy spike-wave discharges (SWDs) is heavily debated. Here we describe such a model which can both reproduce the characteristics of SWDs and dynamics of coupling between brain

  15. Neural Dynamics and Information Representation in Microcircuits of Motor Cortex

    Directory of Open Access Journals (Sweden)

    Yasuhiro eTsubo

    2013-05-01

    Full Text Available The brain has to analyze and respond to external events that can change rapidly from time to time, suggesting that information processing by the brain may be essentially dynamic rather than static. The dynamical features of neural computation are of significant importance in motor cortex that governs the process of movement generation and learning. In this paper, we discuss these features based primarily on our recent findings on neural dynamics and information coding in the microcircuit of rat motor cortex. In fact, cortical neurons show a variety of dynamical behavior from rhythmic activity in various frequency bands to highly irregular spike firing. Of particular interest are the similarity and dissimilarity of the neuronal response properties in different layers of motor cortex. By conducting electrophysiological recordings in slice preparation, we report the phase response curves of neurons in different cortical layers to demonstrate their layer-dependent synchronization properties. We then study how motor cortex recruits task-related neurons in different layers for voluntary arm movements by simultaneous juxtacellular and multiunit recordings from behaving rats. The results suggest an interesting difference in the spectrum of functional activity between the superficial and deep layers. Furthermore, the task-related activities recorded from various layers exhibited power law distributions of inter-spike intervals (ISIs, in contrast to a general belief that ISIs obey Poisson or Gamma distributions in cortical neurons. We present a theoretical argument that this power law of in vivo neurons may represent the maximization of the entropy of firing rate with limited energy consumption of spike generation. Though further studies are required to fully clarify the functional implications of this coding principle, it may shed new light on information representations by neurons and circuits in motor cortex.

  16. A novel bio-mimicking, planar nano-edge microelectrode enables enhanced long-term neural recording

    Science.gov (United States)

    Wijdenes, Pierre; Ali, Hasan; Armstrong, Ryden; Zaidi, Wali; Dalton, Colin; Syed, Naweed I.

    2016-10-01

    Our inability to accurately monitor individual neurons and their synaptic activity precludes fundamental understanding of brain function under normal and various pathological conditions. However, recent breakthroughs in micro- and nano-scale fabrication processes have advanced the development of neuro-electronic hybrid technology. Among such devices are three-dimensional and planar electrodes, offering the advantages of either high fidelity or longer-term recordings respectively. Here, we present the next generation of planar microelectrode arrays with “nano-edges” that enable long-term (≥1 month) and high fidelity recordings at a resolution 15 times higher than traditional planar electrodes. This novel technology enables better understanding of brain function and offers a tremendous opportunity towards the development of future bionic hybrids and drug discovery devices.

  17. ViSAPy: a Python tool for biophysics-based generation of virtual spiking activity for evaluation of spike-sorting algorithms.

    Science.gov (United States)

    Hagen, Espen; Ness, Torbjørn V; Khosrowshahi, Amir; Sørensen, Christina; Fyhn, Marianne; Hafting, Torkel; Franke, Felix; Einevoll, Gaute T

    2015-04-30

    New, silicon-based multielectrodes comprising hundreds or more electrode contacts offer the possibility to record spike trains from thousands of neurons simultaneously. This potential cannot be realized unless accurate, reliable automated methods for spike sorting are developed, in turn requiring benchmarking data sets with known ground-truth spike times. We here present a general simulation tool for computing benchmarking data for evaluation of spike-sorting algorithms entitled ViSAPy (Virtual Spiking Activity in Python). The tool is based on a well-established biophysical forward-modeling scheme and is implemented as a Python package built on top of the neuronal simulator NEURON and the Python tool LFPy. ViSAPy allows for arbitrary combinations of multicompartmental neuron models and geometries of recording multielectrodes. Three example benchmarking data sets are generated, i.e., tetrode and polytrode data mimicking in vivo cortical recordings and microelectrode array (MEA) recordings of in vitro activity in salamander retinas. The synthesized example benchmarking data mimics salient features of typical experimental recordings, for example, spike waveforms depending on interspike interval. ViSAPy goes beyond existing methods as it includes biologically realistic model noise, synaptic activation by recurrent spiking networks, finite-sized electrode contacts, and allows for inhomogeneous electrical conductivities. ViSAPy is optimized to allow for generation of long time series of benchmarking data, spanning minutes of biological time, by parallel execution on multi-core computers. ViSAPy is an open-ended tool as it can be generalized to produce benchmarking data or arbitrary recording-electrode geometries and with various levels of complexity. Copyright © 2015 The Authors. Published by Elsevier B.V. All rights reserved.

  18. The Criticality Hypothesis in Neural Systems

    Science.gov (United States)

    Karimipanah, Yahya

    There is mounting evidence that neural networks of the cerebral cortex exhibit scale invariant dynamics. At the larger scale, fMRI recordings have shown evidence for spatiotemporal long range correlations. On the other hand, at the smaller scales this scale invariance is marked by the power law distribution of the size and duration of spontaneous bursts of activity, which are referred as neuronal avalanches. The existence of such avalanches has been conf