WorldWideScience

Sample records for parallel spike trains

  1. Detection and Evaluation of Spatio-Temporal Spike Patterns in Massively Parallel Spike Train Data with SPADE

    Directory of Open Access Journals (Sweden)

    Pietro Quaglio

    2017-05-01

    Full Text Available Repeated, precise sequences of spikes are largely considered a signature of activation of cell assemblies. These repeated sequences are commonly known under the name of spatio-temporal patterns (STPs. STPs are hypothesized to play a role in the communication of information in the computational process operated by the cerebral cortex. A variety of statistical methods for the detection of STPs have been developed and applied to electrophysiological recordings, but such methods scale poorly with the current size of available parallel spike train recordings (more than 100 neurons. In this work, we introduce a novel method capable of overcoming the computational and statistical limits of existing analysis techniques in detecting repeating STPs within massively parallel spike trains (MPST. We employ advanced data mining techniques to efficiently extract repeating sequences of spikes from the data. Then, we introduce and compare two alternative approaches to distinguish statistically significant patterns from chance sequences. The first approach uses a measure known as conceptual stability, of which we investigate a computationally cheap approximation for applications to such large data sets. The second approach is based on the evaluation of pattern statistical significance. In particular, we provide an extension to STPs of a method we recently introduced for the evaluation of statistical significance of synchronous spike patterns. The performance of the two approaches is evaluated in terms of computational load and statistical power on a variety of artificial data sets that replicate specific features of experimental data. Both methods provide an effective and robust procedure for detection of STPs in MPST data. The method based on significance evaluation shows the best overall performance, although at a higher computational cost. We name the novel procedure the spatio-temporal Spike PAttern Detection and Evaluation (SPADE analysis.

  2. ASSET: Analysis of Sequences of Synchronous Events in Massively Parallel Spike Trains

    Science.gov (United States)

    Canova, Carlos; Denker, Michael; Gerstein, George; Helias, Moritz

    2016-01-01

    With the ability to observe the activity from large numbers of neurons simultaneously using modern recording technologies, the chance to identify sub-networks involved in coordinated processing increases. Sequences of synchronous spike events (SSEs) constitute one type of such coordinated spiking that propagates activity in a temporally precise manner. The synfire chain was proposed as one potential model for such network processing. Previous work introduced a method for visualization of SSEs in massively parallel spike trains, based on an intersection matrix that contains in each entry the degree of overlap of active neurons in two corresponding time bins. Repeated SSEs are reflected in the matrix as diagonal structures of high overlap values. The method as such, however, leaves the task of identifying these diagonal structures to visual inspection rather than to a quantitative analysis. Here we present ASSET (Analysis of Sequences of Synchronous EvenTs), an improved, fully automated method which determines diagonal structures in the intersection matrix by a robust mathematical procedure. The method consists of a sequence of steps that i) assess which entries in the matrix potentially belong to a diagonal structure, ii) cluster these entries into individual diagonal structures and iii) determine the neurons composing the associated SSEs. We employ parallel point processes generated by stochastic simulations as test data to demonstrate the performance of the method under a wide range of realistic scenarios, including different types of non-stationarity of the spiking activity and different correlation structures. Finally, the ability of the method to discover SSEs is demonstrated on complex data from large network simulations with embedded synfire chains. Thus, ASSET represents an effective and efficient tool to analyze massively parallel spike data for temporal sequences of synchronous activity. PMID:27420734

  3. Carotid chemoreceptors tune breathing via multipath routing: reticular chain and loop operations supported by parallel spike train correlations.

    Science.gov (United States)

    Morris, Kendall F; Nuding, Sarah C; Segers, Lauren S; Iceman, Kimberly E; O'Connor, Russell; Dean, Jay B; Ott, Mackenzie M; Alencar, Pierina A; Shuman, Dale; Horton, Kofi-Kermit; Taylor-Clark, Thomas E; Bolser, Donald C; Lindsey, Bruce G

    2018-02-01

    We tested the hypothesis that carotid chemoreceptors tune breathing through parallel circuit paths that target distinct elements of an inspiratory neuron chain in the ventral respiratory column (VRC). Microelectrode arrays were used to monitor neuronal spike trains simultaneously in the VRC, peri-nucleus tractus solitarius (p-NTS)-medial medulla, the dorsal parafacial region of the lateral tegmental field (FTL-pF), and medullary raphe nuclei together with phrenic nerve activity during selective stimulation of carotid chemoreceptors or transient hypoxia in 19 decerebrate, neuromuscularly blocked, and artificially ventilated cats. Of 994 neurons tested, 56% had a significant change in firing rate. A total of 33,422 cell pairs were evaluated for signs of functional interaction; 63% of chemoresponsive neurons were elements of at least one pair with correlational signatures indicative of paucisynaptic relationships. We detected evidence for postinspiratory neuron inhibition of rostral VRC I-Driver (pre-Bötzinger) neurons, an interaction predicted to modulate breathing frequency, and for reciprocal excitation between chemoresponsive p-NTS neurons and more downstream VRC inspiratory neurons for control of breathing depth. Chemoresponsive pericolumnar tonic expiratory neurons, proposed to amplify inspiratory drive by disinhibition, were correlationally linked to afferent and efferent "chains" of chemoresponsive neurons extending to all monitored regions. The chains included coordinated clusters of chemoresponsive FTL-pF neurons with functional links to widespread medullary sites involved in the control of breathing. The results support long-standing concepts on brain stem network architecture and a circuit model for peripheral chemoreceptor modulation of breathing with multiple circuit loops and chains tuned by tegmental field neurons with quasi-periodic discharge patterns. NEW & NOTEWORTHY We tested the long-standing hypothesis that carotid chemoreceptors tune the

  4. Fitting neuron models to spike trains

    Directory of Open Access Journals (Sweden)

    Cyrille eRossant

    2011-02-01

    Full Text Available Computational modeling is increasingly used to understand the function of neural circuitsin systems neuroscience.These studies require models of individual neurons with realisticinput-output properties.Recently, it was found that spiking models can accurately predict theprecisely timed spike trains produced by cortical neurons in response tosomatically injected currents,if properly fitted. This requires fitting techniques that are efficientand flexible enough to easily test different candidate models.We present a generic solution, based on the Brian simulator(a neural network simulator in Python, which allowsthe user to define and fit arbitrary neuron models to electrophysiological recordings.It relies on vectorization and parallel computing techniques toachieve efficiency.We demonstrate its use on neural recordings in the barrel cortex andin the auditory brainstem, and confirm that simple adaptive spiking modelscan accurately predict the response of cortical neurons. Finally, we show how a complexmulticompartmental model can be reduced to a simple effective spiking model.

  5. Statistical properties of superimposed stationary spike trains.

    Science.gov (United States)

    Deger, Moritz; Helias, Moritz; Boucsein, Clemens; Rotter, Stefan

    2012-06-01

    The Poisson process is an often employed model for the activity of neuronal populations. It is known, though, that superpositions of realistic, non- Poisson spike trains are not in general Poisson processes, not even for large numbers of superimposed processes. Here we construct superimposed spike trains from intracellular in vivo recordings from rat neocortex neurons and compare their statistics to specific point process models. The constructed superimposed spike trains reveal strong deviations from the Poisson model. We find that superpositions of model spike trains that take the effective refractoriness of the neurons into account yield a much better description. A minimal model of this kind is the Poisson process with dead-time (PPD). For this process, and for superpositions thereof, we obtain analytical expressions for some second-order statistical quantities-like the count variability, inter-spike interval (ISI) variability and ISI correlations-and demonstrate the match with the in vivo data. We conclude that effective refractoriness is the key property that shapes the statistical properties of the superposition spike trains. We present new, efficient algorithms to generate superpositions of PPDs and of gamma processes that can be used to provide more realistic background input in simulations of networks of spiking neurons. Using these generators, we show in simulations that neurons which receive superimposed spike trains as input are highly sensitive for the statistical effects induced by neuronal refractoriness.

  6. Training spiking neural networks to associate spatio-temporal input-output spike patterns

    OpenAIRE

    Mohemmed, A; Schliebs, S; Matsuda, S; Kasabov, N

    2013-01-01

    In a previous work (Mohemmed et al., Method for training a spiking neuron to associate input–output spike trains) [1] we have proposed a supervised learning algorithm based on temporal coding to train a spiking neuron to associate input spatiotemporal spike patterns to desired output spike patterns. The algorithm is based on the conversion of spike trains into analogue signals and the application of the Widrow–Hoff learning rule. In this paper we present a mathematical formulation of the prop...

  7. Local Variation of Hashtag Spike Trains and Popularity in Twitter

    Science.gov (United States)

    Sanlı, Ceyda; Lambiotte, Renaud

    2015-01-01

    We draw a parallel between hashtag time series and neuron spike trains. In each case, the process presents complex dynamic patterns including temporal correlations, burstiness, and all other types of nonstationarity. We propose the adoption of the so-called local variation in order to uncover salient dynamical properties, while properly detrending for the time-dependent features of a signal. The methodology is tested on both real and randomized hashtag spike trains, and identifies that popular hashtags present regular and so less bursty behavior, suggesting its potential use for predicting online popularity in social media. PMID:26161650

  8. Training Deep Spiking Neural Networks Using Backpropagation.

    Science.gov (United States)

    Lee, Jun Haeng; Delbruck, Tobi; Pfeiffer, Michael

    2016-01-01

    Deep spiking neural networks (SNNs) hold the potential for improving the latency and energy efficiency of deep neural networks through data-driven event-based computation. However, training such networks is difficult due to the non-differentiable nature of spike events. In this paper, we introduce a novel technique, which treats the membrane potentials of spiking neurons as differentiable signals, where discontinuities at spike times are considered as noise. This enables an error backpropagation mechanism for deep SNNs that follows the same principles as in conventional deep networks, but works directly on spike signals and membrane potentials. Compared with previous methods relying on indirect training and conversion, our technique has the potential to capture the statistics of spikes more precisely. We evaluate the proposed framework on artificially generated events from the original MNIST handwritten digit benchmark, and also on the N-MNIST benchmark recorded with an event-based dynamic vision sensor, in which the proposed method reduces the error rate by a factor of more than three compared to the best previous SNN, and also achieves a higher accuracy than a conventional convolutional neural network (CNN) trained and tested on the same data. We demonstrate in the context of the MNIST task that thanks to their event-driven operation, deep SNNs (both fully connected and convolutional) trained with our method achieve accuracy equivalent with conventional neural networks. In the N-MNIST example, equivalent accuracy is achieved with about five times fewer computational operations.

  9. Inferring oscillatory modulation in neural spike trains.

    Science.gov (United States)

    Arai, Kensuke; Kass, Robert E

    2017-10-01

    Oscillations are observed at various frequency bands in continuous-valued neural recordings like the electroencephalogram (EEG) and local field potential (LFP) in bulk brain matter, and analysis of spike-field coherence reveals that spiking of single neurons often occurs at certain phases of the global oscillation. Oscillatory modulation has been examined in relation to continuous-valued oscillatory signals, and independently from the spike train alone, but behavior or stimulus triggered firing-rate modulation, spiking sparseness, presence of slow modulation not locked to stimuli and irregular oscillations with large variability in oscillatory periods, present challenges to searching for temporal structures present in the spike train. In order to study oscillatory modulation in real data collected under a variety of experimental conditions, we describe a flexible point-process framework we call the Latent Oscillatory Spike Train (LOST) model to decompose the instantaneous firing rate in biologically and behaviorally relevant factors: spiking refractoriness, event-locked firing rate non-stationarity, and trial-to-trial variability accounted for by baseline offset and a stochastic oscillatory modulation. We also extend the LOST model to accommodate changes in the modulatory structure over the duration of the experiment, and thereby discover trial-to-trial variability in the spike-field coherence of a rat primary motor cortical neuron to the LFP theta rhythm. Because LOST incorporates a latent stochastic auto-regressive term, LOST is able to detect oscillations when the firing rate is low, the modulation is weak, and when the modulating oscillation has a broad spectral peak.

  10. Temporal Correlations and Neural Spike Train Entropy

    International Nuclear Information System (INIS)

    Schultz, Simon R.; Panzeri, Stefano

    2001-01-01

    Sampling considerations limit the experimental conditions under which information theoretic analyses of neurophysiological data yield reliable results. We develop a procedure for computing the full temporal entropy and information of ensembles of neural spike trains, which performs reliably for limited samples of data. This approach also yields insight to the role of correlations between spikes in temporal coding mechanisms. The method, when applied to recordings from complex cells of the monkey primary visual cortex, results in lower rms error information estimates in comparison to a 'brute force' approach

  11. An Overview of Bayesian Methods for Neural Spike Train Analysis

    Directory of Open Access Journals (Sweden)

    Zhe Chen

    2013-01-01

    Full Text Available Neural spike train analysis is an important task in computational neuroscience which aims to understand neural mechanisms and gain insights into neural circuits. With the advancement of multielectrode recording and imaging technologies, it has become increasingly demanding to develop statistical tools for analyzing large neuronal ensemble spike activity. Here we present a tutorial overview of Bayesian methods and their representative applications in neural spike train analysis, at both single neuron and population levels. On the theoretical side, we focus on various approximate Bayesian inference techniques as applied to latent state and parameter estimation. On the application side, the topics include spike sorting, tuning curve estimation, neural encoding and decoding, deconvolution of spike trains from calcium imaging signals, and inference of neuronal functional connectivity and synchrony. Some research challenges and opportunities for neural spike train analysis are discussed.

  12. Impact of spike train autostructure on probability distribution of joint spike events.

    Science.gov (United States)

    Pipa, Gordon; Grün, Sonja; van Vreeswijk, Carl

    2013-05-01

    The discussion whether temporally coordinated spiking activity really exists and whether it is relevant has been heated over the past few years. To investigate this issue, several approaches have been taken to determine whether synchronized events occur significantly above chance, that is, whether they occur more often than expected if the neurons fire independently. Most investigations ignore or destroy the autostructure of the spiking activity of individual cells or assume Poissonian spiking as a model. Such methods that ignore the autostructure can significantly bias the coincidence statistics. Here, we study the influence of the autostructure on the probability distribution of coincident spiking events between tuples of mutually independent non-Poisson renewal processes. In particular, we consider two types of renewal processes that were suggested as appropriate models of experimental spike trains: a gamma and a log-normal process. For a gamma process, we characterize the shape of the distribution analytically with the Fano factor (FFc). In addition, we perform Monte Carlo estimations to derive the full shape of the distribution and the probability for false positives if a different process type is assumed as was actually present. We also determine how manipulations of such spike trains, here dithering, used for the generation of surrogate data change the distribution of coincident events and influence the significance estimation. We find, first, that the width of the coincidence count distribution and its FFc depend critically and in a nontrivial way on the detailed properties of the structure of the spike trains as characterized by the coefficient of variation CV. Second, the dependence of the FFc on the CV is complex and mostly nonmonotonic. Third, spike dithering, even if as small as a fraction of the interspike interval, can falsify the inference on coordinated firing.

  13. Detection of bursts in neuronal spike trains by the mean inter-spike interval method

    Institute of Scientific and Technical Information of China (English)

    Lin Chen; Yong Deng; Weihua Luo; Zhen Wang; Shaoqun Zeng

    2009-01-01

    Bursts are electrical spikes firing with a high frequency, which are the most important property in synaptic plasticity and information processing in the central nervous system. However, bursts are difficult to identify because bursting activities or patterns vary with phys-iological conditions or external stimuli. In this paper, a simple method automatically to detect bursts in spike trains is described. This method auto-adaptively sets a parameter (mean inter-spike interval) according to intrinsic properties of the detected burst spike trains, without any arbitrary choices or any operator judgrnent. When the mean value of several successive inter-spike intervals is not larger than the parameter, a burst is identified. By this method, bursts can be automatically extracted from different bursting patterns of cultured neurons on multi-electrode arrays, as accurately as by visual inspection. Furthermore, significant changes of burst variables caused by electrical stimulus have been found in spontaneous activity of neuronal network. These suggest that the mean inter-spike interval method is robust for detecting changes in burst patterns and characteristics induced by environmental alterations.

  14. SPIKY: a graphical user interface for monitoring spike train synchrony.

    Science.gov (United States)

    Kreuz, Thomas; Mulansky, Mario; Bozanic, Nebojsa

    2015-05-01

    Techniques for recording large-scale neuronal spiking activity are developing very fast. This leads to an increasing demand for algorithms capable of analyzing large amounts of experimental spike train data. One of the most crucial and demanding tasks is the identification of similarity patterns with a very high temporal resolution and across different spatial scales. To address this task, in recent years three time-resolved measures of spike train synchrony have been proposed, the ISI-distance, the SPIKE-distance, and event synchronization. The Matlab source codes for calculating and visualizing these measures have been made publicly available. However, due to the many different possible representations of the results the use of these codes is rather complicated and their application requires some basic knowledge of Matlab. Thus it became desirable to provide a more user-friendly and interactive interface. Here we address this need and present SPIKY, a graphical user interface that facilitates the application of time-resolved measures of spike train synchrony to both simulated and real data. SPIKY includes implementations of the ISI-distance, the SPIKE-distance, and the SPIKE-synchronization (an improved and simplified extension of event synchronization) that have been optimized with respect to computation speed and memory demand. It also comprises a spike train generator and an event detector that makes it capable of analyzing continuous data. Finally, the SPIKY package includes additional complementary programs aimed at the analysis of large numbers of datasets and the estimation of significance levels. Copyright © 2015 the American Physiological Society.

  15. Joint Probability-Based Neuronal Spike Train Classification

    Directory of Open Access Journals (Sweden)

    Yan Chen

    2009-01-01

    Full Text Available Neuronal spike trains are used by the nervous system to encode and transmit information. Euclidean distance-based methods (EDBMs have been applied to quantify the similarity between temporally-discretized spike trains and model responses. In this study, using the same discretization procedure, we developed and applied a joint probability-based method (JPBM to classify individual spike trains of slowly adapting pulmonary stretch receptors (SARs. The activity of individual SARs was recorded in anaesthetized, paralysed adult male rabbits, which were artificially-ventilated at constant rate and one of three different volumes. Two-thirds of the responses to the 600 stimuli presented at each volume were used to construct three response models (one for each stimulus volume consisting of a series of time bins, each with spike probabilities. The remaining one-third of the responses where used as test responses to be classified into one of the three model responses. This was done by computing the joint probability of observing the same series of events (spikes or no spikes, dictated by the test response in a given model and determining which probability of the three was highest. The JPBM generally produced better classification accuracy than the EDBM, and both performed well above chance. Both methods were similarly affected by variations in discretization parameters, response epoch duration, and two different response alignment strategies. Increasing bin widths increased classification accuracy, which also improved with increased observation time, but primarily during periods of increasing lung inflation. Thus, the JPBM is a simple and effective method performing spike train classification.

  16. An Efficient Supervised Training Algorithm for Multilayer Spiking Neural Networks.

    Science.gov (United States)

    Xie, Xiurui; Qu, Hong; Liu, Guisong; Zhang, Malu; Kurths, Jürgen

    2016-01-01

    The spiking neural networks (SNNs) are the third generation of neural networks and perform remarkably well in cognitive tasks such as pattern recognition. The spike emitting and information processing mechanisms found in biological cognitive systems motivate the application of the hierarchical structure and temporal encoding mechanism in spiking neural networks, which have exhibited strong computational capability. However, the hierarchical structure and temporal encoding approach require neurons to process information serially in space and time respectively, which reduce the training efficiency significantly. For training the hierarchical SNNs, most existing methods are based on the traditional back-propagation algorithm, inheriting its drawbacks of the gradient diffusion and the sensitivity on parameters. To keep the powerful computation capability of the hierarchical structure and temporal encoding mechanism, but to overcome the low efficiency of the existing algorithms, a new training algorithm, the Normalized Spiking Error Back Propagation (NSEBP) is proposed in this paper. In the feedforward calculation, the output spike times are calculated by solving the quadratic function in the spike response model instead of detecting postsynaptic voltage states at all time points in traditional algorithms. Besides, in the feedback weight modification, the computational error is propagated to previous layers by the presynaptic spike jitter instead of the gradient decent rule, which realizes the layer-wised training. Furthermore, our algorithm investigates the mathematical relation between the weight variation and voltage error change, which makes the normalization in the weight modification applicable. Adopting these strategies, our algorithm outperforms the traditional SNN multi-layer algorithms in terms of learning efficiency and parameter sensitivity, that are also demonstrated by the comprehensive experimental results in this paper.

  17. Stochastic optimal control of single neuron spike trains

    DEFF Research Database (Denmark)

    Iolov, Alexandre; Ditlevsen, Susanne; Longtin, Andrë

    2014-01-01

    stimulation of a neuron to achieve a target spike train under the physiological constraint to not damage tissue. Approach. We pose a stochastic optimal control problem to precisely specify the spike times in a leaky integrate-and-fire (LIF) model of a neuron with noise assumed to be of intrinsic or synaptic...... origin. In particular, we allow for the noise to be of arbitrary intensity. The optimal control problem is solved using dynamic programming when the controller has access to the voltage (closed-loop control), and using a maximum principle for the transition density when the controller only has access...... to the spike times (open-loop control). Main results. We have developed a stochastic optimal control algorithm to obtain precise spike times. It is applicable in both the supra-threshold and sub-threshold regimes, under open-loop and closed-loop conditions and with an arbitrary noise intensity; the accuracy...

  18. Noise-enhanced coding in phasic neuron spike trains.

    Science.gov (United States)

    Ly, Cheng; Doiron, Brent

    2017-01-01

    The stochastic nature of neuronal response has lead to conjectures about the impact of input fluctuations on the neural coding. For the most part, low pass membrane integration and spike threshold dynamics have been the primary features assumed in the transfer from synaptic input to output spiking. Phasic neurons are a common, but understudied, neuron class that are characterized by a subthreshold negative feedback that suppresses spike train responses to low frequency signals. Past work has shown that when a low frequency signal is accompanied by moderate intensity broadband noise, phasic neurons spike trains are well locked to the signal. We extend these results with a simple, reduced model of phasic activity that demonstrates that a non-Markovian spike train structure caused by the negative feedback produces a noise-enhanced coding. Further, this enhancement is sensitive to the timescales, as opposed to the intensity, of a driving signal. Reduced hazard function models show that noise-enhanced phasic codes are both novel and separate from classical stochastic resonance reported in non-phasic neurons. The general features of our theory suggest that noise-enhanced codes in excitable systems with subthreshold negative feedback are a particularly rich framework to study.

  19. Neural Spike Train Synchronisation Indices: Definitions, Interpretations and Applications.

    Science.gov (United States)

    Halliday, D M; Rosenberg, J R

    2017-04-24

    A comparison of previously defined spike train syncrhonization indices is undertaken within a stochastic point process framework. The second order cumulant density (covariance density) is shown to be common to all the indices. Simulation studies were used to investigate the sampling variability of a single index based on the second order cumulant. The simulations used a paired motoneurone model and a paired regular spiking cortical neurone model. The sampling variability of spike trains generated under identical conditions from the paired motoneurone model varied from 50% { 160% of the estimated value. On theoretical grounds, and on the basis of simulated data a rate dependence is present in all synchronization indices. The application of coherence and pooled coherence estimates to the issue of synchronization indices is considered. This alternative frequency domain approach allows an arbitrary number of spike train pairs to be evaluated for statistically significant differences, and combined into a single population measure. The pooled coherence framework allows pooled time domain measures to be derived, application of this to the simulated data is illustrated. Data from the cortical neurone model is generated over a wide range of firing rates (1 - 250 spikes/sec). The pooled coherence framework correctly characterizes the sampling variability as not significant over this wide operating range. The broader applicability of this approach to multi electrode array data is briefly discussed.

  20. Supervised learning in spiking neural networks with FORCE training.

    Science.gov (United States)

    Nicola, Wilten; Clopath, Claudia

    2017-12-20

    Populations of neurons display an extraordinary diversity in the behaviors they affect and display. Machine learning techniques have recently emerged that allow us to create networks of model neurons that display behaviors of similar complexity. Here we demonstrate the direct applicability of one such technique, the FORCE method, to spiking neural networks. We train these networks to mimic dynamical systems, classify inputs, and store discrete sequences that correspond to the notes of a song. Finally, we use FORCE training to create two biologically motivated model circuits. One is inspired by the zebra finch and successfully reproduces songbird singing. The second network is motivated by the hippocampus and is trained to store and replay a movie scene. FORCE trained networks reproduce behaviors comparable in complexity to their inspired circuits and yield information not easily obtainable with other techniques, such as behavioral responses to pharmacological manipulations and spike timing statistics.

  1. A metric space approach to the information capacity of spike trains

    OpenAIRE

    HOUGHTON, CONOR JAMES; GILLESPIE, JAMES

    2010-01-01

    PUBLISHED Classical information theory can be either discrete or continuous, corresponding to discrete or continuous random variables. However, although spike times in a spike train are described by continuous variables, the information content is usually calculated using discrete information theory. This is because the number of spikes, and hence, the number of variables, varies from spike train to spike train, making the continuous theory difficult to apply.It is possible to avoid ...

  2. Note on the coefficient of variations of neuronal spike trains.

    Science.gov (United States)

    Lengler, Johannes; Steger, Angelika

    2017-08-01

    It is known that many neurons in the brain show spike trains with a coefficient of variation (CV) of the interspike times of approximately 1, thus resembling the properties of Poisson spike trains. Computational studies have been able to reproduce this phenomenon. However, the underlying models were too complex to be examined analytically. In this paper, we offer a simple model that shows the same effect but is accessible to an analytic treatment. The model is a random walk model with a reflecting barrier; we give explicit formulas for the CV in the regime of excess inhibition. We also analyze the effect of probabilistic synapses in our model and show that it resembles previous findings that were obtained by simulation.

  3. Spike train generation and current-to-frequency conversion in silicon diodes

    Science.gov (United States)

    Coon, D. D.; Perera, A. G. U.

    1989-01-01

    A device physics model is developed to analyze spontaneous neuron-like spike train generation in current driven silicon p(+)-n-n(+) devices in cryogenic environments. The model is shown to explain the very high dynamic range (0 to the 7th) current-to-frequency conversion and experimental features of the spike train frequency as a function of input current. The devices are interesting components for implementation of parallel asynchronous processing adjacent to cryogenically cooled focal planes because of their extremely low current and power requirements, their electronic simplicity, and their pulse coding capability, and could be used to form the hardware basis for neural networks which employ biologically plausible means of information coding.

  4. Stochastic models for spike trains of single neurons

    CERN Document Server

    Sampath, G

    1977-01-01

    1 Some basic neurophysiology 4 The neuron 1. 1 4 1. 1. 1 The axon 7 1. 1. 2 The synapse 9 12 1. 1. 3 The soma 1. 1. 4 The dendrites 13 13 1. 2 Types of neurons 2 Signals in the nervous system 14 2. 1 Action potentials as point events - point processes in the nervous system 15 18 2. 2 Spontaneous activi~ in neurons 3 Stochastic modelling of single neuron spike trains 19 3. 1 Characteristics of a neuron spike train 19 3. 2 The mathematical neuron 23 4 Superposition models 26 4. 1 superposition of renewal processes 26 4. 2 Superposition of stationary point processe- limiting behaviour 34 4. 2. 1 Palm functions 35 4. 2. 2 Asymptotic behaviour of n stationary point processes superposed 36 4. 3 Superposition models of neuron spike trains 37 4. 3. 1 Model 4. 1 39 4. 3. 2 Model 4. 2 - A superposition model with 40 two input channels 40 4. 3. 3 Model 4. 3 4. 4 Discussion 41 43 5 Deletion models 5. 1 Deletion models with 1nd~endent interaction of excitatory and inhibitory sequences 44 VI 5. 1. 1 Model 5. 1 The basic de...

  5. Training Spiking Neural Models Using Artificial Bee Colony

    Science.gov (United States)

    Vazquez, Roberto A.; Garro, Beatriz A.

    2015-01-01

    Spiking neurons are models designed to simulate, in a realistic manner, the behavior of biological neurons. Recently, it has been proven that this type of neurons can be applied to solve pattern recognition problems with great efficiency. However, the lack of learning strategies for training these models do not allow to use them in several pattern recognition problems. On the other hand, several bioinspired algorithms have been proposed in the last years for solving a broad range of optimization problems, including those related to the field of artificial neural networks (ANNs). Artificial bee colony (ABC) is a novel algorithm based on the behavior of bees in the task of exploring their environment to find a food source. In this paper, we describe how the ABC algorithm can be used as a learning strategy to train a spiking neuron aiming to solve pattern recognition problems. Finally, the proposed approach is tested on several pattern recognition problems. It is important to remark that to realize the powerfulness of this type of model only one neuron will be used. In addition, we analyze how the performance of these models is improved using this kind of learning strategy. PMID:25709644

  6. Measures of spike train synchrony for data with multiple time scales

    NARCIS (Netherlands)

    Satuvuori, Eero; Mulansky, Mario; Bozanic, Nebojsa; Malvestio, Irene; Zeldenrust, Fleur; Lenk, Kerstin; Kreuz, Thomas

    2017-01-01

    Background Measures of spike train synchrony are widely used in both experimental and computational neuroscience. Time-scale independent and parameter-free measures, such as the ISI-distance, the SPIKE-distance and SPIKE-synchronization, are preferable to time scale parametric measures, since by

  7. Multineuron spike train analysis with R-convolution linear combination kernel.

    Science.gov (United States)

    Tezuka, Taro

    2018-06-01

    A spike train kernel provides an effective way of decoding information represented by a spike train. Some spike train kernels have been extended to multineuron spike trains, which are simultaneously recorded spike trains obtained from multiple neurons. However, most of these multineuron extensions were carried out in a kernel-specific manner. In this paper, a general framework is proposed for extending any single-neuron spike train kernel to multineuron spike trains, based on the R-convolution kernel. Special subclasses of the proposed R-convolution linear combination kernel are explored. These subclasses have a smaller number of parameters and make optimization tractable when the size of data is limited. The proposed kernel was evaluated using Gaussian process regression for multineuron spike trains recorded from an animal brain. It was compared with the sum kernel and the population Spikernel, which are existing ways of decoding multineuron spike trains using kernels. The results showed that the proposed approach performs better than these kernels and also other commonly used neural decoding methods. Copyright © 2018 Elsevier Ltd. All rights reserved.

  8. On the Spike Train Variability Characterized by Variance-to-Mean Power Relationship.

    Science.gov (United States)

    Koyama, Shinsuke

    2015-07-01

    We propose a statistical method for modeling the non-Poisson variability of spike trains observed in a wide range of brain regions. Central to our approach is the assumption that the variance and the mean of interspike intervals are related by a power function characterized by two parameters: the scale factor and exponent. It is shown that this single assumption allows the variability of spike trains to have an arbitrary scale and various dependencies on the firing rate in the spike count statistics, as well as in the interval statistics, depending on the two parameters of the power function. We also propose a statistical model for spike trains that exhibits the variance-to-mean power relationship. Based on this, a maximum likelihood method is developed for inferring the parameters from rate-modulated spike trains. The proposed method is illustrated on simulated and experimental spike trains.

  9. Parallel Evolutionary Optimization for Neuromorphic Network Training

    Energy Technology Data Exchange (ETDEWEB)

    Schuman, Catherine D [ORNL; Disney, Adam [University of Tennessee (UT); Singh, Susheela [North Carolina State University (NCSU), Raleigh; Bruer, Grant [University of Tennessee (UT); Mitchell, John Parker [University of Tennessee (UT); Klibisz, Aleksander [University of Tennessee (UT); Plank, James [University of Tennessee (UT)

    2016-01-01

    One of the key impediments to the success of current neuromorphic computing architectures is the issue of how best to program them. Evolutionary optimization (EO) is one promising programming technique; in particular, its wide applicability makes it especially attractive for neuromorphic architectures, which can have many different characteristics. In this paper, we explore different facets of EO on a spiking neuromorphic computing model called DANNA. We focus on the performance of EO in the design of our DANNA simulator, and on how to structure EO on both multicore and massively parallel computing systems. We evaluate how our parallel methods impact the performance of EO on Titan, the U.S.'s largest open science supercomputer, and BOB, a Beowulf-style cluster of Raspberry Pi's. We also focus on how to improve the EO by evaluating commonality in higher performing neural networks, and present the result of a study that evaluates the EO performed by Titan.

  10. Impact of substance P on the correlation of spike train evoked by electro acupuncture

    International Nuclear Information System (INIS)

    Jin, Chen; Zhang, Xuan; Wang, Jiang; Guo, Yi; Zhao, Xue; Guo, Yong-Ming

    2016-01-01

    Highlights: • We analyze spike trains induced by EA before and after inhibiting SP in PC6 area. • Inhibiting SP leads to an increase of spiking rate of median nerve. • SP may modulate membrane potential to affect the spiking rate. • SP has an influence on long-range correlation of spike train evoked by EA. • SP play an important role in EA-induced neural spiking and encoding. - Abstract: Substance P (SP) participates in the neural signal transmission evoked by electro-acupuncture (EA). This paper investigates the impact of SP on the correlation of spike train in the median nerve evoked by EA at 'Neiguan' acupoint (PC6). It shows that the spiking rate and interspike interval (ISI) distribution change obviously after inhibiting SP. This variation of spiking activity indicates that SP affects the temporal structure of spike train through modulating the action potential on median nerve filaments. Furtherly, the correlation coefficient and scaling exponent are considered to measure the correlation of spike train. Scaled Windowed Variance (SWV) method is applied to calculate scaling exponent which quantifies the long-range correlation of the neural electrical signals. It is found that the correlation coefficients of ISI increase after inhibiting SP released. In addition, the scaling exponents of neuronal spike train have significant differences between before and after inhibiting SP. These findings demonstrate that SP has an influence on the long-range correlation of spike train. Our results indicate that SP may play an important role in EA-induced neural spiking and encoding.

  11. Advanced correlation grid: Analysis and visualisation of functional connectivity among multiple spike trains.

    Science.gov (United States)

    Masud, Mohammad Shahed; Borisyuk, Roman; Stuart, Liz

    2017-07-15

    This study analyses multiple spike trains (MST) data, defines its functional connectivity and subsequently visualises an accurate diagram of connections. This is a challenging problem. For example, it is difficult to distinguish the common input and the direct functional connection of two spike trains. The new method presented in this paper is based on the traditional pairwise cross-correlation function (CCF) and a new combination of statistical techniques. First, the CCF is used to create the Advanced Correlation Grid (ACG) correlation where both the significant peak of the CCF and the corresponding time delay are used for detailed analysis of connectivity. Second, these two features of functional connectivity are used to classify connections. Finally, the visualization technique is used to represent the topology of functional connections. Examples are presented in the paper to demonstrate the new Advanced Correlation Grid method and to show how it enables discrimination between (i) influence from one spike train to another through an intermediate spike train and (ii) influence from one common spike train to another pair of analysed spike trains. The ACG method enables scientists to automatically distinguish between direct connections from spurious connections such as common source connection and indirect connection whereas existing methods require in-depth analysis to identify such connections. The ACG is a new and effective method for studying functional connectivity of multiple spike trains. This method can identify accurately all the direct connections and can distinguish common source and indirect connections automatically. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. iRaster: a novel information visualization tool to explore spatiotemporal patterns in multiple spike trains.

    Science.gov (United States)

    Somerville, J; Stuart, L; Sernagor, E; Borisyuk, R

    2010-12-15

    Over the last few years, simultaneous recordings of multiple spike trains have become widely used by neuroscientists. Therefore, it is important to develop new tools for analysing multiple spike trains in order to gain new insight into the function of neural systems. This paper describes how techniques from the field of visual analytics can be used to reveal specific patterns of neural activity. An interactive raster plot called iRaster has been developed. This software incorporates a selection of statistical procedures for visualization and flexible manipulations with multiple spike trains. For example, there are several procedures for the re-ordering of spike trains which can be used to unmask activity propagation, spiking synchronization, and many other important features of multiple spike train activity. Additionally, iRaster includes a rate representation of neural activity, a combined representation of rate and spikes, spike train removal and time interval removal. Furthermore, it provides multiple coordinated views, time and spike train zooming windows, a fisheye lens distortion, and dissemination facilities. iRaster is a user friendly, interactive, flexible tool which supports a broad range of visual representations. This tool has been successfully used to analyse both synthetic and experimentally recorded datasets. In this paper, the main features of iRaster are described and its performance and effectiveness are demonstrated using various types of data including experimental multi-electrode array recordings from the ganglion cell layer in mouse retina. iRaster is part of an ongoing research project called VISA (Visualization of Inter-Spike Associations) at the Visualization Lab in the University of Plymouth. The overall aim of the VISA project is to provide neuroscientists with the ability to freely explore and analyse their data. The software is freely available from the Visualization Lab website (see www.plymouth.ac.uk/infovis). Copyright © 2010

  13. Detecting dependencies between spike trains of pairs of neurons through copulas

    DEFF Research Database (Denmark)

    Sacerdote, Laura; Tamborrino, Massimiliano; Zucca, Cristina

    2011-01-01

    The dynamics of a neuron are influenced by the connections with the network where it lies. Recorded spike trains exhibit patterns due to the interactions between neurons. However, the structure of the network is not known. A challenging task is to investigate it from the analysis of simultaneously...... the two neurons. Furthermore, the method recognizes the presence of delays in the spike propagation....

  14. Using Matrix and Tensor Factorizations for the Single-Trial Analysis of Population Spike Trains.

    Directory of Open Access Journals (Sweden)

    Arno Onken

    2016-11-01

    Full Text Available Advances in neuronal recording techniques are leading to ever larger numbers of simultaneously monitored neurons. This poses the important analytical challenge of how to capture compactly all sensory information that neural population codes carry in their spatial dimension (differences in stimulus tuning across neurons at different locations, in their temporal dimension (temporal neural response variations, or in their combination (temporally coordinated neural population firing. Here we investigate the utility of tensor factorizations of population spike trains along space and time. These factorizations decompose a dataset of single-trial population spike trains into spatial firing patterns (combinations of neurons firing together, temporal firing patterns (temporal activation of these groups of neurons and trial-dependent activation coefficients (strength of recruitment of such neural patterns on each trial. We validated various factorization methods on simulated data and on populations of ganglion cells simultaneously recorded in the salamander retina. We found that single-trial tensor space-by-time decompositions provided low-dimensional data-robust representations of spike trains that capture efficiently both their spatial and temporal information about sensory stimuli. Tensor decompositions with orthogonality constraints were the most efficient in extracting sensory information, whereas non-negative tensor decompositions worked well even on non-independent and overlapping spike patterns, and retrieved informative firing patterns expressed by the same population in response to novel stimuli. Our method showed that populations of retinal ganglion cells carried information in their spike timing on the ten-milliseconds-scale about spatial details of natural images. This information could not be recovered from the spike counts of these cells. First-spike latencies carried the majority of information provided by the whole spike train about fine

  15. A unified approach to linking experimental, statistical and computational analysis of spike train data.

    Directory of Open Access Journals (Sweden)

    Liang Meng

    Full Text Available A fundamental issue in neuroscience is how to identify the multiple biophysical mechanisms through which neurons generate observed patterns of spiking activity. In previous work, we proposed a method for linking observed patterns of spiking activity to specific biophysical mechanisms based on a state space modeling framework and a sequential Monte Carlo, or particle filter, estimation algorithm. We have shown, in simulation, that this approach is able to identify a space of simple biophysical models that were consistent with observed spiking data (and included the model that generated the data, but have yet to demonstrate the application of the method to identify realistic currents from real spike train data. Here, we apply the particle filter to spiking data recorded from rat layer V cortical neurons, and correctly identify the dynamics of an slow, intrinsic current. The underlying intrinsic current is successfully identified in four distinct neurons, even though the cells exhibit two distinct classes of spiking activity: regular spiking and bursting. This approach--linking statistical, computational, and experimental neuroscience--provides an effective technique to constrain detailed biophysical models to specific mechanisms consistent with observed spike train data.

  16. Interspike Interval Based Filtering of Directional Selective Retinal Ganglion Cells Spike Trains

    Directory of Open Access Journals (Sweden)

    Aurel Vasile Martiniuc

    2012-01-01

    Full Text Available The information regarding visual stimulus is encoded in spike trains at the output of retina by retinal ganglion cells (RGCs. Among these, the directional selective cells (DSRGC are signaling the direction of stimulus motion. DSRGCs' spike trains show accentuated periods of short interspike intervals (ISIs framed by periods of isolated spikes. Here we use two types of visual stimulus, white noise and drifting bars, and show that short ISI spikes of DSRGCs spike trains are more often correlated to their preferred stimulus feature (that is, the direction of stimulus motion and carry more information than longer ISI spikes. Firstly, our results show that correlation between stimulus and recorded neuronal response is best at short ISI spiking activity and decrease as ISI becomes larger. We then used grating bars stimulus and found that as ISI becomes shorter the directional selectivity is better and information rates are higher. Interestingly, for the less encountered type of DSRGC, known as ON-DSRGC, short ISI distribution and information rates revealed consistent differences when compared with the other directional selective cell type, the ON-OFF DSRGC. However, these findings suggest that ISI-based temporal filtering integrates a mechanism for visual information processing at the output of retina toward higher stages within early visual system.

  17. Voltage-spike analysis for a free-running parallel inverter

    Science.gov (United States)

    Lee, F. C. Y.; Wilson, T. G.

    1974-01-01

    Unwanted and sometimes damaging high-amplitude voltage spikes occur during each half cycle in many transistor saturable-core inverters at the moment when the core saturates and the transistors switch. The analysis shows that spikes are an intrinsic characteristic of certain types of inverters even with negligible leakage inductance and purely resistive load. The small but unavoidable after-saturation inductance of the saturable-core transformer plays an essential role in creating these undesired thigh-voltage spikes. State-plane analysis provides insight into the complex interaction between core and transistors, and shows the circuit parameters upon which the magnitude of these spikes depends.

  18. Which spike train distance is most suitable for distinguishing rate and temporal coding?

    Science.gov (United States)

    Satuvuori, Eero; Kreuz, Thomas

    2018-04-01

    It is commonly assumed in neuronal coding that repeated presentations of a stimulus to a coding neuron elicit similar responses. One common way to assess similarity are spike train distances. These can be divided into spike-resolved, such as the Victor-Purpura and the van Rossum distance, and time-resolved, e.g. the ISI-, the SPIKE- and the RI-SPIKE-distance. We use independent steady-rate Poisson processes as surrogates for spike trains with fixed rate and no timing information to address two basic questions: How does the sensitivity of the different spike train distances to temporal coding depend on the rates of the two processes and how do the distances deal with very low rates? Spike-resolved distances always contain rate information even for parameters indicating time coding. This is an issue for reasonably high rates but beneficial for very low rates. In contrast, the operational range for detecting time coding of time-resolved distances is superior at normal rates, but these measures produce artefacts at very low rates. The RI-SPIKE-distance is the only measure that is sensitive to timing information only. While our results on rate-dependent expectation values for the spike-resolved distances agree with Chicharro et al. (2011), we here go one step further and specifically investigate applicability for very low rates. The most appropriate measure depends on the rates of the data being analysed. Accordingly, we summarize our results in one table that allows an easy selection of the preferred measure for any kind of data. Copyright © 2018 The Authors. Published by Elsevier B.V. All rights reserved.

  19. Model-based decoding, information estimation, and change-point detection techniques for multineuron spike trains.

    Science.gov (United States)

    Pillow, Jonathan W; Ahmadian, Yashar; Paninski, Liam

    2011-01-01

    One of the central problems in systems neuroscience is to understand how neural spike trains convey sensory information. Decoding methods, which provide an explicit means for reading out the information contained in neural spike responses, offer a powerful set of tools for studying the neural coding problem. Here we develop several decoding methods based on point-process neural encoding models, or forward models that predict spike responses to stimuli. These models have concave log-likelihood functions, which allow efficient maximum-likelihood model fitting and stimulus decoding. We present several applications of the encoding model framework to the problem of decoding stimulus information from population spike responses: (1) a tractable algorithm for computing the maximum a posteriori (MAP) estimate of the stimulus, the most probable stimulus to have generated an observed single- or multiple-neuron spike train response, given some prior distribution over the stimulus; (2) a gaussian approximation to the posterior stimulus distribution that can be used to quantify the fidelity with which various stimulus features are encoded; (3) an efficient method for estimating the mutual information between the stimulus and the spike trains emitted by a neural population; and (4) a framework for the detection of change-point times (the time at which the stimulus undergoes a change in mean or variance) by marginalizing over the posterior stimulus distribution. We provide several examples illustrating the performance of these estimators with simulated and real neural data.

  20. Robustness and versatility of a nonlinear interdependence method for directional coupling detection from spike trains

    Science.gov (United States)

    Malvestio, Irene; Kreuz, Thomas; Andrzejak, Ralph G.

    2017-08-01

    The detection of directional couplings between dynamics based on measured spike trains is a crucial problem in the understanding of many different systems. In particular, in neuroscience it is important to assess the connectivity between neurons. One of the approaches that can estimate directional coupling from the analysis of point processes is the nonlinear interdependence measure L . Although its efficacy has already been demonstrated, it still needs to be tested under more challenging and realistic conditions prior to an application to real data. Thus, in this paper we use the Hindmarsh-Rose model system to test the method in the presence of noise and for different spiking regimes. We also examine the influence of different parameters and spike train distances. Our results show that the measure L is versatile and robust to various types of noise, and thus suitable for application to experimental data.

  1. Information Entropy Production of Maximum Entropy Markov Chains from Spike Trains

    Directory of Open Access Journals (Sweden)

    Rodrigo Cofré

    2018-01-01

    Full Text Available The spiking activity of neuronal networks follows laws that are not time-reversal symmetric; the notion of pre-synaptic and post-synaptic neurons, stimulus correlations and noise correlations have a clear time order. Therefore, a biologically realistic statistical model for the spiking activity should be able to capture some degree of time irreversibility. We use the thermodynamic formalism to build a framework in the context maximum entropy models to quantify the degree of time irreversibility, providing an explicit formula for the information entropy production of the inferred maximum entropy Markov chain. We provide examples to illustrate our results and discuss the importance of time irreversibility for modeling the spike train statistics.

  2. Parametric models to relate spike train and LFP dynamics with neural information processing.

    Science.gov (United States)

    Banerjee, Arpan; Dean, Heather L; Pesaran, Bijan

    2012-01-01

    Spike trains and local field potentials (LFPs) resulting from extracellular current flows provide a substrate for neural information processing. Understanding the neural code from simultaneous spike-field recordings and subsequent decoding of information processing events will have widespread applications. One way to demonstrate an understanding of the neural code, with particular advantages for the development of applications, is to formulate a parametric statistical model of neural activity and its covariates. Here, we propose a set of parametric spike-field models (unified models) that can be used with existing decoding algorithms to reveal the timing of task or stimulus specific processing. Our proposed unified modeling framework captures the effects of two important features of information processing: time-varying stimulus-driven inputs and ongoing background activity that occurs even in the absence of environmental inputs. We have applied this framework for decoding neural latencies in simulated and experimentally recorded spike-field sessions obtained from the lateral intraparietal area (LIP) of awake, behaving monkeys performing cued look-and-reach movements to spatial targets. Using both simulated and experimental data, we find that estimates of trial-by-trial parameters are not significantly affected by the presence of ongoing background activity. However, including background activity in the unified model improves goodness of fit for predicting individual spiking events. Uncovering the relationship between the model parameters and the timing of movements offers new ways to test hypotheses about the relationship between neural activity and behavior. We obtained significant spike-field onset time correlations from single trials using a previously published data set where significantly strong correlation was only obtained through trial averaging. We also found that unified models extracted a stronger relationship between neural response latency and trial

  3. When the Ostrich-Algorithm Fails: Blanking Method Affects Spike Train Statistics

    Directory of Open Access Journals (Sweden)

    Kevin Joseph

    2018-04-01

    Full Text Available Modern electroceuticals are bound to employ the usage of electrical high frequency (130–180 Hz stimulation carried out under closed loop control, most prominent in the case of movement disorders. However, particular challenges are faced when electrical recordings of neuronal tissue are carried out during high frequency electrical stimulation, both in-vivo and in-vitro. This stimulation produces undesired artifacts and can render the recorded signal only partially useful. The extent of these artifacts is often reduced by temporarily grounding the recording input during stimulation pulses. In the following study, we quantify the effects of this method, “blanking,” on the spike count and spike train statistics. Starting from a theoretical standpoint, we calculate a loss in the absolute number of action potentials, depending on: width of the blanking window, frequency of stimulation, and intrinsic neuronal activity. These calculations were then corroborated by actual high signal to noise ratio (SNR single cell recordings. We state that, for clinically relevant frequencies of 130 Hz (used for movement disorders and realistic blanking windows of 2 ms, up to 27% of actual existing spikes are lost. We strongly advice cautioned use of the blanking method when spike rate quantification is attempted.Impact statementBlanking (artifact removal by temporarily grounding input, depending on recording parameters, can lead to significant spike loss. Very careful use of blanking circuits is advised.

  4. When the Ostrich-Algorithm Fails: Blanking Method Affects Spike Train Statistics.

    Science.gov (United States)

    Joseph, Kevin; Mottaghi, Soheil; Christ, Olaf; Feuerstein, Thomas J; Hofmann, Ulrich G

    2018-01-01

    Modern electroceuticals are bound to employ the usage of electrical high frequency (130-180 Hz) stimulation carried out under closed loop control, most prominent in the case of movement disorders. However, particular challenges are faced when electrical recordings of neuronal tissue are carried out during high frequency electrical stimulation, both in-vivo and in-vitro . This stimulation produces undesired artifacts and can render the recorded signal only partially useful. The extent of these artifacts is often reduced by temporarily grounding the recording input during stimulation pulses. In the following study, we quantify the effects of this method, "blanking," on the spike count and spike train statistics. Starting from a theoretical standpoint, we calculate a loss in the absolute number of action potentials, depending on: width of the blanking window, frequency of stimulation, and intrinsic neuronal activity. These calculations were then corroborated by actual high signal to noise ratio (SNR) single cell recordings. We state that, for clinically relevant frequencies of 130 Hz (used for movement disorders) and realistic blanking windows of 2 ms, up to 27% of actual existing spikes are lost. We strongly advice cautioned use of the blanking method when spike rate quantification is attempted. Blanking (artifact removal by temporarily grounding input), depending on recording parameters, can lead to significant spike loss. Very careful use of blanking circuits is advised.

  5. Learning to Recognize Actions From Limited Training Examples Using a Recurrent Spiking Neural Model

    Science.gov (United States)

    Panda, Priyadarshini; Srinivasa, Narayan

    2018-01-01

    A fundamental challenge in machine learning today is to build a model that can learn from few examples. Here, we describe a reservoir based spiking neural model for learning to recognize actions with a limited number of labeled videos. First, we propose a novel encoding, inspired by how microsaccades influence visual perception, to extract spike information from raw video data while preserving the temporal correlation across different frames. Using this encoding, we show that the reservoir generalizes its rich dynamical activity toward signature action/movements enabling it to learn from few training examples. We evaluate our approach on the UCF-101 dataset. Our experiments demonstrate that our proposed reservoir achieves 81.3/87% Top-1/Top-5 accuracy, respectively, on the 101-class data while requiring just 8 video examples per class for training. Our results establish a new benchmark for action recognition from limited video examples for spiking neural models while yielding competitive accuracy with respect to state-of-the-art non-spiking neural models. PMID:29551962

  6. Detection of bursts in extracellular spike trains using hidden semi-Markov point process models.

    Science.gov (United States)

    Tokdar, Surya; Xi, Peiyi; Kelly, Ryan C; Kass, Robert E

    2010-08-01

    Neurons in vitro and in vivo have epochs of bursting or "up state" activity during which firing rates are dramatically elevated. Various methods of detecting bursts in extracellular spike trains have appeared in the literature, the most widely used apparently being Poisson Surprise (PS). A natural description of the phenomenon assumes (1) there are two hidden states, which we label "burst" and "non-burst," (2) the neuron evolves stochastically, switching at random between these two states, and (3) within each state the spike train follows a time-homogeneous point process. If in (2) the transitions from non-burst to burst and burst to non-burst states are memoryless, this becomes a hidden Markov model (HMM). For HMMs, the state transitions follow exponential distributions, and are highly irregular. Because observed bursting may in some cases be fairly regular-exhibiting inter-burst intervals with small variation-we relaxed this assumption. When more general probability distributions are used to describe the state transitions the two-state point process model becomes a hidden semi-Markov model (HSMM). We developed an efficient Bayesian computational scheme to fit HSMMs to spike train data. Numerical simulations indicate the method can perform well, sometimes yielding very different results than those based on PS.

  7. Streaming Parallel GPU Acceleration of Large-Scale filter-based Spiking Neural Networks

    NARCIS (Netherlands)

    L.P. Slazynski (Leszek); S.M. Bohte (Sander)

    2012-01-01

    htmlabstractThe arrival of graphics processing (GPU) cards suitable for massively parallel computing promises a↵ordable large-scale neural network simulation previously only available at supercomputing facil- ities. While the raw numbers suggest that GPUs may outperform CPUs by at least an order of

  8. Dynamics and spike trains statistics in conductance-based integrate-and-fire neural networks with chemical and electric synapses

    International Nuclear Information System (INIS)

    Cofré, Rodrigo; Cessac, Bruno

    2013-01-01

    We investigate the effect of electric synapses (gap junctions) on collective neuronal dynamics and spike statistics in a conductance-based integrate-and-fire neural network, driven by Brownian noise, where conductances depend upon spike history. We compute explicitly the time evolution operator and show that, given the spike-history of the network and the membrane potentials at a given time, the further dynamical evolution can be written in a closed form. We show that spike train statistics is described by a Gibbs distribution whose potential can be approximated with an explicit formula, when the noise is weak. This potential form encompasses existing models for spike trains statistics analysis such as maximum entropy models or generalized linear models (GLM). We also discuss the different types of correlations: those induced by a shared stimulus and those induced by neurons interactions

  9. Parallel optical control of spatiotemporal neuronal spike activity using high-frequency digital light processingtechnology

    Directory of Open Access Journals (Sweden)

    Jason eJerome

    2011-08-01

    Full Text Available Neurons in the mammalian neocortex receive inputs from and communicate back to thousands of other neurons, creating complex spatiotemporal activity patterns. The experimental investigation of these parallel dynamic interactions has been limited due to the technical challenges of monitoring or manipulating neuronal activity at that level of complexity. Here we describe a new massively parallel photostimulation system that can be used to control action potential firing in in vitro brain slices with high spatial and temporal resolution while performing extracellular or intracellular electrophysiological measurements. The system uses Digital-Light-Processing (DLP technology to generate 2-dimensional (2D stimulus patterns with >780,000 independently controlled photostimulation sites that operate at high spatial (5.4 µm and temporal (>13kHz resolution. Light is projected through the quartz-glass bottom of the perfusion chamber providing access to a large area (2.76 x 2.07 mm2 of the slice preparation. This system has the unique capability to induce temporally precise action potential firing in large groups of neurons distributed over a wide area covering several cortical columns. Parallel photostimulation opens up new opportunities for the in vitro experimental investigation of spatiotemporal neuronal interactions at a broad range of anatomical scales.

  10. Strength Training Parallel with Plyometric and Cross training Influences on Speed Endurance

    OpenAIRE

    C.C.Chandra Obul Reddy; Dr. K. Rama Subba Reddy

    2017-01-01

    The purpose of the study was to find out the influence of weight training parallel with plyometric and cross training on speed endurance. To achieve this purpose of the study, forty-five men students studying CSSR & SRRM Degree College, Kamalapuram, YSR (D), Andhra Pradesh, India were randomly selected as subjects during the year 2015-2016. They were divided into three equal groups of fifteen subjects each. Group I underwent weight training parallel with plyometric training for three sessions...

  11. Stochastic resonance of ensemble neurons for transient spike trains: Wavelet analysis

    International Nuclear Information System (INIS)

    Hasegawa, Hideo

    2002-01-01

    By using the wavelet transformation (WT), I have analyzed the response of an ensemble of N (=1, 10, 100, and 500) Hodgkin-Huxley neurons to transient M-pulse spike trains (M=1 to 3) with independent Gaussian noises. The cross correlation between the input and output signals is expressed in terms of the WT expansion coefficients. The signal-to-noise ratio (SNR) is evaluated by using the denoising method within the WT, by which the noise contribution is extracted from the output signals. Although the response of a single (N=1) neuron to subthreshold transient signals with noises is quite unreliable, the transmission fidelity assessed by the cross correlation and SNR is shown to be much improved by increasing the value of N: a population of neurons plays an indispensable role in the stochastic resonance (SR) for transient spike inputs. It is also shown that in a large-scale ensemble, the transmission fidelity for suprathreshold transient spikes is not significantly degraded by a weak noise which is responsible to SR for subthreshold inputs

  12. Fractal characterization of acupuncture-induced spike trains of rat WDR neurons

    International Nuclear Information System (INIS)

    Chen, Yingyuan; Guo, Yi; Wang, Jiang; Hong, Shouhai; Wei, Xile; Yu, Haitao; Deng, Bin

    2015-01-01

    Highlights: •Fractal analysis is a valuable tool for measuring MA-induced neural activities. •In course of the experiments, the spike trains display different fractal properties. •The fractal properties reflect the long-term modulation of MA on WDR neurons. •The results may explain the long-lasting effects induced by acupuncture. -- Abstract: The experimental and the clinical studies have showed manual acupuncture (MA) could evoke multiple responses in various neural regions. Characterising the neuronal activities in these regions may provide more deep insights into acupuncture mechanisms. This paper used fractal analysis to investigate MA-induced spike trains of Wide Dynamic Range (WDR) neurons in rat spinal dorsal horn, an important relay station and integral component in processing acupuncture information. Allan factor and Fano factor were utilized to test whether the spike trains were fractal, and Allan factor were used to evaluate the scaling exponents and Hurst exponents. It was found that these two fractal exponents before and during MA were different significantly. During MA, the scaling exponents of WDR neurons were regulated in a small range, indicating a special fractal pattern. The neuronal activities were long-range correlated over multiple time scales. The scaling exponents during and after MA were similar, suggesting that the long-range correlations not only displayed during MA, but also extended to after withdrawing the needle. Our results showed that fractal analysis is a useful tool for measuring acupuncture effects. MA could modulate neuronal activities of which the fractal properties change as time proceeding. This evolution of fractal dynamics in course of MA experiments may explain at the level of neuron why the effect of MA observed in experiment and in clinic are complex, time-evolutionary, long-range even lasting for some time after stimulation

  13. Information Entropy Production of Maximum Entropy Markov Chains from Spike Trains

    Science.gov (United States)

    Cofré, Rodrigo; Maldonado, Cesar

    2018-01-01

    We consider the maximum entropy Markov chain inference approach to characterize the collective statistics of neuronal spike trains, focusing on the statistical properties of the inferred model. We review large deviations techniques useful in this context to describe properties of accuracy and convergence in terms of sampling size. We use these results to study the statistical fluctuation of correlations, distinguishability and irreversibility of maximum entropy Markov chains. We illustrate these applications using simple examples where the large deviation rate function is explicitly obtained for maximum entropy models of relevance in this field.

  14. Temporal Pattern of Online Communication Spike Trains in Spreading a Scientific Rumor: How Often, Who Interacts with Whom?

    Directory of Open Access Journals (Sweden)

    Ceyda eSanli

    2015-09-01

    Full Text Available We study complex time series (spike trains of online user communication while spreading messages about the discovery of the Higgs boson in Twitter. We focus on online social interactions among users such as retweet, mention, and reply, and construct different types of active (performing an action and passive (receiving an action spike trains for each user. The spike trains are analyzed by means of local variation, to quantify the temporal behavior of active and passive users, as a function of their activity and popularity. We show that the active spike trains are bursty, independently of their activation frequency. For passive spike trains, in contrast, the local variation of popular users presents uncorrelated (Poisson random dynamics. We further characterize the correlations of the local variation in different interactions. We obtain high values of correlation, and thus consistent temporal behavior, between retweets and mentions, but only for popular users, indicating that creating online attention suggests an alignment in the dynamics of the two interactions.

  15. Self-Consistent Scheme for Spike-Train Power Spectra in Heterogeneous Sparse Networks.

    Science.gov (United States)

    Pena, Rodrigo F O; Vellmer, Sebastian; Bernardi, Davide; Roque, Antonio C; Lindner, Benjamin

    2018-01-01

    Recurrent networks of spiking neurons can be in an asynchronous state characterized by low or absent cross-correlations and spike statistics which resemble those of cortical neurons. Although spatial correlations are negligible in this state, neurons can show pronounced temporal correlations in their spike trains that can be quantified by the autocorrelation function or the spike-train power spectrum. Depending on cellular and network parameters, correlations display diverse patterns (ranging from simple refractory-period effects and stochastic oscillations to slow fluctuations) and it is generally not well-understood how these dependencies come about. Previous work has explored how the single-cell correlations in a homogeneous network (excitatory and inhibitory integrate-and-fire neurons with nearly balanced mean recurrent input) can be determined numerically from an iterative single-neuron simulation. Such a scheme is based on the fact that every neuron is driven by the network noise (i.e., the input currents from all its presynaptic partners) but also contributes to the network noise, leading to a self-consistency condition for the input and output spectra. Here we first extend this scheme to homogeneous networks with strong recurrent inhibition and a synaptic filter, in which instabilities of the previous scheme are avoided by an averaging procedure. We then extend the scheme to heterogeneous networks in which (i) different neural subpopulations (e.g., excitatory and inhibitory neurons) have different cellular or connectivity parameters; (ii) the number and strength of the input connections are random (Erdős-Rényi topology) and thus different among neurons. In all heterogeneous cases, neurons are lumped in different classes each of which is represented by a single neuron in the iterative scheme; in addition, we make a Gaussian approximation of the input current to the neuron. These approximations seem to be justified over a broad range of parameters as

  16. Self-Consistent Scheme for Spike-Train Power Spectra in Heterogeneous Sparse Networks

    Directory of Open Access Journals (Sweden)

    Rodrigo F. O. Pena

    2018-03-01

    Full Text Available Recurrent networks of spiking neurons can be in an asynchronous state characterized by low or absent cross-correlations and spike statistics which resemble those of cortical neurons. Although spatial correlations are negligible in this state, neurons can show pronounced temporal correlations in their spike trains that can be quantified by the autocorrelation function or the spike-train power spectrum. Depending on cellular and network parameters, correlations display diverse patterns (ranging from simple refractory-period effects and stochastic oscillations to slow fluctuations and it is generally not well-understood how these dependencies come about. Previous work has explored how the single-cell correlations in a homogeneous network (excitatory and inhibitory integrate-and-fire neurons with nearly balanced mean recurrent input can be determined numerically from an iterative single-neuron simulation. Such a scheme is based on the fact that every neuron is driven by the network noise (i.e., the input currents from all its presynaptic partners but also contributes to the network noise, leading to a self-consistency condition for the input and output spectra. Here we first extend this scheme to homogeneous networks with strong recurrent inhibition and a synaptic filter, in which instabilities of the previous scheme are avoided by an averaging procedure. We then extend the scheme to heterogeneous networks in which (i different neural subpopulations (e.g., excitatory and inhibitory neurons have different cellular or connectivity parameters; (ii the number and strength of the input connections are random (Erdős-Rényi topology and thus different among neurons. In all heterogeneous cases, neurons are lumped in different classes each of which is represented by a single neuron in the iterative scheme; in addition, we make a Gaussian approximation of the input current to the neuron. These approximations seem to be justified over a broad range of

  17. Self-consistent determination of the spike-train power spectrum in a neural network with sparse connectivity

    Directory of Open Access Journals (Sweden)

    Benjamin eDummer

    2014-09-01

    Full Text Available A major source of random variability in cortical networks is the quasi-random arrival of presynaptic action potentials from many other cells. In network studies as well as in the study of the response properties of single cells embedded in a network, synaptic background input is often approximated by Poissonian spike trains. However, the output statistics of the cells is in most cases far from being Poisson. This is inconsistent with the assumption of similar spike-train statistics for pre- and postsynaptic cells in a recurrent network. Here we tackle this problem for the popular class of integrate-and-fire neurons and study a self-consistent statistics of input and output spectra of neural spike trains. Instead of actually using a large network, we use an iterative scheme, in which we simulate a single neuron over several generations. In each of these generations, the neuron is stimulated with surrogate stochastic input that has a similar statistics as the output of the previous generation. For the surrogate input, we employ two distinct approximations: (i a superposition of renewal spike trains with the same interspike interval density as observed in the previous generation and (ii a Gaussian current with a power spectrum proportional to that observed in the previous generation. For input parameters that correspond to balanced input in the network, both the renewal and the Gaussian iteration procedure converge quickly and yield comparable results for the self-consistent spike-train power spectrum. We compare our results to large-scale simulations of a random sparsely connected network of leaky integrate-and-fire neurons (Brunel, J. Comp. Neurosci. 2000 and show that in the asynchronous regime close to a state of balanced synaptic input from the network, our iterative schemes provide excellent approximations to the autocorrelation of spike trains in the recurrent network.

  18. Breaking Bad News Training Program Based on Video Reviews and SPIKES Strategy: What do Perinatology Residents Think about It?

    Science.gov (United States)

    Setubal, Maria Silvia Vellutini; Gonçalves, Andrea Vasconcelos; Rocha, Sheyla Ribeiro; Amaral, Eliana Martorano

    2017-10-01

    Objective  Resident doctors usually face the task to communicate bad news in perinatology without any formal training. The impact on parents can be disastrous. The objective of this paper is to analyze the perception of residents regarding a training program in communicating bad news in perinatology based on video reviews and setting, perception, invitation, knowledge, emotion, and summary (SPIKES) strategy. Methods  We performed the analysis of complementary data collected from participants in a randomized controlled intervention study to evaluate the efficacy of a training program on improving residents' skills to communicate bad news. Data were collected using a Likert scale. Through a thematic content analysis we tried to to apprehend the meanings, feelings and experiences expressed by resident doctors in their comments as a response to an open-ended question. Half of the group received training, consisting of discussions of video reviews of participants' simulated encounters communicating a perinatal loss to a "mother" based on the SPIKES strategy. We also offered training sessions to the control group after they completed participation. Twenty-eight residents who were randomized to intervention and 16 from the control group received training. Twenty written comments were analyzed. Results  The majority of the residents evaluated training highly as an education activity to help increase knowledge, ability and understanding about breaking bad news in perinatology. Three big categories emerged from residents' comments: SPIKES training effects; bad news communication in medical training; and doctors' feelings and relationship with patients. Conclusions  Residents took SPIKES training as a guide to systematize the communication of bad news and to amplify perceptions of the emotional needs of the patients. They suggested the insertion of a similar training in their residency programs curricula. Thieme Revinter Publicações Ltda Rio de Janeiro, Brazil.

  19. Optimal decision making on the basis of evidence represented in spike trains.

    Science.gov (United States)

    Zhang, Jiaxiang; Bogacz, Rafal

    2010-05-01

    Experimental data indicate that perceptual decision making involves integration of sensory evidence in certain cortical areas. Theoretical studies have proposed that the computation in neural decision circuits approximates statistically optimal decision procedures (e.g., sequential probability ratio test) that maximize the reward rate in sequential choice tasks. However, these previous studies assumed that the sensory evidence was represented by continuous values from gaussian distributions with the same variance across alternatives. In this article, we make a more realistic assumption that sensory evidence is represented in spike trains described by the Poisson processes, which naturally satisfy the mean-variance relationship observed in sensory neurons. We show that for such a representation, the neural circuits involving cortical integrators and basal ganglia can approximate the optimal decision procedures for two and multiple alternative choice tasks.

  20. Neural Spike-Train Analyses of the Speech-Based Envelope Power Spectrum Model

    Science.gov (United States)

    Rallapalli, Varsha H.

    2016-01-01

    Diagnosing and treating hearing impairment is challenging because people with similar degrees of sensorineural hearing loss (SNHL) often have different speech-recognition abilities. The speech-based envelope power spectrum model (sEPSM) has demonstrated that the signal-to-noise ratio (SNRENV) from a modulation filter bank provides a robust speech-intelligibility measure across a wider range of degraded conditions than many long-standing models. In the sEPSM, noise (N) is assumed to: (a) reduce S + N envelope power by filling in dips within clean speech (S) and (b) introduce an envelope noise floor from intrinsic fluctuations in the noise itself. While the promise of SNRENV has been demonstrated for normal-hearing listeners, it has not been thoroughly extended to hearing-impaired listeners because of limited physiological knowledge of how SNHL affects speech-in-noise envelope coding relative to noise alone. Here, envelope coding to speech-in-noise stimuli was quantified from auditory-nerve model spike trains using shuffled correlograms, which were analyzed in the modulation-frequency domain to compute modulation-band estimates of neural SNRENV. Preliminary spike-train analyses show strong similarities to the sEPSM, demonstrating feasibility of neural SNRENV computations. Results suggest that individual differences can occur based on differential degrees of outer- and inner-hair-cell dysfunction in listeners currently diagnosed into the single audiological SNHL category. The predicted acoustic-SNR dependence in individual differences suggests that the SNR-dependent rate of susceptibility could be an important metric in diagnosing individual differences. Future measurements of the neural SNRENV in animal studies with various forms of SNHL will provide valuable insight for understanding individual differences in speech-in-noise intelligibility.

  1. Reconstruction of sparse connectivity in neural networks from spike train covariances

    International Nuclear Information System (INIS)

    Pernice, Volker; Rotter, Stefan

    2013-01-01

    The inference of causation from correlation is in general highly problematic. Correspondingly, it is difficult to infer the existence of physical synaptic connections between neurons from correlations in their activity. Covariances in neural spike trains and their relation to network structure have been the subject of intense research, both experimentally and theoretically. The influence of recurrent connections on covariances can be characterized directly in linear models, where connectivity in the network is described by a matrix of linear coupling kernels. However, as indirect connections also give rise to covariances, the inverse problem of inferring network structure from covariances can generally not be solved unambiguously. Here we study to what degree this ambiguity can be resolved if the sparseness of neural networks is taken into account. To reconstruct a sparse network, we determine the minimal set of linear couplings consistent with the measured covariances by minimizing the L 1 norm of the coupling matrix under appropriate constraints. Contrary to intuition, after stochastic optimization of the coupling matrix, the resulting estimate of the underlying network is directed, despite the fact that a symmetric matrix of count covariances is used for inference. The performance of the new method is best if connections are neither exceedingly sparse, nor too dense, and it is easily applicable for networks of a few hundred nodes. Full coupling kernels can be obtained from the matrix of full covariance functions. We apply our method to networks of leaky integrate-and-fire neurons in an asynchronous–irregular state, where spike train covariances are well described by a linear model. (paper)

  2. Segmental Bayesian estimation of gap-junctional and inhibitory conductance of inferior olive neurons from spike trains with complicated dynamics

    Directory of Open Access Journals (Sweden)

    Huu eHoang

    2015-05-01

    Full Text Available The inverse problem for estimating model parameters from brain spike data is an ill-posed problem because of a huge mismatch in the system complexity between the model and the brain as well as its non-stationary dynamics, and needs a stochastic approach that finds the most likely solution among many possible solutions. In the present study, we developed a segmental Bayesian method to estimate the two parameters of interest, the gap-junctional (gc and inhibitory conductance (gi from inferior olive spike data. Feature vectors were estimated for the spike data in a segment-wise fashion to compensate for the non-stationary firing dynamics. Hierarchical Bayesian estimation was conducted to estimate the gc and gi for every spike segment using a forward model constructed in the principal component analysis (PCA space of the feature vectors, and to merge the segmental estimates into single estimates for every neuron. The segmental Bayesian estimation gave smaller fitting errors than the conventional Bayesian inference, which finds the estimates once across the entire spike data, or the minimum error method, which directly finds the closest match in the PCA space. The segmental Bayesian inference has the potential to overcome the problem of non-stationary dynamics and resolve the ill-posedness of the inverse problem because of the mismatch between the model and the brain under the constraints based, and it is a useful tool to evaluate parameters of interest for neuroscience from experimental spike train data.

  3. Estimation of parameters in Shot-Noise-Driven Doubly Stochastic Poisson processes using the EM algorithm--modeling of pre- and postsynaptic spike trains.

    Science.gov (United States)

    Mino, H

    2007-01-01

    To estimate the parameters, the impulse response (IR) functions of some linear time-invariant systems generating intensity processes, in Shot-Noise-Driven Doubly Stochastic Poisson Process (SND-DSPP) in which multivariate presynaptic spike trains and postsynaptic spike trains can be assumed to be modeled by the SND-DSPPs. An explicit formula for estimating the IR functions from observations of multivariate input processes of the linear systems and the corresponding counting process (output process) is derived utilizing the expectation maximization (EM) algorithm. The validity of the estimation formula was verified through Monte Carlo simulations in which two presynaptic spike trains and one postsynaptic spike train were assumed to be observable. The IR functions estimated on the basis of the proposed identification method were close to the true IR functions. The proposed method will play an important role in identifying the input-output relationship of pre- and postsynaptic neural spike trains in practical situations.

  4. Method for stationarity-segmentation of spike train data with application to the Pearson cross-correlation.

    Science.gov (United States)

    Quiroga-Lombard, Claudio S; Hass, Joachim; Durstewitz, Daniel

    2013-07-01

    Correlations among neurons are supposed to play an important role in computation and information coding in the nervous system. Empirically, functional interactions between neurons are most commonly assessed by cross-correlation functions. Recent studies have suggested that pairwise correlations may indeed be sufficient to capture most of the information present in neural interactions. Many applications of correlation functions, however, implicitly tend to assume that the underlying processes are stationary. This assumption will usually fail for real neurons recorded in vivo since their activity during behavioral tasks is heavily influenced by stimulus-, movement-, or cognition-related processes as well as by more general processes like slow oscillations or changes in state of alertness. To address the problem of nonstationarity, we introduce a method for assessing stationarity empirically and then "slicing" spike trains into stationary segments according to the statistical definition of weak-sense stationarity. We examine pairwise Pearson cross-correlations (PCCs) under both stationary and nonstationary conditions and identify another source of covariance that can be differentiated from the covariance of the spike times and emerges as a consequence of residual nonstationarities after the slicing process: the covariance of the firing rates defined on each segment. Based on this, a correction of the PCC is introduced that accounts for the effect of segmentation. We probe these methods both on simulated data sets and on in vivo recordings from the prefrontal cortex of behaving rats. Rather than for removing nonstationarities, the present method may also be used for detecting significant events in spike trains.

  5. Spatio-temporal spike train analysis for large scale networks using the maximum entropy principle and Monte Carlo method

    International Nuclear Information System (INIS)

    Nasser, Hassan; Cessac, Bruno; Marre, Olivier

    2013-01-01

    Understanding the dynamics of neural networks is a major challenge in experimental neuroscience. For that purpose, a modelling of the recorded activity that reproduces the main statistics of the data is required. In the first part, we present a review on recent results dealing with spike train statistics analysis using maximum entropy models (MaxEnt). Most of these studies have focused on modelling synchronous spike patterns, leaving aside the temporal dynamics of the neural activity. However, the maximum entropy principle can be generalized to the temporal case, leading to Markovian models where memory effects and time correlations in the dynamics are properly taken into account. In the second part, we present a new method based on Monte Carlo sampling which is suited for the fitting of large-scale spatio-temporal MaxEnt models. The formalism and the tools presented here will be essential to fit MaxEnt spatio-temporal models to large neural ensembles. (paper)

  6. A prolongation of the postspike afterhyperpolarization following spike trains can partly explain the lower firing rates at derecruitment than those at recruitment

    DEFF Research Database (Denmark)

    Wienecke, Jacob; Zhang, Mengliang; Hultborn, Hans

    2009-01-01

    rates at derecruitment correlated with a change in the postspike afterhyperpolarization (AHP) after preceding spike trains? This question was investigated by intracellular recordings from cat motor neurons in both unanesthetized and anesthetized preparations. The firing frequencies at recruitment...... for the lower frequencies at derecruitment. This was independent of whether the current injection had activated persistent inward current (PIC; plateau potentials, secondary range firing). It was found that a preceding spike train could prolong the AHP duration following a subsequent spike. The lower rate...... from AHP duration in fast motoneurons and higher than expected in slow motoneurons. It is suggested that these deviations are explained by the presence of synaptic noise as well as recruitment of PICs below firing threshold. Thus synaptic noise may allow spike discharge even after the end of the AHP...

  7. Parallelization of Neural Network Training for NLP with Hogwild!

    Directory of Open Access Journals (Sweden)

    Deyringer Valentin

    2017-10-01

    Full Text Available Neural Networks are prevalent in todays NLP research. Despite their success for different tasks, training time is relatively long. We use Hogwild! to counteract this phenomenon and show that it is a suitable method to speed up training Neural Networks of different architectures and complexity. For POS tagging and translation we report considerable speedups of training, especially for the latter. We show that Hogwild! can be an important tool for training complex NLP architectures.

  8. Spike Train Auto-Structure Impacts Post-Synaptic Firing and Timing-Based Plasticity

    Science.gov (United States)

    Scheller, Bertram; Castellano, Marta; Vicente, Raul; Pipa, Gordon

    2011-01-01

    Cortical neurons are typically driven by several thousand synapses. The precise spatiotemporal pattern formed by these inputs can modulate the response of a post-synaptic cell. In this work, we explore how the temporal structure of pre-synaptic inhibitory and excitatory inputs impact the post-synaptic firing of a conductance-based integrate and fire neuron. Both the excitatory and inhibitory input was modeled by renewal gamma processes with varying shape factors for modeling regular and temporally random Poisson activity. We demonstrate that the temporal structure of mutually independent inputs affects the post-synaptic firing, while the strength of the effect depends on the firing rates of both the excitatory and inhibitory inputs. In a second step, we explore the effect of temporal structure of mutually independent inputs on a simple version of Hebbian learning, i.e., hard bound spike-timing-dependent plasticity. We explore both the equilibrium weight distribution and the speed of the transient weight dynamics for different mutually independent gamma processes. We find that both the equilibrium distribution of the synaptic weights and the speed of synaptic changes are modulated by the temporal structure of the input. Finally, we highlight that the sensitivity of both the post-synaptic firing as well as the spike-timing-dependent plasticity on the auto-structure of the input of a neuron could be used to modulate the learning rate of synaptic modification. PMID:22203800

  9. Continuous detection of weak sensory signals in afferent spike trains: the role of anti-correlated interspike intervals in detection performance.

    Science.gov (United States)

    Goense, J B M; Ratnam, R

    2003-10-01

    An important problem in sensory processing is deciding whether fluctuating neural activity encodes a stimulus or is due to variability in baseline activity. Neurons that subserve detection must examine incoming spike trains continuously, and quickly and reliably differentiate signals from baseline activity. Here we demonstrate that a neural integrator can perform continuous signal detection, with performance exceeding that of trial-based procedures, where spike counts in signal- and baseline windows are compared. The procedure was applied to data from electrosensory afferents of weakly electric fish (Apteronotus leptorhynchus), where weak perturbations generated by small prey add approximately 1 spike to a baseline of approximately 300 spikes s(-1). The hypothetical postsynaptic neuron, modeling an electrosensory lateral line lobe cell, could detect an added spike within 10-15 ms, achieving near ideal detection performance (80-95%) at false alarm rates of 1-2 Hz, while trial-based testing resulted in only 30-35% correct detections at that false alarm rate. The performance improvement was due to anti-correlations in the afferent spike train, which reduced both the amplitude and duration of fluctuations in postsynaptic membrane activity, and so decreased the number of false alarms. Anti-correlations can be exploited to improve detection performance only if there is memory of prior decisions.

  10. Simulations of drastically reduced SBS with laser pulses composed of a Spike Train of Uneven Duration and Delay (STUD pulses)

    International Nuclear Information System (INIS)

    Hueller, S.; Afeyan, B.

    2013-01-01

    By comparing the impact of established laser smoothing techniques like Random Phase Plates (RPP) and Smoothing by Spectral Dispersion (SSD) to the concept of 'Spike Trains of Uneven Duration and Delay' (STUD pulses) on the amplification of parametric instabilities in laser-produced plasmas, we show with the help of numerical simulations, that STUD pulses can drastically reduce instability growth by orders of magnitude. The simulation results, obtained with the code Harmony in a nonuniformly flowing mm-size plasma for the Stimulated Brillouin Scattering (SBS) instability, show that the efficiency of the STUD pulse technique is due to the fact that successive re-amplification in space and time of parametrically excited plasma waves inside laser hot spots is minimized. An overall mean fluctuation level of ion acoustic waves at low amplitude is established because of the frequent change of the speckle pattern in successive spikes. This level stays orders of magnitude below the levels of ion acoustic waves excited in hot spots of RPP and SSD laser beams. (authors)

  11. Spike-train acquisition, analysis and real-time experimental control using a graphical programming language (LabView).

    Science.gov (United States)

    Nordstrom, M A; Mapletoft, E A; Miles, T S

    1995-11-01

    A solution is described for the acquisition on a personal computer of standard pulses derived from neuronal discharge, measurement of neuronal discharge times, real-time control of stimulus delivery based on specified inter-pulse interval conditions in the neuronal spike train, and on-line display and analysis of the experimental data. The hardware consisted of an Apple Macintosh IIci computer and a plug-in card (National Instruments NB-MIO16) that supports A/D, D/A, digital I/O and timer functions. The software was written in the object-oriented graphical programming language LabView. Essential elements of the source code of the LabView program are presented and explained. The use of the system is demonstrated in an experiment in which the reflex responses to muscle stretch are assessed for a single motor unit in the human masseter muscle.

  12. Digital parallel-to-series pulse-train converter

    Science.gov (United States)

    Hussey, J.

    1971-01-01

    Circuit converts number represented as two level signal on n-bit lines to series of pulses on one of two lines, depending on sign of number. Converter accepts parallel binary input data and produces number of output pulses equal to number represented by input data.

  13. Neuronal spike-train responses in the presence of threshold noise.

    Science.gov (United States)

    Coombes, S; Thul, R; Laudanski, J; Palmer, A R; Sumner, C J

    2011-03-01

    The variability of neuronal firing has been an intense topic of study for many years. From a modelling perspective it has often been studied in conductance based spiking models with the use of additive or multiplicative noise terms to represent channel fluctuations or the stochastic nature of neurotransmitter release. Here we propose an alternative approach using a simple leaky integrate-and-fire model with a noisy threshold. Initially, we develop a mathematical treatment of the neuronal response to periodic forcing using tools from linear response theory and use this to highlight how a noisy threshold can enhance downstream signal reconstruction. We further develop a more general framework for understanding the responses to large amplitude forcing based on a calculation of first passage times. This is ideally suited to understanding stochastic mode-locking, for which we numerically determine the Arnol'd tongue structure. An examination of data from regularly firing stellate neurons within the ventral cochlear nucleus, responding to sinusoidally amplitude modulated pure tones, shows tongue structures consistent with these predictions and highlights that stochastic, as opposed to deterministic, mode-locking is utilised at the level of the single stellate cell to faithfully encode periodic stimuli.

  14. Kinetics of fast short-term depression are matched to spike train statistics to reduce noise.

    Science.gov (United States)

    Khanbabaie, Reza; Nesse, William H; Longtin, Andre; Maler, Leonard

    2010-06-01

    Short-term depression (STD) is observed at many synapses of the CNS and is important for diverse computations. We have discovered a form of fast STD (FSTD) in the synaptic responses of pyramidal cells evoked by stimulation of their electrosensory afferent fibers (P-units). The dynamics of the FSTD are matched to the mean and variance of natural P-unit discharge. FSTD exhibits switch-like behavior in that it is immediately activated with stimulus intervals near the mean interspike interval (ISI) of P-units (approximately 5 ms) and recovers immediately after stimulation with the slightly longer intervals (>7.5 ms) that also occur during P-unit natural and evoked discharge patterns. Remarkably, the magnitude of evoked excitatory postsynaptic potentials appear to depend only on the duration of the previous ISI. Our theoretical analysis suggests that FSTD can serve as a mechanism for noise reduction. Because the kinetics of depression are as fast as the natural spike statistics, this role is distinct from previously ascribed functional roles of STD in gain modulation, synchrony detection or as a temporal filter.

  15. A toolbox for the fast information analysis of multiple-site LFP, EEG and spike train recordings

    Directory of Open Access Journals (Sweden)

    Logothetis Nikos K

    2009-07-01

    Full Text Available Abstract Background Information theory is an increasingly popular framework for studying how the brain encodes sensory information. Despite its widespread use for the analysis of spike trains of single neurons and of small neural populations, its application to the analysis of other types of neurophysiological signals (EEGs, LFPs, BOLD has remained relatively limited so far. This is due to the limited-sampling bias which affects calculation of information, to the complexity of the techniques to eliminate the bias, and to the lack of publicly available fast routines for the information analysis of multi-dimensional responses. Results Here we introduce a new C- and Matlab-based information theoretic toolbox, specifically developed for neuroscience data. This toolbox implements a novel computationally-optimized algorithm for estimating many of the main information theoretic quantities and bias correction techniques used in neuroscience applications. We illustrate and test the toolbox in several ways. First, we verify that these algorithms provide accurate and unbiased estimates of the information carried by analog brain signals (i.e. LFPs, EEGs, or BOLD even when using limited amounts of experimental data. This test is important since existing algorithms were so far tested primarily on spike trains. Second, we apply the toolbox to the analysis of EEGs recorded from a subject watching natural movies, and we characterize the electrodes locations, frequencies and signal features carrying the most visual information. Third, we explain how the toolbox can be used to break down the information carried by different features of the neural signal into distinct components reflecting different ways in which correlations between parts of the neural signal contribute to coding. We illustrate this breakdown by analyzing LFPs recorded from primary visual cortex during presentation of naturalistic movies. Conclusion The new toolbox presented here implements fast

  16. A toolbox for the fast information analysis of multiple-site LFP, EEG and spike train recordings.

    Science.gov (United States)

    Magri, Cesare; Whittingstall, Kevin; Singh, Vanessa; Logothetis, Nikos K; Panzeri, Stefano

    2009-07-16

    Information theory is an increasingly popular framework for studying how the brain encodes sensory information. Despite its widespread use for the analysis of spike trains of single neurons and of small neural populations, its application to the analysis of other types of neurophysiological signals (EEGs, LFPs, BOLD) has remained relatively limited so far. This is due to the limited-sampling bias which affects calculation of information, to the complexity of the techniques to eliminate the bias, and to the lack of publicly available fast routines for the information analysis of multi-dimensional responses. Here we introduce a new C- and Matlab-based information theoretic toolbox, specifically developed for neuroscience data. This toolbox implements a novel computationally-optimized algorithm for estimating many of the main information theoretic quantities and bias correction techniques used in neuroscience applications. We illustrate and test the toolbox in several ways. First, we verify that these algorithms provide accurate and unbiased estimates of the information carried by analog brain signals (i.e. LFPs, EEGs, or BOLD) even when using limited amounts of experimental data. This test is important since existing algorithms were so far tested primarily on spike trains. Second, we apply the toolbox to the analysis of EEGs recorded from a subject watching natural movies, and we characterize the electrodes locations, frequencies and signal features carrying the most visual information. Third, we explain how the toolbox can be used to break down the information carried by different features of the neural signal into distinct components reflecting different ways in which correlations between parts of the neural signal contribute to coding. We illustrate this breakdown by analyzing LFPs recorded from primary visual cortex during presentation of naturalistic movies. The new toolbox presented here implements fast and data-robust computations of the most relevant

  17. Tuning of spinal networks to frequency components of spike trains in individual afferents.

    Science.gov (United States)

    Koerber, H R; Seymour, A W; Mendell, L M

    1991-10-01

    Cord dorsum potentials (CDPs) evoked by primary afferent fiber stimulation reflect the response of postsynaptic dorsal horn neurons. The properties of these CDPs have been shown to vary in accordance with the type of primary afferent fiber stimulated. The purpose of the present study was to determine the relationships between frequency modulation of the afferent input trains, the amplitude modulation of the evoked CDPs, and the type of primary afferent stimulated. The somata of individual primary afferent fibers were impaled in the L7 dorsal root ganglion of alpha-chloralose-anesthetized cats. Action potentials (APs) were evoked in single identified afferents via the intracellular microelectrode while simultaneously recording the response of dorsal horn neurons as CDPs, or activity of individual target interneurons recorded extracellularly or intracellularly. APs were evoked in afferents using temporal patterns identical to the responses of selected afferents to natural stimulation of their receptive fields. Two such physiologically realistic trains, one recorded from a hair follicle and the other from a slowly adapting type 1 receptor, were chosen as standard test trains. Modulation of CDP amplitude in response to this frequency-modulated afferent activity varied according to the type of peripheral mechanoreceptor innervated. Dorsal horn networks driven by A beta afferents innervating hair follicles, rapidly adapting pad (Krause end bulb), and field receptors seemed "tuned" to amplify the onset of activity in single afferents. Networks driven by afferents innervating down hair follicles and pacinian corpuscles required more high-frequency activity to elicit their peak response. Dorsal horn networks driven by afferents innervating slowly adapting receptors including high-threshold mechanoreceptors exhibited some sensitivity to the instantaneous frequency, but in general they reproduced the activity in the afferent fiber much more faithfully. Responses of

  18. Mapping spikes to sensations

    Directory of Open Access Journals (Sweden)

    Maik Christopher Stüttgen

    2011-11-01

    Full Text Available Single-unit recordings conducted during perceptual decision-making tasks have yielded tremendous insights into the neural coding of sensory stimuli. In such experiments, detection or discrimination behavior (the psychometric data is observed in parallel with spike trains in sensory neurons (the neurometric data. Frequently, candidate neural codes for information read-out are pitted against each other by transforming the neurometric data in some way and asking which code’s performance most closely approximates the psychometric performance. The code that matches the psychometric performance best is retained as a viable candidate and the others are rejected. In following this strategy, psychometric data is often considered to provide an unbiased measure of perceptual sensitivity. It is rarely acknowledged that psychometric data result from a complex interplay of sensory and non-sensory processes and that neglect of these processes may result in misestimating psychophysical sensitivity. This again may lead to erroneous conclusions regarding the adequacy of neural candidate codes. In this review, we first discuss requirements on the neural data for a subsequent neurometric-psychometric comparison. We then focus on different psychophysical tasks for the assessment of detection and discrimination performance and the cognitive processes that may underlie their execution. We discuss further factors that may compromise psychometric performance and how they can be detected or avoided. We believe that these considerations point to shortcomings in our understanding of the processes underlying perceptual decisions, and therefore offer potential for future research.

  19. Automatic fitting of spiking neuron models to electrophysiological recordings

    Directory of Open Access Journals (Sweden)

    Cyrille Rossant

    2010-03-01

    Full Text Available Spiking models can accurately predict the spike trains produced by cortical neurons in response to somatically injected currents. Since the specific characteristics of the model depend on the neuron, a computational method is required to fit models to electrophysiological recordings. The fitting procedure can be very time consuming both in terms of computer simulations and in terms of code writing. We present algorithms to fit spiking models to electrophysiological data (time-varying input and spike trains that can run in parallel on graphics processing units (GPUs. The model fitting library is interfaced with Brian, a neural network simulator in Python. If a GPU is present it uses just-in-time compilation to translate model equations into optimized code. Arbitrary models can then be defined at script level and run on the graphics card. This tool can be used to obtain empirically validated spiking models of neurons in various systems. We demonstrate its use on public data from the INCF Quantitative Single-Neuron Modeling 2009 competition by comparing the performance of a number of neuron spiking models.

  20. SuperSpike: Supervised Learning in Multilayer Spiking Neural Networks.

    Science.gov (United States)

    Zenke, Friedemann; Ganguli, Surya

    2018-04-13

    A vast majority of computation in the brain is performed by spiking neural networks. Despite the ubiquity of such spiking, we currently lack an understanding of how biological spiking neural circuits learn and compute in vivo, as well as how we can instantiate such capabilities in artificial spiking circuits in silico. Here we revisit the problem of supervised learning in temporally coding multilayer spiking neural networks. First, by using a surrogate gradient approach, we derive SuperSpike, a nonlinear voltage-based three-factor learning rule capable of training multilayer networks of deterministic integrate-and-fire neurons to perform nonlinear computations on spatiotemporal spike patterns. Second, inspired by recent results on feedback alignment, we compare the performance of our learning rule under different credit assignment strategies for propagating output errors to hidden units. Specifically, we test uniform, symmetric, and random feedback, finding that simpler tasks can be solved with any type of feedback, while more complex tasks require symmetric feedback. In summary, our results open the door to obtaining a better scientific understanding of learning and computation in spiking neural networks by advancing our ability to train them to solve nonlinear problems involving transformations between different spatiotemporal spike time patterns.

  1. A parallel neural network training algorithm for control of discrete dynamical systems.

    Energy Technology Data Exchange (ETDEWEB)

    Gordillo, J. L.; Hanebutte, U. R.; Vitela, J. E.

    1998-01-20

    In this work we present a parallel neural network controller training code, that uses MPI, a portable message passing environment. A comprehensive performance analysis is reported which compares results of a performance model with actual measurements. The analysis is made for three different load assignment schemes: block distribution, strip mining and a sliding average bin packing (best-fit) algorithm. Such analysis is crucial since optimal load balance can not be achieved because the work load information is not available a priori. The speedup results obtained with the above schemes are compared with those corresponding to the bin packing load balance scheme with perfect load prediction based on a priori knowledge of the computing effort. Two multiprocessor platforms: a SGI/Cray Origin 2000 and a IBM SP have been utilized for this study. It is shown that for the best load balance scheme a parallel efficiency of over 50% for the entire computation is achieved by 17 processors of either parallel computers.

  2. Method of Parallel-Hierarchical Network Self-Training and its Application for Pattern Classification and Recognition

    Directory of Open Access Journals (Sweden)

    TIMCHENKO, L.

    2012-11-01

    Full Text Available Propositions necessary for development of parallel-hierarchical (PH network training methods are discussed in this article. Unlike already known structures of the artificial neural network, where non-normalized (absolute similarity criteria are used for comparison, the suggested structure uses a normalized criterion. Based on the analysis of training rules, a conclusion is made that application of two training methods with a teacher is optimal for PH network training: error correction-based training and memory-based training. Mathematical models of training and a combined method of PH network training for recognition of static and dynamic patterns are developed.

  3. Spiking Neurons for Analysis of Patterns

    Science.gov (United States)

    Huntsberger, Terrance

    2008-01-01

    Artificial neural networks comprising spiking neurons of a novel type have been conceived as improved pattern-analysis and pattern-recognition computational systems. These neurons are represented by a mathematical model denoted the state-variable model (SVM), which among other things, exploits a computational parallelism inherent in spiking-neuron geometry. Networks of SVM neurons offer advantages of speed and computational efficiency, relative to traditional artificial neural networks. The SVM also overcomes some of the limitations of prior spiking-neuron models. There are numerous potential pattern-recognition, tracking, and data-reduction (data preprocessing) applications for these SVM neural networks on Earth and in exploration of remote planets. Spiking neurons imitate biological neurons more closely than do the neurons of traditional artificial neural networks. A spiking neuron includes a central cell body (soma) surrounded by a tree-like interconnection network (dendrites). Spiking neurons are so named because they generate trains of output pulses (spikes) in response to inputs received from sensors or from other neurons. They gain their speed advantage over traditional neural networks by using the timing of individual spikes for computation, whereas traditional artificial neurons use averages of activity levels over time. Moreover, spiking neurons use the delays inherent in dendritic processing in order to efficiently encode the information content of incoming signals. Because traditional artificial neurons fail to capture this encoding, they have less processing capability, and so it is necessary to use more gates when implementing traditional artificial neurons in electronic circuitry. Such higher-order functions as dynamic tasking are effected by use of pools (collections) of spiking neurons interconnected by spike-transmitting fibers. The SVM includes adaptive thresholds and submodels of transport of ions (in imitation of such transport in biological

  4. Academic training: From Evolution Theory to Parallel and Distributed Genetic Programming

    CERN Multimedia

    2007-01-01

    2006-2007 ACADEMIC TRAINING PROGRAMME LECTURE SERIES 15, 16 March From 11:00 to 12:00 - Main Auditorium, bldg. 500 From Evolution Theory to Parallel and Distributed Genetic Programming F. FERNANDEZ DE VEGA / Univ. of Extremadura, SP Lecture No. 1: From Evolution Theory to Evolutionary Computation Evolutionary computation is a subfield of artificial intelligence (more particularly computational intelligence) involving combinatorial optimization problems, which are based to some degree on the evolution of biological life in the natural world. In this tutorial we will review the source of inspiration for this metaheuristic and its capability for solving problems. We will show the main flavours within the field, and different problems that have been successfully solved employing this kind of techniques. Lecture No. 2: Parallel and Distributed Genetic Programming The successful application of Genetic Programming (GP, one of the available Evolutionary Algorithms) to optimization problems has encouraged an ...

  5. Academic Training Lectures | Introduction to Parallelism, Concurrency and Acceleration | 19-20 January

    CERN Multimedia

    2016-01-01

    Please note that the next series of Academic Training Lectures will take place on 19 and 20 January 2016. The lectures will be given by Andrzej Nowak (TIK Services, Switzerland).   An Introduction to Parallelism, Concurrency and Acceleration (1/2) on Tuesday, 19 January from 11 a.m. to 12 noon https://indico.cern.ch/event/404682/ An Introduction to Parallelism, Concurrency and Acceleration (2/2) on Wednesday, 20 January from 11 a.m. to 12 noon https://indico.cern.ch/event/404683/ at CERN IT Amphitheatre (31-3-004) Description: Concurrency and parallelism are firm elements of any modern computing infrastructure, made even more prominent by the emergence of accelerators. These lectures offer an introduction to these important concepts. We will begin with a brief refresher of recent hardware offerings to modern-day programmers. We will then open the main discu...

  6. Information transmission with spiking Bayesian neurons

    International Nuclear Information System (INIS)

    Lochmann, Timm; Deneve, Sophie

    2008-01-01

    Spike trains of cortical neurons resulting from repeatedpresentations of a stimulus are variable and exhibit Poisson-like statistics. Many models of neural coding therefore assumed that sensory information is contained in instantaneous firing rates, not spike times. Here, we ask how much information about time-varying stimuli can be transmitted by spiking neurons with such input and output variability. In particular, does this variability imply spike generation to be intrinsically stochastic? We consider a model neuron that estimates optimally the current state of a time-varying binary variable (e.g. presence of a stimulus) by integrating incoming spikes. The unit signals its current estimate to other units with spikes whenever the estimate increased by a fixed amount. As shown previously, this computation results in integrate and fire dynamics with Poisson-like output spike trains. This output variability is entirely due to the stochastic input rather than noisy spike generation. As a result such a deterministic neuron can transmit most of the information about the time varying stimulus. This contrasts with a standard model of sensory neurons, the linear-nonlinear Poisson (LNP) model which assumes that most variability in output spike trains is due to stochastic spike generation. Although it yields the same firing statistics, we found that such noisy firing results in the loss of most information. Finally, we use this framework to compare potential effects of top-down attention versus bottom-up saliency on information transfer with spiking neurons

  7. Improved SpikeProp for Using Particle Swarm Optimization

    Directory of Open Access Journals (Sweden)

    Falah Y. H. Ahmed

    2013-01-01

    Full Text Available A spiking neurons network encodes information in the timing of individual spike times. A novel supervised learning rule for SpikeProp is derived to overcome the discontinuities introduced by the spiking thresholding. This algorithm is based on an error-backpropagation learning rule suited for supervised learning of spiking neurons that use exact spike time coding. The SpikeProp is able to demonstrate the spiking neurons that can perform complex nonlinear classification in fast temporal coding. This study proposes enhancements of SpikeProp learning algorithm for supervised training of spiking networks which can deal with complex patterns. The proposed methods include the SpikeProp particle swarm optimization (PSO and angle driven dependency learning rate. These methods are presented to SpikeProp network for multilayer learning enhancement and weights optimization. Input and output patterns are encoded as spike trains of precisely timed spikes, and the network learns to transform the input trains into target output trains. With these enhancements, our proposed methods outperformed other conventional neural network architectures.

  8. Periodically-modulated inhibition of living pacemaker neurons--III. The heterogeneity of the postsynaptic spike trains, and how control parameters affect it.

    Science.gov (United States)

    Segundo, J P; Vibert, J F; Stiber, M

    1998-11-01

    Codings involving spike trains at synapses with inhibitory postsynaptic potentials on pacemakers were examined in crayfish stretch receptor organs by modulating presynaptic instantaneous rates periodically (triangles or sines; frequencies, slopes and depths under, respectively, 5.0 Hz, 40.0/s/s and 25.0/s). Timings were described by interspike and cross-intervals ("phases"); patterns (dispersions, sequences) and forms (timing classes) were identified using pooled graphs (instant along the cycle when a spike occurs vs preceding interval) and return maps (plots of successive intervals). A remarkable heterogeneity of postsynaptic intervals and phases characterizes each modulation. All cycles separate into the same portions: each contains a particular form and switches abruptly to the next. Forms differ in irregularity and predictability: they are (see text) "p:q alternations", "intermittent", "phase walk-throughs", "messy erratic" and "messy stammering". Postsynaptic cycles are asymmetric (hysteresis). This contrasts with the presynaptic homogeneity, smoothness and symmetry. All control parameters are, individually and jointly, strongly influential. Presynaptic slopes, say, act through a postsynaptic sensitivity to their magnitude and sign; when increasing, hysteresis augments and forms change or disappear. Appropriate noise attenuates between-train contrasts, providing modulations are under 0.5 Hz. Postsynaptic natural intervals impose critical time bases, separating presynaptic intervals (around, above or below them) with dissimilar consequences. Coding rules are numerous and have restricted domains; generalizations are misleading. Modulation-driven forms are trendy pacemaker-driven forms. However, dissimilarities, slight when patterns are almost pacemaker, increase as inhibition departs from pacemaker and incorporate unpredictable features. Physiological significance-(1) Pacemaker-driven forms, simple and ubiquitous, appear to be elementary building blocks of

  9. Serial Spike Time Correlations Affect Probability Distribution of Joint Spike Events.

    Science.gov (United States)

    Shahi, Mina; van Vreeswijk, Carl; Pipa, Gordon

    2016-01-01

    Detecting the existence of temporally coordinated spiking activity, and its role in information processing in the cortex, has remained a major challenge for neuroscience research. Different methods and approaches have been suggested to test whether the observed synchronized events are significantly different from those expected by chance. To analyze the simultaneous spike trains for precise spike correlation, these methods typically model the spike trains as a Poisson process implying that the generation of each spike is independent of all the other spikes. However, studies have shown that neural spike trains exhibit dependence among spike sequences, such as the absolute and relative refractory periods which govern the spike probability of the oncoming action potential based on the time of the last spike, or the bursting behavior, which is characterized by short epochs of rapid action potentials, followed by longer episodes of silence. Here we investigate non-renewal processes with the inter-spike interval distribution model that incorporates spike-history dependence of individual neurons. For that, we use the Monte Carlo method to estimate the full shape of the coincidence count distribution and to generate false positives for coincidence detection. The results show that compared to the distributions based on homogeneous Poisson processes, and also non-Poisson processes, the width of the distribution of joint spike events changes. Non-renewal processes can lead to both heavy tailed or narrow coincidence distribution. We conclude that small differences in the exact autostructure of the point process can cause large differences in the width of a coincidence distribution. Therefore, manipulations of the autostructure for the estimation of significance of joint spike events seem to be inadequate.

  10. Application of cross-correlated delay shift rule in spiking neural networks for interictal spike detection.

    Science.gov (United States)

    Lilin Guo; Zhenzhong Wang; Cabrerizo, Mercedes; Adjouadi, Malek

    2016-08-01

    This study proposes a Cross-Correlated Delay Shift (CCDS) supervised learning rule to train neurons with associated spatiotemporal patterns to classify spike patterns. The objective of this study was to evaluate the feasibility of using the CCDS rule to automate the detection of interictal spikes in electroencephalogram (EEG) data on patients with epilepsy. Encoding is the initial yet essential step for spiking neurons to process EEG patterns. A new encoding method is utilized to convert the EEG signal into spike patterns. The simulation results show that the proposed algorithm identified 69 spikes out of 82 spikes, or 84% detection rate, which is quite high considering the subtleties of interictal spikes and the tediousness of monitoring long EEG records. This CCDS rule is also benchmarked by ReSuMe on the same task.

  11. Stochastic Variational Learning in Recurrent Spiking Networks

    Directory of Open Access Journals (Sweden)

    Danilo eJimenez Rezende

    2014-04-01

    Full Text Available The ability to learn and perform statistical inference with biologically plausible recurrent network of spiking neurons is an important step towards understanding perception and reasoning. Here we derive and investigate a new learning rule for recurrent spiking networks with hidden neurons, combining principles from variational learning and reinforcement learning. Our network defines a generative model over spike train histories and the derived learning rule has the form of a local Spike Timing Dependent Plasticity rule modulated by global factors (neuromodulators conveying information about ``novelty on a statistically rigorous ground.Simulations show that our model is able to learn bothstationary and non-stationary patterns of spike trains.We also propose one experiment that could potentially be performed with animals in order to test the dynamics of the predicted novelty signal.

  12. Stochastic variational learning in recurrent spiking networks.

    Science.gov (United States)

    Jimenez Rezende, Danilo; Gerstner, Wulfram

    2014-01-01

    The ability to learn and perform statistical inference with biologically plausible recurrent networks of spiking neurons is an important step toward understanding perception and reasoning. Here we derive and investigate a new learning rule for recurrent spiking networks with hidden neurons, combining principles from variational learning and reinforcement learning. Our network defines a generative model over spike train histories and the derived learning rule has the form of a local Spike Timing Dependent Plasticity rule modulated by global factors (neuromodulators) conveying information about "novelty" on a statistically rigorous ground. Simulations show that our model is able to learn both stationary and non-stationary patterns of spike trains. We also propose one experiment that could potentially be performed with animals in order to test the dynamics of the predicted novelty signal.

  13. Spiking neural P systems with multiple channels.

    Science.gov (United States)

    Peng, Hong; Yang, Jinyu; Wang, Jun; Wang, Tao; Sun, Zhang; Song, Xiaoxiao; Luo, Xiaohui; Huang, Xiangnian

    2017-11-01

    Spiking neural P systems (SNP systems, in short) are a class of distributed parallel computing systems inspired from the neurophysiological behavior of biological spiking neurons. In this paper, we investigate a new variant of SNP systems in which each neuron has one or more synaptic channels, called spiking neural P systems with multiple channels (SNP-MC systems, in short). The spiking rules with channel label are introduced to handle the firing mechanism of neurons, where the channel labels indicate synaptic channels of transmitting the generated spikes. The computation power of SNP-MC systems is investigated. Specifically, we prove that SNP-MC systems are Turing universal as both number generating and number accepting devices. Copyright © 2017 Elsevier Ltd. All rights reserved.

  14. Supervised Learning Based on Temporal Coding in Spiking Neural Networks.

    Science.gov (United States)

    Mostafa, Hesham

    2017-08-01

    Gradient descent training techniques are remarkably successful in training analog-valued artificial neural networks (ANNs). Such training techniques, however, do not transfer easily to spiking networks due to the spike generation hard nonlinearity and the discrete nature of spike communication. We show that in a feedforward spiking network that uses a temporal coding scheme where information is encoded in spike times instead of spike rates, the network input-output relation is differentiable almost everywhere. Moreover, this relation is piecewise linear after a transformation of variables. Methods for training ANNs thus carry directly to the training of such spiking networks as we show when training on the permutation invariant MNIST task. In contrast to rate-based spiking networks that are often used to approximate the behavior of ANNs, the networks we present spike much more sparsely and their behavior cannot be directly approximated by conventional ANNs. Our results highlight a new approach for controlling the behavior of spiking networks with realistic temporal dynamics, opening up the potential for using these networks to process spike patterns with complex temporal information.

  15. Bayesian population decoding of spiking neurons.

    Science.gov (United States)

    Gerwinn, Sebastian; Macke, Jakob; Bethge, Matthias

    2009-01-01

    The timing of action potentials in spiking neurons depends on the temporal dynamics of their inputs and contains information about temporal fluctuations in the stimulus. Leaky integrate-and-fire neurons constitute a popular class of encoding models, in which spike times depend directly on the temporal structure of the inputs. However, optimal decoding rules for these models have only been studied explicitly in the noiseless case. Here, we study decoding rules for probabilistic inference of a continuous stimulus from the spike times of a population of leaky integrate-and-fire neurons with threshold noise. We derive three algorithms for approximating the posterior distribution over stimuli as a function of the observed spike trains. In addition to a reconstruction of the stimulus we thus obtain an estimate of the uncertainty as well. Furthermore, we derive a 'spike-by-spike' online decoding scheme that recursively updates the posterior with the arrival of each new spike. We use these decoding rules to reconstruct time-varying stimuli represented by a Gaussian process from spike trains of single neurons as well as neural populations.

  16. Bayesian population decoding of spiking neurons

    Directory of Open Access Journals (Sweden)

    Sebastian Gerwinn

    2009-10-01

    Full Text Available The timing of action potentials in spiking neurons depends on the temporal dynamics of their inputs and contains information about temporal fluctuations in the stimulus. Leaky integrate-and-fire neurons constitute a popular class of encoding models, in which spike times depend directly on the temporal structure of the inputs. However, optimal decoding rules for these models have only been studied explicitly in the noiseless case. Here, we study decoding rules for probabilistic inference of a continuous stimulus from the spike times of a population of leaky integrate-and-fire neurons with threshold noise. We derive three algorithms for approximating the posterior distribution over stimuli as a function of the observed spike trains. In addition to a reconstruction of the stimulus we thus obtain an estimate of the uncertainty as well. Furthermore, we derive a `spike-by-spike' online decoding scheme that recursively updates the posterior with the arrival of each new spike. We use these decoding rules to reconstruct time-varying stimuli represented by a Gaussian process from spike trains of single neurons as well as neural populations.

  17. Spiking irregularity and frequency modulate the behavioral report of single-neuron stimulation.

    Science.gov (United States)

    Doron, Guy; von Heimendahl, Moritz; Schlattmann, Peter; Houweling, Arthur R; Brecht, Michael

    2014-02-05

    The action potential activity of single cortical neurons can evoke measurable sensory effects, but it is not known how spiking parameters and neuronal subtypes affect the evoked sensations. Here, we examined the effects of spike train irregularity, spike frequency, and spike number on the detectability of single-neuron stimulation in rat somatosensory cortex. For regular-spiking, putative excitatory neurons, detectability increased with spike train irregularity and decreasing spike frequencies but was not affected by spike number. Stimulation of single, fast-spiking, putative inhibitory neurons led to a larger sensory effect compared to regular-spiking neurons, and the effect size depended only on spike irregularity. An ideal-observer analysis suggests that, under our experimental conditions, rats were using integration windows of a few hundred milliseconds or more. Our data imply that the behaving animal is sensitive to single neurons' spikes and even to their temporal patterning. Copyright © 2014 Elsevier Inc. All rights reserved.

  18. Deep Spiking Networks

    NARCIS (Netherlands)

    O'Connor, P.; Welling, M.

    2016-01-01

    We introduce an algorithm to do backpropagation on a spiking network. Our network is "spiking" in the sense that our neurons accumulate their activation into a potential over time, and only send out a signal (a "spike") when this potential crosses a threshold and the neuron is reset. Neurons only

  19. ViSAPy: a Python tool for biophysics-based generation of virtual spiking activity for evaluation of spike-sorting algorithms.

    Science.gov (United States)

    Hagen, Espen; Ness, Torbjørn V; Khosrowshahi, Amir; Sørensen, Christina; Fyhn, Marianne; Hafting, Torkel; Franke, Felix; Einevoll, Gaute T

    2015-04-30

    New, silicon-based multielectrodes comprising hundreds or more electrode contacts offer the possibility to record spike trains from thousands of neurons simultaneously. This potential cannot be realized unless accurate, reliable automated methods for spike sorting are developed, in turn requiring benchmarking data sets with known ground-truth spike times. We here present a general simulation tool for computing benchmarking data for evaluation of spike-sorting algorithms entitled ViSAPy (Virtual Spiking Activity in Python). The tool is based on a well-established biophysical forward-modeling scheme and is implemented as a Python package built on top of the neuronal simulator NEURON and the Python tool LFPy. ViSAPy allows for arbitrary combinations of multicompartmental neuron models and geometries of recording multielectrodes. Three example benchmarking data sets are generated, i.e., tetrode and polytrode data mimicking in vivo cortical recordings and microelectrode array (MEA) recordings of in vitro activity in salamander retinas. The synthesized example benchmarking data mimics salient features of typical experimental recordings, for example, spike waveforms depending on interspike interval. ViSAPy goes beyond existing methods as it includes biologically realistic model noise, synaptic activation by recurrent spiking networks, finite-sized electrode contacts, and allows for inhomogeneous electrical conductivities. ViSAPy is optimized to allow for generation of long time series of benchmarking data, spanning minutes of biological time, by parallel execution on multi-core computers. ViSAPy is an open-ended tool as it can be generalized to produce benchmarking data or arbitrary recording-electrode geometries and with various levels of complexity. Copyright © 2015 The Authors. Published by Elsevier B.V. All rights reserved.

  20. Neuronal coding and spiking randomness

    Czech Academy of Sciences Publication Activity Database

    Košťál, Lubomír; Lánský, Petr; Rospars, J. P.

    2007-01-01

    Roč. 26, č. 10 (2007), s. 2693-2988 ISSN 0953-816X R&D Projects: GA MŠk(CZ) LC554; GA AV ČR(CZ) 1ET400110401; GA AV ČR(CZ) KJB100110701 Grant - others:ECO-NET(FR) 112644PF Institutional research plan: CEZ:AV0Z50110509 Keywords : spike train * variability * neurovědy Subject RIV: FH - Neurology Impact factor: 3.673, year: 2007

  1. Simple and robust generation of ultrafast laser pulse trains using polarization-independent parallel-aligned thin films

    Science.gov (United States)

    Wang, Andong; Jiang, Lan; Li, Xiaowei; Wang, Zhi; Du, Kun; Lu, Yongfeng

    2018-05-01

    Ultrafast laser pulse temporal shaping has been widely applied in various important applications such as laser materials processing, coherent control of chemical reactions, and ultrafast imaging. However, temporal pulse shaping has been limited to only-in-lab technique due to the high cost, low damage threshold, and polarization dependence. Herein we propose a novel design of ultrafast laser pulse train generation device, which consists of multiple polarization-independent parallel-aligned thin films. Various pulse trains with controllable temporal profile can be generated flexibly by multi-reflections within the splitting films. Compared with other pulse train generation techniques, this method has advantages of compact structure, low cost, high damage threshold and polarization independence. These advantages endow it with high potential for broad utilization in ultrafast applications.

  2. Decoding spikes in a spiking neuronal network

    Energy Technology Data Exchange (ETDEWEB)

    Feng Jianfeng [Department of Informatics, University of Sussex, Brighton BN1 9QH (United Kingdom); Ding, Mingzhou [Department of Mathematics, Florida Atlantic University, Boca Raton, FL 33431 (United States)

    2004-06-04

    We investigate how to reliably decode the input information from the output of a spiking neuronal network. A maximum likelihood estimator of the input signal, together with its Fisher information, is rigorously calculated. The advantage of the maximum likelihood estimation over the 'brute-force rate coding' estimate is clearly demonstrated. It is pointed out that the ergodic assumption in neuroscience, i.e. a temporal average is equivalent to an ensemble average, is in general not true. Averaging over an ensemble of neurons usually gives a biased estimate of the input information. A method on how to compensate for the bias is proposed. Reconstruction of dynamical input signals with a group of spiking neurons is extensively studied and our results show that less than a spike is sufficient to accurately decode dynamical inputs.

  3. Decoding spikes in a spiking neuronal network

    International Nuclear Information System (INIS)

    Feng Jianfeng; Ding, Mingzhou

    2004-01-01

    We investigate how to reliably decode the input information from the output of a spiking neuronal network. A maximum likelihood estimator of the input signal, together with its Fisher information, is rigorously calculated. The advantage of the maximum likelihood estimation over the 'brute-force rate coding' estimate is clearly demonstrated. It is pointed out that the ergodic assumption in neuroscience, i.e. a temporal average is equivalent to an ensemble average, is in general not true. Averaging over an ensemble of neurons usually gives a biased estimate of the input information. A method on how to compensate for the bias is proposed. Reconstruction of dynamical input signals with a group of spiking neurons is extensively studied and our results show that less than a spike is sufficient to accurately decode dynamical inputs

  4. Spike Timing Matters in Novel Neuronal Code Involved in Vibrotactile Frequency Perception.

    Science.gov (United States)

    Birznieks, Ingvars; Vickery, Richard M

    2017-05-22

    Skin vibrations sensed by tactile receptors contribute significantly to the perception of object properties during tactile exploration [1-4] and to sensorimotor control during object manipulation [5]. Sustained low-frequency skin vibration (perception of frequency is still unknown. Measures based on mean spike rates of neurons in the primary somatosensory cortex are sufficient to explain performance in some frequency discrimination tasks [7-11]; however, there is emerging evidence that stimuli can be distinguished based also on temporal features of neural activity [12, 13]. Our study's advance is to demonstrate that temporal features are fundamental for vibrotactile frequency perception. Pulsatile mechanical stimuli were used to elicit specified temporal spike train patterns in tactile afferents, and subsequently psychophysical methods were employed to characterize human frequency perception. Remarkably, the most salient temporal feature determining vibrotactile frequency was not the underlying periodicity but, rather, the duration of the silent gap between successive bursts of neural activity. This burst gap code for frequency represents a previously unknown form of neural coding in the tactile sensory system, which parallels auditory pitch perception mechanisms based on purely temporal information where longer inter-pulse intervals receive higher perceptual weights than short intervals [14]. Our study also demonstrates that human perception of stimuli can be determined exclusively by temporal features of spike trains independent of the mean spike rate and without contribution from population response factors. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Span: spike pattern association neuron for learning spatio-temporal spike patterns.

    Science.gov (United States)

    Mohemmed, Ammar; Schliebs, Stefan; Matsuda, Satoshi; Kasabov, Nikola

    2012-08-01

    Spiking Neural Networks (SNN) were shown to be suitable tools for the processing of spatio-temporal information. However, due to their inherent complexity, the formulation of efficient supervised learning algorithms for SNN is difficult and remains an important problem in the research area. This article presents SPAN - a spiking neuron that is able to learn associations of arbitrary spike trains in a supervised fashion allowing the processing of spatio-temporal information encoded in the precise timing of spikes. The idea of the proposed algorithm is to transform spike trains during the learning phase into analog signals so that common mathematical operations can be performed on them. Using this conversion, it is possible to apply the well-known Widrow-Hoff rule directly to the transformed spike trains in order to adjust the synaptic weights and to achieve a desired input/output spike behavior of the neuron. In the presented experimental analysis, the proposed learning algorithm is evaluated regarding its learning capabilities, its memory capacity, its robustness to noisy stimuli and its classification performance. Differences and similarities of SPAN regarding two related algorithms, ReSuMe and Chronotron, are discussed.

  6. Automatic EEG spike detection.

    Science.gov (United States)

    Harner, Richard

    2009-10-01

    Since the 1970s advances in science and technology during each succeeding decade have renewed the expectation of efficient, reliable automatic epileptiform spike detection (AESD). But even when reinforced with better, faster tools, clinically reliable unsupervised spike detection remains beyond our reach. Expert-selected spike parameters were the first and still most widely used for AESD. Thresholds for amplitude, duration, sharpness, rise-time, fall-time, after-coming slow waves, background frequency, and more have been used. It is still unclear which of these wave parameters are essential, beyond peak-peak amplitude and duration. Wavelet parameters are very appropriate to AESD but need to be combined with other parameters to achieve desired levels of spike detection efficiency. Artificial Neural Network (ANN) and expert-system methods may have reached peak efficiency. Support Vector Machine (SVM) technology focuses on outliers rather than centroids of spike and nonspike data clusters and should improve AESD efficiency. An exemplary spike/nonspike database is suggested as a tool for assessing parameters and methods for AESD and is available in CSV or Matlab formats from the author at brainvue@gmail.com. Exploratory Data Analysis (EDA) is presented as a graphic method for finding better spike parameters and for the step-wise evaluation of the spike detection process.

  7. Memristors Empower Spiking Neurons With Stochasticity

    KAUST Repository

    Al-Shedivat, Maruan

    2015-06-01

    Recent theoretical studies have shown that probabilistic spiking can be interpreted as learning and inference in cortical microcircuits. This interpretation creates new opportunities for building neuromorphic systems driven by probabilistic learning algorithms. However, such systems must have two crucial features: 1) the neurons should follow a specific behavioral model, and 2) stochastic spiking should be implemented efficiently for it to be scalable. This paper proposes a memristor-based stochastically spiking neuron that fulfills these requirements. First, the analytical model of the memristor is enhanced so it can capture the behavioral stochasticity consistent with experimentally observed phenomena. The switching behavior of the memristor model is demonstrated to be akin to the firing of the stochastic spike response neuron model, the primary building block for probabilistic algorithms in spiking neural networks. Furthermore, the paper proposes a neural soma circuit that utilizes the intrinsic nondeterminism of memristive switching for efficient spike generation. The simulations and analysis of the behavior of a single stochastic neuron and a winner-take-all network built of such neurons and trained on handwritten digits confirm that the circuit can be used for building probabilistic sampling and pattern adaptation machinery in spiking networks. The findings constitute an important step towards scalable and efficient probabilistic neuromorphic platforms. © 2011 IEEE.

  8. SpikeTemp: An Enhanced Rank-Order-Based Learning Approach for Spiking Neural Networks With Adaptive Structure.

    Science.gov (United States)

    Wang, Jinling; Belatreche, Ammar; Maguire, Liam P; McGinnity, Thomas Martin

    2017-01-01

    This paper presents an enhanced rank-order-based learning algorithm, called SpikeTemp, for spiking neural networks (SNNs) with a dynamically adaptive structure. The trained feed-forward SNN consists of two layers of spiking neurons: 1) an encoding layer which temporally encodes real-valued features into spatio-temporal spike patterns and 2) an output layer of dynamically grown neurons which perform spatio-temporal classification. Both Gaussian receptive fields and square cosine population encoding schemes are employed to encode real-valued features into spatio-temporal spike patterns. Unlike the rank-order-based learning approach, SpikeTemp uses the precise times of the incoming spikes for adjusting the synaptic weights such that early spikes result in a large weight change and late spikes lead to a smaller weight change. This removes the need to rank all the incoming spikes and, thus, reduces the computational cost of SpikeTemp. The proposed SpikeTemp algorithm is demonstrated on several benchmark data sets and on an image recognition task. The results show that SpikeTemp can achieve better classification performance and is much faster than the existing rank-order-based learning approach. In addition, the number of output neurons is much smaller when the square cosine encoding scheme is employed. Furthermore, SpikeTemp is benchmarked against a selection of existing machine learning algorithms, and the results demonstrate the ability of SpikeTemp to classify different data sets after just one presentation of the training samples with comparable classification performance.

  9. Evoking prescribed spike times in stochastic neurons

    Science.gov (United States)

    Doose, Jens; Lindner, Benjamin

    2017-09-01

    Single cell stimulation in vivo is a powerful tool to investigate the properties of single neurons and their functionality in neural networks. We present a method to determine a cell-specific stimulus that reliably evokes a prescribed spike train with high temporal precision of action potentials. We test the performance of this stimulus in simulations for two different stochastic neuron models. For a broad range of parameters and a neuron firing with intermediate firing rates (20-40 Hz) the reliability in evoking the prescribed spike train is close to its theoretical maximum that is mainly determined by the level of intrinsic noise.

  10. Effect of physical training on urinary incontinence: a randomized parallel group trial in nursing homes

    Directory of Open Access Journals (Sweden)

    Vinsnes AG

    2012-02-01

    Full Text Available Anne G Vinsnes1, Jorunn L Helbostad2, Signe Nyrønning3, Gene E Harkless1,4, Randi Granbo5, Arnfinn Seim61Faculty of Nursing, Sør-Trøndelag University College, 2Department of Neuroscience, Norwegian University of Science and Technology, 3Søbstad Community Hospital and Teaching Nursing Home, Trondheim, Norway; 4University of New Hampshire, College of Health and Social Services, Nursing Faculty, Durham, New Hampshire, USA; 5Department of Physiotherapy, Sør-Trøndelag University College, 6Department of Public Health and General Practice, Norwegian University of Science and Technology, Trondheim, NorwayBackground: Residents in nursing homes (NHs are often frail older persons who have impaired physical activity. Urinary incontinence (UI is a common complaint for residents in NHs. Reduced functional ability and residence in NHs are documented to be risk factors for UI.Objective: To investigate if an individualized training program designed to improve activity of daily living (ADL and physical capacity among residents in nursing homes has any impact on UI.Materials and methods: This randomized controlled trial was a substudy of a Nordic multicenter study. Participants had to be >65 years, have stayed in the NH for more than 3 months and in need of assistance in at least one ADL. A total of 98 residents were randomly allocated to either a training group (n = 48 or a control group (n = 50 after baseline registrations. The training program lasted for 3 months and included accommodated physical activity and ADL training. Personal treatment goals were elicited for each subject. The control group received their usual care. The main outcome measure was UI as measured by a 24-hour pad-weighing test. There was no statistically significant difference between the groups on this measure at baseline (P = 0.15. Changes were calculated from baseline to 3 months after the end of the intervention.Results: Altogether, 68 participants were included in the analysis

  11. Spike Code Flow in Cultured Neuronal Networks.

    Science.gov (United States)

    Tamura, Shinichi; Nishitani, Yoshi; Hosokawa, Chie; Miyoshi, Tomomitsu; Sawai, Hajime; Kamimura, Takuya; Yagi, Yasushi; Mizuno-Matsumoto, Yuko; Chen, Yen-Wei

    2016-01-01

    We observed spike trains produced by one-shot electrical stimulation with 8 × 8 multielectrodes in cultured neuronal networks. Each electrode accepted spikes from several neurons. We extracted the short codes from spike trains and obtained a code spectrum with a nominal time accuracy of 1%. We then constructed code flow maps as movies of the electrode array to observe the code flow of "1101" and "1011," which are typical pseudorandom sequence such as that we often encountered in a literature and our experiments. They seemed to flow from one electrode to the neighboring one and maintained their shape to some extent. To quantify the flow, we calculated the "maximum cross-correlations" among neighboring electrodes, to find the direction of maximum flow of the codes with lengths less than 8. Normalized maximum cross-correlations were almost constant irrespective of code. Furthermore, if the spike trains were shuffled in interval orders or in electrodes, they became significantly small. Thus, the analysis suggested that local codes of approximately constant shape propagated and conveyed information across the network. Hence, the codes can serve as visible and trackable marks of propagating spike waves as well as evaluating information flow in the neuronal network.

  12. Spike Code Flow in Cultured Neuronal Networks

    Directory of Open Access Journals (Sweden)

    Shinichi Tamura

    2016-01-01

    Full Text Available We observed spike trains produced by one-shot electrical stimulation with 8 × 8 multielectrodes in cultured neuronal networks. Each electrode accepted spikes from several neurons. We extracted the short codes from spike trains and obtained a code spectrum with a nominal time accuracy of 1%. We then constructed code flow maps as movies of the electrode array to observe the code flow of “1101” and “1011,” which are typical pseudorandom sequence such as that we often encountered in a literature and our experiments. They seemed to flow from one electrode to the neighboring one and maintained their shape to some extent. To quantify the flow, we calculated the “maximum cross-correlations” among neighboring electrodes, to find the direction of maximum flow of the codes with lengths less than 8. Normalized maximum cross-correlations were almost constant irrespective of code. Furthermore, if the spike trains were shuffled in interval orders or in electrodes, they became significantly small. Thus, the analysis suggested that local codes of approximately constant shape propagated and conveyed information across the network. Hence, the codes can serve as visible and trackable marks of propagating spike waves as well as evaluating information flow in the neuronal network.

  13. Spike-based population coding and working memory.

    Directory of Open Access Journals (Sweden)

    Martin Boerlin

    2011-02-01

    Full Text Available Compelling behavioral evidence suggests that humans can make optimal decisions despite the uncertainty inherent in perceptual or motor tasks. A key question in neuroscience is how populations of spiking neurons can implement such probabilistic computations. In this article, we develop a comprehensive framework for optimal, spike-based sensory integration and working memory in a dynamic environment. We propose that probability distributions are inferred spike-per-spike in recurrently connected networks of integrate-and-fire neurons. As a result, these networks can combine sensory cues optimally, track the state of a time-varying stimulus and memorize accumulated evidence over periods much longer than the time constant of single neurons. Importantly, we propose that population responses and persistent working memory states represent entire probability distributions and not only single stimulus values. These memories are reflected by sustained, asynchronous patterns of activity which make relevant information available to downstream neurons within their short time window of integration. Model neurons act as predictive encoders, only firing spikes which account for new information that has not yet been signaled. Thus, spike times signal deterministically a prediction error, contrary to rate codes in which spike times are considered to be random samples of an underlying firing rate. As a consequence of this coding scheme, a multitude of spike patterns can reliably encode the same information. This results in weakly correlated, Poisson-like spike trains that are sensitive to initial conditions but robust to even high levels of external neural noise. This spike train variability reproduces the one observed in cortical sensory spike trains, but cannot be equated to noise. On the contrary, it is a consequence of optimal spike-based inference. In contrast, we show that rate-based models perform poorly when implemented with stochastically spiking neurons.

  14. Communication partner training for health care professionals in an inpatient rehabilitation setting: A parallel randomised trial.

    Science.gov (United States)

    Heard, Renee; O'Halloran, Robyn; McKinley, Kathryn

    2017-06-01

    The purpose of this study is to determine if the E-Learning Plus communication partner training (CPT) programme is as effective as the Supported Conversation for Adults with Aphasia (SCA TM ) CPT programme in improving healthcare professionals' confidence and knowledge communicating with patients with aphasia. Forty-eight healthcare professionals working in inpatient rehabilitation participated. Participants were randomised to one of the CPT programmes. The three outcome measures were self-rating of confidence, self-rating of knowledge and a test of knowledge of aphasia. Measures were taken pre-, immediately post- and 3-4 months post-training. Data were analysed using mixed between within ANOVAs. Homogeneity of variance was adequate for self-rating of confidence and test of knowledge of aphasia data to continue analysis. There was a statistically significant difference in self-rating of confidence and knowledge of aphasia for both interventions across time. No statistically significant difference was found between the two interventions. Both CPT interventions were associated with an increase in health care professionals' confidence and knowledge of aphasia, but neither programme was superior. As the E-Learning Plus CPT programme is more accessible and sustainable in the Australian healthcare context, further work will continue on this CPT programme.

  15. Beyond Silence: A Randomized, Parallel-Group Trial Exploring the Impact of Workplace Mental Health Literacy Training with Healthcare Employees.

    Science.gov (United States)

    Moll, Sandra E; Patten, Scott; Stuart, Heather; MacDermid, Joy C; Kirsh, Bonnie

    2018-01-01

    This study sought to evaluate whether a contact-based workplace education program was more effective than standard mental health literacy training in promoting early intervention and support for healthcare employees with mental health issues. A parallel-group, randomised trial was conducted with employees in 2 multi-site Ontario hospitals with the evaluators blinded to the groups. Participants were randomly assigned to 1 of 2 group-based education programs: Beyond Silence (comprising 6 in-person, 2-h sessions plus 5 online sessions co-led by employees who personally experienced mental health issues) or Mental Health First Aid (a standardised 2-day training program led by a trained facilitator). Participants completed baseline, post-group, and 3-mo follow-up surveys to explore perceived changes in mental health knowledge, stigmatized beliefs, and help-seeking/help-outreach behaviours. An intent-to-treat analysis was completed with 192 participants. Differences were assessed using multi-level mixed models accounting for site, group, and repeated measurement. Neither program led to significant increases in help-seeking or help-outreach behaviours. Both programs increased mental health literacy, improved attitudes towards seeking treatment, and decreased stigmatized beliefs, with sustained changes in stigmatized beliefs more prominent in the Beyond Silence group. Beyond Silence, a new contact-based education program customised for healthcare workers was not superior to standard mental health literacy training in improving mental health help-seeking or help-outreach behaviours in the workplace. The only difference was a reduction in stigmatized beliefs over time. Additional research is needed to explore the factors that lead to behaviour change.

  16. Accelerated spike resampling for accurate multiple testing controls.

    Science.gov (United States)

    Harrison, Matthew T

    2013-02-01

    Controlling for multiple hypothesis tests using standard spike resampling techniques often requires prohibitive amounts of computation. Importance sampling techniques can be used to accelerate the computation. The general theory is presented, along with specific examples for testing differences across conditions using permutation tests and for testing pairwise synchrony and precise lagged-correlation between many simultaneously recorded spike trains using interval jitter.

  17. Surfing a spike wave down the ventral stream.

    Science.gov (United States)

    VanRullen, Rufin; Thorpe, Simon J

    2002-10-01

    Numerous theories of neural processing, often motivated by experimental observations, have explored the computational properties of neural codes based on the absolute or relative timing of spikes in spike trains. Spiking neuron models and theories however, as well as their experimental counterparts, have generally been limited to the simulation or observation of isolated neurons, isolated spike trains, or reduced neural populations. Such theories would therefore seem inappropriate to capture the properties of a neural code relying on temporal spike patterns distributed across large neuronal populations. Here we report a range of computer simulations and theoretical considerations that were designed to explore the possibilities of one such code and its relevance for visual processing. In a unified framework where the relation between stimulus saliency and spike relative timing plays the central role, we describe how the ventral stream of the visual system could process natural input scenes and extract meaningful information, both rapidly and reliably. The first wave of spikes generated in the retina in response to a visual stimulation carries information explicitly in its spatio-temporal structure: the most salient information is represented by the first spikes over the population. This spike wave, propagating through a hierarchy of visual areas, is regenerated at each processing stage, where its temporal structure can be modified by (i). the selectivity of the cortical neurons, (ii). lateral interactions and (iii). top-down attentional influences from higher order cortical areas. The resulting model could account for the remarkable efficiency and rapidity of processing observed in the primate visual system.

  18. A new supervised learning algorithm for spiking neurons.

    Science.gov (United States)

    Xu, Yan; Zeng, Xiaoqin; Zhong, Shuiming

    2013-06-01

    The purpose of supervised learning with temporal encoding for spiking neurons is to make the neurons emit a specific spike train encoded by the precise firing times of spikes. If only running time is considered, the supervised learning for a spiking neuron is equivalent to distinguishing the times of desired output spikes and the other time during the running process of the neuron through adjusting synaptic weights, which can be regarded as a classification problem. Based on this idea, this letter proposes a new supervised learning method for spiking neurons with temporal encoding; it first transforms the supervised learning into a classification problem and then solves the problem by using the perceptron learning rule. The experiment results show that the proposed method has higher learning accuracy and efficiency over the existing learning methods, so it is more powerful for solving complex and real-time problems.

  19. Consensus-Based Sorting of Neuronal Spike Waveforms.

    Science.gov (United States)

    Fournier, Julien; Mueller, Christian M; Shein-Idelson, Mark; Hemberger, Mike; Laurent, Gilles

    2016-01-01

    Optimizing spike-sorting algorithms is difficult because sorted clusters can rarely be checked against independently obtained "ground truth" data. In most spike-sorting algorithms in use today, the optimality of a clustering solution is assessed relative to some assumption on the distribution of the spike shapes associated with a particular single unit (e.g., Gaussianity) and by visual inspection of the clustering solution followed by manual validation. When the spatiotemporal waveforms of spikes from different cells overlap, the decision as to whether two spikes should be assigned to the same source can be quite subjective, if it is not based on reliable quantitative measures. We propose a new approach, whereby spike clusters are identified from the most consensual partition across an ensemble of clustering solutions. Using the variability of the clustering solutions across successive iterations of the same clustering algorithm (template matching based on K-means clusters), we estimate the probability of spikes being clustered together and identify groups of spikes that are not statistically distinguishable from one another. Thus, we identify spikes that are most likely to be clustered together and therefore correspond to consistent spike clusters. This method has the potential advantage that it does not rely on any model of the spike shapes. It also provides estimates of the proportion of misclassified spikes for each of the identified clusters. We tested our algorithm on several datasets for which there exists a ground truth (simultaneous intracellular data), and show that it performs close to the optimum reached by a support vector machine trained on the ground truth. We also show that the estimated rate of misclassification matches the proportion of misclassified spikes measured from the ground truth data.

  20. A model-based spike sorting algorithm for removing correlation artifacts in multi-neuron recordings.

    Science.gov (United States)

    Pillow, Jonathan W; Shlens, Jonathon; Chichilnisky, E J; Simoncelli, Eero P

    2013-01-01

    We examine the problem of estimating the spike trains of multiple neurons from voltage traces recorded on one or more extracellular electrodes. Traditional spike-sorting methods rely on thresholding or clustering of recorded signals to identify spikes. While these methods can detect a large fraction of the spikes from a recording, they generally fail to identify synchronous or near-synchronous spikes: cases in which multiple spikes overlap. Here we investigate the geometry of failures in traditional sorting algorithms, and document the prevalence of such errors in multi-electrode recordings from primate retina. We then develop a method for multi-neuron spike sorting using a model that explicitly accounts for the superposition of spike waveforms. We model the recorded voltage traces as a linear combination of spike waveforms plus a stochastic background component of correlated Gaussian noise. Combining this measurement model with a Bernoulli prior over binary spike trains yields a posterior distribution for spikes given the recorded data. We introduce a greedy algorithm to maximize this posterior that we call "binary pursuit". The algorithm allows modest variability in spike waveforms and recovers spike times with higher precision than the voltage sampling rate. This method substantially corrects cross-correlation artifacts that arise with conventional methods, and substantially outperforms clustering methods on both real and simulated data. Finally, we develop diagnostic tools that can be used to assess errors in spike sorting in the absence of ground truth.

  1. The variational spiked oscillator

    International Nuclear Information System (INIS)

    Aguilera-Navarro, V.C.; Ullah, N.

    1992-08-01

    A variational analysis of the spiked harmonic oscillator Hamiltonian -d 2 / d x 2 + x 2 + δ/ x 5/2 , δ > 0, is reported in this work. A trial function satisfying Dirichlet boundary conditions is suggested. The results are excellent for a large range of values of the coupling parameter. (author)

  2. Synchronous Spike Patterns in Macaque Motor Cortex during an Instructed-Delay Reach-to-Grasp Task.

    Science.gov (United States)

    Torre, Emiliano; Quaglio, Pietro; Denker, Michael; Brochier, Thomas; Riehle, Alexa; Grün, Sonja

    2016-08-10

    The computational role of spike time synchronization at millisecond precision among neurons in the cerebral cortex is hotly debated. Studies performed on data of limited size provided experimental evidence that low-order correlations occur in relation to behavior. Advances in electrophysiological technology to record from hundreds of neurons simultaneously provide the opportunity to observe coordinated spiking activity of larger populations of cells. We recently published a method that combines data mining and statistical evaluation to search for significant patterns of synchronous spikes in massively parallel spike trains (Torre et al., 2013). The method solves the computational and multiple testing problems raised by the high dimensionality of the data. In the current study, we used our method on simultaneous recordings from two macaque monkeys engaged in an instructed-delay reach-to-grasp task to determine the emergence of spike synchronization in relation to behavior. We found a multitude of synchronous spike patterns aligned in both monkeys along a preferential mediolateral orientation in brain space. The occurrence of the patterns is highly specific to behavior, indicating that different behaviors are associated with the synchronization of different groups of neurons ("cell assemblies"). However, pooled patterns that overlap in neuronal composition exhibit no specificity, suggesting that exclusive cell assemblies become active during different behaviors, but can recruit partly identical neurons. These findings are consistent across multiple recording sessions analyzed across the two monkeys. Neurons in the brain communicate via electrical impulses called spikes. How spikes are coordinated to process information is still largely unknown. Synchronous spikes are effective in triggering a spike emission in receiving neurons and have been shown to occur in relation to behavior in a number of studies on simultaneous recordings of few neurons. We recently published

  3. Precise-spike-driven synaptic plasticity: learning hetero-association of spatiotemporal spike patterns.

    Directory of Open Access Journals (Sweden)

    Qiang Yu

    Full Text Available A new learning rule (Precise-Spike-Driven (PSD Synaptic Plasticity is proposed for processing and memorizing spatiotemporal patterns. PSD is a supervised learning rule that is analytically derived from the traditional Widrow-Hoff rule and can be used to train neurons to associate an input spatiotemporal spike pattern with a desired spike train. Synaptic adaptation is driven by the error between the desired and the actual output spikes, with positive errors causing long-term potentiation and negative errors causing long-term depression. The amount of modification is proportional to an eligibility trace that is triggered by afferent spikes. The PSD rule is both computationally efficient and biologically plausible. The properties of this learning rule are investigated extensively through experimental simulations, including its learning performance, its generality to different neuron models, its robustness against noisy conditions, its memory capacity, and the effects of its learning parameters. Experimental results show that the PSD rule is capable of spatiotemporal pattern classification, and can even outperform a well studied benchmark algorithm with the proposed relative confidence criterion. The PSD rule is further validated on a practical example of an optical character recognition problem. The results again show that it can achieve a good recognition performance with a proper encoding. Finally, a detailed discussion is provided about the PSD rule and several related algorithms including tempotron, SPAN, Chronotron and ReSuMe.

  4. Precise-spike-driven synaptic plasticity: learning hetero-association of spatiotemporal spike patterns.

    Science.gov (United States)

    Yu, Qiang; Tang, Huajin; Tan, Kay Chen; Li, Haizhou

    2013-01-01

    A new learning rule (Precise-Spike-Driven (PSD) Synaptic Plasticity) is proposed for processing and memorizing spatiotemporal patterns. PSD is a supervised learning rule that is analytically derived from the traditional Widrow-Hoff rule and can be used to train neurons to associate an input spatiotemporal spike pattern with a desired spike train. Synaptic adaptation is driven by the error between the desired and the actual output spikes, with positive errors causing long-term potentiation and negative errors causing long-term depression. The amount of modification is proportional to an eligibility trace that is triggered by afferent spikes. The PSD rule is both computationally efficient and biologically plausible. The properties of this learning rule are investigated extensively through experimental simulations, including its learning performance, its generality to different neuron models, its robustness against noisy conditions, its memory capacity, and the effects of its learning parameters. Experimental results show that the PSD rule is capable of spatiotemporal pattern classification, and can even outperform a well studied benchmark algorithm with the proposed relative confidence criterion. The PSD rule is further validated on a practical example of an optical character recognition problem. The results again show that it can achieve a good recognition performance with a proper encoding. Finally, a detailed discussion is provided about the PSD rule and several related algorithms including tempotron, SPAN, Chronotron and ReSuMe.

  5. Spiking neural network for recognizing spatiotemporal sequences of spikes

    International Nuclear Information System (INIS)

    Jin, Dezhe Z.

    2004-01-01

    Sensory neurons in many brain areas spike with precise timing to stimuli with temporal structures, and encode temporally complex stimuli into spatiotemporal spikes. How the downstream neurons read out such neural code is an important unsolved problem. In this paper, we describe a decoding scheme using a spiking recurrent neural network. The network consists of excitatory neurons that form a synfire chain, and two globally inhibitory interneurons of different types that provide delayed feedforward and fast feedback inhibition, respectively. The network signals recognition of a specific spatiotemporal sequence when the last excitatory neuron down the synfire chain spikes, which happens if and only if that sequence was present in the input spike stream. The recognition scheme is invariant to variations in the intervals between input spikes within some range. The computation of the network can be mapped into that of a finite state machine. Our network provides a simple way to decode spatiotemporal spikes with diverse types of neurons

  6. A method for decoding the neurophysiological spike-response transform.

    Science.gov (United States)

    Stern, Estee; García-Crescioni, Keyla; Miller, Mark W; Peskin, Charles S; Brezina, Vladimir

    2009-11-15

    Many physiological responses elicited by neuronal spikes-intracellular calcium transients, synaptic potentials, muscle contractions-are built up of discrete, elementary responses to each spike. However, the spikes occur in trains of arbitrary temporal complexity, and each elementary response not only sums with previous ones, but can itself be modified by the previous history of the activity. A basic goal in system identification is to characterize the spike-response transform in terms of a small number of functions-the elementary response kernel and additional kernels or functions that describe the dependence on previous history-that will predict the response to any arbitrary spike train. Here we do this by developing further and generalizing the "synaptic decoding" approach of Sen et al. (1996). Given the spike times in a train and the observed overall response, we use least-squares minimization to construct the best estimated response and at the same time best estimates of the elementary response kernel and the other functions that characterize the spike-response transform. We avoid the need for any specific initial assumptions about these functions by using techniques of mathematical analysis and linear algebra that allow us to solve simultaneously for all of the numerical function values treated as independent parameters. The functions are such that they may be interpreted mechanistically. We examine the performance of the method as applied to synthetic data. We then use the method to decode real synaptic and muscle contraction transforms.

  7. Unsupervised spike sorting based on discriminative subspace learning.

    Science.gov (United States)

    Keshtkaran, Mohammad Reza; Yang, Zhi

    2014-01-01

    Spike sorting is a fundamental preprocessing step for many neuroscience studies which rely on the analysis of spike trains. In this paper, we present two unsupervised spike sorting algorithms based on discriminative subspace learning. The first algorithm simultaneously learns the discriminative feature subspace and performs clustering. It uses histogram of features in the most discriminative projection to detect the number of neurons. The second algorithm performs hierarchical divisive clustering that learns a discriminative 1-dimensional subspace for clustering in each level of the hierarchy until achieving almost unimodal distribution in the subspace. The algorithms are tested on synthetic and in-vivo data, and are compared against two widely used spike sorting methods. The comparative results demonstrate that our spike sorting methods can achieve substantially higher accuracy in lower dimensional feature space, and they are highly robust to noise. Moreover, they provide significantly better cluster separability in the learned subspace than in the subspace obtained by principal component analysis or wavelet transform.

  8. A supervised learning rule for classification of spatiotemporal spike patterns.

    Science.gov (United States)

    Lilin Guo; Zhenzhong Wang; Adjouadi, Malek

    2016-08-01

    This study introduces a novel supervised algorithm for spiking neurons that take into consideration synapse delays and axonal delays associated with weights. It can be utilized for both classification and association and uses several biologically influenced properties, such as axonal and synaptic delays. This algorithm also takes into consideration spike-timing-dependent plasticity as in Remote Supervised Method (ReSuMe). This paper focuses on the classification aspect alone. Spiked neurons trained according to this proposed learning rule are capable of classifying different categories by the associated sequences of precisely timed spikes. Simulation results have shown that the proposed learning method greatly improves classification accuracy when compared to the Spike Pattern Association Neuron (SPAN) and the Tempotron learning rule.

  9. Neuronal spike sorting based on radial basis function neural networks

    Directory of Open Access Journals (Sweden)

    Taghavi Kani M

    2011-02-01

    Full Text Available "nBackground: Studying the behavior of a society of neurons, extracting the communication mechanisms of brain with other tissues, finding treatment for some nervous system diseases and designing neuroprosthetic devices, require an algorithm to sort neuralspikes automatically. However, sorting neural spikes is a challenging task because of the low signal to noise ratio (SNR of the spikes. The main purpose of this study was to design an automatic algorithm for classifying neuronal spikes that are emitted from a specific region of the nervous system."n "nMethods: The spike sorting process usually consists of three stages: detection, feature extraction and sorting. We initially used signal statistics to detect neural spikes. Then, we chose a limited number of typical spikes as features and finally used them to train a radial basis function (RBF neural network to sort the spikes. In most spike sorting devices, these signals are not linearly discriminative. In order to solve this problem, the aforesaid RBF neural network was used."n "nResults: After the learning process, our proposed algorithm classified any arbitrary spike. The obtained results showed that even though the proposed Radial Basis Spike Sorter (RBSS reached to the same error as the previous methods, however, the computational costs were much lower compared to other algorithms. Moreover, the competitive points of the proposed algorithm were its good speed and low computational complexity."n "nConclusion: Regarding the results of this study, the proposed algorithm seems to serve the purpose of procedures that require real-time processing and spike sorting.

  10. Causal Inference and Explaining Away in a Spiking Network

    Science.gov (United States)

    Moreno-Bote, Rubén; Drugowitsch, Jan

    2015-01-01

    While the brain uses spiking neurons for communication, theoretical research on brain computations has mostly focused on non-spiking networks. The nature of spike-based algorithms that achieve complex computations, such as object probabilistic inference, is largely unknown. Here we demonstrate that a family of high-dimensional quadratic optimization problems with non-negativity constraints can be solved exactly and efficiently by a network of spiking neurons. The network naturally imposes the non-negativity of causal contributions that is fundamental to causal inference, and uses simple operations, such as linear synapses with realistic time constants, and neural spike generation and reset non-linearities. The network infers the set of most likely causes from an observation using explaining away, which is dynamically implemented by spike-based, tuned inhibition. The algorithm performs remarkably well even when the network intrinsically generates variable spike trains, the timing of spikes is scrambled by external sources of noise, or the network is mistuned. This type of network might underlie tasks such as odor identification and classification. PMID:26621426

  11. Predictive coding of dynamical variables in balanced spiking networks.

    Science.gov (United States)

    Boerlin, Martin; Machens, Christian K; Denève, Sophie

    2013-01-01

    Two observations about the cortex have puzzled neuroscientists for a long time. First, neural responses are highly variable. Second, the level of excitation and inhibition received by each neuron is tightly balanced at all times. Here, we demonstrate that both properties are necessary consequences of neural networks that represent information efficiently in their spikes. We illustrate this insight with spiking networks that represent dynamical variables. Our approach is based on two assumptions: We assume that information about dynamical variables can be read out linearly from neural spike trains, and we assume that neurons only fire a spike if that improves the representation of the dynamical variables. Based on these assumptions, we derive a network of leaky integrate-and-fire neurons that is able to implement arbitrary linear dynamical systems. We show that the membrane voltage of the neurons is equivalent to a prediction error about a common population-level signal. Among other things, our approach allows us to construct an integrator network of spiking neurons that is robust against many perturbations. Most importantly, neural variability in our networks cannot be equated to noise. Despite exhibiting the same single unit properties as widely used population code models (e.g. tuning curves, Poisson distributed spike trains), balanced networks are orders of magnitudes more reliable. Our approach suggests that spikes do matter when considering how the brain computes, and that the reliability of cortical representations could have been strongly underestimated.

  12. Analysis of connectivity in NeuCube spiking neural network models trained on EEG data for the understanding of functional changes in the brain: A case study on opiate dependence treatment.

    Science.gov (United States)

    Capecci, Elisa; Kasabov, Nikola; Wang, Grace Y

    2015-08-01

    The paper presents a methodology for the analysis of functional changes in brain activity across different conditions and different groups of subjects. This analysis is based on the recently proposed NeuCube spiking neural network (SNN) framework and more specifically on the analysis of the connectivity of a NeuCube model trained with electroencephalography (EEG) data. The case study data used to illustrate this method is EEG data collected from three groups-subjects with opiate addiction, patients undertaking methadone maintenance treatment, and non-drug users/healthy control group. The proposed method classifies more accurately the EEG data than traditional statistical and artificial intelligence (AI) methods and can be used to predict response to treatment and dose-related drug effect. But more importantly, the method can be used to compare functional brain activities of different subjects and the changes of these activities as a result of treatment, which is a step towards a better understanding of both the EEG data and the brain processes that generated it. The method can also be used for a wide range of applications, such as a better understanding of disease progression or aging. Copyright © 2015 Elsevier Ltd. All rights reserved.

  13. Spatiotemporal Dynamics and Reliable Computations in Recurrent Spiking Neural Networks

    Science.gov (United States)

    Pyle, Ryan; Rosenbaum, Robert

    2017-01-01

    Randomly connected networks of excitatory and inhibitory spiking neurons provide a parsimonious model of neural variability, but are notoriously unreliable for performing computations. We show that this difficulty is overcome by incorporating the well-documented dependence of connection probability on distance. Spatially extended spiking networks exhibit symmetry-breaking bifurcations and generate spatiotemporal patterns that can be trained to perform dynamical computations under a reservoir computing framework.

  14. Spatiotemporal Dynamics and Reliable Computations in Recurrent Spiking Neural Networks.

    Science.gov (United States)

    Pyle, Ryan; Rosenbaum, Robert

    2017-01-06

    Randomly connected networks of excitatory and inhibitory spiking neurons provide a parsimonious model of neural variability, but are notoriously unreliable for performing computations. We show that this difficulty is overcome by incorporating the well-documented dependence of connection probability on distance. Spatially extended spiking networks exhibit symmetry-breaking bifurcations and generate spatiotemporal patterns that can be trained to perform dynamical computations under a reservoir computing framework.

  15. Coincidence Detection Using Spiking Neurons with Application to Face Recognition

    Directory of Open Access Journals (Sweden)

    Fadhlan Kamaruzaman

    2015-01-01

    Full Text Available We elucidate the practical implementation of Spiking Neural Network (SNN as local ensembles of classifiers. Synaptic time constant τs is used as learning parameter in representing the variations learned from a set of training data at classifier level. This classifier uses coincidence detection (CD strategy trained in supervised manner using a novel supervised learning method called τs Prediction which adjusts the precise timing of output spikes towards the desired spike timing through iterative adaptation of τs. This paper also discusses the approximation of spike timing in Spike Response Model (SRM for the purpose of coincidence detection. This process significantly speeds up the whole process of learning and classification. Performance evaluations with face datasets such as AR, FERET, JAFFE, and CK+ datasets show that the proposed method delivers better face classification performance than the network trained with Supervised Synaptic-Time Dependent Plasticity (STDP. We also found that the proposed method delivers better classification accuracy than k nearest neighbor, ensembles of kNN, and Support Vector Machines. Evaluation on several types of spike codings also reveals that latency coding delivers the best result for face classification as well as for classification of other multivariate datasets.

  16. Coronavirus spike-receptor interactions

    NARCIS (Netherlands)

    Mou, H.

    2015-01-01

    Coronaviruses cause important diseases in humans and animals. Coronavirus infection starts with the virus binding with its spike proteins to molecules present on the surface of host cells that act as receptors. This spike-receptor interaction is highly specific and determines the virus’ cell, tissue

  17. Lateralized odor preference training in rat pups reveals an enhanced network response in anterior piriform cortex to olfactory input that parallels extended memory.

    Science.gov (United States)

    Fontaine, Christine J; Harley, Carolyn W; Yuan, Qi

    2013-09-18

    The present study examines synaptic plasticity in the anterior piriform cortex (aPC) using ex vivo slices from rat pups given lateralized odor preference training. In the early odor preference learning model, a brief 10 min training session yields 24 h memory, while four daily sessions yield 48 h memory. Odor preference memory can be lateralized through naris occlusion as the anterior commissure is not yet functional. AMPA receptor-mediated postsynaptic responses in the aPC to lateral olfactory tract input, shown to be enhanced at 24 h, are no longer enhanced 48 h after a single training session. Following four spaced lateralized trials, the AMPA receptor-mediated fEPSP is enhanced in the trained aPC at 48 h. Calcium imaging of aPC pyramidal cells within 48 h revealed decreased firing thresholds in the pyramidal cell network. Thus multiday odor preference training induced increased odor input responsiveness in previously weakly activated aPC cells. These results support the hypothesis that increased synaptic strength in olfactory input networks mediates odor preference memory. The increase in aPC network activation parallels behavioral memory.

  18. Bursts generate a non-reducible spike-pattern code

    Directory of Open Access Journals (Sweden)

    Hugo G Eyherabide

    2009-05-01

    Full Text Available On the single-neuron level, precisely timed spikes can either constitute firing-rate codes or spike-pattern codes that utilize the relative timing between consecutive spikes. There has been little experimental support for the hypothesis that such temporal patterns contribute substantially to information transmission. Using grasshopper auditory receptors as a model system, we show that correlations between spikes can be used to represent behaviorally relevant stimuli. The correlations reflect the inner structure of the spike train: a succession of burst-like patterns. We demonstrate that bursts with different spike counts encode different stimulus features, such that about 20% of the transmitted information corresponds to discriminating between different features, and the remaining 80% is used to allocate these features in time. In this spike-pattern code, the "what" and the "when" of the stimuli are encoded in the duration of each burst and the time of burst onset, respectively. Given the ubiquity of burst firing, we expect similar findings also for other neural systems.

  19. A Simple Deep Learning Method for Neuronal Spike Sorting

    Science.gov (United States)

    Yang, Kai; Wu, Haifeng; Zeng, Yu

    2017-10-01

    Spike sorting is one of key technique to understand brain activity. With the development of modern electrophysiology technology, some recent multi-electrode technologies have been able to record the activity of thousands of neuronal spikes simultaneously. The spike sorting in this case will increase the computational complexity of conventional sorting algorithms. In this paper, we will focus spike sorting on how to reduce the complexity, and introduce a deep learning algorithm, principal component analysis network (PCANet) to spike sorting. The introduced method starts from a conventional model and establish a Toeplitz matrix. Through the column vectors in the matrix, we trains a PCANet, where some eigenvalue vectors of spikes could be extracted. Finally, support vector machine (SVM) is used to sort spikes. In experiments, we choose two groups of simulated data from public databases availably and compare this introduced method with conventional methods. The results indicate that the introduced method indeed has lower complexity with the same sorting errors as the conventional methods.

  20. Capturing spike variability in noisy Izhikevich neurons using point process generalized linear models

    DEFF Research Database (Denmark)

    Østergaard, Jacob; Kramer, Mark A.; Eden, Uri T.

    2018-01-01

    current. We then fit these spike train datawith a statistical model (a generalized linear model, GLM, with multiplicative influences of past spiking). For different levels of noise, we show how the GLM captures both the deterministic features of the Izhikevich neuron and the variability driven...... by the noise. We conclude that the GLM captures essential features of the simulated spike trains, but for near-deterministic spike trains, goodness-of-fit analyses reveal that the model does not fit very well in a statistical sense; the essential random part of the GLM is not captured....... are separately applied; understanding the relationships between these modeling approaches remains an area of active research. In this letter, we examine this relationship using simulation. To do so, we first generate spike train data from a well-known dynamical model, the Izhikevich neuron, with a noisy input...

  1. Decreased hippocampal homoarginine and increased nitric oxide and nitric oxide synthase levels in rats parallel training in a radial arm maze.

    Science.gov (United States)

    Sase, Ajinkya; Nawaratna, Gayan; Hu, Shengdi; Wu, Guoyao; Lubec, Gert

    2016-09-01

    L-homoarginine (hArg) is derived from enzymatic guanidination of lysine. It was demonstrated that hArg is a substrate for nitric oxide (NO) synthesis, blocks lysine transport and inhibits the uptake of arginine into synaptosomes and modulates GABA responses ex vivo. As there is limited information on its physiological roles in the brain, the aim of the study was to show whether hippocampal or frontal lobe (FL) hArg is paralleling training in the radial arm maze (RAM) or NO formation. Hippocampi and FL of male Sprague-Dawley rats were taken from trained or yoked in a RAM. Then hArg and metabolites, NO and NO synthase (NOS) were determined by standard methods. The animals learned the task in the RAM showing significant reduction of working memory errors. hArg showed decreased levels in both brain regions of trained animals as compared to yoked animals. Nitrate plus nitrite (NOx) concentrations and NOS activity were significantly increased in hippocampi, F(1,36) = 170.5; P ≤ 0.0001 and FL, F(1,36) = 74.67; P ≤ 0.0001 of trained animals as compared to yoked animals. Levels of hArg were negatively correlated with NOx in hippocampus (r = -0.6355; P = 0.0483) but not in FL and with lysine in the FL (r = -0.6650; P = 0.0358). NOx levels were positively correlated with NOS in both the hippocampus (r = 0.7474; P = 0.0129) and FL (r = 0.9563; P ≤  0.0001). These novel findings indicate that hArg is linked to NO formation in hippocampus but not in FL and is paralleling spatial memory in the RAM.

  2. Spike Pattern Structure Influences Synaptic Efficacy Variability Under STDP and Synaptic Homeostasis. I: Spike Generating Models on Converging Motifs

    Directory of Open Access Journals (Sweden)

    Zedong eBi

    2016-02-01

    Full Text Available In neural systems, synaptic plasticity is usually driven by spike trains. Due to the inherent noises of neurons and synapses as well as the randomness of connection details, spike trains typically exhibit variability such as spatial randomness and temporal stochasticity, resulting in variability of synaptic changes under plasticity, which we call efficacy variability. How the variability of spike trains influences the efficacy variability of synapses remains unclear. In this paper, we try to understand this influence under pair-wise additive spike-timing dependent plasticity (STDP when the mean strength of plastic synapses into a neuron is bounded (synaptic homeostasis. Specifically, we systematically study, analytically and numerically, how four aspects of statistical features, i.e. synchronous firing, burstiness/regularity, heterogeneity of rates and heterogeneity of cross-correlations, as well as their interactions influence the efficacy variability in converging motifs (simple networks in which one neuron receives from many other neurons. Neurons (including the post-synaptic neuron in a converging motif generate spikes according to statistical models with tunable parameters. In this way, we can explicitly control the statistics of the spike patterns, and investigate their influence onto the efficacy variability, without worrying about the feedback from synaptic changes onto the dynamics of the post-synaptic neuron. We separate efficacy variability into two parts: the drift part (DriftV induced by the heterogeneity of change rates of different synapses, and the diffusion part (DiffV induced by weight diffusion caused by stochasticity of spike trains. Our main findings are: (1 synchronous firing and burstiness tend to increase DiffV, (2 heterogeneity of rates induces DriftV when potentiation and depression in STDP are not balanced, and (3 heterogeneity of cross-correlations induces DriftV together with heterogeneity of rates. We anticipate our

  3. A Fully Automated Approach to Spike Sorting.

    Science.gov (United States)

    Chung, Jason E; Magland, Jeremy F; Barnett, Alex H; Tolosa, Vanessa M; Tooker, Angela C; Lee, Kye Y; Shah, Kedar G; Felix, Sarah H; Frank, Loren M; Greengard, Leslie F

    2017-09-13

    Understanding the detailed dynamics of neuronal networks will require the simultaneous measurement of spike trains from hundreds of neurons (or more). Currently, approaches to extracting spike times and labels from raw data are time consuming, lack standardization, and involve manual intervention, making it difficult to maintain data provenance and assess the quality of scientific results. Here, we describe an automated clustering approach and associated software package that addresses these problems and provides novel cluster quality metrics. We show that our approach has accuracy comparable to or exceeding that achieved using manual or semi-manual techniques with desktop central processing unit (CPU) runtimes faster than acquisition time for up to hundreds of electrodes. Moreover, a single choice of parameters in the algorithm is effective for a variety of electrode geometries and across multiple brain regions. This algorithm has the potential to enable reproducible and automated spike sorting of larger scale recordings than is currently possible. Copyright © 2017 Elsevier Inc. All rights reserved.

  4. SPAN: spike pattern association neuron for learning spatio-temporal sequences

    OpenAIRE

    Mohemmed, A; Schliebs, S; Matsuda, S; Kasabov, N

    2012-01-01

    Spiking Neural Networks (SNN) were shown to be suitable tools for the processing of spatio-temporal information. However, due to their inherent complexity, the formulation of efficient supervised learning algorithms for SNN is difficult and remains an important problem in the research area. This article presents SPAN — a spiking neuron that is able to learn associations of arbitrary spike trains in a supervised fashion allowing the processing of spatio-temporal information encoded in the prec...

  5. Wavelet analysis of epileptic spikes

    Science.gov (United States)

    Latka, Miroslaw; Was, Ziemowit; Kozik, Andrzej; West, Bruce J.

    2003-05-01

    Interictal spikes and sharp waves in human EEG are characteristic signatures of epilepsy. These potentials originate as a result of synchronous pathological discharge of many neurons. The reliable detection of such potentials has been the long standing problem in EEG analysis, especially after long-term monitoring became common in investigation of epileptic patients. The traditional definition of a spike is based on its amplitude, duration, sharpness, and emergence from its background. However, spike detection systems built solely around this definition are not reliable due to the presence of numerous transients and artifacts. We use wavelet transform to analyze the properties of EEG manifestations of epilepsy. We demonstrate that the behavior of wavelet transform of epileptic spikes across scales can constitute the foundation of a relatively simple yet effective detection algorithm.

  6. Wavelet analysis of epileptic spikes

    CERN Document Server

    Latka, M; Kozik, A; West, B J; Latka, Miroslaw; Was, Ziemowit; Kozik, Andrzej; West, Bruce J.

    2003-01-01

    Interictal spikes and sharp waves in human EEG are characteristic signatures of epilepsy. These potentials originate as a result of synchronous, pathological discharge of many neurons. The reliable detection of such potentials has been the long standing problem in EEG analysis, especially after long-term monitoring became common in investigation of epileptic patients. The traditional definition of a spike is based on its amplitude, duration, sharpness, and emergence from its background. However, spike detection systems built solely around this definition are not reliable due to the presence of numerous transients and artifacts. We use wavelet transform to analyze the properties of EEG manifestations of epilepsy. We demonstrate that the behavior of wavelet transform of epileptic spikes across scales can constitute the foundation of a relatively simple yet effective detection algorithm.

  7. Comparison of Classifier Architectures for Online Neural Spike Sorting.

    Science.gov (United States)

    Saeed, Maryam; Khan, Amir Ali; Kamboh, Awais Mehmood

    2017-04-01

    High-density, intracranial recordings from micro-electrode arrays need to undergo Spike Sorting in order to associate the recorded neuronal spikes to particular neurons. This involves spike detection, feature extraction, and classification. To reduce the data transmission and power requirements, on-chip real-time processing is becoming very popular. However, high computational resources are required for classifiers in on-chip spike-sorters, making scalability a great challenge. In this review paper, we analyze several popular classifiers to propose five new hardware architectures using the off-chip training with on-chip classification approach. These include support vector classification, fuzzy C-means classification, self-organizing maps classification, moving-centroid K-means classification, and Cosine distance classification. The performance of these architectures is analyzed in terms of accuracy and resource requirement. We establish that the neural networks based Self-Organizing Maps classifier offers the most viable solution. A spike sorter based on the Self-Organizing Maps classifier, requires only 7.83% of computational resources of the best-reported spike sorter, hierarchical adaptive means, while offering a 3% better accuracy at 7 dB SNR.

  8. Multiplexed Spike Coding and Adaptation in the Thalamus

    Directory of Open Access Journals (Sweden)

    Rebecca A. Mease

    2017-05-01

    Full Text Available High-frequency “burst” clusters of spikes are a generic output pattern of many neurons. While bursting is a ubiquitous computational feature of different nervous systems across animal species, the encoding of synaptic inputs by bursts is not well understood. We find that bursting neurons in the rodent thalamus employ “multiplexing” to differentially encode low- and high-frequency stimulus features associated with either T-type calcium “low-threshold” or fast sodium spiking events, respectively, and these events adapt differently. Thus, thalamic bursts encode disparate information in three channels: (1 burst size, (2 burst onset time, and (3 precise spike timing within bursts. Strikingly, this latter “intraburst” encoding channel shows millisecond-level feature selectivity and adapts across statistical contexts to maintain stable information encoded per spike. Consequently, calcium events both encode low-frequency stimuli and, in parallel, gate a transient window for high-frequency, adaptive stimulus encoding by sodium spike timing, allowing bursts to efficiently convey fine-scale temporal information.

  9. Real-time computing platform for spiking neurons (RT-spike).

    Science.gov (United States)

    Ros, Eduardo; Ortigosa, Eva M; Agís, Rodrigo; Carrillo, Richard; Arnold, Michael

    2006-07-01

    A computing platform is described for simulating arbitrary networks of spiking neurons in real time. A hybrid computing scheme is adopted that uses both software and hardware components to manage the tradeoff between flexibility and computational power; the neuron model is implemented in hardware and the network model and the learning are implemented in software. The incremental transition of the software components into hardware is supported. We focus on a spike response model (SRM) for a neuron where the synapses are modeled as input-driven conductances. The temporal dynamics of the synaptic integration process are modeled with a synaptic time constant that results in a gradual injection of charge. This type of model is computationally expensive and is not easily amenable to existing software-based event-driven approaches. As an alternative we have designed an efficient time-based computing architecture in hardware, where the different stages of the neuron model are processed in parallel. Further improvements occur by computing multiple neurons in parallel using multiple processing units. This design is tested using reconfigurable hardware and its scalability and performance evaluated. Our overall goal is to investigate biologically realistic models for the real-time control of robots operating within closed action-perception loops, and so we evaluate the performance of the system on simulating a model of the cerebellum where the emulation of the temporal dynamics of the synaptic integration process is important.

  10. An online supervised learning method based on gradient descent for spiking neurons.

    Science.gov (United States)

    Xu, Yan; Yang, Jing; Zhong, Shuiming

    2017-09-01

    The purpose of supervised learning with temporal encoding for spiking neurons is to make the neurons emit a specific spike train encoded by precise firing times of spikes. The gradient-descent-based (GDB) learning methods are widely used and verified in the current research. Although the existing GDB multi-spike learning (or spike sequence learning) methods have good performance, they work in an offline manner and still have some limitations. This paper proposes an online GDB spike sequence learning method for spiking neurons that is based on the online adjustment mechanism of real biological neuron synapses. The method constructs error function and calculates the adjustment of synaptic weights as soon as the neurons emit a spike during their running process. We analyze and synthesize desired and actual output spikes to select appropriate input spikes in the calculation of weight adjustment in this paper. The experimental results show that our method obviously improves learning performance compared with the offline learning manner and has certain advantage on learning accuracy compared with other learning methods. Stronger learning ability determines that the method has large pattern storage capacity. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Spiking the expectancy profiles

    DEFF Research Database (Denmark)

    Hansen, Niels Chr.; Loui, Psyche; Vuust, Peter

    Melodic expectations have long been quantified using expectedness ratings. Motivated by statistical learning and sharper key profiles in musicians, we model musical learning as a process of reducing the relative entropy between listeners' prior expectancy profiles and probability distributions...... of a given musical style or of stimuli used in short-term experiments. Five previous probe-tone experiments with musicians and non-musicians are revisited. Exp. 1-2 used jazz, classical and hymn melodies. Exp. 3-5 collected ratings before and after exposure to 5, 15 or 400 novel melodies generated from...... a finite-state grammar using the Bohlen-Pierce scale. We find group differences in entropy corresponding to degree and relevance of musical training and within-participant decreases after short-term exposure. Thus, whereas inexperienced listeners make high-entropy predictions by default, statistical...

  12. Event-Driven Contrastive Divergence for Spiking Neuromorphic Systems

    Directory of Open Access Journals (Sweden)

    Emre eNeftci

    2014-01-01

    Full Text Available Restricted Boltzmann Machines (RBMs and Deep Belief Networks have been demonstrated to perform efficiently in variety of applications, such as dimensionality reduction, feature learning, and classification. Their implementation on neuromorphic hardware platforms emulating large-scale networks of spiking neurons can have significant advantages from the perspectives of scalability, power dissipation and real-time interfacing with the environment. However the traditional RBM architecture and the commonly used training algorithm known as Contrastive Divergence (CD are based on discrete updates and exact arithmetics which do not directly map onto a dynamical neural substrate. Here, we present an event-driven variation of CD to train a RBM constructed with Integrate & Fire (I&F neurons, that is constrained by the limitations of existing and near future neuromorphic hardware platforms. Our strategy is based on neural sampling, which allows us to synthesize a spiking neural network that samples from a target Boltzmann distribution. The reverberating activity of the network replaces the discrete steps of the CD algorithm, while Spike Time Dependent Plasticity (STDP carries out the weight updates in an online, asynchronous fashion.We demonstrate our approach by training an RBM composed of leaky I&F neurons with STDP synapses to learn a generative model of the MNIST hand-written digit dataset, and by testing it in recognition, generation and cue integration tasks. Our results contribute to a machine learning-driven approach for synthesizing networks of spiking neurons capable of carrying out practical, high-level functionality.

  13. Event-driven contrastive divergence for spiking neuromorphic systems.

    Science.gov (United States)

    Neftci, Emre; Das, Srinjoy; Pedroni, Bruno; Kreutz-Delgado, Kenneth; Cauwenberghs, Gert

    2013-01-01

    Restricted Boltzmann Machines (RBMs) and Deep Belief Networks have been demonstrated to perform efficiently in a variety of applications, such as dimensionality reduction, feature learning, and classification. Their implementation on neuromorphic hardware platforms emulating large-scale networks of spiking neurons can have significant advantages from the perspectives of scalability, power dissipation and real-time interfacing with the environment. However, the traditional RBM architecture and the commonly used training algorithm known as Contrastive Divergence (CD) are based on discrete updates and exact arithmetics which do not directly map onto a dynamical neural substrate. Here, we present an event-driven variation of CD to train a RBM constructed with Integrate & Fire (I&F) neurons, that is constrained by the limitations of existing and near future neuromorphic hardware platforms. Our strategy is based on neural sampling, which allows us to synthesize a spiking neural network that samples from a target Boltzmann distribution. The recurrent activity of the network replaces the discrete steps of the CD algorithm, while Spike Time Dependent Plasticity (STDP) carries out the weight updates in an online, asynchronous fashion. We demonstrate our approach by training an RBM composed of leaky I&F neurons with STDP synapses to learn a generative model of the MNIST hand-written digit dataset, and by testing it in recognition, generation and cue integration tasks. Our results contribute to a machine learning-driven approach for synthesizing networks of spiking neurons capable of carrying out practical, high-level functionality.

  14. An information-theoretic approach to motor action decoding with a reconfigurable parallel architecture.

    Science.gov (United States)

    Craciun, Stefan; Brockmeier, Austin J; George, Alan D; Lam, Herman; Príncipe, José C

    2011-01-01

    Methods for decoding movements from neural spike counts using adaptive filters often rely on minimizing the mean-squared error. However, for non-Gaussian distribution of errors, this approach is not optimal for performance. Therefore, rather than using probabilistic modeling, we propose an alternate non-parametric approach. In order to extract more structure from the input signal (neuronal spike counts) we propose using minimum error entropy (MEE), an information-theoretic approach that minimizes the error entropy as part of an iterative cost function. However, the disadvantage of using MEE as the cost function for adaptive filters is the increase in computational complexity. In this paper we present a comparison between the decoding performance of the analytic Wiener filter and a linear filter trained with MEE, which is then mapped to a parallel architecture in reconfigurable hardware tailored to the computational needs of the MEE filter. We observe considerable speedup from the hardware design. The adaptation of filter weights for the multiple-input, multiple-output linear filters, necessary in motor decoding, is a highly parallelizable algorithm. It can be decomposed into many independent computational blocks with a parallel architecture readily mapped to a field-programmable gate array (FPGA) and scales to large numbers of neurons. By pipelining and parallelizing independent computations in the algorithm, the proposed parallel architecture has sublinear increases in execution time with respect to both window size and filter order.

  15. Adrenalectomy eliminates the extinction spike in autoshaping with rats.

    Science.gov (United States)

    Thomas, B L; Papini, M R

    2001-03-01

    Experiment 1, using rats, investigated the effect of adrenalectomy (ADX) on the invigoration of lever-contact performance that occurs in the autoshaping situation after a shift from acquisition to extinction (called the extinction spike). Groups of rats with ADX or sham operations were trained under spaced and massed conditions [average intertrial intervals (ITI) of either 15 or 90 s] for 10 sessions and then shifted to extinction. ADX did not affect acquisition training but it eliminated the extinction spike. Plasma corticosterone levels during acquisition were shown in Experiment 2 to be similar in rats trained under spaced or massed conditions. Adrenal participation in the emotional arousal induced by conditions of surprising nonreward (e.g., extinction) is discussed.

  16. Spike Pattern Recognition for Automatic Collimation Alignment

    CERN Document Server

    Azzopardi, Gabriella; Salvachua Ferrando, Belen Maria; Mereghetti, Alessio; Redaelli, Stefano; CERN. Geneva. ATS Department

    2017-01-01

    The LHC makes use of a collimation system to protect its sensitive equipment by intercepting potentially dangerous beam halo particles. The appropriate collimator settings to protect the machine against beam losses relies on a very precise alignment of all the collimators with respect to the beam. The beam center at each collimator is then found by touching the beam halo using an alignment procedure. Until now, in order to determine whether a collimator is aligned with the beam or not, a user is required to follow the collimator’s BLM loss data and detect spikes. A machine learning (ML) model was trained in order to automatically recognize spikes when a collimator is aligned. The model was loosely integrated with the alignment implementation to determine the classification performance and reliability, without effecting the alignment process itself. The model was tested on a number of collimators during this MD and the machine learning was able to output the classifications in real-time.

  17. Computing with Spiking Neuron Networks

    NARCIS (Netherlands)

    H. Paugam-Moisy; S.M. Bohte (Sander); G. Rozenberg; T.H.W. Baeck (Thomas); J.N. Kok (Joost)

    2012-01-01

    htmlabstractAbstract Spiking Neuron Networks (SNNs) are often referred to as the 3rd gener- ation of neural networks. Highly inspired from natural computing in the brain and recent advances in neurosciences, they derive their strength and interest from an ac- curate modeling of synaptic interactions

  18. Learning Universal Computations with Spikes

    Science.gov (United States)

    Thalmeier, Dominik; Uhlmann, Marvin; Kappen, Hilbert J.; Memmesheimer, Raoul-Martin

    2016-01-01

    Providing the neurobiological basis of information processing in higher animals, spiking neural networks must be able to learn a variety of complicated computations, including the generation of appropriate, possibly delayed reactions to inputs and the self-sustained generation of complex activity patterns, e.g. for locomotion. Many such computations require previous building of intrinsic world models. Here we show how spiking neural networks may solve these different tasks. Firstly, we derive constraints under which classes of spiking neural networks lend themselves to substrates of powerful general purpose computing. The networks contain dendritic or synaptic nonlinearities and have a constrained connectivity. We then combine such networks with learning rules for outputs or recurrent connections. We show that this allows to learn even difficult benchmark tasks such as the self-sustained generation of desired low-dimensional chaotic dynamics or memory-dependent computations. Furthermore, we show how spiking networks can build models of external world systems and use the acquired knowledge to control them. PMID:27309381

  19. Parallel rendering

    Science.gov (United States)

    Crockett, Thomas W.

    1995-01-01

    This article provides a broad introduction to the subject of parallel rendering, encompassing both hardware and software systems. The focus is on the underlying concepts and the issues which arise in the design of parallel rendering algorithms and systems. We examine the different types of parallelism and how they can be applied in rendering applications. Concepts from parallel computing, such as data decomposition, task granularity, scalability, and load balancing, are considered in relation to the rendering problem. We also explore concepts from computer graphics, such as coherence and projection, which have a significant impact on the structure of parallel rendering algorithms. Our survey covers a number of practical considerations as well, including the choice of architectural platform, communication and memory requirements, and the problem of image assembly and display. We illustrate the discussion with numerous examples from the parallel rendering literature, representing most of the principal rendering methods currently used in computer graphics.

  20. Time Resolution Dependence of Information Measures for Spiking Neurons: Scaling and Universality

    Directory of Open Access Journals (Sweden)

    James P Crutchfield

    2015-08-01

    Full Text Available The mutual information between stimulus and spike-train response is commonly used to monitor neural coding efficiency, but neuronal computation broadly conceived requires more refined and targeted information measures of input-output joint processes. A first step towards that larger goal is todevelop information measures for individual output processes, including information generation (entropy rate, stored information (statisticalcomplexity, predictable information (excess entropy, and active information accumulation (bound information rate. We calculate these for spike trains generated by a variety of noise-driven integrate-and-fire neurons as a function of time resolution and for alternating renewal processes. We show that their time-resolution dependence reveals coarse-grained structural properties of interspike interval statistics; e.g., $tau$-entropy rates that diverge less quickly than the firing rate indicate interspike interval correlations. We also find evidence that the excess entropy and regularized statistical complexity of different types of integrate-and-fire neurons are universal in the continuous-time limit in the sense that they do not depend on mechanism details. This suggests a surprising simplicity in the spike trains generated by these model neurons. Interestingly, neurons with gamma-distributed ISIs and neurons whose spike trains are alternating renewal processes do not fall into the same universality class. These results lead to two conclusions. First, the dependence of information measures on time resolution reveals mechanistic details about spike train generation. Second, information measures can be used as model selection tools for analyzing spike train processes.

  1. Automatic spike sorting using tuning information.

    Science.gov (United States)

    Ventura, Valérie

    2009-09-01

    Current spike sorting methods focus on clustering neurons' characteristic spike waveforms. The resulting spike-sorted data are typically used to estimate how covariates of interest modulate the firing rates of neurons. However, when these covariates do modulate the firing rates, they provide information about spikes' identities, which thus far have been ignored for the purpose of spike sorting. This letter describes a novel approach to spike sorting, which incorporates both waveform information and tuning information obtained from the modulation of firing rates. Because it efficiently uses all the available information, this spike sorter yields lower spike misclassification rates than traditional automatic spike sorters. This theoretical result is verified empirically on several examples. The proposed method does not require additional assumptions; only its implementation is different. It essentially consists of performing spike sorting and tuning estimation simultaneously rather than sequentially, as is currently done. We used an expectation-maximization maximum likelihood algorithm to implement the new spike sorter. We present the general form of this algorithm and provide a detailed implementable version under the assumptions that neurons are independent and spike according to Poisson processes. Finally, we uncover a systematic flaw of spike sorting based on waveform information only.

  2. Parallel computations

    CERN Document Server

    1982-01-01

    Parallel Computations focuses on parallel computation, with emphasis on algorithms used in a variety of numerical and physical applications and for many different types of parallel computers. Topics covered range from vectorization of fast Fourier transforms (FFTs) and of the incomplete Cholesky conjugate gradient (ICCG) algorithm on the Cray-1 to calculation of table lookups and piecewise functions. Single tridiagonal linear systems and vectorized computation of reactive flow are also discussed.Comprised of 13 chapters, this volume begins by classifying parallel computers and describing techn

  3. Neural Parallel Engine: A toolbox for massively parallel neural signal processing.

    Science.gov (United States)

    Tam, Wing-Kin; Yang, Zhi

    2018-05-01

    Large-scale neural recordings provide detailed information on neuronal activities and can help elicit the underlying neural mechanisms of the brain. However, the computational burden is also formidable when we try to process the huge data stream generated by such recordings. In this study, we report the development of Neural Parallel Engine (NPE), a toolbox for massively parallel neural signal processing on graphical processing units (GPUs). It offers a selection of the most commonly used routines in neural signal processing such as spike detection and spike sorting, including advanced algorithms such as exponential-component-power-component (EC-PC) spike detection and binary pursuit spike sorting. We also propose a new method for detecting peaks in parallel through a parallel compact operation. Our toolbox is able to offer a 5× to 110× speedup compared with its CPU counterparts depending on the algorithms. A user-friendly MATLAB interface is provided to allow easy integration of the toolbox into existing workflows. Previous efforts on GPU neural signal processing only focus on a few rudimentary algorithms, are not well-optimized and often do not provide a user-friendly programming interface to fit into existing workflows. There is a strong need for a comprehensive toolbox for massively parallel neural signal processing. A new toolbox for massively parallel neural signal processing has been created. It can offer significant speedup in processing signals from large-scale recordings up to thousands of channels. Copyright © 2018 Elsevier B.V. All rights reserved.

  4. Spatiotemporal Spike Coding of Behavioral Adaptation in the Dorsal Anterior Cingulate Cortex.

    Directory of Open Access Journals (Sweden)

    Laureline Logiaco

    2015-08-01

    Full Text Available The frontal cortex controls behavioral adaptation in environments governed by complex rules. Many studies have established the relevance of firing rate modulation after informative events signaling whether and how to update the behavioral policy. However, whether the spatiotemporal features of these neuronal activities contribute to encoding imminent behavioral updates remains unclear. We investigated this issue in the dorsal anterior cingulate cortex (dACC of monkeys while they adapted their behavior based on their memory of feedback from past choices. We analyzed spike trains of both single units and pairs of simultaneously recorded neurons using an algorithm that emulates different biologically plausible decoding circuits. This method permits the assessment of the performance of both spike-count and spike-timing sensitive decoders. In response to the feedback, single neurons emitted stereotypical spike trains whose temporal structure identified informative events with higher accuracy than mere spike count. The optimal decoding time scale was in the range of 70-200 ms, which is significantly shorter than the memory time scale required by the behavioral task. Importantly, the temporal spiking patterns of single units were predictive of the monkeys' behavioral response time. Furthermore, some features of these spiking patterns often varied between jointly recorded neurons. All together, our results suggest that dACC drives behavioral adaptation through complex spatiotemporal spike coding. They also indicate that downstream networks, which decode dACC feedback signals, are unlikely to act as mere neural integrators.

  5. Spatiotemporal Spike Coding of Behavioral Adaptation in the Dorsal Anterior Cingulate Cortex.

    Science.gov (United States)

    Logiaco, Laureline; Quilodran, René; Procyk, Emmanuel; Arleo, Angelo

    2015-08-01

    The frontal cortex controls behavioral adaptation in environments governed by complex rules. Many studies have established the relevance of firing rate modulation after informative events signaling whether and how to update the behavioral policy. However, whether the spatiotemporal features of these neuronal activities contribute to encoding imminent behavioral updates remains unclear. We investigated this issue in the dorsal anterior cingulate cortex (dACC) of monkeys while they adapted their behavior based on their memory of feedback from past choices. We analyzed spike trains of both single units and pairs of simultaneously recorded neurons using an algorithm that emulates different biologically plausible decoding circuits. This method permits the assessment of the performance of both spike-count and spike-timing sensitive decoders. In response to the feedback, single neurons emitted stereotypical spike trains whose temporal structure identified informative events with higher accuracy than mere spike count. The optimal decoding time scale was in the range of 70-200 ms, which is significantly shorter than the memory time scale required by the behavioral task. Importantly, the temporal spiking patterns of single units were predictive of the monkeys' behavioral response time. Furthermore, some features of these spiking patterns often varied between jointly recorded neurons. All together, our results suggest that dACC drives behavioral adaptation through complex spatiotemporal spike coding. They also indicate that downstream networks, which decode dACC feedback signals, are unlikely to act as mere neural integrators.

  6. Multichannel interictal spike activity detection using time-frequency entropy measure.

    Science.gov (United States)

    Thanaraj, Palani; Parvathavarthini, B

    2017-06-01

    Localization of interictal spikes is an important clinical step in the pre-surgical assessment of pharmacoresistant epileptic patients. The manual selection of interictal spike periods is cumbersome and involves a considerable amount of analysis workload for the physician. The primary focus of this paper is to automate the detection of interictal spikes for clinical applications in epilepsy localization. The epilepsy localization procedure involves detection of spikes in a multichannel EEG epoch. Therefore, a multichannel Time-Frequency (T-F) entropy measure is proposed to extract features related to the interictal spike activity. Least squares support vector machine is used to train the proposed feature to classify the EEG epochs as either normal or interictal spike period. The proposed T-F entropy measure, when validated with epilepsy dataset of 15 patients, shows an interictal spike classification accuracy of 91.20%, sensitivity of 100% and specificity of 84.23%. Moreover, the area under the curve of Receiver Operating Characteristics plot of 0.9339 shows the superior classification performance of the proposed T-F entropy measure. The results of this paper show a good spike detection accuracy without any prior information about the spike morphology.

  7. A neuro-inspired spike-based PID motor controller for multi-motor robots with low cost FPGAs.

    Science.gov (United States)

    Jimenez-Fernandez, Angel; Jimenez-Moreno, Gabriel; Linares-Barranco, Alejandro; Dominguez-Morales, Manuel J; Paz-Vicente, Rafael; Civit-Balcells, Anton

    2012-01-01

    In this paper we present a neuro-inspired spike-based close-loop controller written in VHDL and implemented for FPGAs. This controller has been focused on controlling a DC motor speed, but only using spikes for information representation, processing and DC motor driving. It could be applied to other motors with proper driver adaptation. This controller architecture represents one of the latest layers in a Spiking Neural Network (SNN), which implements a bridge between robotics actuators and spike-based processing layers and sensors. The presented control system fuses actuation and sensors information as spikes streams, processing these spikes in hard real-time, implementing a massively parallel information processing system, through specialized spike-based circuits. This spike-based close-loop controller has been implemented into an AER platform, designed in our labs, that allows direct control of DC motors: the AER-Robot. Experimental results evidence the viability of the implementation of spike-based controllers, and hardware synthesis denotes low hardware requirements that allow replicating this controller in a high number of parallel controllers working together to allow a real-time robot control.

  8. A Neuro-Inspired Spike-Based PID Motor Controller for Multi-Motor Robots with Low Cost FPGAs

    Directory of Open Access Journals (Sweden)

    Anton Civit-Balcells

    2012-03-01

    Full Text Available In this paper we present a neuro-inspired spike-based close-loop controller written in VHDL and implemented for FPGAs. This controller has been focused on controlling a DC motor speed, but only using spikes for information representation, processing and DC motor driving. It could be applied to other motors with proper driver adaptation. This controller architecture represents one of the latest layers in a Spiking Neural Network (SNN, which implements a bridge between robotics actuators and spike-based processing layers and sensors. The presented control system fuses actuation and sensors information as spikes streams, processing these spikes in hard real-time, implementing a massively parallel information processing system, through specialized spike-based circuits. This spike-based close-loop controller has been implemented into an AER platform, designed in our labs, that allows direct control of DC motors: the AER-Robot. Experimental results evidence the viability of the implementation of spike-based controllers, and hardware synthesis denotes low hardware requirements that allow replicating this controller in a high number of parallel controllers working together to allow a real-time robot control.

  9. The chronotron: a neuron that learns to fire temporally precise spike patterns.

    Directory of Open Access Journals (Sweden)

    Răzvan V Florian

    Full Text Available In many cases, neurons process information carried by the precise timings of spikes. Here we show how neurons can learn to generate specific temporally precise output spikes in response to input patterns of spikes having precise timings, thus processing and memorizing information that is entirely temporally coded, both as input and as output. We introduce two new supervised learning rules for spiking neurons with temporal coding of information (chronotrons, one that provides high memory capacity (E-learning, and one that has a higher biological plausibility (I-learning. With I-learning, the neuron learns to fire the target spike trains through synaptic changes that are proportional to the synaptic currents at the timings of real and target output spikes. We study these learning rules in computer simulations where we train integrate-and-fire neurons. Both learning rules allow neurons to fire at the desired timings, with sub-millisecond precision. We show how chronotrons can learn to classify their inputs, by firing identical, temporally precise spike trains for different inputs belonging to the same class. When the input is noisy, the classification also leads to noise reduction. We compute lower bounds for the memory capacity of chronotrons and explore the influence of various parameters on chronotrons' performance. The chronotrons can model neurons that encode information in the time of the first spike relative to the onset of salient stimuli or neurons in oscillatory networks that encode information in the phases of spikes relative to the background oscillation. Our results show that firing one spike per cycle optimizes memory capacity in neurons encoding information in the phase of firing relative to a background rhythm.

  10. Spike and burst coding in thalamocortical relay cells.

    Directory of Open Access Journals (Sweden)

    Fleur Zeldenrust

    2018-02-01

    Full Text Available Mammalian thalamocortical relay (TCR neurons switch their firing activity between a tonic spiking and a bursting regime. In a combined experimental and computational study, we investigated the features in the input signal that single spikes and bursts in the output spike train represent and how this code is influenced by the membrane voltage state of the neuron. Identical frozen Gaussian noise current traces were injected into TCR neurons in rat brain slices as well as in a validated three-compartment TCR model cell. The resulting membrane voltage traces and spike trains were analyzed by calculating the coherence and impedance. Reverse correlation techniques gave the Event-Triggered Average (ETA and the Event-Triggered Covariance (ETC. This demonstrated that the feature selectivity started relatively long before the events (up to 300 ms and showed a clear distinction between spikes (selective for fluctuations and bursts (selective for integration. The model cell was fine-tuned to mimic the frozen noise initiated spike and burst responses to within experimental accuracy, especially for the mixed mode regimes. The information content carried by the various types of events in the signal as well as by the whole signal was calculated. Bursts phase-lock to and transfer information at lower frequencies than single spikes. On depolarization the neuron transits smoothly from the predominantly bursting regime to a spiking regime, in which it is more sensitive to high-frequency fluctuations. The model was then used to elucidate properties that could not be assessed experimentally, in particular the role of two important subthreshold voltage-dependent currents: the low threshold activated calcium current (IT and the cyclic nucleotide modulated h current (Ih. The ETAs of those currents and their underlying activation/inactivation states not only explained the state dependence of the firing regime but also the long-lasting concerted dynamic action of the two

  11. Epileptiform spike detection via convolutional neural networks

    DEFF Research Database (Denmark)

    Johansen, Alexander Rosenberg; Jin, Jing; Maszczyk, Tomasz

    2016-01-01

    The EEG of epileptic patients often contains sharp waveforms called "spikes", occurring between seizures. Detecting such spikes is crucial for diagnosing epilepsy. In this paper, we develop a convolutional neural network (CNN) for detecting spikes in EEG of epileptic patients in an automated...

  12. iSpike: a spiking neural interface for the iCub robot

    International Nuclear Information System (INIS)

    Gamez, D; Fidjeland, A K; Lazdins, E

    2012-01-01

    This paper presents iSpike: a C++ library that interfaces between spiking neural network simulators and the iCub humanoid robot. It uses a biologically inspired approach to convert the robot’s sensory information into spikes that are passed to the neural network simulator, and it decodes output spikes from the network into motor signals that are sent to control the robot. Applications of iSpike range from embodied models of the brain to the development of intelligent robots using biologically inspired spiking neural networks. iSpike is an open source library that is available for free download under the terms of the GPL. (paper)

  13. Parallel algorithms

    CERN Document Server

    Casanova, Henri; Robert, Yves

    2008-01-01

    ""…The authors of the present book, who have extensive credentials in both research and instruction in the area of parallelism, present a sound, principled treatment of parallel algorithms. … This book is very well written and extremely well designed from an instructional point of view. … The authors have created an instructive and fascinating text. The book will serve researchers as well as instructors who need a solid, readable text for a course on parallelism in computing. Indeed, for anyone who wants an understandable text from which to acquire a current, rigorous, and broad vi

  14. Spike timing regulation on the millisecond scale by distributed synaptic plasticity at the cerebellum input stage: a simulation study

    Directory of Open Access Journals (Sweden)

    Jesus A Garrido

    2013-05-01

    Full Text Available The way long-term synaptic plasticity regulates neuronal spike patterns is not completely understood. This issue is especially relevant for the cerebellum, which is endowed with several forms of long-term synaptic plasticity and has been predicted to operate as a timing and a learning machine. Here we have used a computational model to simulate the impact of multiple distributed synaptic weights in the cerebellar granular layer network. In response to mossy fiber bursts, synaptic weights at multiple connections played a crucial role to regulate spike number and positioning in granule cells. The weight at mossy fiber to granule cell synapses regulated the delay of the first spike and the weight at mossy fiber and parallel fiber to Golgi cell synapses regulated the duration of the time-window during which the first-spike could be emitted. Moreover, the weights of synapses controlling Golgi cell activation regulated the intensity of granule cell inhibition and therefore the number of spikes that could be emitted. First spike timing was regulated with millisecond precision and the number of spikes ranged from 0 to 3. Interestingly, different combinations of synaptic weights optimized either first-spike timing precision or spike number, efficiently controlling transmission and filtering properties. These results predict that distributed synaptic plasticity regulates the emission of quasi-digital spike patterns on the millisecond time scale and allows the cerebellar granular layer to flexibly control burst transmission along the mossy fiber pathway.

  15. Learning of Precise Spike Times with Homeostatic Membrane Potential Dependent Synaptic Plasticity.

    Directory of Open Access Journals (Sweden)

    Christian Albers

    Full Text Available Precise spatio-temporal patterns of neuronal action potentials underly e.g. sensory representations and control of muscle activities. However, it is not known how the synaptic efficacies in the neuronal networks of the brain adapt such that they can reliably generate spikes at specific points in time. Existing activity-dependent plasticity rules like Spike-Timing-Dependent Plasticity are agnostic to the goal of learning spike times. On the other hand, the existing formal and supervised learning algorithms perform a temporally precise comparison of projected activity with the target, but there is no known biologically plausible implementation of this comparison. Here, we propose a simple and local unsupervised synaptic plasticity mechanism that is derived from the requirement of a balanced membrane potential. Since the relevant signal for synaptic change is the postsynaptic voltage rather than spike times, we call the plasticity rule Membrane Potential Dependent Plasticity (MPDP. Combining our plasticity mechanism with spike after-hyperpolarization causes a sensitivity of synaptic change to pre- and postsynaptic spike times which can reproduce Hebbian spike timing dependent plasticity for inhibitory synapses as was found in experiments. In addition, the sensitivity of MPDP to the time course of the voltage when generating a spike allows MPDP to distinguish between weak (spurious and strong (teacher spikes, which therefore provides a neuronal basis for the comparison of actual and target activity. For spatio-temporal input spike patterns our conceptually simple plasticity rule achieves a surprisingly high storage capacity for spike associations. The sensitivity of the MPDP to the subthreshold membrane potential during training allows robust memory retrieval after learning even in the presence of activity corrupted by noise. We propose that MPDP represents a biophysically plausible mechanism to learn temporal target activity patterns.

  16. Learning of Precise Spike Times with Homeostatic Membrane Potential Dependent Synaptic Plasticity.

    Science.gov (United States)

    Albers, Christian; Westkott, Maren; Pawelzik, Klaus

    2016-01-01

    Precise spatio-temporal patterns of neuronal action potentials underly e.g. sensory representations and control of muscle activities. However, it is not known how the synaptic efficacies in the neuronal networks of the brain adapt such that they can reliably generate spikes at specific points in time. Existing activity-dependent plasticity rules like Spike-Timing-Dependent Plasticity are agnostic to the goal of learning spike times. On the other hand, the existing formal and supervised learning algorithms perform a temporally precise comparison of projected activity with the target, but there is no known biologically plausible implementation of this comparison. Here, we propose a simple and local unsupervised synaptic plasticity mechanism that is derived from the requirement of a balanced membrane potential. Since the relevant signal for synaptic change is the postsynaptic voltage rather than spike times, we call the plasticity rule Membrane Potential Dependent Plasticity (MPDP). Combining our plasticity mechanism with spike after-hyperpolarization causes a sensitivity of synaptic change to pre- and postsynaptic spike times which can reproduce Hebbian spike timing dependent plasticity for inhibitory synapses as was found in experiments. In addition, the sensitivity of MPDP to the time course of the voltage when generating a spike allows MPDP to distinguish between weak (spurious) and strong (teacher) spikes, which therefore provides a neuronal basis for the comparison of actual and target activity. For spatio-temporal input spike patterns our conceptually simple plasticity rule achieves a surprisingly high storage capacity for spike associations. The sensitivity of the MPDP to the subthreshold membrane potential during training allows robust memory retrieval after learning even in the presence of activity corrupted by noise. We propose that MPDP represents a biophysically plausible mechanism to learn temporal target activity patterns.

  17. A memristive spiking neuron with firing rate coding

    Directory of Open Access Journals (Sweden)

    Marina eIgnatov

    2015-10-01

    Full Text Available Perception, decisions, and sensations are all encoded into trains of action potentials in the brain. The relation between stimulus strength and all-or-nothing spiking of neurons is widely believed to be the basis of this coding. This initiated the development of spiking neuron models; one of today's most powerful conceptual tool for the analysis and emulation of neural dynamics. The success of electronic circuit models and their physical realization within silicon field-effect transistor circuits lead to elegant technical approaches. Recently, the spectrum of electronic devices for neural computing has been extended by memristive devices, mainly used to emulate static synaptic functionality. Their capabilities for emulations of neural activity were recently demonstrated using a memristive neuristor circuit, while a memristive neuron circuit has so far been elusive. Here, a spiking neuron model is experimentally realized in a compact circuit comprising memristive and memcapacitive devices based on the strongly correlated electron material vanadium dioxide (VO2 and on the chemical electromigration cell Ag/TiO2-x/Al. The circuit can emulate dynamical spiking patterns in response to an external stimulus including adaptation, which is at the heart of firing rate coding as first observed by E.D. Adrian in 1926.

  18. Deep Learning with Dynamic Spiking Neurons and Fixed Feedback Weights.

    Science.gov (United States)

    Samadi, Arash; Lillicrap, Timothy P; Tweed, Douglas B

    2017-03-01

    Recent work in computer science has shown the power of deep learning driven by the backpropagation algorithm in networks of artificial neurons. But real neurons in the brain are different from most of these artificial ones in at least three crucial ways: they emit spikes rather than graded outputs, their inputs and outputs are related dynamically rather than by piecewise-smooth functions, and they have no known way to coordinate arrays of synapses in separate forward and feedback pathways so that they change simultaneously and identically, as they do in backpropagation. Given these differences, it is unlikely that current deep learning algorithms can operate in the brain, but we that show these problems can be solved by two simple devices: learning rules can approximate dynamic input-output relations with piecewise-smooth functions, and a variation on the feedback alignment algorithm can train deep networks without having to coordinate forward and feedback synapses. Our results also show that deep spiking networks learn much better if each neuron computes an intracellular teaching signal that reflects that cell's nonlinearity. With this mechanism, networks of spiking neurons show useful learning in synapses at least nine layers upstream from the output cells and perform well compared to other spiking networks in the literature on the MNIST digit recognition task.

  19. Spike Neural Models Part II: Abstract Neural Models

    Directory of Open Access Journals (Sweden)

    Johnson, Melissa G.

    2018-02-01

    Full Text Available Neurons are complex cells that require a lot of time and resources to model completely. In spiking neural networks (SNN though, not all that complexity is required. Therefore simple, abstract models are often used. These models save time, use less computer resources, and are easier to understand. This tutorial presents two such models: Izhikevich's model, which is biologically realistic in the resulting spike trains but not in the parameters, and the Leaky Integrate and Fire (LIF model which is not biologically realistic but does quickly and easily integrate input to produce spikes. Izhikevich's model is based on Hodgkin-Huxley's model but simplified such that it uses only two differentiation equations and four parameters to produce various realistic spike patterns. LIF is based on a standard electrical circuit and contains one equation. Either of these two models, or any of the many other models in literature can be used in a SNN. Choosing a neural model is an important task that depends on the goal of the research and the resources available. Once a model is chosen, network decisions such as connectivity, delay, and sparseness, need to be made. Understanding neural models and how they are incorporated into the network is the first step in creating a SNN.

  20. Learning Spatiotemporally Encoded Pattern Transformations in Structured Spiking Neural Networks.

    Science.gov (United States)

    Gardner, Brian; Sporea, Ioana; Grüning, André

    2015-12-01

    Information encoding in the nervous system is supported through the precise spike timings of neurons; however, an understanding of the underlying processes by which such representations are formed in the first place remains an open question. Here we examine how multilayered networks of spiking neurons can learn to encode for input patterns using a fully temporal coding scheme. To this end, we introduce a new supervised learning rule, MultilayerSpiker, that can train spiking networks containing hidden layer neurons to perform transformations between spatiotemporal input and output spike patterns. The performance of the proposed learning rule is demonstrated in terms of the number of pattern mappings it can learn, the complexity of network structures it can be used on, and its classification accuracy when using multispike-based encodings. In particular, the learning rule displays robustness against input noise and can generalize well on an example data set. Our approach contributes to both a systematic understanding of how computations might take place in the nervous system and a learning rule that displays strong technical capability.

  1. Characterizing neural activities evoked by manual acupuncture through spiking irregularity measures

    International Nuclear Information System (INIS)

    Xue Ming; Wang Jiang; Deng Bin; Wei Xi-Le; Yu Hai-Tao; Chen Ying-Yuan

    2013-01-01

    The neural system characterizes information in external stimulations by different spiking patterns. In order to examine how neural spiking patterns are related to acupuncture manipulations, experiments are designed in such a way that different types of manual acupuncture (MA) manipulations are taken at the ‘Zusanli’ point of experimental rats, and the induced electrical signals in the spinal dorsal root ganglion are detected and recorded. The interspike interval (ISI) statistical histogram is fitted by the gamma distribution, which has two parameters: one is the time-dependent firing rate and the other is a shape parameter characterizing the spiking irregularities. The shape parameter is the measure of spiking irregularities and can be used to identify the type of MA manipulations. The coefficient of variation is mostly used to measure the spike time irregularity, but it overestimates the irregularity in the case of pronounced firing rate changes. However, experiments show that each acupuncture manipulation will lead to changes in the firing rate. So we combine four relatively rate-independent measures to study the irregularity of spike trains evoked by different types of MA manipulations. Results suggest that the MA manipulations possess unique spiking statistics and characteristics and can be distinguished according to the spiking irregularity measures. These studies have offered new insights into the coding processes and information transfer of acupuncture. (interdisciplinary physics and related areas of science and technology)

  2. A Markovian event-based framework for stochastic spiking neural networks.

    Science.gov (United States)

    Touboul, Jonathan D; Faugeras, Olivier D

    2011-11-01

    In spiking neural networks, the information is conveyed by the spike times, that depend on the intrinsic dynamics of each neuron, the input they receive and on the connections between neurons. In this article we study the Markovian nature of the sequence of spike times in stochastic neural networks, and in particular the ability to deduce from a spike train the next spike time, and therefore produce a description of the network activity only based on the spike times regardless of the membrane potential process. To study this question in a rigorous manner, we introduce and study an event-based description of networks of noisy integrate-and-fire neurons, i.e. that is based on the computation of the spike times. We show that the firing times of the neurons in the networks constitute a Markov chain, whose transition probability is related to the probability distribution of the interspike interval of the neurons in the network. In the cases where the Markovian model can be developed, the transition probability is explicitly derived in such classical cases of neural networks as the linear integrate-and-fire neuron models with excitatory and inhibitory interactions, for different types of synapses, possibly featuring noisy synaptic integration, transmission delays and absolute and relative refractory period. This covers most of the cases that have been investigated in the event-based description of spiking deterministic neural networks.

  3. The dynamic relationship between cerebellar Purkinje cell simple spikes and the spikelet number of complex spikes.

    Science.gov (United States)

    Burroughs, Amelia; Wise, Andrew K; Xiao, Jianqiang; Houghton, Conor; Tang, Tianyu; Suh, Colleen Y; Lang, Eric J; Apps, Richard; Cerminara, Nadia L

    2017-01-01

    Purkinje cells are the sole output of the cerebellar cortex and fire two distinct types of action potential: simple spikes and complex spikes. Previous studies have mainly considered complex spikes as unitary events, even though the waveform is composed of varying numbers of spikelets. The extent to which differences in spikelet number affect simple spike activity (and vice versa) remains unclear. We found that complex spikes with greater numbers of spikelets are preceded by higher simple spike firing rates but, following the complex spike, simple spikes are reduced in a manner that is graded with spikelet number. This dynamic interaction has important implications for cerebellar information processing, and suggests that complex spike spikelet number may maintain Purkinje cells within their operational range. Purkinje cells are central to cerebellar function because they form the sole output of the cerebellar cortex. They exhibit two distinct types of action potential: simple spikes and complex spikes. It is widely accepted that interaction between these two types of impulse is central to cerebellar cortical information processing. Previous investigations of the interactions between simple spikes and complex spikes have mainly considered complex spikes as unitary events. However, complex spikes are composed of an initial large spike followed by a number of secondary components, termed spikelets. The number of spikelets within individual complex spikes is highly variable and the extent to which differences in complex spike spikelet number affects simple spike activity (and vice versa) remains poorly understood. In anaesthetized adult rats, we have found that Purkinje cells recorded from the posterior lobe vermis and hemisphere have high simple spike firing frequencies that precede complex spikes with greater numbers of spikelets. This finding was also evident in a small sample of Purkinje cells recorded from the posterior lobe hemisphere in awake cats. In addition

  4. Motor control by precisely timed spike patterns

    DEFF Research Database (Denmark)

    Srivastava, Kyle H; Holmes, Caroline M; Vellema, Michiel

    2017-01-01

    whether the information in spike timing actually plays a role in brain function. By examining the activity of individual motor units (the muscle fibers innervated by a single motor neuron) and manipulating patterns of activation of these neurons, we provide both correlative and causal evidence......A fundamental problem in neuroscience is understanding how sequences of action potentials ("spikes") encode information about sensory signals and motor outputs. Although traditional theories assume that this information is conveyed by the total number of spikes fired within a specified time...... interval (spike rate), recent studies have shown that additional information is carried by the millisecond-scale timing patterns of action potentials (spike timing). However, it is unknown whether or how subtle differences in spike timing drive differences in perception or behavior, leaving it unclear...

  5. Multineuronal Spike Sequences Repeat with Millisecond Precision

    Directory of Open Access Journals (Sweden)

    Koki eMatsumoto

    2013-06-01

    Full Text Available Cortical microcircuits are nonrandomly wired by neurons. As a natural consequence, spikes emitted by microcircuits are also nonrandomly patterned in time and space. One of the prominent spike organizations is a repetition of fixed patterns of spike series across multiple neurons. However, several questions remain unsolved, including how precisely spike sequences repeat, how the sequences are spatially organized, how many neurons participate in sequences, and how different sequences are functionally linked. To address these questions, we monitored spontaneous spikes of hippocampal CA3 neurons ex vivo using a high-speed functional multineuron calcium imaging technique that allowed us to monitor spikes with millisecond resolution and to record the location of spiking and nonspiking neurons. Multineuronal spike sequences were overrepresented in spontaneous activity compared to the statistical chance level. Approximately 75% of neurons participated in at least one sequence during our observation period. The participants were sparsely dispersed and did not show specific spatial organization. The number of sequences relative to the chance level decreased when larger time frames were used to detect sequences. Thus, sequences were precise at the millisecond level. Sequences often shared common spikes with other sequences; parts of sequences were subsequently relayed by following sequences, generating complex chains of multiple sequences.

  6. The Second Spiking Threshold: Dynamics of Laminar Network Spiking in the Visual Cortex

    DEFF Research Database (Denmark)

    Forsberg, Lars E.; Bonde, Lars H.; Harvey, Michael A.

    2016-01-01

    and moving visual stimuli from the spontaneous ongoing spiking state, in all layers and zones of areas 17 and 18 indicating that the second threshold is a property of the network. Spontaneous and evoked spiking, thus can easily be distinguished. In addition, the trajectories of spontaneous ongoing states......Most neurons have a threshold separating the silent non-spiking state and the state of producing temporal sequences of spikes. But neurons in vivo also have a second threshold, found recently in granular layer neurons of the primary visual cortex, separating spontaneous ongoing spiking from...... visually evoked spiking driven by sharp transients. Here we examine whether this second threshold exists outside the granular layer and examine details of transitions between spiking states in ferrets exposed to moving objects. We found the second threshold, separating spiking states evoked by stationary...

  7. Supervised learning with decision margins in pools of spiking neurons.

    Science.gov (United States)

    Le Mouel, Charlotte; Harris, Kenneth D; Yger, Pierre

    2014-10-01

    Learning to categorise sensory inputs by generalising from a few examples whose category is precisely known is a crucial step for the brain to produce appropriate behavioural responses. At the neuronal level, this may be performed by adaptation of synaptic weights under the influence of a training signal, in order to group spiking patterns impinging on the neuron. Here we describe a framework that allows spiking neurons to perform such "supervised learning", using principles similar to the Support Vector Machine, a well-established and robust classifier. Using a hinge-loss error function, we show that requesting a margin similar to that of the SVM improves performance on linearly non-separable problems. Moreover, we show that using pools of neurons to discriminate categories can also increase the performance by sharing the load among neurons.

  8. Integration of balance and strength training into daily life activity to reduce rate of falls in older people (the LiFE study): randomised parallel trial.

    Science.gov (United States)

    Clemson, Lindy; Fiatarone Singh, Maria A; Bundy, Anita; Cumming, Robert G; Manollaras, Kate; O'Loughlin, Patricia; Black, Deborah

    2012-08-07

    To determine whether a lifestyle integrated approach to balance and strength training is effective in reducing the rate of falls in older, high risk people living at home. Three arm, randomised parallel trial; assessments at baseline and after six and 12 months. Randomisation done by computer generated random blocks, stratified by sex and fall history and concealed by an independent secure website. Residents in metropolitan Sydney, Australia. Participants aged 70 years or older who had two or more falls or one injurious fall in past 12 months, recruited from Veteran's Affairs databases and general practice databases. Exclusion criteria were moderate to severe cognitive problems, inability to ambulate independently, neurological conditions that severely influenced gait and mobility, resident in a nursing home or hostel, or any unstable or terminal illness that would affect ability to do exercises. Three home based interventions: Lifestyle integrated Functional Exercise (LiFE) approach (n=107; taught principles of balance and strength training and integrated selected activities into everyday routines), structured programme (n=105; exercises for balance and lower limb strength, done three times a week), sham control programme (n=105; gentle exercise). LiFE and structured groups received five sessions with two booster visits and two phone calls; controls received three home visits and six phone calls. Assessments made at baseline and after six and 12 months. Primary measure: rate of falls over 12 months, collected by self report. Secondary measures: static and dynamic balance; ankle, knee and hip strength; balance self efficacy; daily living activities; participation; habitual physical activity; quality of life; energy expenditure; body mass index; and fat free mass. After 12 months' follow-up, we recorded 172, 193, and 224 falls in the LiFE, structured exercise, and control groups, respectively. The overall incidence of falls in the LiFE programme was 1.66 per person

  9. Spike voltage topography in temporal lobe epilepsy.

    Science.gov (United States)

    Asadi-Pooya, Ali A; Asadollahi, Marjan; Shimamoto, Shoichi; Lorenzo, Matthew; Sperling, Michael R

    2016-07-15

    We investigated the voltage topography of interictal spikes in patients with temporal lobe epilepsy (TLE) to see whether topography was related to etiology for TLE. Adults with TLE, who had epilepsy surgery for drug-resistant seizures from 2011 until 2014 at Jefferson Comprehensive Epilepsy Center were selected. Two groups of patients were studied: patients with mesial temporal sclerosis (MTS) on MRI and those with other MRI findings. The voltage topography maps of the interictal spikes at the peak were created using BESA software. We classified the interictal spikes as polar, basal, lateral, or others. Thirty-four patients were studied, from which the characteristics of 340 spikes were investigated. The most common type of spike orientation was others (186 spikes; 54.7%), followed by lateral (146; 42.9%), polar (5; 1.5%), and basal (3; 0.9%). Characteristics of the voltage topography maps of the spikes between the two groups of patients were somewhat different. Five spikes in patients with MTS had polar orientation, but none of the spikes in patients with other MRI findings had polar orientation (odds ratio=6.98, 95% confidence interval=0.38 to 127.38; p=0.07). Scalp topographic mapping of interictal spikes has the potential to offer different information than visual inspection alone. The present results do not allow an immediate clinical application of our findings; however, detecting a polar spike in a patient with TLE may increase the possibility of mesial temporal sclerosis as the underlying etiology. Copyright © 2016 Elsevier B.V. All rights reserved.

  10. Parallel computation

    International Nuclear Information System (INIS)

    Jejcic, A.; Maillard, J.; Maurel, G.; Silva, J.; Wolff-Bacha, F.

    1997-01-01

    The work in the field of parallel processing has developed as research activities using several numerical Monte Carlo simulations related to basic or applied current problems of nuclear and particle physics. For the applications utilizing the GEANT code development or improvement works were done on parts simulating low energy physical phenomena like radiation, transport and interaction. The problem of actinide burning by means of accelerators was approached using a simulation with the GEANT code. A program of neutron tracking in the range of low energies up to the thermal region has been developed. It is coupled to the GEANT code and permits in a single pass the simulation of a hybrid reactor core receiving a proton burst. Other works in this field refers to simulations for nuclear medicine applications like, for instance, development of biological probes, evaluation and characterization of the gamma cameras (collimators, crystal thickness) as well as the method for dosimetric calculations. Particularly, these calculations are suited for a geometrical parallelization approach especially adapted to parallel machines of the TN310 type. Other works mentioned in the same field refer to simulation of the electron channelling in crystals and simulation of the beam-beam interaction effect in colliders. The GEANT code was also used to simulate the operation of germanium detectors designed for natural and artificial radioactivity monitoring of environment

  11. Linking investment spikes and productivity growth

    NARCIS (Netherlands)

    Geylani, P.C.; Stefanou, S.E.

    2013-01-01

    We investigate the relationship between productivity growth and investment spikes using Census Bureau’s plant-level dataset for the U.S. food manufacturing industry. There are differences in productivity growth and investment spike patterns across different sub-industries and food manufacturing

  12. Mimickers of generalized spike and wave discharges.

    Science.gov (United States)

    Azzam, Raed; Bhatt, Amar B

    2014-06-01

    Overinterpretation of benign EEG variants is a common problem that can lead to the misdiagnosis of epilepsy. We review four normal patterns that mimic generalized spike and wave discharges: phantom spike-and-wave, hyperventilation hypersynchrony, hypnagogic/ hypnopompic hypersynchrony, and mitten patterns.

  13. Noise-robust speech recognition through auditory feature detection and spike sequence decoding.

    Science.gov (United States)

    Schafer, Phillip B; Jin, Dezhe Z

    2014-03-01

    Speech recognition in noisy conditions is a major challenge for computer systems, but the human brain performs it routinely and accurately. Automatic speech recognition (ASR) systems that are inspired by neuroscience can potentially bridge the performance gap between humans and machines. We present a system for noise-robust isolated word recognition that works by decoding sequences of spikes from a population of simulated auditory feature-detecting neurons. Each neuron is trained to respond selectively to a brief spectrotemporal pattern, or feature, drawn from the simulated auditory nerve response to speech. The neural population conveys the time-dependent structure of a sound by its sequence of spikes. We compare two methods for decoding the spike sequences--one using a hidden Markov model-based recognizer, the other using a novel template-based recognition scheme. In the latter case, words are recognized by comparing their spike sequences to template sequences obtained from clean training data, using a similarity measure based on the length of the longest common sub-sequence. Using isolated spoken digits from the AURORA-2 database, we show that our combined system outperforms a state-of-the-art robust speech recognizer at low signal-to-noise ratios. Both the spike-based encoding scheme and the template-based decoding offer gains in noise robustness over traditional speech recognition methods. Our system highlights potential advantages of spike-based acoustic coding and provides a biologically motivated framework for robust ASR development.

  14. Spiking neural networks for handwritten digit recognition-Supervised learning and network optimization.

    Science.gov (United States)

    Kulkarni, Shruti R; Rajendran, Bipin

    2018-07-01

    We demonstrate supervised learning in Spiking Neural Networks (SNNs) for the problem of handwritten digit recognition using the spike triggered Normalized Approximate Descent (NormAD) algorithm. Our network that employs neurons operating at sparse biological spike rates below 300Hz achieves a classification accuracy of 98.17% on the MNIST test database with four times fewer parameters compared to the state-of-the-art. We present several insights from extensive numerical experiments regarding optimization of learning parameters and network configuration to improve its accuracy. We also describe a number of strategies to optimize the SNN for implementation in memory and energy constrained hardware, including approximations in computing the neuronal dynamics and reduced precision in storing the synaptic weights. Experiments reveal that even with 3-bit synaptic weights, the classification accuracy of the designed SNN does not degrade beyond 1% as compared to the floating-point baseline. Further, the proposed SNN, which is trained based on the precise spike timing information outperforms an equivalent non-spiking artificial neural network (ANN) trained using back propagation, especially at low bit precision. Thus, our study shows the potential for realizing efficient neuromorphic systems that use spike based information encoding and learning for real-world applications. Copyright © 2018 Elsevier Ltd. All rights reserved.

  15. Inference of neuronal network spike dynamics and topology from calcium imaging data

    Directory of Open Access Journals (Sweden)

    Henry eLütcke

    2013-12-01

    Full Text Available Two-photon calcium imaging enables functional analysis of neuronal circuits by inferring action potential (AP occurrence ('spike trains' from cellular fluorescence signals. It remains unclear how experimental parameters such as signal-to-noise ratio (SNR and acquisition rate affect spike inference and whether additional information about network structure can be extracted. Here we present a simulation framework for quantitatively assessing how well spike dynamics and network topology can be inferred from noisy calcium imaging data. For simulated AP-evoked calcium transients in neocortical pyramidal cells, we analyzed the quality of spike inference as a function of SNR and data acquisition rate using a recently introduced peeling algorithm. Given experimentally attainable values of SNR and acquisition rate, neural spike trains could be reconstructed accurately and with up to millisecond precision. We then applied statistical neuronal network models to explore how remaining uncertainties in spike inference affect estimates of network connectivity and topological features of network organization. We define the experimental conditions suitable for inferring whether the network has a scale-free structure and determine how well hub neurons can be identified. Our findings provide a benchmark for future calcium imaging studies that aim to reliably infer neuronal network properties.

  16. Fluid-thermal analysis of aerodynamic heating over spiked blunt body configurations

    Science.gov (United States)

    Qin, Qihao; Xu, Jinglei; Guo, Shuai

    2017-03-01

    When flying at hypersonic speeds, the spiked blunt body is constantly subjected to severe aerodynamic heating. To illustrate the thermal response of different configurations and the relevant flow field variation, a loosely-coupled fluid-thermal analysis is performed in this paper. The Mesh-based parallel Code Coupling Interface (MpCCI) is adopted to implement the data exchange between the fluid solver and the thermal solver. The results indicate that increases in spike diameter and length will result in a sharp decline of the wall temperature along the spike, and the overall heat flux is remarkably reduced to less than 300 W/cm2 with the aerodome mounted at the spike tip. Moreover, the presence and evolution of small vortices within the recirculation zone are observed and proved to be induced by the stagnation effect of reattachment points on the spike. In addition, the drag coefficient of the configuration with a doubled spike length presents a maximum drop of 4.59% due to the elevated wall temperature. And the growing difference of the drag coefficient is further increased during the accelerating process.

  17. Biologically-Inspired Spike-Based Automatic Speech Recognition of Isolated Digits Over a Reproducing Kernel Hilbert Space

    Directory of Open Access Journals (Sweden)

    Kan Li

    2018-04-01

    Full Text Available This paper presents a novel real-time dynamic framework for quantifying time-series structure in spoken words using spikes. Audio signals are converted into multi-channel spike trains using a biologically-inspired leaky integrate-and-fire (LIF spike generator. These spike trains are mapped into a function space of infinite dimension, i.e., a Reproducing Kernel Hilbert Space (RKHS using point-process kernels, where a state-space model learns the dynamics of the multidimensional spike input using gradient descent learning. This kernelized recurrent system is very parsimonious and achieves the necessary memory depth via feedback of its internal states when trained discriminatively, utilizing the full context of the phoneme sequence. A main advantage of modeling nonlinear dynamics using state-space trajectories in the RKHS is that it imposes no restriction on the relationship between the exogenous input and its internal state. We are free to choose the input representation with an appropriate kernel, and changing the kernel does not impact the system nor the learning algorithm. Moreover, we show that this novel framework can outperform both traditional hidden Markov model (HMM speech processing as well as neuromorphic implementations based on spiking neural network (SNN, yielding accurate and ultra-low power word spotters. As a proof of concept, we demonstrate its capabilities using the benchmark TI-46 digit corpus for isolated-word automatic speech recognition (ASR or keyword spotting. Compared to HMM using Mel-frequency cepstral coefficient (MFCC front-end without time-derivatives, our MFCC-KAARMA offered improved performance. For spike-train front-end, spike-KAARMA also outperformed state-of-the-art SNN solutions. Furthermore, compared to MFCCs, spike trains provided enhanced noise robustness in certain low signal-to-noise ratio (SNR regime.

  18. Biologically-Inspired Spike-Based Automatic Speech Recognition of Isolated Digits Over a Reproducing Kernel Hilbert Space.

    Science.gov (United States)

    Li, Kan; Príncipe, José C

    2018-01-01

    This paper presents a novel real-time dynamic framework for quantifying time-series structure in spoken words using spikes. Audio signals are converted into multi-channel spike trains using a biologically-inspired leaky integrate-and-fire (LIF) spike generator. These spike trains are mapped into a function space of infinite dimension, i.e., a Reproducing Kernel Hilbert Space (RKHS) using point-process kernels, where a state-space model learns the dynamics of the multidimensional spike input using gradient descent learning. This kernelized recurrent system is very parsimonious and achieves the necessary memory depth via feedback of its internal states when trained discriminatively, utilizing the full context of the phoneme sequence. A main advantage of modeling nonlinear dynamics using state-space trajectories in the RKHS is that it imposes no restriction on the relationship between the exogenous input and its internal state. We are free to choose the input representation with an appropriate kernel, and changing the kernel does not impact the system nor the learning algorithm. Moreover, we show that this novel framework can outperform both traditional hidden Markov model (HMM) speech processing as well as neuromorphic implementations based on spiking neural network (SNN), yielding accurate and ultra-low power word spotters. As a proof of concept, we demonstrate its capabilities using the benchmark TI-46 digit corpus for isolated-word automatic speech recognition (ASR) or keyword spotting. Compared to HMM using Mel-frequency cepstral coefficient (MFCC) front-end without time-derivatives, our MFCC-KAARMA offered improved performance. For spike-train front-end, spike-KAARMA also outperformed state-of-the-art SNN solutions. Furthermore, compared to MFCCs, spike trains provided enhanced noise robustness in certain low signal-to-noise ratio (SNR) regime.

  19. Decoding spatiotemporal spike sequences via the finite state automata dynamics of spiking neural networks

    International Nuclear Information System (INIS)

    Jin, Dezhe Z

    2008-01-01

    Temporally complex stimuli are encoded into spatiotemporal spike sequences of neurons in many sensory areas. Here, we describe how downstream neurons with dendritic bistable plateau potentials can be connected to decode such spike sequences. Driven by feedforward inputs from the sensory neurons and controlled by feedforward inhibition and lateral excitation, the neurons transit between UP and DOWN states of the membrane potentials. The neurons spike only in the UP states. A decoding neuron spikes at the end of an input to signal the recognition of specific spike sequences. The transition dynamics is equivalent to that of a finite state automaton. A connection rule for the networks guarantees that any finite state automaton can be mapped into the transition dynamics, demonstrating the equivalence in computational power between the networks and finite state automata. The decoding mechanism is capable of recognizing an arbitrary number of spatiotemporal spike sequences, and is insensitive to the variations of the spike timings in the sequences

  20. Contribution to the optimal design of an hybrid parallel power-train: choice of a battery model; Contribution a la conception optimale d'une motorisation hybride parallele. Choix d'un modele d'accumulateur

    Energy Technology Data Exchange (ETDEWEB)

    Kuhn, E.

    2004-09-15

    This work deals with the dynamical and energetic modeling of a 42 V NiMH battery, the model of which is taking into account into a control law for an hybrid electrical vehicle. Using an inventory of the electrochemical phenomena, an equivalent electrical scheme has been established. In this model, diffusion phenomena were represented using non integer derivatives. This tool leads to a very good approximation of diffusion phenomena, nevertheless such a pure mathematical approach did not allow to represent energetic losses inside the battery. Consequently, a second model, made of a series of electric circuits has been proposed to represent energetic transfers. This second model has been used in the determination of a control law which warrants an autonomous management of electrical energy embedded in a parallel hybrid electrical vehicle, and to prevent deep discharge of the battery. (author)

  1. Parallel R

    CERN Document Server

    McCallum, Ethan

    2011-01-01

    It's tough to argue with R as a high-quality, cross-platform, open source statistical software product-unless you're in the business of crunching Big Data. This concise book introduces you to several strategies for using R to analyze large datasets. You'll learn the basics of Snow, Multicore, Parallel, and some Hadoop-related tools, including how to find them, how to use them, when they work well, and when they don't. With these packages, you can overcome R's single-threaded nature by spreading work across multiple CPUs, or offloading work to multiple machines to address R's memory barrier.

  2. Reliability of MEG source imaging of anterior temporal spikes: analysis of an intracranially characterized spike focus.

    Science.gov (United States)

    Wennberg, Richard; Cheyne, Douglas

    2014-05-01

    To assess the reliability of MEG source imaging (MSI) of anterior temporal spikes through detailed analysis of the localization and orientation of source solutions obtained for a large number of spikes that were separately confirmed by intracranial EEG to be focally generated within a single, well-characterized spike focus. MSI was performed on 64 identical right anterior temporal spikes from an anterolateral temporal neocortical spike focus. The effects of different volume conductors (sphere and realistic head model), removal of noise with low frequency filters (LFFs) and averaging multiple spikes were assessed in terms of the reliability of the source solutions. MSI of single spikes resulted in scattered dipole source solutions that showed reasonable reliability for localization at the lobar level, but only for solutions with a goodness-of-fit exceeding 80% using a LFF of 3 Hz. Reliability at a finer level of intralobar localization was limited. Spike averaging significantly improved the reliability of source solutions and averaging 8 or more spikes reduced dependency on goodness-of-fit and data filtering. MSI performed on topographically identical individual spikes from an intracranially defined classical anterior temporal lobe spike focus was limited by low reliability (i.e., scattered source solutions) in terms of fine, sublobar localization within the ipsilateral temporal lobe. Spike averaging significantly improved reliability. MSI performed on individual anterior temporal spikes is limited by low reliability. Reduction of background noise through spike averaging significantly improves the reliability of MSI solutions. Copyright © 2013 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  3. Visually Evoked Spiking Evolves While Spontaneous Ongoing Dynamics Persist

    DEFF Research Database (Denmark)

    Huys, Raoul; Jirsa, Viktor K; Darokhan, Ziauddin

    2016-01-01

    attractor. Its existence guarantees that evoked spiking return to the spontaneous state. However, the spontaneous ongoing spiking state and the visual evoked spiking states are qualitatively different and are separated by a threshold (separatrix). The functional advantage of this organization...

  4. Emergent properties of interacting populations of spiking neurons

    Directory of Open Access Journals (Sweden)

    Stefano eCardanobile

    2011-12-01

    Full Text Available Dynamic neuronal networks are a key paradigm of increasing importance in brain research, concerned with the functional analysis of biological neuronal networks and, at the same time, with the synthesis of artificial brain-like systems. In this context, neuronal network models serve as mathematical tools to understand the function of brains, but they might as well develop into future tools for enhancing certain functions of our nervous system.Here, we discuss our recent achievements in developing multiplicative point processes into a viable mathematical framework for spiking network modeling. The perspective is that the dynamic behavior of these neuronal networks on the population level is faithfully reflected by a set of non-linear rate equations, describing all interactions on this level. These equations, in turn, are similar in structure to the Lotka-Volterra equations, well known by their use in modeling predator-prey relationships in population biology, but abundant applications to economic theory have also been described.We present a number of biologically relevant examples for spiking network function, which can be studied with the help of the aforementioned correspondence between spike trains and specific systems of non-linear coupled ordinary differential equations. We claim that, enabled by the use of multiplicative point processes, we can make essential contributions to a more thorough understanding of the dynamical properties of neural populations.

  5. Emergent properties of interacting populations of spiking neurons.

    Science.gov (United States)

    Cardanobile, Stefano; Rotter, Stefan

    2011-01-01

    Dynamic neuronal networks are a key paradigm of increasing importance in brain research, concerned with the functional analysis of biological neuronal networks and, at the same time, with the synthesis of artificial brain-like systems. In this context, neuronal network models serve as mathematical tools to understand the function of brains, but they might as well develop into future tools for enhancing certain functions of our nervous system. Here, we present and discuss our recent achievements in developing multiplicative point processes into a viable mathematical framework for spiking network modeling. The perspective is that the dynamic behavior of these neuronal networks is faithfully reflected by a set of non-linear rate equations, describing all interactions on the population level. These equations are similar in structure to Lotka-Volterra equations, well known by their use in modeling predator-prey relations in population biology, but abundant applications to economic theory have also been described. We present a number of biologically relevant examples for spiking network function, which can be studied with the help of the aforementioned correspondence between spike trains and specific systems of non-linear coupled ordinary differential equations. We claim that, enabled by the use of multiplicative point processes, we can make essential contributions to a more thorough understanding of the dynamical properties of interacting neuronal populations.

  6. A Reinforcement Learning Framework for Spiking Networks with Dynamic Synapses

    Directory of Open Access Journals (Sweden)

    Karim El-Laithy

    2011-01-01

    Full Text Available An integration of both the Hebbian-based and reinforcement learning (RL rules is presented for dynamic synapses. The proposed framework permits the Hebbian rule to update the hidden synaptic model parameters regulating the synaptic response rather than the synaptic weights. This is performed using both the value and the sign of the temporal difference in the reward signal after each trial. Applying this framework, a spiking network with spike-timing-dependent synapses is tested to learn the exclusive-OR computation on a temporally coded basis. Reward values are calculated with the distance between the output spike train of the network and a reference target one. Results show that the network is able to capture the required dynamics and that the proposed framework can reveal indeed an integrated version of Hebbian and RL. The proposed framework is tractable and less computationally expensive. The framework is applicable to a wide class of synaptic models and is not restricted to the used neural representation. This generality, along with the reported results, supports adopting the introduced approach to benefit from the biologically plausible synaptic models in a wide range of intuitive signal processing.

  7. The Use Of Spikes Protocol In Cancer: An Integrative Review

    Directory of Open Access Journals (Sweden)

    Fernando Henrique de Sousa

    2017-03-01

    Full Text Available This is an integrative review which aimed to evaluate the use of the SPIKES protocol in Oncology. We selected articles published in Medline and CINAHL databases between 2005-2015, in English, with the descriptors defined by the Medical Subject Headings (MeSH:cancer, neoplasms, plus the uncontrolled descriptor: protocol spikes.  Six articles met the inclusion criteria and were analyzed in full, three thematic categories were established: aspects inherent to the health care professional; Aspects related to the patient and aspects related to the protocol. The main effects of the steps of SPIKES protocol can provide the strengthening of ties between health professionals and patients, and ensure the maintenance and quality of this relationship.  The results indicate an important limiting factor for effective doctor-patient relationship, the little training provided to medical professionals communication of bad news, verified by the difficulty reported in this moment through interviews in the analyzed studies.

  8. Spike Bursts from an Excitable Optical System

    Science.gov (United States)

    Rios Leite, Jose R.; Rosero, Edison J.; Barbosa, Wendson A. S.; Tredicce, Jorge R.

    Diode Lasers with double optical feedback are shown to present power drop spikes with statistical distribution controllable by the ratio of the two feedback times. The average time between spikes and the variance within long time series are studied. The system is shown to be excitable and present bursting of spikes created with specific feedback time ratios and strength. A rate equation model, extending the Lang-Kobayashi single feedback for semiconductor lasers proves to match the experimental observations. Potential applications to construct network to mimic neural systems having controlled bursting properties in each unit will be discussed. Brazilian Agency CNPQ.

  9. PARALLEL MOVING MECHANICAL SYSTEMS

    Directory of Open Access Journals (Sweden)

    Florian Ion Tiberius Petrescu

    2014-09-01

    Full Text Available Normal 0 false false false EN-US X-NONE X-NONE MicrosoftInternetExplorer4 Moving mechanical systems parallel structures are solid, fast, and accurate. Between parallel systems it is to be noticed Stewart platforms, as the oldest systems, fast, solid and precise. The work outlines a few main elements of Stewart platforms. Begin with the geometry platform, kinematic elements of it, and presented then and a few items of dynamics. Dynamic primary element on it means the determination mechanism kinetic energy of the entire Stewart platforms. It is then in a record tail cinematic mobile by a method dot matrix of rotation. If a structural mottoelement consists of two moving elements which translates relative, drive train and especially dynamic it is more convenient to represent the mottoelement as a single moving components. We have thus seven moving parts (the six motoelements or feet to which is added mobile platform 7 and one fixed.

  10. The Second Spiking Threshold: Dynamics of Laminar Network Spiking in the Visual Cortex

    Science.gov (United States)

    Forsberg, Lars E.; Bonde, Lars H.; Harvey, Michael A.; Roland, Per E.

    2016-01-01

    Most neurons have a threshold separating the silent non-spiking state and the state of producing temporal sequences of spikes. But neurons in vivo also have a second threshold, found recently in granular layer neurons of the primary visual cortex, separating spontaneous ongoing spiking from visually evoked spiking driven by sharp transients. Here we examine whether this second threshold exists outside the granular layer and examine details of transitions between spiking states in ferrets exposed to moving objects. We found the second threshold, separating spiking states evoked by stationary and moving visual stimuli from the spontaneous ongoing spiking state, in all layers and zones of areas 17 and 18 indicating that the second threshold is a property of the network. Spontaneous and evoked spiking, thus can easily be distinguished. In addition, the trajectories of spontaneous ongoing states were slow, frequently changing direction. In single trials, sharp as well as smooth and slow transients transform the trajectories to be outward directed, fast and crossing the threshold to become evoked. Although the speeds of the evolution of the evoked states differ, the same domain of the state space is explored indicating uniformity of the evoked states. All evoked states return to the spontaneous evoked spiking state as in a typical mono-stable dynamical system. In single trials, neither the original spiking rates, nor the temporal evolution in state space could distinguish simple visual scenes. PMID:27582693

  11. Parallel Lines

    Directory of Open Access Journals (Sweden)

    James G. Worner

    2017-05-01

    Full Text Available James Worner is an Australian-based writer and scholar currently pursuing a PhD at the University of Technology Sydney. His research seeks to expose masculinities lost in the shadow of Australia’s Anzac hegemony while exploring new opportunities for contemporary historiography. He is the recipient of the Doctoral Scholarship in Historical Consciousness at the university’s Australian Centre of Public History and will be hosted by the University of Bologna during 2017 on a doctoral research writing scholarship.   ‘Parallel Lines’ is one of a collection of stories, The Shapes of Us, exploring liminal spaces of modern life: class, gender, sexuality, race, religion and education. It looks at lives, like lines, that do not meet but which travel in proximity, simultaneously attracted and repelled. James’ short stories have been published in various journals and anthologies.

  12. Spike persistence and normalization in benign epilepsy with centrotemporal spikes - Implications for management.

    Science.gov (United States)

    Kim, Hunmin; Kim, Soo Yeon; Lim, Byung Chan; Hwang, Hee; Chae, Jong-Hee; Choi, Jieun; Kim, Ki Joong; Dlugos, Dennis J

    2018-05-10

    This study was performed 1) to determine the timing of spike normalization in patients with benign epilepsy with centrotemporal spikes (BECTS); 2) to identify relationships between age of seizure onset, age of spike normalization, years of spike persistence and treatment; and 3) to assess final outcomes between groups of patients with or without spikes at the time of medication tapering. Retrospective analysis of BECTS patients confirmed by clinical data, including age of onset, seizure semiology and serial electroencephalography (EEG) from diagnosis to remission. Age at spike normalization, years of spike persistence, and time of treatment onset to spike normalization were assessed. Final seizure and EEG outcome were compared between the groups with or without spikes at the time of AED tapering. One hundred and thirty-four patients were included. Mean age at seizure onset was 7.52 ± 2.11 years. Mean age at spike normalization was 11.89 ± 2.11 (range: 6.3-16.8) years. Mean time of treatment onset to spike normalization was 4.11 ± 2.13 (range: 0.24-10.08) years. Younger age of seizure onset was correlated with longer duration of spike persistence (r = -0.41, p < 0.001). In treated patients, spikes persisted for 4.1 ± 1.95 years, compared with 2.9 ± 1.97 years in untreated patients. No patients had recurrent seizures after AED was discontinued, regardless of the presence/absence of spikes at time of AED tapering. Years of spike persistence was longer in early onset BECTS patients. Treatment with AEDs did not shorten years of spike persistence. Persistence of spikes at time of treatment withdrawal was not associated with seizure recurrence. Copyright © 2018 The Japanese Society of Child Neurology. Published by Elsevier B.V. All rights reserved.

  13. Memristors Empower Spiking Neurons With Stochasticity

    KAUST Repository

    Al-Shedivat, Maruan; Naous, Rawan; Cauwenberghs, Gert; Salama, Khaled N.

    2015-01-01

    Recent theoretical studies have shown that probabilistic spiking can be interpreted as learning and inference in cortical microcircuits. This interpretation creates new opportunities for building neuromorphic systems driven by probabilistic learning

  14. Frequency of Rolandic Spikes in ADHD

    Directory of Open Access Journals (Sweden)

    J Gordon Millichap

    2003-10-01

    Full Text Available The frequency of rolandic spikes in nonepileptic children with attention deficit hyperactivity disorder (ADHD was compared with a control group of normal school-aged children in a study at the University of Frankfurt, Germany.

  15. THE POLITICAL CRITIQUE OF SPIKE Lee's Bamboozled

    African Journals Online (AJOL)

    Admin

    CONTEMPORARY AMERICAN MEDIA: THE POLITICAL. CRITIQUE OF SPIKE ... KEYWORDS: Blackface Minstrelsy, Racist Stereotypes and American Media. INTRODUCTION ..... of a difference that is itself a process of disavowal.” In this ...

  16. Restoration of Muscle Mitochondrial Function and Metabolic Flexibility in Type 2 Diabetes by Exercise Training Is Paralleled by Increased Myocellular Fat Storage and Improved Insulin Sensitivity

    NARCIS (Netherlands)

    Meex, R.C.R.; Schrauwen-Hinderling, V.B.; Moonen-Kornips, E.; Schaart, G.; Mensink, M.R.; Phielix, E.; Weijer, van de T.; Sels, J.P.; Schrauwen, P.; Hesselink, M.K.C.

    2010-01-01

    OBJECTIVE-Mitochondrial dysfunction and fat accumulation in skeletal muscle (increased intramyocellular lipid [IMCL]) have been linked to development of type 2 diabetes. We examined whether exercise training could restore mitochondrial function and insulin sensitivity in patients with type 2

  17. The DEMO trial: a randomized, parallel-group, observer-blinded clinical trial of strength versus aerobic versus relaxation training for patients with mild to moderate depression

    DEFF Research Database (Denmark)

    Krogh, Jesper; Saltin, Bengt; Gluud, Christian

    2009-01-01

    OBJECTIVE: To assess the benefit and harm of exercise training in adults with clinical depression. METHOD: The DEMO trial is a randomized pragmatic trial for patients with unipolar depression conducted from January 2005 through July 2007. Patients were referred from general practitioners or psych......: Our findings do not support a biologically mediated effect of exercise on symptom severity in depressed patients, but they do support a beneficial effect of strength training on work capacity. TRIAL REGISTRATION: (ClinicalTrials.gov) Identifier: NCT00103415....

  18. Anticipating Activity in Social Media Spikes

    OpenAIRE

    Higham, Desmond J.; Grindrod, Peter; Mantzaris, Alexander V.; Otley, Amanda; Laflin, Peter

    2014-01-01

    We propose a novel mathematical model for the activity of microbloggers during an external, event-driven spike. The model leads to a testable prediction of who would become most active if a spike were to take place. This type of information is of great interest to commercial organisations, governments and charities, as it identifies key players who can be targeted with information in real time when the network is most receptive. The model takes account of the fact that dynamic interactions ev...

  19. Building functional networks of spiking model neurons.

    Science.gov (United States)

    Abbott, L F; DePasquale, Brian; Memmesheimer, Raoul-Martin

    2016-03-01

    Most of the networks used by computer scientists and many of those studied by modelers in neuroscience represent unit activities as continuous variables. Neurons, however, communicate primarily through discontinuous spiking. We review methods for transferring our ability to construct interesting networks that perform relevant tasks from the artificial continuous domain to more realistic spiking network models. These methods raise a number of issues that warrant further theoretical and experimental study.

  20. Hierarchical Adaptive Means (HAM) clustering for hardware-efficient, unsupervised and real-time spike sorting.

    Science.gov (United States)

    Paraskevopoulou, Sivylla E; Wu, Di; Eftekhar, Amir; Constandinou, Timothy G

    2014-09-30

    This work presents a novel unsupervised algorithm for real-time adaptive clustering of neural spike data (spike sorting). The proposed Hierarchical Adaptive Means (HAM) clustering method combines centroid-based clustering with hierarchical cluster connectivity to classify incoming spikes using groups of clusters. It is described how the proposed method can adaptively track the incoming spike data without requiring any past history, iteration or training and autonomously determines the number of spike classes. Its performance (classification accuracy) has been tested using multiple datasets (both simulated and recorded) achieving a near-identical accuracy compared to k-means (using 10-iterations and provided with the number of spike classes). Also, its robustness in applying to different feature extraction methods has been demonstrated by achieving classification accuracies above 80% across multiple datasets. Last but crucially, its low complexity, that has been quantified through both memory and computation requirements makes this method hugely attractive for future hardware implementation. Copyright © 2014 Elsevier B.V. All rights reserved.

  1. An Investigation on the Role of Spike Latency in an Artificial Olfactory System

    Directory of Open Access Journals (Sweden)

    Corrado eDi Natale

    2011-12-01

    Full Text Available Experimental studies have shown that the reactions to external stimuli may appear only few hundreds of milliseconds after the physical interaction of the stimulus with the proper receptor. This behavior suggests that neurons transmit the largest meaningful part of their signal in the first spikes, and than that the spike latency is a good descriptor of the information content in biological neural networks. In this paper this property has been investigated in an artificial sensorial system where a single layer of spiking neurons is trained with the data generated by an artificial olfactory platform based on a large array of chemical sensors. The capability to discriminate between distinct chemicals and mixtures of them was studied with spiking neural networks endowed with and without lateral inhibitions and considering as output feature of the network both the spikes latency and the average firing rate. Results show that the average firing rate of the output spikes sequences shows the best separation among the experienced vapors, however the latency code is able in a shorter time to correctly discriminate all the tested volatile compounds. This behavior is qualitatively similar to those recently found in natural olfaction, and noteworthy it provides practical suggestions to tail the measurement conditions of artificial olfactory systems defining for each specific case a proper measurement time.

  2. Structured chaos shapes spike-response noise entropy in balanced neural networks

    Directory of Open Access Journals (Sweden)

    Guillaume eLajoie

    2014-10-01

    Full Text Available Large networks of sparsely coupled, excitatory and inhibitory cells occur throughout the brain. For many models of these networks, a striking feature is that their dynamics are chaotic and thus, are sensitive to small perturbations. How does this chaos manifest in the neural code? Specifically, how variable are the spike patterns that such a network produces in response to an input signal? To answer this, we derive a bound for a general measure of variability -- spike-train entropy. This leads to important insights on the variability of multi-cell spike pattern distributions in large recurrent networks of spiking neurons responding to fluctuating inputs. The analysis is based on results from random dynamical systems theory and is complemented by detailed numerical simulations. We find that the spike pattern entropy is an order of magnitude lower than what would be extrapolated from single cells. This holds despite the fact that network coupling becomes vanishingly sparse as network size grows -- a phenomenon that depends on ``extensive chaos, as previously discovered for balanced networks without stimulus drive. Moreover, we show how spike pattern entropy is controlled by temporal features of the inputs. Our findings provide insight into how neural networks may encode stimuli in the presence of inherently chaotic dynamics.

  3. Bio-inspired spiking neural network for nonlinear systems control.

    Science.gov (United States)

    Pérez, Javier; Cabrera, Juan A; Castillo, Juan J; Velasco, Juan M

    2018-08-01

    Spiking neural networks (SNN) are the third generation of artificial neural networks. SNN are the closest approximation to biological neural networks. SNNs make use of temporal spike trains to command inputs and outputs, allowing a faster and more complex computation. As demonstrated by biological organisms, they are a potentially good approach to designing controllers for highly nonlinear dynamic systems in which the performance of controllers developed by conventional techniques is not satisfactory or difficult to implement. SNN-based controllers exploit their ability for online learning and self-adaptation to evolve when transferred from simulations to the real world. SNN's inherent binary and temporary way of information codification facilitates their hardware implementation compared to analog neurons. Biological neural networks often require a lower number of neurons compared to other controllers based on artificial neural networks. In this work, these neuronal systems are imitated to perform the control of non-linear dynamic systems. For this purpose, a control structure based on spiking neural networks has been designed. Particular attention has been paid to optimizing the structure and size of the neural network. The proposed structure is able to control dynamic systems with a reduced number of neurons and connections. A supervised learning process using evolutionary algorithms has been carried out to perform controller training. The efficiency of the proposed network has been verified in two examples of dynamic systems control. Simulations show that the proposed control based on SNN exhibits superior performance compared to other approaches based on Neural Networks and SNNs. Copyright © 2018 Elsevier Ltd. All rights reserved.

  4. Unsupervised neural spike sorting for high-density microelectrode arrays with convolutive independent component analysis.

    Science.gov (United States)

    Leibig, Christian; Wachtler, Thomas; Zeck, Günther

    2016-09-15

    Unsupervised identification of action potentials in multi-channel extracellular recordings, in particular from high-density microelectrode arrays with thousands of sensors, is an unresolved problem. While independent component analysis (ICA) achieves rapid unsupervised sorting, it ignores the convolutive structure of extracellular data, thus limiting the unmixing to a subset of neurons. Here we present a spike sorting algorithm based on convolutive ICA (cICA) to retrieve a larger number of accurately sorted neurons than with instantaneous ICA while accounting for signal overlaps. Spike sorting was applied to datasets with varying signal-to-noise ratios (SNR: 3-12) and 27% spike overlaps, sampled at either 11.5 or 23kHz on 4365 electrodes. We demonstrate how the instantaneity assumption in ICA-based algorithms has to be relaxed in order to improve the spike sorting performance for high-density microelectrode array recordings. Reformulating the convolutive mixture as an instantaneous mixture by modeling several delayed samples jointly is necessary to increase signal-to-noise ratio. Our results emphasize that different cICA algorithms are not equivalent. Spike sorting performance was assessed with ground-truth data generated from experimentally derived templates. The presented spike sorter was able to extract ≈90% of the true spike trains with an error rate below 2%. It was superior to two alternative (c)ICA methods (≈80% accurately sorted neurons) and comparable to a supervised sorting. Our new algorithm represents a fast solution to overcome the current bottleneck in spike sorting of large datasets generated by simultaneous recording with thousands of electrodes. Copyright © 2016 Elsevier B.V. All rights reserved.

  5. Anti-correlations in the degree distribution increase stimulus detection performance in noisy spiking neural networks

    NARCIS (Netherlands)

    Martens, M.B. (Marijn B.); A.R. Houweling (Arthur); E. Tiesinga, P.H. (Paul H.)

    2017-01-01

    textabstractNeuronal circuits in the rodent barrel cortex are characterized by stable low firing rates. However, recent experiments show that short spike trains elicited by electrical stimulation in single neurons can induce behavioral responses. Hence, the underlying neural networks provide

  6. The DEMO trial: a randomized, parallel-group, observer-blinded clinical trial of strength versus aerobic versus relaxation training for patients with mild to moderate depression

    DEFF Research Database (Denmark)

    Krogh, Jesper; Saltin, Bengt; Gluud, Christian

    2009-01-01

    OBJECTIVE: To assess the benefit and harm of exercise training in adults with clinical depression. METHOD: The DEMO trial is a randomized pragmatic trial for patients with unipolar depression conducted from January 2005 through July 2007. Patients were referred from general practitioners......: Our findings do not support a biologically mediated effect of exercise on symptom severity in depressed patients, but they do support a beneficial effect of strength training on work capacity. TRIAL REGISTRATION: (ClinicalTrials.gov) Identifier: NCT00103415....... or psychiatrists and were eligible if they fulfilled the International Classification of Diseases, Tenth Revision, criteria for unipolar depression and were aged between 18 and 55 years. Patients (N = 165) were allocated to supervised strength, aerobic, or relaxation training during a 4-month period. The primary...

  7. Restoration of muscle mitochondrial function and metabolic flexibility in type 2 diabetes by exercise training is paralleled by increased myocellular fat storage and improved insulin sensitivity.

    Science.gov (United States)

    Meex, Ruth C R; Schrauwen-Hinderling, Vera B; Moonen-Kornips, Esther; Schaart, Gert; Mensink, Marco; Phielix, Esther; van de Weijer, Tineke; Sels, Jean-Pierre; Schrauwen, Patrick; Hesselink, Matthijs K C

    2010-03-01

    Mitochondrial dysfunction and fat accumulation in skeletal muscle (increased intramyocellular lipid [IMCL]) have been linked to development of type 2 diabetes. We examined whether exercise training could restore mitochondrial function and insulin sensitivity in patients with type 2 diabetes. Eighteen male type 2 diabetic and 20 healthy male control subjects of comparable body weight, BMI, age, and VO2max participated in a 12-week combined progressive training program (three times per week and 45 min per session). In vivo mitochondrial function (assessed via magnetic resonance spectroscopy), insulin sensitivity (clamp), metabolic flexibility (indirect calorimetry), and IMCL content (histochemically) were measured before and after training. Mitochondrial function was lower in type 2 diabetic compared with control subjects (P = 0.03), improved by training in control subjects (28% increase; P = 0.02), and restored to control values in type 2 diabetic subjects (48% increase; P type 2 diabetic subjects (delta Rd 63% increase; P type 2 diabetic subjects was restored (delta respiratory exchange ratio 63% increase; P = 0.01) but was unchanged in control subjects (delta respiratory exchange ratio 7% increase; P = 0.22). Starting with comparable pretraining IMCL levels, training tended to increase IMCL content in type 2 diabetic subjects (27% increase; P = 0.10), especially in type 2 muscle fibers. Exercise training restored in vivo mitochondrial function in type 2 diabetic subjects. Insulin-mediated glucose disposal and metabolic flexibility improved in type 2 diabetic subjects in the face of near-significantly increased IMCL content. This indicates that increased capacity to store IMCL and restoration of improved mitochondrial function contribute to improved muscle insulin sensitivity.

  8. Visually Evoked Spiking Evolves While Spontaneous Ongoing Dynamics Persist

    Science.gov (United States)

    Huys, Raoul; Jirsa, Viktor K.; Darokhan, Ziauddin; Valentiniene, Sonata; Roland, Per E.

    2016-01-01

    Neurons in the primary visual cortex spontaneously spike even when there are no visual stimuli. It is unknown whether the spiking evoked by visual stimuli is just a modification of the spontaneous ongoing cortical spiking dynamics or whether the spontaneous spiking state disappears and is replaced by evoked spiking. This study of laminar recordings of spontaneous spiking and visually evoked spiking of neurons in the ferret primary visual cortex shows that the spiking dynamics does not change: the spontaneous spiking as well as evoked spiking is controlled by a stable and persisting fixed point attractor. Its existence guarantees that evoked spiking return to the spontaneous state. However, the spontaneous ongoing spiking state and the visual evoked spiking states are qualitatively different and are separated by a threshold (separatrix). The functional advantage of this organization is that it avoids the need for a system reorganization following visual stimulation, and impedes the transition of spontaneous spiking to evoked spiking and the propagation of spontaneous spiking from layer 4 to layers 2–3. PMID:26778982

  9. Non-orthogonally transitive G2 spike solution

    International Nuclear Information System (INIS)

    Lim, Woei Chet

    2015-01-01

    We generalize the orthogonally transitive (OT) G 2 spike solution to the non-OT G 2 case. This is achieved by applying Geroch’s transformation on a Kasner seed. The new solution contains two more parameters than the OT G 2 spike solution. Unlike the OT G 2 spike solution, the new solution always resolves its spike. (fast track communication)

  10. Reconstructing stimuli from the spike-times of leaky integrate and fire neurons

    Directory of Open Access Journals (Sweden)

    Sebastian eGerwinn

    2011-02-01

    Full Text Available Reconstructing stimuli from the spike-trains of neurons is an important approach for understanding the neural code. One of the difficulties associated with this task is that signals which are varying continuously in time are encoded into sequences of discrete events or spikes. An important problem is to determine how much information about the continuously varying stimulus can be extracted from the time-points at which spikes were observed, especially if these time-points are subject to some sort of randomness. For the special case of spike trains generated by leaky integrate and fire neurons, noise can be introduced by allowing variations in the threshold every time a spike is released. A simple decoding algorithm previously derived for the noiseless case can be extended to the stochastic case, but turns out to be biased. Here, we review a solution to this problem, by presenting a simple yet efficient algorithm which greatly reduces the bias, and therefore leads to better decoding performance in the stochastic case.

  11. Statistical characteristics of climbing fiber spikes necessary for efficient cerebellar learning.

    Science.gov (United States)

    Kuroda, S; Yamamoto, K; Miyamoto, H; Doya, K; Kawat, M

    2001-03-01

    Mean firing rates (MFRs), with analogue values, have thus far been used as information carriers of neurons in most brain theories of learning. However, the neurons transmit the signal by spikes, which are discrete events. The climbing fibers (CFs), which are known to be essential for cerebellar motor learning, fire at the ultra-low firing rates (around 1 Hz), and it is not yet understood theoretically how high-frequency information can be conveyed and how learning of smooth and fast movements can be achieved. Here we address whether cerebellar learning can be achieved by CF spikes instead of conventional MFR in an eye movement task, such as the ocular following response (OFR), and an arm movement task. There are two major afferents into cerebellar Purkinje cells: parallel fiber (PF) and CF, and the synaptic weights between PFs and Purkinje cells have been shown to be modulated by the stimulation of both types of fiber. The modulation of the synaptic weights is regulated by the cerebellar synaptic plasticity. In this study we simulated cerebellar learning using CF signals as spikes instead of conventional MFR. To generate the spikes we used the following four spike generation models: (1) a Poisson model in which the spike interval probability follows a Poisson distribution, (2) a gamma model in which the spike interval probability follows the gamma distribution, (3) a max model in which a spike is generated when a synaptic input reaches maximum, and (4) a threshold model in which a spike is generated when the input crosses a certain small threshold. We found that, in an OFR task with a constant visual velocity, learning was successful with stochastic models, such as Poisson and gamma models, but not in the deterministic models, such as max and threshold models. In an OFR with a stepwise velocity change and an arm movement task, learning could be achieved only in the Poisson model. In addition, for efficient cerebellar learning, the distribution of CF spike

  12. Automatic online spike sorting with singular value decomposition and fuzzy C-mean clustering.

    Science.gov (United States)

    Oliynyk, Andriy; Bonifazzi, Claudio; Montani, Fernando; Fadiga, Luciano

    2012-08-08

    Understanding how neurons contribute to perception, motor functions and cognition requires the reliable detection of spiking activity of individual neurons during a number of different experimental conditions. An important problem in computational neuroscience is thus to develop algorithms to automatically detect and sort the spiking activity of individual neurons from extracellular recordings. While many algorithms for spike sorting exist, the problem of accurate and fast online sorting still remains a challenging issue. Here we present a novel software tool, called FSPS (Fuzzy SPike Sorting), which is designed to optimize: (i) fast and accurate detection, (ii) offline sorting and (iii) online classification of neuronal spikes with very limited or null human intervention. The method is based on a combination of Singular Value Decomposition for fast and highly accurate pre-processing of spike shapes, unsupervised Fuzzy C-mean, high-resolution alignment of extracted spike waveforms, optimal selection of the number of features to retain, automatic identification the number of clusters, and quantitative quality assessment of resulting clusters independent on their size. After being trained on a short testing data stream, the method can reliably perform supervised online classification and monitoring of single neuron activity. The generalized procedure has been implemented in our FSPS spike sorting software (available free for non-commercial academic applications at the address: http://www.spikesorting.com) using LabVIEW (National Instruments, USA). We evaluated the performance of our algorithm both on benchmark simulated datasets with different levels of background noise and on real extracellular recordings from premotor cortex of Macaque monkeys. The results of these tests showed an excellent accuracy in discriminating low-amplitude and overlapping spikes under strong background noise. The performance of our method is competitive with respect to other robust spike

  13. Automatic online spike sorting with singular value decomposition and fuzzy C-mean clustering

    Directory of Open Access Journals (Sweden)

    Oliynyk Andriy

    2012-08-01

    Full Text Available Abstract Background Understanding how neurons contribute to perception, motor functions and cognition requires the reliable detection of spiking activity of individual neurons during a number of different experimental conditions. An important problem in computational neuroscience is thus to develop algorithms to automatically detect and sort the spiking activity of individual neurons from extracellular recordings. While many algorithms for spike sorting exist, the problem of accurate and fast online sorting still remains a challenging issue. Results Here we present a novel software tool, called FSPS (Fuzzy SPike Sorting, which is designed to optimize: (i fast and accurate detection, (ii offline sorting and (iii online classification of neuronal spikes with very limited or null human intervention. The method is based on a combination of Singular Value Decomposition for fast and highly accurate pre-processing of spike shapes, unsupervised Fuzzy C-mean, high-resolution alignment of extracted spike waveforms, optimal selection of the number of features to retain, automatic identification the number of clusters, and quantitative quality assessment of resulting clusters independent on their size. After being trained on a short testing data stream, the method can reliably perform supervised online classification and monitoring of single neuron activity. The generalized procedure has been implemented in our FSPS spike sorting software (available free for non-commercial academic applications at the address: http://www.spikesorting.com using LabVIEW (National Instruments, USA. We evaluated the performance of our algorithm both on benchmark simulated datasets with different levels of background noise and on real extracellular recordings from premotor cortex of Macaque monkeys. The results of these tests showed an excellent accuracy in discriminating low-amplitude and overlapping spikes under strong background noise. The performance of our method is

  14. The DEMO trial: a randomized, parallel-group, observer-blinded clinical trial of strength versus aerobic versus relaxation training for patients with mild to moderate depression

    DEFF Research Database (Denmark)

    Krogh, Jesper; Saltin, Bengt; Gluud, Christian

    2009-01-01

    OBJECTIVE: To assess the benefit and harm of exercise training in adults with clinical depression. METHOD: The DEMO trial is a randomized pragmatic trial for patients with unipolar depression conducted from January 2005 through July 2007. Patients were referred from general practitioners or psych......: Our findings do not support a biologically mediated effect of exercise on symptom severity in depressed patients, but they do support a beneficial effect of strength training on work capacity. TRIAL REGISTRATION: (ClinicalTrials.gov) Identifier: NCT00103415.......OBJECTIVE: To assess the benefit and harm of exercise training in adults with clinical depression. METHOD: The DEMO trial is a randomized pragmatic trial for patients with unipolar depression conducted from January 2005 through July 2007. Patients were referred from general practitioners...... or psychiatrists and were eligible if they fulfilled the International Classification of Diseases, Tenth Revision, criteria for unipolar depression and were aged between 18 and 55 years. Patients (N = 165) were allocated to supervised strength, aerobic, or relaxation training during a 4-month period. The primary...

  15. Parallel changes in the onset of blood lactate accumulation (OBLA) and threshold of psychomotor performance deterioration during incremental exercise after training in athletes.

    Science.gov (United States)

    Chmura, Jan; Nazar, Krystyna

    2010-03-01

    During aerobic exercise with increasing intensities choice reaction time (CRT) progressively shortens up to 60-80% of maximal workload, and then it rapidly increases. The aim of this study was to determine whether workload associated with the shortest CRT operationally called "the psychomotor fatigue threshold" is related to the metabolic response to exercise. Thirteen male soccer players (aged 23.3 + or - 1.0 yrs) participated in this study. Before and after 6 weeks of training in the pre-competition period they underwent treadmill test at 0 grade with running speed increasing every 3 min by 2 km/h starting from 6 km/h until exhaustion. At each stage of exercise CRT, heart rate, respiratory gas exchange and blood lactate [LA] were measured and the workload corresponding to [LA] of 4 mmol/l (OBLA) was recorded. After training, CRT was significantly shortened at rest (from m + or - SEM = 345 + or - 12 to 317 + or - 12 ms) and during exercise (from 304 + or - 10 to 285 + or - 11 ms at the psychomotor fatigue threshold and from 359 + or - 13 to 331 + or - 13 ms, pchanges in OBLA occurring during training and those in psychomotor fatigue threshold (r = 0.88). It is concluded that endurance training not only increases exercise tolerance due to its influence on metabolism but also facilitates psychomotor performance during heavy exercise. Copyright 2010 Elsevier B.V. All rights reserved.

  16. Spiking Neural Networks Based on OxRAM Synapses for Real-Time Unsupervised Spike Sorting.

    Science.gov (United States)

    Werner, Thilo; Vianello, Elisa; Bichler, Olivier; Garbin, Daniele; Cattaert, Daniel; Yvert, Blaise; De Salvo, Barbara; Perniola, Luca

    2016-01-01

    In this paper, we present an alternative approach to perform spike sorting of complex brain signals based on spiking neural networks (SNN). The proposed architecture is suitable for hardware implementation by using resistive random access memory (RRAM) technology for the implementation of synapses whose low latency (spike sorting. This offers promising advantages to conventional spike sorting techniques for brain-computer interfaces (BCI) and neural prosthesis applications. Moreover, the ultra-low power consumption of the RRAM synapses of the spiking neural network (nW range) may enable the design of autonomous implantable devices for rehabilitation purposes. We demonstrate an original methodology to use Oxide based RRAM (OxRAM) as easy to program and low energy (Spike Timing Dependent Plasticity. Real spiking data have been recorded both intra- and extracellularly from an in-vitro preparation of the Crayfish sensory-motor system and used for validation of the proposed OxRAM based SNN. This artificial SNN is able to identify, learn, recognize and distinguish between different spike shapes in the input signal with a recognition rate about 90% without any supervision.

  17. Spike rate and spike timing contributions to coding taste quality information in rat periphery

    Directory of Open Access Journals (Sweden)

    Vernon eLawhern

    2011-05-01

    Full Text Available There is emerging evidence that individual sensory neurons in the rodent brain rely on temporal features of the discharge pattern to code differences in taste quality information. In contrast, in-vestigations of individual sensory neurons in the periphery have focused on analysis of spike rate and mostly disregarded spike timing as a taste quality coding mechanism. The purpose of this work was to determine the contribution of spike timing to taste quality coding by rat geniculate ganglion neurons using computational methods that have been applied successfully in other sys-tems. We recorded the discharge patterns of narrowly-tuned and broadly-tuned neurons in the rat geniculate ganglion to representatives of the five basic taste qualities. We used mutual in-formation to determine significant responses and the van Rossum metric to characterize their temporal features. While our findings show that spike timing contributes a significant part of the message, spike rate contributes the largest portion of the message relayed by afferent neurons from rat fungiform taste buds to the brain. Thus, spike rate and spike timing together are more effective than spike rate alone in coding stimulus quality information to a single basic taste in the periphery for both narrowly-tuned specialist and broadly-tuned generalist neurons.

  18. Robustness of spiking Deep Belief Networks to noise and reduced bit precision of neuro-inspired hardware platforms.

    Science.gov (United States)

    Stromatias, Evangelos; Neil, Daniel; Pfeiffer, Michael; Galluppi, Francesco; Furber, Steve B; Liu, Shih-Chii

    2015-01-01

    Increasingly large deep learning architectures, such as Deep Belief Networks (DBNs) are the focus of current machine learning research and achieve state-of-the-art results in different domains. However, both training and execution of large-scale Deep Networks require vast computing resources, leading to high power requirements and communication overheads. The on-going work on design and construction of spike-based hardware platforms offers an alternative for running deep neural networks with significantly lower power consumption, but has to overcome hardware limitations in terms of noise and limited weight precision, as well as noise inherent in the sensor signal. This article investigates how such hardware constraints impact the performance of spiking neural network implementations of DBNs. In particular, the influence of limited bit precision during execution and training, and the impact of silicon mismatch in the synaptic weight parameters of custom hybrid VLSI implementations is studied. Furthermore, the network performance of spiking DBNs is characterized with regard to noise in the spiking input signal. Our results demonstrate that spiking DBNs can tolerate very low levels of hardware bit precision down to almost two bits, and show that their performance can be improved by at least 30% through an adapted training mechanism that takes the bit precision of the target platform into account. Spiking DBNs thus present an important use-case for large-scale hybrid analog-digital or digital neuromorphic platforms such as SpiNNaker, which can execute large but precision-constrained deep networks in real time.

  19. Gradient Learning in Spiking Neural Networks by Dynamic Perturbation of Conductances

    International Nuclear Information System (INIS)

    Fiete, Ila R.; Seung, H. Sebastian

    2006-01-01

    We present a method of estimating the gradient of an objective function with respect to the synaptic weights of a spiking neural network. The method works by measuring the fluctuations in the objective function in response to dynamic perturbation of the membrane conductances of the neurons. It is compatible with recurrent networks of conductance-based model neurons with dynamic synapses. The method can be interpreted as a biologically plausible synaptic learning rule, if the dynamic perturbations are generated by a special class of 'empiric' synapses driven by random spike trains from an external source

  20. The electric potential of tripolar spikes

    Energy Technology Data Exchange (ETDEWEB)

    Nocera, L. [Theoretical Plasma Physics, IPCF-CNR, Via Moruzzi 1, I-56124 Pisa (Italy)

    2010-02-22

    We present an analytical formula for the waveform of the electric potential associated with a tripolar spike in a plasma. This formula is based on the construction and on the subsequent solution of a differential equation for the waveform. We work out this equation as a direct consequence of the morphological and functional properties of the observed waveform, without making any reference to the velocity distributions of the electrons and of the ions which sustain the spike. In the approximation of small potential amplitudes, we solve this equation by quadrature. In particular, in the second order approximation, the solution of this equation is given in terms of elementary functions. This analytical solution is able to reproduce the potential waveforms associated with electron holes, ion holes, monotonic and nonmonotonic double layers and tripolar spikes, in excellent agreement with observations.

  1. The electric potential of tripolar spikes

    International Nuclear Information System (INIS)

    Nocera, L.

    2010-01-01

    We present an analytical formula for the waveform of the electric potential associated with a tripolar spike in a plasma. This formula is based on the construction and on the subsequent solution of a differential equation for the waveform. We work out this equation as a direct consequence of the morphological and functional properties of the observed waveform, without making any reference to the velocity distributions of the electrons and of the ions which sustain the spike. In the approximation of small potential amplitudes, we solve this equation by quadrature. In particular, in the second order approximation, the solution of this equation is given in terms of elementary functions. This analytical solution is able to reproduce the potential waveforms associated with electron holes, ion holes, monotonic and nonmonotonic double layers and tripolar spikes, in excellent agreement with observations.

  2. Trace element ink spiking for signature authentication

    International Nuclear Information System (INIS)

    Hatzistavros, V.S.; Kallithrakas-Kontos, N.G.

    2008-01-01

    Signature authentication is a critical question in forensic document examination. Last years the evolution of personal computers made signature copying a quite easy task, so the development of new ways for signature authentication is crucial. In the present work a commercial ink was spiked with many trace elements in various concentrations. Inorganic and organometallic ink soluble compounds were used as spiking agents, whilst ink retained its initial properties. The spiked inks were used for paper writing and the documents were analyzed by a non destructive method, the energy dispersive X-ray fluorescence. The thin target model was proved right for quantitative analysis and a very good linear relationship of the intensity (X-ray signal) against concentration was estimated for all used elements. Intensity ratios between different elements in the same ink gave very stable results, independent on the writing alterations. The impact of time both to written document and prepared inks was also investigated. (author)

  3. A Novel and Simple Spike Sorting Implementation.

    Science.gov (United States)

    Petrantonakis, Panagiotis C; Poirazi, Panayiota

    2017-04-01

    Monitoring the activity of multiple, individual neurons that fire spikes in the vicinity of an electrode, namely perform a Spike Sorting (SS) procedure, comprises one of the most important tools for contemporary neuroscience in order to reverse-engineer the brain. As recording electrodes' technology rabidly evolves by integrating thousands of electrodes in a confined spatial setting, the algorithms that are used to monitor individual neurons from recorded signals have to become even more reliable and computationally efficient. In this work, we propose a novel framework of the SS approach in which a single-step processing of the raw (unfiltered) extracellular signal is sufficient for both the detection and sorting of the activity of individual neurons. Despite its simplicity, the proposed approach exhibits comparable performance with state-of-the-art approaches, especially for spike detection in noisy signals, and paves the way for a new family of SS algorithms with the potential for multi-recording, fast, on-chip implementations.

  4. Spike timing precision of neuronal circuits.

    Science.gov (United States)

    Kilinc, Deniz; Demir, Alper

    2018-04-17

    Spike timing is believed to be a key factor in sensory information encoding and computations performed by the neurons and neuronal circuits. However, the considerable noise and variability, arising from the inherently stochastic mechanisms that exist in the neurons and the synapses, degrade spike timing precision. Computational modeling can help decipher the mechanisms utilized by the neuronal circuits in order to regulate timing precision. In this paper, we utilize semi-analytical techniques, which were adapted from previously developed methods for electronic circuits, for the stochastic characterization of neuronal circuits. These techniques, which are orders of magnitude faster than traditional Monte Carlo type simulations, can be used to directly compute the spike timing jitter variance, power spectral densities, correlation functions, and other stochastic characterizations of neuronal circuit operation. We consider three distinct neuronal circuit motifs: Feedback inhibition, synaptic integration, and synaptic coupling. First, we show that both the spike timing precision and the energy efficiency of a spiking neuron are improved with feedback inhibition. We unveil the underlying mechanism through which this is achieved. Then, we demonstrate that a neuron can improve on the timing precision of its synaptic inputs, coming from multiple sources, via synaptic integration: The phase of the output spikes of the integrator neuron has the same variance as that of the sample average of the phases of its inputs. Finally, we reveal that weak synaptic coupling among neurons, in a fully connected network, enables them to behave like a single neuron with a larger membrane area, resulting in an improvement in the timing precision through cooperation.

  5. Implementing Signature Neural Networks with Spiking Neurons.

    Science.gov (United States)

    Carrillo-Medina, José Luis; Latorre, Roberto

    2016-01-01

    Spiking Neural Networks constitute the most promising approach to develop realistic Artificial Neural Networks (ANNs). Unlike traditional firing rate-based paradigms, information coding in spiking models is based on the precise timing of individual spikes. It has been demonstrated that spiking ANNs can be successfully and efficiently applied to multiple realistic problems solvable with traditional strategies (e.g., data classification or pattern recognition). In recent years, major breakthroughs in neuroscience research have discovered new relevant computational principles in different living neural systems. Could ANNs benefit from some of these recent findings providing novel elements of inspiration? This is an intriguing question for the research community and the development of spiking ANNs including novel bio-inspired information coding and processing strategies is gaining attention. From this perspective, in this work, we adapt the core concepts of the recently proposed Signature Neural Network paradigm-i.e., neural signatures to identify each unit in the network, local information contextualization during the processing, and multicoding strategies for information propagation regarding the origin and the content of the data-to be employed in a spiking neural network. To the best of our knowledge, none of these mechanisms have been used yet in the context of ANNs of spiking neurons. This paper provides a proof-of-concept for their applicability in such networks. Computer simulations show that a simple network model like the discussed here exhibits complex self-organizing properties. The combination of multiple simultaneous encoding schemes allows the network to generate coexisting spatio-temporal patterns of activity encoding information in different spatio-temporal spaces. As a function of the network and/or intra-unit parameters shaping the corresponding encoding modality, different forms of competition among the evoked patterns can emerge even in the absence

  6. Parallelization of TMVA Machine Learning Algorithms

    CERN Document Server

    Hajili, Mammad

    2017-01-01

    This report reflects my work on Parallelization of TMVA Machine Learning Algorithms integrated to ROOT Data Analysis Framework during summer internship at CERN. The report consists of 4 impor- tant part - data set used in training and validation, algorithms that multiprocessing applied on them, parallelization techniques and re- sults of execution time changes due to number of workers.

  7. Noise-robust unsupervised spike sorting based on discriminative subspace learning with outlier handling.

    Science.gov (United States)

    Keshtkaran, Mohammad Reza; Yang, Zhi

    2017-06-01

    Spike sorting is a fundamental preprocessing step for many neuroscience studies which rely on the analysis of spike trains. Most of the feature extraction and dimensionality reduction techniques that have been used for spike sorting give a projection subspace which is not necessarily the most discriminative one. Therefore, the clusters which appear inherently separable in some discriminative subspace may overlap if projected using conventional feature extraction approaches leading to a poor sorting accuracy especially when the noise level is high. In this paper, we propose a noise-robust and unsupervised spike sorting algorithm based on learning discriminative spike features for clustering. The proposed algorithm uses discriminative subspace learning to extract low dimensional and most discriminative features from the spike waveforms and perform clustering with automatic detection of the number of the clusters. The core part of the algorithm involves iterative subspace selection using linear discriminant analysis and clustering using Gaussian mixture model with outlier detection. A statistical test in the discriminative subspace is proposed to automatically detect the number of the clusters. Comparative results on publicly available simulated and real in vivo datasets demonstrate that our algorithm achieves substantially improved cluster distinction leading to higher sorting accuracy and more reliable detection of clusters which are highly overlapping and not detectable using conventional feature extraction techniques such as principal component analysis or wavelets. By providing more accurate information about the activity of more number of individual neurons with high robustness to neural noise and outliers, the proposed unsupervised spike sorting algorithm facilitates more detailed and accurate analysis of single- and multi-unit activities in neuroscience and brain machine interface studies.

  8. Noise-robust unsupervised spike sorting based on discriminative subspace learning with outlier handling

    Science.gov (United States)

    Keshtkaran, Mohammad Reza; Yang, Zhi

    2017-06-01

    Objective. Spike sorting is a fundamental preprocessing step for many neuroscience studies which rely on the analysis of spike trains. Most of the feature extraction and dimensionality reduction techniques that have been used for spike sorting give a projection subspace which is not necessarily the most discriminative one. Therefore, the clusters which appear inherently separable in some discriminative subspace may overlap if projected using conventional feature extraction approaches leading to a poor sorting accuracy especially when the noise level is high. In this paper, we propose a noise-robust and unsupervised spike sorting algorithm based on learning discriminative spike features for clustering. Approach. The proposed algorithm uses discriminative subspace learning to extract low dimensional and most discriminative features from the spike waveforms and perform clustering with automatic detection of the number of the clusters. The core part of the algorithm involves iterative subspace selection using linear discriminant analysis and clustering using Gaussian mixture model with outlier detection. A statistical test in the discriminative subspace is proposed to automatically detect the number of the clusters. Main results. Comparative results on publicly available simulated and real in vivo datasets demonstrate that our algorithm achieves substantially improved cluster distinction leading to higher sorting accuracy and more reliable detection of clusters which are highly overlapping and not detectable using conventional feature extraction techniques such as principal component analysis or wavelets. Significance. By providing more accurate information about the activity of more number of individual neurons with high robustness to neural noise and outliers, the proposed unsupervised spike sorting algorithm facilitates more detailed and accurate analysis of single- and multi-unit activities in neuroscience and brain machine interface studies.

  9. Rotational Angles and Velocities During Down the Line and Diagonal Across Court Volleyball Spikes

    Directory of Open Access Journals (Sweden)

    Justin R. Brown

    2014-05-01

    Full Text Available The volleyball spike is an explosive movement that is frequently used to end a rally and earn a point. High velocity spikes are an important skill for a successful volleyball offense. Although the influence of vertical jump height and arm velocity on spiked ball velocity (SBV have been investigated, little is known about the relationship of shoulder and hip angular kinematics with SBV. Other sport skills, like the baseball pitch share similar movement patterns and suggest trunk rotation is important for such movements. The purpose of this study was to examine the relationship of both shoulder and hip angular kinematics with ball velocity during the volleyball spike. Methods: Fourteen Division I collegiate female volleyball players executed down the line (DL and diagonally across-court (DAC spikes in a laboratory setting to measure shoulder and hip angular kinematics and velocities. Each spike was analyzed using a 10 Camera Raptor-E Digital Real Time Camera System.  Results: DL SBV was significantly greater than for DAC, respectively (17.54±2.35 vs. 15.97±2.36 m/s, p<0.05.  The Shoulder Hip Separation Angle (S-HSA, Shoulder Angular Velocity (SAV, and Hip Angular Velocity (HAV were all significantly correlated with DAC SBV. S-HSA was the most significant predictor of DAC SBV as determined by regression analysis.  Conclusions: This study provides support for a relationship between a greater S-HSA and SBV. Future research should continue to 1 examine the influence of core training exercise and rotational skill drills on SBV and 2 examine trunk angular velocities during various types of spikes during play.

  10. On the robustness of EC-PC spike detection method for online neural recording.

    Science.gov (United States)

    Zhou, Yin; Wu, Tong; Rastegarnia, Amir; Guan, Cuntai; Keefer, Edward; Yang, Zhi

    2014-09-30

    Online spike detection is an important step to compress neural data and perform real-time neural information decoding. An unsupervised, automatic, yet robust signal processing is strongly desired, thus it can support a wide range of applications. We have developed a novel spike detection algorithm called "exponential component-polynomial component" (EC-PC) spike detection. We firstly evaluate the robustness of the EC-PC spike detector under different firing rates and SNRs. Secondly, we show that the detection Precision can be quantitatively derived without requiring additional user input parameters. We have realized the algorithm (including training) into a 0.13 μm CMOS chip, where an unsupervised, nonparametric operation has been demonstrated. Both simulated data and real data are used to evaluate the method under different firing rates (FRs), SNRs. The results show that the EC-PC spike detector is the most robust in comparison with some popular detectors. Moreover, the EC-PC detector can track changes in the background noise due to the ability to re-estimate the neural data distribution. Both real and synthesized data have been used for testing the proposed algorithm in comparison with other methods, including the absolute thresholding detector (AT), median absolute deviation detector (MAD), nonlinear energy operator detector (NEO), and continuous wavelet detector (CWD). Comparative testing results reveals that the EP-PC detection algorithm performs better than the other algorithms regardless of recording conditions. The EC-PC spike detector can be considered as an unsupervised and robust online spike detection. It is also suitable for hardware implementation. Copyright © 2014 Elsevier B.V. All rights reserved.

  11. Spike-timing computation properties of a feed-forward neural network model

    Directory of Open Access Journals (Sweden)

    Drew Benjamin Sinha

    2014-01-01

    Full Text Available Brain function is characterized by dynamical interactions among networks of neurons. These interactions are mediated by network topology at many scales ranging from microcircuits to brain areas. Understanding how networks operate can be aided by understanding how the transformation of inputs depends upon network connectivity patterns, e.g. serial and parallel pathways. To tractably determine how single synapses or groups of synapses in such pathways shape transformations, we modeled feed-forward networks of 7-22 neurons in which synaptic strength changed according to a spike-timing dependent plasticity rule. We investigated how activity varied when dynamics were perturbed by an activity-dependent electrical stimulation protocol (spike-triggered stimulation; STS in networks of different topologies and background input correlations. STS can successfully reorganize functional brain networks in vivo, but with a variability in effectiveness that may derive partially from the underlying network topology. In a simulated network with a single disynaptic pathway driven by uncorrelated background activity, structured spike-timing relationships between polysynaptically connected neurons were not observed. When background activity was correlated or parallel disynaptic pathways were added, however, robust polysynaptic spike timing relationships were observed, and application of STS yielded predictable changes in synaptic strengths and spike-timing relationships. These observations suggest that precise input-related or topologically induced temporal relationships in network activity are necessary for polysynaptic signal propagation. Such constraints for polysynaptic computation suggest potential roles for higher-order topological structure in network organization, such as maintaining polysynaptic correlation in the face of relatively weak synapses.

  12. Evolving Spiking Neural Networks for Recognition of Aged Voices.

    Science.gov (United States)

    Silva, Marco; Vellasco, Marley M B R; Cataldo, Edson

    2017-01-01

    The aging of the voice, known as presbyphonia, is a natural process that can cause great change in vocal quality of the individual. This is a relevant problem to those people who use their voices professionally, and its early identification can help determine a suitable treatment to avoid its progress or even to eliminate the problem. This work focuses on the development of a new model for the identification of aging voices (independently of their chronological age), using as input attributes parameters extracted from the voice and glottal signals. The proposed model, named Quantum binary-real evolving Spiking Neural Network (QbrSNN), is based on spiking neural networks (SNNs), with an unsupervised training algorithm, and a Quantum-Inspired Evolutionary Algorithm that automatically determines the most relevant attributes and the optimal parameters that configure the SNN. The QbrSNN model was evaluated in a database composed of 120 records, containing samples from three groups of speakers. The results obtained indicate that the proposed model provides better accuracy than other approaches, with fewer input attributes. Copyright © 2017 The Voice Foundation. Published by Elsevier Inc. All rights reserved.

  13. Physics of volleyball: Spiking with a purpose

    Science.gov (United States)

    Behroozi, F.

    1998-05-01

    A few weeks ago our volleyball coach telephoned me with a problem: How high should a player jump to "spike" a "set" ball so it would clear the net and land at a known distance on the other side of the net?

  14. An Unsupervised Online Spike-Sorting Framework.

    Science.gov (United States)

    Knieling, Simeon; Sridharan, Kousik S; Belardinelli, Paolo; Naros, Georgios; Weiss, Daniel; Mormann, Florian; Gharabaghi, Alireza

    2016-08-01

    Extracellular neuronal microelectrode recordings can include action potentials from multiple neurons. To separate spikes from different neurons, they can be sorted according to their shape, a procedure referred to as spike-sorting. Several algorithms have been reported to solve this task. However, when clustering outcomes are unsatisfactory, most of them are difficult to adjust to achieve the desired results. We present an online spike-sorting framework that uses feature normalization and weighting to maximize the distinctiveness between different spike shapes. Furthermore, multiple criteria are applied to either facilitate or prevent cluster fusion, thereby enabling experimenters to fine-tune the sorting process. We compare our method to established unsupervised offline (Wave_Clus (WC)) and online (OSort (OS)) algorithms by examining their performance in sorting various test datasets using two different scoring systems (AMI and the Adamos metric). Furthermore, we evaluate sorting capabilities on intra-operative recordings using established quality metrics. Compared to WC and OS, our algorithm achieved comparable or higher scores on average and produced more convincing sorting results for intra-operative datasets. Thus, the presented framework is suitable for both online and offline analysis and could substantially improve the quality of microelectrode-based data evaluation for research and clinical application.

  15. Spike-timing theory of working memory.

    Directory of Open Access Journals (Sweden)

    Botond Szatmáry

    Full Text Available Working memory (WM is the part of the brain's memory system that provides temporary storage and manipulation of information necessary for cognition. Although WM has limited capacity at any given time, it has vast memory content in the sense that it acts on the brain's nearly infinite repertoire of lifetime long-term memories. Using simulations, we show that large memory content and WM functionality emerge spontaneously if we take the spike-timing nature of neuronal processing into account. Here, memories are represented by extensively overlapping groups of neurons that exhibit stereotypical time-locked spatiotemporal spike-timing patterns, called polychronous patterns; and synapses forming such polychronous neuronal groups (PNGs are subject to associative synaptic plasticity in the form of both long-term and short-term spike-timing dependent plasticity. While long-term potentiation is essential in PNG formation, we show how short-term plasticity can temporarily strengthen the synapses of selected PNGs and lead to an increase in the spontaneous reactivation rate of these PNGs. This increased reactivation rate, consistent with in vivo recordings during WM tasks, results in high interspike interval variability and irregular, yet systematically changing, elevated firing rate profiles within the neurons of the selected PNGs. Additionally, our theory explains the relationship between such slowly changing firing rates and precisely timed spikes, and it reveals a novel relationship between WM and the perception of time on the order of seconds.

  16. A compound memristive synapse model for statistical learning through STDP in spiking neural networks

    Directory of Open Access Journals (Sweden)

    Johannes eBill

    2014-12-01

    Full Text Available Memristors have recently emerged as promising circuit elements to mimic the function of biological synapses in neuromorphic computing. The fabrication of reliable nanoscale memristive synapses, that feature continuous conductance changes based on the timing of pre- and postsynaptic spikes, has however turned out to be challenging. In this article, we propose an alternative approach, the compound memristive synapse, that circumvents this problem by the use of memristors with binary memristive states. A compound memristive synapse employs multiple bistable memristors in parallel to jointly form one synapse, thereby providing a spectrum of synaptic efficacies. We investigate the computational implications of synaptic plasticity in the compound synapse by integrating the recently observed phenomenon of stochastic filament formation into an abstract model of stochastic switching. Using this abstract model, we first show how standard pulsing schemes give rise to spike-timing dependent plasticity (STDP with a stabilizing weight dependence in compound synapses. In a next step, we study unsupervised learning with compound synapses in networks of spiking neurons organized in a winner-take-all architecture. Our theoretical analysis reveals that compound-synapse STDP implements generalized Expectation-Maximization in the spiking network. Specifically, the emergent synapse configuration represents the most salient features of the input distribution in a Mixture-of-Gaussians generative model. Furthermore, the network’s spike response to spiking input streams approximates a well-defined Bayesian posterior distribution. We show in computer simulations how such networks learn to represent high-dimensional distributions over images of handwritten digits with high fidelity even in presence of substantial device variations and under severe noise conditions. Therefore, the compound memristive synapse may provide a synaptic design principle for future neuromorphic

  17. Preparation and validation of a large size dried spike: Batch SAL-9924

    International Nuclear Information System (INIS)

    Bagliano, G.; Cappis, J.; Doubek, N.; Jammet, G.; Raab, W.; Zoigner, A.

    1989-12-01

    To determine uranium and plutonium concentration using isotope dilution mass spectrometry, weighed aliquands of a synthetic mixture containing 2 to 4 mg of Pu (with a 239 Pu abundance of about 97%) and 40 to 200 mg of U (with a 235 U enrichment of about 18%) can be advantageously used to spike a concentrated spent fuel solution with a high burn up and with a low 235 U enrichment. This will simplify the conditioning of the sample by 1) reduced time of preparation (from more than one day used for the conventional technique to 2-3 hours); 2) reduced burden for the operator with a clear easiness for the inspector to witness the entire procedure (accurate dilution of the spent fuel sample before spiking being no longer necessary). Furthermore this type of spike could be used as a common spike for the operator and the inspector. The source materials are available in sufficient quantity and are enough cheaper than the commonly used 233 U and 242 Pu or 244 Pu tracer that the costs of the overall Operator-Inspector procedures will be reduced. Certified Reference Materials Pu-NBL-126, natural U-NBS-960 and 93% enriched U-NBL-116 were used to prepare a stock solution containing 1.7 mg/ml of Pu and 68 mg/ml of 17.5% enriched U. Before shipment to the Reprocessing Plant, aliquands of the stock solution must be dried to give Large Size Dried Spikes which resist shocks encountered during transportation, so that they can readily be recovered quantitatively at the plant. This paper describes the preparation and the validation of the Large Size Dried Spike. Proof of usefulness in the field will be done at a later date in parallel with analysis by the conventional technique. Refs and tabs

  18. A compound memristive synapse model for statistical learning through STDP in spiking neural networks.

    Science.gov (United States)

    Bill, Johannes; Legenstein, Robert

    2014-01-01

    Memristors have recently emerged as promising circuit elements to mimic the function of biological synapses in neuromorphic computing. The fabrication of reliable nanoscale memristive synapses, that feature continuous conductance changes based on the timing of pre- and postsynaptic spikes, has however turned out to be challenging. In this article, we propose an alternative approach, the compound memristive synapse, that circumvents this problem by the use of memristors with binary memristive states. A compound memristive synapse employs multiple bistable memristors in parallel to jointly form one synapse, thereby providing a spectrum of synaptic efficacies. We investigate the computational implications of synaptic plasticity in the compound synapse by integrating the recently observed phenomenon of stochastic filament formation into an abstract model of stochastic switching. Using this abstract model, we first show how standard pulsing schemes give rise to spike-timing dependent plasticity (STDP) with a stabilizing weight dependence in compound synapses. In a next step, we study unsupervised learning with compound synapses in networks of spiking neurons organized in a winner-take-all architecture. Our theoretical analysis reveals that compound-synapse STDP implements generalized Expectation-Maximization in the spiking network. Specifically, the emergent synapse configuration represents the most salient features of the input distribution in a Mixture-of-Gaussians generative model. Furthermore, the network's spike response to spiking input streams approximates a well-defined Bayesian posterior distribution. We show in computer simulations how such networks learn to represent high-dimensional distributions over images of handwritten digits with high fidelity even in presence of substantial device variations and under severe noise conditions. Therefore, the compound memristive synapse may provide a synaptic design principle for future neuromorphic architectures.

  19. Transformation-invariant visual representations in self-organizing spiking neural networks.

    Science.gov (United States)

    Evans, Benjamin D; Stringer, Simon M

    2012-01-01

    The ventral visual pathway achieves object and face recognition by building transformation-invariant representations from elementary visual features. In previous computer simulation studies with rate-coded neural networks, the development of transformation-invariant representations has been demonstrated using either of two biologically plausible learning mechanisms, Trace learning and Continuous Transformation (CT) learning. However, it has not previously been investigated how transformation-invariant representations may be learned in a more biologically accurate spiking neural network. A key issue is how the synaptic connection strengths in such a spiking network might self-organize through Spike-Time Dependent Plasticity (STDP) where the change in synaptic strength is dependent on the relative times of the spikes emitted by the presynaptic and postsynaptic neurons rather than simply correlated activity driving changes in synaptic efficacy. Here we present simulations with conductance-based integrate-and-fire (IF) neurons using a STDP learning rule to address these gaps in our understanding. It is demonstrated that with the appropriate selection of model parameters and training regime, the spiking network model can utilize either Trace-like or CT-like learning mechanisms to achieve transform-invariant representations.

  20. Transform-invariant visual representations in self-organizing spiking neural networks

    Directory of Open Access Journals (Sweden)

    Benjamin eEvans

    2012-07-01

    Full Text Available The ventral visual pathway achieves object and face recognition by building transform-invariant representations from elementary visual features. In previous computer simulation studies with rate-coded neural networks, the development of transform invariant representations has been demonstrated using either of two biologically plausible learning mechanisms, Trace learning and Continuous Transformation (CT learning. However, it has not previously been investigated how transform invariant representations may be learned in a more biologically accurate spiking neural network. A key issue is how the synaptic connection strengths in such a spiking network might self-organize through Spike-Time Dependent Plasticity (STDP where the change in synaptic strength is dependent on the relative times of the spikes emitted by the pre- and postsynaptic neurons rather than simply correlated activity driving changes in synaptic efficacy. Here we present simulations with conductance-based integrate-and-fire (IF neurons using a STDP learning rule to address these gaps in our understanding. It is demonstrated that with the appropriate selection of model pa- rameters and training regime, the spiking network model can utilize either Trace-like or CT-like learning mechanisms to achieve transform-invariant representations.

  1. Mechanical design of a free-wheel clutch for the thermal engine of a parallel hybrid vehicle with thermal and electrical power-train; Conception mecanique d'un accouplement a roue libre pour le moteur thermique d'un vehicule hybride parallele thermique et electrique

    Energy Technology Data Exchange (ETDEWEB)

    Santin, J.J.

    2001-07-01

    This thesis deals with the design of a free-wheel clutch. This unit is intended to replace the automated dry single-plate clutch of a parallel hybrid car with thermal and electric power-train. Furthermore, the car is a single shaft zero emission vehicle fitted with a controlled gearbox. Chapter one focuses on the type of hybrid vehicle studied. It shows the need to isolate the engine from the rest of the drive train, depending on the driving conditions. Chapter two presents and compares the two alternatives: automated clutch and free-wheel. In order to develop the free-wheel option, the torsional vibrations in the automotive drive line had to be closely studied. It required the design of a specific modular tool, as presented in chapter three, with the help of MATLAB SIMULINK. Lastly, chapter four shows how this tool was used during the design stage and specifies the way to build it. The free-wheel is then to be fitted to a prototype hybrid vehicle, constructed by both the LAMIH and PSA. (author)

  2. Spiking Neural P Systems with Communication on Request.

    Science.gov (United States)

    Pan, Linqiang; Păun, Gheorghe; Zhang, Gexiang; Neri, Ferrante

    2017-12-01

    Spiking Neural [Formula: see text] Systems are Neural System models characterized by the fact that each neuron mimics a biological cell and the communication between neurons is based on spikes. In the Spiking Neural [Formula: see text] systems investigated so far, the application of evolution rules depends on the contents of a neuron (checked by means of a regular expression). In these [Formula: see text] systems, a specified number of spikes are consumed and a specified number of spikes are produced, and then sent to each of the neurons linked by a synapse to the evolving neuron. [Formula: see text]In the present work, a novel communication strategy among neurons of Spiking Neural [Formula: see text] Systems is proposed. In the resulting models, called Spiking Neural [Formula: see text] Systems with Communication on Request, the spikes are requested from neighboring neurons, depending on the contents of the neuron (still checked by means of a regular expression). Unlike the traditional Spiking Neural [Formula: see text] systems, no spikes are consumed or created: the spikes are only moved along synapses and replicated (when two or more neurons request the contents of the same neuron). [Formula: see text]The Spiking Neural [Formula: see text] Systems with Communication on Request are proved to be computationally universal, that is, equivalent with Turing machines as long as two types of spikes are used. Following this work, further research questions are listed to be open problems.

  3. The stochastic properties of input spike trains control neuronal arithmetic

    Czech Academy of Sciences Publication Activity Database

    Bureš, Zbyněk

    2012-01-01

    Roč. 106, č. 2 (2012), s. 111-122 ISSN 0340-1200 R&D Projects: GA ČR(CZ) GAP303/12/1347; GA ČR(CZ) GAP304/12/1342; GA ČR(CZ) GBP304/12/G069 Grant - others:GA MŠk(CZ) M00176 Institutional research plan: CEZ:AV0Z50390512 Institutional support: RVO:68378041 Keywords : aerosol * simulation of human breathing * porcine lung equivalent Subject RIV: ED - Physiology Impact factor: 2.067, year: 2012

  4. Real-time classification and sensor fusion with a spiking deep belief network.

    Science.gov (United States)

    O'Connor, Peter; Neil, Daniel; Liu, Shih-Chii; Delbruck, Tobi; Pfeiffer, Michael

    2013-01-01

    Deep Belief Networks (DBNs) have recently shown impressive performance on a broad range of classification problems. Their generative properties allow better understanding of the performance, and provide a simpler solution for sensor fusion tasks. However, because of their inherent need for feedback and parallel update of large numbers of units, DBNs are expensive to implement on serial computers. This paper proposes a method based on the Siegert approximation for Integrate-and-Fire neurons to map an offline-trained DBN onto an efficient event-driven spiking neural network suitable for hardware implementation. The method is demonstrated in simulation and by a real-time implementation of a 3-layer network with 2694 neurons used for visual classification of MNIST handwritten digits with input from a 128 × 128 Dynamic Vision Sensor (DVS) silicon retina, and sensory-fusion using additional input from a 64-channel AER-EAR silicon cochlea. The system is implemented through the open-source software in the jAER project and runs in real-time on a laptop computer. It is demonstrated that the system can recognize digits in the presence of distractions, noise, scaling, translation and rotation, and that the degradation of recognition performance by using an event-based approach is less than 1%. Recognition is achieved in an average of 5.8 ms after the onset of the presentation of a digit. By cue integration from both silicon retina and cochlea outputs we show that the system can be biased to select the correct digit from otherwise ambiguous input.

  5. Design of Spiking Central Pattern Generators for Multiple Locomotion Gaits in Hexapod Robots by Christiansen Grammar Evolution.

    Science.gov (United States)

    Espinal, Andres; Rostro-Gonzalez, Horacio; Carpio, Martin; Guerra-Hernandez, Erick I; Ornelas-Rodriguez, Manuel; Sotelo-Figueroa, Marco

    2016-01-01

    This paper presents a method to design Spiking Central Pattern Generators (SCPGs) to achieve locomotion at different frequencies on legged robots. It is validated through embedding its designs into a Field-Programmable Gate Array (FPGA) and implemented on a real hexapod robot. The SCPGs are automatically designed by means of a Christiansen Grammar Evolution (CGE)-based methodology. The CGE performs a solution for the configuration (synaptic weights and connections) for each neuron in the SCPG. This is carried out through the indirect representation of candidate solutions that evolve to replicate a specific spike train according to a locomotion pattern (gait) by measuring the similarity between the spike trains and the SPIKE distance to lead the search to a correct configuration. By using this evolutionary approach, several SCPG design specifications can be explicitly added into the SPIKE distance-based fitness function, such as looking for Spiking Neural Networks (SNNs) with minimal connectivity or a Central Pattern Generator (CPG) able to generate different locomotion gaits only by changing the initial input stimuli. The SCPG designs have been successfully implemented on a Spartan 6 FPGA board and a real time validation on a 12 Degrees Of Freedom (DOFs) hexapod robot is presented.

  6. Adaptive robotic control driven by a versatile spiking cerebellar network.

    Directory of Open Access Journals (Sweden)

    Claudia Casellato

    Full Text Available The cerebellum is involved in a large number of different neural processes, especially in associative learning and in fine motor control. To develop a comprehensive theory of sensorimotor learning and control, it is crucial to determine the neural basis of coding and plasticity embedded into the cerebellar neural circuit and how they are translated into behavioral outcomes in learning paradigms. Learning has to be inferred from the interaction of an embodied system with its real environment, and the same cerebellar principles derived from cell physiology have to be able to drive a variety of tasks of different nature, calling for complex timing and movement patterns. We have coupled a realistic cerebellar spiking neural network (SNN with a real robot and challenged it in multiple diverse sensorimotor tasks. Encoding and decoding strategies based on neuronal firing rates were applied. Adaptive motor control protocols with acquisition and extinction phases have been designed and tested, including an associative Pavlovian task (Eye blinking classical conditioning, a vestibulo-ocular task and a perturbed arm reaching task operating in closed-loop. The SNN processed in real-time mossy fiber inputs as arbitrary contextual signals, irrespective of whether they conveyed a tone, a vestibular stimulus or the position of a limb. A bidirectional long-term plasticity rule implemented at parallel fibers-Purkinje cell synapses modulated the output activity in the deep cerebellar nuclei. In all tasks, the neurorobot learned to adjust timing and gain of the motor responses by tuning its output discharge. It succeeded in reproducing how human biological systems acquire, extinguish and express knowledge of a noisy and changing world. By varying stimuli and perturbations patterns, real-time control robustness and generalizability were validated. The implicit spiking dynamics of the cerebellar model fulfill timing, prediction and learning functions.

  7. Adaptive robotic control driven by a versatile spiking cerebellar network.

    Science.gov (United States)

    Casellato, Claudia; Antonietti, Alberto; Garrido, Jesus A; Carrillo, Richard R; Luque, Niceto R; Ros, Eduardo; Pedrocchi, Alessandra; D'Angelo, Egidio

    2014-01-01

    The cerebellum is involved in a large number of different neural processes, especially in associative learning and in fine motor control. To develop a comprehensive theory of sensorimotor learning and control, it is crucial to determine the neural basis of coding and plasticity embedded into the cerebellar neural circuit and how they are translated into behavioral outcomes in learning paradigms. Learning has to be inferred from the interaction of an embodied system with its real environment, and the same cerebellar principles derived from cell physiology have to be able to drive a variety of tasks of different nature, calling for complex timing and movement patterns. We have coupled a realistic cerebellar spiking neural network (SNN) with a real robot and challenged it in multiple diverse sensorimotor tasks. Encoding and decoding strategies based on neuronal firing rates were applied. Adaptive motor control protocols with acquisition and extinction phases have been designed and tested, including an associative Pavlovian task (Eye blinking classical conditioning), a vestibulo-ocular task and a perturbed arm reaching task operating in closed-loop. The SNN processed in real-time mossy fiber inputs as arbitrary contextual signals, irrespective of whether they conveyed a tone, a vestibular stimulus or the position of a limb. A bidirectional long-term plasticity rule implemented at parallel fibers-Purkinje cell synapses modulated the output activity in the deep cerebellar nuclei. In all tasks, the neurorobot learned to adjust timing and gain of the motor responses by tuning its output discharge. It succeeded in reproducing how human biological systems acquire, extinguish and express knowledge of a noisy and changing world. By varying stimuli and perturbations patterns, real-time control robustness and generalizability were validated. The implicit spiking dynamics of the cerebellar model fulfill timing, prediction and learning functions.

  8. Influence of soil γ-irradiation and spiking on sorption of p,p'-DDE and soil organic matter chemistry.

    Science.gov (United States)

    Škulcová, Lucia; Scherr, Kerstin E; Chrást, Lukáš; Hofman, Jakub; Bielská, Lucie

    2018-07-15

    The fate of organic chemicals and their metabolites in soils is often investigated in model matrices having undergone various pre-treatment steps that may qualitatively or quantitatively interfere with the results. Presently, effects associated with soil sterilization by γ-irradiation and soil spiking using an organic solvent were studied in one freshly spiked soil (sterilization prior to contamination) and its field-contaminated (sterilization after contamination) counterpart for the model organic compound 1,1-Dichloro-2,2-bis(4-chlorophenyl)ethene (p,p'-DDE). Changes in the sorption and potential bioavailability of spiked and native p,p'-DDE were measured by supercritical fluid extraction (SFE), XAD-assisted extraction (XAD), and solid-phase microextraction (SPME) and linked to qualitative changes in soil organic matter (SOM) chemistry measured by diffuse reflectance infrared Fourier-transform (DRIFT) spectroscopy. Reduced sorption of p,p´-DDE detected with XAD and SPME was associated more clearly with spiking than with sterilization, but SFE showed a negligible impact. Spiking resulted in an increase of the DRIFT-derived hydrophobicity index, but irradiation did not. Spectral peak height ratio descriptors indicated increasing hydrophobicity and hydrophilicity in pristine soil following sterilization, and a greater reduction of hydrophobic over hydrophilic groups as a consequence of spiking. In parallel, reduced sorption of p,p´-DDE upon spiking was observed. Based on the present samples, γ-irradiation appears to alter soil sorptive properties to a lesser extent when compared to common laboratory processes such as spiking with organic solvents. Copyright © 2018. Published by Elsevier Inc.

  9. Parallel Programming with Intel Parallel Studio XE

    CERN Document Server

    Blair-Chappell , Stephen

    2012-01-01

    Optimize code for multi-core processors with Intel's Parallel Studio Parallel programming is rapidly becoming a "must-know" skill for developers. Yet, where to start? This teach-yourself tutorial is an ideal starting point for developers who already know Windows C and C++ and are eager to add parallelism to their code. With a focus on applying tools, techniques, and language extensions to implement parallelism, this essential resource teaches you how to write programs for multicore and leverage the power of multicore in your programs. Sharing hands-on case studies and real-world examples, the

  10. Dynamic evolving spiking neural networks for on-line spatio- and spectro-temporal pattern recognition.

    Science.gov (United States)

    Kasabov, Nikola; Dhoble, Kshitij; Nuntalid, Nuttapod; Indiveri, Giacomo

    2013-05-01

    On-line learning and recognition of spatio- and spectro-temporal data (SSTD) is a very challenging task and an important one for the future development of autonomous machine learning systems with broad applications. Models based on spiking neural networks (SNN) have already proved their potential in capturing spatial and temporal data. One class of them, the evolving SNN (eSNN), uses a one-pass rank-order learning mechanism and a strategy to evolve a new spiking neuron and new connections to learn new patterns from incoming data. So far these networks have been mainly used for fast image and speech frame-based recognition. Alternative spike-time learning methods, such as Spike-Timing Dependent Plasticity (STDP) and its variant Spike Driven Synaptic Plasticity (SDSP), can also be used to learn spatio-temporal representations, but they usually require many iterations in an unsupervised or semi-supervised mode of learning. This paper introduces a new class of eSNN, dynamic eSNN, that utilise both rank-order learning and dynamic synapses to learn SSTD in a fast, on-line mode. The paper also introduces a new model called deSNN, that utilises rank-order learning and SDSP spike-time learning in unsupervised, supervised, or semi-supervised modes. The SDSP learning is used to evolve dynamically the network changing connection weights that capture spatio-temporal spike data clusters both during training and during recall. The new deSNN model is first illustrated on simple examples and then applied on two case study applications: (1) moving object recognition using address-event representation (AER) with data collected using a silicon retina device; (2) EEG SSTD recognition for brain-computer interfaces. The deSNN models resulted in a superior performance in terms of accuracy and speed when compared with other SNN models that use either rank-order or STDP learning. The reason is that the deSNN makes use of both the information contained in the order of the first input spikes

  11. Grain price spikes and beggar-thy-neighbor policy responses

    DEFF Research Database (Denmark)

    Jensen, Hans Grinsted; Anderson, Kym

    2017-01-01

    When prices spike in international grain markets, national governments often reduce the extent to which that spike affects their domestic food markets. Those actions exacerbate the price spike and international welfare transfer associated with that terms of trade change. Several recent analyses...

  12. Barbed micro-spikes for micro-scale biopsy

    Science.gov (United States)

    Byun, Sangwon; Lim, Jung-Min; Paik, Seung-Joon; Lee, Ahra; Koo, Kyo-in; Park, Sunkil; Park, Jaehong; Choi, Byoung-Doo; Seo, Jong Mo; Kim, Kyung-ah; Chung, Hum; Song, Si Young; Jeon, Doyoung; Cho, Dongil

    2005-06-01

    Single-crystal silicon planar micro-spikes with protruding barbs are developed for micro-scale biopsy and the feasibility of using the micro-spike as a micro-scale biopsy tool is evaluated for the first time. The fabrication process utilizes a deep silicon etch to define the micro-spike outline, resulting in protruding barbs of various shapes. Shanks of the fabricated micro-spikes are 3 mm long, 100 µm thick and 250 µm wide. Barbs protruding from micro-spike shanks facilitate the biopsy procedure by tearing off and retaining samples from target tissues. Micro-spikes with barbs successfully extracted tissue samples from the small intestines of the anesthetized pig, whereas micro-spikes without barbs failed to obtain a biopsy sample. Parylene coating can be applied to improve the biocompatibility of the micro-spike without deteriorating the biopsy function of the micro-spike. In addition, to show that the biopsy with the micro-spike can be applied to tissue analysis, samples obtained by micro-spikes were examined using immunofluorescent staining. Nuclei and F-actin of cells which are extracted by the micro-spike from a transwell were clearly visualized by immunofluorescent staining.

  13. The Mutation Frequency in Different Spike Categories in Barley

    DEFF Research Database (Denmark)

    Frydenberg, O.; Doll, Hans; Sandfær, J.

    1964-01-01

    After gamma irradiation of barley seeds, a comparison has been made between the chlorophyll-mutant frequencies in X1 spikes that had multicellular bud meristems in the seeds at the time of treatment (denoted as pre-formed spikes) and X1 spikes having no recognizable meristems at the time...

  14. Error-backpropagation in temporally encoded networks of spiking neurons

    NARCIS (Netherlands)

    S.M. Bohte (Sander); J.A. La Poutré (Han); J.N. Kok (Joost)

    2000-01-01

    textabstractFor a network of spiking neurons that encodes information in the timing of individual spike-times, we derive a supervised learning rule, emph{SpikeProp, akin to traditional error-backpropagation and show how to overcome the discontinuities introduced by thresholding. With this algorithm,

  15. A real-time spike sorting method based on the embedded GPU.

    Science.gov (United States)

    Zelan Yang; Kedi Xu; Xiang Tian; Shaomin Zhang; Xiaoxiang Zheng

    2017-07-01

    Microelectrode arrays with hundreds of channels have been widely used to acquire neuron population signals in neuroscience studies. Online spike sorting is becoming one of the most important challenges for high-throughput neural signal acquisition systems. Graphic processing unit (GPU) with high parallel computing capability might provide an alternative solution for increasing real-time computational demands on spike sorting. This study reported a method of real-time spike sorting through computing unified device architecture (CUDA) which was implemented on an embedded GPU (NVIDIA JETSON Tegra K1, TK1). The sorting approach is based on the principal component analysis (PCA) and K-means. By analyzing the parallelism of each process, the method was further optimized in the thread memory model of GPU. Our results showed that the GPU-based classifier on TK1 is 37.92 times faster than the MATLAB-based classifier on PC while their accuracies were the same with each other. The high-performance computing features of embedded GPU demonstrated in our studies suggested that the embedded GPU provide a promising platform for the real-time neural signal processing.

  16. Hardware implementation of stochastic spiking neural networks.

    Science.gov (United States)

    Rosselló, Josep L; Canals, Vincent; Morro, Antoni; Oliver, Antoni

    2012-08-01

    Spiking Neural Networks, the last generation of Artificial Neural Networks, are characterized by its bio-inspired nature and by a higher computational capacity with respect to other neural models. In real biological neurons, stochastic processes represent an important mechanism of neural behavior and are responsible of its special arithmetic capabilities. In this work we present a simple hardware implementation of spiking neurons that considers this probabilistic nature. The advantage of the proposed implementation is that it is fully digital and therefore can be massively implemented in Field Programmable Gate Arrays. The high computational capabilities of the proposed model are demonstrated by the study of both feed-forward and recurrent networks that are able to implement high-speed signal filtering and to solve complex systems of linear equations.

  17. Evolving spiking networks with variable resistive memories.

    Science.gov (United States)

    Howard, Gerard; Bull, Larry; de Lacy Costello, Ben; Gale, Ella; Adamatzky, Andrew

    2014-01-01

    Neuromorphic computing is a brainlike information processing paradigm that requires adaptive learning mechanisms. A spiking neuro-evolutionary system is used for this purpose; plastic resistive memories are implemented as synapses in spiking neural networks. The evolutionary design process exploits parameter self-adaptation and allows the topology and synaptic weights to be evolved for each network in an autonomous manner. Variable resistive memories are the focus of this research; each synapse has its own conductance profile which modifies the plastic behaviour of the device and may be altered during evolution. These variable resistive networks are evaluated on a noisy robotic dynamic-reward scenario against two static resistive memories and a system containing standard connections only. The results indicate that the extra behavioural degrees of freedom available to the networks incorporating variable resistive memories enable them to outperform the comparative synapse types.

  18. Visualizing spikes in source-space

    DEFF Research Database (Denmark)

    Beniczky, Sándor; Duez, Lene; Scherg, Michael

    2016-01-01

    OBJECTIVE: Reviewing magnetoencephalography (MEG) recordings is time-consuming: signals from the 306 MEG-sensors are typically reviewed divided into six arrays of 51 sensors each, thus browsing each recording six times in order to evaluate all signals. A novel method of reconstructing the MEG...... signals in source-space was developed using a source-montage of 29 brain-regions and two spatial components to remove magnetocardiographic (MKG) artefacts. Our objective was to evaluate the accuracy of reviewing MEG in source-space. METHODS: In 60 consecutive patients with epilepsy, we prospectively...... evaluated the accuracy of reviewing the MEG signals in source-space as compared to the classical method of reviewing them in sensor-space. RESULTS: All 46 spike-clusters identified in sensor-space were also identified in source-space. Two additional spike-clusters were identified in source-space. As 29...

  19. Spiked instantons from intersecting D-branes

    Directory of Open Access Journals (Sweden)

    Nikita Nekrasov

    2017-01-01

    Full Text Available The moduli space of spiked instantons that arises in the context of the BPS/CFT correspondence [22] is realised as the moduli space of classical vacua, i.e. low-energy open string field configurations, of a certain stack of intersecting D1-branes and D5-branes in Type IIB string theory. The presence of a constant B-field induces an interesting dynamics involving the tachyon condensation.

  20. Stochastic synchronization in finite size spiking networks

    Science.gov (United States)

    Doiron, Brent; Rinzel, John; Reyes, Alex

    2006-09-01

    We study a stochastic synchronization of spiking activity in feedforward networks of integrate-and-fire model neurons. A stochastic mean field analysis shows that synchronization occurs only when the network size is sufficiently small. This gives evidence that the dynamics, and hence processing, of finite size populations can be drastically different from that observed in the infinite size limit. Our results agree with experimentally observed synchrony in cortical networks, and further strengthen the link between synchrony and propagation in cortical systems.

  1. Non-singular spiked harmonic oscillator

    International Nuclear Information System (INIS)

    Aguilera-Navarro, V.C.; Guardiola, R.

    1990-01-01

    A perturbative study of a class of non-singular spiked harmonic oscillators defined by the hamiltonian H = d sup(2)/dr sup(2) + r sup(2) + λ/r sup(α) in the domain [0,∞] is carried out, in the two extremes of a weak coupling and a strong coupling regimes. A path has been found to connect both expansions for α near 2. (author)

  2. The transfer function of neuron spike.

    Science.gov (United States)

    Palmieri, Igor; Monteiro, Luiz H A; Miranda, Maria D

    2015-08-01

    The mathematical modeling of neuronal signals is a relevant problem in neuroscience. The complexity of the neuron behavior, however, makes this problem a particularly difficult task. Here, we propose a discrete-time linear time-invariant (LTI) model with a rational function in order to represent the neuronal spike detected by an electrode located in the surroundings of the nerve cell. The model is presented as a cascade association of two subsystems: one that generates an action potential from an input stimulus, and one that represents the medium between the cell and the electrode. The suggested approach employs system identification and signal processing concepts, and is dissociated from any considerations about the biophysical processes of the neuronal cell, providing a low-complexity alternative to model the neuronal spike. The model is validated by using in vivo experimental readings of intracellular and extracellular signals. A computational simulation of the model is presented in order to assess its proximity to the neuronal signal and to observe the variability of the estimated parameters. The implications of the results are discussed in the context of spike sorting. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. Basalt FRP Spike Repairing of Wood Beams

    Directory of Open Access Journals (Sweden)

    Luca Righetti

    2015-08-01

    Full Text Available This article describes aspects within an experimental program aimed at improving the structural performance of cracked solid fir-wood beams repaired with Basalt Fiber Reinforced Polymer (BFRP spikes. Fir wood is characterized by its low density, low compression strength, and high level of defects, and it is likely to distort when dried and tends to fail under tension due to the presence of cracks, knots, or grain deviation. The proposed repair technique consists of the insertion of BFRP spikes into timber beams to restore the continuity of cracked sections. The experimental efforts deal with the evaluation of the bending strength and deformation properties of 24 timber beams. An artificially simulated cracking was produced by cutting the wood beams in half or notching. The obtained results for the repaired beams were compared with those of solid undamaged and damaged beams, and increases of beam capacity, bending strength and of modulus of elasticity, and analysis of failure modes was discussed. For notched beams, the application of the BFRP spikes was able to restore the original bending capacity of undamaged beams, while only a small part of the original capacity was recovered for beams that were cut in half.

  4. A network of spiking neurons that can represent interval timing: mean field analysis.

    Science.gov (United States)

    Gavornik, Jeffrey P; Shouval, Harel Z

    2011-04-01

    Despite the vital importance of our ability to accurately process and encode temporal information, the underlying neural mechanisms are largely unknown. We have previously described a theoretical framework that explains how temporal representations, similar to those reported in the visual cortex, can form in locally recurrent cortical networks as a function of reward modulated synaptic plasticity. This framework allows networks of both linear and spiking neurons to learn the temporal interval between a stimulus and paired reward signal presented during training. Here we use a mean field approach to analyze the dynamics of non-linear stochastic spiking neurons in a network trained to encode specific time intervals. This analysis explains how recurrent excitatory feedback allows a network structure to encode temporal representations.

  5. Principal cell spiking, postsynaptic excitation, and oxygen consumption in the rat cerebellar cortex

    DEFF Research Database (Denmark)

    Thomsen, Kirsten; Piilgaard, Henning; Gjedde, Albert

    2009-01-01

    excitatory synaptic input. Subsequent inhibition of action potential propagation and neurotransmission by blocking voltage-gated Na+-channels eliminated the increases in CMRO2 due to PF stimulation and increased PC spiking, but left a large fraction of CMRO2, i.e., basal CMRO2, intact. In conclusion, whereas......) of postsynaptic excitation and PC spiking during evoked and ongoing neuronal activity in the rat. By inhibiting excitatory synaptic input using ionotropic glutamate receptor blockers, we found that the increase in CMRO2 evoked by parallel fiber (PF) stimulation depended entirely on postsynaptic excitation...... basal CMRO2 in anesthetized animals did not seem to be related to neurosignaling, increases in CMRO2 could be induced by all aspects of neurosignaling. Our findings imply that CMRO2 responses cannot a priori be assigned to specific neuronal activities....

  6. Nicotine-Mediated ADP to Spike Transition: Double Spiking in Septal Neurons.

    Science.gov (United States)

    Kodirov, Sodikdjon A; Wehrmeister, Michael; Colom, Luis

    2016-04-01

    The majority of neurons in lateral septum (LS) are electrically silent at resting membrane potential. Nicotine transiently excites a subset of neurons and occasionally leads to long lasting bursting activity upon longer applications. We have observed simultaneous changes in frequencies and amplitudes of spontaneous action potentials (AP) in the presence of nicotine. During the prolonged exposure, nicotine increased numbers of spikes within a burst. One of the hallmarks of nicotine effects was the occurrences of double spikes (known also as bursting). Alignment of 51 spontaneous spikes, triggered upon continuous application of nicotine, revealed that the slope of after-depolarizing potential gradually increased (1.4 vs. 3 mV/ms) and neuron fired the second AP, termed as double spiking. A transition from a single AP to double spikes increased the amplitude of after-hyperpolarizing potential. The amplitude of the second (premature) AP was smaller compared to the first one, and this correlation persisted in regard to their duration (half-width). A similar bursting activity in the presence of nicotine, to our knowledge, has not been reported previously in the septal structure in general and in LS in particular.

  7. Spikes and matter inhomogeneities in massless scalar field models

    International Nuclear Information System (INIS)

    Coley, A A; Lim, W C

    2016-01-01

    We shall discuss the general relativistic generation of spikes in a massless scalar field or stiff perfect fluid model. We first investigate orthogonally transitive (OT) G 2 stiff fluid spike models both heuristically and numerically, and give a new exact OT G 2 stiff fluid spike solution. We then present a new two-parameter family of non-OT G 2 stiff fluid spike solutions, obtained by the generalization of non-OT G 2 vacuum spike solutions to the stiff fluid case by applying Geroch’s transformation on a Jacobs seed. The dynamics of these new stiff fluid spike solutions is qualitatively different from that of the vacuum spike solutions in that the matter (stiff fluid) feels the spike directly and the stiff fluid spike solution can end up with a permanent spike. We then derive the evolution equations of non-OT G 2 stiff fluid models, including a second perfect fluid, in full generality, and briefly discuss some of their qualitative properties and their potential numerical analysis. Finally, we discuss how a fluid, and especially a stiff fluid or massless scalar field, affects the physics of the generation of spikes. (paper)

  8. Introduction to spiking neural networks: Information processing, learning and applications.

    Science.gov (United States)

    Ponulak, Filip; Kasinski, Andrzej

    2011-01-01

    The concept that neural information is encoded in the firing rate of neurons has been the dominant paradigm in neurobiology for many years. This paradigm has also been adopted by the theory of artificial neural networks. Recent physiological experiments demonstrate, however, that in many parts of the nervous system, neural code is founded on the timing of individual action potentials. This finding has given rise to the emergence of a new class of neural models, called spiking neural networks. In this paper we summarize basic properties of spiking neurons and spiking networks. Our focus is, specifically, on models of spike-based information coding, synaptic plasticity and learning. We also survey real-life applications of spiking models. The paper is meant to be an introduction to spiking neural networks for scientists from various disciplines interested in spike-based neural processing.

  9. A spike sorting toolbox for up to thousands of electrodes validated with ground truth recordings in vitro and in vivo

    Science.gov (United States)

    Lefebvre, Baptiste; Deny, Stéphane; Gardella, Christophe; Stimberg, Marcel; Jetter, Florian; Zeck, Guenther; Picaud, Serge; Duebel, Jens

    2018-01-01

    In recent years, multielectrode arrays and large silicon probes have been developed to record simultaneously between hundreds and thousands of electrodes packed with a high density. However, they require novel methods to extract the spiking activity of large ensembles of neurons. Here, we developed a new toolbox to sort spikes from these large-scale extracellular data. To validate our method, we performed simultaneous extracellular and loose patch recordings in rodents to obtain ‘ground truth’ data, where the solution to this sorting problem is known for one cell. The performance of our algorithm was always close to the best expected performance, over a broad range of signal-to-noise ratios, in vitro and in vivo. The algorithm is entirely parallelized and has been successfully tested on recordings with up to 4225 electrodes. Our toolbox thus offers a generic solution to sort accurately spikes for up to thousands of electrodes. PMID:29557782

  10. Practical parallel computing

    CERN Document Server

    Morse, H Stephen

    1994-01-01

    Practical Parallel Computing provides information pertinent to the fundamental aspects of high-performance parallel processing. This book discusses the development of parallel applications on a variety of equipment.Organized into three parts encompassing 12 chapters, this book begins with an overview of the technology trends that converge to favor massively parallel hardware over traditional mainframes and vector machines. This text then gives a tutorial introduction to parallel hardware architectures. Other chapters provide worked-out examples of programs using several parallel languages. Thi

  11. Parallel sorting algorithms

    CERN Document Server

    Akl, Selim G

    1985-01-01

    Parallel Sorting Algorithms explains how to use parallel algorithms to sort a sequence of items on a variety of parallel computers. The book reviews the sorting problem, the parallel models of computation, parallel algorithms, and the lower bounds on the parallel sorting problems. The text also presents twenty different algorithms, such as linear arrays, mesh-connected computers, cube-connected computers. Another example where algorithm can be applied is on the shared-memory SIMD (single instruction stream multiple data stream) computers in which the whole sequence to be sorted can fit in the

  12. An Event-Driven Classifier for Spiking Neural Networks Fed with Synthetic or Dynamic Vision Sensor Data

    Directory of Open Access Journals (Sweden)

    Evangelos Stromatias

    2017-06-01

    Full Text Available This paper introduces a novel methodology for training an event-driven classifier within a Spiking Neural Network (SNN System capable of yielding good classification results when using both synthetic input data and real data captured from Dynamic Vision Sensor (DVS chips. The proposed supervised method uses the spiking activity provided by an arbitrary topology of prior SNN layers to build histograms and train the classifier in the frame domain using the stochastic gradient descent algorithm. In addition, this approach can cope with leaky integrate-and-fire neuron models within the SNN, a desirable feature for real-world SNN applications, where neural activation must fade away after some time in the absence of inputs. Consequently, this way of building histograms captures the dynamics of spikes immediately before the classifier. We tested our method on the MNIST data set using different synthetic encodings and real DVS sensory data sets such as N-MNIST, MNIST-DVS, and Poker-DVS using the same network topology and feature maps. We demonstrate the effectiveness of our approach by achieving the highest classification accuracy reported on the N-MNIST (97.77% and Poker-DVS (100% real DVS data sets to date with a spiking convolutional network. Moreover, by using the proposed method we were able to retrain the output layer of a previously reported spiking neural network and increase its performance by 2%, suggesting that the proposed classifier can be used as the output layer in works where features are extracted using unsupervised spike-based learning methods. In addition, we also analyze SNN performance figures such as total event activity and network latencies, which are relevant for eventual hardware implementations. In summary, the paper aggregates unsupervised-trained SNNs with a supervised-trained SNN classifier, combining and applying them to heterogeneous sets of benchmarks, both synthetic and from real DVS chips.

  13. An Event-Driven Classifier for Spiking Neural Networks Fed with Synthetic or Dynamic Vision Sensor Data.

    Science.gov (United States)

    Stromatias, Evangelos; Soto, Miguel; Serrano-Gotarredona, Teresa; Linares-Barranco, Bernabé

    2017-01-01

    This paper introduces a novel methodology for training an event-driven classifier within a Spiking Neural Network (SNN) System capable of yielding good classification results when using both synthetic input data and real data captured from Dynamic Vision Sensor (DVS) chips. The proposed supervised method uses the spiking activity provided by an arbitrary topology of prior SNN layers to build histograms and train the classifier in the frame domain using the stochastic gradient descent algorithm. In addition, this approach can cope with leaky integrate-and-fire neuron models within the SNN, a desirable feature for real-world SNN applications, where neural activation must fade away after some time in the absence of inputs. Consequently, this way of building histograms captures the dynamics of spikes immediately before the classifier. We tested our method on the MNIST data set using different synthetic encodings and real DVS sensory data sets such as N-MNIST, MNIST-DVS, and Poker-DVS using the same network topology and feature maps. We demonstrate the effectiveness of our approach by achieving the highest classification accuracy reported on the N-MNIST (97.77%) and Poker-DVS (100%) real DVS data sets to date with a spiking convolutional network. Moreover, by using the proposed method we were able to retrain the output layer of a previously reported spiking neural network and increase its performance by 2%, suggesting that the proposed classifier can be used as the output layer in works where features are extracted using unsupervised spike-based learning methods. In addition, we also analyze SNN performance figures such as total event activity and network latencies, which are relevant for eventual hardware implementations. In summary, the paper aggregates unsupervised-trained SNNs with a supervised-trained SNN classifier, combining and applying them to heterogeneous sets of benchmarks, both synthetic and from real DVS chips.

  14. Comparison of electrodialytic removal of Cu from spiked kaolinite, spiked soil and industrially polluted soil

    DEFF Research Database (Denmark)

    Ottosen, Lisbeth M.; Lepkova, Katarina; Kubal, Martin

    2006-01-01

    Electrokinetic remediation methods for removal of heavy metals from polluted soils have been subjected for quite intense research during the past years since these methods are well suitable for fine-grained soils where other remediation methods fail. Electrodialytic remediation is an electrokinetic...... remediation method which is based on applying an electric DC field and the use of ion exchange membranes that ensures the main transport of heavy metals to be out of the pollutes soil. An experimental investigation was made with electrodialytic removal of Cu from spiked kaolinite, spiked soil and industrially...... polluted soil under the same operational conditions (constant current density 0.2 mA/cm2 and duration 28 days). The results of the present paper show that caution must be taken when generalising results obtained in spiked kaolinite to remediation of industrially polluted soils, as it was shown...

  15. Synchronous spikes are necessary but not sufficient for a synchrony code in populations of spiking neurons.

    Science.gov (United States)

    Grewe, Jan; Kruscha, Alexandra; Lindner, Benjamin; Benda, Jan

    2017-03-07

    Synchronous activity in populations of neurons potentially encodes special stimulus features. Selective readout of either synchronous or asynchronous activity allows formation of two streams of information processing. Theoretical work predicts that such a synchrony code is a fundamental feature of populations of spiking neurons if they operate in specific noise and stimulus regimes. Here we experimentally test the theoretical predictions by quantifying and comparing neuronal response properties in tuberous and ampullary electroreceptor afferents of the weakly electric fish Apteronotus leptorhynchus These related systems show similar levels of synchronous activity, but only in the more irregularly firing tuberous afferents a synchrony code is established, whereas in the more regularly firing ampullary afferents it is not. The mere existence of synchronous activity is thus not sufficient for a synchrony code. Single-cell features such as the irregularity of spiking and the frequency dependence of the neuron's transfer function determine whether synchronous spikes possess a distinct meaning for the encoding of time-dependent signals.

  16. Analog memristive synapse in spiking networks implementing unsupervised learning

    Directory of Open Access Journals (Sweden)

    Erika Covi

    2016-10-01

    Full Text Available Emerging brain-inspired architectures call for devices that can emulate the functionality of biological synapses in order to implement new efficient computational schemes able to solve ill-posed problems. Various devices and solutions are still under investigation and, in this respect, a challenge is opened to the researchers in the field. Indeed, the optimal candidate is a device able to reproduce the complete functionality of a synapse, i.e. the typical synaptic process underlying learning in biological systems (activity-dependent synaptic plasticity. This implies a device able to change its resistance (synaptic strength, or weight upon proper electrical stimuli (synaptic activity and showing several stable resistive states throughout its dynamic range (analog behavior. Moreover, it should be able to perform spike timing dependent plasticity (STDP, an associative homosynaptic plasticity learning rule based on the delay time between the two firing neurons the synapse is connected to. This rule is a fundamental learning protocol in state-of-art networks, because it allows unsupervised learning. Notwithstanding this fact, STDP-based unsupervised learning has been proposed several times mainly for binary synapses rather than multilevel synapses composed of many binary memristors. This paper proposes an HfO2-based analog memristor as a synaptic element which performs STDP within a small spiking neuromorphic network operating unsupervised learning for character recognition. The trained network is able to recognize five characters even in case incomplete or noisy characters are displayed and it is robust to a device-to-device variability of up to +/-30%.

  17. Analog Memristive Synapse in Spiking Networks Implementing Unsupervised Learning.

    Science.gov (United States)

    Covi, Erika; Brivio, Stefano; Serb, Alexander; Prodromakis, Themis; Fanciulli, Marco; Spiga, Sabina

    2016-01-01

    Emerging brain-inspired architectures call for devices that can emulate the functionality of biological synapses in order to implement new efficient computational schemes able to solve ill-posed problems. Various devices and solutions are still under investigation and, in this respect, a challenge is opened to the researchers in the field. Indeed, the optimal candidate is a device able to reproduce the complete functionality of a synapse, i.e., the typical synaptic process underlying learning in biological systems (activity-dependent synaptic plasticity). This implies a device able to change its resistance (synaptic strength, or weight) upon proper electrical stimuli (synaptic activity) and showing several stable resistive states throughout its dynamic range (analog behavior). Moreover, it should be able to perform spike timing dependent plasticity (STDP), an associative homosynaptic plasticity learning rule based on the delay time between the two firing neurons the synapse is connected to. This rule is a fundamental learning protocol in state-of-art networks, because it allows unsupervised learning. Notwithstanding this fact, STDP-based unsupervised learning has been proposed several times mainly for binary synapses rather than multilevel synapses composed of many binary memristors. This paper proposes an HfO 2 -based analog memristor as a synaptic element which performs STDP within a small spiking neuromorphic network operating unsupervised learning for character recognition. The trained network is able to recognize five characters even in case incomplete or noisy images are displayed and it is robust to a device-to-device variability of up to ±30%.

  18. Introduction to parallel programming

    CERN Document Server

    Brawer, Steven

    1989-01-01

    Introduction to Parallel Programming focuses on the techniques, processes, methodologies, and approaches involved in parallel programming. The book first offers information on Fortran, hardware and operating system models, and processes, shared memory, and simple parallel programs. Discussions focus on processes and processors, joining processes, shared memory, time-sharing with multiple processors, hardware, loops, passing arguments in function/subroutine calls, program structure, and arithmetic expressions. The text then elaborates on basic parallel programming techniques, barriers and race

  19. Parallel computing works!

    CERN Document Server

    Fox, Geoffrey C; Messina, Guiseppe C

    2014-01-01

    A clear illustration of how parallel computers can be successfully appliedto large-scale scientific computations. This book demonstrates how avariety of applications in physics, biology, mathematics and other scienceswere implemented on real parallel computers to produce new scientificresults. It investigates issues of fine-grained parallelism relevant forfuture supercomputers with particular emphasis on hypercube architecture. The authors describe how they used an experimental approach to configuredifferent massively parallel machines, design and implement basic systemsoftware, and develop

  20. A Three-Dimensional Movement Analysis of the Spike in Fistball

    Directory of Open Access Journals (Sweden)

    Andreas Bund

    2016-12-01

    Full Text Available Due to its relevancy to point scoring, the spike is considered as one of the most important skills in fistball. Biomechanical analyses of this sport are very rare. In the present study, we performed a three-dimensional kinematic analysis of the fistball spike, which helps to specify performance parameters on a descriptive level. Recorded by four synchronized cameras (120 Hz and linked to the motion capture software Simi Motion® 5.0, three female fistball players of the second German league (24–26 years, 1.63–1.69 m performed several spikes under standardized conditions. Results show that the segment velocities of the arm reached their maximum successively from proximal to distal, following the principle of temporal coordination of single impulses. The wrist shows maximum speed when the fist hits the ball. The elbow joint angle performs a rapid transition from a strong flexion to a (almost full extension; however, the extension is completed after the moment of ball impact. In contrast, the shoulder joint angle increases almost linearly until the fistball contact and decreases afterward. The findings can be used to optimize the training of the spike.

  1. Evolving spiking neural networks: a novel growth algorithm exhibits unintelligent design

    Science.gov (United States)

    Schaffer, J. David

    2015-06-01

    Spiking neural networks (SNNs) have drawn considerable excitement because of their computational properties, believed to be superior to conventional von Neumann machines, and sharing properties with living brains. Yet progress building these systems has been limited because we lack a design methodology. We present a gene-driven network growth algorithm that enables a genetic algorithm (evolutionary computation) to generate and test SNNs. The genome for this algorithm grows O(n) where n is the number of neurons; n is also evolved. The genome not only specifies the network topology, but all its parameters as well. Experiments show the algorithm producing SNNs that effectively produce a robust spike bursting behavior given tonic inputs, an application suitable for central pattern generators. Even though evolution did not include perturbations of the input spike trains, the evolved networks showed remarkable robustness to such perturbations. In addition, the output spike patterns retain evidence of the specific perturbation of the inputs, a feature that could be exploited by network additions that could use this information for refined decision making if required. On a second task, a sequence detector, a discriminating design was found that might be considered an example of "unintelligent design"; extra non-functional neurons were included that, while inefficient, did not hamper its proper functioning.

  2. Solving constraint satisfaction problems with networks of spiking neurons

    Directory of Open Access Journals (Sweden)

    Zeno eJonke

    2016-03-01

    Full Text Available Network of neurons in the brain apply – unlike processors in our current generation ofcomputer hardware – an event-based processing strategy, where short pulses (spikes areemitted sparsely by neurons to signal the occurrence of an event at a particular point intime. Such spike-based computations promise to be substantially more power-efficient thantraditional clocked processing schemes. However it turned out to be surprisingly difficult todesign networks of spiking neurons that can solve difficult computational problems on the levelof single spikes (rather than rates of spikes. We present here a new method for designingnetworks of spiking neurons via an energy function. Furthermore we show how the energyfunction of a network of stochastically firing neurons can be shaped in a quite transparentmanner by composing the networks of simple stereotypical network motifs. We show that thisdesign approach enables networks of spiking neurons to produce approximate solutions todifficult (NP-hard constraint satisfaction problems from the domains of planning/optimizationand verification/logical inference. The resulting networks employ noise as a computationalresource. Nevertheless the timing of spikes (rather than just spike rates plays an essential rolein their computations. Furthermore, networks of spiking neurons carry out for the Traveling Salesman Problem a more efficient stochastic search for good solutions compared with stochastic artificial neural networks (Boltzmann machines and Gibbs sampling.

  3. Limits to high-speed simulations of spiking neural networks using general-purpose computers.

    Science.gov (United States)

    Zenke, Friedemann; Gerstner, Wulfram

    2014-01-01

    To understand how the central nervous system performs computations using recurrent neuronal circuitry, simulations have become an indispensable tool for theoretical neuroscience. To study neuronal circuits and their ability to self-organize, increasing attention has been directed toward synaptic plasticity. In particular spike-timing-dependent plasticity (STDP) creates specific demands for simulations of spiking neural networks. On the one hand a high temporal resolution is required to capture the millisecond timescale of typical STDP windows. On the other hand network simulations have to evolve over hours up to days, to capture the timescale of long-term plasticity. To do this efficiently, fast simulation speed is the crucial ingredient rather than large neuron numbers. Using different medium-sized network models consisting of several thousands of neurons and off-the-shelf hardware, we compare the simulation speed of the simulators: Brian, NEST and Neuron as well as our own simulator Auryn. Our results show that real-time simulations of different plastic network models are possible in parallel simulations in which numerical precision is not a primary concern. Even so, the speed-up margin of parallelism is limited and boosting simulation speeds beyond one tenth of real-time is difficult. By profiling simulation code we show that the run times of typical plastic network simulations encounter a hard boundary. This limit is partly due to latencies in the inter-process communications and thus cannot be overcome by increased parallelism. Overall, these results show that to study plasticity in medium-sized spiking neural networks, adequate simulation tools are readily available which run efficiently on small clusters. However, to run simulations substantially faster than real-time, special hardware is a prerequisite.

  4. Supervised spike-timing-dependent plasticity: a spatiotemporal neuronal learning rule for function approximation and decisions.

    Science.gov (United States)

    Franosch, Jan-Moritz P; Urban, Sebastian; van Hemmen, J Leo

    2013-12-01

    How can an animal learn from experience? How can it train sensors, such as the auditory or tactile system, based on other sensory input such as the visual system? Supervised spike-timing-dependent plasticity (supervised STDP) is a possible answer. Supervised STDP trains one modality using input from another one as "supervisor." Quite complex time-dependent relationships between the senses can be learned. Here we prove that under very general conditions, supervised STDP converges to a stable configuration of synaptic weights leading to a reconstruction of primary sensory input.

  5. Parallel Atomistic Simulations

    Energy Technology Data Exchange (ETDEWEB)

    HEFFELFINGER,GRANT S.

    2000-01-18

    Algorithms developed to enable the use of atomistic molecular simulation methods with parallel computers are reviewed. Methods appropriate for bonded as well as non-bonded (and charged) interactions are included. While strategies for obtaining parallel molecular simulations have been developed for the full variety of atomistic simulation methods, molecular dynamics and Monte Carlo have received the most attention. Three main types of parallel molecular dynamics simulations have been developed, the replicated data decomposition, the spatial decomposition, and the force decomposition. For Monte Carlo simulations, parallel algorithms have been developed which can be divided into two categories, those which require a modified Markov chain and those which do not. Parallel algorithms developed for other simulation methods such as Gibbs ensemble Monte Carlo, grand canonical molecular dynamics, and Monte Carlo methods for protein structure determination are also reviewed and issues such as how to measure parallel efficiency, especially in the case of parallel Monte Carlo algorithms with modified Markov chains are discussed.

  6. Eliminating thermal violin spikes from LIGO noise

    Energy Technology Data Exchange (ETDEWEB)

    Santamore, D. H.; Levin, Yuri

    2001-08-15

    We have developed a scheme for reducing LIGO suspension thermal noise close to violin-mode resonances. The idea is to monitor directly the thermally induced motion of a small portion of (a 'point' on) each suspension fiber, thereby recording the random forces driving the test-mass motion close to each violin-mode frequency. One can then suppress the thermal noise by optimally subtracting the recorded fiber motions from the measured motion of the test mass, i.e., from the LIGO output. The proposed method is a modification of an analogous but more technically difficult scheme by Braginsky, Levin and Vyatchanin for reducing broad-band suspension thermal noise. The efficiency of our method is limited by the sensitivity of the sensor used to monitor the fiber motion. If the sensor has no intrinsic noise (i.e. has unlimited sensitivity), then our method allows, in principle, a complete removal of violin spikes from the thermal-noise spectrum. We find that in LIGO-II interferometers, in order to suppress violin spikes below the shot-noise level, the intrinsic noise of the sensor must be less than {approx}2 x 10{sup -13} cm/Hz. This sensitivity is two orders of magnitude greater than that of currently available sensors.

  7. Eliminating thermal violin spikes from LIGO noise

    International Nuclear Information System (INIS)

    Santamore, D. H.; Levin, Yuri

    2001-01-01

    We have developed a scheme for reducing LIGO suspension thermal noise close to violin-mode resonances. The idea is to monitor directly the thermally induced motion of a small portion of (a 'point' on) each suspension fiber, thereby recording the random forces driving the test-mass motion close to each violin-mode frequency. One can then suppress the thermal noise by optimally subtracting the recorded fiber motions from the measured motion of the test mass, i.e., from the LIGO output. The proposed method is a modification of an analogous but more technically difficult scheme by Braginsky, Levin and Vyatchanin for reducing broad-band suspension thermal noise. The efficiency of our method is limited by the sensitivity of the sensor used to monitor the fiber motion. If the sensor has no intrinsic noise (i.e. has unlimited sensitivity), then our method allows, in principle, a complete removal of violin spikes from the thermal-noise spectrum. We find that in LIGO-II interferometers, in order to suppress violin spikes below the shot-noise level, the intrinsic noise of the sensor must be less than ∼2 x 10 -13 cm/Hz. This sensitivity is two orders of magnitude greater than that of currently available sensors

  8. Phase Diagram of Spiking Neural Networks

    Directory of Open Access Journals (Sweden)

    Hamed eSeyed-Allaei

    2015-03-01

    Full Text Available In computer simulations of spiking neural networks, often it is assumed that every two neurons of the network are connected by a probablilty of 2%, 20% of neurons are inhibitory and 80% are excitatory. These common values are based on experiments, observations. but here, I take a different perspective, inspired by evolution. I simulate many networks, each with a different set of parameters, and then I try to figure out what makes the common values desirable by nature. Networks which are configured according to the common values, have the best dynamic range in response to an impulse and their dynamic range is more robust in respect to synaptic weights. In fact, evolution has favored networks of best dynamic range. I present a phase diagram that shows the dynamic ranges of different networks of different parameteres. This phase diagram gives an insight into the space of parameters -- excitatory to inhibitory ratio, sparseness of connections and synaptic weights. It may serve as a guideline to decide about the values of parameters in a simulation of spiking neural network.

  9. Communication through resonance in spiking neuronal networks.

    Science.gov (United States)

    Hahn, Gerald; Bujan, Alejandro F; Frégnac, Yves; Aertsen, Ad; Kumar, Arvind

    2014-08-01

    The cortex processes stimuli through a distributed network of specialized brain areas. This processing requires mechanisms that can route neuronal activity across weakly connected cortical regions. Routing models proposed thus far are either limited to propagation of spiking activity across strongly connected networks or require distinct mechanisms that create local oscillations and establish their coherence between distant cortical areas. Here, we propose a novel mechanism which explains how synchronous spiking activity propagates across weakly connected brain areas supported by oscillations. In our model, oscillatory activity unleashes network resonance that amplifies feeble synchronous signals and promotes their propagation along weak connections ("communication through resonance"). The emergence of coherent oscillations is a natural consequence of synchronous activity propagation and therefore the assumption of different mechanisms that create oscillations and provide coherence is not necessary. Moreover, the phase-locking of oscillations is a side effect of communication rather than its requirement. Finally, we show how the state of ongoing activity could affect the communication through resonance and propose that modulations of the ongoing activity state could influence information processing in distributed cortical networks.

  10. Robustness of spiking Deep Belief Networks to noise and reduced bit precision of neuro-inspired hardware platforms

    Directory of Open Access Journals (Sweden)

    Evangelos eStromatias

    2015-07-01

    Full Text Available Increasingly large deep learning architectures, such as Deep Belief Networks (DBNs are the focus of current machine learning research and achieve state-of-the-art results in different domains. However, both training and execution of large-scale Deep Networks requires vast computing resources, leading to high power requirements and communication overheads. The on-going work on design and construction of spike-based hardware platforms offers an alternative for running deep neural networks with significantly lower power consumption, but has to overcome hardware limitations in terms of noise and limited weight precision, as well as noise inherent in the sensor signal. This article investigates how such hardware constraints impact the performance of spiking neural network implementations of DBNs. In particular, the influence of limited bit precision during execution and training, and the impact of silicon mismatch in the synaptic weight parameters of custom hybrid VLSI implementations is studied. Furthermore, the network performance of spiking DBNs is characterized with regard to noise in the spiking input signal. Our results demonstrate that spiking DBNs can tolerate very low levels of hardware bit precision down to almost 2 bits, and shows that their performance can be improved by at least 30% through an adapted training mechanism that takes the bit precision of the target platform into account. Spiking DBNs thus present an important use-case for large-scale hybrid analog-digital or digital neuromorphic platforms such as SpiNNaker, which can execute large but precision-constrained deep networks in real time.

  11. A spiking neuron circuit based on a carbon nanotube transistor

    International Nuclear Information System (INIS)

    Chen, C-L; Kim, K; Truong, Q; Shen, A; Li, Z; Chen, Y

    2012-01-01

    A spiking neuron circuit based on a carbon nanotube (CNT) transistor is presented in this paper. The spiking neuron circuit has a crossbar architecture in which the transistor gates are connected to its row electrodes and the transistor sources are connected to its column electrodes. An electrochemical cell is incorporated in the gate of the transistor by sandwiching a hydrogen-doped poly(ethylene glycol)methyl ether (PEG) electrolyte between the CNT channel and the top gate electrode. An input spike applied to the gate triggers a dynamic drift of the hydrogen ions in the PEG electrolyte, resulting in a post-synaptic current (PSC) through the CNT channel. Spikes input into the rows trigger PSCs through multiple CNT transistors, and PSCs cumulate in the columns and integrate into a ‘soma’ circuit to trigger output spikes based on an integrate-and-fire mechanism. The spiking neuron circuit can potentially emulate biological neuron networks and their intelligent functions. (paper)

  12. The local field potential reflects surplus spike synchrony

    DEFF Research Database (Denmark)

    Denker, Michael; Roux, Sébastien; Lindén, Henrik

    2011-01-01

    While oscillations of the local field potential (LFP) are commonly attributed to the synchronization of neuronal firing rate on the same time scale, their relationship to coincident spiking in the millisecond range is unknown. Here, we present experimental evidence to reconcile the notions...... of synchrony at the level of spiking and at the mesoscopic scale. We demonstrate that only in time intervals of significant spike synchrony that cannot be explained on the basis of firing rates, coincident spikes are better phase locked to the LFP than predicted by the locking of the individual spikes....... This effect is enhanced in periods of large LFP amplitudes. A quantitative model explains the LFP dynamics by the orchestrated spiking activity in neuronal groups that contribute the observed surplus synchrony. From the correlation analysis, we infer that neurons participate in different constellations...

  13. Effects of Spike Anticipation on the Spiking Dynamics of Neural Networks.

    Science.gov (United States)

    de Santos-Sierra, Daniel; Sanchez-Jimenez, Abel; Garcia-Vellisca, Mariano A; Navas, Adrian; Villacorta-Atienza, Jose A

    2015-01-01

    Synchronization is one of the central phenomena involved in information processing in living systems. It is known that the nervous system requires the coordinated activity of both local and distant neural populations. Such an interplay allows to merge different information modalities in a whole processing supporting high-level mental skills as understanding, memory, abstraction, etc. Though, the biological processes underlying synchronization in the brain are not fully understood there have been reported a variety of mechanisms supporting different types of synchronization both at theoretical and experimental level. One of the more intriguing of these phenomena is the anticipating synchronization, which has been recently reported in a pair of unidirectionally coupled artificial neurons under simple conditions (Pyragiene and Pyragas, 2013), where the slave neuron is able to anticipate in time the behavior of the master one. In this paper, we explore the effect of spike anticipation over the information processing performed by a neural network at functional and structural level. We show that the introduction of intermediary neurons in the network enhances spike anticipation and analyse how these variations in spike anticipation can significantly change the firing regime of the neural network according to its functional and structural properties. In addition we show that the interspike interval (ISI), one of the main features of the neural response associated with the information coding, can be closely related to spike anticipation by each spike, and how synaptic plasticity can be modulated through that relationship. This study has been performed through numerical simulation of a coupled system of Hindmarsh-Rose neurons.

  14. Effects of Spike Anticipation on the Spiking Dynamics of Neural Networks

    Directory of Open Access Journals (Sweden)

    Daniel ede Santos-Sierra

    2015-11-01

    Full Text Available Synchronization is one of the central phenomena involved in information processing in living systems. It is known that the nervous system requires the coordinated activity of both local and distant neural populations. Such an interplay allows to merge different information modalities in a whole processing supporting high-level mental skills as understanding, memory, abstraction, etc. Though the biological processes underlying synchronization in the brain are not fully understood there have been reported a variety of mechanisms supporting different types of synchronization both at theoretical and experimental level. One of the more intriguing of these phenomena is the anticipating synchronization, which has been recently reported in a pair of unidirectionally coupled artificial neurons under simple conditions cite{Pyragas}, where the slave neuron is able to anticipate in time the behaviour of the master one. In this paper we explore the effect of spike anticipation over the information processing performed by a neural network at functional and structural level. We show that the introduction of intermediary neurons in the network enhances spike anticipation and analyse how these variations in spike anticipation can significantly change the firing regime of the neural network according to its functional and structural properties. In addition we show that the interspike interval (ISI, one of the main features of the neural response associated to the information coding, can be closely related to spike anticipation by each spike, and how synaptic plasticity can be modulated through that relationship. This study has been performed through numerical simulation of a coupled system of Hindmarsh-Rose neurons.

  15. Multimodal imaging of spike propagation: a technical case report.

    Science.gov (United States)

    Tanaka, N; Grant, P E; Suzuki, N; Madsen, J R; Bergin, A M; Hämäläinen, M S; Stufflebeam, S M

    2012-06-01

    We report an 11-year-old boy with intractable epilepsy, who had cortical dysplasia in the right superior frontal gyrus. Spatiotemporal source analysis of MEG and EEG spikes demonstrated a similar time course of spike propagation from the superior to inferior frontal gyri, as observed on intracranial EEG. The tractography reconstructed from DTI showed a fiber connection between these areas. Our multimodal approach demonstrates spike propagation and a white matter tract guiding the propagation.

  16. Generalized activity equations for spiking neural network dynamics

    Directory of Open Access Journals (Sweden)

    Michael A Buice

    2013-11-01

    Full Text Available Much progress has been made in uncovering the computational capabilities of spiking neural networks. However, spiking neurons will always be more expensive to simulate compared to rate neurons because of the inherent disparity in time scales - the spike duration time is much shorter than the inter-spike time, which is much shorter than any learning time scale. In numerical analysis, this is a classic stiff problem. Spiking neurons are also much more difficult to study analytically. One possible approach to making spiking networks more tractable is to augment mean field activity models with some information about spiking correlations. For example, such a generalized activity model could carry information about spiking rates and correlations between spikes self-consistently. Here, we will show how this can be accomplished by constructing a complete formal probabilistic description of the network and then expanding around a small parameter such as the inverse of the number of neurons in the network. The mean field theory of the system gives a rate-like description. The first order terms in the perturbation expansion keep track of covariances.

  17. In-reactor creep of zirconium alloys by thermal spikes

    International Nuclear Information System (INIS)

    Ibrahim, E.F.

    1975-01-01

    The size and duration of thermal spikes from fast neutrons have been calculated for zirconium alloys, showing that spikes up to 1.8 nm radius may exist for 2 x 10 -11 s at greater than melting point, at 570K ambient temperature. Creep rates have been calculated assuming that the elastic strain from the applied stress relaxes in the volume of the spikes (by preferential loop alignment or modification of an existing dislocation network). The calculated rates are consistent with strain rates observed in long term tests-in-reactor, if spike lifetimes are 2 to 2.5 x 10 -11 s. (Auth.)

  18. Solving Constraint Satisfaction Problems with Networks of Spiking Neurons.

    Science.gov (United States)

    Jonke, Zeno; Habenschuss, Stefan; Maass, Wolfgang

    2016-01-01

    Network of neurons in the brain apply-unlike processors in our current generation of computer hardware-an event-based processing strategy, where short pulses (spikes) are emitted sparsely by neurons to signal the occurrence of an event at a particular point in time. Such spike-based computations promise to be substantially more power-efficient than traditional clocked processing schemes. However, it turns out to be surprisingly difficult to design networks of spiking neurons that can solve difficult computational problems on the level of single spikes, rather than rates of spikes. We present here a new method for designing networks of spiking neurons via an energy function. Furthermore, we show how the energy function of a network of stochastically firing neurons can be shaped in a transparent manner by composing the networks of simple stereotypical network motifs. We show that this design approach enables networks of spiking neurons to produce approximate solutions to difficult (NP-hard) constraint satisfaction problems from the domains of planning/optimization and verification/logical inference. The resulting networks employ noise as a computational resource. Nevertheless, the timing of spikes plays an essential role in their computations. Furthermore, networks of spiking neurons carry out for the Traveling Salesman Problem a more efficient stochastic search for good solutions compared with stochastic artificial neural networks (Boltzmann machines) and Gibbs sampling.

  19. 2D co-ordinate transformation based on a spike timing-dependent plasticity learning mechanism.

    Science.gov (United States)

    Wu, QingXiang; McGinnity, Thomas Martin; Maguire, Liam; Belatreche, Ammar; Glackin, Brendan

    2008-11-01

    In order to plan accurate motor actions, the brain needs to build an integrated spatial representation associated with visual stimuli and haptic stimuli. Since visual stimuli are represented in retina-centered co-ordinates and haptic stimuli are represented in body-centered co-ordinates, co-ordinate transformations must occur between the retina-centered co-ordinates and body-centered co-ordinates. A spiking neural network (SNN) model, which is trained with spike-timing-dependent-plasticity (STDP), is proposed to perform a 2D co-ordinate transformation of the polar representation of an arm position to a Cartesian representation, to create a virtual image map of a haptic input. Through the visual pathway, a position signal corresponding to the haptic input is used to train the SNN with STDP synapses such that after learning the SNN can perform the co-ordinate transformation to generate a representation of the haptic input with the same co-ordinates as a visual image. The model can be applied to explain co-ordinate transformation in spiking neuron based systems. The principle can be used in artificial intelligent systems to process complex co-ordinate transformations represented by biological stimuli.

  20. Weak noise in neurons may powerfully inhibit the generation of repetitive spiking but not its propagation.

    Directory of Open Access Journals (Sweden)

    Henry C Tuckwell

    2010-05-01

    Full Text Available Many neurons have epochs in which they fire action potentials in an approximately periodic fashion. To see what effects noise of relatively small amplitude has on such repetitive activity we recently examined the response of the Hodgkin-Huxley (HH space-clamped system to such noise as the mean and variance of the applied current vary, near the bifurcation to periodic firing. This article is concerned with a more realistic neuron model which includes spatial extent. Employing the Hodgkin-Huxley partial differential equation system, the deterministic component of the input current is restricted to a small segment whereas the stochastic component extends over a region which may or may not overlap the deterministic component. For mean values below, near and above the critical values for repetitive spiking, the effects of weak noise of increasing strength is ascertained by simulation. As in the point model, small amplitude noise near the critical value dampens the spiking activity and leads to a minimum as noise level increases. This was the case for both additive noise and conductance-based noise. Uniform noise along the whole neuron is only marginally more effective in silencing the cell than noise which occurs near the region of excitation. In fact it is found that if signal and noise overlap in spatial extent, then weak noise may inhibit spiking. If, however, signal and noise are applied on disjoint intervals, then the noise has no effect on the spiking activity, no matter how large its region of application, though the trajectories are naturally altered slightly by noise. Such effects could not be discerned in a point model and are important for real neuron behavior. Interference with the spike train does nevertheless occur when the noise amplitude is larger, even when noise and signal do not overlap, being due to the instigation of secondary noise-induced wave phenomena rather than switching the system from one attractor (firing regularly to

  1. AMORE Mo-99 Spike Test Results

    Energy Technology Data Exchange (ETDEWEB)

    Youker, Amanda J. [Argonne National Lab. (ANL), Argonne, IL (United States); Krebs, John F. [Argonne National Lab. (ANL), Argonne, IL (United States); Quigley, Kevin J. [Argonne National Lab. (ANL), Argonne, IL (United States); Byrnes, James P. [Argonne National Lab. (ANL), Argonne, IL (United States); Rotsch, David A [Argonne National Lab. (ANL), Argonne, IL (United States); Brossard, Thomas [Argonne National Lab. (ANL), Argonne, IL (United States); Wesolowski, Kenneth [Argonne National Lab. (ANL), Argonne, IL (United States); Alford, Kurt [Argonne National Lab. (ANL), Argonne, IL (United States); Chemerisov, Sergey [Argonne National Lab. (ANL), Argonne, IL (United States); Vandegrift, George F. [Argonne National Lab. (ANL), Argonne, IL (United States)

    2017-09-27

    With funding from the National Nuclear Security Administrations Material Management and Minimization Office, Argonne National Laboratory (Argonne) is providing technical assistance to help accelerate the U.S. production of Mo-99 using a non-highly enriched uranium (non-HEU) source. A potential Mo-99 production pathway is by accelerator-initiated fissioning in a subcritical uranyl sulfate solution containing low enriched uranium (LEU). As part of the Argonne development effort, we are undertaking the AMORE (Argonne Molybdenum Research Experiment) project, which is essentially a pilot facility for all phases of Mo-99 production, recovery, and purification. Production of Mo-99 and other fission products in the subcritical target solution is initiated by putting an electron beam on a depleted uranium (DU) target; the fast neutrons produced in the DU target are thermalized and lead to fissioning of U-235. At the end of irradiation, Mo is recovered from the target solution and separated from uranium and most of the fission products by using a titania column. The Mo is stripped from the column with an alkaline solution. After acidification of the Mo product solution from the recovery column, the Mo is concentrated (and further purified) in a second titania column. The strip solution from the concentration column is then purified with the LEU Modified Cintichem process. A full description of the process can be found elsewhere [1–3]. The initial commissioning steps for the AMORE project include performing a Mo-99 spike test with pH 1 sulfuric acid in the target vessel without a beam on the target to demonstrate the initial Mo separation-and-recovery process, followed by the concentration column process. All glovebox operations were tested with cold solutions prior to performing the Mo-99 spike tests. Two Mo-99 spike tests with pH 1 sulfuric acid have been performed to date. Figure 1 shows the flow diagram for the remotely operated Mo-recovery system for the AMORE project

  2. Parallelization in Modern C++

    CERN Multimedia

    CERN. Geneva

    2016-01-01

    The traditionally used and well established parallel programming models OpenMP and MPI are both targeting lower level parallelism and are meant to be as language agnostic as possible. For a long time, those models were the only widely available portable options for developing parallel C++ applications beyond using plain threads. This has strongly limited the optimization capabilities of compilers, has inhibited extensibility and genericity, and has restricted the use of those models together with other, modern higher level abstractions introduced by the C++11 and C++14 standards. The recent revival of interest in the industry and wider community for the C++ language has also spurred a remarkable amount of standardization proposals and technical specifications being developed. Those efforts however have so far failed to build a vision on how to seamlessly integrate various types of parallelism, such as iterative parallel execution, task-based parallelism, asynchronous many-task execution flows, continuation s...

  3. Parallelism in matrix computations

    CERN Document Server

    Gallopoulos, Efstratios; Sameh, Ahmed H

    2016-01-01

    This book is primarily intended as a research monograph that could also be used in graduate courses for the design of parallel algorithms in matrix computations. It assumes general but not extensive knowledge of numerical linear algebra, parallel architectures, and parallel programming paradigms. The book consists of four parts: (I) Basics; (II) Dense and Special Matrix Computations; (III) Sparse Matrix Computations; and (IV) Matrix functions and characteristics. Part I deals with parallel programming paradigms and fundamental kernels, including reordering schemes for sparse matrices. Part II is devoted to dense matrix computations such as parallel algorithms for solving linear systems, linear least squares, the symmetric algebraic eigenvalue problem, and the singular-value decomposition. It also deals with the development of parallel algorithms for special linear systems such as banded ,Vandermonde ,Toeplitz ,and block Toeplitz systems. Part III addresses sparse matrix computations: (a) the development of pa...

  4. A Parallel Supercomputer Implementation of a Biological Inspired Neural Network and its use for Pattern Recognition

    International Nuclear Information System (INIS)

    De Ladurantaye, Vincent; Lavoie, Jean; Bergeron, Jocelyn; Parenteau, Maxime; Lu Huizhong; Pichevar, Ramin; Rouat, Jean

    2012-01-01

    A parallel implementation of a large spiking neural network is proposed and evaluated. The neural network implements the binding by synchrony process using the Oscillatory Dynamic Link Matcher (ODLM). Scalability, speed and performance are compared for 2 implementations: Message Passing Interface (MPI) and Compute Unified Device Architecture (CUDA) running on clusters of multicore supercomputers and NVIDIA graphical processing units respectively. A global spiking list that represents at each instant the state of the neural network is described. This list indexes each neuron that fires during the current simulation time so that the influence of their spikes are simultaneously processed on all computing units. Our implementation shows a good scalability for very large networks. A complex and large spiking neural network has been implemented in parallel with success, thus paving the road towards real-life applications based on networks of spiking neurons. MPI offers a better scalability than CUDA, while the CUDA implementation on a GeForce GTX 285 gives the best cost to performance ratio. When running the neural network on the GTX 285, the processing speed is comparable to the MPI implementation on RQCHP's Mammouth parallel with 64 notes (128 cores).

  5. Self-control with spiking and non-spiking neural networks playing games.

    Science.gov (United States)

    Christodoulou, Chris; Banfield, Gaye; Cleanthous, Aristodemos

    2010-01-01

    Self-control can be defined as choosing a large delayed reward over a small immediate reward, while precommitment is the making of a choice with the specific aim of denying oneself future choices. Humans recognise that they have self-control problems and attempt to overcome them by applying precommitment. Problems in exercising self-control, suggest a conflict between cognition and motivation, which has been linked to competition between higher and lower brain functions (representing the frontal lobes and the limbic system respectively). This premise of an internal process conflict, lead to a behavioural model being proposed, based on which, we implemented a computational model for studying and explaining self-control through precommitment behaviour. Our model consists of two neural networks, initially non-spiking and then spiking ones, representing the higher and lower brain systems viewed as cooperating for the benefit of the organism. The non-spiking neural networks are of simple feed forward multilayer type with reinforcement learning, one with selective bootstrap weight update rule, which is seen as myopic, representing the lower brain and the other with the temporal difference weight update rule, which is seen as far-sighted, representing the higher brain. The spiking neural networks are implemented with leaky integrate-and-fire neurons with learning based on stochastic synaptic transmission. The differentiating element between the two brain centres in this implementation is based on the memory of past actions determined by an eligibility trace time constant. As the structure of the self-control problem can be likened to the Iterated Prisoner's Dilemma (IPD) game in that cooperation is to defection what self-control is to impulsiveness or what compromising is to insisting, we implemented the neural networks as two players, learning simultaneously but independently, competing in the IPD game. With a technique resembling the precommitment effect, whereby the

  6. [Wide QRS tachycardia preceded by pacemaker spikes].

    Science.gov (United States)

    Romero, M; Aranda, A; Gómez, F J; Jurado, A

    2014-04-01

    The differential diagnosis and therapeutic management of wide QRS tachycardia preceded by pacemaker spike is presented. The pacemaker-mediated tachycardia, tachycardia fibrillo-flutter in patients with pacemakers, and runaway pacemakers, have a similar surface electrocardiogram, but respond to different therapeutic measures. The tachycardia response to the application of a magnet over the pacemaker could help in the differential diagnosis, and in some cases will be therapeutic, as in the case of a tachycardia-mediated pacemaker. Although these conditions are diagnosed and treated in hospitals with catheterization laboratories using the application programmer over the pacemaker, patients presenting in primary care clinic and emergency forced us to make a diagnosis and treat the haemodynamically unstable patient prior to referral. Copyright © 2012 Sociedad Española de Médicos de Atención Primaria (SEMERGEN). Publicado por Elsevier España. All rights reserved.

  7. A propane price spike nails users

    International Nuclear Information System (INIS)

    Milke, M.

    1997-01-01

    The increase in price for propane was discussed. In 1993, propane cost about 5 cents per litre; by December 1996, the price has risen to 27 cents wholesale, while retail prices for auto propane reached 40 cents per litre. As a result, farmers and fleet operators are considering switching to an alternative energy supply. The five factors which may have played a role in the propane price spike were described. These included a cold winter which lowered inventories, a Pemex gas plant in Mexico which had been damaged by fire, forcing Mexico to import natural gas and natural gas liquids from the USA, the failure of propane distributors to restock during the summer months in the hope of lower prices, and increased cost of competing fuels in the face of increased demand. It was noted that these factors are transitory, which could mean better prices this summer

  8. Spike sorting for polytrodes: a divide and conquer approach

    Directory of Open Access Journals (Sweden)

    Nicholas V. Swindale

    2014-02-01

    Full Text Available In order to determine patterns of neural activity, spike signals recorded by extracellular electrodes have to be clustered (sorted with the aim of ensuring that each cluster represents all the spikes generated by an individual neuron. Many methods for spike sorting have been proposed but few are easily applicable to recordings from polytrodes which may have 16 or more recording sites. As with tetrodes, these are spaced sufficiently closely that signals from single neurons will usually be recorded on several adjacent sites. Although this offers a better chance of distinguishing neurons with similarly shaped spikes, sorting is difficult in such cases because of the high dimensionality of the space in which the signals must be classified. This report details a method for spike sorting based on a divide and conquer approach. Clusters are initially formed by assigning each event to the channel on which it is largest. Each channel-based cluster is then sub-divided into as many distinct clusters as possible. These are then recombined on the basis of pairwise tests into a final set of clusters. Pairwise tests are also performed to establish how distinct each cluster is from the others. A modified gradient ascent clustering (GAC algorithm is used to do the clustering. The method can sort spikes with minimal user input in times comparable to real time for recordings lasting up to 45 minutes. Our results illustrate some of the difficulties inherent in spike sorting, including changes in spike shape over time. We show that some physiologically distinct units may have very similar spike shapes. We show that RMS measures of spike shape similarity are not sensitive enough to discriminate clusters that can otherwise be separated by principal components analysis. Hence spike sorting based on least-squares matching to templates may be unreliable. Our methods should be applicable to tetrodes and scaleable to larger multi-electrode arrays (MEAs.

  9. A parallel buffer tree

    DEFF Research Database (Denmark)

    Sitchinava, Nodar; Zeh, Norbert

    2012-01-01

    We present the parallel buffer tree, a parallel external memory (PEM) data structure for batched search problems. This data structure is a non-trivial extension of Arge's sequential buffer tree to a private-cache multiprocessor environment and reduces the number of I/O operations by the number of...... in the optimal OhOf(psortN + K/PB) parallel I/O complexity, where K is the size of the output reported in the process and psortN is the parallel I/O complexity of sorting N elements using P processors....

  10. Parallel MR imaging.

    Science.gov (United States)

    Deshmane, Anagha; Gulani, Vikas; Griswold, Mark A; Seiberlich, Nicole

    2012-07-01

    Parallel imaging is a robust method for accelerating the acquisition of magnetic resonance imaging (MRI) data, and has made possible many new applications of MR imaging. Parallel imaging works by acquiring a reduced amount of k-space data with an array of receiver coils. These undersampled data can be acquired more quickly, but the undersampling leads to aliased images. One of several parallel imaging algorithms can then be used to reconstruct artifact-free images from either the aliased images (SENSE-type reconstruction) or from the undersampled data (GRAPPA-type reconstruction). The advantages of parallel imaging in a clinical setting include faster image acquisition, which can be used, for instance, to shorten breath-hold times resulting in fewer motion-corrupted examinations. In this article the basic concepts behind parallel imaging are introduced. The relationship between undersampling and aliasing is discussed and two commonly used parallel imaging methods, SENSE and GRAPPA, are explained in detail. Examples of artifacts arising from parallel imaging are shown and ways to detect and mitigate these artifacts are described. Finally, several current applications of parallel imaging are presented and recent advancements and promising research in parallel imaging are briefly reviewed. Copyright © 2012 Wiley Periodicals, Inc.

  11. Parallel Algorithms and Patterns

    Energy Technology Data Exchange (ETDEWEB)

    Robey, Robert W. [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-06-16

    This is a powerpoint presentation on parallel algorithms and patterns. A parallel algorithm is a well-defined, step-by-step computational procedure that emphasizes concurrency to solve a problem. Examples of problems include: Sorting, searching, optimization, matrix operations. A parallel pattern is a computational step in a sequence of independent, potentially concurrent operations that occurs in diverse scenarios with some frequency. Examples are: Reductions, prefix scans, ghost cell updates. We only touch on parallel patterns in this presentation. It really deserves its own detailed discussion which Gabe Rockefeller would like to develop.

  12. Application Portable Parallel Library

    Science.gov (United States)

    Cole, Gary L.; Blech, Richard A.; Quealy, Angela; Townsend, Scott

    1995-01-01

    Application Portable Parallel Library (APPL) computer program is subroutine-based message-passing software library intended to provide consistent interface to variety of multiprocessor computers on market today. Minimizes effort needed to move application program from one computer to another. User develops application program once and then easily moves application program from parallel computer on which created to another parallel computer. ("Parallel computer" also include heterogeneous collection of networked computers). Written in C language with one FORTRAN 77 subroutine for UNIX-based computers and callable from application programs written in C language or FORTRAN 77.

  13. Clustering predicts memory performance in networks of spiking and non-spiking neurons

    Directory of Open Access Journals (Sweden)

    Weiliang eChen

    2011-03-01

    Full Text Available The problem we address in this paper is that of finding effective and parsimonious patterns of connectivity in sparse associative memories. This problem must be addressed in real neuronal systems, so that results in artificial systems could throw light on real systems. We show that there are efficient patterns of connectivity and that these patterns are effective in models with either spiking or non-spiking neurons. This suggests that there may be some underlying general principles governing good connectivity in such networks. We also show that the clustering of the network, measured by Clustering Coefficient, has a strong linear correlation to the performance of associative memory. This result is important since a purely static measure of network connectivity appears to determine an important dynamic property of the network.

  14. Symbol manipulation and rule learning in spiking neuronal networks.

    Science.gov (United States)

    Fernando, Chrisantha

    2011-04-21

    It has been claimed that the productivity, systematicity and compositionality of human language and thought necessitate the existence of a physical symbol system (PSS) in the brain. Recent discoveries about temporal coding suggest a novel type of neuronal implementation of a physical symbol system. Furthermore, learning classifier systems provide a plausible algorithmic basis by which symbol re-write rules could be trained to undertake behaviors exhibiting systematicity and compositionality, using a kind of natural selection of re-write rules in the brain, We show how the core operation of a learning classifier system, namely, the replication with variation of symbol re-write rules, can be implemented using spike-time dependent plasticity based supervised learning. As a whole, the aim of this paper is to integrate an algorithmic and an implementation level description of a neuronal symbol system capable of sustaining systematic and compositional behaviors. Previously proposed neuronal implementations of symbolic representations are compared with this new proposal. Copyright © 2011 Elsevier Ltd. All rights reserved.

  15. Spikes and memory in (Nord Pool) electricity price spot prices

    DEFF Research Database (Denmark)

    Proietti, Tomasso; Haldrup, Niels; Knapik, Oskar

    Electricity spot prices are subject to transitory sharp movements commonly referred to as spikes. The paper aims at assessing their effects on model based inferences and predictions, with reference to the Nord Pool power exchange. We identify a spike as a price value which deviates substantially...

  16. The Nature of Power Spikes: a regime-switch approach

    NARCIS (Netherlands)

    C.M. de Jong (Cyriel)

    2005-01-01

    textabstractDue to its non-storable nature, electricity is a commodity with probably the most volatile spot prices, exemplified by occasional spikes. Appropriate pricing, portfolio, and risk management models have to incorporate these characteristics, and the spikes in particular. We investigate the

  17. No WIMP mini-spikes in dwarf spheroidal galaxies

    NARCIS (Netherlands)

    Wanders, M.; Bertone, G.; Volonteri, M.; Weniger, C.

    2015-01-01

    The formation of black holes inevitably affects the distribution of dark and baryonic matter in their vicinity, leading to an enhancement of the dark matter density, called spike, and if dark matter is made of WIMPs, to a strong enhancement of the dark matter annihilation rate. Spikes at the center

  18. Spiking and bursting patterns of fractional-order Izhikevich model

    Science.gov (United States)

    Teka, Wondimu W.; Upadhyay, Ranjit Kumar; Mondal, Argha

    2018-03-01

    Bursting and spiking oscillations play major roles in processing and transmitting information in the brain through cortical neurons that respond differently to the same signal. These oscillations display complex dynamics that might be produced by using neuronal models and varying many model parameters. Recent studies have shown that models with fractional order can produce several types of history-dependent neuronal activities without the adjustment of several parameters. We studied the fractional-order Izhikevich model and analyzed different kinds of oscillations that emerge from the fractional dynamics. The model produces a wide range of neuronal spike responses, including regular spiking, fast spiking, intrinsic bursting, mixed mode oscillations, regular bursting and chattering, by adjusting only the fractional order. Both the active and silent phase of the burst increase when the fractional-order model further deviates from the classical model. For smaller fractional order, the model produces memory dependent spiking activity after the pulse signal turned off. This special spiking activity and other properties of the fractional-order model are caused by the memory trace that emerges from the fractional-order dynamics and integrates all the past activities of the neuron. On the network level, the response of the neuronal network shifts from random to scale-free spiking. Our results suggest that the complex dynamics of spiking and bursting can be the result of the long-term dependence and interaction of intracellular and extracellular ionic currents.

  19. Spectral components of cytosolic [Ca2+] spiking in neurons

    DEFF Research Database (Denmark)

    Kardos, J; Szilágyi, N; Juhász, G

    1998-01-01

    . Delayed complex responses of large [Ca2+]c spiking observed in cells from a different set of cultures were synthesized by a set of frequencies within the range 0.018-0.117 Hz. Differential frequency patterns are suggested as characteristics of the [Ca2+]c spiking responses of neurons under different...

  20. Cytoplasmic tail of coronavirus spike protein has intracellular

    Indian Academy of Sciences (India)

    https://www.ias.ac.in/article/fulltext/jbsc/042/02/0231-0244. Keywords. Coronavirus spike protein trafficking; cytoplasmic tail signal; endoplasmic reticulum–Golgi intermediate complex; lysosome. Abstract. Intracellular trafficking and localization studies of spike protein from SARS and OC43 showed that SARS spikeprotein is ...

  1. Distributed Cerebellar Motor Learning; a Spike-Timing-Dependent Plasticity Model

    Directory of Open Access Journals (Sweden)

    Niceto Rafael Luque

    2016-03-01

    Full Text Available Deep cerebellar nuclei neurons receive both inhibitory (GABAergic synaptic currents from Purkinje cells (within the cerebellar cortex and excitatory (glutamatergic synaptic currents from mossy fibres. Those two deep cerebellar nucleus inputs are thought to be also adaptive, embedding interesting properties in the framework of accurate movements. We show that distributed spike-timing-dependent plasticity mechanisms (STDP located at different cerebellar sites (parallel fibres to Purkinje cells, mossy fibres to deep cerebellar nucleus cells, and Purkinje cells to deep cerebellar nucleus cells in close-loop simulations provide an explanation for the complex learning properties of the cerebellum in motor learning. Concretely, we propose a new mechanistic cerebellar spiking model. In this new model, deep cerebellar nuclei embed a dual functionality: deep cerebellar nuclei acting as a gain adaptation mechanism and as a facilitator for the slow memory consolidation at mossy fibres to deep cerebellar nucleus synapses. Equipping the cerebellum with excitatory (e-STDP and inhibitory (i-STDP mechanisms at deep cerebellar nuclei afferents allows the accommodation of synaptic memories that were formed at parallel fibres to Purkinje cells synapses and then transferred to mossy fibres to deep cerebellar nucleus synapses. These adaptive mechanisms also contribute to modulate the deep-cerebellar-nucleus-output firing rate (output gain modulation towards optimising its working range.

  2. Temporally coordinated spiking activity of human induced pluripotent stem cell-derived neurons co-cultured with astrocytes.

    Science.gov (United States)

    Kayama, Tasuku; Suzuki, Ikuro; Odawara, Aoi; Sasaki, Takuya; Ikegaya, Yuji

    2018-01-01

    In culture conditions, human induced-pluripotent stem cells (hiPSC)-derived neurons form synaptic connections with other cells and establish neuronal networks, which are expected to be an in vitro model system for drug discovery screening and toxicity testing. While early studies demonstrated effects of co-culture of hiPSC-derived neurons with astroglial cells on survival and maturation of hiPSC-derived neurons, the population spiking patterns of such hiPSC-derived neurons have not been fully characterized. In this study, we analyzed temporal spiking patterns of hiPSC-derived neurons recorded by a multi-electrode array system. We discovered that specific sets of hiPSC-derived neurons co-cultured with astrocytes showed more frequent and highly coherent non-random synchronized spike trains and more dynamic changes in overall spike patterns over time. These temporally coordinated spiking patterns are physiological signs of organized circuits of hiPSC-derived neurons and suggest benefits of co-culture of hiPSC-derived neurons with astrocytes. Copyright © 2017 Elsevier Inc. All rights reserved.

  3. Parallel discrete event simulation

    NARCIS (Netherlands)

    Overeinder, B.J.; Hertzberger, L.O.; Sloot, P.M.A.; Withagen, W.J.

    1991-01-01

    In simulating applications for execution on specific computing systems, the simulation performance figures must be known in a short period of time. One basic approach to the problem of reducing the required simulation time is the exploitation of parallelism. However, in parallelizing the simulation

  4. Parallel reservoir simulator computations

    International Nuclear Information System (INIS)

    Hemanth-Kumar, K.; Young, L.C.

    1995-01-01

    The adaptation of a reservoir simulator for parallel computations is described. The simulator was originally designed for vector processors. It performs approximately 99% of its calculations in vector/parallel mode and relative to scalar calculations it achieves speedups of 65 and 81 for black oil and EOS simulations, respectively on the CRAY C-90

  5. Pressurized water reactor iodine spiking behavior under power transient conditions

    International Nuclear Information System (INIS)

    Ho, J.C.

    1992-01-01

    The most accepted theory explaining the cause of pressurized water reactor iodine spiking is steam formation and condensation in damaged fuel rods. The phase transformation of the primary coolant from water to steam and back again is believed to cause the iodine spiking phenomenon. But due to the complex nature of the phenomenon, a comprehensive model of the behavior has not yet been successfully developed. This paper presents a new model based on an empirical approach, which gives a first-order estimation of the peak iodine spiking magnitude. Based on the proposed iodine spiking model, it is apparent that it is feasible to derive a correlation using the plant operating data base to monitor and control the peak iodine spiking magnitude

  6. Recent progress in multi-electrode spike sorting methods.

    Science.gov (United States)

    Lefebvre, Baptiste; Yger, Pierre; Marre, Olivier

    2016-11-01

    In recent years, arrays of extracellular electrodes have been developed and manufactured to record simultaneously from hundreds of electrodes packed with a high density. These recordings should allow neuroscientists to reconstruct the individual activity of the neurons spiking in the vicinity of these electrodes, with the help of signal processing algorithms. Algorithms need to solve a source separation problem, also known as spike sorting. However, these new devices challenge the classical way to do spike sorting. Here we review different methods that have been developed to sort spikes from these large-scale recordings. We describe the common properties of these algorithms, as well as their main differences. Finally, we outline the issues that remain to be solved by future spike sorting algorithms. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Totally parallel multilevel algorithms

    Science.gov (United States)

    Frederickson, Paul O.

    1988-01-01

    Four totally parallel algorithms for the solution of a sparse linear system have common characteristics which become quite apparent when they are implemented on a highly parallel hypercube such as the CM2. These four algorithms are Parallel Superconvergent Multigrid (PSMG) of Frederickson and McBryan, Robust Multigrid (RMG) of Hackbusch, the FFT based Spectral Algorithm, and Parallel Cyclic Reduction. In fact, all four can be formulated as particular cases of the same totally parallel multilevel algorithm, which are referred to as TPMA. In certain cases the spectral radius of TPMA is zero, and it is recognized to be a direct algorithm. In many other cases the spectral radius, although not zero, is small enough that a single iteration per timestep keeps the local error within the required tolerance.

  8. Parallel computing works

    Energy Technology Data Exchange (ETDEWEB)

    1991-10-23

    An account of the Caltech Concurrent Computation Program (C{sup 3}P), a five year project that focused on answering the question: Can parallel computers be used to do large-scale scientific computations '' As the title indicates, the question is answered in the affirmative, by implementing numerous scientific applications on real parallel computers and doing computations that produced new scientific results. In the process of doing so, C{sup 3}P helped design and build several new computers, designed and implemented basic system software, developed algorithms for frequently used mathematical computations on massively parallel machines, devised performance models and measured the performance of many computers, and created a high performance computing facility based exclusively on parallel computers. While the initial focus of C{sup 3}P was the hypercube architecture developed by C. Seitz, many of the methods developed and lessons learned have been applied successfully on other massively parallel architectures.

  9. Massively parallel mathematical sieves

    Energy Technology Data Exchange (ETDEWEB)

    Montry, G.R.

    1989-01-01

    The Sieve of Eratosthenes is a well-known algorithm for finding all prime numbers in a given subset of integers. A parallel version of the Sieve is described that produces computational speedups over 800 on a hypercube with 1,024 processing elements for problems of fixed size. Computational speedups as high as 980 are achieved when the problem size per processor is fixed. The method of parallelization generalizes to other sieves and will be efficient on any ensemble architecture. We investigate two highly parallel sieves using scattered decomposition and compare their performance on a hypercube multiprocessor. A comparison of different parallelization techniques for the sieve illustrates the trade-offs necessary in the design and implementation of massively parallel algorithms for large ensemble computers.

  10. Noisy Spiking in Visual Area V2 of Amblyopic Monkeys.

    Science.gov (United States)

    Wang, Ye; Zhang, Bin; Tao, Xiaofeng; Wensveen, Janice M; Smith, Earl L; Chino, Yuzo M

    2017-01-25

    Interocular decorrelation of input signals in developing visual cortex can cause impaired binocular vision and amblyopia. Although increased intrinsic noise is thought to be responsible for a range of perceptual deficits in amblyopic humans, the neural basis for the elevated perceptual noise in amblyopic primates is not known. Here, we tested the idea that perceptual noise is linked to the neuronal spiking noise (variability) resulting from developmental alterations in cortical circuitry. To assess spiking noise, we analyzed the contrast-dependent dynamics of spike counts and spiking irregularity by calculating the square of the coefficient of variation in interspike intervals (CV 2 ) and the trial-to-trial fluctuations in spiking, or mean matched Fano factor (m-FF) in visual area V2 of monkeys reared with chronic monocular defocus. In amblyopic neurons, the contrast versus response functions and the spike count dynamics exhibited significant deviations from comparable data for normal monkeys. The CV 2 was pronounced in amblyopic neurons for high-contrast stimuli and the m-FF was abnormally high in amblyopic neurons for low-contrast gratings. The spike count, CV 2 , and m-FF of spontaneous activity were also elevated in amblyopic neurons. These contrast-dependent spiking irregularities were correlated with the level of binocular suppression in these V2 neurons and with the severity of perceptual loss for individual monkeys. Our results suggest that the developmental alterations in normalization mechanisms resulting from early binocular suppression can explain much of these contrast-dependent spiking abnormalities in V2 neurons and the perceptual performance of our amblyopic monkeys. Amblyopia is a common developmental vision disorder in humans. Despite the extensive animal studies on how amblyopia emerges, we know surprisingly little about the neural basis of amblyopia in humans and nonhuman primates. Although the vision of amblyopic humans is often described as

  11. Asynchronous Rate Chaos in Spiking Neuronal Circuits.

    Directory of Open Access Journals (Sweden)

    Omri Harish

    2015-07-01

    Full Text Available The brain exhibits temporally complex patterns of activity with features similar to those of chaotic systems. Theoretical studies over the last twenty years have described various computational advantages for such regimes in neuronal systems. Nevertheless, it still remains unclear whether chaos requires specific cellular properties or network architectures, or whether it is a generic property of neuronal circuits. We investigate the dynamics of networks of excitatory-inhibitory (EI spiking neurons with random sparse connectivity operating in the regime of balance of excitation and inhibition. Combining Dynamical Mean-Field Theory with numerical simulations, we show that chaotic, asynchronous firing rate fluctuations emerge generically for sufficiently strong synapses. Two different mechanisms can lead to these chaotic fluctuations. One mechanism relies on slow I-I inhibition which gives rise to slow subthreshold voltage and rate fluctuations. The decorrelation time of these fluctuations is proportional to the time constant of the inhibition. The second mechanism relies on the recurrent E-I-E feedback loop. It requires slow excitation but the inhibition can be fast. In the corresponding dynamical regime all neurons exhibit rate fluctuations on the time scale of the excitation. Another feature of this regime is that the population-averaged firing rate is substantially smaller in the excitatory population than in the inhibitory population. This is not necessarily the case in the I-I mechanism. Finally, we discuss the neurophysiological and computational significance of our results.

  12. Memory recall and spike-frequency adaptation

    Science.gov (United States)

    Roach, James P.; Sander, Leonard M.; Zochowski, Michal R.

    2016-05-01

    The brain can reproduce memories from partial data; this ability is critical for memory recall. The process of memory recall has been studied using autoassociative networks such as the Hopfield model. This kind of model reliably converges to stored patterns that contain the memory. However, it is unclear how the behavior is controlled by the brain so that after convergence to one configuration, it can proceed with recognition of another one. In the Hopfield model, this happens only through unrealistic changes of an effective global temperature that destabilizes all stored configurations. Here we show that spike-frequency adaptation (SFA), a common mechanism affecting neuron activation in the brain, can provide state-dependent control of pattern retrieval. We demonstrate this in a Hopfield network modified to include SFA, and also in a model network of biophysical neurons. In both cases, SFA allows for selective stabilization of attractors with different basins of attraction, and also for temporal dynamics of attractor switching that is not possible in standard autoassociative schemes. The dynamics of our models give a plausible account of different sorts of memory retrieval.

  13. Phase diagram of spiking neural networks.

    Science.gov (United States)

    Seyed-Allaei, Hamed

    2015-01-01

    In computer simulations of spiking neural networks, often it is assumed that every two neurons of the network are connected by a probability of 2%, 20% of neurons are inhibitory and 80% are excitatory. These common values are based on experiments, observations, and trials and errors, but here, I take a different perspective, inspired by evolution, I systematically simulate many networks, each with a different set of parameters, and then I try to figure out what makes the common values desirable. I stimulate networks with pulses and then measure their: dynamic range, dominant frequency of population activities, total duration of activities, maximum rate of population and the occurrence time of maximum rate. The results are organized in phase diagram. This phase diagram gives an insight into the space of parameters - excitatory to inhibitory ratio, sparseness of connections and synaptic weights. This phase diagram can be used to decide the parameters of a model. The phase diagrams show that networks which are configured according to the common values, have a good dynamic range in response to an impulse and their dynamic range is robust in respect to synaptic weights, and for some synaptic weights they oscillates in α or β frequencies, independent of external stimuli.

  14. Asynchronous Rate Chaos in Spiking Neuronal Circuits

    Science.gov (United States)

    Harish, Omri; Hansel, David

    2015-01-01

    The brain exhibits temporally complex patterns of activity with features similar to those of chaotic systems. Theoretical studies over the last twenty years have described various computational advantages for such regimes in neuronal systems. Nevertheless, it still remains unclear whether chaos requires specific cellular properties or network architectures, or whether it is a generic property of neuronal circuits. We investigate the dynamics of networks of excitatory-inhibitory (EI) spiking neurons with random sparse connectivity operating in the regime of balance of excitation and inhibition. Combining Dynamical Mean-Field Theory with numerical simulations, we show that chaotic, asynchronous firing rate fluctuations emerge generically for sufficiently strong synapses. Two different mechanisms can lead to these chaotic fluctuations. One mechanism relies on slow I-I inhibition which gives rise to slow subthreshold voltage and rate fluctuations. The decorrelation time of these fluctuations is proportional to the time constant of the inhibition. The second mechanism relies on the recurrent E-I-E feedback loop. It requires slow excitation but the inhibition can be fast. In the corresponding dynamical regime all neurons exhibit rate fluctuations on the time scale of the excitation. Another feature of this regime is that the population-averaged firing rate is substantially smaller in the excitatory population than in the inhibitory population. This is not necessarily the case in the I-I mechanism. Finally, we discuss the neurophysiological and computational significance of our results. PMID:26230679

  15. Automated spike preparation system for Isotope Dilution Mass Spectrometry (IDMS)

    International Nuclear Information System (INIS)

    Maxwell, S.L. III; Clark, J.P.

    1990-01-01

    Isotope Dilution Mass Spectrometry (IDMS) is a method frequently employed to measure dissolved, irradiated nuclear materials. A known quantity of a unique isotope of the element to be measured (referred to as the ''spike'') is added to the solution containing the analyte. The resulting solution is chemically purified then analyzed by mass spectrometry. By measuring the magnitude of the response for each isotope and the response for the ''unique spike'' then relating this to the known quantity of the ''spike'', the quantity of the nuclear material can be determined. An automated spike preparation system was developed at the Savannah River Site (SRS) to dispense spikes for use in IDMS analytical methods. Prior to this development, technicians weighed each individual spike manually to achieve the accuracy required. This procedure was time-consuming and subjected the master stock solution to evaporation. The new system employs a high precision SMI Model 300 Unipump dispenser interfaced with an electronic balance and a portable Epson HX-20 notebook computer to automate spike preparation

  16. Geomagnetic spikes on the core-mantle boundary

    Science.gov (United States)

    Davies, C. J.; Constable, C.

    2017-12-01

    Extreme variations of Earth's magnetic field occurred in the Levantine region around 1000 BC, where the field intensity rose and fell by a factor of 2-3 over a short time and confined spatial region. There is presently no coherent link between this intensity spike and the generating processes in Earth's liquid core. Here we test the attribution of a surface spike to a flux patch visible on the core-mantle boundary (CMB), calculating geometric and energetic bounds on resulting surface geomagnetic features. We show that the Levantine intensity high must span at least 60 degrees in longitude. Models providing the best trade-off between matching surface spike intensity, minimizing L1 and L2 misfit to the available data and satisfying core energy constraints produce CMB spikes 8-22 degrees wide with peak values of O(100) mT. We propose that the Levantine spike grew in place before migrating northward and westward, contributing to the growth of the axial dipole field seen in Holocene field models. Estimates of Ohmic dissipation suggest that diffusive processes, which are often neglected, likely govern the ultimate decay of geomagnetic spikes. Using these results, we search for the presence of spike-like features in geodynamo simulations.

  17. An Efficient Hardware Circuit for Spike Sorting Based on Competitive Learning Networks

    Directory of Open Access Journals (Sweden)

    Huan-Yuan Chen

    2017-09-01

    Full Text Available This study aims to present an effective VLSI circuit for multi-channel spike sorting. The circuit supports the spike detection, feature extraction and classification operations. The detection circuit is implemented in accordance with the nonlinear energy operator algorithm. Both the peak detection and area computation operations are adopted for the realization of the hardware architecture for feature extraction. The resulting feature vectors are classified by a circuit for competitive learning (CL neural networks. The CL circuit supports both online training and classification. In the proposed architecture, all the channels share the same detection, feature extraction, learning and classification circuits for a low area cost hardware implementation. The clock-gating technique is also employed for reducing the power dissipation. To evaluate the performance of the architecture, an application-specific integrated circuit (ASIC implementation is presented. Experimental results demonstrate that the proposed circuit exhibits the advantages of a low chip area, a low power dissipation and a high classification success rate for spike sorting.

  18. An Efficient Hardware Circuit for Spike Sorting Based on Competitive Learning Networks

    Science.gov (United States)

    Chen, Huan-Yuan; Chen, Chih-Chang

    2017-01-01

    This study aims to present an effective VLSI circuit for multi-channel spike sorting. The circuit supports the spike detection, feature extraction and classification operations. The detection circuit is implemented in accordance with the nonlinear energy operator algorithm. Both the peak detection and area computation operations are adopted for the realization of the hardware architecture for feature extraction. The resulting feature vectors are classified by a circuit for competitive learning (CL) neural networks. The CL circuit supports both online training and classification. In the proposed architecture, all the channels share the same detection, feature extraction, learning and classification circuits for a low area cost hardware implementation. The clock-gating technique is also employed for reducing the power dissipation. To evaluate the performance of the architecture, an application-specific integrated circuit (ASIC) implementation is presented. Experimental results demonstrate that the proposed circuit exhibits the advantages of a low chip area, a low power dissipation and a high classification success rate for spike sorting. PMID:28956859

  19. Breaking HIV News to Clients: SPIKES Strategy in Post-Test Counseling Session

    Directory of Open Access Journals (Sweden)

    Hamid Emadi-Koochak

    2016-05-01

    Full Text Available Breaking bad news is one of the most burdensome tasks physicians face in their everyday practice. It becomes even more challenging in the context of HIV+ patients because of stigma and discrimination. The aim of the current study is to evaluate the quality of giving HIV seroconversion news according to SPIKES protocol. Numbers of 154 consecutive HIV+ patients from Imam Khomeini Hospital testing and counseling center were enrolled in this study. Patients were inquired about how they were given the HIV news and whether or not they received pre- and post-test counseling sessions. Around 51% of them were men, 80% had high school education, and 56% were employed. Regarding marital status, 32% were single, and 52% were married at the time of the interview. Among them, 31% had received the HIV news in a counseling center, and only 29% had pre-test counseling. SPIKES criteria were significantly met when the HIV news was given in an HIV counseling and testing center (P.value<0.05. Low coverage of HIV counseling services was observed in the study. SPIKES criteria were significantly met when the HIV seroconversion news was given in a counseling center. The need to further train staff to deliver HIV news seems a priority in the field of HIV care and treatment.

  20. Multi-layer network utilizing rewarded spike time dependent plasticity to learn a foraging task.

    Directory of Open Access Journals (Sweden)

    Pavel Sanda

    2017-09-01

    Full Text Available Neural networks with a single plastic layer employing reward modulated spike time dependent plasticity (STDP are capable of learning simple foraging tasks. Here we demonstrate advanced pattern discrimination and continuous learning in a network of spiking neurons with multiple plastic layers. The network utilized both reward modulated and non-reward modulated STDP and implemented multiple mechanisms for homeostatic regulation of synaptic efficacy, including heterosynaptic plasticity, gain control, output balancing, activity normalization of rewarded STDP and hard limits on synaptic strength. We found that addition of a hidden layer of neurons employing non-rewarded STDP created neurons that responded to the specific combinations of inputs and thus performed basic classification of the input patterns. When combined with a following layer of neurons implementing rewarded STDP, the network was able to learn, despite the absence of labeled training data, discrimination between rewarding patterns and the patterns designated as punishing. Synaptic noise allowed for trial-and-error learning that helped to identify the goal-oriented strategies which were effective in task solving. The study predicts a critical set of properties of the spiking neuronal network with STDP that was sufficient to solve a complex foraging task involving pattern classification and decision making.

  1. DL-ReSuMe: A Delay Learning-Based Remote Supervised Method for Spiking Neurons.

    Science.gov (United States)

    Taherkhani, Aboozar; Belatreche, Ammar; Li, Yuhua; Maguire, Liam P

    2015-12-01

    Recent research has shown the potential capability of spiking neural networks (SNNs) to model complex information processing in the brain. There is biological evidence to prove the use of the precise timing of spikes for information coding. However, the exact learning mechanism in which the neuron is trained to fire at precise times remains an open problem. The majority of the existing learning methods for SNNs are based on weight adjustment. However, there is also biological evidence that the synaptic delay is not constant. In this paper, a learning method for spiking neurons, called delay learning remote supervised method (DL-ReSuMe), is proposed to merge the delay shift approach and ReSuMe-based weight adjustment to enhance the learning performance. DL-ReSuMe uses more biologically plausible properties, such as delay learning, and needs less weight adjustment than ReSuMe. Simulation results have shown that the proposed DL-ReSuMe approach achieves learning accuracy and learning speed improvements compared with ReSuMe.

  2. One weird trick for parallelizing convolutional neural networks

    OpenAIRE

    Krizhevsky, Alex

    2014-01-01

    I present a new way to parallelize the training of convolutional neural networks across multiple GPUs. The method scales significantly better than all alternatives when applied to modern convolutional neural networks.

  3. Algorithms for parallel computers

    International Nuclear Information System (INIS)

    Churchhouse, R.F.

    1985-01-01

    Until relatively recently almost all the algorithms for use on computers had been designed on the (usually unstated) assumption that they were to be run on single processor, serial machines. With the introduction of vector processors, array processors and interconnected systems of mainframes, minis and micros, however, various forms of parallelism have become available. The advantage of parallelism is that it offers increased overall processing speed but it also raises some fundamental questions, including: (i) which, if any, of the existing 'serial' algorithms can be adapted for use in the parallel mode. (ii) How close to optimal can such adapted algorithms be and, where relevant, what are the convergence criteria. (iii) How can we design new algorithms specifically for parallel systems. (iv) For multi-processor systems how can we handle the software aspects of the interprocessor communications. Aspects of these questions illustrated by examples are considered in these lectures. (orig.)

  4. Parallelism and array processing

    International Nuclear Information System (INIS)

    Zacharov, V.

    1983-01-01

    Modern computing, as well as the historical development of computing, has been dominated by sequential monoprocessing. Yet there is the alternative of parallelism, where several processes may be in concurrent execution. This alternative is discussed in a series of lectures, in which the main developments involving parallelism are considered, both from the standpoint of computing systems and that of applications that can exploit such systems. The lectures seek to discuss parallelism in a historical context, and to identify all the main aspects of concurrency in computation right up to the present time. Included will be consideration of the important question as to what use parallelism might be in the field of data processing. (orig.)

  5. Boobs, Boxing, and Bombs: Problematizing the Entertainment of Spike TV

    OpenAIRE

    Walton, Gerald; Potvin, L.

    2009-01-01

    Spike is the only television network in North America “for men.” Its motto, “Get more action,” is suggestive of pursuits of various forms of violence. We conceptualize Spike not as trivial entertainment, but rather as a form of pop culture that erodes the gains of feminists who have challenged the prevalence of normalized hegemonic masculinity (HM). Our paper highlights themes of Spike content, and connects those themes to the literature on HM. Moreover, we validate the identities and lives ...

  6. Parallel magnetic resonance imaging

    International Nuclear Information System (INIS)

    Larkman, David J; Nunes, Rita G

    2007-01-01

    Parallel imaging has been the single biggest innovation in magnetic resonance imaging in the last decade. The use of multiple receiver coils to augment the time consuming Fourier encoding has reduced acquisition times significantly. This increase in speed comes at a time when other approaches to acquisition time reduction were reaching engineering and human limits. A brief summary of spatial encoding in MRI is followed by an introduction to the problem parallel imaging is designed to solve. There are a large number of parallel reconstruction algorithms; this article reviews a cross-section, SENSE, SMASH, g-SMASH and GRAPPA, selected to demonstrate the different approaches. Theoretical (the g-factor) and practical (coil design) limits to acquisition speed are reviewed. The practical implementation of parallel imaging is also discussed, in particular coil calibration. How to recognize potential failure modes and their associated artefacts are shown. Well-established applications including angiography, cardiac imaging and applications using echo planar imaging are reviewed and we discuss what makes a good application for parallel imaging. Finally, active research areas where parallel imaging is being used to improve data quality by repairing artefacted images are also reviewed. (invited topical review)

  7. Real-time cerebellar neuroprosthetic system based on a spiking neural network model of motor learning

    Science.gov (United States)

    Xu, Tao; Xiao, Na; Zhai, Xiaolong; Chan, Pak Kwan; Tin, Chung

    2018-02-01

    Objective. Damage to the brain, as a result of various medical conditions, impacts the everyday life of patients and there is still no complete cure to neurological disorders. Neuroprostheses that can functionally replace the damaged neural circuit have recently emerged as a possible solution to these problems. Here we describe the development of a real-time cerebellar neuroprosthetic system to substitute neural function in cerebellar circuitry for learning delay eyeblink conditioning (DEC). Approach. The system was empowered by a biologically realistic spiking neural network (SNN) model of the cerebellar neural circuit, which considers the neuronal population and anatomical connectivity of the network. The model simulated synaptic plasticity critical for learning DEC. This SNN model was carefully implemented on a field programmable gate array (FPGA) platform for real-time simulation. This hardware system was interfaced in in vivo experiments with anesthetized rats and it used neural spikes recorded online from the animal to learn and trigger conditioned eyeblink in the animal during training. Main results. This rat-FPGA hybrid system was able to process neuronal spikes in real-time with an embedded cerebellum model of ~10 000 neurons and reproduce learning of DEC with different inter-stimulus intervals. Our results validated that the system performance is physiologically relevant at both the neural (firing pattern) and behavioral (eyeblink pattern) levels. Significance. This integrated system provides the sufficient computation power for mimicking the cerebellar circuit in real-time. The system interacts with the biological system naturally at the spike level and can be generalized for including other neural components (neuron types and plasticity) and neural functions for potential neuroprosthetic applications.

  8. Spike propagation through the dorsal root ganglia in an unmyelinated sensory neuron: a modeling study.

    Science.gov (United States)

    Sundt, Danielle; Gamper, Nikita; Jaffe, David B

    2015-12-01

    Unmyelinated C-fibers are a major type of sensory neurons conveying pain information. Action potential conduction is regulated by the bifurcation (T-junction) of sensory neuron axons within the dorsal root ganglia (DRG). Understanding how C-fiber signaling is influenced by the morphology of the T-junction and the local expression of ion channels is important for understanding pain signaling. In this study we used biophysical computer modeling to investigate the influence of axon morphology within the DRG and various membrane conductances on the reliability of spike propagation. As expected, calculated input impedance and the amplitude of propagating action potentials were both lowest at the T-junction. Propagation reliability for single spikes was highly sensitive to the diameter of the stem axon and the density of voltage-gated Na(+) channels. A model containing only fast voltage-gated Na(+) and delayed-rectifier K(+) channels conducted trains of spikes up to frequencies of 110 Hz. The addition of slowly activating KCNQ channels (i.e., KV7 or M-channels) to the model reduced the following frequency to 30 Hz. Hyperpolarization produced by addition of a much slower conductance, such as a Ca(2+)-dependent K(+) current, was needed to reduce the following frequency to 6 Hz. Attenuation of driving force due to ion accumulation or hyperpolarization produced by a Na(+)-K(+) pump had no effect on following frequency but could influence the reliability of spike propagation mutually with the voltage shift generated by a Ca(2+)-dependent K(+) current. These simulations suggest how specific ion channels within the DRG may contribute toward therapeutic treatments for chronic pain. Copyright © 2015 the American Physiological Society.

  9. Probabilistic Decision Making with Spikes: From ISI Distributions to Behaviour via Information Gain.

    Directory of Open Access Journals (Sweden)

    Javier A Caballero

    Full Text Available Computational theories of decision making in the brain usually assume that sensory 'evidence' is accumulated supporting a number of hypotheses, and that the first accumulator to reach threshold triggers a decision in favour of its associated hypothesis. However, the evidence is often assumed to occur as a continuous process whose origins are somewhat abstract, with no direct link to the neural signals - action potentials or 'spikes' - that must ultimately form the substrate for decision making in the brain. Here we introduce a new variant of the well-known multi-hypothesis sequential probability ratio test (MSPRT for decision making whose evidence observations consist of the basic unit of neural signalling - the inter-spike interval (ISI - and which is based on a new form of the likelihood function. We dub this mechanism s-MSPRT and show its precise form for a range of realistic ISI distributions with positive support. In this way we show that, at the level of spikes, the refractory period may actually facilitate shorter decision times, and that the mechanism is robust against poor choice of the hypothesized data distribution. We show that s-MSPRT performance is related to the Kullback-Leibler divergence (KLD or information gain between ISI distributions, through which we are able to link neural signalling to psychophysical observation at the behavioural level. Thus, we find the mean information needed for a decision is constant, thereby offering an account of Hick's law (relating decision time to the number of choices. Further, the mean decision time of s-MSPRT shows a power law dependence on the KLD offering an account of Piéron's law (relating reaction time to stimulus intensity. These results show the foundations for a research programme in which spike train analysis can be made the basis for predictions about behavior in multi-alternative choice tasks.

  10. Probabilistic Decision Making with Spikes: From ISI Distributions to Behaviour via Information Gain.

    Science.gov (United States)

    Caballero, Javier A; Lepora, Nathan F; Gurney, Kevin N

    2015-01-01

    Computational theories of decision making in the brain usually assume that sensory 'evidence' is accumulated supporting a number of hypotheses, and that the first accumulator to reach threshold triggers a decision in favour of its associated hypothesis. However, the evidence is often assumed to occur as a continuous process whose origins are somewhat abstract, with no direct link to the neural signals - action potentials or 'spikes' - that must ultimately form the substrate for decision making in the brain. Here we introduce a new variant of the well-known multi-hypothesis sequential probability ratio test (MSPRT) for decision making whose evidence observations consist of the basic unit of neural signalling - the inter-spike interval (ISI) - and which is based on a new form of the likelihood function. We dub this mechanism s-MSPRT and show its precise form for a range of realistic ISI distributions with positive support. In this way we show that, at the level of spikes, the refractory period may actually facilitate shorter decision times, and that the mechanism is robust against poor choice of the hypothesized data distribution. We show that s-MSPRT performance is related to the Kullback-Leibler divergence (KLD) or information gain between ISI distributions, through which we are able to link neural signalling to psychophysical observation at the behavioural level. Thus, we find the mean information needed for a decision is constant, thereby offering an account of Hick's law (relating decision time to the number of choices). Further, the mean decision time of s-MSPRT shows a power law dependence on the KLD offering an account of Piéron's law (relating reaction time to stimulus intensity). These results show the foundations for a research programme in which spike train analysis can be made the basis for predictions about behavior in multi-alternative choice tasks.

  11. Real-time cerebellar neuroprosthetic system based on a spiking neural network model of motor learning.

    Science.gov (United States)

    Xu, Tao; Xiao, Na; Zhai, Xiaolong; Kwan Chan, Pak; Tin, Chung

    2018-02-01

    Damage to the brain, as a result of various medical conditions, impacts the everyday life of patients and there is still no complete cure to neurological disorders. Neuroprostheses that can functionally replace the damaged neural circuit have recently emerged as a possible solution to these problems. Here we describe the development of a real-time cerebellar neuroprosthetic system to substitute neural function in cerebellar circuitry for learning delay eyeblink conditioning (DEC). The system was empowered by a biologically realistic spiking neural network (SNN) model of the cerebellar neural circuit, which considers the neuronal population and anatomical connectivity of the network. The model simulated synaptic plasticity critical for learning DEC. This SNN model was carefully implemented on a field programmable gate array (FPGA) platform for real-time simulation. This hardware system was interfaced in in vivo experiments with anesthetized rats and it used neural spikes recorded online from the animal to learn and trigger conditioned eyeblink in the animal during training. This rat-FPGA hybrid system was able to process neuronal spikes in real-time with an embedded cerebellum model of ~10 000 neurons and reproduce learning of DEC with different inter-stimulus intervals. Our results validated that the system performance is physiologically relevant at both the neural (firing pattern) and behavioral (eyeblink pattern) levels. This integrated system provides the sufficient computation power for mimicking the cerebellar circuit in real-time. The system interacts with the biological system naturally at the spike level and can be generalized for including other neural components (neuron types and plasticity) and neural functions for potential neuroprosthetic applications.

  12. Integrated workflows for spiking neuronal network simulations

    Directory of Open Access Journals (Sweden)

    Ján eAntolík

    2013-12-01

    Full Text Available The increasing availability of computational resources is enabling more detailed, realistic modelling in computational neuroscience, resulting in a shift towards more heterogeneous models of neuronal circuits, and employment of complex experimental protocols. This poses a challenge for existing tool chains, as the set of tools involved in a typical modeller's workflow is expanding concomitantly, with growing complexity in the metadata flowing between them. For many parts of the workflow, a range of tools is available; however, numerous areas lack dedicated tools, while integration of existing tools is limited. This forces modellers to either handle the workflow manually, leading to errors, or to write substantial amounts of code to automate parts of the workflow, in both cases reducing their productivity.To address these issues, we have developed Mozaik: a workflow system for spiking neuronal network simulations written in Python. Mozaik integrates model, experiment and stimulation specification, simulation execution, data storage, data analysis and visualisation into a single automated workflow, ensuring that all relevant metadata are available to all workflow components. It is based on several existing tools, including PyNN, Neo and Matplotlib. It offers a declarative way to specify models and recording configurations using hierarchically organised configuration files. Mozaik automatically records all data together with all relevant metadata about the experimental context, allowing automation of the analysis and visualisation stages. Mozaik has a modular architecture, and the existing modules are designed to be extensible with minimal programming effort. Mozaik increases the productivity of running virtual experiments on highly structured neuronal networks by automating the entire experimental cycle, while increasing the reliability of modelling studies by relieving the user from manual handling of the flow of metadata between the individual

  13. Heterogeneity of Purkinje cell simple spike-complex spike interactions: zebrin- and non-zebrin-related variations.

    Science.gov (United States)

    Tang, Tianyu; Xiao, Jianqiang; Suh, Colleen Y; Burroughs, Amelia; Cerminara, Nadia L; Jia, Linjia; Marshall, Sarah P; Wise, Andrew K; Apps, Richard; Sugihara, Izumi; Lang, Eric J

    2017-08-01

    Cerebellar Purkinje cells (PCs) generate two types of action potentials, simple and complex spikes. Although they are generated by distinct mechanisms, interactions between the two spike types exist. Zebrin staining produces alternating positive and negative stripes of PCs across most of the cerebellar cortex. Thus, here we compared simple spike-complex spike interactions both within and across zebrin populations. Simple spike activity undergoes a complex modulation preceding and following a complex spike. The amplitudes of the pre- and post-complex spike modulation phases were correlated across PCs. On average, the modulation was larger for PCs in zebrin positive regions. Correlations between aspects of the complex spike waveform and simple spike activity were found, some of which varied between zebrin positive and negative PCs. The implications of the results are discussed with regard to hypotheses that complex spikes are triggered by rises in simple spike activity for either motor learning or homeostatic functions. Purkinje cells (PCs) generate two types of action potentials, called simple and complex spikes (SSs and CSs). We first investigated the CS-associated modulation of SS activity and its relationship to the zebrin status of the PC. The modulation pattern consisted of a pre-CS rise in SS activity, and then, following the CS, a pause, a rebound, and finally a late inhibition of SS activity for both zebrin positive (Z+) and negative (Z-) cells, though the amplitudes of the phases were larger in Z+ cells. Moreover, the amplitudes of the pre-CS rise with the late inhibitory phase of the modulation were correlated across PCs. In contrast, correlations between modulation phases across CSs of individual PCs were generally weak. Next, the relationship between CS spikelets and SS activity was investigated. The number of spikelets/CS correlated with the average SS firing rate only for Z+ cells. In contrast, correlations across CSs between spikelet numbers and the

  14. Perceptron learning rule derived from spike-frequency adaptation and spike-time-dependent plasticity.

    Science.gov (United States)

    D'Souza, Prashanth; Liu, Shih-Chii; Hahnloser, Richard H R

    2010-03-09

    It is widely believed that sensory and motor processing in the brain is based on simple computational primitives rooted in cellular and synaptic physiology. However, many gaps remain in our understanding of the connections between neural computations and biophysical properties of neurons. Here, we show that synaptic spike-time-dependent plasticity (STDP) combined with spike-frequency adaptation (SFA) in a single neuron together approximate the well-known perceptron learning rule. Our calculations and integrate-and-fire simulations reveal that delayed inputs to a neuron endowed with STDP and SFA precisely instruct neural responses to earlier arriving inputs. We demonstrate this mechanism on a developmental example of auditory map formation guided by visual inputs, as observed in the external nucleus of the inferior colliculus (ICX) of barn owls. The interplay of SFA and STDP in model ICX neurons precisely transfers the tuning curve from the visual modality onto the auditory modality, demonstrating a useful computation for multimodal and sensory-guided processing.

  15. Timing intervals using population synchrony and spike timing dependent plasticity

    Directory of Open Access Journals (Sweden)

    Wei Xu

    2016-12-01

    Full Text Available We present a computational model by which ensembles of regularly spiking neurons can encode different time intervals through synchronous firing. We show that a neuron responding to a large population of convergent inputs has the potential to learn to produce an appropriately-timed output via spike-time dependent plasticity. We explain why temporal variability of this population synchrony increases with increasing time intervals. We also show that the scalar property of timing and its violation at short intervals can be explained by the spike-wise accumulation of jitter in the inter-spike intervals of timing neurons. We explore how the challenge of encoding longer time intervals can be overcome and conclude that this may involve a switch to a different population of neurons with lower firing rate, with the added effect of producing an earlier bias in response. Experimental data on human timing performance show features in agreement with the model’s output.

  16. Inherently stochastic spiking neurons for probabilistic neural computation

    KAUST Repository

    Al-Shedivat, Maruan; Naous, Rawan; Neftci, Emre; Cauwenberghs, Gert; Salama, Khaled N.

    2015-01-01

    . Our analysis and simulations show that the proposed neuron circuit satisfies a neural computability condition that enables probabilistic neural sampling and spike-based Bayesian learning and inference. Our findings constitute an important step towards

  17. A novel unsupervised spike sorting algorithm for intracranial EEG.

    Science.gov (United States)

    Yadav, R; Shah, A K; Loeb, J A; Swamy, M N S; Agarwal, R

    2011-01-01

    This paper presents a novel, unsupervised spike classification algorithm for intracranial EEG. The method combines template matching and principal component analysis (PCA) for building a dynamic patient-specific codebook without a priori knowledge of the spike waveforms. The problem of misclassification due to overlapping classes is resolved by identifying similar classes in the codebook using hierarchical clustering. Cluster quality is visually assessed by projecting inter- and intra-clusters onto a 3D plot. Intracranial EEG from 5 patients was utilized to optimize the algorithm. The resulting codebook retains 82.1% of the detected spikes in non-overlapping and disjoint clusters. Initial results suggest a definite role of this method for both rapid review and quantitation of interictal spikes that could enhance both clinical treatment and research studies on epileptic patients.

  18. Emergent dynamics of spiking neurons with fluctuating threshold

    Science.gov (United States)

    Bhattacharjee, Anindita; Das, M. K.

    2017-05-01

    Role of fluctuating threshold on neuronal dynamics is investigated. The threshold function is assumed to follow a normal probability distribution. Standard deviation of inter-spike interval of the response is computed as an indicator of irregularity in spike emission. It has been observed that, the irregularity in spiking is more if the threshold variation is more. A significant change in modal characteristics of Inter Spike Intervals (ISI) is seen to occur as a function of fluctuation parameter. Investigation is further carried out for coupled system of neurons. Cooperative dynamics of coupled neurons are discussed in view of synchronization. Total and partial synchronization regimes are depicted with the help of contour plots of synchrony measure under various conditions. Results of this investigation may provide a basis for exploring the complexities of neural communication and brain functioning.

  19. Higher Order Spike Synchrony in Prefrontal Cortex during visual memory

    Directory of Open Access Journals (Sweden)

    Gordon ePipa

    2011-06-01

    Full Text Available Precise temporal synchrony of spike firing has been postulated as an important neuronal mechanism for signal integration and the induction of plasticity in neocortex. As prefrontal cortex plays an important role in organizing memory and executive functions, the convergence of multiple visual pathways onto PFC predicts that neurons should preferentially synchronize their spiking when stimulus information is processed. Furthermore, synchronous spike firing should intensify if memory processes require the induction of neuronal plasticity, even if this is only for short-term. Here we show with multiple simultaneously recorded units in ventral prefrontal cortex that neurons participate in 3 ms precise synchronous discharges distributed across multiple sites separated by at least 500 µm. The frequency of synchronous firing is modulated by behavioral performance and is specific for the memorized visual stimuli. In particular, during the memory period in which activity is not stimulus driven, larger groups of up to 7 sites exhibit performance dependent modulation of their spike synchronization.

  20. The STAPL Parallel Graph Library

    KAUST Repository

    Harshvardhan,; Fidel, Adam; Amato, Nancy M.; Rauchwerger, Lawrence

    2013-01-01

    This paper describes the stapl Parallel Graph Library, a high-level framework that abstracts the user from data-distribution and parallelism details and allows them to concentrate on parallel graph algorithm development. It includes a customizable

  1. ANNarchy: a code generation approach to neural simulations on parallel hardware

    Science.gov (United States)

    Vitay, Julien; Dinkelbach, Helge Ü.; Hamker, Fred H.

    2015-01-01

    Many modern neural simulators focus on the simulation of networks of spiking neurons on parallel hardware. Another important framework in computational neuroscience, rate-coded neural networks, is mostly difficult or impossible to implement using these simulators. We present here the ANNarchy (Artificial Neural Networks architect) neural simulator, which allows to easily define and simulate rate-coded and spiking networks, as well as combinations of both. The interface in Python has been designed to be close to the PyNN interface, while the definition of neuron and synapse models can be specified using an equation-oriented mathematical description similar to the Brian neural simulator. This information is used to generate C++ code that will efficiently perform the simulation on the chosen parallel hardware (multi-core system or graphical processing unit). Several numerical methods are available to transform ordinary differential equations into an efficient C++code. We compare the parallel performance of the simulator to existing solutions. PMID:26283957

  2. Constructing Precisely Computing Networks with Biophysical Spiking Neurons.

    Science.gov (United States)

    Schwemmer, Michael A; Fairhall, Adrienne L; Denéve, Sophie; Shea-Brown, Eric T

    2015-07-15

    While spike timing has been shown to carry detailed stimulus information at the sensory periphery, its possible role in network computation is less clear. Most models of computation by neural networks are based on population firing rates. In equivalent spiking implementations, firing is assumed to be random such that averaging across populations of neurons recovers the rate-based approach. Recently, however, Denéve and colleagues have suggested that the spiking behavior of neurons may be fundamental to how neuronal networks compute, with precise spike timing determined by each neuron's contribution to producing the desired output (Boerlin and Denéve, 2011; Boerlin et al., 2013). By postulating that each neuron fires to reduce the error in the network's output, it was demonstrated that linear computations can be performed by networks of integrate-and-fire neurons that communicate through instantaneous synapses. This left open, however, the possibility that realistic networks, with conductance-based neurons with subthreshold nonlinearity and the slower timescales of biophysical synapses, may not fit into this framework. Here, we show how the spike-based approach can be extended to biophysically plausible networks. We then show that our network reproduces a number of key features of cortical networks including irregular and Poisson-like spike times and a tight balance between excitation and inhibition. Lastly, we discuss how the behavior of our model scales with network size or with the number of neurons "recorded" from a larger computing network. These results significantly increase the biological plausibility of the spike-based approach to network computation. We derive a network of neurons with standard spike-generating currents and synapses with realistic timescales that computes based upon the principle that the precise timing of each spike is important for the computation. We then show that our network reproduces a number of key features of cortical networks

  3. A novel automated spike sorting algorithm with adaptable feature extraction.

    Science.gov (United States)

    Bestel, Robert; Daus, Andreas W; Thielemann, Christiane

    2012-10-15

    To study the electrophysiological properties of neuronal networks, in vitro studies based on microelectrode arrays have become a viable tool for analysis. Although in constant progress, a challenging task still remains in this area: the development of an efficient spike sorting algorithm that allows an accurate signal analysis at the single-cell level. Most sorting algorithms currently available only extract a specific feature type, such as the principal components or Wavelet coefficients of the measured spike signals in order to separate different spike shapes generated by different neurons. However, due to the great variety in the obtained spike shapes, the derivation of an optimal feature set is still a very complex issue that current algorithms struggle with. To address this problem, we propose a novel algorithm that (i) extracts a variety of geometric, Wavelet and principal component-based features and (ii) automatically derives a feature subset, most suitable for sorting an individual set of spike signals. Thus, there is a new approach that evaluates the probability distribution of the obtained spike features and consequently determines the candidates most suitable for the actual spike sorting. These candidates can be formed into an individually adjusted set of spike features, allowing a separation of the various shapes present in the obtained neuronal signal by a subsequent expectation maximisation clustering algorithm. Test results with simulated data files and data obtained from chick embryonic neurons cultured on microelectrode arrays showed an excellent classification result, indicating the superior performance of the described algorithm approach. Copyright © 2012 Elsevier B.V. All rights reserved.

  4. Nonlinear evolution of single spike in Richtmyer-Meshkov instability

    International Nuclear Information System (INIS)

    Fukuda, Y.; Nishihara, K.; Wouchuk, J.G.

    2000-01-01

    Nonlinear evolution of single spike structure and vortex in the Richtmyer-Meshkov instability is investigated with the use of a two-dimensional hydrodynamic code. It is shown that singularity appears in the vorticity left by transmitted and reflected shocks at a corrugated interface. This singularity results in opposite sign of vorticity along the interface that causes double spiral structure of the spike. (authors)

  5. Monte Carlo point process estimation of electromyographic envelopes from motor cortical spikes for brain-machine interfaces

    Science.gov (United States)

    Liao, Yuxi; She, Xiwei; Wang, Yiwen; Zhang, Shaomin; Zhang, Qiaosheng; Zheng, Xiaoxiang; Principe, Jose C.

    2015-12-01

    Objective. Representation of movement in the motor cortex (M1) has been widely studied in brain-machine interfaces (BMIs). The electromyogram (EMG) has greater bandwidth than the conventional kinematic variables (such as position, velocity), and is functionally related to the discharge of cortical neurons. As the stochastic information of EMG is derived from the explicit spike time structure, point process (PP) methods will be a good solution for decoding EMG directly from neural spike trains. Previous studies usually assume linear or exponential tuning curves between neural firing and EMG, which may not be true. Approach. In our analysis, we estimate the tuning curves in a data-driven way and find both the traditional functional-excitatory and functional-inhibitory neurons, which are widely found across a rat’s motor cortex. To accurately decode EMG envelopes from M1 neural spike trains, the Monte Carlo point process (MCPP) method is implemented based on such nonlinear tuning properties. Main results. Better reconstruction of EMG signals is shown on baseline and extreme high peaks, as our method can better preserve the nonlinearity of the neural tuning during decoding. The MCPP improves the prediction accuracy (the normalized mean squared error) 57% and 66% on average compared with the adaptive point process filter using linear and exponential tuning curves respectively, for all 112 data segments across six rats. Compared to a Wiener filter using spike rates with an optimal window size of 50 ms, MCPP decoding EMG from a point process improves the normalized mean square error (NMSE) by 59% on average. Significance. These results suggest that neural tuning is constantly changing during task execution and therefore, the use of spike timing methodologies and estimation of appropriate tuning curves needs to be undertaken for better EMG decoding in motor BMIs.

  6. GABAergic activities control spike timing- and frequency-dependent long-term depression at hippocampal excitatory synapses

    Directory of Open Access Journals (Sweden)

    Makoto Nishiyama

    2010-06-01

    Full Text Available GABAergic interneuronal network activities in the hippocampus control a variety of neural functions, including learning and memory, by regulating θ and γ oscillations. How these GABAergic activities at pre- and post-synaptic sites of hippocampal CA1 pyramidal cells differentially contribute to synaptic function and plasticity during their repetitive pre- and post-synaptic spiking at θ and γ oscillations is largely unknown. We show here that activities mediated by postsynaptic GABAARs and presynaptic GABABRs determine, respectively, the spike timing- and frequency-dependence of activity-induced synaptic modifications at Schaffer collateral-CA1 excitatory synapses. We demonstrate that both feedforward and feedback GABAAR-mediated inhibition in the postsynaptic cell controls the spike timing-dependent long-term depression of excitatory inputs (“e-LTD” at the θ frequency. We also show that feedback postsynaptic inhibition specifically causes e-LTD of inputs that induce small postsynaptic currents (<70 pA with LTP timing, thus enforcing the requirement of cooperativity for induction of long-term potentiation at excitatory inputs (“e-LTP”. Furthermore, under spike-timing protocols that induce e-LTP and e-LTD at excitatory synapses, we observed parallel induction of LTP and LTD at inhibitory inputs (“i-LTP” and “i-LTD” to the same postsynaptic cells. Finally, we show that presynaptic GABABR-mediated inhibition plays a major role in the induction of frequency-dependent e-LTD at α and β frequencies. These observations demonstrate the critical influence of GABAergic interneuronal network activities in regulating the spike timing and frequency dependences of long-term synaptic modifications in the hippocampus.

  7. Orthobunyavirus ultrastructure and the curious tripodal glycoprotein spike.

    Directory of Open Access Journals (Sweden)

    Thomas A Bowden

    Full Text Available The genus Orthobunyavirus within the family Bunyaviridae constitutes an expanding group of emerging viruses, which threaten human and animal health. Despite the medical importance, little is known about orthobunyavirus structure, a prerequisite for understanding virus assembly and entry. Here, using electron cryo-tomography, we report the ultrastructure of Bunyamwera virus, the prototypic member of this genus. Whilst Bunyamwera virions are pleomorphic in shape, they display a locally ordered lattice of glycoprotein spikes. Each spike protrudes 18 nm from the viral membrane and becomes disordered upon introduction to an acidic environment. Using sub-tomogram averaging, we derived a three-dimensional model of the trimeric pre-fusion glycoprotein spike to 3-nm resolution. The glycoprotein spike consists mainly of the putative class-II fusion glycoprotein and exhibits a unique tripod-like arrangement. Protein-protein contacts between neighbouring spikes occur at membrane-proximal regions and intra-spike contacts at membrane-distal regions. This trimeric assembly deviates from previously observed fusion glycoprotein arrangements, suggesting a greater than anticipated repertoire of viral fusion glycoprotein oligomerization. Our study provides evidence of a pH-dependent conformational change that occurs during orthobunyaviral entry into host cells and a blueprint for the structure of this group of emerging pathogens.

  8. The Omega-Infinity Limit of Single Spikes

    CERN Document Server

    Axenides, Minos; Linardopoulos, Georgios

    A new infinite-size limit of strings in RxS2 is presented. The limit is obtained from single spike strings by letting their angular velocity omega become infinite. We derive the energy-momenta relation of omega-infinity single spikes as their linear velocity v-->1 and their angular momentum J-->1. Generally, the v-->1, J-->1 limit of single spikes is singular and has to be excluded from the spectrum and be studied separately. We discover that the dispersion relation of omega-infinity single spikes contains logarithms in the limit J-->1. This result is somewhat surprising, since the logarithmic behavior in the string spectra is typically associated with their motion in non-compact spaces such as AdS. Omega-infinity single spikes seem to completely cover the surface of the 2-sphere they occupy, so that they may essentially be viewed as some sort of "brany strings". A proof of the sphere-filling property of omega-infinity single spikes is given in the appendix.

  9. Absolute Ca Isotopic Measurement Using an Improved Double Spike Technique

    Directory of Open Access Journals (Sweden)

    Jason Jiun-San Shen

    2009-01-01

    Full Text Available A new vector analytical method has been developed in order to obtain the true isotopic composition of the 42Ca-48Ca double spike. This is achieved by using two different sample-spike mixtures combined with the double spike and natural Ca data. Be cause the natural sample (two mixtures and the spike should all lie on a single mixing line, we are able to con strain the true isotopic composition of our double spike using this new approach. Once the isotopic composition of the Ca double spike is established, we are able to obtain the true Ca isotopic composition of the NIST Ca standard SRM915a, 40Ca/44Ca = 46.537 ± 2 (2sm, n = 55, 42Ca/44Ca = 0.31031 ± 1, 43Ca/44Ca = 0.06474 ± 1, and 48Ca/44Ca = 0.08956 ± 1. De spite an off set of 1.3% in 40Ca/44Ca between our result and the previously re ported value (Russell et al. 1978, our data indicate an off set of 1.89__in 40Ca/44Ca between SRM915a and seawater, entirely consistent with the published results.

  10. Massively parallel multicanonical simulations

    Science.gov (United States)

    Gross, Jonathan; Zierenberg, Johannes; Weigel, Martin; Janke, Wolfhard

    2018-03-01

    Generalized-ensemble Monte Carlo simulations such as the multicanonical method and similar techniques are among the most efficient approaches for simulations of systems undergoing discontinuous phase transitions or with rugged free-energy landscapes. As Markov chain methods, they are inherently serial computationally. It was demonstrated recently, however, that a combination of independent simulations that communicate weight updates at variable intervals allows for the efficient utilization of parallel computational resources for multicanonical simulations. Implementing this approach for the many-thread architecture provided by current generations of graphics processing units (GPUs), we show how it can be efficiently employed with of the order of 104 parallel walkers and beyond, thus constituting a versatile tool for Monte Carlo simulations in the era of massively parallel computing. We provide the fully documented source code for the approach applied to the paradigmatic example of the two-dimensional Ising model as starting point and reference for practitioners in the field.

  11. Reinforcement learning of targeted movement in a spiking neuronal model of motor cortex.

    Directory of Open Access Journals (Sweden)

    George L Chadderdon

    Full Text Available Sensorimotor control has traditionally been considered from a control theory perspective, without relation to neurobiology. In contrast, here we utilized a spiking-neuron model of motor cortex and trained it to perform a simple movement task, which consisted of rotating a single-joint "forearm" to a target. Learning was based on a reinforcement mechanism analogous to that of the dopamine system. This provided a global reward or punishment signal in response to decreasing or increasing distance from hand to target, respectively. Output was partially driven by Poisson motor babbling, creating stochastic movements that could then be shaped by learning. The virtual forearm consisted of a single segment rotated around an elbow joint, controlled by flexor and extensor muscles. The model consisted of 144 excitatory and 64 inhibitory event-based neurons, each with AMPA, NMDA, and GABA synapses. Proprioceptive cell input to this model encoded the 2 muscle lengths. Plasticity was only enabled in feedforward connections between input and output excitatory units, using spike-timing-dependent eligibility traces for synaptic credit or blame assignment. Learning resulted from a global 3-valued signal: reward (+1, no learning (0, or punishment (-1, corresponding to phasic increases, lack of change, or phasic decreases of dopaminergic cell firing, respectively. Successful learning only occurred when both reward and punishment were enabled. In this case, 5 target angles were learned successfully within 180 s of simulation time, with a median error of 8 degrees. Motor babbling allowed exploratory learning, but decreased the stability of the learned behavior, since the hand continued moving after reaching the target. Our model demonstrated that a global reinforcement signal, coupled with eligibility traces for synaptic plasticity, can train a spiking sensorimotor network to perform goal-directed motor behavior.

  12. Reinforcement learning of targeted movement in a spiking neuronal model of motor cortex.

    Science.gov (United States)

    Chadderdon, George L; Neymotin, Samuel A; Kerr, Cliff C; Lytton, William W

    2012-01-01

    Sensorimotor control has traditionally been considered from a control theory perspective, without relation to neurobiology. In contrast, here we utilized a spiking-neuron model of motor cortex and trained it to perform a simple movement task, which consisted of rotating a single-joint "forearm" to a target. Learning was based on a reinforcement mechanism analogous to that of the dopamine system. This provided a global reward or punishment signal in response to decreasing or increasing distance from hand to target, respectively. Output was partially driven by Poisson motor babbling, creating stochastic movements that could then be shaped by learning. The virtual forearm consisted of a single segment rotated around an elbow joint, controlled by flexor and extensor muscles. The model consisted of 144 excitatory and 64 inhibitory event-based neurons, each with AMPA, NMDA, and GABA synapses. Proprioceptive cell input to this model encoded the 2 muscle lengths. Plasticity was only enabled in feedforward connections between input and output excitatory units, using spike-timing-dependent eligibility traces for synaptic credit or blame assignment. Learning resulted from a global 3-valued signal: reward (+1), no learning (0), or punishment (-1), corresponding to phasic increases, lack of change, or phasic decreases of dopaminergic cell firing, respectively. Successful learning only occurred when both reward and punishment were enabled. In this case, 5 target angles were learned successfully within 180 s of simulation time, with a median error of 8 degrees. Motor babbling allowed exploratory learning, but decreased the stability of the learned behavior, since the hand continued moving after reaching the target. Our model demonstrated that a global reinforcement signal, coupled with eligibility traces for synaptic plasticity, can train a spiking sensorimotor network to perform goal-directed motor behavior.

  13. Different propagation speeds of recalled sequences in plastic spiking neural networks

    Science.gov (United States)

    Huang, Xuhui; Zheng, Zhigang; Hu, Gang; Wu, Si; Rasch, Malte J.

    2015-03-01

    Neural networks can generate spatiotemporal patterns of spike activity. Sequential activity learning and retrieval have been observed in many brain areas, and e.g. is crucial for coding of episodic memory in the hippocampus or generating temporal patterns during song production in birds. In a recent study, a sequential activity pattern was directly entrained onto the neural activity of the primary visual cortex (V1) of rats and subsequently successfully recalled by a local and transient trigger. It was observed that the speed of activity propagation in coordinates of the retinotopically organized neural tissue was constant during retrieval regardless how the speed of light stimulation sweeping across the visual field during training was varied. It is well known that spike-timing dependent plasticity (STDP) is a potential mechanism for embedding temporal sequences into neural network activity. How training and retrieval speeds relate to each other and how network and learning parameters influence retrieval speeds, however, is not well described. We here theoretically analyze sequential activity learning and retrieval in a recurrent neural network with realistic synaptic short-term dynamics and STDP. Testing multiple STDP rules, we confirm that sequence learning can be achieved by STDP. However, we found that a multiplicative nearest-neighbor (NN) weight update rule generated weight distributions and recall activities that best matched the experiments in V1. Using network simulations and mean-field analysis, we further investigated the learning mechanisms and the influence of network parameters on recall speeds. Our analysis suggests that a multiplicative STDP rule with dominant NN spike interaction might be implemented in V1 since recall speed was almost constant in an NMDA-dominant regime. Interestingly, in an AMPA-dominant regime, neural circuits might exhibit recall speeds that instead follow the change in stimulus speeds. This prediction could be tested in

  14. Successful reconstruction of a physiological circuit with known connectivity from spiking activity alone.

    Directory of Open Access Journals (Sweden)

    Felipe Gerhard

    Full Text Available Identifying the structure and dynamics of synaptic interactions between neurons is the first step to understanding neural network dynamics. The presence of synaptic connections is traditionally inferred through the use of targeted stimulation and paired recordings or by post-hoc histology. More recently, causal network inference algorithms have been proposed to deduce connectivity directly from electrophysiological signals, such as extracellularly recorded spiking activity. Usually, these algorithms have not been validated on a neurophysiological data set for which the actual circuitry is known. Recent work has shown that traditional network inference algorithms based on linear models typically fail to identify the correct coupling of a small central pattern generating circuit in the stomatogastric ganglion of the crab Cancer borealis. In this work, we show that point process models of observed spike trains can guide inference of relative connectivity estimates that match the known physiological connectivity of the central pattern generator up to a choice of threshold. We elucidate the necessary steps to derive faithful connectivity estimates from a model that incorporates the spike train nature of the data. We then apply the model to measure changes in the effective connectivity pattern in response to two pharmacological interventions, which affect both intrinsic neural dynamics and synaptic transmission. Our results provide the first successful application of a network inference algorithm to a circuit for which the actual physiological synapses between neurons are known. The point process methodology presented here generalizes well to larger networks and can describe the statistics of neural populations. In general we show that advanced statistical models allow for the characterization of effective network structure, deciphering underlying network dynamics and estimating information-processing capabilities.

  15. Stress-Induced Impairment of a Working Memory Task: Role of Spiking Rate and Spiking History Predicted Discharge

    Science.gov (United States)

    Devilbiss, David M.; Jenison, Rick L.; Berridge, Craig W.

    2012-01-01

    Stress, pervasive in society, contributes to over half of all work place accidents a year and over time can contribute to a variety of psychiatric disorders including depression, schizophrenia, and post-traumatic stress disorder. Stress impairs higher cognitive processes, dependent on the prefrontal cortex (PFC) and that involve maintenance and integration of information over extended periods, including working memory and attention. Substantial evidence has demonstrated a relationship between patterns of PFC neuron spiking activity (action-potential discharge) and components of delayed-response tasks used to probe PFC-dependent cognitive function in rats and monkeys. During delay periods of these tasks, persistent spiking activity is posited to be essential for the maintenance of information for working memory and attention. However, the degree to which stress-induced impairment in PFC-dependent cognition involves changes in task-related spiking rates or the ability for PFC neurons to retain information over time remains unknown. In the current study, spiking activity was recorded from the medial PFC of rats performing a delayed-response task of working memory during acute noise stress (93 db). Spike history-predicted discharge (SHPD) for PFC neurons was quantified as a measure of the degree to which ongoing neuronal discharge can be predicted by past spiking activity and reflects the degree to which past information is retained by these neurons over time. We found that PFC neuron discharge is predicted by their past spiking patterns for nearly one second. Acute stress impaired SHPD, selectively during delay intervals of the task, and simultaneously impaired task performance. Despite the reduction in delay-related SHPD, stress increased delay-related spiking rates. These findings suggest that neural codes utilizing SHPD within PFC networks likely reflects an additional important neurophysiological mechanism for maintenance of past information over time. Stress

  16. Spike Pattern Structure Influences Synaptic Efficacy Variability Under STDP and Synaptic Homeostasis. II: Spike Shuffling Methods on LIF Networks

    Directory of Open Access Journals (Sweden)

    Zedong Bi

    2016-08-01

    Full Text Available Synapses may undergo variable changes during plasticity because of the variability of spike patterns such as temporal stochasticity and spatial randomness. Here, we call the variability of synaptic weight changes during plasticity to be efficacy variability. In this paper, we investigate how four aspects of spike pattern statistics (i.e., synchronous firing, burstiness/regularity, heterogeneity of rates and heterogeneity of cross-correlations influence the efficacy variability under pair-wise additive spike-timing dependent plasticity (STDP and synaptic homeostasis (the mean strength of plastic synapses into a neuron is bounded, by implementing spike shuffling methods onto spike patterns self-organized by a network of excitatory and inhibitory leaky integrate-and-fire (LIF neurons. With the increase of the decay time scale of the inhibitory synaptic currents, the LIF network undergoes a transition from asynchronous state to weak synchronous state and then to synchronous bursting state. We first shuffle these spike patterns using a variety of methods, each designed to evidently change a specific pattern statistics; and then investigate the change of efficacy variability of the synapses under STDP and synaptic homeostasis, when the neurons in the network fire according to the spike patterns before and after being treated by a shuffling method. In this way, we can understand how the change of pattern statistics may cause the change of efficacy variability. Our results are consistent with those of our previous study which implements spike-generating models on converging motifs. We also find that burstiness/regularity is important to determine the efficacy variability under asynchronous states, while heterogeneity of cross-correlations is the main factor to cause efficacy variability when the network moves into synchronous bursting states (the states observed in epilepsy.

  17. Does arousal interfere with operant conditioning of spike-wave discharges in genetic epileptic rats?

    Science.gov (United States)

    Osterhagen, Lasse; Breteler, Marinus; van Luijtelaar, Gilles

    2010-06-01

    One of the ways in which brain computer interfaces can be used is neurofeedback (NF). Subjects use their brain activation to control an external device, and with this technique it is also possible to learn to control aspects of the brain activity by operant conditioning. Beneficial effects of NF training on seizure occurrence have been described in epileptic patients. Little research has been done about differentiating NF effectiveness by type of epilepsy, particularly, whether idiopathic generalized seizures are susceptible to NF. In this experiment, seizures that manifest themselves as spike-wave discharges (SWDs) in the EEG were reinforced during 10 sessions in 6 rats of the WAG/Rij strain, an animal model for absence epilepsy. EEG's were recorded before and after the training sessions. Reinforcing SWDs let to decreased SWD occurrences during training; however, the changes during training were not persistent in the post-training sessions. Because behavioural states are known to have an influence on the occurrence of SWDs, it is proposed that the reinforcement situation increased arousal which resulted in fewer SWDs. Additional tests supported this hypothesis. The outcomes have implications for the possibility to train SWDs with operant learning techniques. Copyright (c) 2010 Elsevier B.V. All rights reserved.

  18. SPINning parallel systems software

    International Nuclear Information System (INIS)

    Matlin, O.S.; Lusk, E.; McCune, W.

    2002-01-01

    We describe our experiences in using Spin to verify parts of the Multi Purpose Daemon (MPD) parallel process management system. MPD is a distributed collection of processes connected by Unix network sockets. MPD is dynamic processes and connections among them are created and destroyed as MPD is initialized, runs user processes, recovers from faults, and terminates. This dynamic nature is easily expressible in the Spin/Promela framework but poses performance and scalability challenges. We present here the results of expressing some of the parallel algorithms of MPD and executing both simulation and verification runs with Spin

  19. Parallel programming with Python

    CERN Document Server

    Palach, Jan

    2014-01-01

    A fast, easy-to-follow and clear tutorial to help you develop Parallel computing systems using Python. Along with explaining the fundamentals, the book will also introduce you to slightly advanced concepts and will help you in implementing these techniques in the real world. If you are an experienced Python programmer and are willing to utilize the available computing resources by parallelizing applications in a simple way, then this book is for you. You are required to have a basic knowledge of Python development to get the most of this book.

  20. Scaling up spike-and-slab models for unsupervised feature learning.

    Science.gov (United States)

    Goodfellow, Ian J; Courville, Aaron; Bengio, Yoshua

    2013-08-01

    We describe the use of two spike-and-slab models for modeling real-valued data, with an emphasis on their applications to object recognition. The first model, which we call spike-and-slab sparse coding (S3C), is a preexisting model for which we introduce a faster approximate inference algorithm. We introduce a deep variant of S3C, which we call the partially directed deep Boltzmann machine (PD-DBM) and extend our S3C inference algorithm for use on this model. We describe learning procedures for each. We demonstrate that our inference procedure for S3C enables scaling the model to unprecedented large problem sizes, and demonstrate that using S3C as a feature extractor results in very good object recognition performance, particularly when the number of labeled examples is low. We show that the PD-DBM generates better samples than its shallow counterpart, and that unlike DBMs or DBNs, the PD-DBM may be trained successfully without greedy layerwise training.

  1. On the sample complexity of learning for networks of spiking neurons with nonlinear synaptic interactions.

    Science.gov (United States)

    Schmitt, Michael

    2004-09-01

    We study networks of spiking neurons that use the timing of pulses to encode information. Nonlinear interactions model the spatial groupings of synapses on the neural dendrites and describe the computations performed at local branches. Within a theoretical framework of learning we analyze the question of how many training examples these networks must receive to be able to generalize well. Bounds for this sample complexity of learning can be obtained in terms of a combinatorial parameter known as the pseudodimension. This dimension characterizes the computational richness of a neural network and is given in terms of the number of network parameters. Two types of feedforward architectures are considered: constant-depth networks and networks of unconstrained depth. We derive asymptotically tight bounds for each of these network types. Constant depth networks are shown to have an almost linear pseudodimension, whereas the pseudodimension of general networks is quadratic. Networks of spiking neurons that use temporal coding are becoming increasingly more important in practical tasks such as computer vision, speech recognition, and motor control. The question of how well these networks generalize from a given set of training examples is a central issue for their successful application as adaptive systems. The results show that, although coding and computation in these networks is quite different and in many cases more powerful, their generalization capabilities are at least as good as those of traditional neural network models.

  2. SPICODYN: A Toolbox for the Analysis of Neuronal Network Dynamics and Connectivity from Multi-Site Spike Signal Recordings.

    Science.gov (United States)

    Pastore, Vito Paolo; Godjoski, Aleksandar; Martinoia, Sergio; Massobrio, Paolo

    2018-01-01

    We implemented an automated and efficient open-source software for the analysis of multi-site neuronal spike signals. The software package, named SPICODYN, has been developed as a standalone windows GUI application, using C# programming language with Microsoft Visual Studio based on .NET framework 4.5 development environment. Accepted input data formats are HDF5, level 5 MAT and text files, containing recorded or generated time series spike signals data. SPICODYN processes such electrophysiological signals focusing on: spiking and bursting dynamics and functional-effective connectivity analysis. In particular, for inferring network connectivity, a new implementation of the transfer entropy method is presented dealing with multiple time delays (temporal extension) and with multiple binary patterns (high order extension). SPICODYN is specifically tailored to process data coming from different Multi-Electrode Arrays setups, guarantying, in those specific cases, automated processing. The optimized implementation of the Delayed Transfer Entropy and the High-Order Transfer Entropy algorithms, allows performing accurate and rapid analysis on multiple spike trains from thousands of electrodes.

  3. Cellular and circuit mechanisms maintain low spike co-variability and enhance population coding in somatosensory cortex

    Directory of Open Access Journals (Sweden)

    Cheng eLy

    2012-03-01

    Full Text Available The responses of cortical neurons are highly variable across repeated presentations of a stimulus. Understanding this variability is critical for theories of both sensory and motor processing, since response variance affects the accuracy of neural codes. Despite this influence, the cellular and circuit mechanisms that shape the trial-to-trial variability of population responses remain poorly understood. We used a combination of experimental and computational techniques to uncover the mechanisms underlying response variability of populations of pyramidal (E cells in layer 2/3 of rat whisker barrel cortex. Spike trains recorded from pairs of E-cells during either spontaneous activity or whisker deflected responses show similarly low levels of spiking co-variability, despite large differences in network activation between the two states. We developed network models that show how spike threshold nonlinearities dilutes E-cell spiking co-variability during spontaneous activity and low velocity whisker deflections. In contrast, during high velocity whisker deflections, cancelation mechanisms mediated by feedforward inhibition maintain low E-cell pairwise co-variability. Thus, the combination of these two mechanisms ensure low E-cell population variability over a wide range of whisker deflection velocities. Finally, we show how this active decorrelation of population variability leads to a drastic increase in the population information about whisker velocity. The canonical cellular and circuit components of our study suggest that low network variability over a broad range of neural states may generalize across the nervous system.

  4. Unsupervised clustering with spiking neurons by sparse temporal coding and multi-layer RBF networks

    NARCIS (Netherlands)

    S.M. Bohte (Sander); J.A. La Poutré (Han); J.N. Kok (Joost)

    2000-01-01

    textabstractWe demonstrate that spiking neural networks encoding information in spike times are capable of computing and learning clusters from realistic data. We show how a spiking neural network based on spike-time coding and Hebbian learning can successfully perform unsupervised clustering on

  5. NeuroFlow: A General Purpose Spiking Neural Network Simulation Platform using Customizable Processors.

    Science.gov (United States)

    Cheung, Kit; Schultz, Simon R; Luk, Wayne

    2015-01-01

    NeuroFlow is a scalable spiking neural network simulation platform for off-the-shelf high performance computing systems using customizable hardware processors such as Field-Programmable Gate Arrays (FPGAs). Unlike multi-core processors and application-specific integrated circuits, the processor architecture of NeuroFlow can be redesigned and reconfigured to suit a particular simulation to deliver optimized performance, such as the degree of parallelism to employ. The compilation process supports using PyNN, a simulator-independent neural network description language, to configure the processor. NeuroFlow supports a number of commonly used current or conductance based neuronal models such as integrate-and-fire and Izhikevich models, and the spike-timing-dependent plasticity (STDP) rule for learning. A 6-FPGA system can simulate a network of up to ~600,000 neurons and can achieve a real-time performance of 400,000 neurons. Using one FPGA, NeuroFlow delivers a speedup of up to 33.6 times the speed of an 8-core processor, or 2.83 times the speed of GPU-based platforms. With high flexibility and throughput, NeuroFlow provides a viable environment for large-scale neural network simulation.

  6. An Investigation into Spike-Based Neuromorphic Approaches for Artificial Olfactory Systems

    Directory of Open Access Journals (Sweden)

    Anup Vanarse

    2017-11-01

    Full Text Available The implementation of neuromorphic methods has delivered promising results for vision and auditory sensors. These methods focus on mimicking the neuro-biological architecture to generate and process spike-based information with minimal power consumption. With increasing interest in developing low-power and robust chemical sensors, the application of neuromorphic engineering concepts for electronic noses has provided an impetus for research focusing on improving these instruments. While conventional e-noses apply computationally expensive and power-consuming data-processing strategies, neuromorphic olfactory sensors implement the biological olfaction principles found in humans and insects to simplify the handling of multivariate sensory data by generating and processing spike-based information. Over the last decade, research on neuromorphic olfaction has established the capability of these sensors to tackle problems that plague the current e-nose implementations such as drift, response time, portability, power consumption and size. This article brings together the key contributions in neuromorphic olfaction and identifies future research directions to develop near-real-time olfactory sensors that can be implemented for a range of applications such as biosecurity and environmental monitoring. Furthermore, we aim to expose the computational parallels between neuromorphic olfaction and gustation for future research focusing on the correlation of these senses.

  7. Controlling market power and price spikes in electricity networks: Demand-side bidding.

    Science.gov (United States)

    Rassenti, Stephen J; Smith, Vernon L; Wilson, Bart J

    2003-03-04

    In this article we report an experiment that examines how demand-side bidding can discipline generators in a market for electric power. First we develop a treatment without demand-side bidding; two large firms are allocated baseload and intermediate cost generators such that either firm might unilaterally withhold the capacity of its intermediate cost generators from the market to benefit from the supracompetitive prices that would result from only selling its baseload units. In a converse treatment, ownership of some of the intermediate cost generators is transferred from each of these firms to two other firms such that no one firm could unilaterally restrict output to spawn supracompetitive prices. Having established a well controlled data set with price spikes paralleling those observed in the naturally occurring economy, we also extend the design to include demand-side bidding. We find that demand-side bidding completely neutralizes the exercise of market power and eliminates price spikes even in the presence of structural market power.

  8. FPGA IMPLEMENTATION OF ADAPTIVE INTEGRATED SPIKING NEURAL NETWORK FOR EFFICIENT IMAGE RECOGNITION SYSTEM

    Directory of Open Access Journals (Sweden)

    T. Pasupathi

    2014-05-01

    Full Text Available Image recognition is a technology which can be used in various applications such as medical image recognition systems, security, defense video tracking, and factory automation. In this paper we present a novel pipelined architecture of an adaptive integrated Artificial Neural Network for image recognition. In our proposed work we have combined the feature of spiking neuron concept with ANN to achieve the efficient architecture for image recognition. The set of training images are trained by ANN and target output has been identified. Real time videos are captured and then converted into frames for testing purpose and the image were recognized. The machine can operate at up to 40 frames/sec using images acquired from the camera. The system has been implemented on XC3S400 SPARTAN-3 Field Programmable Gate Arrays.

  9. Dual roles for spike signaling in cortical neural populations

    Directory of Open Access Journals (Sweden)

    Dana eBallard

    2011-06-01

    Full Text Available A prominent feature of signaling in cortical neurons is that of randomness in the action potential. The output of a typical pyramidal cell can be well fit with a Poisson model, and variations in the Poisson rate repeatedly have been shown to be correlated with stimuli. However while the rate provides a very useful characterization of neural spike data, it may not be the most fundamental description of the signaling code. Recent data showing γ frequency range multi-cell action potential correlations, together with spike timing dependent plasticity, are spurring a re-examination of the classical model, since precise timing codes imply that the generation of spikes is essentially deterministic. Could the observed Poisson randomness and timing determinism reflect two separate modes of communication, or do they somehow derive from a single process? We investigate in a timing-based model whether the apparent incompatibility between these probabilistic and deterministic observations may be resolved by examining how spikes could be used in the underlying neural circuits. The crucial component of this model draws on dual roles for spike signaling. In learning receptive fields from ensembles of inputs, spikes need to behave probabilistically, whereas for fast signaling of individual stimuli, the spikes need to behave deterministically. Our simulations show that this combination is possible if deterministic signals using γ latency coding are probabilistically routed through different members of a cortical cell population at different times. This model exhibits standard features characteristic of Poisson models such as orientation tuning post-stimulus histograms and exponential interval histograms. In addition it makes testable predictions that follow from the γ latency coding.

  10. Expressing Parallelism with ROOT

    Energy Technology Data Exchange (ETDEWEB)

    Piparo, D. [CERN; Tejedor, E. [CERN; Guiraud, E. [CERN; Ganis, G. [CERN; Mato, P. [CERN; Moneta, L. [CERN; Valls Pla, X. [CERN; Canal, P. [Fermilab

    2017-11-22

    The need for processing the ever-increasing amount of data generated by the LHC experiments in a more efficient way has motivated ROOT to further develop its support for parallelism. Such support is being tackled both for shared-memory and distributed-memory environments. The incarnations of the aforementioned parallelism are multi-threading, multi-processing and cluster-wide executions. In the area of multi-threading, we discuss the new implicit parallelism and related interfaces, as well as the new building blocks to safely operate with ROOT objects in a multi-threaded environment. Regarding multi-processing, we review the new MultiProc framework, comparing it with similar tools (e.g. multiprocessing module in Python). Finally, as an alternative to PROOF for cluster-wide executions, we introduce the efforts on integrating ROOT with state-of-the-art distributed data processing technologies like Spark, both in terms of programming model and runtime design (with EOS as one of the main components). For all the levels of parallelism, we discuss, based on real-life examples and measurements, how our proposals can increase the productivity of scientists.

  11. Expressing Parallelism with ROOT

    Science.gov (United States)

    Piparo, D.; Tejedor, E.; Guiraud, E.; Ganis, G.; Mato, P.; Moneta, L.; Valls Pla, X.; Canal, P.

    2017-10-01

    The need for processing the ever-increasing amount of data generated by the LHC experiments in a more efficient way has motivated ROOT to further develop its support for parallelism. Such support is being tackled both for shared-memory and distributed-memory environments. The incarnations of the aforementioned parallelism are multi-threading, multi-processing and cluster-wide executions. In the area of multi-threading, we discuss the new implicit parallelism and related interfaces, as well as the new building blocks to safely operate with ROOT objects in a multi-threaded environment. Regarding multi-processing, we review the new MultiProc framework, comparing it with similar tools (e.g. multiprocessing module in Python). Finally, as an alternative to PROOF for cluster-wide executions, we introduce the efforts on integrating ROOT with state-of-the-art distributed data processing technologies like Spark, both in terms of programming model and runtime design (with EOS as one of the main components). For all the levels of parallelism, we discuss, based on real-life examples and measurements, how our proposals can increase the productivity of scientists.

  12. Parallel Fast Legendre Transform

    NARCIS (Netherlands)

    Alves de Inda, M.; Bisseling, R.H.; Maslen, D.K.

    1998-01-01

    We discuss a parallel implementation of a fast algorithm for the discrete polynomial Legendre transform We give an introduction to the DriscollHealy algorithm using polynomial arithmetic and present experimental results on the eciency and accuracy of our implementation The algorithms were

  13. Practical parallel programming

    CERN Document Server

    Bauer, Barr E

    2014-01-01

    This is the book that will teach programmers to write faster, more efficient code for parallel processors. The reader is introduced to a vast array of procedures and paradigms on which actual coding may be based. Examples and real-life simulations using these devices are presented in C and FORTRAN.

  14. Parallel hierarchical radiosity rendering

    Energy Technology Data Exchange (ETDEWEB)

    Carter, Michael [Iowa State Univ., Ames, IA (United States)

    1993-07-01

    In this dissertation, the step-by-step development of a scalable parallel hierarchical radiosity renderer is documented. First, a new look is taken at the traditional radiosity equation, and a new form is presented in which the matrix of linear system coefficients is transformed into a symmetric matrix, thereby simplifying the problem and enabling a new solution technique to be applied. Next, the state-of-the-art hierarchical radiosity methods are examined for their suitability to parallel implementation, and scalability. Significant enhancements are also discovered which both improve their theoretical foundations and improve the images they generate. The resultant hierarchical radiosity algorithm is then examined for sources of parallelism, and for an architectural mapping. Several architectural mappings are discussed. A few key algorithmic changes are suggested during the process of making the algorithm parallel. Next, the performance, efficiency, and scalability of the algorithm are analyzed. The dissertation closes with a discussion of several ideas which have the potential to further enhance the hierarchical radiosity method, or provide an entirely new forum for the application of hierarchical methods.

  15. Parallel universes beguile science

    CERN Multimedia

    2007-01-01

    A staple of mind-bending science fiction, the possibility of multiple universes has long intrigued hard-nosed physicists, mathematicians and cosmologists too. We may not be able -- as least not yet -- to prove they exist, many serious scientists say, but there are plenty of reasons to think that parallel dimensions are more than figments of eggheaded imagination.

  16. Parallel k-means++

    Energy Technology Data Exchange (ETDEWEB)

    2017-04-04

    A parallelization of the k-means++ seed selection algorithm on three distinct hardware platforms: GPU, multicore CPU, and multithreaded architecture. K-means++ was developed by David Arthur and Sergei Vassilvitskii in 2007 as an extension of the k-means data clustering technique. These algorithms allow people to cluster multidimensional data, by attempting to minimize the mean distance of data points within a cluster. K-means++ improved upon traditional k-means by using a more intelligent approach to selecting the initial seeds for the clustering process. While k-means++ has become a popular alternative to traditional k-means clustering, little work has been done to parallelize this technique. We have developed original C++ code for parallelizing the algorithm on three unique hardware architectures: GPU using NVidia's CUDA/Thrust framework, multicore CPU using OpenMP, and the Cray XMT multithreaded architecture. By parallelizing the process for these platforms, we are able to perform k-means++ clustering much more quickly than it could be done before.

  17. Parallel plate detectors

    International Nuclear Information System (INIS)

    Gardes, D.; Volkov, P.

    1981-01-01

    A 5x3cm 2 (timing only) and a 15x5cm 2 (timing and position) parallel plate avalanche counters (PPAC) are considered. The theory of operation and timing resolution is given. The measurement set-up and the curves of experimental results illustrate the possibilities of the two counters [fr

  18. Parallel hierarchical global illumination

    Energy Technology Data Exchange (ETDEWEB)

    Snell, Quinn O. [Iowa State Univ., Ames, IA (United States)

    1997-10-08

    Solving the global illumination problem is equivalent to determining the intensity of every wavelength of light in all directions at every point in a given scene. The complexity of the problem has led researchers to use approximation methods for solving the problem on serial computers. Rather than using an approximation method, such as backward ray tracing or radiosity, the authors have chosen to solve the Rendering Equation by direct simulation of light transport from the light sources. This paper presents an algorithm that solves the Rendering Equation to any desired accuracy, and can be run in parallel on distributed memory or shared memory computer systems with excellent scaling properties. It appears superior in both speed and physical correctness to recent published methods involving bidirectional ray tracing or hybrid treatments of diffuse and specular surfaces. Like progressive radiosity methods, it dynamically refines the geometry decomposition where required, but does so without the excessive storage requirements for ray histories. The algorithm, called Photon, produces a scene which converges to the global illumination solution. This amounts to a huge task for a 1997-vintage serial computer, but using the power of a parallel supercomputer significantly reduces the time required to generate a solution. Currently, Photon can be run on most parallel environments from a shared memory multiprocessor to a parallel supercomputer, as well as on clusters of heterogeneous workstations.

  19. Worker flexibility in a parallel dual resource constrained job shop

    NARCIS (Netherlands)

    Yue, H.; Slomp, J.; Molleman, E.; van der Zee, D.J.

    2008-01-01

    In this paper we investigate cross-training policies in a dual resource constraint (DRC) parallel job shop where new part types are frequently introduced into the system. Each new part type introduction induces the need for workers to go through a learning curve. A cross-training policy relates to

  20. Spike morphology in blast-wave-driven instability experiments

    International Nuclear Information System (INIS)

    Kuranz, C. C.; Drake, R. P.; Grosskopf, M. J.; Fryxell, B.; Budde, A.; Hansen, J. F.; Miles, A. R.; Plewa, T.; Hearn, N.; Knauer, J.

    2010-01-01

    The laboratory experiments described in the present paper observe the blast-wave-driven Rayleigh-Taylor instability with three-dimensional (3D) initial conditions. About 5 kJ of energy from the Omega laser creates conditions similar to those of the He-H interface during the explosion phase of a supernova. The experimental target is a 150 μm thick plastic disk followed by a low-density foam. The plastic piece has an embedded, 3D perturbation. The basic structure of the pattern is two orthogonal sine waves where each sine wave has an amplitude of 2.5 μm and a wavelength of 71 μm. In some experiments, an additional wavelength is added to explore the interaction of modes. In experiments with 3D initial conditions the spike morphology differs from what has been observed in other Rayleigh-Taylor experiments and simulations. Under certain conditions, experimental radiographs show some mass extending from the interface to the shock front. Current simulations show neither the spike morphology nor the spike penetration observed in the experiments. The amount of mass reaching the shock front is analyzed and potential causes for the spike morphology and the spikes reaching the shock are discussed. One such hypothesis is that these phenomena may be caused by magnetic pressure, generated by an azimuthal magnetic field produced by the plasma dynamics.

  1. Financial time series prediction using spiking neural networks.

    Science.gov (United States)

    Reid, David; Hussain, Abir Jaafar; Tawfik, Hissam

    2014-01-01

    In this paper a novel application of a particular type of spiking neural network, a Polychronous Spiking Network, was used for financial time series prediction. It is argued that the inherent temporal capabilities of this type of network are suited to non-stationary data such as this. The performance of the spiking neural network was benchmarked against three systems: two "traditional", rate-encoded, neural networks; a Multi-Layer Perceptron neural network and a Dynamic Ridge Polynomial neural network, and a standard Linear Predictor Coefficients model. For this comparison three non-stationary and noisy time series were used: IBM stock data; US/Euro exchange rate data, and the price of Brent crude oil. The experiments demonstrated favourable prediction results for the Spiking Neural Network in terms of Annualised Return and prediction error for 5-Step ahead predictions. These results were also supported by other relevant metrics such as Maximum Drawdown and Signal-To-Noise ratio. This work demonstrated the applicability of the Polychronous Spiking Network to financial data forecasting and this in turn indicates the potential of using such networks over traditional systems in difficult to manage non-stationary environments.

  2. Scaling of spiking and humping in keyhole welding

    Energy Technology Data Exchange (ETDEWEB)

    Wei, P S; Chuang, K C [Department of Mechanical and Electro-Mechanical Engineering, National Sun Yat-Sen University, Kaohsiung, Taiwan (China); DebRoy, T [Department of Materials Science and Engineering, The Pennsylvania State University, University Park, PA 16802 (United States); Ku, J S, E-mail: pswei@mail.nsysu.edu.tw, E-mail: cielo.zhuang@gmail.com, E-mail: rtd1@psu.edu, E-mail: jsku@mail.nsysu.edu.tw [Institute of Materials Science and Engineering, National Sun Yat-Sen University, Kaohsiung, Taiwan (China)

    2011-06-22

    Spiking, rippling and humping seriously reduce the strength of welds. The effects of beam focusing, volatile alloying element concentration and welding velocity on spiking, coarse rippling and humping in keyhole mode electron-beam welding are examined through scale analysis. Although these defects have been studied in the past, the mechanisms for their formation are not fully understood. This work relates the average amplitudes of spikes to fusion zone depth for the welding of Al 6061, SS 304 and carbon steel, and Al 5083. The scale analysis introduces welding and melting efficiencies and an appropriate power distribution to account for the focusing effects, and the energy which is reflected and escapes through the keyhole opening to the surroundings. The frequency of humping and spiking can also be predicted from the scale analysis. The analysis also reveals the interrelation between coarse rippling and humping. The data and the mechanistic findings reported in this study are useful for understanding and preventing spiking and humping during keyhole mode electron and laser beam welding.

  3. Parallel grid population

    Science.gov (United States)

    Wald, Ingo; Ize, Santiago

    2015-07-28

    Parallel population of a grid with a plurality of objects using a plurality of processors. One example embodiment is a method for parallel population of a grid with a plurality of objects using a plurality of processors. The method includes a first act of dividing a grid into n distinct grid portions, where n is the number of processors available for populating the grid. The method also includes acts of dividing a plurality of objects into n distinct sets of objects, assigning a distinct set of objects to each processor such that each processor determines by which distinct grid portion(s) each object in its distinct set of objects is at least partially bounded, and assigning a distinct grid portion to each processor such that each processor populates its distinct grid portion with any objects that were previously determined to be at least partially bounded by its distinct grid portion.

  4. Ultrascalable petaflop parallel supercomputer

    Science.gov (United States)

    Blumrich, Matthias A [Ridgefield, CT; Chen, Dong [Croton On Hudson, NY; Chiu, George [Cross River, NY; Cipolla, Thomas M [Katonah, NY; Coteus, Paul W [Yorktown Heights, NY; Gara, Alan G [Mount Kisco, NY; Giampapa, Mark E [Irvington, NY; Hall, Shawn [Pleasantville, NY; Haring, Rudolf A [Cortlandt Manor, NY; Heidelberger, Philip [Cortlandt Manor, NY; Kopcsay, Gerard V [Yorktown Heights, NY; Ohmacht, Martin [Yorktown Heights, NY; Salapura, Valentina [Chappaqua, NY; Sugavanam, Krishnan [Mahopac, NY; Takken, Todd [Brewster, NY

    2010-07-20

    A massively parallel supercomputer of petaOPS-scale includes node architectures based upon System-On-a-Chip technology, where each processing node comprises a single Application Specific Integrated Circuit (ASIC) having up to four processing elements. The ASIC nodes are interconnected by multiple independent networks that optimally maximize the throughput of packet communications between nodes with minimal latency. The multiple networks may include three high-speed networks for parallel algorithm message passing including a Torus, collective network, and a Global Asynchronous network that provides global barrier and notification functions. These multiple independent networks may be collaboratively or independently utilized according to the needs or phases of an algorithm for optimizing algorithm processing performance. The use of a DMA engine is provided to facilitate message passing among the nodes without the expenditure of processing resources at the node.

  5. More parallel please

    DEFF Research Database (Denmark)

    Gregersen, Frans; Josephson, Olle; Kristoffersen, Gjert

    of departure that English may be used in parallel with the various local, in this case Nordic, languages. As such, the book integrates the challenge of internationalization faced by any university with the wish to improve quality in research, education and administration based on the local language......Abstract [en] More parallel, please is the result of the work of an Inter-Nordic group of experts on language policy financed by the Nordic Council of Ministers 2014-17. The book presents all that is needed to plan, practice and revise a university language policy which takes as its point......(s). There are three layers in the text: First, you may read the extremely brief version of the in total 11 recommendations for best practice. Second, you may acquaint yourself with the extended version of the recommendations and finally, you may study the reasoning behind each of them. At the end of the text, we give...

  6. Xyce parallel electronic simulator.

    Energy Technology Data Exchange (ETDEWEB)

    Keiter, Eric R; Mei, Ting; Russo, Thomas V.; Rankin, Eric Lamont; Schiek, Richard Louis; Thornquist, Heidi K.; Fixel, Deborah A.; Coffey, Todd S; Pawlowski, Roger P; Santarelli, Keith R.

    2010-05-01

    This document is a reference guide to the Xyce Parallel Electronic Simulator, and is a companion document to the Xyce Users Guide. The focus of this document is (to the extent possible) exhaustively list device parameters, solver options, parser options, and other usage details of Xyce. This document is not intended to be a tutorial. Users who are new to circuit simulation are better served by the Xyce Users Guide.

  7. Stability of parallel flows

    CERN Document Server

    Betchov, R

    2012-01-01

    Stability of Parallel Flows provides information pertinent to hydrodynamical stability. This book explores the stability problems that occur in various fields, including electronics, mechanics, oceanography, administration, economics, as well as naval and aeronautical engineering. Organized into two parts encompassing 10 chapters, this book starts with an overview of the general equations of a two-dimensional incompressible flow. This text then explores the stability of a laminar boundary layer and presents the equation of the inviscid approximation. Other chapters present the general equation

  8. Algorithmically specialized parallel computers

    CERN Document Server

    Snyder, Lawrence; Gannon, Dennis B

    1985-01-01

    Algorithmically Specialized Parallel Computers focuses on the concept and characteristics of an algorithmically specialized computer.This book discusses the algorithmically specialized computers, algorithmic specialization using VLSI, and innovative architectures. The architectures and algorithms for digital signal, speech, and image processing and specialized architectures for numerical computations are also elaborated. Other topics include the model for analyzing generalized inter-processor, pipelined architecture for search tree maintenance, and specialized computer organization for raster

  9. Comparison of spike-sorting algorithms for future hardware implementation.

    Science.gov (United States)

    Gibson, Sarah; Judy, Jack W; Markovic, Dejan

    2008-01-01

    Applications such as brain-machine interfaces require hardware spike sorting in order to (1) obtain single-unit activity and (2) perform data reduction for wireless transmission of data. Such systems must be low-power, low-area, high-accuracy, automatic, and able to operate in real time. Several detection and feature extraction algorithms for spike sorting are described briefly and evaluated in terms of accuracy versus computational complexity. The nonlinear energy operator method is chosen as the optimal spike detection algorithm, being most robust over noise and relatively simple. The discrete derivatives method [1] is chosen as the optimal feature extraction method, maintaining high accuracy across SNRs with a complexity orders of magnitude less than that of traditional methods such as PCA.

  10. Grain price spikes and beggar-thy-neighbor policy responses

    DEFF Research Database (Denmark)

    Jensen, Hans Grinsted; Anderson, Kym

    When prices spike in international grain markets, national governments often reduce the extent to which that spike affects their domestic food markets. Those actions exacerbate the price spike and international welfare transfer associated with that terms of trade change. Several recent analyses...... have assessed the extent to which those policies contributed to the 2006-08 international price rise, but only by focusing on one commodity or using a back-of-the envelope (BOTE) method. This paper provides a more-comprehensive analysis using a global economy-wide model that is able to take account...... of the interactions between markets for farm products that are closely related in production and/or consumption, and able to estimate the impacts of those insulating policies on grain prices and on the grain trade and economic welfare of the world’s various countries. Our results support the conclusion from earlier...

  11. Character recognition from trajectory by recurrent spiking neural networks.

    Science.gov (United States)

    Jiangrong Shen; Kang Lin; Yueming Wang; Gang Pan

    2017-07-01

    Spiking neural networks are biologically plausible and power-efficient on neuromorphic hardware, while recurrent neural networks have been proven to be efficient on time series data. However, how to use the recurrent property to improve the performance of spiking neural networks is still a problem. This paper proposes a recurrent spiking neural network for character recognition using trajectories. In the network, a new encoding method is designed, in which varying time ranges of input streams are used in different recurrent layers. This is able to improve the generalization ability of our model compared with general encoding methods. The experiments are conducted on four groups of the character data set from University of Edinburgh. The results show that our method can achieve a higher average recognition accuracy than existing methods.

  12. Evaluation of the uranium double spike technique for environmental monitoring

    International Nuclear Information System (INIS)

    Hemberger, P.H.; Rokop, D.J.; Efurd, D.W.; Roensch, F.R.; Smith, D.H.; Turner, M.L.; Barshick, C.M.; Bayne, C.K.

    1998-01-01

    Use of a uranium double spike in analysis of environmental samples showed that a 235 U enrichment of 1% ( 235 U/ 238 U = 0.00732) can be distinguished from natural ( 235 U/ 238 U = 0.00725). Experiments performed jointly at Los Alamos National Laboratory (LANL) and Oak Ridge National Laboratory (ORNL) used a carefully calibrated double spike of 233 U and 236 U to obtain much better precision than is possible using conventional analytical techniques. A variety of different sampling media (vegetation and swipes) showed that, provided sufficient care is exercised in choice of sample type, relative standard deviations of less than ± 0.5% can be routinely obtained. This ability, unavailable without use of the double spike, has enormous potential significance in the detection of undeclared nuclear facilities

  13. A Hybrid Setarx Model for Spikes in Tight Electricity Markets

    Directory of Open Access Journals (Sweden)

    Carlo Lucheroni

    2012-01-01

    Full Text Available The paper discusses a simple looking but highly nonlinear regime-switching, self-excited threshold model for hourly electricity prices in continuous and discrete time. The regime structure of the model is linked to organizational features of the market. In continuous time, the model can include spikes without using jumps, by defining stochastic orbits. In passing from continuous time to discrete time, the stochastic orbits survive discretization and can be identified again as spikes. A calibration technique suitable for the discrete version of this model, which does not need deseasonalization or spike filtering, is developed, tested and applied to market data. The discussion of the properties of the model uses phase-space analysis, an approach uncommon in econometrics. (original abstract

  14. Inherently stochastic spiking neurons for probabilistic neural computation

    KAUST Repository

    Al-Shedivat, Maruan

    2015-04-01

    Neuromorphic engineering aims to design hardware that efficiently mimics neural circuitry and provides the means for emulating and studying neural systems. In this paper, we propose a new memristor-based neuron circuit that uniquely complements the scope of neuron implementations and follows the stochastic spike response model (SRM), which plays a cornerstone role in spike-based probabilistic algorithms. We demonstrate that the switching of the memristor is akin to the stochastic firing of the SRM. Our analysis and simulations show that the proposed neuron circuit satisfies a neural computability condition that enables probabilistic neural sampling and spike-based Bayesian learning and inference. Our findings constitute an important step towards memristive, scalable and efficient stochastic neuromorphic platforms. © 2015 IEEE.

  15. A Cross-Correlated Delay Shift Supervised Learning Method for Spiking Neurons with Application to Interictal Spike Detection in Epilepsy.

    Science.gov (United States)

    Guo, Lilin; Wang, Zhenzhong; Cabrerizo, Mercedes; Adjouadi, Malek

    2017-05-01

    This study introduces a novel learning algorithm for spiking neurons, called CCDS, which is able to learn and reproduce arbitrary spike patterns in a supervised fashion allowing the processing of spatiotemporal information encoded in the precise timing of spikes. Unlike the Remote Supervised Method (ReSuMe), synapse delays and axonal delays in CCDS are variants which are modulated together with weights during learning. The CCDS rule is both biologically plausible and computationally efficient. The properties of this learning rule are investigated extensively through experimental evaluations in terms of reliability, adaptive learning performance, generality to different neuron models, learning in the presence of noise, effects of its learning parameters and classification performance. Results presented show that the CCDS learning method achieves learning accuracy and learning speed comparable with ReSuMe, but improves classification accuracy when compared to both the Spike Pattern Association Neuron (SPAN) learning rule and the Tempotron learning rule. The merit of CCDS rule is further validated on a practical example involving the automated detection of interictal spikes in EEG records of patients with epilepsy. Results again show that with proper encoding, the CCDS rule achieves good recognition performance.

  16. Effects of a new parallel primary healthcare centre and on-campus training programme on history taking, physical examination skills and medical students’ preparedness: a prospective comparative study in Taiwan

    Science.gov (United States)

    Yang, Ying-Ying; Wang, Shuu-Jiun; Yang, Ling-Yu; Lirng, Jiing-Feng; Huang, Chia-Chang; Liang, Jen-Feng; Lee, Fa-Yauh; Hwang, Shinn-Jang; Huang, Chin-Chou; Kirby, Ralph

    2017-01-01

    Objectives The primary healthcarecentre (PHCC) is the first place that medical students experience patient contact. Usually, medical students are frustrated by a lack of proper skills training for on-campus history taking (HT), physical examination (PE) and self-directed learning (SDL) to prepare for their PHCC and inhospital patient contact. For pre-clerks, this study aims to compare the effectiveness of PHCC training and PHCC training in combination with on-campus HT and PE training modules (PHCC+on-campus) on their clerkship preparedness. Design This comparative study utilised prospective, consecutive, end of pre-clerkship group objective structured clinical examination (GOSCE), beginning of clerkship OSCE and self-administered Preparation for Hospital Practice Questionnaire (PHPQ). Setting/participants 128 pre-clinical clerk volunteers (64 each year) receiving PHCC training (7 week PHCCtraining in addition to 7 week assignment based group learning, academic year 2014, controls) and PHCC training in combination with on-campus module training (academic year 2015, 7 week PHCCtraining in addition to 7 week on-campus sessions) were sequentially assessed before the module (week 1), at the end of the module (week 14) and at the beginning of clerkship (week 25). Results For overall HT and PE skills, both PHCC and PHCC+on-campus module trained pre-clerks performed better on OSCE than GOSCE. Additionally, the improvement was accompanied by higher self-reported PHPQ scores in ‘confidence/coping’ and ‘SDL’ domains. At the end of the pre-clerkship and the beginning of the clerkship stages, the degree of improvement in preparedness in ‘confidence/coping’ and ‘SDL’ domains was higher for those in the PHCC+on-campus group than for those in the PHCC group. Among the PHCC+on-campus module participants, a positive association was observed between high mean PHPQ-SDL scores and high OSCE scores. Conclusions Our study suggests that the PHCC+on-campus module

  17. Effects of a new parallel primary healthcare centre and on-campus training programme on history taking, physical examination skills and medical students' preparedness: a prospective comparative study in Taiwan.

    Science.gov (United States)

    Yang, Ying-Ying; Wang, Shuu-Jiun; Yang, Ling-Yu; Lirng, Jiing-Feng; Huang, Chia-Chang; Liang, Jen-Feng; Lee, Fa-Yauh; Hwang, Shinn-Jang; Huang, Chin-Chou; Kirby, Ralph

    2017-09-25

    The primary healthcarecentre (PHCC) is the first place that medical students experience patient contact. Usually, medical students are frustrated by a lack of proper skills training for on-campus history taking (HT), physical examination (PE) and self-directed learning (SDL) to prepare for their PHCC and inhospital patient contact. For pre-clerks, this study aims to compare the effectiveness of PHCC training and PHCC training in combination with on-campus HT and PE training modules (PHCC+on-campus) on their clerkship preparedness. This comparative study utilised prospective, consecutive, end of pre-clerkship group objective structured clinical examination (GOSCE), beginning of clerkship OSCE and self-administered Preparation for Hospital Practice Questionnaire (PHPQ). 128 pre-clinical clerk volunteers (64 each year) receiving PHCC training (7 week PHCCtraining in addition to 7 week assignment based group learning, academic year 2014, controls) and PHCC training in combination with on-campus module training (academic year 2015, 7 week PHCCtraining in addition to 7 week on-campus sessions) were sequentially assessed before the module (week 1), at the end of the module (week 14) and at the beginning of clerkship (week 25). For overall HT and PE skills, both PHCC and PHCC+on-campus module trained pre-clerks performed better on OSCE than GOSCE. Additionally, the improvement was accompanied by higher self-reported PHPQ scores in 'confidence/coping' and 'SDL' domains. At the end of the pre-clerkship and the beginning of the clerkship stages, the degree of improvement in preparedness in 'confidence/coping' and 'SDL' domains was higher for those in the PHCC+on-campus group than for those in the PHCC group. Among the PHCC+on-campus module participants, a positive association was observed between high mean PHPQ-SDL scores and high OSCE scores. Our study suggests that the PHCC+on-campus module, which is paired faculty led and pre-trained dyad student assisted, is

  18. Sleep deprivation and spike-wave discharges in epileptic rats

    OpenAIRE

    Drinkenburg, W.H.I.M.; Coenen, A.M.L.; Vossen, J.M.H.; Luijtelaar, E.L.J.M. van

    1995-01-01

    The effects of sleep deprivation were studied on the occurrence of spike-wave discharges in the electroencephalogram of rats of the epileptic WAG/Rij strain, a model for absence epilepsy. This was done before, during and after a period of 12 hours of near total sleep deprivation. A substantial increase in the number of spike-wave discharges was found during the first 4 hours of the deprivation period, whereas in the following deprivation hours epileptic activity returned to baseline values. I...

  19. Spike propagation in driven chain networks with dominant global inhibition

    International Nuclear Information System (INIS)

    Chang Wonil; Jin, Dezhe Z.

    2009-01-01

    Spike propagation in chain networks is usually studied in the synfire regime, in which successive groups of neurons are synaptically activated sequentially through the unidirectional excitatory connections. Here we study the dynamics of chain networks with dominant global feedback inhibition that prevents the synfire activity. Neural activity is driven by suprathreshold external inputs. We analytically and numerically demonstrate that spike propagation along the chain is a unique dynamical attractor in a wide parameter regime. The strong inhibition permits a robust winner-take-all propagation in the case of multiple chains competing via the inhibition.

  20. Spiking neuron devices consisting of single-flux-quantum circuits

    International Nuclear Information System (INIS)

    Hirose, Tetsuya; Asai, Tetsuya; Amemiya, Yoshihito

    2006-01-01

    Single-flux-quantum (SFQ) circuits can be used for making spiking neuron devices, which are useful elements for constructing intelligent, brain-like computers. The device we propose is based on the leaky integrate-and-fire neuron (IFN) model and uses a SFQ pulse as an action signal or a spike of neurons. The operation of the neuron device is confirmed by computer simulator. It can operate with a short delay of 100 ps or less and is the highest-speed neuron device ever reported

  1. STDP-based spiking deep convolutional neural networks for object recognition.

    Science.gov (United States)

    Kheradpisheh, Saeed Reza; Ganjtabesh, Mohammad; Thorpe, Simon J; Masquelier, Timothée

    2018-03-01

    Previous studies have shown that spike-timing-dependent plasticity (STDP) can be used in spiking neural networks (SNN) to extract visual features of low or intermediate complexity in an unsupervised manner. These studies, however, used relatively shallow architectures, and only one layer was trainable. Another line of research has demonstrated - using rate-based neural networks trained with back-propagation - that having many layers increases the recognition robustness, an approach known as deep learning. We thus designed a deep SNN, comprising several convolutional (trainable with STDP) and pooling layers. We used a temporal coding scheme where the most strongly activated neurons fire first, and less activated neurons fire later or not at all. The network was exposed to natural images. Thanks to STDP, neurons progressively learned features corresponding to prototypical patterns that were both salient and frequent. Only a few tens of examples per category were required and no label was needed. After learning, the complexity of the extracted features increased along the hierarchy, from edge detectors in the first layer to object prototypes in the last layer. Coding was very sparse, with only a few thousands spikes per image, and in some cases the object category could be reasonably well inferred from the activity of a single higher-order neuron. More generally, the activity of a few hundreds of such neurons contained robust category information, as demonstrated using a classifier on Caltech 101, ETH-80, and MNIST databases. We also demonstrate the superiority of STDP over other unsupervised techniques such as random crops (HMAX) or auto-encoders. Taken together, our results suggest that the combination of STDP with latency coding may be a key to understanding the way that the primate visual system learns, its remarkable processing speed and its low energy consumption. These mechanisms are also interesting for artificial vision systems, particularly for hardware

  2. Anti-correlations in the degree distribution increase stimulus detection performance in noisy spiking neural networks.

    Science.gov (United States)

    Martens, Marijn B; Houweling, Arthur R; E Tiesinga, Paul H

    2017-02-01

    Neuronal circuits in the rodent barrel cortex are characterized by stable low firing rates. However, recent experiments show that short spike trains elicited by electrical stimulation in single neurons can induce behavioral responses. Hence, the underlying neural networks provide stability against internal fluctuations in the firing rate, while simultaneously making the circuits sensitive to small external perturbations. Here we studied whether stability and sensitivity are affected by the connectivity structure in recurrently connected spiking networks. We found that anti-correlation between the number of afferent (in-degree) and efferent (out-degree) synaptic connections of neurons increases stability against pathological bursting, relative to networks where the degrees were either positively correlated or uncorrelated. In the stable network state, stimulation of a few cells could lead to a detectable change in the firing rate. To quantify the ability of networks to detect the stimulation, we used a receiver operating characteristic (ROC) analysis. For a given level of background noise, networks with anti-correlated degrees displayed the lowest false positive rates, and consequently had the highest stimulus detection performance. We propose that anti-correlation in the degree distribution may be a computational strategy employed by sensory cortices to increase the detectability of external stimuli. We show that networks with anti-correlated degrees can in principle be formed by applying learning rules comprised of a combination of spike-timing dependent plasticity, homeostatic plasticity and pruning to networks with uncorrelated degrees. To test our prediction we suggest a novel experimental method to estimate correlations in the degree distribution.

  3. Unsupervised Learning of Digit Recognition Using Spike-Timing-Dependent Plasticity

    Directory of Open Access Journals (Sweden)

    Peter U. Diehl

    2015-08-01

    Full Text Available In order to understand how the mammalian neocortex is performing computations, two things are necessary; we need to have a good understanding of the available neuronal processing units and mechanisms, and we need to gain a better understanding of how those mechanisms are combined to build functioning systems. Therefore, in recent years there is an increasing interest in how spiking neural networks (SNN can be used to perform complex computations or solve pattern recognition tasks. However, it remains a challenging task to design SNNs which use biologically plausible mechanisms (especially for learning new patterns, since most of such SNN architectures rely on training in a rate-based network and subsequent conversion to a SNN. We present a SNN for digit recognition which is based on mechanisms with increased biological plausibility, i.e. conductance-based instead of current-based synapses, spike-timing-dependent plasticity with time-dependent weight change, lateral inhibition, and an adaptive spiking threshold. Unlike most other systems, we do not use a teaching signal and do not present any class labels to the network. Using this unsupervised learning scheme, our architecture achieves 95% accuracy on the MNIST benchmark, which is better than previous SNN implementations without supervision. The fact that we used no domain-specific knowledge points toward the general applicability of our network design. Also, the performance of our network scales well with the number of neurons used and shows similar performance for four different learning rules, indicating robustness of the full combination of mechanisms, which suggests applicability in heterogeneous biological neural networks.

  4. A defined network of fast-spiking interneurons in orbitofrontal cortex: responses to behavioral contingencies and ketamine administration

    Directory of Open Access Journals (Sweden)

    Michael C Quirk

    2009-11-01

    Full Text Available Orbitofrontal cortex (OFC is a region of prefrontal cortex implicated in the motivational control of behavior and in related abnormalities seen in psychosis and depression. It has been hypothesized that a critical mechanism in these disorders is the dysfunction of GABAergic interneurons that normally regulate prefrontal information processing. Here, we studied a subclass of interneurons isolated in rat OFC using extracellular waveform and spike train analysis. During performance of a goal-directed behavioral task, the firing of this class of putative fast-spiking (FS interneurons showed robust temporal correlations indicative of a functionally coherent network. FS cell activity also co-varied with behavioral response latency, a key indicator of motivational state. Systemic administration of ketamine, a drug that can mimic psychosis, preferentially inhibited this cell class. Together, these results support the idea that OFC-FS interneurons form a critical link in the regulation of motivation by prefrontal circuits during normal and abnormal brain and behavioral states.

  5. Learning to Generate Sequences with Combination of Hebbian and Non-hebbian Plasticity in Recurrent Spiking Neural Networks.

    Science.gov (United States)

    Panda, Priyadarshini; Roy, Kaushik

    2017-01-01

    Synaptic Plasticity, the foundation for learning and memory formation in the human brain, manifests in various forms. Here, we combine the standard spike timing correlation based Hebbian plasticity with a non-Hebbian synaptic decay mechanism for training a recurrent spiking neural model to generate sequences. We show that inclusion of the adaptive decay of synaptic weights with standard STDP helps learn stable contextual dependencies between temporal sequences, while reducing the strong attractor states that emerge in recurrent models due to feedback loops. Furthermore, we show that the combined learning scheme suppresses the chaotic activity in the recurrent model substantially, thereby enhancing its' ability to generate sequences consistently even in the presence of perturbations.

  6. Temporal sequence learning in winner-take-all networks of spiking neurons demonstrated in a brain-based device.

    Science.gov (United States)

    McKinstry, Jeffrey L; Edelman, Gerald M

    2013-01-01

    Animal behavior often involves a temporally ordered sequence of actions learned from experience. Here we describe simulations of interconnected networks of spiking neurons that learn to generate patterns of activity in correct temporal order. The simulation consists of large-scale networks of thousands of excitatory and inhibitory neurons that exhibit short-term synaptic plasticity and spike-timing dependent synaptic plasticity. The neural architecture within each area is arranged to evoke winner-take-all (WTA) patterns of neural activity that persist for tens of milliseconds. In order to generate and switch between consecutive firing patterns in correct temporal order, a reentrant exchange of signals between these areas was necessary. To demonstrate the capacity of this arrangement, we used the simulation to train a brain-based device responding to visual input by autonomously generating temporal sequences of motor actions.

  7. Components of action potential repolarization in cerebellar parallel fibres.

    Science.gov (United States)

    Pekala, Dobromila; Baginskas, Armantas; Szkudlarek, Hanna J; Raastad, Morten

    2014-11-15

    Repolarization of the presynaptic action potential is essential for transmitter release, excitability and energy expenditure. Little is known about repolarization in thin, unmyelinated axons forming en passant synapses, which represent the most common type of axons in the mammalian brain's grey matter.We used rat cerebellar parallel fibres, an example of typical grey matter axons, to investigate the effects of K(+) channel blockers on repolarization. We show that repolarization is composed of a fast tetraethylammonium (TEA)-sensitive component, determining the width and amplitude of the spike, and a slow margatoxin (MgTX)-sensitive depolarized after-potential (DAP). These two components could be recorded at the granule cell soma as antidromic action potentials and from the axons with a newly developed miniaturized grease-gap method. A considerable proportion of fast repolarization remained in the presence of TEA, MgTX, or both. This residual was abolished by the addition of quinine. The importance of proper control of fast repolarization was demonstrated by somatic recordings of antidromic action potentials. In these experiments, the relatively broad K(+) channel blocker 4-aminopyridine reduced the fast repolarization, resulting in bursts of action potentials forming on top of the DAP. We conclude that repolarization of the action potential in parallel fibres is supported by at least three groups of K(+) channels. Differences in their temporal profiles allow relatively independent control of the spike and the DAP, whereas overlap of their temporal profiles provides robust control of axonal bursting properties.

  8. SpikingLab: modelling agents controlled by Spiking Neural Networks in Netlogo.

    Science.gov (United States)

    Jimenez-Romero, Cristian; Johnson, Jeffrey

    2017-01-01

    The scientific interest attracted by Spiking Neural Networks (SNN) has lead to the development of tools for the simulation and study of neuronal dynamics ranging from phenomenological models to the more sophisticated and biologically accurate Hodgkin-and-Huxley-based and multi-compartmental models. However, despite the multiple features offered by neural modelling tools, their integration with environments for the simulation of robots and agents can be challenging and time consuming. The implementation of artificial neural circuits to control robots generally involves the following tasks: (1) understanding the simulation tools, (2) creating the neural circuit in the neural simulator, (3) linking the simulated neural circuit with the environment of the agent and (4) programming the appropriate interface in the robot or agent to use the neural controller. The accomplishment of the above-mentioned tasks can be challenging, especially for undergraduate students or novice researchers. This paper presents an alternative tool which facilitates the simulation of simple SNN circuits using the multi-agent simulation and the programming environment Netlogo (educational software that simplifies the study and experimentation of complex systems). The engine proposed and implemented in Netlogo for the simulation of a functional model of SNN is a simplification of integrate and fire (I&F) models. The characteristics of the engine (including neuronal dynamics, STDP learning and synaptic delay) are demonstrated through the implementation of an agent representing an artificial insect controlled by a simple neural circuit. The setup of the experiment and its outcomes are described in this work.

  9. Toxicity of nickel-spiked freshwater sediments to benthic invertebrates-Spiking methodology, species sensitivity, and nickel bioavailability

    Science.gov (United States)

    Besser, John M.; Brumbaugh, William G.; Kemble, Nile E.; Ivey, Chris D.; Kunz, James L.; Ingersoll, Christopher G.; Rudel, David

    2011-01-01

    This report summarizes data from studies of the toxicity and bioavailability of nickel in nickel-spiked freshwater sediments. The goal of these studies was to generate toxicity and chemistry data to support development of broadly applicable sediment quality guidelines for nickel. The studies were conducted as three tasks, which are presented here as three chapters: Task 1, Development of methods for preparation and toxicity testing of nickel-spiked freshwater sediments; Task 2, Sensitivity of benthic invertebrates to toxicity of nickel-spiked freshwater sediments; and Task 3, Effect of sediment characteristics on nickel bioavailability. Appendices with additional methodological details and raw chemistry and toxicity data for the three tasks are available online at http://pubs.usgs.gov/sir/2011/5225/downloads/.

  10. Google Searches for "Cheap Cigarettes" Spike at Tax Increases: Evidence from an Algorithm to Detect Spikes in Time Series Data.

    Science.gov (United States)

    Caputi, Theodore L

    2018-05-03

    Online cigarette dealers have lower prices than brick-and-mortar retailers and advertise tax-free status.1-8 Previous studies show smokers search out these online alternatives at the time of a cigarette tax increase.9,10 However, these studies rely upon researchers' decision to consider a specific date and preclude the possibility that researchers focus on the wrong date. The purpose of this study is to introduce an unbiased methodology to the field of observing search patterns and to use this methodology to determine whether smokers search Google for "cheap cigarettes" at cigarette tax increases and, if so, whether the increased level of searches persists. Publicly available data from Google Trends is used to observe standardized search volumes for the term, "cheap cigarettes". Seasonal Hybrid Extreme Studentized Deviate and E-Divisive with Means tests were performed to observe spikes and mean level shifts in search volume. Of the twelve cigarette tax increases studied, ten showed spikes in searches for "cheap cigarettes" within two weeks of the tax increase. However, the mean level shifts did not occur for any cigarette tax increase. Searches for "cheap cigarettes" spike around the time of a cigarette tax increase, but the mean level of searches does not shift in response to a tax increase. The SHESD and EDM tests are unbiased methodologies that can be used to identify spikes and mean level shifts in time series data without an a priori date to be studied. SHESD and EDM affirm spikes in interest are related to tax increases. • Applies improved statistical techniques (SHESD and EDM) to Google search data related to cigarettes, reducing bias and increasing power • Contributes to the body of evidence that state and federal tax increases are associated with spikes in searches for cheap cigarettes and may be good dates for increased online health messaging related to tobacco.

  11. Pelvic floor muscle training versus watchful waiting or pessary treatment for pelvic organ prolapse (POPPS) : Design and participant baseline characteristics of two parallel pragmatic randomized controlled trials in primary care

    NARCIS (Netherlands)

    Wiegersma, Marian; Panman, Chantal M. C. R.; Kollen, Boudewijn J.; Vermeulen, Karin M.; Schram, Aaltje J.; Messelink, Embert J.; Berger, Marjolein Y.; Lisman-Van Leeuwen, Yvonne; Dekker, Janny H.

    Pelvic floor muscle training (PFMT) and pessaries are commonly used in the conservative treatment of pelvic organ prolapse (POP). Because there is a lack of evidence regarding the optimal choice between these two interventions, we designed the "Pelvic Organ prolapse in primary care: effects of

  12. Resistor Combinations for Parallel Circuits.

    Science.gov (United States)

    McTernan, James P.

    1978-01-01

    To help simplify both teaching and learning of parallel circuits, a high school electricity/electronics teacher presents and illustrates the use of tables of values for parallel resistive circuits in which total resistances are whole numbers. (MF)

  13. SOFTWARE FOR DESIGNING PARALLEL APPLICATIONS

    Directory of Open Access Journals (Sweden)

    M. K. Bouza

    2017-01-01

    Full Text Available The object of research is the tools to support the development of parallel programs in C/C ++. The methods and software which automates the process of designing parallel applications are proposed.

  14. Parallel External Memory Graph Algorithms

    DEFF Research Database (Denmark)

    Arge, Lars Allan; Goodrich, Michael T.; Sitchinava, Nodari

    2010-01-01

    In this paper, we study parallel I/O efficient graph algorithms in the Parallel External Memory (PEM) model, one o f the private-cache chip multiprocessor (CMP) models. We study the fundamental problem of list ranking which leads to efficient solutions to problems on trees, such as computing lowest...... an optimal speedup of ¿(P) in parallel I/O complexity and parallel computation time, compared to the single-processor external memory counterparts....

  15. VLSI implementation of a bio-inspired olfactory spiking neural network.

    Science.gov (United States)

    Hsieh, Hung-Yi; Tang, Kea-Tiong

    2012-07-01

    This paper presents a low-power, neuromorphic spiking neural network (SNN) chip that can be integrated in an electronic nose system to classify odor. The proposed SNN takes advantage of sub-threshold oscillation and onset-latency representation to reduce power consumption and chip area, providing a more distinct output for each odor input. The synaptic weights between the mitral and cortical cells are modified according to an spike-timing-dependent plasticity learning rule. During the experiment, the odor data are sampled by a commercial electronic nose (Cyranose 320) and are normalized before training and testing to ensure that the classification result is only caused by learning. Measurement results show that the circuit only consumed an average power of approximately 3.6 μW with a 1-V power supply to discriminate odor data. The SNN has either a high or low output response for a given input odor, making it easy to determine whether the circuit has made the correct decision. The measurement result of the SNN chip and some well-known algorithms (support vector machine and the K-nearest neighbor program) is compared to demonstrate the classification performance of the proposed SNN chip.The mean testing accuracy is 87.59% for the data used in this paper.

  16. Cochlear spike synchronization and neuron coincidence detection model

    Science.gov (United States)

    Bader, Rolf

    2018-02-01

    Coincidence detection of a spike pattern fed from the cochlea into a single neuron is investigated using a physical Finite-Difference model of the cochlea and a physiologically motivated neuron model. Previous studies have shown experimental evidence of increased spike synchronization in the nucleus cochlearis and the trapezoid body [Joris et al., J. Neurophysiol. 71(3), 1022-1036 and 1037-1051 (1994)] and models show tone partial phase synchronization at the transition from mechanical waves on the basilar membrane into spike patterns [Ch. F. Babbs, J. Biophys. 2011, 435135]. Still the traveling speed of waves on the basilar membrane cause a frequency-dependent time delay of simultaneously incoming sound wavefronts up to 10 ms. The present model shows nearly perfect synchronization of multiple spike inputs as neuron outputs with interspike intervals (ISI) at the periodicity of the incoming sound for frequencies from about 30 to 300 Hz for two different amounts of afferent nerve fiber neuron inputs. Coincidence detection serves here as a fusion of multiple inputs into one single event enhancing pitch periodicity detection for low frequencies, impulse detection, or increased sound or speech intelligibility due to dereverberation.

  17. Proficiency test on incurred and spiked pesticide residues in cereals

    DEFF Research Database (Denmark)

    Poulsen, Mette Erecius; Christensen, Hanne Bjerre; Herrmann, Susan Strange

    2009-01-01

    A proficiency test on incurred and spiked pesticide residues in wheat was organised in 2008. The test material was grown in 2007 and treated in the field with 14 pesticides formulations containing the active substances, alpha-cypermethrin, bifentrin, carbendazim, chlormequat, chlorpyrifos...

  18. Spike sorting based upon machine learning algorithms (SOMA).

    Science.gov (United States)

    Horton, P M; Nicol, A U; Kendrick, K M; Feng, J F

    2007-02-15

    We have developed a spike sorting method, using a combination of various machine learning algorithms, to analyse electrophysiological data and automatically determine the number of sampled neurons from an individual electrode, and discriminate their activities. We discuss extensions to a standard unsupervised learning algorithm (Kohonen), as using a simple application of this technique would only identify a known number of clusters. Our extra techniques automatically identify the number of clusters within the dataset, and their sizes, thereby reducing the chance of misclassification. We also discuss a new pre-processing technique, which transforms the data into a higher dimensional feature space revealing separable clusters. Using principal component analysis (PCA) alone may not achieve this. Our new approach appends the features acquired using PCA with features describing the geometric shapes that constitute a spike waveform. To validate our new spike sorting approach, we have applied it to multi-electrode array datasets acquired from the rat olfactory bulb, and from the sheep infero-temporal cortex, and using simulated data. The SOMA sofware is available at http://www.sussex.ac.uk/Users/pmh20/spikes.

  19. Spike-timing-based computation in sound localization.

    Directory of Open Access Journals (Sweden)

    Dan F M Goodman

    2010-11-01

    Full Text Available Spike timing is precise in the auditory system and it has been argued that it conveys information about auditory stimuli, in particular about the location of a sound source. However, beyond simple time differences, the way in which neurons might extract this information is unclear and the potential computational advantages are unknown. The computational difficulty of this task for an animal is to locate the source of an unexpected sound from two monaural signals that are highly dependent on the unknown source signal. In neuron models consisting of spectro-temporal filtering and spiking nonlinearity, we found that the binaural structure induced by spatialized sounds is mapped to synchrony patterns that depend on source location rather than on source signal. Location-specific synchrony patterns would then result in the activation of location-specific assemblies of postsynaptic neurons. We designed a spiking neuron model which exploited this principle to locate a variety of sound sources in a virtual acoustic environment using measured human head-related transfer functions. The model was able to accurately estimate the location of previously unknown sounds in both azimuth and elevation (including front/back discrimination in a known acoustic environment. We found that multiple representations of different acoustic environments could coexist as sets of overlapping neural assemblies which could be associated with spatial locations by Hebbian learning. The model demonstrates the computational relevance of relative spike timing to extract spatial information about sources independently of the source signal.

  20. Thermal spike analysis of highly charged ion tracks

    International Nuclear Information System (INIS)

    Karlušić, M.; Jakšić, M.

    2012-01-01

    The irradiation of material using swift heavy ion or highly charged ion causes excitation of the electron subsystem at nanometer scale along the ion trajectory. According to the thermal spike model, energy deposited into the electron subsystem leads to temperature increase due to electron–phonon coupling. If ion-induced excitation is sufficiently intensive, then melting of the material can occur, and permanent damage (i.e., ion track) can be formed upon rapid cooling. We present an extension of the analytical thermal spike model of Szenes for the analysis of surface ion track produced after the impact of highly charged ion. By applying the model to existing experimental data, more than 60% of the potential energy of the highly charged ion was shown to be retained in the material during the impact and transformed into the energy of the thermal spike. This value is much higher than 20–40% of the transferred energy into the thermal spike by swift heavy ion. Thresholds for formation of highly charged ion track in different materials show uniform behavior depending only on few material parameters.

  1. Bayesian Inference for Structured Spike and Slab Priors

    DEFF Research Database (Denmark)

    Andersen, Michael Riis; Winther, Ole; Hansen, Lars Kai

    2014-01-01

    Sparse signal recovery addresses the problem of solving underdetermined linear inverse problems subject to a sparsity constraint. We propose a novel prior formulation, the structured spike and slab prior, which allows to incorporate a priori knowledge of the sparsity pattern by imposing a spatial...

  2. Effect of Rolandic Spikes on ADHD Impulsive Behavior

    Directory of Open Access Journals (Sweden)

    J Gordon Millichap

    2007-01-01

    Full Text Available The association of Rolandic spikes with the neuropsychological profile of children with attention deficit hyperactivity disorder (ADHD was studied in a total of 48 patients at JW Goethe-University, Frankfurt/Main; and Central Institute of Mental Health, Mannheim, Germany.

  3. Sleep deprivation and spike-wave discharges in epileptic rats

    NARCIS (Netherlands)

    Drinkenburg, W.H.I.M.; Coenen, A.M.L.; Vossen, J.M.H.; Luijtelaar, E.L.J.M. van

    1995-01-01

    The effects of sleep deprivation were studied on the occurrence of spike-wave discharges in the electroencephalogram of rats of the epileptic WAG/Rij strain, a model for absence epilepsy. This was done before, during and after a period of 12 hours of near total sleep deprivation. A substantial

  4. Fast computation with spikes in a recurrent neural network

    International Nuclear Information System (INIS)

    Jin, Dezhe Z.; Seung, H. Sebastian

    2002-01-01

    Neural networks with recurrent connections are sometimes regarded as too slow at computation to serve as models of the brain. Here we analytically study a counterexample, a network consisting of N integrate-and-fire neurons with self excitation, all-to-all inhibition, instantaneous synaptic coupling, and constant external driving inputs. When the inhibition and/or excitation are large enough, the network performs a winner-take-all computation for all possible external inputs and initial states of the network. The computation is done very quickly: As soon as the winner spikes once, the computation is completed since no other neurons will spike. For some initial states, the winner is the first neuron to spike, and the computation is done at the first spike of the network. In general, there are M potential winners, corresponding to the top M external inputs. When the external inputs are close in magnitude, M tends to be larger. If M>1, the selection of the actual winner is strongly influenced by the initial states. If a special relation between the excitation and inhibition is satisfied, the network always selects the neuron with the maximum external input as the winner

  5. Dynamics of directional coupling underlying spike-wave discharges

    NARCIS (Netherlands)

    Sysoeva, M.V.; Luttjohann, A.K.; Luijtelaar, E.L.J.M. van; Sysoev, I.V.

    2016-01-01

    Purpose: Spike and wave discharges (SWDs), generated within cortico-thalamo-cortical networks, are the electroencephalographic biomarker of absence epilepsy. The current work aims to identify mechanisms of SWD initiation, maintenance and termination by the analyses of dynamics and directionality of

  6. Breathing, spiking and chaos in a laser with injected signal

    Energy Technology Data Exchange (ETDEWEB)

    Lugiato, L A; Narducci, L M

    1983-06-01

    The behavior of a laser driven by an injected cw field detuned from the operating laser frequency is considered. The analysis covers the entire range of incident power levels from zero to the injection locking threshold. In this domain, the output intensity exhibits regular and chaotic oscillations, a period doubling cascade in reverse order, envelope breathing and spiking.

  7. Inhibitory Synaptic Plasticity - Spike timing dependence and putative network function.

    Directory of Open Access Journals (Sweden)

    Tim P Vogels

    2013-07-01

    Full Text Available While the plasticity of excitatory synaptic connections in the brain has been widely studied, the plasticity of inhibitory connections is much less understood. Here, we present recent experimental and theoretical □ndings concerning the rules of spike timing-dependent inhibitory plasticity and their putative network function. This is a summary of a workshop at the COSYNE conference 2012.

  8. Spiking Activity of a LIF Neuron in Distributed Delay Framework

    Directory of Open Access Journals (Sweden)

    Saket Kumar Choudhary

    2016-06-01

    Full Text Available Evolution of membrane potential and spiking activity for a single leaky integrate-and-fire (LIF neuron in distributed delay framework (DDF is investigated. DDF provides a mechanism to incorporate memory element in terms of delay (kernel function into a single neuron models. This investigation includes LIF neuron model with two different kinds of delay kernel functions, namely, gamma distributed delay kernel function and hypo-exponential distributed delay kernel function. Evolution of membrane potential for considered models is studied in terms of stationary state probability distribution (SPD. Stationary state probability distribution of membrane potential (SPDV for considered neuron models are found asymptotically similar which is Gaussian distributed. In order to investigate the effect of membrane potential delay, rate code scheme for neuronal information processing is applied. Firing rate and Fano-factor for considered neuron models are calculated and standard LIF model is used for comparative study. It is noticed that distributed delay increases the spiking activity of a neuron. Increase in spiking activity of neuron in DDF is larger for hypo-exponential distributed delay function than gamma distributed delay function. Moreover, in case of hypo-exponential delay function, a LIF neuron generates spikes with Fano-factor less than 1.

  9. Parallel inter channel interaction mechanisms

    International Nuclear Information System (INIS)

    Jovic, V.; Afgan, N.; Jovic, L.

    1995-01-01

    Parallel channels interactions are examined. For experimental researches of nonstationary regimes flow in three parallel vertical channels results of phenomenon analysis and mechanisms of parallel channel interaction for adiabatic condition of one-phase fluid and two-phase mixture flow are shown. (author)

  10. Massively Parallel QCD

    International Nuclear Information System (INIS)

    Soltz, R; Vranas, P; Blumrich, M; Chen, D; Gara, A; Giampap, M; Heidelberger, P; Salapura, V; Sexton, J; Bhanot, G

    2007-01-01

    The theory of the strong nuclear force, Quantum Chromodynamics (QCD), can be numerically simulated from first principles on massively-parallel supercomputers using the method of Lattice Gauge Theory. We describe the special programming requirements of lattice QCD (LQCD) as well as the optimal supercomputer hardware architectures that it suggests. We demonstrate these methods on the BlueGene massively-parallel supercomputer and argue that LQCD and the BlueGene architecture are a natural match. This can be traced to the simple fact that LQCD is a regular lattice discretization of space into lattice sites while the BlueGene supercomputer is a discretization of space into compute nodes, and that both are constrained by requirements of locality. This simple relation is both technologically important and theoretically intriguing. The main result of this paper is the speedup of LQCD using up to 131,072 CPUs on the largest BlueGene/L supercomputer. The speedup is perfect with sustained performance of about 20% of peak. This corresponds to a maximum of 70.5 sustained TFlop/s. At these speeds LQCD and BlueGene are poised to produce the next generation of strong interaction physics theoretical results

  11. A Parallel Butterfly Algorithm

    KAUST Repository

    Poulson, Jack; Demanet, Laurent; Maxwell, Nicholas; Ying, Lexing

    2014-01-01

    The butterfly algorithm is a fast algorithm which approximately evaluates a discrete analogue of the integral transform (Equation Presented.) at large numbers of target points when the kernel, K(x, y), is approximately low-rank when restricted to subdomains satisfying a certain simple geometric condition. In d dimensions with O(Nd) quasi-uniformly distributed source and target points, when each appropriate submatrix of K is approximately rank-r, the running time of the algorithm is at most O(r2Nd logN). A parallelization of the butterfly algorithm is introduced which, assuming a message latency of α and per-process inverse bandwidth of β, executes in at most (Equation Presented.) time using p processes. This parallel algorithm was then instantiated in the form of the open-source DistButterfly library for the special case where K(x, y) = exp(iΦ(x, y)), where Φ(x, y) is a black-box, sufficiently smooth, real-valued phase function. Experiments on Blue Gene/Q demonstrate impressive strong-scaling results for important classes of phase functions. Using quasi-uniform sources, hyperbolic Radon transforms, and an analogue of a three-dimensional generalized Radon transform were, respectively, observed to strong-scale from 1-node/16-cores up to 1024-nodes/16,384-cores with greater than 90% and 82% efficiency, respectively. © 2014 Society for Industrial and Applied Mathematics.

  12. A Parallel Butterfly Algorithm

    KAUST Repository

    Poulson, Jack

    2014-02-04

    The butterfly algorithm is a fast algorithm which approximately evaluates a discrete analogue of the integral transform (Equation Presented.) at large numbers of target points when the kernel, K(x, y), is approximately low-rank when restricted to subdomains satisfying a certain simple geometric condition. In d dimensions with O(Nd) quasi-uniformly distributed source and target points, when each appropriate submatrix of K is approximately rank-r, the running time of the algorithm is at most O(r2Nd logN). A parallelization of the butterfly algorithm is introduced which, assuming a message latency of α and per-process inverse bandwidth of β, executes in at most (Equation Presented.) time using p processes. This parallel algorithm was then instantiated in the form of the open-source DistButterfly library for the special case where K(x, y) = exp(iΦ(x, y)), where Φ(x, y) is a black-box, sufficiently smooth, real-valued phase function. Experiments on Blue Gene/Q demonstrate impressive strong-scaling results for important classes of phase functions. Using quasi-uniform sources, hyperbolic Radon transforms, and an analogue of a three-dimensional generalized Radon transform were, respectively, observed to strong-scale from 1-node/16-cores up to 1024-nodes/16,384-cores with greater than 90% and 82% efficiency, respectively. © 2014 Society for Industrial and Applied Mathematics.

  13. Fast parallel event reconstruction

    CERN Multimedia

    CERN. Geneva

    2010-01-01

    On-line processing of large data volumes produced in modern HEP experiments requires using maximum capabilities of modern and future many-core CPU and GPU architectures.One of such powerful feature is a SIMD instruction set, which allows packing several data items in one register and to operate on all of them, thus achievingmore operations per clock cycle. Motivated by the idea of using the SIMD unit ofmodern processors, the KF based track fit has been adapted for parallelism, including memory optimization, numerical analysis, vectorization with inline operator overloading, and optimization using SDKs. The speed of the algorithm has been increased in 120000 times with 0.1 ms/track, running in parallel on 16 SPEs of a Cell Blade computer.  Running on a Nehalem CPU with 8 cores it shows the processing speed of 52 ns/track using the Intel Threading Building Blocks. The same KF algorithm running on an Nvidia GTX 280 in the CUDA frameworkprovi...

  14. Supervised Learning in Spiking Neural Networks for Precise Temporal Encoding.

    Science.gov (United States)

    Gardner, Brian; Grüning, André

    2016-01-01

    Precise spike timing as a means to encode information in neural networks is biologically supported, and is advantageous over frequency-based codes by processing input features on a much shorter time-scale. For these reasons, much recent attention has been focused on the development of supervised learning rules for spiking neural networks that utilise a temporal coding scheme. However, despite significant progress in this area, there still lack rules that have a theoretical basis, and yet can be considered biologically relevant. Here we examine the general conditions under which synaptic plasticity most effectively takes place to support the supervised learning of a precise temporal code. As part of our analysis we examine two spike-based learning methods: one of which relies on an instantaneous error signal to modify synaptic weights in a network (INST rule), and the other one relying on a filtered error signal for smoother synaptic weight modifications (FILT rule). We test the accuracy of the solutions provided by each rule with respect to their temporal encoding precision, and then measure the maximum number of input patterns they can learn to memorise using the precise timings of individual spikes as an indication of their storage capacity. Our results demonstrate the high performance of the FILT rule in most cases, underpinned by the rule's error-filtering mechanism, which is predicted to provide smooth convergence towards a desired solution during learning. We also find the FILT rule to be most efficient at performing input pattern memorisations, and most noticeably when patterns are identified using spikes with sub-millisecond temporal precision. In comparison with existing work, we determine the performance of the FILT rule to be consistent with that of the highly efficient E-learning Chronotron rule, but with the distinct advantage that our FILT rule is also implementable as an online method for increased biological realism.

  15. Adaptive training of neural networks for control of autonomous mobile robots

    NARCIS (Netherlands)

    Steur, E.; Vromen, T.; Nijmeijer, H.; Fossen, T.I.; Nijmeijer, H.; Pettersen, K.Y.

    2017-01-01

    We present an adaptive training procedure for a spiking neural network, which is used for control of a mobile robot. Because of manufacturing tolerances, any hardware implementation of a spiking neural network has non-identical nodes, which limit the performance of the controller. The adaptive

  16. Assembly of spikes into coronavirus particles is mediated by the carboxy-terminal domain of the spike protein

    NARCIS (Netherlands)

    Godeke, G J; de Haan, Cornelis A M; Rossen, J W; Vennema, H; Rottier, P J

    The type I glycoprotein S of coronavirus, trimers of which constitute the typical viral spikes, is assembled into virions through noncovalent interactions with the M protein. Here we demonstrate that incorporation is mediated by the short carboxy-terminal segment comprising the transmembrane and

  17. Parallel Computing in SCALE

    International Nuclear Information System (INIS)

    DeHart, Mark D.; Williams, Mark L.; Bowman, Stephen M.

    2010-01-01

    The SCALE computational architecture has remained basically the same since its inception 30 years ago, although constituent modules and capabilities have changed significantly. This SCALE concept was intended to provide a framework whereby independent codes can be linked to provide a more comprehensive capability than possible with the individual programs - allowing flexibility to address a wide variety of applications. However, the current system was designed originally for mainframe computers with a single CPU and with significantly less memory than today's personal computers. It has been recognized that the present SCALE computation system could be restructured to take advantage of modern hardware and software capabilities, while retaining many of the modular features of the present system. Preliminary work is being done to define specifications and capabilities for a more advanced computational architecture. This paper describes the state of current SCALE development activities and plans for future development. With the release of SCALE 6.1 in 2010, a new phase of evolutionary development will be available to SCALE users within the TRITON and NEWT modules. The SCALE (Standardized Computer Analyses for Licensing Evaluation) code system developed by Oak Ridge National Laboratory (ORNL) provides a comprehensive and integrated package of codes and nuclear data for a wide range of applications in criticality safety, reactor physics, shielding, isotopic depletion and decay, and sensitivity/uncertainty (S/U) analysis. Over the last three years, since the release of version 5.1 in 2006, several important new codes have been introduced within SCALE, and significant advances applied to existing codes. Many of these new features became available with the release of SCALE 6.0 in early 2009. However, beginning with SCALE 6.1, a first generation of parallel computing is being introduced. In addition to near-term improvements, a plan for longer term SCALE enhancement

  18. On the Universality and Non-Universality of Spiking Neural P Systems With Rules on Synapses.

    Science.gov (United States)

    Song, Tao; Xu, Jinbang; Pan, Linqiang

    2015-12-01

    Spiking neural P systems with rules on synapses are a new variant of spiking neural P systems. In the systems, the neuron contains only spikes, while the spiking/forgetting rules are moved on the synapses. It was obtained that such system with 30 neurons (using extended spiking rules) or with 39 neurons (using standard spiking rules) is Turing universal. In this work, this number is improved to 6. Specifically, we construct a Turing universal spiking neural P system with rules on synapses having 6 neurons, which can generate any set of Turing computable natural numbers. As well, it is obtained that spiking neural P system with rules on synapses having less than two neurons are not Turing universal: i) such systems having one neuron can characterize the family of finite sets of natural numbers; ii) the family of sets of numbers generated by the systems having two neurons is included in the family of semi-linear sets of natural numbers.

  19. Diallel analysis to study the genetic makeup of spike and yield ...

    African Journals Online (AJOL)

    African Journal of Biotechnology ... Five wheat genotypes were crossed in complete diallel fashion for gene ... by pursuing pedigree method while heterosis can be exploited for spike length, grain weight per spike and grain yield per plant.

  20. Parallel Polarization State Generation.

    Science.gov (United States)

    She, Alan; Capasso, Federico

    2016-05-17

    The control of polarization, an essential property of light, is of wide scientific and technological interest. The general problem of generating arbitrary time-varying states of polarization (SOP) has always been mathematically formulated by a series of linear transformations, i.e. a product of matrices, imposing a serial architecture. Here we show a parallel architecture described by a sum of matrices. The theory is experimentally demonstrated by modulating spatially-separated polarization components of a laser using a digital micromirror device that are subsequently beam combined. This method greatly expands the parameter space for engineering devices that control polarization. Consequently, performance characteristics, such as speed, stability, and spectral range, are entirely dictated by the technologies of optical intensity modulation, including absorption, reflection, emission, and scattering. This opens up important prospects for polarization state generation (PSG) with unique performance characteristics with applications in spectroscopic ellipsometry, spectropolarimetry, communications, imaging, and security.

  1. Parallel imaging microfluidic cytometer.

    Science.gov (United States)

    Ehrlich, Daniel J; McKenna, Brian K; Evans, James G; Belkina, Anna C; Denis, Gerald V; Sherr, David H; Cheung, Man Ching

    2011-01-01

    By adding an additional degree of freedom from multichannel flow, the parallel microfluidic cytometer (PMC) combines some of the best features of fluorescence-activated flow cytometry (FCM) and microscope-based high-content screening (HCS). The PMC (i) lends itself to fast processing of large numbers of samples, (ii) adds a 1D imaging capability for intracellular localization assays (HCS), (iii) has a high rare-cell sensitivity, and (iv) has an unusual capability for time-synchronized sampling. An inability to practically handle large sample numbers has restricted applications of conventional flow cytometers and microscopes in combinatorial cell assays, network biology, and drug discovery. The PMC promises to relieve a bottleneck in these previously constrained applications. The PMC may also be a powerful tool for finding rare primary cells in the clinic. The multichannel architecture of current PMC prototypes allows 384 unique samples for a cell-based screen to be read out in ∼6-10 min, about 30 times the speed of most current FCM systems. In 1D intracellular imaging, the PMC can obtain protein localization using HCS marker strategies at many times for the sample throughput of charge-coupled device (CCD)-based microscopes or CCD-based single-channel flow cytometers. The PMC also permits the signal integration time to be varied over a larger range than is practical in conventional flow cytometers. The signal-to-noise advantages are useful, for example, in counting rare positive cells in the most difficult early stages of genome-wide screening. We review the status of parallel microfluidic cytometry and discuss some of the directions the new technology may take. Copyright © 2011 Elsevier Inc. All rights reserved.

  2. About Parallel Programming: Paradigms, Parallel Execution and Collaborative Systems

    Directory of Open Access Journals (Sweden)

    Loredana MOCEAN

    2009-01-01

    Full Text Available In the last years, there were made efforts for delineation of a stabile and unitary frame, where the problems of logical parallel processing must find solutions at least at the level of imperative languages. The results obtained by now are not at the level of the made efforts. This paper wants to be a little contribution at these efforts. We propose an overview in parallel programming, parallel execution and collaborative systems.

  3. Copula Regression Analysis of Simultaneously Recorded Frontal Eye Field and Inferotemporal Spiking Activity during Object-Based Working Memory

    Science.gov (United States)

    Hu, Meng; Clark, Kelsey L.; Gong, Xiajing; Noudoost, Behrad; Li, Mingyao; Moore, Tirin

    2015-01-01

    Inferotemporal (IT) neurons are known to exhibit persistent, stimulus-selective activity during the delay period of object-based working memory tasks. Frontal eye field (FEF) neurons show robust, spatially selective delay period activity during memory-guided saccade tasks. We present a copula regression paradigm to examine neural interaction of these two types of signals between areas IT and FEF of the monkey during a working memory task. This paradigm is based on copula models that can account for both marginal distribution over spiking activity of individual neurons within each area and joint distribution over ensemble activity of neurons between areas. Considering the popular GLMs as marginal models, we developed a general and flexible likelihood framework that uses the copula to integrate separate GLMs into a joint regression analysis. Such joint analysis essentially leads to a multivariate analog of the marginal GLM theory and hence efficient model estimation. In addition, we show that Granger causality between spike trains can be readily assessed via the likelihood ratio statistic. The performance of this method is validated by extensive simulations, and compared favorably to the widely used GLMs. When applied to spiking activity of simultaneously recorded FEF and IT neurons during working memory task, we observed significant Granger causality influence from FEF to IT, but not in the opposite direction, suggesting the role of the FEF in the selection and retention of visual information during working memory. The copula model has the potential to provide unique neurophysiological insights about network properties of the brain. PMID:26063909

  4. Power-law inter-spike interval distributions infer a conditional maximization of entropy in cortical neurons.

    Directory of Open Access Journals (Sweden)

    Yasuhiro Tsubo

    Full Text Available The brain is considered to use a relatively small amount of energy for its efficient information processing. Under a severe restriction on the energy consumption, the maximization of mutual information (MMI, which is adequate for designing artificial processing machines, may not suit for the brain. The MMI attempts to send information as accurate as possible and this usually requires a sufficient energy supply for establishing clearly discretized communication bands. Here, we derive an alternative hypothesis for neural code from the neuronal activities recorded juxtacellularly in the sensorimotor cortex of behaving rats. Our hypothesis states that in vivo cortical neurons maximize the entropy of neuronal firing under two constraints, one limiting the energy consumption (as assumed previously and one restricting the uncertainty in output spike sequences at given firing rate. Thus, the conditional maximization of firing-rate entropy (CMFE solves a tradeoff between the energy cost and noise in neuronal response. In short, the CMFE sends a rich variety of information through broader communication bands (i.e., widely distributed firing rates at the cost of accuracy. We demonstrate that the CMFE is reflected in the long-tailed, typically power law, distributions of inter-spike intervals obtained for the majority of recorded neurons. In other words, the power-law tails are more consistent with the CMFE rather than the MMI. Thus, we propose the mathematical principle by which cortical neurons may represent information about synaptic input into their output spike trains.

  5. α-Oscillations in the monkey sensorimotor network influence discrimination performance by rhythmical inhibition of neuronal spiking.

    Science.gov (United States)

    Haegens, Saskia; Nácher, Verónica; Luna, Rogelio; Romo, Ranulfo; Jensen, Ole

    2011-11-29

    Extensive work in humans using magneto- and electroencephalography strongly suggests that decreased oscillatory α-activity (8-14 Hz) facilitates processing in a given region, whereas increased α-activity serves to actively suppress irrelevant or interfering processing. However, little work has been done to understand how α-activity is linked to neuronal firing. Here, we simultaneously recorded local field potentials and spikes from somatosensory, premotor, and motor regions while a trained monkey performed a vibrotactile discrimination task. In the local field potentials we observed strong activity in the α-band, which decreased in the sensorimotor regions during the discrimination task. This α-power decrease predicted better discrimination performance. Furthermore, the α-oscillations demonstrated a rhythmic relation with the spiking, such that firing was highest at the trough of the α-cycle. Firing rates increased with a decrease in α-power. These findings suggest that α-oscillations exercise a strong inhibitory influence on both spike timing and firing rate. Thus, the pulsed inhibition by α-oscillations plays an important functional role in the extended sensorimotor system.

  6. Fast and Efficient Asynchronous Neural Computation with Adapting Spiking Neural Networks

    NARCIS (Netherlands)

    D. Zambrano (Davide); S.M. Bohte (Sander)

    2016-01-01

    textabstractBiological neurons communicate with a sparing exchange of pulses - spikes. It is an open question how real spiking neurons produce the kind of powerful neural computation that is possible with deep artificial neural networks, using only so very few spikes to communicate. Building on

  7. Spikes matter for phase-locked bursting in inhibitory neurons

    Science.gov (United States)

    Jalil, Sajiya; Belykh, Igor; Shilnikov, Andrey

    2012-03-01

    We show that inhibitory networks composed of two endogenously bursting neurons can robustly display several coexistent phase-locked states in addition to stable antiphase and in-phase bursting. This work complements and enhances our recent result [Jalil, Belykh, and Shilnikov, Phys. Rev. EPLEEE81539-375510.1103/PhysRevE.81.045201 81, 045201(R) (2010)] that fast reciprocal inhibition can synchronize bursting neurons due to spike interactions. We reveal the role of spikes in generating multiple phase-locked states and demonstrate that this multistability is generic by analyzing diverse models of bursting networks with various fast inhibitory synapses; the individual cell models include the reduced leech heart interneuron, the Sherman model for pancreatic beta cells, and the Purkinje neuron model.

  8. Reflex reading epilepsy: effect of linguistic characteristics on spike frequency.

    Science.gov (United States)

    Safi, Dima; Lassonde, Maryse; Nguyen, Dang Khoa; Denault, Carole; Macoir, Joël; Rouleau, Isabelle; Béland, Renée

    2011-04-01

    Reading epilepsy is a rare reflex epilepsy in which seizures are provoked by reading. Several cases have been described in the literature, but the pathophysiological processes vary widely and remain unclear. We describe a 42-year-old male patient with reading epilepsy evaluated using clinical assessments and continuous video/EEG recordings. We administered verbal, nonverbal, and reading tasks to determine factors precipitating seizures. Linguistic characteristics of the words were manipulated. Results indicated that reading-induced seizures were significantly more numerous than those observed during verbal and nonverbal tasks. In reading tasks, spike frequency significantly increased with involvement of the phonological reading route. Spikes were recorded predominantly in left parasagittal regions. Future cerebral imaging studies will enable us to visualize the spatial localization and temporal course of reading-induced seizures and brain activity involved in reading. A better understanding of reading epilepsy is crucial for reading rehabilitation in these patients. Copyright © 2011 Elsevier Inc. All rights reserved.

  9. Negotiating Multicollinearity with Spike-and-Slab Priors.

    Science.gov (United States)

    Ročková, Veronika; George, Edward I

    2014-08-01

    In multiple regression under the normal linear model, the presence of multicollinearity is well known to lead to unreliable and unstable maximum likelihood estimates. This can be particularly troublesome for the problem of variable selection where it becomes more difficult to distinguish between subset models. Here we show how adding a spike-and-slab prior mitigates this difficulty by filtering the likelihood surface into a posterior distribution that allocates the relevant likelihood information to each of the subset model modes. For identification of promising high posterior models in this setting, we consider three EM algorithms, the fast closed form EMVS version of Rockova and George (2014) and two new versions designed for variants of the spike-and-slab formulation. For a multimodal posterior under multicollinearity, we compare the regions of convergence of these three algorithms. Deterministic annealing versions of the EMVS algorithm are seen to substantially mitigate this multimodality. A single simple running example is used for illustration throughout.

  10. Method for spiking soil samples with organic compounds

    DEFF Research Database (Denmark)

    Brinch, Ulla C; Ekelund, Flemming; Jacobsen, Carsten S

    2002-01-01

    We examined the harmful side effects on indigenous soil microorganisms of two organic solvents, acetone and dichloromethane, that are normally used for spiking of soil with polycyclic aromatic hydrocarbons for experimental purposes. The solvents were applied in two contamination protocols to either...... higher than in control soil, probably due mainly to release of predation from indigenous protozoa. In order to minimize solvent effects on indigenous soil microorganisms when spiking native soil samples with compounds having a low water solubility, we propose a common protocol in which the contaminant...... tagged with luxAB::Tn5. For both solvents, application to the whole sample resulted in severe side effects on both indigenous protozoa and bacteria. Application of dichloromethane to the whole soil volume immediately reduced the number of protozoa to below the detection limit. In one of the soils...

  11. Voltage spike detection in high field superconducting accelerator magnets

    Energy Technology Data Exchange (ETDEWEB)

    Orris, D.F.; Carcagno, R.; Feher, S.; Makulski, A.; Pischalnikov, Y.M.; /Fermilab

    2004-12-01

    A measurement system for the detection of small magnetic flux changes in superconducting magnets, which are due to either mechanical motion of the conductor or flux jump, has been developed at Fermilab. These flux changes are detected as small amplitude, short duration voltage spikes, which are {approx}15mV in magnitude and lasts for {approx}30 {micro}sec. The detection system combines an analog circuit for the signal conditioning of two coil segments and a fast data acquisition system for digitizing the results, performing threshold detection, and storing the resultant data. The design of the spike detection system along with the modeling results and noise analysis will be presented. Data from tests of high field Nb{sub 3}Sn magnets at currents up to {approx}20KA will also be shown.

  12. Voltage spike detection in high field superconducting accelerator magnets

    International Nuclear Information System (INIS)

    Orris, D.F.; Carcagno, R.; Feher, S.; Makulski, A.; Pischalnikov, Y.M.

    2004-01-01

    A measurement system for the detection of small magnetic flux changes in superconducting magnets, which are due to either mechanical motion of the conductor or flux jump, has been developed at Fermilab. These flux changes are detected as small amplitude, short duration voltage spikes, which are ∼15mV in magnitude and lasts for ∼30(micro)sec. The detection system combines an analog circuit for the signal conditioning of two coil segments and a fast data acquisition system for digitizing the results, performing threshold detection, and storing the resultant data. The design of the spike detection system along with the modeling results and noise analysis will be presented. Data from tests of high field Nb3Sn magnets at currents up to ∼20KA will also be shown

  13. Past, present and future of spike sorting techniques.

    Science.gov (United States)

    Rey, Hernan Gonzalo; Pedreira, Carlos; Quian Quiroga, Rodrigo

    2015-10-01

    Spike sorting is a crucial step to extract information from extracellular recordings. With new recording opportunities provided by the development of new electrodes that allow monitoring hundreds of neurons simultaneously, the scenario for the new generation of algorithms is both exciting and challenging. However, this will require a new approach to the problem and the development of a common reference framework to quickly assess the performance of new algorithms. In this work, we review the basic concepts of spike sorting, including the requirements for different applications, together with the problems faced by presently available algorithms. We conclude by proposing a roadmap stressing the crucial points to be addressed to support the neuroscientific research of the near future. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  14. Analysis of voltage spikes in superconducting Nb3Sn magnets

    International Nuclear Information System (INIS)

    Rahimzadeh-Kalaleh, S.; Ambrosio, G.; Chlachidze, G.; Donnelly, C.

    2008-01-01

    Fermi National Accelerator Laboratory has been developing a new generation of superconducting accelerator magnets based on Niobium Tin (Nb 3 Sn). The performance of these magnets is influenced by thermo-magnetic instabilities, known as flux jumps, which can lead to premature trips of the quench detection system due to large voltage transients or quenches at low current. In an effort to better characterize and understand these instabilities, a system for capturing fast voltage transients was developed and used in recent tests of R and D model magnets. A new automated voltage spike analysis program was developed for the analysis of large amount of voltage-spike data. We report results from the analysis of large statistics data samples for short model magnets that were constructed using MJR and RRP strands having different sub-element size and structure. We then assess the implications for quench protection of Nb 3 Sn magnets

  15. EPILEPTIC ENCEPHALOPATHY WITH CONTINUOUS SPIKES-WAVES ACTIVITY DURING SLEEP

    Directory of Open Access Journals (Sweden)

    E. D. Belousova

    2012-01-01

    Full Text Available The author represents the review and discussion of current scientific literature devoted to epileptic encephalopathy with continuous spikes-waves activity during sleep — the special form of partly reversible age-dependent epileptic encephalopathy, characterized by triad of symptoms: continuous prolonged epileptiform (spike-wave activity on EEG in sleep, epileptic seizures and cognitive disorders. The author describes the aspects of classification, pathogenesis and etiology, prevalence, clinical picture and diagnostics of this disorder, including the peculiar anomalies on EEG. The especial attention is given to approaches to the treatment of epileptic encephalopathy with continuous spikeswaves activity during sleep. Efficacy of valproates, corticosteroid hormones and antiepileptic drugs of other groups is considered. The author represents own experience of treatment this disorder with corticosteroids, scheme of therapy and assessment of efficacy.

  16. The Ripple Pond: Enabling Spiking Networks to See

    Directory of Open Access Journals (Sweden)

    Saeed eAfshar

    2013-11-01

    Full Text Available We present the biologically inspired Ripple Pond Network (RPN, a simply connected spiking neural network which performs a transformation converting two dimensional images to one dimensional temporal patterns suitable for recognition by temporal coding learning and memory networks. The RPN has been developed as a hardware solution linking previously implemented neuromorphic vision and memory structures such as frameless vision sensors and neuromorphic temporal coding spiking neural networks. Working together such systems are potentially capable of delivering end-to-end high-speed, low-power and low-resolution recognition for mobile and autonomous applications where slow, highly sophisticated and power hungry signal processing solutions are ineffective. Key aspects in the proposed approach include utilising the spatial properties of physically embedded neural networks and propagating waves of activity therein for information processing, using dimensional collapse of imagery information into amenable temporal patterns and the use of asynchronous frames for information binding.

  17. The ripple pond: enabling spiking networks to see.

    Science.gov (United States)

    Afshar, Saeed; Cohen, Gregory K; Wang, Runchun M; Van Schaik, André; Tapson, Jonathan; Lehmann, Torsten; Hamilton, Tara J

    2013-01-01

    We present the biologically inspired Ripple Pond Network (RPN), a simply connected spiking neural network which performs a transformation converting two dimensional images to one dimensional temporal patterns (TP) suitable for recognition by temporal coding learning and memory networks. The RPN has been developed as a hardware solution linking previously implemented neuromorphic vision and memory structures such as frameless vision sensors and neuromorphic temporal coding spiking neural networks. Working together such systems are potentially capable of delivering end-to-end high-speed, low-power and low-resolution recognition for mobile and autonomous applications where slow, highly sophisticated and power hungry signal processing solutions are ineffective. Key aspects in the proposed approach include utilizing the spatial properties of physically embedded neural networks and propagating waves of activity therein for information processing, using dimensional collapse of imagery information into amenable TP and the use of asynchronous frames for information binding.

  18. Binary Associative Memories as a Benchmark for Spiking Neuromorphic Hardware

    Directory of Open Access Journals (Sweden)

    Andreas Stöckel

    2017-08-01

    Full Text Available Large-scale neuromorphic hardware platforms, specialized computer systems for energy efficient simulation of spiking neural networks, are being developed around the world, for example as part of the European Human Brain Project (HBP. Due to conceptual differences, a universal performance analysis of these systems in terms of runtime, accuracy and energy efficiency is non-trivial, yet indispensable for further hard- and software development. In this paper we describe a scalable benchmark based on a spiking neural network implementation of the binary neural associative memory. We treat neuromorphic hardware and software simulators as black-boxes and execute exactly the same network description across all devices. Experiments on the HBP platforms under varying configurations of the associative memory show that the presented method allows to test the quality of the neuron model implementation, and to explain significant deviations from the expected reference output.

  19. A Theory of Material Spike Formation in Flow Separation

    Science.gov (United States)

    Serra, Mattia; Haller, George

    2017-11-01

    We develop a frame-invariant theory of material spike formation during flow separation over a no-slip boundary in two-dimensional flows with arbitrary time dependence. This theory identifies both fixed and moving separation, is effective also over short-time intervals, and admits a rigorous instantaneous limit. Our theory is based on topological properties of material lines, combining objectively stretching- and rotation-based kinematic quantities. The separation profile identified here serves as the theoretical backbone for the material spike from its birth to its fully developed shape, and remains hidden to existing approaches. Finally, our theory can be used to rigorously explain the perception of off-wall separation in unsteady flows, and more importantly, provide the conditions under which such a perception is justified. We illustrate our results in several examples including steady, time-periodic and unsteady analytic velocity fields with flat and curved boundaries, and an experimental dataset.

  20. Planning Annuaulised hours when spike in demand exists

    Directory of Open Access Journals (Sweden)

    MR Sureshkumar

    2012-04-01

    Full Text Available Manpower planning using annualised hours is an effective tool where seasonal demand for staff in industry exists. In annualised hours (AH workers are contracted to work for a certain number of hours per year. The workers are associated with relative efficiency for different types of tasks. This paper proposes a Mixed Integer linear Programming (MILP model to solve an annualised working hours planning problem when spike in demand exists. The holiday weeks for the workers are considered as partially individualised. If a worker has been assigned with more than one type of working week in a week, this will be compensated with one or more holiday week. The performance of the model is demonstrated with an example. It can be seen that this type of modelling helps to meet the spikes in demand with less capacity shortage compared with one working week in a week.