WorldWideScience

Sample records for neural chemorepellent sema3a

  1. Sema3A chemorepellant regulates the timing and patterning of dental nerves during development of incisor tooth germ.

    Science.gov (United States)

    Shrestha, Anjana; Moe, Kyaw; Luukko, Keijo; Taniguchi, Masahiko; Kettunen, Paivi

    2014-07-01

    Semaphorin 3A (Sema3A) axon repellant serves multiple developmental functions. Sema3A mRNAs are expressed in epithelial and mesenchymal components of the developing incisor in a dynamic manner. Here, we investigate the functions of Sema3A during development of incisors using Sema3A-deficient mice. We analyze histomorphogenesis and innervation of mandibular incisors using immunohistochemistry as well as computed tomography and thick tissue confocal imaging. Whereas no apparent disturbances in histomorphogenesis or hard tissue formation of Sema3A (-/-) incisors were observed, nerve fibers were prematurely seen in the presumptive dental mesenchyme of the bud stage Sema3A (-/-) tooth germ. Later, nerves were ectopically present in the Sema3A (-/-) dental papilla mesenchyme during the cap and bell stages, whereas in the Sema3A (+/+) mice the first nerve fibers were seen in the pulp after the onset of dental hard tissue formation. However, no apparent topographic differences in innervation pattern or nerve fasciculation were seen inside the pulp between postnatal and adult Sema3A (+/+) or Sema3A (-/-) incisors. In contrast, an abnormally large number of nerves and arborizations were observed in the Sema3A (-/-) developing dental follicle target field and periodontium and, unlike in the wild-type mice, nerve fibers were abundant in the labial periodontium. Of note, the observed defects appeared to be mostly corrected in the adult incisors. The expressions of Ngf and Gdnf neurotrophins and their receptors were not altered in the Sema3A (-/-) postnatal incisor or trigeminal ganglion, respectively. Thus, Sema3A is an essential, locally produced chemorepellant, which by creating mesenchymal exclusion areas, regulates the timing and patterning of the dental nerves during the development of incisor tooth germ.

  2. Inflammatory milieu cultivated Sema3A signaling promotes chondrocyte apoptosis in knee osteoarthritis.

    Science.gov (United States)

    Sun, Jie; Wei, Xuelei; Wang, Zengliang; Liu, Yunjiao; Lu, Jie; Lu, Yandong; Cui, Meng; Zhang, Xi; Li, Fangguo

    2018-03-01

    Osteoarthritis (OA) is the leading degenerative joint disease and featured by articular cartilage destruction, where chondrocyte apoptosis plays a critical role. Semaphorin-3A (Sema3A) has been implicated in OA chondrocyte physiology. In this study we aimed to uncover how Sema3A signaling is regulated in chondrocytes and investigate its role in OA chondrocyte survival. Here, we report that Sema3A and its receptor neuropilin-1 (Nrp1) are synchronously upregulated in cartilage chondrocytes of knee OA patients. Their expressions in chondrocytes could be induced by the stimulation of proinflammatory cytokines IL-1β and TNF-α and subsequent transcriptional activation orchestrated by C/EBPβ. The resulting excessive Sema3A signaling promotes chondrocyte apoptosis through impairing PI3K/Akt prosurvival signaling. These findings indicate a regulatory mechanism and a proapoptotic function of aberrant Sema3A signaling in OA chondrocytes, and suggest that targeting Sema3A signaling might interfere OA pathogenesis. © 2017 Wiley Periodicals, Inc.

  3. Diabetes Perturbs Bone Microarchitecture and Bone Strength through Regulation of Sema3A/IGF-1/β-Catenin in Rats.

    Science.gov (United States)

    Ma, Rufeng; Wang, Lili; Zhao, Baosheng; Liu, Chenyue; Liu, Haixia; Zhu, Ruyuan; Chen, Beibei; Li, Lin; Zhao, Dandan; Mo, Fangfang; Li, Yu; Niu, Jianzhao; Jiang, Guangjian; Fu, Min; Bromme, Dieter; Gao, Sihua; Zhang, Dongwei

    2017-01-01

    Increasing evidence supported that semaphorin 3A (Sema3A), insulin-like growth factor (IGF)-1 and β-catenin were involved in the development of osteoporosis and diabetes. This study is aimed to evaluate whether Sema3A/IGF-1/β-catenin is directly involved in the alterations of bone microarchitecture and bone strength of diabetic rats. Diabetic rats were induced by streptozotocin and high fat diet exposure. Bone microarchitecture and strength in the femurs were evaluated by micro-CT scanning, three-point bending examination and the stainings of HE, alizarin red S and safranin O/fast green, respectively. The alterations of lumbar spines microarchitecture were also determined by micro-CT scanning. Western blot and immunohistochemical analyses were used to examine the expression of Sema3A, β-catenin, IGF-1, peroxisome proliferator-activated receptor γ (PPARγ) and cathepsin K in rat tibias. Diabetic rats exhibited decreased trabecular numbers and bone formation, but an increased trabecular separation in the femurs and lumbar spines. Moreover, the increased bone fragility and decreased bone stiffness were evident in the femurs of diabetic rats. Diabetic rats also exhibited a pronounced bone phenotype which manifested by decreased expression of Sema3A, IGF-1 and β-catenin, as well as increased expression of cathepsin K and PPARγ. This study suggests that diabetes could perturb bone loss through the Sema3A/IGF-1/β-catenin pathway. Sema3A deficiency in bone may contribute to upregulation of PPARγ and cathepsin K expression, which further disrupts bone remodeling in diabetic rats. © 2017 The Author(s) Published by S. Karger AG, Basel.

  4. [Expression of Sema3A and NP-1 in spinal cord and spared dorsal root ganglion after partial dorsal root rhizotomy of cat].

    Science.gov (United States)

    Ding, Yan; Zhang, Wei; Huang, Qian; Peng, Jin; Zhao, Xiu-jun; Xu, Ai-li; Zhou, Xue

    2008-01-01

    To investigate the expression of semaphorin 3A (Sema3A) and neuropilin 1 (NP-1) in spinal cord and dorsal root ganglion after partial dorsal root rhizotomy. 15 adult cats were used for this study and divided into 3 groups: normal control group, 7 d and 14 d postoperative groups (7Th day and 14th day groups) undergoing unilateral partial dorsal root rhizotomy. The L3, L5 and L6 segments of spinal cord and L6 dorsal root ganglia (DRG) in operated side were made into frozen sections. By immunohistochemistry ABC method, the sections of spinal cord were stained with specific Sema3A antibody, and L6 DRG were stained with NP-1 antibody. The mean optical density (OD) of Sema3A immunoreactivity in dorsal horn was measured and the number of NP-1 positive medium-small sized neurons in spared DRG was counted. After partial dorsal root rhizotomy, in L3 segment the expression of Sema3A decreased in 7th day group (0.25 +/- 0. 14) compared with that in normal group(0. 37 +/- 0.87) (P P NP-1 positive medium-small sized neurons in spared DRG (30.85 +/- 10.26) was decreased in 7th day group (P P NP-1 in L6 DRG after partial root rhizotomy may be involved in collateral sprouting of spared root in superficial lamina.

  5. A secreted protein is an endogenous chemorepellant in Dictyostelium discoideum

    OpenAIRE

    Jonathan E Phillips; Gomer, Richard H.

    2012-01-01

    Chemorepellants may play multiple roles in physiological and pathological processes. However, few endogenous chemorepellants have been identified, and how they function is unclear. We found that the autocrine signal AprA, which is produced by growing Dictyostelium discoideum cells and inhibits their proliferation, also functions as a chemorepellant. Wild-type cells at the edge of a colony show directed movement outward from the colony, whereas cells lacking AprA do not. Cells show directed mo...

  6. Expression of a Mutant SEMA3A Protein with Diminished Signalling Capacity Does Not Alter ALS-Related Motor Decline, or Confer Changes in NMJ Plasticity after BotoxA-Induced Paralysis of Male Gastrocnemic Muscle.

    Directory of Open Access Journals (Sweden)

    Elizabeth B Moloney

    Full Text Available Terminal Schwann cells (TSCs are specialized cells that envelop the motor nerve terminal, and play a role in the maintenance and regeneration of neuromuscular junctions (NMJs. The chemorepulsive protein semaphorin 3A (SEMA3A is selectively up-regulated in TSCs on fast-fatigable muscle fibers following experimental denervation of the muscle (BotoxA-induced paralysis or crush injury to the sciatic nerve or in the motor neuron disease amyotrophic lateral sclerosis (ALS. Re-expression of SEMA3A in this subset of TSCs is thought to play a role in the selective plasticity of nerve terminals as observed in ALS and following BotoxA-induced paralysis. Using a mouse model expressing a mutant SEMA3A with diminished signaling capacity, we studied the influence of SEMA3A signaling at the NMJ with two denervation paradigms; a motor neuron disease model (the G93A-hSOD1 ALS mouse line and an injury model (BotoxA-induced paralysis. ALS mice that either expressed 1 or 2 mutant SEMA3A alleles demonstrated no difference in ALS-induced decline in motor behavior. We also investigated the effects of BotoxA-induced paralysis on the sprouting capacity of NMJs in the K108N-SEMA3A mutant mouse, and observed no change in the differential neuronal plasticity found at NMJs on fast-fatigable or slow muscle fibers due to the presence of the SEMA3A mutant protein. Our data may be explained by the residual repulsive activity of the mutant SEMA3A, or it may imply that SEMA3A alone is not a key component of the molecular signature affecting NMJ plasticity in ALS or BotoxA-induced paralysis. Interestingly, we did observe a sex difference in motor neuron sprouting behavior after BotoxA-induced paralysis in WT mice which we speculate may be an important factor in the sex dimorphic differences seen in ALS.

  7. A secreted protein is an endogenous chemorepellant in Dictyostelium discoideum.

    Science.gov (United States)

    Phillips, Jonathan E; Gomer, Richard H

    2012-07-03

    Chemorepellants may play multiple roles in physiological and pathological processes. However, few endogenous chemorepellants have been identified, and how they function is unclear. We found that the autocrine signal AprA, which is produced by growing Dictyostelium discoideum cells and inhibits their proliferation, also functions as a chemorepellant. Wild-type cells at the edge of a colony show directed movement outward from the colony, whereas cells lacking AprA do not. Cells show directed movement away from a source of recombinant AprA and dialyzed conditioned media from wild-type cells, but not dialyzed conditioned media from aprA(-) cells. The secreted protein CfaD, the G protein Gα8, and the kinase QkgA are necessary for the chemorepellant activity of AprA as well as its proliferation-inhibiting activity, whereas the putative transcription factor BzpN is dispensable for the chemorepellant activity of AprA but necessary for inhibition of proliferation. Phospholipase C and PI3 kinases 1 and 2, which are necessary for the activity of at least one other chemorepellant in Dictyostelium, are not necessary for recombinant AprA chemorepellant activity. Starved cells are not repelled by recombinant AprA, suggesting that aggregation-phase cells are not sensitive to the chemorepellant effect. Cell tracking indicates that AprA affects the directional bias of cell movement, but not cell velocity or the persistence of cell movement. Together, our data indicate that the endogenous signal AprA acts as an autocrine chemorepellant for Dictyostelium cells.

  8. Expression of a Mutant SEMA3A Protein with Diminished Signalling Capacity Does Not Alter ALS-Related Motor Decline, or Confer Changes in NMJ Plasticity after BotoxA-Induced Paralysis of Male Gastrocnemic Muscle

    NARCIS (Netherlands)

    Moloney, E.; Hobo, B.; De Winter, Fred; Verhaagen, J.

    2017-01-01

    Terminal Schwann cells (TSCs) are specialized cells that envelop the motor nerve terminal, and play a role in the maintenance and regeneration of neuromuscular junctions (NMJs). The chemorepulsive protein semaphorin 3A (SEMA3A) is selectively up-regulated in TSCs on fast-fatigable muscle fibers

  9. Netrin-1 Peptide Is a Chemorepellent in Tetrahymena thermophila

    Directory of Open Access Journals (Sweden)

    Heather Kuruvilla

    2016-01-01

    Full Text Available Netrin-1 is a highly conserved, pleiotropic signaling molecule that can serve as a neuronal chemorepellent during vertebrate development. In vertebrates, chemorepellent signaling is mediated through the tyrosine kinase, src-1, and the tyrosine phosphatase, shp-2. Tetrahymena thermophila has been used as a model system for chemorepellent signaling because its avoidance response is easily characterized under a light microscope. Our experiments showed that netrin-1 peptide is a chemorepellent in T. thermophila at micromolar concentrations. T. thermophila adapts to netrin-1 over a time course of about 10 minutes. Netrin-adapted cells still avoid GTP, PACAP-38, and nociceptin, suggesting that netrin does not use the same signaling machinery as any of these other repellents. Avoidance of netrin-1 peptide was effectively eliminated by the addition of the tyrosine kinase inhibitor, genistein, to the assay buffer; however, immunostaining using an anti-phosphotyrosine antibody showed similar fluorescence levels in control and netrin-1 exposed cells, suggesting that tyrosine phosphorylation is not required for signaling to occur. In addition, ELISA indicates that a netrin-like peptide is present in both whole cell extract and secreted protein obtained from Tetrahymena thermophila. Further study will be required in order to fully elucidate the signaling mechanism of netrin-1 peptide in this organism.

  10. CHD7, the gene mutated in CHARGE syndrome, regulates genes involved in neural crest cell guidance.

    Science.gov (United States)

    Schulz, Yvonne; Wehner, Peter; Opitz, Lennart; Salinas-Riester, Gabriela; Bongers, Ernie M H F; van Ravenswaaij-Arts, Conny M A; Wincent, Josephine; Schoumans, Jacqueline; Kohlhase, Jürgen; Borchers, Annette; Pauli, Silke

    2014-08-01

    Heterozygous loss of function mutations in CHD7 (chromodomain helicase DNA-binding protein 7) lead to CHARGE syndrome, a complex developmental disorder affecting craniofacial structures, cranial nerves and several organ systems. Recently, it was demonstrated that CHD7 is essential for the formation of multipotent migratory neural crest cells, which migrate from the neural tube to many regions of the embryo, where they differentiate into various tissues including craniofacial and heart structures. So far, only few CHD7 target genes involved in neural crest cell development have been identified and the role of CHD7 in neural crest cell guidance and the regulation of mesenchymal-epithelial transition are unknown. Therefore, we undertook a genome-wide microarray expression analysis on wild-type and CHD7 deficient (Chd7 (Whi/+) and Chd7 (Whi/Whi)) mouse embryos at day 9.5, a time point of neural crest cell migration. We identified 98 differentially expressed genes between wild-type and Chd7 (Whi/Whi) embryos. Interestingly, many misregulated genes are involved in neural crest cell and axon guidance such as semaphorins and ephrin receptors. By performing knockdown experiments for Chd7 in Xenopus laevis embryos, we found abnormalities in the expression pattern of Sema3a, a protein involved in the pathogenesis of Kallmann syndrome, in vivo. In addition, we detected non-synonymous SEMA3A variations in 3 out of 45 CHD7-negative CHARGE patients. In summary, we discovered for the first time that Chd7 regulates genes involved in neural crest cell guidance, demonstrating a new aspect in the pathogenesis of CHARGE syndrome. Furthermore, we showed for Sema3a a conserved regulatory mechanism across different species, highlighting its significance during development. Although we postulated that the non-synonymous SEMA3A variants which we found in CHD7-negative CHARGE patients alone are not sufficient to produce the phenotype, we suggest an important modifier role for SEMA3A in the

  11. Evidence for a role of the chemorepellent semaphorin III and its receptor neuropilin-1 in the regeneration of primary olfactory axons

    NARCIS (Netherlands)

    Pasterkamp, R Jeroen; De Winter, F; Holtmaat, Anthony J D G; Verhaagen, J

    1998-01-01

    To explore a role for chemorepulsive axon guidance mechanisms in the regeneration of primary olfactory axons, we examined the expression of the chemorepellent semaphorin III (sema III), its receptor neuropilin-1, and collapsin response mediator protein-2 (CRMP-2) during regeneration of the olfactory

  12. Slit/Robo1 signaling regulates neural tube development by balancing neuroepithelial cell proliferation and differentiation

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Guang; Li, Yan; Wang, Xiao-yu [Key Laboratory for Regenerative Medicine of The Ministry of Education, Department of Histology and Embryology, School of Medicine, Jinan University, Guangzhou 510632 (China); Han, Zhe [Institute of Vascular Biological Sciences, Guangdong Pharmaceutical University, Guangzhou 510224 (China); Chuai, Manli [College of Life Sciences Biocentre, University of Dundee, Dundee DD1 5EH (United Kingdom); Wang, Li-jing [Institute of Vascular Biological Sciences, Guangdong Pharmaceutical University, Guangzhou 510224 (China); Ho Lee, Kenneth Ka [Stem Cell and Regeneration Thematic Research Programme, School of Biomedical Sciences, Chinese University of Hong Kong, Shatin (Hong Kong); Geng, Jian-guo, E-mail: jgeng@umich.edu [Institute of Vascular Biological Sciences, Guangdong Pharmaceutical University, Guangzhou 510224 (China); Department of Biologic and Materials Sciences, University of Michigan School of Dentistry, Ann Arbor, MI 48109 (United States); Yang, Xuesong, E-mail: yang_xuesong@126.com [Key Laboratory for Regenerative Medicine of The Ministry of Education, Department of Histology and Embryology, School of Medicine, Jinan University, Guangzhou 510632 (China)

    2013-05-01

    Formation of the neural tube is the morphological hallmark for development of the embryonic central nervous system (CNS). Therefore, neural tube development is a crucial step in the neurulation process. Slit/Robo signaling was initially identified as a chemo-repellent that regulated axon growth cone elongation, but its role in controlling neural tube development is currently unknown. To address this issue, we investigated Slit/Robo1 signaling in the development of chick neCollege of Life Sciences Biocentre, University of Dundee, Dundee DD1 5EH, UKural tube and transgenic mice over-expressing Slit2. We disrupted Slit/Robo1 signaling by injecting R5 monoclonal antibodies into HH10 neural tubes to block the Robo1 receptor. This inhibited the normal development of the ventral body curvature and caused the spinal cord to curl up into a S-shape. Next, Slit/Robo1 signaling on one half-side of the chick embryo neural tube was disturbed by electroporation in ovo. We found that the morphology of the neural tube was dramatically abnormal after we interfered with Slit/Robo1 signaling. Furthermore, we established that silencing Robo1 inhibited cell proliferation while over-expressing Robo1 enhanced cell proliferation. We also investigated the effects of altering Slit/Robo1 expression on Sonic Hedgehog (Shh) and Pax7 expression in the developing neural tube. We demonstrated that over-expressing Robo1 down-regulated Shh expression in the ventral neural tube and resulted in the production of fewer HNK-1{sup +} migrating neural crest cells (NCCs). In addition, Robo1 over-expression enhanced Pax7 expression in the dorsal neural tube and increased the number of Slug{sup +} pre-migratory NCCs. Conversely, silencing Robo1 expression resulted in an enhanced Shh expression and more HNK-1{sup +} migrating NCCs but reduced Pax7 expression and fewer Slug{sup +} pre-migratory NCCs were observed. In conclusion, we propose that Slit/Robo1 signaling is involved in regulating neural tube

  13. Brief Report: Robo1 Regulates the Migration of Human Subventricular Zone Neural Progenitor Cells During Development.

    Science.gov (United States)

    Guerrero-Cazares, Hugo; Lavell, Emily; Chen, Linda; Schiapparelli, Paula; Lara-Velazquez, Montserrat; Capilla-Gonzalez, Vivian; Clements, Anna Christina; Drummond, Gabrielle; Noiman, Liron; Thaler, Katrina; Burke, Anne; Quiñones-Hinojosa, Alfredo

    2017-07-01

    Human neural progenitor cell (NPC) migration within the subventricular zone (SVZ) of the lateral ganglionic eminence is an active process throughout early brain development. The migration of human NPCs from the SVZ to the olfactory bulb during fetal stages resembles what occurs in adult rodents. As the human brain develops during infancy, this migratory stream is drastically reduced in cell number and becomes barely evident in adults. The mechanisms regulating human NPC migration are unknown. The Slit-Robo signaling pathway has been defined as a chemorepulsive cue involved in axon guidance and neuroblast migration in rodents. Slit and Robo proteins expressed in the rodent brain help guide neuroblast migration from the SVZ through the rostral migratory stream to the olfactory bulb. Here, we present the first study on the role that Slit and Robo proteins play in human-derived fetal neural progenitor cell migration (hfNPC). We describe that Robo1 and Robo2 isoforms are expressed in the human fetal SVZ. Furthermore, we demonstrate that Slit2 is able to induce a chemorepellent effect on the migration of hfNPCs derived from the human fetal SVZ. In addition, when Robo1 expression is inhibited, hfNPCs are unable to migrate to the olfactory bulb of mice when injected in the anterior SVZ. Our findings indicate that the migration of human NPCs from the SVZ is partially regulated by the Slit-Robo axis. This pathway could be regulated to direct the migration of NPCs in human endogenous neural cell therapy. Stem Cells 2017;35:1860-1865. © 2017 AlphaMed Press.

  14. Evolvable synthetic neural system

    Science.gov (United States)

    Curtis, Steven A. (Inventor)

    2009-01-01

    An evolvable synthetic neural system includes an evolvable neural interface operably coupled to at least one neural basis function. Each neural basis function includes an evolvable neural interface operably coupled to a heuristic neural system to perform high-level functions and an autonomic neural system to perform low-level functions. In some embodiments, the evolvable synthetic neural system is operably coupled to one or more evolvable synthetic neural systems in a hierarchy.

  15. Neural Networks

    Directory of Open Access Journals (Sweden)

    Schwindling Jerome

    2010-04-01

    Full Text Available This course presents an overview of the concepts of the neural networks and their aplication in the framework of High energy physics analyses. After a brief introduction on the concept of neural networks, the concept is explained in the frame of neuro-biology, introducing the concept of multi-layer perceptron, learning and their use as data classifer. The concept is then presented in a second part using in more details the mathematical approach focussing on typical use cases faced in particle physics. Finally, the last part presents the best way to use such statistical tools in view of event classifers, putting the emphasis on the setup of the multi-layer perceptron. The full article (15 p. corresponding to this lecture is written in french and is provided in the proceedings of the book SOS 2008.

  16. NRP1 and NRP2 cooperate to regulate gangliogenesis, axon guidance and target innervation in the sympathetic nervous system

    Science.gov (United States)

    Maden, Charlotte H.; Gomes, John; Schwarz, Quenten; Davidson, Kathryn; Tinker, Andrew; Ruhrberg, Christiana

    2012-01-01

    The sympathetic nervous system (SNS) arises from neural crest (NC) cells during embryonic development and innervates the internal organs of vertebrates to modulate their stress response. NRP1 and NRP2 are receptors for guidance cues of the class 3 semaphorin (SEMA) family and are expressed in partially overlapping patterns in sympathetic NC cells and their progeny. By comparing the phenotypes of mice lacking NRP1 or its ligand SEMA3A with mice lacking NRP1 in the sympathetic versus vascular endothelial cell lineages, we demonstrate that SEMA3A signalling through NRP1 has multiple cell-autonomous roles in SNS development. These roles include neuronal cell body positioning, neuronal aggregation and axon guidance, first during sympathetic chain assembly and then to regulate the innervation of the heart and aorta. Loss of NRP2 or its ligand SEMA3F impaired sympathetic gangliogenesis more mildly than loss of SEMA3A/NRP1 signalling, but caused ectopic neurite extension along the embryonic aorta. The analysis of compound mutants lacking SEMA3A and SEMA3F or NRP1 and NRP2 in the SNS demonstrated that both signalling pathways cooperate to organise the SNS. We further show that abnormal sympathetic development in mice lacking NRP1 in the sympathetic lineage has functional consequences, as it causes sinus bradycardia, similar to mice lacking SEMA3A. PMID:22790009

  17. Neural Tube Defects

    Science.gov (United States)

    ... vitamin, before and during pregnancy prevents most neural tube defects. Neural tube defects are usually diagnosed before the infant is ... or imaging tests. There is no cure for neural tube defects. The nerve damage and loss of function ...

  18. [Neural repair].

    Science.gov (United States)

    Kitada, Masaaki; Dezawa, Mari

    2008-05-01

    Recent progress of stem cell biology gives us the hope for neural repair. We have established methods to specifically induce functional Schwann cells and neurons from bone marrow stromal cells (MSCs). The effectiveness of these induced cells was evaluated by grafting them either into peripheral nerve injury, spinal cord injury, or Parkinson' s disease animal models. MSCs-derived Schwann cells supported axonal regeneration and re-constructed myelin to facilitate the functional recovery in peripheral and spinal cord injury. MSCs-derived dopaminergic neurons integrated into host striatum and contributed to behavioral repair. In this review, we introduce the differentiation potential of MSCs and finally discuss about their benefits and drawbacks of these induction systems for cell-based therapy in neuro-traumatic and neuro-degenerative diseases.

  19. Introduction to neural networks

    CERN Document Server

    James, Frederick E

    1994-02-02

    1. Introduction and overview of Artificial Neural Networks. 2,3. The Feed-forward Network as an inverse Problem, and results on the computational complexity of network training. 4.Physics applications of neural networks.

  20. Morphological neural networks

    Energy Technology Data Exchange (ETDEWEB)

    Ritter, G.X.; Sussner, P. [Univ. of Florida, Gainesville, FL (United States)

    1996-12-31

    The theory of artificial neural networks has been successfully applied to a wide variety of pattern recognition problems. In this theory, the first step in computing the next state of a neuron or in performing the next layer neural network computation involves the linear operation of multiplying neural values by their synaptic strengths and adding the results. Thresholding usually follows the linear operation in order to provide for nonlinearity of the network. In this paper we introduce a novel class of neural networks, called morphological neural networks, in which the operations of multiplication and addition are replaced by addition and maximum (or minimum), respectively. By taking the maximum (or minimum) of sums instead of the sum of products, morphological network computation is nonlinear before thresholding. As a consequence, the properties of morphological neural networks are drastically different than those of traditional neural network models. In this paper we consider some of these differences and provide some particular examples of morphological neural network.

  1. Evolvable Neural Software System

    Science.gov (United States)

    Curtis, Steven A.

    2009-01-01

    The Evolvable Neural Software System (ENSS) is composed of sets of Neural Basis Functions (NBFs), which can be totally autonomously created and removed according to the changing needs and requirements of the software system. The resulting structure is both hierarchical and self-similar in that a given set of NBFs may have a ruler NBF, which in turn communicates with other sets of NBFs. These sets of NBFs may function as nodes to a ruler node, which are also NBF constructs. In this manner, the synthetic neural system can exhibit the complexity, three-dimensional connectivity, and adaptability of biological neural systems. An added advantage of ENSS over a natural neural system is its ability to modify its core genetic code in response to environmental changes as reflected in needs and requirements. The neural system is fully adaptive and evolvable and is trainable before release. It continues to rewire itself while on the job. The NBF is a unique, bilevel intelligence neural system composed of a higher-level heuristic neural system (HNS) and a lower-level, autonomic neural system (ANS). Taken together, the HNS and the ANS give each NBF the complete capabilities of a biological neural system to match sensory inputs to actions. Another feature of the NBF is the Evolvable Neural Interface (ENI), which links the HNS and ANS. The ENI solves the interface problem between these two systems by actively adapting and evolving from a primitive initial state (a Neural Thread) to a complicated, operational ENI and successfully adapting to a training sequence of sensory input. This simulates the adaptation of a biological neural system in a developmental phase. Within the greater multi-NBF and multi-node ENSS, self-similar ENI s provide the basis for inter-NBF and inter-node connectivity.

  2. Consciousness and neural plasticity

    DEFF Research Database (Denmark)

    In contemporary consciousness studies the phenomenon of neural plasticity has received little attention despite the fact that neural plasticity is of still increased interest in neuroscience. We will, however, argue that neural plasticity could be of great importance to consciousness studies....... If consciousness is related to neural processes it seems, at least prima facie, that the ability of the neural structures to change should be reflected in a theory of this relationship "Neural plasticity" refers to the fact that the brain can change due to its own activity. The brain is not static but rather...... a dynamic entity, which physical structure changes according to its use and environment. This change may take the form of growth of new neurons, the creation of new networks and structures, and change within network structures, that is, changes in synaptic strengths. Plasticity raises questions about...

  3. Fuzzy and neural control

    Science.gov (United States)

    Berenji, Hamid R.

    1992-01-01

    Fuzzy logic and neural networks provide new methods for designing control systems. Fuzzy logic controllers do not require a complete analytical model of a dynamic system and can provide knowledge-based heuristic controllers for ill-defined and complex systems. Neural networks can be used for learning control. In this chapter, we discuss hybrid methods using fuzzy logic and neural networks which can start with an approximate control knowledge base and refine it through reinforcement learning.

  4. What Is Neural Plasticity?

    Science.gov (United States)

    von Bernhardi, Rommy; Bernhardi, Laura Eugenín-von; Eugenín, Jaime

    2017-01-01

    "Neural plasticity" refers to the capacity of the nervous system to modify itself, functionally and structurally, in response to experience and injury. As the various chapters in this volume show, plasticity is a key component of neural development and normal functioning of the nervous system, as well as a response to the changing environment, aging, or pathological insult. This chapter discusses how plasticity is necessary not only for neural networks to acquire new functional properties, but also for them to remain robust and stable. The article also reviews the seminal proposals developed over the years that have driven experiments and strongly influenced concepts of neural plasticity.

  5. Neural Systems Laboratory

    Data.gov (United States)

    Federal Laboratory Consortium — As part of the Electrical and Computer Engineering Department and The Institute for System Research, the Neural Systems Laboratory studies the functionality of the...

  6. A neural flow estimator

    DEFF Research Database (Denmark)

    Jørgensen, Ivan Harald Holger; Bogason, Gudmundur; Bruun, Erik

    1995-01-01

    is implemented using switched-current technique and is capable of estimating flow in the μl/s range. The neural estimator is built around a multiplierless neural network, containing 96 synaptic weights which are updated using the LMS1-algorithm. An experimental chip has been designed that operates at 5 V......This paper proposes a new way to estimate the flow in a micromechanical flow channel. A neural network is used to estimate the delay of random temperature fluctuations induced in a fluid. The design and implementation of a hardware efficient neural flow estimator is described. The system...

  7. Neural Networks: Implementations and Applications

    NARCIS (Netherlands)

    Vonk, E.; Veelenturf, L.P.J.; Jain, L.C.

    1996-01-01

    Artificial neural networks, also called neural networks, have been used successfully in many fields including engineering, science and business. This paper presents the implementation of several neural network simulators and their applications in character recognition and other engineering areas

  8. Critical Branching Neural Networks

    Science.gov (United States)

    Kello, Christopher T.

    2013-01-01

    It is now well-established that intrinsic variations in human neural and behavioral activity tend to exhibit scaling laws in their fluctuations and distributions. The meaning of these scaling laws is an ongoing matter of debate between isolable causes versus pervasive causes. A spiking neural network model is presented that self-tunes to critical…

  9. Kunstige neurale net

    DEFF Research Database (Denmark)

    Hørning, Annette

    1994-01-01

    Artiklen beskæftiger sig med muligheden for at anvende kunstige neurale net i forbindelse med datamatisk procession af naturligt sprog, specielt automatisk talegenkendelse.......Artiklen beskæftiger sig med muligheden for at anvende kunstige neurale net i forbindelse med datamatisk procession af naturligt sprog, specielt automatisk talegenkendelse....

  10. Dynamics of neural cryptography.

    Science.gov (United States)

    Ruttor, Andreas; Kinzel, Wolfgang; Kanter, Ido

    2007-05-01

    Synchronization of neural networks has been used for public channel protocols in cryptography. In the case of tree parity machines the dynamics of both bidirectional synchronization and unidirectional learning is driven by attractive and repulsive stochastic forces. Thus it can be described well by a random walk model for the overlap between participating neural networks. For that purpose transition probabilities and scaling laws for the step sizes are derived analytically. Both these calculations as well as numerical simulations show that bidirectional interaction leads to full synchronization on average. In contrast, successful learning is only possible by means of fluctuations. Consequently, synchronization is much faster than learning, which is essential for the security of the neural key-exchange protocol. However, this qualitative difference between bidirectional and unidirectional interaction vanishes if tree parity machines with more than three hidden units are used, so that those neural networks are not suitable for neural cryptography. In addition, the effective number of keys which can be generated by the neural key-exchange protocol is calculated using the entropy of the weight distribution. As this quantity increases exponentially with the system size, brute-force attacks on neural cryptography can easily be made unfeasible.

  11. ANT Advanced Neural Tool

    Energy Technology Data Exchange (ETDEWEB)

    Labrador, I.; Carrasco, R.; Martinez, L.

    1996-07-01

    This paper describes a practical introduction to the use of Artificial Neural Networks. Artificial Neural Nets are often used as an alternative to the traditional symbolic manipulation and first order logic used in Artificial Intelligence, due the high degree of difficulty to solve problems that can not be handled by programmers using algorithmic strategies. As a particular case of Neural Net a Multilayer Perception developed by programming in C language on OS9 real time operating system is presented. A detailed description about the program structure and practical use are included. Finally, several application examples that have been treated with the tool are presented, and some suggestions about hardware implementations. (Author) 15 refs.

  12. Hidden neural networks

    DEFF Research Database (Denmark)

    Krogh, Anders Stærmose; Riis, Søren Kamaric

    1999-01-01

    A general framework for hybrids of hidden Markov models (HMMs) and neural networks (NNs) called hidden neural networks (HNNs) is described. The article begins by reviewing standard HMMs and estimation by conditional maximum likelihood, which is used by the HNN. In the HNN, the usual HMM probability...... parameters are replaced by the outputs of state-specific neural networks. As opposed to many other hybrids, the HNN is normalized globally and therefore has a valid probabilistic interpretation. All parameters in the HNN are estimated simultaneously according to the discriminative conditional maximum...... likelihood criterion. The HNN can be viewed as an undirected probabilistic independence network (a graphical model), where the neural networks provide a compact representation of the clique functions. An evaluation of the HNN on the task of recognizing broad phoneme classes in the TIMIT database shows clear...

  13. [Neural codes for perception].

    Science.gov (United States)

    Romo, R; Salinas, E; Hernández, A; Zainos, A; Lemus, L; de Lafuente, V; Luna, R

    This article describes experiments designed to show the neural codes associated with the perception and processing of tactile information. The results of these experiments have shown the neural activity correlated with tactile perception. The neurones of the primary somatosensory cortex (S1) represent the physical attributes of tactile perception. We found that these representations correlated with tactile perception. By means of intracortical microstimulation we demonstrated the causal relationship between S1 activity and tactile perception. In the motor areas of the frontal lobe is to be found the connection between sensorial and motor representation whilst decisions are being taken. S1 generates neural representations of the somatosensory stimuli which seen to be sufficient for tactile perception. These neural representations are subsequently processed by central areas to S1 and seem useful in perception, memory and decision making.

  14. Neural Oscillators Programming Simplified

    Directory of Open Access Journals (Sweden)

    Patrick McDowell

    2012-01-01

    Full Text Available The neurological mechanism used for generating rhythmic patterns for functions such as swallowing, walking, and chewing has been modeled computationally by the neural oscillator. It has been widely studied by biologists to model various aspects of organisms and by computer scientists and robotics engineers as a method for controlling and coordinating the gaits of walking robots. Although there has been significant study in this area, it is difficult to find basic guidelines for programming neural oscillators. In this paper, the authors approach neural oscillators from a programmer’s point of view, providing background and examples for developing neural oscillators to generate rhythmic patterns that can be used in biological modeling and robotics applications.

  15. Neural cryptography with feedback.

    Science.gov (United States)

    Ruttor, Andreas; Kinzel, Wolfgang; Shacham, Lanir; Kanter, Ido

    2004-04-01

    Neural cryptography is based on a competition between attractive and repulsive stochastic forces. A feedback mechanism is added to neural cryptography which increases the repulsive forces. Using numerical simulations and an analytic approach, the probability of a successful attack is calculated for different model parameters. Scaling laws are derived which show that feedback improves the security of the system. In addition, a network with feedback generates a pseudorandom bit sequence which can be used to encrypt and decrypt a secret message.

  16. Neural cryptography with feedback

    Science.gov (United States)

    Ruttor, Andreas; Kinzel, Wolfgang; Shacham, Lanir; Kanter, Ido

    2004-04-01

    Neural cryptography is based on a competition between attractive and repulsive stochastic forces. A feedback mechanism is added to neural cryptography which increases the repulsive forces. Using numerical simulations and an analytic approach, the probability of a successful attack is calculated for different model parameters. Scaling laws are derived which show that feedback improves the security of the system. In addition, a network with feedback generates a pseudorandom bit sequence which can be used to encrypt and decrypt a secret message.

  17. Neural network applications

    Science.gov (United States)

    Padgett, Mary L.; Desai, Utpal; Roppel, T.A.; White, Charles R.

    1993-01-01

    A design procedure is suggested for neural networks which accommodates the inclusion of such knowledge-based systems techniques as fuzzy logic and pairwise comparisons. The use of these procedures in the design of applications combines qualitative and quantitative factors with empirical data to yield a model with justifiable design and parameter selection procedures. The procedure is especially relevant to areas of back-propagation neural network design which are highly responsive to the use of precisely recorded expert knowledge.

  18. Building Neural Net Software

    OpenAIRE

    Neto, João Pedro; Costa, José Félix

    1999-01-01

    In a recent paper [Neto et al. 97] we showed that programming languages can be translated on recurrent (analog, rational weighted) neural nets. The goal was not efficiency but simplicity. Indeed we used a number-theoretic approach to machine programming, where (integer) numbers were coded in a unary fashion, introducing a exponential slow down in the computations, with respect to a two-symbol tape Turing machine. Implementation of programming languages in neural nets turns to be not only theo...

  19. NEMEFO: NEural MEteorological FOrecast

    Energy Technology Data Exchange (ETDEWEB)

    Pasero, E.; Moniaci, W.; Meindl, T.; Montuori, A. [Polytechnic of Turin (Italy). Dept. of Electronics

    2004-07-01

    Artificial Neural Systems are a well-known technique used to classify and recognize objects. Introducing the time dimension they can be used to forecast numerical series. NEMEFO is a ''nowcasting'' tool, which uses both statistical and neural systems to forecast meteorological data in a restricted area close to a meteorological weather station in a short time range (3 hours). Ice, fog, rain are typical events which can be anticipated by NEMEFO. (orig.)

  20. Conducting Polymers for Neural Prosthetic and Neural Interface Applications

    Science.gov (United States)

    2015-01-01

    Neural interfacing devices are an artificial mechanism for restoring or supplementing the function of the nervous system lost as a result of injury or disease. Conducting polymers (CPs) are gaining significant attention due to their capacity to meet the performance criteria of a number of neuronal therapies including recording and stimulating neural activity, the regeneration of neural tissue and the delivery of bioactive molecules for mediating device-tissue interactions. CPs form a flexible platform technology that enables the development of tailored materials for a range of neuronal diagnostic and treatment therapies. In this review the application of CPs for neural prostheses and other neural interfacing devices are discussed, with a specific focus on neural recording, neural stimulation, neural regeneration, and therapeutic drug delivery. PMID:26414302

  1. Hyperbolic Hopfield neural networks.

    Science.gov (United States)

    Kobayashi, M

    2013-02-01

    In recent years, several neural networks using Clifford algebra have been studied. Clifford algebra is also called geometric algebra. Complex-valued Hopfield neural networks (CHNNs) are the most popular neural networks using Clifford algebra. The aim of this brief is to construct hyperbolic HNNs (HHNNs) as an analog of CHNNs. Hyperbolic algebra is a Clifford algebra based on Lorentzian geometry. In this brief, a hyperbolic neuron is defined in a manner analogous to a phasor neuron, which is a typical complex-valued neuron model. HHNNs share common concepts with CHNNs, such as the angle and energy. However, HHNNs and CHNNs are different in several aspects. The states of hyperbolic neurons do not form a circle, and, therefore, the start and end states are not identical. In the quantized version, unlike complex-valued neurons, hyperbolic neurons have an infinite number of states.

  2. Neural Semantic Encoders.

    Science.gov (United States)

    Munkhdalai, Tsendsuren; Yu, Hong

    2017-04-01

    We present a memory augmented neural network for natural language understanding: Neural Semantic Encoders. NSE is equipped with a novel memory update rule and has a variable sized encoding memory that evolves over time and maintains the understanding of input sequences through read, compose and write operations. NSE can also access multiple and shared memories. In this paper, we demonstrated the effectiveness and the flexibility of NSE on five different natural language tasks: natural language inference, question answering, sentence classification, document sentiment analysis and machine translation where NSE achieved state-of-the-art performance when evaluated on publically available benchmarks. For example, our shared-memory model showed an encouraging result on neural machine translation, improving an attention-based baseline by approximately 1.0 BLEU.

  3. The neural crest and neural crest cells: discovery and significance ...

    Indian Academy of Sciences (India)

    In this paper I provide a brief overview of the major phases of investigation into the neural crest and the major players involved, discuss how the origin of the neural crest relates to the origin of the nervous system in vertebrate embryos, discuss the impact on the germ-layer theory of the discovery of the neural crest and of ...

  4. Introduction to Artificial Neural Networks

    DEFF Research Database (Denmark)

    Larsen, Jan

    1999-01-01

    The note addresses introduction to signal analysis and classification based on artificial feed-forward neural networks.......The note addresses introduction to signal analysis and classification based on artificial feed-forward neural networks....

  5. Deconvolution using a neural network

    Energy Technology Data Exchange (ETDEWEB)

    Lehman, S.K.

    1990-11-15

    Viewing one dimensional deconvolution as a matrix inversion problem, we compare a neural network backpropagation matrix inverse with LMS, and pseudo-inverse. This is a largely an exercise in understanding how our neural network code works. 1 ref.

  6. Neural Network Ensembles

    DEFF Research Database (Denmark)

    Hansen, Lars Kai; Salamon, Peter

    1990-01-01

    We propose several means for improving the performance an training of neural networks for classification. We use crossvalidation as a tool for optimizing network parameters and architecture. We show further that the remaining generalization error can be reduced by invoking ensembles of similar...... networks....

  7. Neural systems for control

    National Research Council Canada - National Science Library

    Omidvar, Omid; Elliott, David L

    1997-01-01

    ... is reprinted with permission from A. Barto, "Reinforcement Learning," Handbook of Brain Theory and Neural Networks, M.A. Arbib, ed.. The MIT Press, Cambridge, MA, pp. 804-809, 1995. Chapter 4, Figures 4-5 and 7-9 and Tables 2-5, are reprinted with permission, from S. Cho, "Map Formation in Proprioceptive Cortex," International Jour...

  8. Neural Tube Defects

    Science.gov (United States)

    ... pregnancies each year in the United States. A baby’s neural tube normally develops into the brain and spinal cord. ... fluid in the brain. This is called hydrocephalus. Babies with this condition are treated with surgery to insert a tube (called a shunt) into the brain. The shunt ...

  9. Mutational spectrum of semaphorin 3A and semaphorin 3D genes in Spanish Hirschsprung patients.

    Directory of Open Access Journals (Sweden)

    Berta Luzón-Toro

    Full Text Available Hirschsprung disease (HSCR, OMIM 142623 is a developmental disorder characterized by the absence of ganglion cells along variable lengths of the distal gastrointestinal tract, which results in tonic contraction of the aganglionic colon segment and functional intestinal obstruction. The RET proto-oncogene is the major gene associated to HSCR with differential contributions of its rare and common, coding and noncoding mutations to the multifactorial nature of this pathology. In addition, many other genes have been described to be associated with this pathology, including the semaphorins class III genes SEMA3A (7p12.1 and SEMA3D (7q21.11 through SNP array analyses and by next-generation sequencing technologies. Semaphorins are guidance cues for developing neurons implicated in the axonal projections and in the determination of the migratory pathway for neural-crest derived neural precursors during enteric nervous system development. In addition, it has been described that increased SEMA3A expression may be a risk factor for HSCR through the upregulation of the gene in the aganglionic smooth muscle layer of the colon in HSCR patients. Here we present the results of a comprehensive analysis of SEMA3A and SEMA3D in a series of 200 Spanish HSCR patients by the mutational screening of its coding sequence, which has led to find a number of potentially deleterious variants. RET mutations have been also detected in some of those patients carrying SEMAs variants. We have evaluated the A131T-SEMA3A, S598G-SEMA3A and E198K-SEMA3D mutations using colon tissue sections of these patients by immunohistochemistry. All mutants presented increased protein expression in smooth muscle layer of ganglionic segments. Moreover, A131T-SEMA3A also maintained higher protein levels in the aganglionic muscle layers. These findings strongly suggest that these mutants have a pathogenic effect on the disease. Furthermore, because of their coexistence with RET mutations, our data

  10. Bioprinting for Neural Tissue Engineering.

    Science.gov (United States)

    Knowlton, Stephanie; Anand, Shivesh; Shah, Twisha; Tasoglu, Savas

    2018-01-01

    Bioprinting is a method by which a cell-encapsulating bioink is patterned to create complex tissue architectures. Given the potential impact of this technology on neural research, we review the current state-of-the-art approaches for bioprinting neural tissues. While 2D neural cultures are ubiquitous for studying neural cells, 3D cultures can more accurately replicate the microenvironment of neural tissues. By bioprinting neuronal constructs, one can precisely control the microenvironment by specifically formulating the bioink for neural tissues, and by spatially patterning cell types and scaffold properties in three dimensions. We review a range of bioprinted neural tissue models and discuss how they can be used to observe how neurons behave, understand disease processes, develop new therapies and, ultimately, design replacement tissues. Copyright © 2017 Elsevier Ltd. All rights reserved.

  11. Neural network technologies

    Science.gov (United States)

    Villarreal, James A.

    1991-01-01

    A whole new arena of computer technologies is now beginning to form. Still in its infancy, neural network technology is a biologically inspired methodology which draws on nature's own cognitive processes. The Software Technology Branch has provided a software tool, Neural Execution and Training System (NETS), to industry, government, and academia to facilitate and expedite the use of this technology. NETS is written in the C programming language and can be executed on a variety of machines. Once a network has been debugged, NETS can produce a C source code which implements the network. This code can then be incorporated into other software systems. Described here are various software projects currently under development with NETS and the anticipated future enhancements to NETS and the technology.

  12. Analysis of neural data

    CERN Document Server

    Kass, Robert E; Brown, Emery N

    2014-01-01

    Continual improvements in data collection and processing have had a huge impact on brain research, producing data sets that are often large and complicated. By emphasizing a few fundamental principles, and a handful of ubiquitous techniques, Analysis of Neural Data provides a unified treatment of analytical methods that have become essential for contemporary researchers. Throughout the book ideas are illustrated with more than 100 examples drawn from the literature, ranging from electrophysiology, to neuroimaging, to behavior. By demonstrating the commonality among various statistical approaches the authors provide the crucial tools for gaining knowledge from diverse types of data. Aimed at experimentalists with only high-school level mathematics, as well as computationally-oriented neuroscientists who have limited familiarity with statistics, Analysis of Neural Data serves as both a self-contained introduction and a reference work.

  13. Neural tube defects

    Directory of Open Access Journals (Sweden)

    M.E. Marshall

    1981-09-01

    Full Text Available Neural tube defects refer to any defect in the morphogenesis of the neural tube, the most common types being spina bifida and anencephaly. Spina bifida has been recognised in skeletons found in north-eastern Morocco and estimated to have an age of almost 12 000 years. It was also known to the ancient Greek and Arabian physicians who thought that the bony defect was due to the tumour. The term spina bifida was first used by Professor Nicolai Tulp of Amsterdam in 1652. Many other terms have been used to describe this defect, but spina bifida remains the most useful general term, as it describes the separation of the vertebral elements in the midline.

  14. Neural networks for triggering

    Energy Technology Data Exchange (ETDEWEB)

    Denby, B. (Fermi National Accelerator Lab., Batavia, IL (USA)); Campbell, M. (Michigan Univ., Ann Arbor, MI (USA)); Bedeschi, F. (Istituto Nazionale di Fisica Nucleare, Pisa (Italy)); Chriss, N.; Bowers, C. (Chicago Univ., IL (USA)); Nesti, F. (Scuola Normale Superiore, Pisa (Italy))

    1990-01-01

    Two types of neural network beauty trigger architectures, based on identification of electrons in jets and recognition of secondary vertices, have been simulated in the environment of the Fermilab CDF experiment. The efficiencies for B's and rejection of background obtained are encouraging. If hardware tests are successful, the electron identification architecture will be tested in the 1991 run of CDF. 10 refs., 5 figs., 1 tab.

  15. Artificial neural network modelling

    CERN Document Server

    Samarasinghe, Sandhya

    2016-01-01

    This book covers theoretical aspects as well as recent innovative applications of Artificial Neural networks (ANNs) in natural, environmental, biological, social, industrial and automated systems. It presents recent results of ANNs in modelling small, large and complex systems under three categories, namely, 1) Networks, Structure Optimisation, Robustness and Stochasticity 2) Advances in Modelling Biological and Environmental Systems and 3) Advances in Modelling Social and Economic Systems. The book aims at serving undergraduates, postgraduates and researchers in ANN computational modelling. .

  16. Neurally-mediated sincope.

    Science.gov (United States)

    Can, I; Cytron, J; Jhanjee, R; Nguyen, J; Benditt, D G

    2009-08-01

    Syncope is a syndrome characterized by a relatively sudden, temporary and self-terminating loss of consciousness; the causes may vary, but they have in common a temporary inadequacy of cerebral nutrient flow, usually due to a fall in systemic arterial pressure. However, while syncope is a common problem, it is only one explanation for episodic transient loss of consciousness (TLOC). Consequently, diagnostic evaluation should start with a broad consideration of real or seemingly real TLOC. Among those patients in whom TLOC is deemed to be due to ''true syncope'', the focus may then reasonably turn to assessing the various possible causes; in this regard, the neurally-mediated syncope syndromes are among the most frequently encountered. There are three common variations: vasovagal syncope (often termed the ''common'' faint), carotid sinus syndrome, and the so-called ''situational faints''. Defining whether the cause is due to a neurally-mediated reflex relies heavily on careful history taking and selected testing (e.g., tilt-test, carotid massage). These steps are important. Despite the fact that neurally-mediated faints are usually relatively benign from a mortality perspective, they are nevertheless only infrequently an isolated event; neurally-mediated syncope tends to recur, and physical injury resulting from falls or accidents, diminished quality-of-life, and possible restriction from employment or avocation are real concerns. Consequently, defining the specific form and developing an effective treatment strategy are crucial. In every case the goal should be to determine the cause of syncope with sufficient confidence to provide patients and family members with a reliable assessment of prognosis, recurrence risk, and treatment options.

  17. The Neural Noisy Channel

    OpenAIRE

    Yu, Lei; Blunsom, Phil; Dyer, Chris; Grefenstette, Edward; Kocisky, Tomas

    2016-01-01

    We formulate sequence to sequence transduction as a noisy channel decoding problem and use recurrent neural networks to parameterise the source and channel models. Unlike direct models which can suffer from explaining-away effects during training, noisy channel models must produce outputs that explain their inputs, and their component models can be trained with not only paired training samples but also unpaired samples from the marginal output distribution. Using a latent variable to control ...

  18. Neural Based Orthogonal Data Fitting The EXIN Neural Networks

    CERN Document Server

    Cirrincione, Giansalvo

    2008-01-01

    Written by three leaders in the field of neural based algorithms, Neural Based Orthogonal Data Fitting proposes several neural networks, all endowed with a complete theory which not only explains their behavior, but also compares them with the existing neural and traditional algorithms. The algorithms are studied from different points of view, including: as a differential geometry problem, as a dynamic problem, as a stochastic problem, and as a numerical problem. All algorithms have also been analyzed on real time problems (large dimensional data matrices) and have shown accurate solutions. Wh

  19. Neural Correlates of Stimulus Reportability

    OpenAIRE

    Hulme, Oliver J.; Friston, Karl F.; Zeki, Semir

    2009-01-01

    Most experiments on the “neural correlates of consciousness” employ stimulus reportability as an operational definition of what is consciously perceived. The interpretation of such experiments therefore depends critically on understanding the neural basis of stimulus reportability. Using a high volume of fMRI data, we investigated the neural correlates of stimulus reportability using a partial report object detection paradigm. Subjects were presented with a random array of circularly arranged...

  20. Symbolic processing in neural networks

    OpenAIRE

    Neto, João Pedro; Hava T Siegelmann; Costa,J.Félix

    2003-01-01

    In this paper we show that programming languages can be translated into recurrent (analog, rational weighted) neural nets. Implementation of programming languages in neural nets turns to be not only theoretical exciting, but has also some practical implications in the recent efforts to merge symbolic and sub symbolic computation. To be of some use, it should be carried in a context of bounded resources. Herein, we show how to use resource bounds to speed up computations over neural nets, thro...

  1. [Artificial neural networks in Neurosciences].

    Science.gov (United States)

    Porras Chavarino, Carmen; Salinas Martínez de Lecea, José María

    2011-11-01

    This article shows that artificial neural networks are used for confirming the relationships between physiological and cognitive changes. Specifically, we explore the influence of a decrease of neurotransmitters on the behaviour of old people in recognition tasks. This artificial neural network recognizes learned patterns. When we change the threshold of activation in some units, the artificial neural network simulates the experimental results of old people in recognition tasks. However, the main contributions of this paper are the design of an artificial neural network and its operation inspired by the nervous system and the way the inputs are coded and the process of orthogonalization of patterns.

  2. Neural Correlates of Face Detection

    National Research Council Canada - National Science Library

    Xu, Xiaokun; Biederman, Irving

    2014-01-01

    Although face detection likely played an essential adaptive role in our evolutionary past and in contemporary social interactions, there have been few rigorous studies investigating its neural correlates...

  3. Optics in neural computation

    Science.gov (United States)

    Levene, Michael John

    In all attempts to emulate the considerable powers of the brain, one is struck by both its immense size, parallelism, and complexity. While the fields of neural networks, artificial intelligence, and neuromorphic engineering have all attempted oversimplifications on the considerable complexity, all three can benefit from the inherent scalability and parallelism of optics. This thesis looks at specific aspects of three modes in which optics, and particularly volume holography, can play a part in neural computation. First, holography serves as the basis of highly-parallel correlators, which are the foundation of optical neural networks. The huge input capability of optical neural networks make them most useful for image processing and image recognition and tracking. These tasks benefit from the shift invariance of optical correlators. In this thesis, I analyze the capacity of correlators, and then present several techniques for controlling the amount of shift invariance. Of particular interest is the Fresnel correlator, in which the hologram is displaced from the Fourier plane. In this case, the amount of shift invariance is limited not just by the thickness of the hologram, but by the distance of the hologram from the Fourier plane. Second, volume holography can provide the huge storage capacity and high speed, parallel read-out necessary to support large artificial intelligence systems. However, previous methods for storing data in volume holograms have relied on awkward beam-steering or on as-yet non- existent cheap, wide-bandwidth, tunable laser sources. This thesis presents a new technique, shift multiplexing, which is capable of very high densities, but which has the advantage of a very simple implementation. In shift multiplexing, the reference wave consists of a focused spot a few millimeters in front of the hologram. Multiplexing is achieved by simply translating the hologram a few tens of microns or less. This thesis describes the theory for how shift

  4. Analysis of neural networks

    CERN Document Server

    Heiden, Uwe

    1980-01-01

    The purpose of this work is a unified and general treatment of activity in neural networks from a mathematical pOint of view. Possible applications of the theory presented are indica­ ted throughout the text. However, they are not explored in de­ tail for two reasons : first, the universal character of n- ral activity in nearly all animals requires some type of a general approach~ secondly, the mathematical perspicuity would suffer if too many experimental details and empirical peculiarities were interspersed among the mathematical investigation. A guide to many applications is supplied by the references concerning a variety of specific issues. Of course the theory does not aim at covering all individual problems. Moreover there are other approaches to neural network theory (see e.g. Poggio-Torre, 1978) based on the different lev­ els at which the nervous system may be viewed. The theory is a deterministic one reflecting the average be­ havior of neurons or neuron pools. In this respect the essay is writt...

  5. Artificial Neural Networks·

    Indian Academy of Sciences (India)

    differences between biological neural networks (BNNs) of the brain and ANN s. A thorough understanding of ... neurons. Artificial neural models are loosely based on biology since a complete understanding of the .... A learning scheme for updating a neuron's connections (weights) was proposed by Donald Hebb in 1949.

  6. Neural Networks for Optimal Control

    DEFF Research Database (Denmark)

    Sørensen, O.

    1995-01-01

    Two neural networks are trained to act as an observer and a controller, respectively, to control a non-linear, multi-variable process.......Two neural networks are trained to act as an observer and a controller, respectively, to control a non-linear, multi-variable process....

  7. The Neural Support Vector Machine

    NARCIS (Netherlands)

    Wiering, Marco; van der Ree, Michiel; Embrechts, Mark; Stollenga, Marijn; Meijster, Arnold; Nolte, A; Schomaker, Lambertus

    2013-01-01

    This paper describes a new machine learning algorithm for regression and dimensionality reduction tasks. The Neural Support Vector Machine (NSVM) is a hybrid learning algorithm consisting of neural networks and support vector machines (SVMs). The output of the NSVM is given by SVMs that take a

  8. Neural fields theory and applications

    CERN Document Server

    Graben, Peter; Potthast, Roland; Wright, James

    2014-01-01

    With this book, the editors present the first comprehensive collection in neural field studies, authored by leading scientists in the field - among them are two of the founding-fathers of neural field theory. Up to now, research results in the field have been disseminated across a number of distinct journals from mathematics, computational neuroscience, biophysics, cognitive science and others. Starting with a tutorial for novices in neural field studies, the book comprises chapters on emergent patterns, their phase transitions and evolution, on stochastic approaches, cortical development, cognition, robotics and computation, large-scale numerical simulations, the coupling of neural fields to the electroencephalogram and phase transitions in anesthesia. The intended readership are students and scientists in applied mathematics, theoretical physics, theoretical biology, and computational neuroscience. Neural field theory and its applications have a long-standing tradition in the mathematical and computational ...

  9. The Neural Correlates of Race

    Science.gov (United States)

    Ito, Tiffany A.; Bartholow, Bruce D.

    2009-01-01

    Behavioral analyses are a natural choice for understanding the wide-ranging behavioral consequences of racial stereotyping and prejudice. However, neuroimaging and electrophysiological research has recently considered the neural mechanisms that underlie racial categorization and the activation and application of racial stereotypes and prejudice, revealing exciting new insights. Work reviewed here points to the importance of neural structures previously associated with face processing, semantic knowledge activation, evaluation, and self-regulatory behavioral control, allowing for the specification of a neural model of race processing. We show how research on the neural correlates of race can serve to link otherwise disparate lines of evidence on the neural underpinnings of a broad array of social-cognitive phenomena, and consider implications for effecting change in race relations. PMID:19896410

  10. Neural Networks in Control Applications

    DEFF Research Database (Denmark)

    Sørensen, O.

    The intention of this report is to make a systematic examination of the possibilities of applying neural networks in those technical areas, which are familiar to a control engineer. In other words, the potential of neural networks in control applications is given higher priority than a detailed...... examined, and it appears that considering 'normal' neural network models with, say, 500 samples, the problem of over-fitting is neglible, and therefore it is not taken into consideration afterwards. Numerous model types, often met in control applications, are implemented as neural network models...... Kalmann filter) representing state space description. The potentials of neural networks for control of non-linear processes are also examined, focusing on three different groups of control concepts, all considered as generalizations of known linear control concepts to handle also non-linear processes...

  11. Chemoattractants and chemorepellents act by inducing opposite polarity in phospholipase C and PI3-kinase signaling

    NARCIS (Netherlands)

    Keizer-Gunnink, Ineke; Kortholt, Arjan; Van Haastert, Peter J. M.

    2007-01-01

    During embryonic development, cell movement is orchestrated by a multitude of attractants and repellents. Chemoattractants applied as a gradient, such as cAMP with Dictyostelium discoideum or fMLP with neutrophils, induce the activation of phospholipase C ( PLC) and phosphoinositide 3 (PI3)-kinase

  12. An Optoelectronic Neural Network

    Science.gov (United States)

    Neil, Mark A. A.; White, Ian H.; Carroll, John E.

    1990-02-01

    We describe and present results of an optoelectronic neural network processing system. The system uses an algorithm based on the Hebbian learning rule to memorise a set of associated vector pairs. Recall occurs by the processing of the input vector with these stored associations in an incoherent optical vector multiplier using optical polarisation rotating liquid crystal spatial light modulators to store the vectors and an optical polarisation shadow casting technique to perform multiplications. Results are detected on a photodiode array and thresholded electronically by a controlling microcomputer. The processor is shown to work in autoassociative and heteroassociative modes with up to 10 stored memory vectors of length 64 (equivalent to 64 neurons) and a cycle time of 50ms. We discuss the limiting factors at work in this system, how they affect its scalability and the general applicability of its principles to other systems.

  13. Neural Darwinism and consciousness.

    Science.gov (United States)

    Seth, Anil K; Baars, Bernard J

    2005-03-01

    Neural Darwinism (ND) is a large scale selectionist theory of brain development and function that has been hypothesized to relate to consciousness. According to ND, consciousness is entailed by reentrant interactions among neuronal populations in the thalamocortical system (the 'dynamic core'). These interactions, which permit high-order discriminations among possible core states, confer selective advantages on organisms possessing them by linking current perceptual events to a past history of value-dependent learning. Here, we assess the consistency of ND with 16 widely recognized properties of consciousness, both physiological (for example, consciousness is associated with widespread, relatively fast, low amplitude interactions in the thalamocortical system), and phenomenal (for example, consciousness involves the existence of a private flow of events available only to the experiencing subject). While no theory accounts fully for all of these properties at present, we find that ND and its recent extensions fare well.

  14. Cortical neural prosthetics.

    Science.gov (United States)

    Schwartz, Andrew B

    2004-01-01

    Control of prostheses using cortical signals is based on three elements: chronic microelectrode arrays, extraction algorithms, and prosthetic effectors. Arrays of microelectrodes are permanently implanted in cerebral cortex. These arrays must record populations of single- and multiunit activity indefinitely. Information containing position and velocity correlates of animate movement needs to be extracted continuously in real time from the recorded activity. Prosthetic arms, the current effectors used in this work, need to have the agility and configuration of natural arms. Demonstrations using closed-loop control show that subjects change their neural activity to improve performance with these devices. Adaptive-learning algorithms that capitalize on these improvements show that this technology has the capability of restoring much of the arm movement lost with immobilizing deficits.

  15. Cooperating attackers in neural cryptography.

    Science.gov (United States)

    Shacham, Lanir N; Klein, Einat; Mislovaty, Rachel; Kanter, Ido; Kinzel, Wolfgang

    2004-06-01

    A successful attack strategy in neural cryptography is presented. The neural cryptosystem, based on synchronization of neural networks by mutual learning, has been recently shown to be secure under different attack strategies. The success of the advanced attacker presented here, called the "majority-flipping attacker," does not decay with the parameters of the model. This attacker's outstanding success is due to its using a group of attackers which cooperate throughout the synchronization process, unlike any other attack strategy known. An analytical description of this attack is also presented, and fits the results of simulations.

  16. Cooperating attackers in neural cryptography

    Science.gov (United States)

    Shacham, Lanir N.; Klein, Einat; Mislovaty, Rachel; Kanter, Ido; Kinzel, Wolfgang

    2004-06-01

    A successful attack strategy in neural cryptography is presented. The neural cryptosystem, based on synchronization of neural networks by mutual learning, has been recently shown to be secure under different attack strategies. The success of the advanced attacker presented here, called the “majority-flipping attacker,” does not decay with the parameters of the model. This attacker’s outstanding success is due to its using a group of attackers which cooperate throughout the synchronization process, unlike any other attack strategy known. An analytical description of this attack is also presented, and fits the results of simulations.

  17. Neural Computations in Binaural Hearing

    Science.gov (United States)

    Wagner, Hermann

    Binaural hearing helps humans and animals to localize and unmask sounds. Here, binaural computations in the barn owl's auditory system are discussed. Barn owls use the interaural time difference (ITD) for azimuthal sound localization, and they use the interaural level difference (ELD) for elevational sound localization. ITD and ILD and their precursors are processed in separate neural pathways, the time pathway and the intensity pathway, respectively. Representation of ITD involves four main computational steps, while the representation of ILD is accomplished in three steps. In the discussion neural processing in the owl's auditory system is compared with neural computations present in mammals.

  18. Neural Manifolds for the Control of Movement.

    Science.gov (United States)

    Gallego, Juan A; Perich, Matthew G; Miller, Lee E; Solla, Sara A

    2017-06-07

    The analysis of neural dynamics in several brain cortices has consistently uncovered low-dimensional manifolds that capture a significant fraction of neural variability. These neural manifolds are spanned by specific patterns of correlated neural activity, the "neural modes." We discuss a model for neural control of movement in which the time-dependent activation of these neural modes is the generator of motor behavior. This manifold-based view of motor cortex may lead to a better understanding of how the brain controls movement. Copyright © 2017 Elsevier Inc. All rights reserved.

  19. Epidemiology of neural tube defects

    National Research Council Canada - National Science Library

    Seidahmed, Mohammed Z; Abdelbasit, Omar B; Shaheed, Meeralebbae M; Alhussein, Khalid A; Miqdad, Abeer M; Khalil, Mohamed I; Al-Enazy, Naif M; Salih, Mustafa A

    2014-01-01

    To find the prevalence of neural tube defects (NTDs), and compare the findings with local and international data, and highlight the important role of folic acid supplementation and flour fortification with folic acid in preventing NTDs...

  20. Neural Networks in Control Applications

    DEFF Research Database (Denmark)

    Sørensen, O.

    The intention of this report is to make a systematic examination of the possibilities of applying neural networks in those technical areas, which are familiar to a control engineer. In other words, the potential of neural networks in control applications is given higher priority than a detailed...... study of the networks themselves. With this end in view the following restrictions have been made: - Amongst numerous neural network structures, only the Multi Layer Perceptron (a feed-forward network) is applied. - Amongst numerous training algorithms, only four algorithms are examined, all...... in a recursive form (sample updating). The simplest is the Back Probagation Error Algorithm, and the most complex is the recursive Prediction Error Method using a Gauss-Newton search direction. - Over-fitting is often considered to be a serious problem when training neural networks. This problem is specifically...

  1. Memristor-based neural networks

    Science.gov (United States)

    Thomas, Andy

    2013-03-01

    The synapse is a crucial element in biological neural networks, but a simple electronic equivalent has been absent. This complicates the development of hardware that imitates biological architectures in the nervous system. Now, the recent progress in the experimental realization of memristive devices has renewed interest in artificial neural networks. The resistance of a memristive system depends on its past states and exactly this functionality can be used to mimic the synaptic connections in a (human) brain. After a short introduction to memristors, we present and explain the relevant mechanisms in a biological neural network, such as long-term potentiation and spike time-dependent plasticity, and determine the minimal requirements for an artificial neural network. We review the implementations of these processes using basic electric circuits and more complex mechanisms that either imitate biological systems or could act as a model system for them.

  2. Neural Networks in Control Applications

    DEFF Research Database (Denmark)

    Sørensen, O.

    simulated process and compared. The closing chapter describes some practical experiments, where the different control concepts and training methods are tested on the same practical process operating in very noisy environments. All tests confirm that neural networks also have the potential to be trained......The intention of this report is to make a systematic examination of the possibilities of applying neural networks in those technical areas, which are familiar to a control engineer. In other words, the potential of neural networks in control applications is given higher priority than a detailed...... study of the networks themselves. With this end in view the following restrictions have been made: - Amongst numerous neural network structures, only the Multi Layer Perceptron (a feed-forward network) is applied. - Amongst numerous training algorithms, only four algorithms are examined, all...

  3. Neural components of altruistic punishment

    Directory of Open Access Journals (Sweden)

    Emily eDu

    2015-02-01

    Full Text Available Altruistic punishment, which occurs when an individual incurs a cost to punish in response to unfairness or a norm violation, may play a role in perpetuating cooperation. The neural correlates underlying costly punishment have only recently begun to be explored. Here we review the current state of research on the neural basis of altruism from the perspectives of costly punishment, emphasizing the importance of characterizing elementary neural processes underlying a decision to punish. In particular, we emphasize three cognitive processes that contribute to the decision to altruistically punish in most scenarios: inequity aversion, cost-benefit calculation, and social reference frame to distinguish self from others. Overall, we argue for the importance of understanding the neural correlates of altruistic punishment with respect to the core computations necessary to achieve a decision to punish.

  4. Complex-Valued Neural Networks

    CERN Document Server

    Hirose, Akira

    2012-01-01

    This book is the second enlarged and revised edition of the first successful monograph on complex-valued neural networks (CVNNs) published in 2006, which lends itself to graduate and undergraduate courses in electrical engineering, informatics, control engineering, mechanics, robotics, bioengineering, and other relevant fields. In the second edition the recent trends in CVNNs research are included, resulting in e.g. almost a doubled number of references. The parametron invented in 1954 is also referred to with discussion on analogy and disparity. Also various additional arguments on the advantages of the complex-valued neural networks enhancing the difference to real-valued neural networks are given in various sections. The book is useful for those beginning their studies, for instance, in adaptive signal processing for highly functional sensing and imaging, control in unknown and changing environment, robotics inspired by human neural systems, and brain-like information processing, as well as interdisciplina...

  5. CHARGEd with neural crest defects.

    Science.gov (United States)

    Pauli, Silke; Bajpai, Ruchi; Borchers, Annette

    2017-10-30

    Neural crest cells are highly migratory pluripotent cells that give rise to diverse derivatives including cartilage, bone, smooth muscle, pigment, and endocrine cells as well as neurons and glia. Abnormalities in neural crest-derived tissues contribute to the etiology of CHARGE syndrome, a complex malformation disorder that encompasses clinical symptoms like coloboma, heart defects, atresia of the choanae, retarded growth and development, genital hypoplasia, ear anomalies, and deafness. Mutations in the chromodomain helicase DNA-binding protein 7 (CHD7) gene are causative of CHARGE syndrome and loss-of-function data in different model systems have firmly established a role of CHD7 in neural crest development. Here, we will summarize our current understanding of the function of CHD7 in neural crest development and discuss possible links of CHARGE syndrome to other developmental disorders. © 2017 Wiley Periodicals, Inc.

  6. Neural components of altruistic punishment.

    Science.gov (United States)

    Du, Emily; Chang, Steve W C

    2015-01-01

    Altruistic punishment, which occurs when an individual incurs a cost to punish in response to unfairness or a norm violation, may play a role in perpetuating cooperation. The neural correlates underlying costly punishment have only recently begun to be explored. Here we review the current state of research on the neural basis of altruism from the perspectives of costly punishment, emphasizing the importance of characterizing elementary neural processes underlying a decision to punish. In particular, we emphasize three cognitive processes that contribute to the decision to altruistically punish in most scenarios: inequity aversion, cost-benefit calculation, and social reference frame to distinguish self from others. Overall, we argue for the importance of understanding the neural correlates of altruistic punishment with respect to the core computations necessary to achieve a decision to punish.

  7. Pansharpening by Convolutional Neural Networks

    National Research Council Canada - National Science Library

    Masi, Giuseppe; Cozzolino, Davide; Verdoliva, Luisa; Scarpa, Giuseppe

    2016-01-01

    A new pansharpening method is proposed, based on convolutional neural networks. We adapt a simple and effective three-layer architecture recently proposed for super-resolution to the pansharpening problem...

  8. What are artificial neural networks?

    DEFF Research Database (Denmark)

    Krogh, Anders

    2008-01-01

    Artificial neural networks have been applied to problems ranging from speech recognition to prediction of protein secondary structure, classification of cancers and gene prediction. How do they work and what might they be good for? Udgivelsesdato: 2008-Feb......Artificial neural networks have been applied to problems ranging from speech recognition to prediction of protein secondary structure, classification of cancers and gene prediction. How do they work and what might they be good for? Udgivelsesdato: 2008-Feb...

  9. Indices for Testing Neural Codes

    OpenAIRE

    Jonathan D. Victor; Nirenberg, Sheila

    2008-01-01

    One of the most critical challenges in systems neuroscience is determining the neural code. A principled framework for addressing this can be found in information theory. With this approach, one can determine whether a proposed code can account for the stimulus-response relationship. Specifically, one can compare the transmitted information between the stimulus and the hypothesized neural code with the transmitted information between the stimulus and the behavioral response. If the former is ...

  10. Biologically Inspired Modular Neural Networks

    OpenAIRE

    Azam, Farooq

    2000-01-01

    This dissertation explores the modular learning in artificial neural networks that mainly driven by the inspiration from the neurobiological basis of the human learning. The presented modularization approaches to the neural network design and learning are inspired by the engineering, complexity, psychological and neurobiological aspects. The main theme of this dissertation is to explore the organization and functioning of the brain to discover new structural and learning ...

  11. Neural-like growing networks

    Science.gov (United States)

    Yashchenko, Vitaliy A.

    2000-03-01

    On the basis of the analysis of scientific ideas reflecting the law in the structure and functioning the biological structures of a brain, and analysis and synthesis of knowledge, developed by various directions in Computer Science, also there were developed the bases of the theory of a new class neural-like growing networks, not having the analogue in world practice. In a base of neural-like growing networks the synthesis of knowledge developed by classical theories - semantic and neural of networks is. The first of them enable to form sense, as objects and connections between them in accordance with construction of the network. With thus each sense gets a separate a component of a network as top, connected to other tops. In common it quite corresponds to structure reflected in a brain, where each obvious concept is presented by certain structure and has designating symbol. Secondly, this network gets increased semantic clearness at the expense owing to formation not only connections between neural by elements, but also themselves of elements as such, i.e. here has a place not simply construction of a network by accommodation sense structures in environment neural of elements, and purely creation of most this environment, as of an equivalent of environment of memory. Thus neural-like growing networks are represented by the convenient apparatus for modeling of mechanisms of teleological thinking, as a fulfillment of certain psychophysiological of functions.

  12. Flexibility of neural stem cells

    Directory of Open Access Journals (Sweden)

    Eumorphia eRemboutsika

    2011-04-01

    Full Text Available Embryonic cortical neural stem cells are self-renewing progenitors that can differentiate into neurons and glia. We generated neurospheres from the developing cerebral cortex using a mouse genetic model that allows for lineage selection and found that the self-renewing neural stem cells are restricted to Sox2 expressing cells. Under normal conditions, embryonic cortical neurospheres are heterogeneous with regard to Sox2 expression and contain astrocytes, neural stem cells and neural progenitor cells sufficiently plastic to give rise to neural crest cells when transplanted into the hindbrain of E1.5 chick and E8 mouse embryos. However, when neurospheres are maintained under lineage selection, such that all cells express Sox2, neural stem cells maintain their Pax6+ cortical radial glia identity and exhibit a more restricted fate in vitro and after transplantation. These data demonstrate that Sox2 preserves the cortical identity and regulates the plasticity of self-renewing Pax6+ radial glia cells.

  13. Spiking modular neural networks: A neural network modeling approach for hydrological processes

    National Research Council Canada - National Science Library

    Kamban Parasuraman; Amin Elshorbagy; Sean K. Carey

    2006-01-01

    .... In this study, a novel neural network model called the spiking modular neural networks (SMNNs) is proposed. An SMNN consists of an input layer, a spiking layer, and an associator neural network layer...

  14. Influence of neural adaptation on dynamics and equilibrium state of neural activities in a ring neural network

    Science.gov (United States)

    Takiyama, Ken

    2017-12-01

    How neural adaptation affects neural information processing (i.e. the dynamics and equilibrium state of neural activities) is a central question in computational neuroscience. In my previous works, I analytically clarified the dynamics and equilibrium state of neural activities in a ring-type neural network model that is widely used to model the visual cortex, motor cortex, and several other brain regions. The neural dynamics and the equilibrium state in the neural network model corresponded to a Bayesian computation and statistically optimal multiple information integration, respectively, under a biologically inspired condition. These results were revealed in an analytically tractable manner; however, adaptation effects were not considered. Here, I analytically reveal how the dynamics and equilibrium state of neural activities in a ring neural network are influenced by spike-frequency adaptation (SFA). SFA is an adaptation that causes gradual inhibition of neural activity when a sustained stimulus is applied, and the strength of this inhibition depends on neural activities. I reveal that SFA plays three roles: (1) SFA amplifies the influence of external input in neural dynamics; (2) SFA allows the history of the external input to affect neural dynamics; and (3) the equilibrium state corresponds to the statistically optimal multiple information integration independent of the existence of SFA. In addition, the equilibrium state in a ring neural network model corresponds to the statistically optimal integration of multiple information sources under biologically inspired conditions, independent of the existence of SFA.

  15. Fractional Hopfield Neural Networks: Fractional Dynamic Associative Recurrent Neural Networks.

    Science.gov (United States)

    Pu, Yi-Fei; Yi, Zhang; Zhou, Ji-Liu

    2017-10-01

    This paper mainly discusses a novel conceptual framework: fractional Hopfield neural networks (FHNN). As is commonly known, fractional calculus has been incorporated into artificial neural networks, mainly because of its long-term memory and nonlocality. Some researchers have made interesting attempts at fractional neural networks and gained competitive advantages over integer-order neural networks. Therefore, it is naturally makes one ponder how to generalize the first-order Hopfield neural networks to the fractional-order ones, and how to implement FHNN by means of fractional calculus. We propose to introduce a novel mathematical method: fractional calculus to implement FHNN. First, we implement fractor in the form of an analog circuit. Second, we implement FHNN by utilizing fractor and the fractional steepest descent approach, construct its Lyapunov function, and further analyze its attractors. Third, we perform experiments to analyze the stability and convergence of FHNN, and further discuss its applications to the defense against chip cloning attacks for anticounterfeiting. The main contribution of our work is to propose FHNN in the form of an analog circuit by utilizing a fractor and the fractional steepest descent approach, construct its Lyapunov function, prove its Lyapunov stability, analyze its attractors, and apply FHNN to the defense against chip cloning attacks for anticounterfeiting. A significant advantage of FHNN is that its attractors essentially relate to the neuron's fractional order. FHNN possesses the fractional-order-stability and fractional-order-sensitivity characteristics.

  16. Myelin plasticity, neural activity, and traumatic neural injury.

    Science.gov (United States)

    Kondiles, Bethany R; Horner, Philip J

    2018-02-01

    The possibility that adult organisms exhibit myelin plasticity has recently become a topic of great interest. Many researchers are exploring the role of myelin growth and adaptation in daily functions such as memory and motor learning. Here we consider evidence for three different potential categories of myelin plasticity: the myelination of previously bare axons, remodeling of existing sheaths, and the removal of a sheath with replacement by a new internode. We also review evidence that points to the importance of neural activity as a mechanism by which oligodendrocyte precursor cells (OPCs) are cued to differentiate into myelinating oligodendrocytes, which may potentially be an important component of myelin plasticity. Finally, we discuss demyelination in the context of traumatic neural injury and present an argument for altering neural activity as a potential therapeutic target for remyelination following injury. © 2017 Wiley Periodicals, Inc. Develop Neurobiol 78: 108-122, 2018. © 2017 Wiley Periodicals, Inc.

  17. Multigradient for Neural Networks for Equalizers

    Directory of Open Access Journals (Sweden)

    Chulhee Lee

    2003-06-01

    Full Text Available Recently, a new training algorithm, multigradient, has been published for neural networks and it is reported that the multigradient outperforms the backpropagation when neural networks are used as a classifier. When neural networks are used as an equalizer in communications, they can be viewed as a classifier. In this paper, we apply the multigradient algorithm to train the neural networks that are used as equalizers. Experiments show that the neural networks trained using the multigradient noticeably outperforms the neural networks trained by the backpropagation.

  18. Understanding perception through neural "codes".

    Science.gov (United States)

    Freeman, Walter J

    2011-07-01

    A major challenge for cognitive scientists is to deduce and explain the neural mechanisms of the rapid transposition between stimulus energy and recalled memory-between the specific (sensation) and the generic (perception)-in both material and mental aspects. Researchers are attempting three explanations in terms of neural codes. The microscopic code: cellular neurobiologists correlate stimulus properties with the rates and frequencies of trains of action potentials induced by stimuli and carried by topologically organized axons. The mesoscopic code: cognitive scientists formulate symbolic codes in trains of action potentials from feature-detector neurons of phonemes, lines, odorants, vibrations, faces, etc., that object-detector neurons bind into representations of stimuli. The macroscopic code: neurodynamicists extract neural correlates of stimuli and associated behaviors in spatial patterns of oscillatory fields of dendritic activity, which self-organize and evolve on trajectories through high-dimensional brain state space. This multivariate code is expressed in landscapes of chaotic attractors. Unlike other scientific codes, such as DNA and the periodic table, these neural codes have no alphabet or syntax. They are epistemological metaphors that experimentalists need to measure neural activity and engineers need to model brain functions. My aim is to describe the main properties of the macroscopic code and the grand challenge it poses: how do very large patterns of textured synchronized oscillations form in cortex so quickly? © 2010 IEEE

  19. Neural correlates of stimulus reportability.

    Science.gov (United States)

    Hulme, Oliver J; Friston, Karl F; Zeki, Semir

    2009-08-01

    Most experiments on the "neural correlates of consciousness" employ stimulus reportability as an operational definition of what is consciously perceived. The interpretation of such experiments therefore depends critically on understanding the neural basis of stimulus reportability. Using a high volume of fMRI data, we investigated the neural correlates of stimulus reportability using a partial report object detection paradigm. Subjects were presented with a random array of circularly arranged disc-stimuli and were cued, after variable delays (following stimulus offset), to report the presence or absence of a disc at the cued location, using variable motor actions. By uncoupling stimulus processing, decision, and motor response, we were able to use signal detection theory to deconstruct the neural basis of stimulus reportability. We show that retinotopically specific responses in the early visual cortex correlate with stimulus processing but not decision or report; a network of parietal/temporal regions correlates with decisions but not stimulus presence, whereas classical motor regions correlate with report. These findings provide a basic framework for understanding the neural basis of stimulus reportability without the theoretical burden of presupposing a relationship between reportability and consciousness.

  20. Neural Approaches to Machine Consciousness

    Science.gov (United States)

    Aleksander, Igor; Eng., F. R.

    2008-10-01

    `Machine Consciousness', which some years ago might have been suppressed as an inappropriate pursuit, has come out of the closet and is now a legitimate area of research concern. This paper briefly surveys the last few years of worldwide research in this area which divides into rule-based and neural approaches and then reviews the work of the author's laboratory during the last ten years. The paper develops a fresh perspective on this work: it is argued that neural approaches, in this case, digital neural systems, can address phenomenological consciousness. Important clarifications of phenomenology and virtuality which enter this modelling are explained in the early parts of the paper. In neural models, phenomenology is a form of depictive inner representation that has five specific axiomatic features: a sense of self-presence in an external world; a sense of imagination of past experience and fiction; a sense of attention; a capacity for planning; a sense of emotion-based volition that influences planning. It is shown that these five features have separate but integrated support in dynamic neural systems.

  1. Neural recording and modulation technologies

    Science.gov (United States)

    Chen, Ritchie; Canales, Andres; Anikeeva, Polina

    2017-01-01

    In the mammalian nervous system, billions of neurons connected by quadrillions of synapses exchange electrical, chemical and mechanical signals. Disruptions to this network manifest as neurological or psychiatric conditions. Despite decades of neuroscience research, our ability to treat or even to understand these conditions is limited by the capability of tools to probe the signalling complexity of the nervous system. Although orders of magnitude smaller and computationally faster than neurons, conventional substrate-bound electronics do not recapitulate the chemical and mechanical properties of neural tissue. This mismatch results in a foreign-body response and the encapsulation of devices by glial scars, suggesting that the design of an interface between the nervous system and a synthetic sensor requires additional materials innovation. Advances in genetic tools for manipulating neural activity have fuelled the demand for devices that are capable of simultaneously recording and controlling individual neurons at unprecedented scales. Recently, flexible organic electronics and bio- and nanomaterials have been developed for multifunctional and minimally invasive probes for long-term interaction with the nervous system. In this Review, we discuss the design lessons from the quarter-century-old field of neural engineering, highlight recent materials-driven progress in neural probes and look at emergent directions inspired by the principles of neural transduction.

  2. Seeding neural progenitor cells on silicon-based neural probes.

    Science.gov (United States)

    Azemi, Erdrin; Gobbel, Glenn T; Cui, Xinyan Tracy

    2010-09-01

    Chronically implanted neural electrode arrays have the potential to be used as neural prostheses in patients with various neurological disorders. While these electrodes perform well in acute recordings, they often fail to function reliably in clinically relevant chronic settings because of glial encapsulation and the loss of neurons. Surface modification of these implants may provide a means of improving their biocompatibility and integration within host brain tissue. The authors proposed a method of improving the brain-implant interface by seeding the implant's surface with a layer of neural progenitor cells (NPCs) derived from adult murine subependyma. Neural progenitor cells may reduce the foreign body reaction by presenting a tissue-friendly surface and repair implant-induced injury and inflammation by releasing neurotrophic factors. In this study, the authors evaluated the growth and differentiation of NPCs on laminin-immobilized probe surfaces and explored the potential impact on transplant survival of these cells. Laminin protein was successfully immobilized on the silicon surface via covalent binding using silane chemistry. The growth, adhesion, and differentiation of NPCs expressing green fluorescent protein (GFP) on laminin-modified silicon surfaces were characterized in vitro by using immunocytochemical techniques. Shear forces were applied to NPC cultures in growth medium to evaluate their shearing properties. In addition, neural probes seeded with GFP-labeled NPCs cultured in growth medium for 14 days were implanted in murine cortex. The authors assessed the adhesion properties of these cells during implantation conditions. Moreover, the tissue response around NPC-seeded implants was observed after 1 and 7 days postimplantation. Significantly improved NPC attachment and growth was found on the laminin-immobilized surface compared with an unmodified control before and after shear force application. The NPCs grown on the laminin-immobilized surface

  3. Neural networks and statistical learning

    CERN Document Server

    Du, Ke-Lin

    2014-01-01

    Providing a broad but in-depth introduction to neural network and machine learning in a statistical framework, this book provides a single, comprehensive resource for study and further research. All the major popular neural network models and statistical learning approaches are covered with examples and exercises in every chapter to develop a practical working understanding of the content. Each of the twenty-five chapters includes state-of-the-art descriptions and important research results on the respective topics. The broad coverage includes the multilayer perceptron, the Hopfield network, associative memory models, clustering models and algorithms, the radial basis function network, recurrent neural networks, principal component analysis, nonnegative matrix factorization, independent component analysis, discriminant analysis, support vector machines, kernel methods, reinforcement learning, probabilistic and Bayesian networks, data fusion and ensemble learning, fuzzy sets and logic, neurofuzzy models, hardw...

  4. Multiprocessor Neural Network in Healthcare.

    Science.gov (United States)

    Godó, Zoltán Attila; Kiss, Gábor; Kocsis, Dénes

    2015-01-01

    A possible way of creating a multiprocessor artificial neural network is by the use of microcontrollers. The RISC processors' high performance and the large number of I/O ports mean they are greatly suitable for creating such a system. During our research, we wanted to see if it is possible to efficiently create interaction between the artifical neural network and the natural nervous system. To achieve as much analogy to the living nervous system as possible, we created a frequency-modulated analog connection between the units. Our system is connected to the living nervous system through 128 microelectrodes. Two-way communication is provided through A/D transformation, which is even capable of testing psychopharmacons. The microcontroller-based analog artificial neural network can play a great role in medical singal processing, such as ECG, EEG etc.

  5. Principles of neural information processing

    CERN Document Server

    Seelen, Werner v

    2016-01-01

    In this fundamental book the authors devise a framework that describes the working of the brain as a whole. It presents a comprehensive introduction to the principles of Neural Information Processing as well as recent and authoritative research. The books´ guiding principles are the main purpose of neural activity, namely, to organize behavior to ensure survival, as well as the understanding of the evolutionary genesis of the brain. Among the developed principles and strategies belong self-organization of neural systems, flexibility, the active interpretation of the world by means of construction and prediction as well as their embedding into the world, all of which form the framework of the presented description. Since, in brains, their partial self-organization, the lifelong adaptation and their use of various methods of processing incoming information are all interconnected, the authors have chosen not only neurobiology and evolution theory as a basis for the elaboration of such a framework, but also syst...

  6. Performance sustaining intracortical neural prostheses

    Science.gov (United States)

    Nuyujukian, Paul; Kao, Jonathan C.; Fan, Joline M.; Stavisky, Sergey D.; Ryu, Stephen I.; Shenoy, Krishna V.

    2014-12-01

    Objective. Neural prostheses, or brain-machine interfaces, aim to restore efficient communication and movement ability to those suffering from paralysis. A major challenge these systems face is robust performance, particularly with aging signal sources. The aim in this study was to develop a neural prosthesis that could sustain high performance in spite of signal instability while still minimizing retraining time. Approach. We trained two rhesus macaques implanted with intracortical microelectrode arrays 1-4 years prior to this study to acquire targets with a neurally-controlled cursor. We measured their performance via achieved bitrate (bits per second, bps). This task was repeated over contiguous days to evaluate the sustained performance across time. Main results. We found that in the monkey with a younger (i.e., two year old) implant and better signal quality, a fixed decoder could sustain performance for a month at a rate of 4 bps, the highest achieved communication rate reported to date. This fixed decoder was evaluated across 22 months and experienced a performance decline at a rate of 0.24 bps yr-1. In the monkey with the older (i.e., 3.5 year old) implant and poorer signal quality, a fixed decoder could not sustain performance for more than a few days. Nevertheless, performance in this monkey was maintained for two weeks without requiring additional online retraining time by utilizing prior days’ experimental data. Upon analysis of the changes in channel tuning, we found that this stability appeared partially attributable to the cancelling-out of neural tuning fluctuations when projected to two-dimensional cursor movements. Significance. The findings in this study (1) document the highest-performing communication neural prosthesis in monkeys, (2) confirm and extend prior reports of the stability of fixed decoders, and (3) demonstrate a protocol for system stability under conditions where fixed decoders would otherwise fail. These improvements to decoder

  7. Neural Decoder for Topological Codes

    Science.gov (United States)

    Torlai, Giacomo; Melko, Roger G.

    2017-07-01

    We present an algorithm for error correction in topological codes that exploits modern machine learning techniques. Our decoder is constructed from a stochastic neural network called a Boltzmann machine, of the type extensively used in deep learning. We provide a general prescription for the training of the network and a decoding strategy that is applicable to a wide variety of stabilizer codes with very little specialization. We demonstrate the neural decoder numerically on the well-known two-dimensional toric code with phase-flip errors.

  8. The neural cell adhesion molecule

    DEFF Research Database (Denmark)

    Berezin, V; Bock, E; Poulsen, F M

    2000-01-01

    During the past year, the understanding of the structure and function of neural cell adhesion has advanced considerably. The three-dimensional structures of several of the individual modules of the neural cell adhesion molecule (NCAM) have been determined, as well as the structure of the complex...... between two identical fragments of the NCAM. Also during the past year, a link between homophilic cell adhesion and several signal transduction pathways has been proposed, connecting the event of cell surface adhesion to cellular responses such as neurite outgrowth. Finally, the stimulation of neurite...

  9. Generalization performance of regularized neural network models

    DEFF Research Database (Denmark)

    Larsen, Jan; Hansen, Lars Kai

    1994-01-01

    Architecture optimization is a fundamental problem of neural network modeling. The optimal architecture is defined as the one which minimizes the generalization error. This paper addresses estimation of the generalization performance of regularized, complete neural network models. Regularization...

  10. voltage compensation using artificial neural network

    African Journals Online (AJOL)

    Offor Theophilos

    VOLTAGE COMPENSATION USING ARTIFICIAL NEURAL NETWORK: A CASE STUDY OF. RUMUOLA ... using artificial neural network (ANN) controller based dynamic voltage restorer (DVR). ... substation by simulating with sample of average voltage for Omerelu, Waterlines, Rumuola, Shell Industrial and Barracks.

  11. Plant Growth Models Using Artificial Neural Networks

    Science.gov (United States)

    Bubenheim, David

    1997-01-01

    In this paper, we descrive our motivation and approach to devloping models and the neural network architecture. Initial use of the artificial neural network for modeling the single plant process of transpiration is presented.

  12. Neural circulatory control in vasovagal syncope

    NARCIS (Netherlands)

    van Lieshout, J. J.; Wieling, W.; Karemaker, J. M.

    1997-01-01

    The orthostatic volume displacement associated with the upright position necessitates effective neural cardiovascular modulation. Neural control of cardiac chronotropy and inotropy, and vasomotor tone aims at maintaining venous return, thus opposing gravitational pooling of blood in the lower part

  13. Neural overlap in processing music and speech

    Science.gov (United States)

    Peretz, Isabelle; Vuvan, Dominique; Lagrois, Marie-Élaine; Armony, Jorge L.

    2015-01-01

    Neural overlap in processing music and speech, as measured by the co-activation of brain regions in neuroimaging studies, may suggest that parts of the neural circuitries established for language may have been recycled during evolution for musicality, or vice versa that musicality served as a springboard for language emergence. Such a perspective has important implications for several topics of general interest besides evolutionary origins. For instance, neural overlap is an important premise for the possibility of music training to influence language acquisition and literacy. However, neural overlap in processing music and speech does not entail sharing neural circuitries. Neural separability between music and speech may occur in overlapping brain regions. In this paper, we review the evidence and outline the issues faced in interpreting such neural data, and argue that converging evidence from several methodologies is needed before neural overlap is taken as evidence of sharing. PMID:25646513

  14. The neural crest and neural crest cells: discovery and significance ...

    Indian Academy of Sciences (India)

    PRAKASH KUMAR

    such as sea urchins, flies, fish and humans. (ii) Embryos (and so larvae and adults) form by differentiation from these germ layers. (iii) Homologous structures in different animals arise from the same germ layers. The germ-layer theory exerted a profound influence on those claiming a neural crest — that is, an ectodermal.

  15. MEMBRAIN NEURAL NETWORK FOR VISUAL PATTERN RECOGNITION

    Directory of Open Access Journals (Sweden)

    Artur Popko

    2013-06-01

    Full Text Available Recognition of visual patterns is one of significant applications of Artificial Neural Networks, which partially emulate human thinking in the domain of artificial intelligence. In the paper, a simplified neural approach to recognition of visual patterns is portrayed and discussed. This paper is dedicated for investigators in visual patterns recognition, Artificial Neural Networking and related disciplines. The document describes also MemBrain application environment as a powerful and easy to use neural networks’ editor and simulator supporting ANN.

  16. Radioactive fallout and neural tube defects

    African Journals Online (AJOL)

    Nejat Akar

    2015-07-10

    Jul 10, 2015 ... Neural tube defects;. Anencephaly;. Spina bifida. Abstract Possible link between radioactivity and the occurrence of neural tube defects is a long lasting debate ... Neural tube defects, are one of the common congenital mal- formations ... ent cities of Turkey (˙Izmir/Aegean Region; Trabzon/Black Sea region ...

  17. Analysis of neural networks through base functions

    NARCIS (Netherlands)

    van der Zwaag, B.J.; Slump, Cornelis H.; Spaanenburg, L.

    Problem statement. Despite their success-story, neural networks have one major disadvantage compared to other techniques: the inability to explain comprehensively how a trained neural network reaches its output; neural networks are not only (incorrectly) seen as a "magic tool" but possibly even more

  18. Simplified LQG Control with Neural Networks

    DEFF Research Database (Denmark)

    Sørensen, O.

    1997-01-01

    A new neural network application for non-linear state control is described. One neural network is modelled to form a Kalmann predictor and trained to act as an optimal state observer for a non-linear process. Another neural network is modelled to form a state controller and trained to produce...

  19. Novel quantum inspired binary neural network algorithm

    Indian Academy of Sciences (India)

    In this paper, a quantum based binary neural network algorithm is proposed, named as novel quantum binary neural network algorithm (NQ-BNN). It forms a neural network structure by deciding weights and separability parameter in quantum based manner. Quantum computing concept represents solution probabilistically ...

  20. Degenerate coding in neural systems.

    Science.gov (United States)

    Leonardo, Anthony

    2005-11-01

    When the dimensionality of a neural circuit is substantially larger than the dimensionality of the variable it encodes, many different degenerate network states can produce the same output. In this review I will discuss three different neural systems that are linked by this theme. The pyloric network of the lobster, the song control system of the zebra finch, and the odor encoding system of the locust, while different in design, all contain degeneracies between their internal parameters and the outputs they encode. Indeed, although the dynamics of song generation and odor identification are quite different, computationally, odor recognition can be thought of as running the song generation circuitry backwards. In both of these systems, degeneracy plays a vital role in mapping a sparse neural representation devoid of correlations onto external stimuli (odors or song structure) that are strongly correlated. I argue that degeneracy between input and output states is an inherent feature of many neural systems, which can be exploited as a fault-tolerant method of reliably learning, generating, and discriminating closely related patterns.

  1. Optoelectronic Implementation of Neural Networks

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 3; Issue 9. Optoelectronic Implementation of Neural Networks - Use of Optics in Computing. R Ramachandran. General Article Volume 3 Issue 9 September 1998 pp 45-55. Fulltext. Click here to view fulltext PDF. Permanent link:

  2. Aphasia Classification Using Neural Networks

    DEFF Research Database (Denmark)

    Axer, H.; Jantzen, Jan; Berks, G.

    2000-01-01

    A web-based software model (http://fuzzy.iau.dtu.dk/aphasia.nsf) was developed as an example for classification of aphasia using neural networks. Two multilayer perceptrons were used to classify the type of aphasia (Broca, Wernicke, anomic, global) according to the results in some subtests...

  3. Memory Storage and Neural Systems.

    Science.gov (United States)

    Alkon, Daniel L.

    1989-01-01

    Investigates memory storage and molecular nature of associative-memory formation by analyzing Pavlovian conditioning in marine snails and rabbits. Presented is the design of a computer-based memory system (neural networks) using the rules acquired in the investigation. Reports that the artificial network recognized patterns well. (YP)

  4. Nanomaterial-enabled neural stimulation

    Directory of Open Access Journals (Sweden)

    Yongchen eWang

    2016-03-01

    Full Text Available Neural stimulation is a critical technique in treating neurological diseases and investigating brain functions. Traditional electrical stimulation uses electrodes to directly create intervening electric fields in the immediate vicinity of neural tissues. Second-generation stimulation techniques directly use light, magnetic fields or ultrasound in a non-contact manner. An emerging generation of non- or minimally invasive neural stimulation techniques is enabled by nanotechnology to achieve a high spatial resolution and cell-type specificity. In these techniques, a nanomaterial converts a remotely transmitted primary stimulus such as a light, magnetic or ultrasonic signal to a localized secondary stimulus such as an electric field or heat to stimulate neurons. The ease of surface modification and bio-conjugation of nanomaterials facilitates cell-type-specific targeting, designated placement and highly localized membrane activation. This review focuses on nanomaterial-enabled neural stimulation techniques primarily involving opto-electric, opto-thermal, magneto-electric, magneto-thermal and acousto-electric transduction mechanisms. Stimulation techniques based on other possible transduction schemes and general consideration for these emerging neurotechnologies are also discussed.

  5. Neural Control of the Circulation

    Science.gov (United States)

    Thomas, Gail D.

    2011-01-01

    The purpose of this brief review is to highlight key concepts about the neural control of the circulation that graduate and medical students should be expected to incorporate into their general knowledge of human physiology. The focus is largely on the sympathetic nerves, which have a dominant role in cardiovascular control due to their effects to…

  6. Classification using Bayesian neural nets

    NARCIS (Netherlands)

    J.C. Bioch (Cor); O. van der Meer; R. Potharst (Rob)

    1995-01-01

    textabstractRecently, Bayesian methods have been proposed for neural networks to solve regression and classification problems. These methods claim to overcome some difficulties encountered in the standard approach such as overfitting. However, an implementation of the full Bayesian approach to

  7. Vitamins and neural tube defects

    OpenAIRE

    Harris, Rodney

    1988-01-01

    The use of vitamin supplements by women around the time of conception was examined and compared in those having babies with neural tube defects, those with still births or some other type of malformation, and in women who had normal babies.

  8. Neural Mechanisms of Conceptual Relations

    Science.gov (United States)

    Lewis, Gwyneth A.

    2017-01-01

    An over-arching goal in neurolinguistic research is to characterize the neural bases of semantic representation. A particularly relevant goal concerns whether we represent features and events (a) together in a generalized semantic hub or (b) separately in distinct but complementary systems. While the left anterior temporal lobe (ATL) is strongly…

  9. Neural mechanisms for voice recognition

    NARCIS (Netherlands)

    Andics, A.V.; McQueen, J.M.; Petersson, K.M.; Gal, V.; Rudas, G.; Vidnyanszky, Z.

    2010-01-01

    We investigated neural mechanisms that support voice recognition in a training paradigm with fMRI. The same listeners were trained on different weeks to categorize the mid-regions of voice-morph continua as an individual's voice. Stimuli implicitly defined a voice-acoustics space, and training

  10. Serotonin, neural markers and memory

    Directory of Open Access Journals (Sweden)

    Alfredo eMeneses

    2015-07-01

    Full Text Available Diverse neuropsychiatric disorders present dysfunctional memory and no effective treatment exits for them; likely as result of the absence of neural markers associated to memory. Neurotransmitter systems and signaling pathways have been implicated in memory and dysfunctional memory; however, their role is poorly understood. Hence, neural markers and cerebral functions and dysfunctions are revised. To our knowledge no previous systematic works have been published addressing these issues. The interactions among behavioral tasks, control groups and molecular changes and/or pharmacological effects are mentioned. Neurotransmitter receptors and signaling pathways, during normal and abnormally functioning memory with an emphasis on the behavioral aspects of memory are revised. With focus on serotonin, since as it is a well characterized neurotransmitter, with multiple pharmacological tools, and well characterized downstream signaling in mammals’ species. 5-HT1A, 5-HT4, 5-HT5, 5-HT6 and 5-HT7 receptors as well as SERT (serotonin transporter seem to be useful neural markers and/or therapeutic targets. Certainly, if the mentioned evidence is replicated, then the translatability from preclinical and clinical studies to neural changes might be confirmed. Hypothesis and theories might provide appropriate limits and perspectives of evidence

  11. Non-invasive neural stimulation

    Science.gov (United States)

    Tyler, William J.; Sanguinetti, Joseph L.; Fini, Maria; Hool, Nicholas

    2017-05-01

    Neurotechnologies for non-invasively interfacing with neural circuits have been evolving from those capable of sensing neural activity to those capable of restoring and enhancing human brain function. Generally referred to as non-invasive neural stimulation (NINS) methods, these neuromodulation approaches rely on electrical, magnetic, photonic, and acoustic or ultrasonic energy to influence nervous system activity, brain function, and behavior. Evidence that has been surmounting for decades shows that advanced neural engineering of NINS technologies will indeed transform the way humans treat diseases, interact with information, communicate, and learn. The physics underlying the ability of various NINS methods to modulate nervous system activity can be quite different from one another depending on the energy modality used as we briefly discuss. For members of commercial and defense industry sectors that have not traditionally engaged in neuroscience research and development, the science, engineering and technology required to advance NINS methods beyond the state-of-the-art presents tremendous opportunities. Within the past few years alone there have been large increases in global investments made by federal agencies, foundations, private investors and multinational corporations to develop advanced applications of NINS technologies. Driven by these efforts NINS methods and devices have recently been introduced to mass markets via the consumer electronics industry. Further, NINS continues to be explored in a growing number of defense applications focused on enhancing human dimensions. The present paper provides a brief introduction to the field of non-invasive neural stimulation by highlighting some of the more common methods in use or under current development today.

  12. Neural networks and applications tutorial

    Science.gov (United States)

    Guyon, I.

    1991-09-01

    The importance of neural networks has grown dramatically during this decade. While only a few years ago they were primarily of academic interest, now dozens of companies and many universities are investigating the potential use of these systems and products are beginning to appear. The idea of building a machine whose architecture is inspired by that of the brain has roots which go far back in history. Nowadays, technological advances of computers and the availability of custom integrated circuits, permit simulations of hundreds or even thousands of neurons. In conjunction, the growing interest in learning machines, non-linear dynamics and parallel computation spurred renewed attention in artificial neural networks. Many tentative applications have been proposed, including decision systems (associative memories, classifiers, data compressors and optimizers), or parametric models for signal processing purposes (system identification, automatic control, noise canceling, etc.). While they do not always outperform standard methods, neural network approaches are already used in some real world applications for pattern recognition and signal processing tasks. The tutorial is divided into six lectures, that where presented at the Third Graduate Summer Course on Computational Physics (September 3-7, 1990) on Parallel Architectures and Applications, organized by the European Physical Society: (1) Introduction: machine learning and biological computation. (2) Adaptive artificial neurons (perceptron, ADALINE, sigmoid units, etc.): learning rules and implementations. (3) Neural network systems: architectures, learning algorithms. (4) Applications: pattern recognition, signal processing, etc. (5) Elements of learning theory: how to build networks which generalize. (6) A case study: a neural network for on-line recognition of handwritten alphanumeric characters.

  13. Dynamic properties of cellular neural networks

    Directory of Open Access Journals (Sweden)

    Angela Slavova

    1993-01-01

    Full Text Available Dynamic behavior of a new class of information-processing systems called Cellular Neural Networks is investigated. In this paper we introduce a small parameter in the state equation of a cellular neural network and we seek for periodic phenomena. New approach is used for proving stability of a cellular neural network by constructing Lyapunov's majorizing equations. This algorithm is helpful for finding a map from initial continuous state space of a cellular neural network into discrete output. A comparison between cellular neural networks and cellular automata is made.

  14. Micro- and Nanotechnologies for Optical Neural Interfaces

    Science.gov (United States)

    Pisanello, Ferruccio; Sileo, Leonardo; De Vittorio, Massimo

    2016-01-01

    In last decade, the possibility to optically interface with the mammalian brain in vivo has allowed unprecedented investigation of functional connectivity of neural circuitry. Together with new genetic and molecular techniques to optically trigger and monitor neural activity, a new generation of optical neural interfaces is being developed, mainly thanks to the exploitation of both bottom-up and top-down nanofabrication approaches. This review highlights the role of nanotechnologies for optical neural interfaces, with particular emphasis on new devices and methodologies for optogenetic control of neural activity and unconventional methods for detection and triggering of action potentials using optically-active colloidal nanoparticles. PMID:27013939

  15. Spike Neural Models Part II: Abstract Neural Models

    OpenAIRE

    Johnson, Melissa G.; Chartier, Sylvain

    2018-01-01

    Neurons are complex cells that require a lot of time and resources to model completely. In spiking neural networks (SNN) though, not all that complexity is required. Therefore simple, abstract models are often used. These models save time, use less computer resources, and are easier to understand. This tutorial presents two such models: Izhikevich's model, which is biologically realistic in the resulting spike trains but not in the parameters, and the Leaky Integrate and Fire (LIF) model whic...

  16. Genetic attack on neural cryptography.

    Science.gov (United States)

    Ruttor, Andreas; Kinzel, Wolfgang; Naeh, Rivka; Kanter, Ido

    2006-03-01

    Different scaling properties for the complexity of bidirectional synchronization and unidirectional learning are essential for the security of neural cryptography. Incrementing the synaptic depth of the networks increases the synchronization time only polynomially, but the success of the geometric attack is reduced exponentially and it clearly fails in the limit of infinite synaptic depth. This method is improved by adding a genetic algorithm, which selects the fittest neural networks. The probability of a successful genetic attack is calculated for different model parameters using numerical simulations. The results show that scaling laws observed in the case of other attacks hold for the improved algorithm, too. The number of networks needed for an effective attack grows exponentially with increasing synaptic depth. In addition, finite-size effects caused by Hebbian and anti-Hebbian learning are analyzed. These learning rules converge to the random walk rule if the synaptic depth is small compared to the square root of the system size.

  17. Neural plasticity across the lifespan.

    Science.gov (United States)

    Power, Jonathan D; Schlaggar, Bradley L

    2017-01-01

    An essential feature of the brain is its capacity to change. Neuroscientists use the term 'plasticity' to describe the malleability of neuronal connectivity and circuitry. How does plasticity work? A review of current data suggests that plasticity encompasses many distinct phenomena, some of which operate across most or all of the lifespan, and others that operate exclusively in early development. This essay surveys some of the key concepts related to neural plasticity, beginning with how current patterns of neural activity (e.g., as you read this essay) come to impact future patterns of activity (e.g., your memory of this essay), and then extending this framework backward into more development-specific mechanisms of plasticity. WIREs Dev Biol 2017, 6:e216. doi: 10.1002/wdev.216 For further resources related to this article, please visit the WIREs website. © 2016 Wiley Periodicals, Inc.

  18. Neural Networks Methodology and Applications

    CERN Document Server

    Dreyfus, Gérard

    2005-01-01

    Neural networks represent a powerful data processing technique that has reached maturity and broad application. When clearly understood and appropriately used, they are a mandatory component in the toolbox of any engineer who wants make the best use of the available data, in order to build models, make predictions, mine data, recognize shapes or signals, etc. Ranging from theoretical foundations to real-life applications, this book is intended to provide engineers and researchers with clear methodologies for taking advantage of neural networks in industrial, financial or banking applications, many instances of which are presented in the book. For the benefit of readers wishing to gain deeper knowledge of the topics, the book features appendices that provide theoretical details for greater insight, and algorithmic details for efficient programming and implementation. The chapters have been written by experts ands seemlessly edited to present a coherent and comprehensive, yet not redundant, practically-oriented...

  19. Autonomic neural functions in space.

    Science.gov (United States)

    Mano, T

    2005-08-01

    Autonomic neural functions are important to regulate vital functions in the living body. There are different methods to evaluate indirectly and directly autonomic, sympathetic and parasympathetic, neural functions of human body. Among various methods, microneurography is a technique to evaluate directly sympathetic neural functions in humans. Using this technique sympathetic neural traffic leading to skeletal muscles (muscle sympathetic nerve activity; MSNA) can be recorded from human peripheral nerves in situ. MSNA plays essentially important roles to maintain blood pressure homeostasis against gravity. Orthostatic intolerance is an important problem as an autonomic dysfunction encountered after exposure of human beings to microgravity. There exist at least two different types of sympathetic neural responses, low and high responders to orthostatic stress in orthostatic hypotension seen in neurological disorders. To answer the question if post-spaceflight orthostatic intolerance is induced by low or high MSNA responses to orthostatic stress, MSNA was microneurographically recorded for the first time before, during and after spaceflight in 1998 under Neurolab international research project. The same activity has been recorded during and/or after ground-based short- and long-term simulations of microgravity. MSNA was rather enhanced on the 12(th) and 13(th) day of spaceflight and just after landing day. Postflight MSNA response to head-up tilt was well preserved in astronauts who were orthostatically well tolerant. MSNA was suppressed during short-term simulation of microgravity less than 2 hours but was enhanced after long-term simulation of microgravity more than 3 days. Orthostatic intolerance after exposure to long-term simulation of microgravity was associated with reduced MSNA response to orthostatic stress with impaired baroreflex functions. These findings obtained from MSNA recordings in subjects exposed to space as well as short- and long-term simulations of

  20. Adaptive Regularization of Neural Classifiers

    DEFF Research Database (Denmark)

    Andersen, Lars Nonboe; Larsen, Jan; Hansen, Lars Kai

    1997-01-01

    We present a regularization scheme which iteratively adapts the regularization parameters by minimizing the validation error. It is suggested to use the adaptive regularization scheme in conjunction with optimal brain damage pruning to optimize the architecture and to avoid overfitting. Furthermore......, we propose an improved neural classification architecture eliminating an inherent redundancy in the widely used SoftMax classification network. Numerical results demonstrate the viability of the method...

  1. Functional neural anatomy of talent.

    Science.gov (United States)

    Kalbfleisch, M Layne

    2004-03-01

    The terms gifted, talented, and intelligent all have meanings that suggest an individual's highly proficient or exceptional performance in one or more specific areas of strength. Other than Spearman's g, which theorizes about a general elevated level of potential or ability, more contemporary theories of intelligence are based on theoretical models that define ability or intelligence according to a priori categories of specific performance. Recent studies in cognitive neuroscience report on the neural basis of g from various perspectives such as the neural speed theory and the efficiency of prefrontal function. Exceptional talent is the result of interactions between goal-directed behavior and nonvolitional perceptual processes in the brain that have yet to be fully characterized and understood by the fields of psychology and cognitive neuroscience. Some developmental studies report differences in region-specific neural activation, recruitment patterns, and reaction times in subjects who are identified with high IQ scores according to traditional scales of assessment such as the WISC-III or Stanford-Binet. Although as cases of savants and prodigies illustrate, talent is not synonymous with high IQ. This review synthesizes information from the fields of psychometrics and gifted education, with findings from the neurosciences on the neural basis of intelligence, creativity, profiles of expert performers, cognitive function, and plasticity to suggest a paradigm for investigating talent as the maximal and productive use of either or both of one's high level of general intelligence or domain-specific ability. Anat Rec (Part B: New Anat) 277B:21-36, 2004. Copyright 2004 Wiley-Liss, Inc.

  2. Handbook on neural information processing

    CERN Document Server

    Maggini, Marco; Jain, Lakhmi

    2013-01-01

    This handbook presents some of the most recent topics in neural information processing, covering both theoretical concepts and practical applications. The contributions include:                         Deep architectures                         Recurrent, recursive, and graph neural networks                         Cellular neural networks                         Bayesian networks                         Approximation capabilities of neural networks                         Semi-supervised learning                         Statistical relational learning                         Kernel methods for structured data                         Multiple classifier systems                         Self organisation and modal learning                         Applications to ...

  3. [Glutamate signaling and neural plasticity].

    Science.gov (United States)

    Watanabe, Masahiko

    2013-07-01

    Proper functioning of the nervous system relies on the precise formation of neural circuits during development. At birth, neurons have redundant synaptic connections not only to their proper targets but also to other neighboring cells. Then, functional neural circuits are formed during early postnatal development by the selective strengthening of necessary synapses and weakening of surplus connections. Synaptic connections are also modified so that projection fields of active afferents expand at the expense of lesser ones. We have studied the molecular mechanisms underlying these activity-dependent prunings and the plasticity of synaptic circuitry using gene-engineered mice defective in the glutamatergic signaling system. NMDA-type glutamate receptors are critically involved in the establishment of the somatosensory pathway ascending from the brainstem trigeminal nucleus to the somatosensory cortex. Without NMDA receptors, whisker-related patterning fails to develop, whereas lesion-induced plasticity occurs normally during the critical period. In contrast, mice lacking the glutamate transporters GLAST or GLT1 are selectively impaired in the lesion-induced critical plasticity of cortical barrels, although whisker-related patterning itself develops normally. In the developing cerebellum, multiple climbing fibers initially innervating given Purkinje cells are eliminated one by one until mono-innervation is achieved. In this pruning process, P/Q-type Ca2+ channels expressed on Purkinje cells are critically involved by the selective strengthening of single main climbing fibers against other lesser afferents. Therefore, the activation of glutamate receptors that leads to an activity-dependent increase in the intracellular Ca2+ concentration plays a key role in the pruning of immature synaptic circuits into functional circuits. On the other hand, glutamate transporters appear to control activity-dependent plasticity among afferent fields, presumably through adjusting

  4. Neural prostheses and brain plasticity

    Science.gov (United States)

    Fallon, James B.; Irvine, Dexter R. F.; Shepherd, Robert K.

    2009-12-01

    The success of modern neural prostheses is dependent on a complex interplay between the devices' hardware and software and the dynamic environment in which the devices operate: the patient's body or 'wetware'. Over 120 000 severe/profoundly deaf individuals presently receive information enabling auditory awareness and speech perception from cochlear implants. The cochlear implant therefore provides a useful case study for a review of the complex interactions between hardware, software and wetware, and of the important role of the dynamic nature of wetware. In the case of neural prostheses, the most critical component of that wetware is the central nervous system. This paper will examine the evidence of changes in the central auditory system that contribute to changes in performance with a cochlear implant, and discuss how these changes relate to electrophysiological and functional imaging studies in humans. The relationship between the human data and evidence from animals of the remarkable capacity for plastic change of the central auditory system, even into adulthood, will then be examined. Finally, we will discuss the role of brain plasticity in neural prostheses in general.

  5. Neural correlates of paediatric dysgraphia.

    Science.gov (United States)

    Van Hoorn, Jessika F; Maathuis, Carel G B; Hadders-Algra, Mijna

    2013-11-01

    Writing is an important skill that is related both to school performance and to psychosocial outcomes such as the child's self-esteem. Deficits in handwriting performance are frequently encountered in children with developmental coordination disorder. This review focuses on what is known about the neural correlates of atypical handwriting in children. Knowledge of the neural correlates is derived from studies using clinical case designs (e.g. lesion studies), studies using neuroimaging, and assessment of minor neurological dysfunction. The two functional imaging studies suggest a contribution of cortical areas and the cerebellum. The largest study indicated that cortical areas in all regions of the brain are involved (frontal, temporal, parietal, and occipital). The two lesion studies confirmed cerebellar involvement. The findings of the study on minor neurological dysfunction in children with writing problems correspond to the imaging results. The limited data on the neural substrate of paediatric dysgraphia suggest that at least a subset of the children with dysgraphia have dysfunctions in extensive supraspinal networks. In others, dysfunction may be restricted to either the cerebellum or specific cortical sites. © The Authors. Developmental Medicine & Child Neurology © 2013 Mac Keith Press.

  6. Three dimensional living neural networks

    Science.gov (United States)

    Linnenberger, Anna; McLeod, Robert R.; Basta, Tamara; Stowell, Michael H. B.

    2015-08-01

    We investigate holographic optical tweezing combined with step-and-repeat maskless projection micro-stereolithography for fine control of 3D positioning of living cells within a 3D microstructured hydrogel grid. Samples were fabricated using three different cell lines; PC12, NT2/D1 and iPSC. PC12 cells are a rat cell line capable of differentiation into neuron-like cells NT2/D1 cells are a human cell line that exhibit biochemical and developmental properties similar to that of an early embryo and when exposed to retinoic acid the cells differentiate into human neurons useful for studies of human neurological disease. Finally induced pluripotent stem cells (iPSC) were utilized with the goal of future studies of neural networks fabricated from human iPSC derived neurons. Cells are positioned in the monomer solution with holographic optical tweezers at 1064 nm and then are encapsulated by photopolymerization of polyethylene glycol (PEG) hydrogels formed by thiol-ene photo-click chemistry via projection of a 512x512 spatial light modulator (SLM) illuminated at 405 nm. Fabricated samples are incubated in differentiation media such that cells cease to divide and begin to form axons or axon-like structures. By controlling the position of the cells within the encapsulating hydrogel structure the formation of the neural circuits is controlled. The samples fabricated with this system are a useful model for future studies of neural circuit formation, neurological disease, cellular communication, plasticity, and repair mechanisms.

  7. Neural mechanisms of social dominance

    Directory of Open Access Journals (Sweden)

    Noriya eWatanabe

    2015-06-01

    Full Text Available In a group setting, individuals’ perceptions of their own level of dominance or of the dominance level of others, and the ability to adequately control their behavior based on these perceptions are crucial for living within a social environment. Recent advances in neural imaging and molecular technology have enabled researchers to investigate the neural substrates that support the perception of social dominance and the formation of a social hierarchy in humans. At the systems’ level, recent studies showed that dominance perception is represented in broad brain regions which include the amygdala, hippocampus, striatum, and various cortical networks such as the prefrontal, and parietal cortices. Additionally, neurotransmitter systems such as the dopaminergic and serotonergic systems, modulate and are modulated by the formation of the social hierarchy in a group. While these monoamine systems have a wide distribution and multiple functions, it was recently found that the Neuropeptide B/W contributes to the perception of dominance and is present in neurons that have a limited projection primarily to the amygdala. The present review discusses the specific roles of these neural regions and neurotransmitter systems in the perception of dominance and in hierarchy formation.

  8. Neural Correlates of Predictive Saccades.

    Science.gov (United States)

    Lee, Stephen M; Peltsch, Alicia; Kilmade, Maureen; Brien, Donald C; Coe, Brian C; Johnsrude, Ingrid S; Munoz, Douglas P

    2016-08-01

    Every day we generate motor responses that are timed with external cues. This phenomenon of sensorimotor synchronization has been simplified and studied extensively using finger tapping sequences that are executed in synchrony with auditory stimuli. The predictive saccade paradigm closely resembles the finger tapping task. In this paradigm, participants follow a visual target that "steps" between two fixed locations on a visual screen at predictable ISIs. Eventually, the time from target appearance to saccade initiation (i.e., saccadic RT) becomes predictive with values nearing 0 msec. Unlike the finger tapping literature, neural control of predictive behavior described within the eye movement literature has not been well established and is inconsistent, especially between neuroimaging and patient lesion studies. To resolve these discrepancies, we used fMRI to investigate the neural correlates of predictive saccades by contrasting brain areas involved with behavior generated from the predictive saccade task with behavior generated from a reactive saccade task (saccades are generated toward targets that are unpredictably timed). We observed striking differences in neural recruitment between reactive and predictive conditions: Reactive saccades recruited oculomotor structures, as predicted, whereas predictive saccades recruited brain structures that support timing in motor responses, such as the crus I of the cerebellum, and structures commonly associated with the default mode network. Therefore, our results were more consistent with those found in the finger tapping literature.

  9. Central neural pathways for thermoregulation

    Science.gov (United States)

    Morrison, Shaun F.; Nakamura, Kazuhiro

    2010-01-01

    Central neural circuits orchestrate a homeostatic repertoire to maintain body temperature during environmental temperature challenges and to alter body temperature during the inflammatory response. This review summarizes the functional organization of the neural pathways through which cutaneous thermal receptors alter thermoregulatory effectors: the cutaneous circulation for heat loss, the brown adipose tissue, skeletal muscle and heart for thermogenesis and species-dependent mechanisms (sweating, panting and saliva spreading) for evaporative heat loss. These effectors are regulated by parallel but distinct, effector-specific neural pathways that share a common peripheral thermal sensory input. The thermal afferent circuits include cutaneous thermal receptors, spinal dorsal horn neurons and lateral parabrachial nucleus neurons projecting to the preoptic area to influence warm-sensitive, inhibitory output neurons which control thermogenesis-promoting neurons in the dorsomedial hypothalamus that project to premotor neurons in the rostral ventromedial medulla, including the raphe pallidus, that descend to provide the excitation necessary to drive thermogenic thermal effectors. A distinct population of warm-sensitive preoptic neurons controls heat loss through an inhibitory input to raphe pallidus neurons controlling cutaneous vasoconstriction. PMID:21196160

  10. Fuzzy neural networks: theory and applications

    Science.gov (United States)

    Gupta, Madan M.

    1994-10-01

    During recent years, significant advances have been made in two distinct technological areas: fuzzy logic and computational neural networks. The theory of fuzzy logic provides a mathematical framework to capture the uncertainties associated with human cognitive processes, such as thinking and reasoning. It also provides a mathematical morphology to emulate certain perceptual and linguistic attributes associated with human cognition. On the other hand, the computational neural network paradigms have evolved in the process of understanding the incredible learning and adaptive features of neuronal mechanisms inherent in certain biological species. Computational neural networks replicate, on a small scale, some of the computational operations observed in biological learning and adaptation. The integration of these two fields, fuzzy logic and neural networks, have given birth to an emerging technological field -- fuzzy neural networks. Fuzzy neural networks, have the potential to capture the benefits of these two fascinating fields, fuzzy logic and neural networks, into a single framework. The intent of this tutorial paper is to describe the basic notions of biological and computational neuronal morphologies, and to describe the principles and architectures of fuzzy neural networks. Towards this goal, we develop a fuzzy neural architecture based upon the notion of T-norm and T-conorm connectives. An error-based learning scheme is described for this neural structure.

  11. Satellite image analysis using neural networks

    Science.gov (United States)

    Sheldon, Roger A.

    1990-01-01

    The tremendous backlog of unanalyzed satellite data necessitates the development of improved methods for data cataloging and analysis. Ford Aerospace has developed an image analysis system, SIANN (Satellite Image Analysis using Neural Networks) that integrates the technologies necessary to satisfy NASA's science data analysis requirements for the next generation of satellites. SIANN will enable scientists to train a neural network to recognize image data containing scenes of interest and then rapidly search data archives for all such images. The approach combines conventional image processing technology with recent advances in neural networks to provide improved classification capabilities. SIANN allows users to proceed through a four step process of image classification: filtering and enhancement, creation of neural network training data via application of feature extraction algorithms, configuring and training a neural network model, and classification of images by application of the trained neural network. A prototype experimentation testbed was completed and applied to climatological data.

  12. Neural plasticity of development and learning.

    Science.gov (United States)

    Galván, Adriana

    2010-06-01

    Development and learning are powerful agents of change across the lifespan that induce robust structural and functional plasticity in neural systems. An unresolved question in developmental cognitive neuroscience is whether development and learning share the same neural mechanisms associated with experience-related neural plasticity. In this article, I outline the conceptual and practical challenges of this question, review insights gleaned from adult studies, and describe recent strides toward examining this topic across development using neuroimaging methods. I suggest that development and learning are not two completely separate constructs and instead, that they exist on a continuum. While progressive and regressive changes are central to both, the behavioral consequences associated with these changes are closely tied to the existing neural architecture of maturity of the system. Eventually, a deeper, more mechanistic understanding of neural plasticity will shed light on behavioral changes across development and, more broadly, about the underlying neural basis of cognition. (c) 2010 Wiley-Liss, Inc.

  13. Neurosecurity: security and privacy for neural devices.

    Science.gov (United States)

    Denning, Tamara; Matsuoka, Yoky; Kohno, Tadayoshi

    2009-07-01

    An increasing number of neural implantable devices will become available in the near future due to advances in neural engineering. This discipline holds the potential to improve many patients' lives dramatically by offering improved-and in some cases entirely new-forms of rehabilitation for conditions ranging from missing limbs to degenerative cognitive diseases. The use of standard engineering practices, medical trials, and neuroethical evaluations during the design process can create systems that are safe and that follow ethical guidelines; unfortunately, none of these disciplines currently ensure that neural devices are robust against adversarial entities trying to exploit these devices to alter, block, or eavesdrop on neural signals. The authors define "neurosecurity"-a version of computer science security principles and methods applied to neural engineering-and discuss why neurosecurity should be a critical consideration in the design of future neural devices.

  14. Flexible and Organic Neural Interfaces: A Review

    Directory of Open Access Journals (Sweden)

    Nicolò Lago

    2017-12-01

    Full Text Available Neural interfaces are a fundamental tool to interact with neurons and to study neural networks by transducing cellular signals into electronics signals and vice versa. State-of-the-art technologies allow both in vivo and in vitro recording of neural activity. However, they are mainly made of stiff inorganic materials that can limit the long-term stability of the implant due to infection and/or glial scars formation. In the last decade, organic electronics is digging its way in the field of bioelectronics and researchers started to develop neural interfaces based on organic semiconductors, creating more flexible and conformable neural interfaces that can be intrinsically biocompatible. In this manuscript, we are going to review the latest achievements in flexible and organic neural interfaces for the recording of neuronal activity.

  15. Rod-Shaped Neural Units for Aligned 3D Neural Network Connection.

    Science.gov (United States)

    Kato-Negishi, Midori; Onoe, Hiroaki; Ito, Akane; Takeuchi, Shoji

    2017-08-01

    This paper proposes neural tissue units with aligned nerve fibers (called rod-shaped neural units) that connect neural networks with aligned neurons. To make the proposed units, 3D fiber-shaped neural tissues covered with a calcium alginate hydrogel layer are prepared with a microfluidic system and are cut in an accurate and reproducible manner. These units have aligned nerve fibers inside the hydrogel layer and connectable points on both ends. By connecting the units with a poly(dimethylsiloxane) guide, 3D neural tissues can be constructed and maintained for more than two weeks of culture. In addition, neural networks can be formed between the different neural units via synaptic connections. Experimental results indicate that the proposed rod-shaped neural units are effective tools for the construction of spatially complex connections with aligned nerve fibers in vitro. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  16. Person Movement Prediction Using Neural Networks

    OpenAIRE

    Vintan, Lucian; Gellert, Arpad; Petzold, Jan; Ungerer, Theo

    2006-01-01

    Ubiquitous systems use context information to adapt appliance behavior to human needs. Even more convenience is reached if the appliance foresees the user's desires and acts proactively. This paper proposes neural prediction techniques to anticipate a person's next movement. We focus on neural predictors (multi-layer perceptron with back-propagation learning) with and without pre-training. The optimal configuration of the neural network is determined by evaluating movement sequences of real p...

  17. Neural crest contributions to the lamprey head

    Science.gov (United States)

    McCauley, David W.; Bronner-Fraser, Marianne

    2003-01-01

    The neural crest is a vertebrate-specific cell population that contributes to the facial skeleton and other derivatives. We have performed focal DiI injection into the cranial neural tube of the developing lamprey in order to follow the migratory pathways of discrete groups of cells from origin to destination and to compare neural crest migratory pathways in a basal vertebrate to those of gnathostomes. The results show that the general pathways of cranial neural crest migration are conserved throughout the vertebrates, with cells migrating in streams analogous to the mandibular and hyoid streams. Caudal branchial neural crest cells migrate ventrally as a sheet of cells from the hindbrain and super-pharyngeal region of the neural tube and form a cylinder surrounding a core of mesoderm in each pharyngeal arch, similar to that seen in zebrafish and axolotl. In addition to these similarities, we also uncovered important differences. Migration into the presumptive caudal branchial arches of the lamprey involves both rostral and caudal movements of neural crest cells that have not been described in gnathostomes, suggesting that barriers that constrain rostrocaudal movement of cranial neural crest cells may have arisen after the agnathan/gnathostome split. Accordingly, neural crest cells from a single axial level contributed to multiple arches and there was extensive mixing between populations. There was no apparent filling of neural crest derivatives in a ventral-to-dorsal order, as has been observed in higher vertebrates, nor did we find evidence of a neural crest contribution to cranial sensory ganglia. These results suggest that migratory constraints and additional neural crest derivatives arose later in gnathostome evolution.

  18. Pediatric Nutritional Requirements Determination with Neural Networks

    OpenAIRE

    Karlık, Bekir; Ece, Aydın

    1998-01-01

    To calculate daily nutritional requirements of children, a computer program has been developed based upon neural network. Three parameters, daily protein, energy and water requirements, were calculated through trained artificial neural networks using a database of 312 children The results were compared with those of calculated from dietary requirements tables of World Health Organisation. No significant difference was found between two calculations. In conclusion, a simple neural network may ...

  19. Adaptive optimization and control using neural networks

    Energy Technology Data Exchange (ETDEWEB)

    Mead, W.C.; Brown, S.K.; Jones, R.D.; Bowling, P.S.; Barnes, C.W.

    1993-10-22

    Recent work has demonstrated the ability of neural-network-based controllers to optimize and control machines with complex, non-linear, relatively unknown control spaces. We present a brief overview of neural networks via a taxonomy illustrating some capabilities of different kinds of neural networks. We present some successful control examples, particularly the optimization and control of a small-angle negative ion source.

  20. Initialization of multilayer forecasting artifical neural networks

    OpenAIRE

    Bochkarev, Vladimir V.; Maslennikova, Yulia S.

    2014-01-01

    In this paper, a new method was developed for initialising artificial neural networks predicting dynamics of time series. Initial weighting coefficients were determined for neurons analogously to the case of a linear prediction filter. Moreover, to improve the accuracy of the initialization method for a multilayer neural network, some variants of decomposition of the transformation matrix corresponding to the linear prediction filter were suggested. The efficiency of the proposed neural netwo...

  1. Texture Based Image Analysis With Neural Nets

    Science.gov (United States)

    Ilovici, Irina S.; Ong, Hoo-Tee; Ostrander, Kim E.

    1990-03-01

    In this paper, we combine direct image statistics and spatial frequency domain techniques with a neural net model to analyze texture based images. The resultant optimal texture features obtained from the direct and transformed image form the exemplar pattern of the neural net. The proposed approach introduces an automated texture analysis applied to metallography for determining the cooling rate and mechanical working of the materials. The results suggest that the proposed method enhances the practical applications of neural nets and texture extraction features.

  2. Cognitive Control Signals for Neural Prosthetics

    National Research Council Canada - National Science Library

    S. Musallam; B. D. Corneil; B. Greger; H. Scherberger; R. A. Andersen

    2004-01-01

    Recent development of neural prosthetics for assisting paralyzed patients has focused on decoding intended hand trajectories from motor cortical neurons and using this signal to control external devices...

  3. NeuroMEMS: Neural Probe Microtechnologies

    Directory of Open Access Journals (Sweden)

    Sam Musallam

    2008-10-01

    Full Text Available Neural probe technologies have already had a significant positive effect on our understanding of the brain by revealing the functioning of networks of biological neurons. Probes are implanted in different areas of the brain to record and/or stimulate specific sites in the brain. Neural probes are currently used in many clinical settings for diagnosis of brain diseases such as seizers, epilepsy, migraine, Alzheimer’s, and dementia. We find these devices assisting paralyzed patients by allowing them to operate computers or robots using their neural activity. In recent years, probe technologies were assisted by rapid advancements in microfabrication and microelectronic technologies and thus are enabling highly functional and robust neural probes which are opening new and exciting avenues in neural sciences and brain machine interfaces. With a wide variety of probes that have been designed, fabricated, and tested to date, this review aims to provide an overview of the advances and recent progress in the microfabrication techniques of neural probes. In addition, we aim to highlight the challenges faced in developing and implementing ultralong multi-site recording probes that are needed to monitor neural activity from deeper regions in the brain. Finally, we review techniques that can improve the biocompatibility of the neural probes to minimize the immune response and encourage neural growth around the electrodes for long term implantation studies.

  4. Neural repair in the adult brain

    Science.gov (United States)

    Jessberger, Sebastian

    2016-01-01

    Acute or chronic injury to the adult brain often results in substantial loss of neural tissue and subsequent permanent functional impairment. Over the last two decades, a number of approaches have been developed to harness the regenerative potential of neural stem cells and the existing fate plasticity of neural cells in the nervous system to prevent tissue loss or to enhance structural and functional regeneration upon injury. Here, we review recent advances of stem cell-associated neural repair in the adult brain, discuss current challenges and limitations, and suggest potential directions to foster the translation of experimental stem cell therapies into the clinic. PMID:26918167

  5. Neural network based system for equipment surveillance

    Science.gov (United States)

    Vilim, R.B.; Gross, K.C.; Wegerich, S.W.

    1998-04-28

    A method and system are disclosed for performing surveillance of transient signals of an industrial device to ascertain the operating state. The method and system involves the steps of reading into a memory training data, determining neural network weighting values until achieving target outputs close to the neural network output. If the target outputs are inadequate, wavelet parameters are determined to yield neural network outputs close to the desired set of target outputs and then providing signals characteristic of an industrial process and comparing the neural network output to the industrial process signals to evaluate the operating state of the industrial process. 33 figs.

  6. Fuzzy neural network theory and application

    CERN Document Server

    Liu, Puyin

    2004-01-01

    This book systematically synthesizes research achievements in the field of fuzzy neural networks in recent years. It also provides a comprehensive presentation of the developments in fuzzy neural networks, with regard to theory as well as their application to system modeling and image restoration. Special emphasis is placed on the fundamental concepts and architecture analysis of fuzzy neural networks. The book is unique in treating all kinds of fuzzy neural networks and their learning algorithms and universal approximations, and employing simulation examples which are carefully designed to he

  7. Neural reflexes in inflammation and immunity

    National Research Council Canada - National Science Library

    Andersson, Ulf; Tracey, Kevin J

    2012-01-01

    .... Development of advanced neurophysiological and immunological techniques recently enabled the study of reflex neural circuits that maintain immunological homeostasis, and are essential for health in mammals...

  8. Neural network optimization, components, and design selection

    Science.gov (United States)

    Weller, Scott W.

    1991-01-01

    Neural Networks are part of a revived technology which has received a lot of hype in recent years. As is apt to happen in any hyped technology, jargon and predictions make its assimilation and application difficult. Nevertheless, Neural Networks have found use in a number of areas, working on non-trivial and non-contrived problems. For example, one net has been trained to "read", translating English text into phoneme sequences. Other applications of Neural Networks include data base manipulation and the solving of routing and classification types of optimization problems. It was their use in optimization that got me involved with Neural Networks. As it turned out, "optimization" used in this context was somewhat misleading, because while some network configurations could indeed solve certain kinds of optimization problems, the configuring or "training" of a Neural Network itself is an optimization problem, and most of the literature which talked about Neural Nets and optimization in the same breath did not speak to my goal of using Neural Nets to help solve lens optimization problems. I did eventually apply Neural Network to lens optimization, and I will touch on those results. The application of Neural Nets to the problem of lens selection was much more successful, and those results will dominate this paper.

  9. Practical neural network recipies in C++

    CERN Document Server

    Masters

    2014-01-01

    This text serves as a cookbook for neural network solutions to practical problems using C++. It will enable those with moderate programming experience to select a neural network model appropriate to solving a particular problem, and to produce a working program implementing that network. The book provides guidance along the entire problem-solving path, including designing the training set, preprocessing variables, training and validating the network, and evaluating its performance. Though the book is not intended as a general course in neural networks, no background in neural works is assum

  10. The Neural Crest in Cardiac Congenital Anomalies

    Science.gov (United States)

    Keyte, Anna; Hutson, Mary Redmond

    2012-01-01

    This review discusses the function of neural crest as they relate to cardiovascular defects. The cardiac neural crest cells are a subpopulation of cranial neural crest discovered nearly 30 years ago by ablation of premigratory neural crest. The cardiac neural crest cells are necessary for normal cardiovascular development. We begin with a description of the crest cells in normal development, including their function in remodeling the pharyngeal arch arteries, outflow tract septation, valvulogenesis, and development of the cardiac conduction system. The cells are also responsible for modulating signaling in the caudal pharynx, including the second heart field. Many of the molecular pathways that are known to influence specification, migration, patterning and final targeting of the cardiac neural crest cells are reviewed. The cardiac neural crest cells play a critical role in the pathogenesis of various human cardiocraniofacial syndromes such as DiGeorge, Velocardiofacial, CHARGE, Fetal Alcohol, Alagille, LEOPARD, and Noonan syndromes, as well as Retinoic Acid Embryopathy. The loss of neural crest cells or their dysfunction may not always directly cause abnormal cardiovascular development, but are involved secondarily because crest cells represent a major component in the complex tissue interactions in the head, pharynx and outflow tract. Thus many of the human syndromes linking defects in the heart, face and brain can be better understood when considered within the context of a single cardiocraniofacial developmental module with the neural crest being a key cell type that interconnects the regions. PMID:22595346

  11. Sequential neural models with stochastic layers

    DEFF Research Database (Denmark)

    Fraccaro, Marco; Sønderby, Søren Kaae; Paquet, Ulrich

    2016-01-01

    How can we efficiently propagate uncertainty in a latent state representation with recurrent neural networks? This paper introduces stochastic recurrent neural networks which glue a deterministic recurrent neural network and a state space model together to form a stochastic and sequential neural ...... the uncertainty in a latent path, like a state space model, we improve the state of the art results on the Blizzard and TIMIT speech modeling data sets by a large margin, while achieving comparable performances to competing methods on polyphonic music modeling....

  12. Cultured neural networks: Optimisation of patterned network adhesiveness and characterisation of their neural activity

    NARCIS (Netherlands)

    Rutten, Wim; Ruardij, T.G.; Marani, Enrico; Roelofsen, B.H.

    2006-01-01

    One type of future, improved neural interface is the "cultured probe"?. It is a hybrid type of neural information transducer or prosthesis, for stimulation and/or recording of neural activity. It would consist of a microelectrode array (MEA) on a planar substrate, each electrode being covered and

  13. The LILARTI neural network system

    Energy Technology Data Exchange (ETDEWEB)

    Allen, J.D. Jr.; Schell, F.M.; Dodd, C.V.

    1992-10-01

    The material of this Technical Memorandum is intended to provide the reader with conceptual and technical background information on the LILARTI neural network system of detail sufficient to confer an understanding of the LILARTI method as it is presently allied and to facilitate application of the method to problems beyond the scope of this document. Of particular importance in this regard are the descriptive sections and the Appendices which include operating instructions, partial listings of program output and data files, and network construction information.

  14. Phantom limbs and neural plasticity.

    Science.gov (United States)

    Ramachandran, V S; Rogers-Ramachandran, D

    2000-03-01

    The study of phantom limbs has received tremendous impetus from recent studies linking changes in cortical topography with perceptual experience. Systematic psychophysical testing and functional imaging studies on patients with phantom limbs provide 2 unique opportunities. First, they allow us to demonstrate neural plasticity in the adult human brain. Second, by tracking perceptual changes (such as referred sensations) and changes in cortical topography in individual patients, we can begin to explore how the activity of sensory maps gives rise to conscious experience. Finally, phantom limbs also allow us to explore intersensory effects and the manner in which the brain constructs and updates a "body image" throughout life.

  15. Neural processing of natural sounds.

    Science.gov (United States)

    Theunissen, Frédéric E; Elie, Julie E

    2014-06-01

    We might be forced to listen to a high-frequency tone at our audiologist's office or we might enjoy falling asleep with a white-noise machine, but the sounds that really matter to us are the voices of our companions or music from our favourite radio station. The auditory system has evolved to process behaviourally relevant natural sounds. Research has shown not only that our brain is optimized for natural hearing tasks but also that using natural sounds to probe the auditory system is the best way to understand the neural computations that enable us to comprehend speech or appreciate music.

  16. Optical implementation of neural networks

    Science.gov (United States)

    Yu, Francis T. S.; Guo, Ruyan

    2002-12-01

    An adaptive optical neuro-computing (ONC) using inexpensive pocket size liquid crystal televisions (LCTVs) had been developed by the graduate students in the Electro-Optics Laboratory at The Pennsylvania State University. Although this neuro-computing has only 8×8=64 neurons, it can be easily extended to 16×20=320 neurons. The major advantages of this LCTV architecture as compared with other reported ONCs, are low cost and the flexibility to operate. To test the performance, several neural net models are used. These models are Interpattern Association, Hetero-association and unsupervised learning algorithms. The system design considerations and experimental demonstrations are also included.

  17. Neural Nets for Scene Analysis

    Science.gov (United States)

    1992-09-01

    decision boundaries produced for the arificial database when prototypes are Se- feature 1 lected from reduced training set. ly selected from the 383...CLASSIFIER HIT MISS MOPOGIA CORRELATION LOW-LEVEL VISION IVARL&MCE NEURAL NE. (O D ILER) SE CORRELATION REUCE ETC.(OR I F RS)DI4ENSIONAIM AND TRAINING...A) = J11’, + tOi2Z2 + 61311’ (4) SPE Vol. 1608 mitalwg’t Robots and Coniutef Vision X (991)/501 - "X,, ,v ) X 1112 1P Pa P2 P2 .. 2 33 CL AS INPUT

  18. Pansharpening by Convolutional Neural Networks

    Directory of Open Access Journals (Sweden)

    Giuseppe Masi

    2016-07-01

    Full Text Available A new pansharpening method is proposed, based on convolutional neural networks. We adapt a simple and effective three-layer architecture recently proposed for super-resolution to the pansharpening problem. Moreover, to improve performance without increasing complexity, we augment the input by including several maps of nonlinear radiometric indices typical of remote sensing. Experiments on three representative datasets show the proposed method to provide very promising results, largely competitive with the current state of the art in terms of both full-reference and no-reference metrics, and also at a visual inspection.

  19. Neural networks and perceptual learning

    Science.gov (United States)

    Tsodyks, Misha; Gilbert, Charles

    2005-01-01

    Sensory perception is a learned trait. The brain strategies we use to perceive the world are constantly modified by experience. With practice, we subconsciously become better at identifying familiar objects or distinguishing fine details in our environment. Current theoretical models simulate some properties of perceptual learning, but neglect the underlying cortical circuits. Future neural network models must incorporate the top-down alteration of cortical function by expectation or perceptual tasks. These newly found dynamic processes are challenging earlier views of static and feedforward processing of sensory information. PMID:15483598

  20. Optimization with Potts Neural Networks

    Science.gov (United States)

    Söderberg, Bo

    The Potts Neural Network approach to non-binary discrete optimization problems is described. It applies to problems that can be described as a set of elementary `multiple choice' options. Instead of the conventional binary (Ising) neurons, mean field Potts neurons, having several available states, are used to describe the elementary degrees of freedom of such problems. The dynamics consists of iterating the mean field equations with annealing until convergence. Due to its deterministic character, the method is quite fast. When applied to problems of Graph Partition and scheduling types, it produces very good solutions also for problems of considerable size.

  1. Mir-29b Mediates the Neural Tube versus Neural Crest Fate Decision during Embryonic Stem Cell Neural Differentiation.

    Science.gov (United States)

    Xi, Jiajie; Wu, Yukang; Li, Guoping; Ma, Li; Feng, Ke; Guo, Xudong; Jia, Wenwen; Wang, Guiying; Yang, Guang; Li, Ping; Kang, Jiuhong

    2017-08-08

    During gastrulation, the neuroectoderm cells form the neural tube and neural crest. The nervous system contains significantly more microRNAs than other tissues, but the role of microRNAs in controlling the differentiation of neuroectodermal cells into neural tube epithelial (NTE) cells and neural crest cells (NCCs) remains unknown. Using embryonic stem cell (ESC) neural differentiation systems, we found that miR-29b was upregulated in NTE cells and downregulated in NCCs. MiR-29b promoted the differentiation of ESCs into NTE cells and inhibited their differentiation into NCCs. Accordingly, the inhibition of miR-29b significantly inhibited the differentiation of NTE cells. A mechanistic study revealed that miR-29b targets DNA methyltransferase 3a (Dnmt3a) to regulate neural differentiation. Moreover, miR-29b mediated the function of Pou3f1, a critical neural transcription factor. Therefore, our study showed that the Pou3f1-miR-29b-Dnmt3a regulatory axis was active at the initial stage of neural differentiation and regulated the determination of cell fate. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  2. NeuralWISP: A Wirelessly Powered Neural Interface With 1-m Range.

    Science.gov (United States)

    Yeager, D J; Holleman, J; Prasad, R; Smith, J R; Otis, B P

    2009-12-01

    We present the NeuralWISP, a wireless neural interface operating from far-field radio-frequency RF energy. The NeuralWISP is compatible with commercial RF identification readers and operates at a range up to 1 m. It includes a custom low-noise, low-power amplifier integrated circuit for processing the neural signal and an analog spike detection circuit for reducing digital computational requirements and communications bandwidth. Our system monitors the neural signal and periodically transmits the spike density in a user-programmable time window. The entire system draws an average 20 muA from the harvested 1.8-V supply.

  3. Neural network modeling of emotion

    Science.gov (United States)

    Levine, Daniel S.

    2007-03-01

    This article reviews the history and development of computational neural network modeling of cognitive and behavioral processes that involve emotion. The exposition starts with models of classical conditioning dating from the early 1970s. Then it proceeds toward models of interactions between emotion and attention. Then models of emotional influences on decision making are reviewed, including some speculative (not and not yet simulated) models of the evolution of decision rules. Through the late 1980s, the neural networks developed to model emotional processes were mainly embodiments of significant functional principles motivated by psychological data. In the last two decades, network models of these processes have become much more detailed in their incorporation of known physiological properties of specific brain regions, while preserving many of the psychological principles from the earlier models. Most network models of emotional processes so far have dealt with positive and negative emotion in general, rather than specific emotions such as fear, joy, sadness, and anger. But a later section of this article reviews a few models relevant to specific emotions: one family of models of auditory fear conditioning in rats, and one model of induced pleasure enhancing creativity in humans. Then models of emotional disorders are reviewed. The article concludes with philosophical statements about the essential contributions of emotion to intelligent behavior and the importance of quantitative theories and models to the interdisciplinary enterprise of understanding the interactions of emotion, cognition, and behavior.

  4. Clustering: a neural network approach.

    Science.gov (United States)

    Du, K-L

    2010-01-01

    Clustering is a fundamental data analysis method. It is widely used for pattern recognition, feature extraction, vector quantization (VQ), image segmentation, function approximation, and data mining. As an unsupervised classification technique, clustering identifies some inherent structures present in a set of objects based on a similarity measure. Clustering methods can be based on statistical model identification (McLachlan & Basford, 1988) or competitive learning. In this paper, we give a comprehensive overview of competitive learning based clustering methods. Importance is attached to a number of competitive learning based clustering neural networks such as the self-organizing map (SOM), the learning vector quantization (LVQ), the neural gas, and the ART model, and clustering algorithms such as the C-means, mountain/subtractive clustering, and fuzzy C-means (FCM) algorithms. Associated topics such as the under-utilization problem, fuzzy clustering, robust clustering, clustering based on non-Euclidean distance measures, supervised clustering, hierarchical clustering as well as cluster validity are also described. Two examples are given to demonstrate the use of the clustering methods.

  5. Imaging Posture Veils Neural Signals

    Directory of Open Access Journals (Sweden)

    Robert T Thibault

    2016-10-01

    Full Text Available Whereas modern brain imaging often demands holding body positions incongruent with everyday life, posture governs both neural activity and cognitive performance. Humans commonly perform while upright; yet, many neuroimaging methodologies require participants to remain motionless and adhere to non-ecological comportments within a confined space. This inconsistency between ecological postures and imaging constraints undermines the transferability and generalizability of many a neuroimaging assay.Here we highlight the influence of posture on brain function and behavior. Specifically, we challenge the tacit assumption that brain processes and cognitive performance are comparable across a spectrum of positions. We provide an integrative synthesis regarding the increasingly prominent influence of imaging postures on autonomic function, mental capacity, sensory thresholds, and neural activity. Arguing that neuroimagers and cognitive scientists could benefit from considering the influence posture wields on both general functioning and brain activity, we examine existing imaging technologies and the potential of portable and versatile imaging devices (e.g., functional near infrared spectroscopy. Finally, we discuss ways that accounting for posture may help unveil the complex brain processes of everyday cognition.

  6. Radiation Behavior of Analog Neural Network Chip

    Science.gov (United States)

    Langenbacher, H.; Zee, F.; Daud, T.; Thakoor, A.

    1996-01-01

    A neural network experiment conducted for the Space Technology Research Vehicle (STRV-1) 1-b launched in June 1994. Identical sets of analog feed-forward neural network chips was used to study and compare the effects of space and ground radiation on the chips. Three failure mechanisms are noted.

  7. Deciphering the Cognitive and Neural Mechanisms Underlying ...

    International Development Research Centre (IDRC) Digital Library (Canada)

    Deciphering the Cognitive and Neural Mechanisms Underlying Auditory Learning. This project seeks to understand the brain mechanisms necessary for people to learn to perceive sounds. Neural circuits and learning. The research team will test people with and without musical training to evaluate their capacity to learn ...

  8. Neural network approach to parton distributions fitting

    CERN Document Server

    Piccione, Andrea; Forte, Stefano; Latorre, Jose I.; Rojo, Joan; Piccione, Andrea; Rojo, Joan

    2006-01-01

    We will show an application of neural networks to extract information on the structure of hadrons. A Monte Carlo over experimental data is performed to correctly reproduce data errors and correlations. A neural network is then trained on each Monte Carlo replica via a genetic algorithm. Results on the proton and deuteron structure functions, and on the nonsinglet parton distribution will be shown.

  9. NEURAL METHODS FOR THE FINANCIAL PREDICTION

    Directory of Open Access Journals (Sweden)

    Jerzy Balicki

    2016-06-01

    Full Text Available Artificial neural networks can be used to predict share investment on the stock market, assess the reliability of credit client or predicting banking crises. Moreover, this paper discusses the principles of cooperation neural network algorithms with evolutionary method, and support vector machines. In addition, a reference is made to other methods of artificial intelligence, which are used in finance prediction.

  10. Self-organization of neural networks

    Energy Technology Data Exchange (ETDEWEB)

    Clark, J.W.; Winston, J.V.; Rafelski, J.

    1984-05-14

    The plastic development of a neural-network model operating autonomously in discrete time is described by the temporal modification of interneuronal coupling strengths according to momentary neural activity. A simple algorithm (brainwashing) is found which, applied to nets with initially quasirandom connectivity, leads to model networks with properties conducive to the simulation of memory and learning phenomena. 18 references, 2 figures.

  11. Medical image analysis with artificial neural networks.

    Science.gov (United States)

    Jiang, J; Trundle, P; Ren, J

    2010-12-01

    Given that neural networks have been widely reported in the research community of medical imaging, we provide a focused literature survey on recent neural network developments in computer-aided diagnosis, medical image segmentation and edge detection towards visual content analysis, and medical image registration for its pre-processing and post-processing, with the aims of increasing awareness of how neural networks can be applied to these areas and to provide a foundation for further research and practical development. Representative techniques and algorithms are explained in detail to provide inspiring examples illustrating: (i) how a known neural network with fixed structure and training procedure could be applied to resolve a medical imaging problem; (ii) how medical images could be analysed, processed, and characterised by neural networks; and (iii) how neural networks could be expanded further to resolve problems relevant to medical imaging. In the concluding section, a highlight of comparisons among many neural network applications is included to provide a global view on computational intelligence with neural networks in medical imaging. Copyright © 2010 Elsevier Ltd. All rights reserved.

  12. Cockayne syndrome b maintains neural precursor function.

    Science.gov (United States)

    Sacco, Raffaele; Tamblyn, Laura; Rajakulendran, Nishani; Bralha, Fernando N; Tropepe, Vincent; Laposa, Rebecca R

    2013-02-01

    Neurodevelopmental defects are observed in the hereditary disorder Cockayne syndrome (CS). The gene most frequently mutated in CS, Cockayne Syndrome B (CSB), is required for the repair of bulky DNA adducts in transcribed genes during transcription-coupled nucleotide excision repair. CSB also plays a role in chromatin remodeling and mitochondrial function. The role of CSB in neural development is poorly understood. Here we report that the abundance of neural progenitors is normal in Csb(-/-) mice and the frequency of apoptotic cells in the neurogenic niche of the adult subependymal zone is similar in Csb(-/-) and wild type mice. Both embryonic and adult Csb(-/-) neural precursors exhibited defective self-renewal in the neurosphere assay. In Csb(-/-) neural precursors, self-renewal progressively decreased in serially passaged neurospheres. The data also indicate that Csb and the nucleotide excision repair protein Xpa preserve embryonic neural stem cell self-renewal after UV DNA damage. Although Csb(-/-) neural precursors do not exhibit altered neuronal lineage commitment after low-dose UV (1J/m(2)) in vitro, neurons differentiated in vitro from Csb(-/-) neural precursors that had been irradiated with 1J/m(2) UV exhibited defective neurite outgrowth. These findings identify a function for Csb in neural precursors. Copyright © 2012 Elsevier B.V. All rights reserved.

  13. Hidden neural networks: application to speech recognition

    DEFF Research Database (Denmark)

    Riis, Søren Kamaric

    1998-01-01

    We evaluate the hidden neural network HMM/NN hybrid on two speech recognition benchmark tasks; (1) task independent isolated word recognition on the Phonebook database, and (2) recognition of broad phoneme classes in continuous speech from the TIMIT database. It is shown how hidden neural networks...

  14. Genetic Algorithm Optimized Neural Networks Ensemble as ...

    African Journals Online (AJOL)

    Improvements in neural network calibration models by a novel approach using neural network ensemble (NNE) for the simultaneous spectrophotometric multicomponent analysis are suggested, with a study on the estimation of the components of an antihypertensive combination, namely, atenolol and losartan potassium.

  15. Neural Networks for Non-linear Control

    DEFF Research Database (Denmark)

    Sørensen, O.

    1994-01-01

    This paper describes how a neural network, structured as a Multi Layer Perceptron, is trained to predict, simulate and control a non-linear process.......This paper describes how a neural network, structured as a Multi Layer Perceptron, is trained to predict, simulate and control a non-linear process....

  16. Application of Neural Networks for Energy Reconstruction

    CERN Document Server

    Damgov, Jordan

    2002-01-01

    The possibility to use Neural Networks for reconstruction ofthe energy deposited in the calorimetry system of the CMS detector is investigated. It is shown that using feed-forward neural network, good linearity, Gaussian energy distribution and good energy resolution can be achieved. Significant improvement of the energy resolution and linearity is reached in comparison with other weighting methods for energy reconstruction.

  17. Neural Network to Solve Concave Games

    OpenAIRE

    Zixin Liu; Nengfa Wang

    2014-01-01

    The issue on neural network method to solve concave games is concerned. Combined with variational inequality, Ky Fan inequality, and projection equation, concave games are transformed into a neural network model. On the basis of the Lyapunov stable theory, some stability results are also given. Finally, two classic games’ simulation results are given to illustrate the theoretical results.

  18. Recognizing changing seasonal patterns using neural networks

    NARCIS (Netherlands)

    Ph.H.B.F. Franses (Philip Hans); G. Draisma (Gerrit)

    1997-01-01

    textabstractIn this paper we propose a graphical method based on an artificial neural network model to investigate how and when seasonal patterns in macroeconomic time series change over time. Neural networks are useful since the hidden layer units may become activated only in certain seasons or

  19. Neural plasticity in pancreatitis and pancreatic cancer.

    Science.gov (United States)

    Demir, Ihsan Ekin; Friess, Helmut; Ceyhan, Güralp O

    2015-11-01

    Pancreatic nerves undergo prominent alterations during the evolution and progression of human chronic pancreatitis and pancreatic cancer. Intrapancreatic nerves increase in size (neural hypertrophy) and number (increased neural density). The proportion of autonomic and sensory fibres (neural remodelling) is switched, and are infiltrated by perineural inflammatory cells (pancreatic neuritis) or invaded by pancreatic cancer cells (neural invasion). These neuropathic alterations also correlate with neuropathic pain. Instead of being mere histopathological manifestations of disease progression, pancreatic neural plasticity synergizes with the enhanced excitability of sensory neurons, with Schwann cell recruitment toward cancer and with central nervous system alterations. These alterations maintain a bidirectional interaction between nerves and non-neural pancreatic cells, as demonstrated by tissue and neural damage inducing neuropathic pain, and activated neurons releasing mediators that modulate inflammation and cancer growth. Owing to the prognostic effects of pain and neural invasion in pancreatic cancer, dissecting the mechanism of pancreatic neuroplasticity holds major translational relevance. However, current in vivo models of pancreatic cancer and chronic pancreatitis contain many discrepancies from human disease that overshadow their translational value. The present Review discusses novel possibilities for mechanistically uncovering the role of the nervous system in pancreatic disease progression.

  20. Neural Plasticity in Speech Acquisition and Learning

    Science.gov (United States)

    Zhang, Yang; Wang, Yue

    2007-01-01

    Neural plasticity in speech acquisition and learning is concerned with the timeline trajectory and the mechanisms of experience-driven changes in the neural circuits that support or disrupt linguistic function. In this selective review, we discuss the role of phonetic learning in language acquisition, the "critical period" of learning, the agents…

  1. Neural plasticity: the biological substrate for neurorehabilitation.

    Science.gov (United States)

    Warraich, Zuha; Kleim, Jeffrey A

    2010-12-01

    Decades of basic science have clearly demonstrated the capacity of the central nervous system (CNS) to structurally and functionally adapt in response to experience. The field of neurorehabilitation has begun to use this body of work to develop neurobiologically informed therapies that harness the key behavioral and neural signals that drive neural plasticity. The present review describes how neural plasticity supports both learning in the intact CNS and functional improvement in the damaged or diseased CNS. A pragmatic, interdisciplinary definition of neural plasticity is presented that may be used by both clinical and basic scientists studying neurorehabilitation. Furthermore, a description of how neural plasticity may act to drive different neural strategies underlying functional improvement after CNS injury or disease is provided. The understanding of the relationship between these different neural strategies, mechanisms of neural plasticity, and changes in behavior may facilitate the development of novel, more effective rehabilitation interventions. Copyright © 2010 American Academy of Physical Medicine and Rehabilitation. Published by Elsevier Inc. All rights reserved.

  2. Neural constructivism or self-organization?

    NARCIS (Netherlands)

    van der Maas, H.L.J.; Molenaar, P.C.M.

    2000-01-01

    Comments on the article by S. R. Quartz et al (see record 1998-00749-001) which discussed the constructivist perspective of interaction between cognition and neural processes during development and consequences for theories of learning. Three arguments are given to show that neural constructivism

  3. Adaptive Neurons For Artificial Neural Networks

    Science.gov (United States)

    Tawel, Raoul

    1990-01-01

    Training time decreases dramatically. In improved mathematical model of neural-network processor, temperature of neurons (in addition to connection strengths, also called weights, of synapses) varied during supervised-learning phase of operation according to mathematical formalism and not heuristic rule. Evidence that biological neural networks also process information at neuronal level.

  4. A Fuzzy Neural Tree for Possibilistic Reliability

    NARCIS (Netherlands)

    Ciftcioglu, O.

    2008-01-01

    An innovative neural fuzzy system is considered for possibilistic reliability using a neural tree structure with nodes of neuronal type. The total tree structure works effectively as a fuzzy logic system where the possibility theory plays important role with Gaussian possibility distribution at the

  5. Neural Adaptive Sensory Processing for Undersea Sonar

    Science.gov (United States)

    1992-10-01

    neurobionic conceptual framework- [71 -, "Neural target locator," Naval Ocean Systems Center, Tech. Mr. Speidel is a member of the American Association...for the Ad- Document 77)1914, 1990. vancement of Science (AAAS), the International Neural Network Soci- [8) -, "Sonar scene analysis using neurobionic

  6. Neural bases of accented speech perception

    OpenAIRE

    Patti eAdank; Nuttall, Helen E.; Briony eBanks; Dan eKennedy-Higgins

    2015-01-01

    The recognition of unfamiliar regional and foreign accents represents a challenging task for the speech perception system (Adank, Evans, Stuart-Smith, & Scott, 2009; Floccia, Goslin, Girard, & Konopczynski, 2006). Despite the frequency with which we encounter such accents, the neural mechanisms supporting successful perception of accented speech are poorly understood. Nonetheless, candidate neural substrates involved in processing speech in challenging listening conditions, including accented...

  7. Learning from large scale neural simulations

    DEFF Research Database (Denmark)

    Serban, Maria

    2017-01-01

    -possibly explanations of neural and cognitive phenomena, or they can be used to test explanatory hypotheses and identify sufficient causal factors in specific neurobiological mechanisms, or yet again they can be deployed exploratorily to evaluate the hierarchy of multiple neural and cognitive models that are put...

  8. A new perspective on behavioral inconsistency and neural noise in aging: Compensatory speeding of neural communication

    Directory of Open Access Journals (Sweden)

    S. Lee Hong

    2012-09-01

    Full Text Available This paper seeks to present a new perspective on the aging brain. Here, we make connections between two key phenomena of brain aging: 1 increased neural noise or random background activity; and 2 slowing of brain activity. Our perspective proposes the possibility that the slowing of neural processing due to decreasing nerve conduction velocities leads to a compensatory speeding of neuron firing rates. These increased firing rates lead to a broader distribution of power in the frequency spectrum of neural oscillations, which we propose, can just as easily be interpreted as neural noise. Compensatory speeding of neural activity, as we present, is constrained by the: A availability of metabolic energy sources; and B competition for frequency bandwidth needed for neural communication. We propose that these constraints lead to the eventual inability to compensate for age-related declines in neural function that are manifested clinically as deficits in cognition, affect, and motor behavior.

  9. International Conference on Artificial Neural Networks (ICANN)

    CERN Document Server

    Mladenov, Valeri; Kasabov, Nikola; Artificial Neural Networks : Methods and Applications in Bio-/Neuroinformatics

    2015-01-01

    The book reports on the latest theories on artificial neural networks, with a special emphasis on bio-neuroinformatics methods. It includes twenty-three papers selected from among the best contributions on bio-neuroinformatics-related issues, which were presented at the International Conference on Artificial Neural Networks, held in Sofia, Bulgaria, on September 10-13, 2013 (ICANN 2013). The book covers a broad range of topics concerning the theory and applications of artificial neural networks, including recurrent neural networks, super-Turing computation and reservoir computing, double-layer vector perceptrons, nonnegative matrix factorization, bio-inspired models of cell communities, Gestalt laws, embodied theory of language understanding, saccadic gaze shifts and memory formation, and new training algorithms for Deep Boltzmann Machines, as well as dynamic neural networks and kernel machines. It also reports on new approaches to reinforcement learning, optimal control of discrete time-delay systems, new al...

  10. Neural network signal understanding for instrumentation

    DEFF Research Database (Denmark)

    Pau, L. F.; Johansen, F. S.

    1990-01-01

    A report is presented on the use of neural signal interpretation theory and techniques for the purpose of classifying the shapes of a set of instrumentation signals, in order to calibrate devices, diagnose anomalies, generate tuning/settings, and interpret the measurement results. Neural signal...... understanding research is surveyed, and the selected implementation and its performance in terms of correct classification rates and robustness to noise are described. Formal results on neural net training time and sensitivity to weights are given. A theory for neural control using functional link nets is given......, and an explanation facility designed to help neural signal understanding is described. The results are compared to those obtained with a knowledge-based signal interpretation system using the same instrument and data...

  11. Neural scaling laws for an uncertain world

    CERN Document Server

    Howard, Marc W

    2016-01-01

    The Weber-Fechner law describes the form of psychological space in many behavioral experiments involving perception of one-dimensional physical quantities. If the physical quantity is expressed using multiple neural receptors, then placing receptive fields evenly along a logarithmic scale naturally leads to the psychological Weber-Fechner law. In the visual system, the spacing and width of extrafoveal receptive fields are consistent with logarithmic scaling. Other sets of neural "receptors" appear to show the same qualitative properties, suggesting that this form of neural scaling reflects a solution to a very general problem. This paper argues that these neural scaling laws enable the brain to represent information about the world efficiently without making any assumptions about the statistics of the world. This analysis suggests that the organization of neural scales to represent one-dimensional quantities, including more abstract quantities such as numerosity, time, and allocentric space, should have a uni...

  12. Neural networks with discontinuous/impact activations

    CERN Document Server

    Akhmet, Marat

    2014-01-01

    This book presents as its main subject new models in mathematical neuroscience. A wide range of neural networks models with discontinuities are discussed, including impulsive differential equations, differential equations with piecewise constant arguments, and models of mixed type. These models involve discontinuities, which are natural because huge velocities and short distances are usually observed in devices modeling the networks. A discussion of the models, appropriate for the proposed applications, is also provided. This book also: Explores questions related to the biological underpinning for models of neural networks\\ Considers neural networks modeling using differential equations with impulsive and piecewise constant argument discontinuities Provides all necessary mathematical basics for application to the theory of neural networks Neural Networks with Discontinuous/Impact Activations is an ideal book for researchers and professionals in the field of engineering mathematics that have an interest in app...

  13. 22nd Italian Workshop on Neural Nets

    CERN Document Server

    Bassis, Simone; Esposito, Anna; Morabito, Francesco

    2013-01-01

    This volume collects a selection of contributions which has been presented at the 22nd Italian Workshop on Neural Networks, the yearly meeting of the Italian Society for Neural Networks (SIREN). The conference was held in Italy, Vietri sul Mare (Salerno), during May 17-19, 2012. The annual meeting of SIREN is sponsored by International Neural Network Society (INNS), European Neural Network Society (ENNS) and IEEE Computational Intelligence Society (CIS). The book – as well as the workshop-  is organized in three main components, two special sessions and a group of regular sessions featuring different aspects and point of views of artificial neural networks and natural intelligence, also including applications of present compelling interest.

  14. Decentralized neural control application to robotics

    CERN Document Server

    Garcia-Hernandez, Ramon; Sanchez, Edgar N; Alanis, Alma y; Ruz-Hernandez, Jose A

    2017-01-01

    This book provides a decentralized approach for the identification and control of robotics systems. It also presents recent research in decentralized neural control and includes applications to robotics. Decentralized control is free from difficulties due to complexity in design, debugging, data gathering and storage requirements, making it preferable for interconnected systems. Furthermore, as opposed to the centralized approach, it can be implemented with parallel processors. This approach deals with four decentralized control schemes, which are able to identify the robot dynamics. The training of each neural network is performed on-line using an extended Kalman filter (EKF). The first indirect decentralized control scheme applies the discrete-time block control approach, to formulate a nonlinear sliding manifold. The second direct decentralized neural control scheme is based on the backstepping technique, approximated by a high order neural network. The third control scheme applies a decentralized neural i...

  15. Flexible body control using neural networks

    Science.gov (United States)

    Mccullough, Claire L.

    1992-01-01

    Progress is reported on the control of Control Structures Interaction suitcase demonstrator (a flexible structure) using neural networks and fuzzy logic. It is concluded that while control by neural nets alone (i.e., allowing the net to design a controller with no human intervention) has yielded less than optimal results, the neural net trained to emulate the existing fuzzy logic controller does produce acceptible system responses for the initial conditions examined. Also, a neural net was found to be very successful in performing the emulation step necessary for the anticipatory fuzzy controller for the CSI suitcase demonstrator. The fuzzy neural hybrid, which exhibits good robustness and noise rejection properties, shows promise as a controller for practical flexible systems, and should be further evaluated.

  16. Neural network topology design for nonlinear control

    Science.gov (United States)

    Haecker, Jens; Rudolph, Stephan

    2001-03-01

    Neural networks, especially in nonlinear system identification and control applications, are typically considered to be black-boxes which are difficult to analyze and understand mathematically. Due to this reason, an in- depth mathematical analysis offering insight into the different neural network transformation layers based on a theoretical transformation scheme is desired, but up to now neither available nor known. In previous works it has been shown how proven engineering methods such as dimensional analysis and the Laplace transform may be used to construct a neural controller topology for time-invariant systems. Using the knowledge of neural correspondences of these two classical methods, the internal nodes of the network could also be successfully interpreted after training. As further extension to these works, the paper describes the latest of a theoretical interpretation framework describing the neural network transformation sequences in nonlinear system identification and control. This can be achieved By incorporation of the method of exact input-output linearization in the above mentioned two transform sequences of dimensional analysis and the Laplace transformation. Based on these three theoretical considerations neural network topologies may be designed in special situations by pure translation in the sense of a structural compilation of the known classical solutions into their correspondent neural topology. Based on known exemplary results, the paper synthesizes the proposed approach into the visionary goals of a structural compiler for neural networks. This structural compiler for neural networks is intended to automatically convert classical control formulations into their equivalent neural network structure based on the principles of equivalence between formula and operator, and operator and structure which are discussed in detail in this work.

  17. Neural Network and Letter Recognition.

    Science.gov (United States)

    Lee, Hue Yeon

    Neural net architectures and learning algorithms that recognize hand written 36 alphanumeric characters are studied. The thin line input patterns written in 32 x 32 binary array are used. The system is comprised of two major components, viz. a preprocessing unit and a Recognition unit. The preprocessing unit in turn consists of three layers of neurons; the U-layer, the V-layer, and the C -layer. The functions of the U-layer is to extract local features by template matching. The correlation between the detected local features are considered. Through correlating neurons in a plane with their neighboring neurons, the V-layer would thicken the on-cells or lines that are groups of on-cells of the previous layer. These two correlations would yield some deformation tolerance and some of the rotational tolerance of the system. The C-layer then compresses data through the 'Gabor' transform. Pattern dependent choice of center and wavelengths of 'Gabor' filters is the cause of shift and scale tolerance of the system. Three different learning schemes had been investigated in the recognition unit, namely; the error back propagation learning with hidden units, a simple perceptron learning, and a competitive learning. Their performances were analyzed and compared. Since sometimes the network fails to distinguish between two letters that are inherently similar, additional ambiguity resolving neural nets are introduced on top of the above main neural net. The two dimensional Fourier transform is used as the preprocessing and the perceptron is used as the recognition unit of the ambiguity resolver. One hundred different person's handwriting sets are collected. Some of these are used as the training sets and the remainders are used as the test sets. The correct recognition rate of the system increases with the number of training sets and eventually saturates at a certain value. Similar recognition rates are obtained for the above three different learning algorithms. The minimum error

  18. Complex-valued Neural Networks

    Science.gov (United States)

    Hirose, Akira

    This paper reviews the features and applications of complex-valued neural networks (CVNNs). First we list the present application fields, and describe the advantages of the CVNNs in two application examples, namely, an adaptive plastic-landmine visualization system and an optical frequency-domain-multiplexed learning logic circuit. Then we briefly discuss the features of complex number itself to find that the phase rotation is the most significant concept, which is very useful in processing the information related to wave phenomena such as lightwave and electromagnetic wave. The CVNNs will also be an indispensable framework of the future microelectronic information-processing hardware where the quantum electron wave plays the principal role.

  19. Neural correlates of viewing paintings

    DEFF Research Database (Denmark)

    Vartanian, Oshin; Skov, Martin

    2014-01-01

    Many studies involving functional magnetic resonance imaging (fMRI) have exposed participants to paintings under varying task demands. To isolate neural systems that are activated reliably across fMRI studies in response to viewing paintings regardless of variation in task demands, a quantitative...... meta-analysis of fifteen experiments using the activation likelihood estimation (ALE) method was conducted. As predicted, viewing paintings was correlated with activation in a distributed system including the occipital lobes, temporal lobe structures in the ventral stream involved in object (fusiform...... gyrus) and scene (parahippocampal gyrus) perception, and the anterior insula-a key structure in experience of emotion. In addition, we also observed activation in the posterior cingulate cortex bilaterally-part of the brain's default network. These results suggest that viewing paintings engages not only...

  20. Neural remodeling in retinal degeneration.

    Science.gov (United States)

    Marc, Robert E; Jones, Bryan W; Watt, Carl B; Strettoi, Enrica

    2003-09-01

    Mammalian retinal degenerations initiated by gene defects in rods, cones or the retinal pigmented epithelium (RPE) often trigger loss of the sensory retina, effectively leaving the neural retina deafferented. The neural retina responds to this challenge by remodeling, first by subtle changes in neuronal structure and later by large-scale reorganization. Retinal degenerations in the mammalian retina generally progress through three phases. Phase 1 initiates with expression of a primary insult, followed by phase 2 photoreceptor death that ablates the sensory retina via initial photoreceptor stress, phenotype deconstruction, irreversible stress and cell death, including bystander effects or loss of trophic support. The loss of cones heralds phase 3: a protracted period of global remodeling of the remnant neural retina. Remodeling resembles the responses of many CNS assemblies to deafferentation or trauma, and includes neuronal cell death, neuronal and glial migration, elaboration of new neurites and synapses, rewiring of retinal circuits, glial hypertrophy and the evolution of a fibrotic glial seal that isolates the remnant neural retina from the surviving RPE and choroid. In early phase 2, stressed photoreceptors sprout anomalous neurites that often reach the inner plexiform and ganglion cell layers. As death of rods and cones progresses, bipolar and horizontal cells are deafferented and retract most of their dendrites. Horizontal cells develop anomalous axonal processes and dendritic stalks that enter the inner plexiform layer. Dendrite truncation in rod bipolar cells is accompanied by revision of their macromolecular phenotype, including the loss of functioning mGluR6 transduction. After ablation of the sensory retina, Müller cells increase intermediate filament synthesis, forming a dense fibrotic layer in the remnant subretinal space. This layer invests the remnant retina and seals it from access via the choroidal route. Evidence of bipolar cell death begins in

  1. Primary neural leprosy: systematic review

    Directory of Open Access Journals (Sweden)

    Jose Antonio Garbino

    2013-06-01

    Full Text Available The authors proposed a systematic review on the current concepts of primary neural leprosy by consulting the following online databases: MEDLINE, Lilacs/SciELO, and Embase. Selected studies were classified based on the degree of recommendation and levels of scientific evidence according to the “Oxford Centre for Evidence-based Medicine”. The following aspects were reviewed: cutaneous clinical and laboratorial investigations, i.e. skin clinical exam, smears, and biopsy, and Mitsuda's reaction; neurological investigation (anamnesis, electromyography and nerve biopsy; serological investigation and molecular testing, i.e. serological testing for the detection of the phenolic glycolipid 1 (PGL-I and the polymerase chain reaction (PCR; and treatment (classification criteria for the definition of specific treatment, steroid treatment, and cure criteria.

  2. Primary neural leprosy: systematic review.

    Science.gov (United States)

    Garbino, José Antonio; Marques, Wilson; Barreto, Jaison Antonio; Heise, Carlos Otto; Rodrigues, Márcia Maria Jardim; Antunes, Sérgio L; Soares, Cleverson Teixeira; Floriano, Marcos Cesar; Nery, José Augusto; Trindade, Maria Angela Bianconcini; Carvalho, Noêmia Barbosa; Andrada, Nathália Carvalho de; Barreira, Amilton Antunes; Virmond, Marcos da Cunha Lopes

    2013-06-01

    The authors proposed a systematic review on the current concepts of primary neural leprosy by consulting the following online databases: MEDLINE, Lilacs/SciELO, and Embase. Selected studies were classified based on the degree of recommendation and levels of scientific evidence according to the "Oxford Centre for Evidence-based Medicine". The following aspects were reviewed: cutaneous clinical and laboratorial investigations, i.e. skin clinical exam, smears, and biopsy, and Mitsuda's reaction; neurological investigation (anamnesis, electromyography and nerve biopsy); serological investigation and molecular testing, i.e. serological testing for the detection of the phenolic glycolipid 1 (PGL-I) and the polymerase chain reaction (PCR); and treatment (classification criteria for the definition of specific treatment, steroid treatment, and cure criteria).

  3. Neural correlates of face detection.

    Science.gov (United States)

    Xu, Xiaokun; Biederman, Irving

    2014-06-01

    Although face detection likely played an essential adaptive role in our evolutionary past and in contemporary social interactions, there have been few rigorous studies investigating its neural correlates. MJH, a prosopagnosic with bilateral lesions to the ventral temporal-occipital cortices encompassing the posterior face areas (fusiform and occipital face areas), expresses no subjective difficulty in face detection, suggesting that these posterior face areas do not mediate face detection exclusively. Despite his normal contrast sensitivity and visual acuity in foveal vision, the present study nevertheless revealed significant face detection deficits in MJH. Compared with controls, MJH showed a lower tolerance to noise in the phase spectrum for faces (vs. cars), reflected in his higher detection threshold for faces. MJH's lesions in bilateral occipito-temporal cortices thus appear to have produced a deficit not only in face individuation, but also in face detection.

  4. Collision avoidance using neural networks

    Science.gov (United States)

    Sugathan, Shilpa; Sowmya Shree, B. V.; Warrier, Mithila R.; Vidhyapathi, C. M.

    2017-11-01

    Now a days, accidents on roads are caused due to the negligence of drivers and pedestrians or due to unexpected obstacles that come into the vehicle’s path. In this paper, a model (robot) is developed to assist drivers for a smooth travel without accidents. It reacts to the real time obstacles on the four critical sides of the vehicle and takes necessary action. The sensor used for detecting the obstacle was an IR proximity sensor. A single layer perceptron neural network is used to train and test all possible combinations of sensors result by using Matlab (offline). A microcontroller (ARM Cortex-M3 LPC1768) is used to control the vehicle through the output data which is received from Matlab via serial communication. Hence, the vehicle becomes capable of reacting to any combination of real time obstacles.

  5. Bayesian regularization of neural networks.

    Science.gov (United States)

    Burden, Frank; Winkler, Dave

    2008-01-01

    Bayesian regularized artificial neural networks (BRANNs) are more robust than standard back-propagation nets and can reduce or eliminate the need for lengthy cross-validation. Bayesian regularization is a mathematical process that converts a nonlinear regression into a "well-posed" statistical problem in the manner of a ridge regression. The advantage of BRANNs is that the models are robust and the validation process, which scales as O(N2) in normal regression methods, such as back propagation, is unnecessary. These networks provide solutions to a number of problems that arise in QSAR modeling, such as choice of model, robustness of model, choice of validation set, size of validation effort, and optimization of network architecture. They are difficult to overtrain, since evidence procedures provide an objective Bayesian criterion for stopping training. They are also difficult to overfit, because the BRANN calculates and trains on a number of effective network parameters or weights, effectively turning off those that are not relevant. This effective number is usually considerably smaller than the number of weights in a standard fully connected back-propagation neural net. Automatic relevance determination (ARD) of the input variables can be used with BRANNs, and this allows the network to "estimate" the importance of each input. The ARD method ensures that irrelevant or highly correlated indices used in the modeling are neglected as well as showing which are the most important variables for modeling the activity data. This chapter outlines the equations that define the BRANN method plus a flowchart for producing a BRANN-QSAR model. Some results of the use of BRANNs on a number of data sets are illustrated and compared with other linear and nonlinear models.

  6. Tampa Electric Neural Network Sootblowing

    Energy Technology Data Exchange (ETDEWEB)

    Mark A. Rhode

    2003-12-31

    Boiler combustion dynamics change continuously due to several factors including coal quality, boiler loading, ambient conditions, changes in slag/soot deposits and the condition of plant equipment. NO{sub x} formation, Particulate Matter (PM) emissions, and boiler thermal performance are directly affected by the sootblowing practices on a unit. As part of its Power Plant Improvement Initiative program, the US DOE is providing cofunding (DE-FC26-02NT41425) and NETL is the managing agency for this project at Tampa Electric's Big Bend Station. This program serves to co-fund projects that have the potential to increase thermal efficiency and reduce emissions from coal-fired utility boilers. A review of the Big Bend units helped identify intelligent sootblowing as a suitable application to achieve the desired objectives. The existing sootblower control philosophy uses sequential schemes, whose frequency is either dictated by the control room operator or is timed based. The intent of this project is to implement a neural network based intelligent soot-blowing system, in conjunction with state-of-the-art controls and instrumentation, to optimize the operation of a utility boiler and systematically control boiler fouling. Utilizing unique, on-line, adaptive technology, operation of the sootblowers can be dynamically controlled based on real-time events and conditions within the boiler. This could be an extremely cost-effective technology, which has the ability to be readily and easily adapted to virtually any pulverized coal fired boiler. Through unique on-line adaptive technology, Neural Network-based systems optimize the boiler operation by accommodating equipment performance changes due to wear and maintenance activities, adjusting to fluctuations in fuel quality, and improving operating flexibility. The system dynamically adjusts combustion setpoints and bias settings in closed-loop supervisory control to simultaneously reduce NO{sub x} emissions and improve heat

  7. Tampa Electric Neural Network Sootblowing

    Energy Technology Data Exchange (ETDEWEB)

    Mark A. Rhode

    2004-09-30

    Boiler combustion dynamics change continuously due to several factors including coal quality, boiler loading, ambient conditions, changes in slag/soot deposits and the condition of plant equipment. NOx formation, Particulate Matter (PM) emissions, and boiler thermal performance are directly affected by the sootblowing practices on a unit. As part of its Power Plant Improvement Initiative program, the US DOE is providing cofunding (DE-FC26-02NT41425) and NETL is the managing agency for this project at Tampa Electric's Big Bend Station. This program serves to co-fund projects that have the potential to increase thermal efficiency and reduce emissions from coal-fired utility boilers. A review of the Big Bend units helped identify intelligent sootblowing as a suitable application to achieve the desired objectives. The existing sootblower control philosophy uses sequential schemes, whose frequency is either dictated by the control room operator or is timed based. The intent of this project is to implement a neural network based intelligent sootblowing system, in conjunction with state-of-the-art controls and instrumentation, to optimize the operation of a utility boiler and systematically control boiler fouling. Utilizing unique, on-line, adaptive technology, operation of the sootblowers can be dynamically controlled based on real-time events and conditions within the boiler. This could be an extremely cost-effective technology, which has the ability to be readily and easily adapted to virtually any pulverized coal fired boiler. Through unique on-line adaptive technology, Neural Network-based systems optimize the boiler operation by accommodating equipment performance changes due to wear and maintenance activities, adjusting to fluctuations in fuel quality, and improving operating flexibility. The system dynamically adjusts combustion setpoints and bias settings in closed-loop supervisory control to simultaneously reduce NO{sub x} emissions and improve heat rate

  8. Tampa Electric Neural Network Sootblowing

    Energy Technology Data Exchange (ETDEWEB)

    Mark A. Rhode

    2004-03-31

    Boiler combustion dynamics change continuously due to several factors including coal quality, boiler loading, ambient conditions, changes in slag/soot deposits and the condition of plant equipment. NOx formation, Particulate Matter (PM) emissions, and boiler thermal performance are directly affected by the sootblowing practices on a unit. As part of its Power Plant Improvement Initiative program, the US DOE is providing co-funding (DE-FC26-02NT41425) and NETL is the managing agency for this project at Tampa Electric's Big Bend Station. This program serves to co-fund projects that have the potential to increase thermal efficiency and reduce emissions from coal-fired utility boilers. A review of the Big Bend units helped identify intelligent sootblowing as a suitable application to achieve the desired objectives. The existing sootblower control philosophy uses sequential schemes, whose frequency is either dictated by the control room operator or is timed based. The intent of this project is to implement a neural network based intelligent sootblowing system, in conjunction with state-of-the-art controls and instrumentation, to optimize the operation of a utility boiler and systematically control boiler fouling. Utilizing unique, on-line, adaptive technology, operation of the sootblowers can be dynamically controlled based on real-time events and conditions within the boiler. This could be an extremely cost-effective technology, which has the ability to be readily and easily adapted to virtually any pulverized coal fired boiler. Through unique on-line adaptive technology, Neural Network-based systems optimize the boiler operation by accommodating equipment performance changes due to wear and maintenance activities, adjusting to fluctuations in fuel quality, and improving operating flexibility. The system dynamically adjusts combustion setpoints and bias settings in closed-loop supervisory control to simultaneously reduce NO{sub x} emissions and improve heat rate

  9. Neural networks for nuclear spectroscopy

    Energy Technology Data Exchange (ETDEWEB)

    Keller, P.E.; Kangas, L.J.; Hashem, S.; Kouzes, R.T. [Pacific Northwest Lab., Richland, WA (United States)] [and others

    1995-12-31

    In this paper two applications of artificial neural networks (ANNs) in nuclear spectroscopy analysis are discussed. In the first application, an ANN assigns quality coefficients to alpha particle energy spectra. These spectra are used to detect plutonium contamination in the work environment. The quality coefficients represent the levels of spectral degradation caused by miscalibration and foreign matter affecting the instruments. A set of spectra was labeled with quality coefficients by an expert and used to train the ANN expert system. Our investigation shows that the expert knowledge of spectral quality can be transferred to an ANN system. The second application combines a portable gamma-ray spectrometer with an ANN. In this system the ANN is used to automatically identify, radioactive isotopes in real-time from their gamma-ray spectra. Two neural network paradigms are examined: the linear perception and the optimal linear associative memory (OLAM). A comparison of the two paradigms shows that OLAM is superior to linear perception for this application. Both networks have a linear response and are useful in determining the composition of an unknown sample when the spectrum of the unknown is a linear superposition of known spectra. One feature of this technique is that it uses the whole spectrum in the identification process instead of only the individual photo-peaks. For this reason, it is potentially more useful for processing data from lower resolution gamma-ray spectrometers. This approach has been tested with data generated by Monte Carlo simulations and with field data from sodium iodide and Germanium detectors. With the ANN approach, the intense computation takes place during the training process. Once the network is trained, normal operation consists of propagating the data through the network, which results in rapid identification of samples. This approach is useful in situations that require fast response where precise quantification is less important.

  10. Nonequilibrium landscape theory of neural networks

    Science.gov (United States)

    Yan, Han; Zhao, Lei; Hu, Liang; Wang, Xidi; Wang, Erkang; Wang, Jin

    2013-01-01

    The brain map project aims to map out the neuron connections of the human brain. Even with all of the wirings mapped out, the global and physical understandings of the function and behavior are still challenging. Hopfield quantified the learning and memory process of symmetrically connected neural networks globally through equilibrium energy. The energy basins of attractions represent memories, and the memory retrieval dynamics is determined by the energy gradient. However, the realistic neural networks are asymmetrically connected, and oscillations cannot emerge from symmetric neural networks. Here, we developed a nonequilibrium landscape–flux theory for realistic asymmetrically connected neural networks. We uncovered the underlying potential landscape and the associated Lyapunov function for quantifying the global stability and function. We found the dynamics and oscillations in human brains responsible for cognitive processes and physiological rhythm regulations are determined not only by the landscape gradient but also by the flux. We found that the flux is closely related to the degrees of the asymmetric connections in neural networks and is the origin of the neural oscillations. The neural oscillation landscape shows a closed-ring attractor topology. The landscape gradient attracts the network down to the ring. The flux is responsible for coherent oscillations on the ring. We suggest the flux may provide the driving force for associations among memories. We applied our theory to rapid-eye movement sleep cycle. We identified the key regulation factors for function through global sensitivity analysis of landscape topography against wirings, which are in good agreements with experiments. PMID:24145451

  11. Genetic algorithm for neural networks optimization

    Science.gov (United States)

    Setyawati, Bina R.; Creese, Robert C.; Sahirman, Sidharta

    2004-11-01

    This paper examines the forecasting performance of multi-layer feed forward neural networks in modeling a particular foreign exchange rates, i.e. Japanese Yen/US Dollar. The effects of two learning methods, Back Propagation and Genetic Algorithm, in which the neural network topology and other parameters fixed, were investigated. The early results indicate that the application of this hybrid system seems to be well suited for the forecasting of foreign exchange rates. The Neural Networks and Genetic Algorithm were programmed using MATLAB«.

  12. Estimation of Conditional Quantile using Neural Networks

    DEFF Research Database (Denmark)

    Kulczycki, P.; Schiøler, Henrik

    1999-01-01

    The problem of estimating conditional quantiles using neural networks is investigated here. A basic structure is developed using the methodology of kernel estimation, and a theory guaranteeing con-sistency on a mild set of assumptions is provided. The constructed structure constitutes a basis...... for the design of a variety of different neural networks, some of which are considered in detail. The task of estimating conditional quantiles is related to Bayes point estimation whereby a broad range of applications within engineering, economics and management can be suggested. Numerical results illustrating...... the capabilities of the elaborated neural network are also given....

  13. Vectorized algorithms for spiking neural network simulation.

    Science.gov (United States)

    Brette, Romain; Goodman, Dan F M

    2011-06-01

    High-level languages (Matlab, Python) are popular in neuroscience because they are flexible and accelerate development. However, for simulating spiking neural networks, the cost of interpretation is a bottleneck. We describe a set of algorithms to simulate large spiking neural networks efficiently with high-level languages using vector-based operations. These algorithms constitute the core of Brian, a spiking neural network simulator written in the Python language. Vectorized simulation makes it possible to combine the flexibility of high-level languages with the computational efficiency usually associated with compiled languages.

  14. Convolutional Neural Network for Image Recognition

    CERN Document Server

    Seifnashri, Sahand

    2015-01-01

    The aim of this project is to use machine learning techniques especially Convolutional Neural Networks for image processing. These techniques can be used for Quark-Gluon discrimination using calorimeters data, but unfortunately I didn’t manage to get the calorimeters data and I just used the Jet data fromminiaodsim(ak4 chs). The Jet data was not good enough for Convolutional Neural Network which is designed for ’image’ recognition. This report is made of twomain part, part one is mainly about implementing Convolutional Neural Network on unphysical data such as MNIST digits and CIFAR-10 dataset and part 2 is about the Jet data.

  15. Deep Learning Neural Networks and Bayesian Neural Networks in Data Analysis

    Science.gov (United States)

    Chernoded, Andrey; Dudko, Lev; Myagkov, Igor; Volkov, Petr

    2017-10-01

    Most of the modern analyses in high energy physics use signal-versus-background classification techniques of machine learning methods and neural networks in particular. Deep learning neural network is the most promising modern technique to separate signal and background and now days can be widely and successfully implemented as a part of physical analysis. In this article we compare Deep learning and Bayesian neural networks application as a classifiers in an instance of top quark analysis.

  16. Deep Learning Neural Networks and Bayesian Neural Networks in Data Analysis

    Directory of Open Access Journals (Sweden)

    Chernoded Andrey

    2017-01-01

    Full Text Available Most of the modern analyses in high energy physics use signal-versus-background classification techniques of machine learning methods and neural networks in particular. Deep learning neural network is the most promising modern technique to separate signal and background and now days can be widely and successfully implemented as a part of physical analysis. In this article we compare Deep learning and Bayesian neural networks application as a classifiers in an instance of top quark analysis.

  17. LPA is a chemorepellent for B16 melanoma cells: action through the cAMP-elevating LPA5 receptor.

    Directory of Open Access Journals (Sweden)

    Maikel Jongsma

    Full Text Available Lysophosphatidic acid (LPA, a lipid mediator enriched in serum, stimulates cell migration, proliferation and other functions in many cell types. LPA acts on six known G protein-coupled receptors, termed LPA(1-6, showing both overlapping and distinct signaling properties. Here we show that, unexpectedly, LPA and serum almost completely inhibit the transwell migration of B16 melanoma cells, with alkyl-LPA(18:1 being 10-fold more potent than acyl-LPA(18:1. The anti-migratory response to LPA is highly polarized and dependent on protein kinase A (PKA but not Rho kinase activity; it is associated with a rapid increase in intracellular cAMP levels and PIP3 depletion from the plasma membrane. B16 cells express LPA(2, LPA(5 and LPA(6 receptors. We show that LPA-induced chemorepulsion is mediated specifically by the alkyl-LPA-preferring LPA(5 receptor (GPR92, which raises intracellular cAMP via a noncanonical pathway. Our results define LPA(5 as an anti-migratory receptor and they implicate the cAMP-PKA pathway, along with reduced PIP3 signaling, as an effector of chemorepulsion in B16 melanoma cells.

  18. Neural processing of auditory signals and modular neural control for sound tropism of walking machines

    DEFF Research Database (Denmark)

    Manoonpong, Poramate; Pasemann, Frank; Fischer, Joern

    2005-01-01

    . The parameters of these networks are optimized by an evolutionary algorithm. In addition, a simple modular neural controller then generates the desired different walking patterns such that the machine walks straight, then turns towards a switched-on sound source, and then stops near to it....... and a neural preprocessing system together with a modular neural controller are used to generate a sound tropism of a four-legged walking machine. The neural preprocessing network is acting as a low-pass filter and it is followed by a network which discerns between signals coming from the left or the right...

  19. Folate receptor 1 is necessary for neural plate cell apical constriction during Xenopus neural tube formation.

    Science.gov (United States)

    Balashova, Olga A; Visina, Olesya; Borodinsky, Laura N

    2017-04-15

    Folate supplementation prevents up to 70% of neural tube defects (NTDs), which result from a failure of neural tube closure during embryogenesis. The elucidation of the mechanisms underlying folate action has been challenging. This study introduces Xenopus laevis as a model to determine the cellular and molecular mechanisms involved in folate action during neural tube formation. We show that knockdown of folate receptor 1 (Folr1; also known as FRα) impairs neural tube formation and leads to NTDs. Folr1 knockdown in neural plate cells only is necessary and sufficient to induce NTDs. Folr1-deficient neural plate cells fail to constrict, resulting in widening of the neural plate midline and defective neural tube closure. Pharmacological inhibition of folate action by methotrexate during neurulation induces NTDs by inhibiting folate interaction with its uptake systems. Our findings support a model in which the folate receptor interacts with cell adhesion molecules, thus regulating the apical cell membrane remodeling and cytoskeletal dynamics necessary for neural plate folding. Further studies in this organism could unveil novel cellular and molecular events mediated by folate and lead to new ways of preventing NTDs. © 2017. Published by The Company of Biologists Ltd.

  20. Neural bases of accented speech perception

    Directory of Open Access Journals (Sweden)

    Patti eAdank

    2015-10-01

    Full Text Available The recognition of unfamiliar regional and foreign accents represents a challenging task for the speech perception system (Adank, Evans, Stuart-Smith, & Scott, 2009; Floccia, Goslin, Girard, & Konopczynski, 2006. Despite the frequency with which we encounter such accents, the neural mechanisms supporting successful perception of accented speech are poorly understood. Nonetheless, candidate neural substrates involved in processing speech in challenging listening conditions, including accented speech, are beginning to be identified. This review will outline neural bases associated with perception of accented speech in the light of current models of speech perception, and compare these data to brain areas associated with processing other speech distortions. We will subsequently evaluate competing models of speech processing with regards to neural processing of accented speech. See Cristia et al. (2012 for an in-depth overview of behavioural aspects of accent processing.

  1. [Medical use of artificial neural networks].

    Science.gov (United States)

    Molnár, B; Papik, K; Schaefer, R; Dombóvári, Z; Fehér, J; Tulassay, Z

    1998-01-04

    The main aim of the research in medical diagnostics is to develop more exact, cost-effective and handsome systems, procedures and methods for supporting the clinicians. In their paper the authors introduce a new method that recently came into the focus referred to as artificial neural networks. Based on the literature of the past 5-6 years they give a brief review--highlighting the most important ones--showing the idea behind neural networks, what they are used for in the medical field. The definition, structure and operation of neural networks are discussed. In the application part they collect examples in order to give an insight in the neural network application research. It is emphasised that in the near future basically new diagnostic equipment can be developed based on this new technology in the field of ECG, EEG and macroscopic and microscopic image analysis systems.

  2. Screening for Open Neural Tube Defects.

    Science.gov (United States)

    Krantz, David A; Hallahan, Terrence W; Carmichael, Jonathan B

    2016-06-01

    Biochemical prenatal screening was initiated with the use of maternal serum alpha fetoprotein to screen for open neural tube defects. Screening now includes multiple marker and sequential screening protocols involving serum and ultrasound markers to screen for aneuploidy. Recently cell-free DNA screening for aneuploidy has been initiated, but does not screen for neural tube defects. Although ultrasound is highly effective in identifying neural tube defects in high-risk populations, in decentralized health systems maternal serum screening still plays a significant role. Abnormal maternal serum alpha fetoprotein alone or in combination with other markers may indicate adverse pregnancy outcome in the absence of open neural tube defects. Copyright © 2016 Elsevier Inc. All rights reserved.

  3. Hindcasting of storm waves using neural networks

    Digital Repository Service at National Institute of Oceanography (India)

    Rao, S.; Mandal, S.

    of any exogenous input requirement makes the network attractive. A neural network is an information processing system modeled on the structure of the human brain. Its merit is the ability to deal with fuzzy information whose interrelation is ambiguous...

  4. Application of neural networks in coastal engineering

    Digital Repository Service at National Institute of Oceanography (India)

    Mandal, S.

    methods. That is why it is becoming popular in various fields including coastal engineering. Waves and tides will play important roles in coastal erosion or accretion. This paper briefly describes the back-propagation neural networks and its application...

  5. Additive Feed Forward Control with Neural Networks

    DEFF Research Database (Denmark)

    Sørensen, O.

    1999-01-01

    This paper demonstrates a method to control a non-linear, multivariable, noisy process using trained neural networks. The basis for the method is a trained neural network controller acting as the inverse process model. A training method for obtaining such an inverse process model is applied....... A suitable 'shaped' (low-pass filtered) reference is used to overcome problems with excessive control action when using a controller acting as the inverse process model. The control concept is Additive Feed Forward Control, where the trained neural network controller, acting as the inverse process model......, is placed in a supplementary pure feed-forward path to an existing feedback controller. This concept benefits from the fact, that an existing, traditional designed, feedback controller can be retained without any modifications, and after training the connection of the neural network feed-forward controller...

  6. Hardware Acceleration of Adaptive Neural Algorithms.

    Energy Technology Data Exchange (ETDEWEB)

    James, Conrad D. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-11-01

    As tradit ional numerical computing has faced challenges, researchers have turned towards alternative computing approaches to reduce power - per - computation metrics and improve algorithm performance. Here, we describe an approach towards non - conventional computing that strengthens the connection between machine learning and neuroscience concepts. The Hardware Acceleration of Adaptive Neural Algorithms (HAANA) project ha s develop ed neural machine learning algorithms and hardware for applications in image processing and cybersecurity. While machine learning methods are effective at extracting relevant features from many types of data, the effectiveness of these algorithms degrades when subjected to real - world conditions. Our team has generated novel neural - inspired approa ches to improve the resiliency and adaptability of machine learning algorithms. In addition, we have also designed and fabricated hardware architectures and microelectronic devices specifically tuned towards the training and inference operations of neural - inspired algorithms. Finally, our multi - scale simulation framework allows us to assess the impact of microelectronic device properties on algorithm performance.

  7. Optimal neural computations require analog processors

    Energy Technology Data Exchange (ETDEWEB)

    Beiu, V.

    1998-12-31

    This paper discusses some of the limitations of hardware implementations of neural networks. The authors start by presenting neural structures and their biological inspirations, while mentioning the simplifications leading to artificial neural networks. Further, the focus will be on hardware imposed constraints. They will present recent results for three different alternatives of parallel implementations of neural networks: digital circuits, threshold gate circuits, and analog circuits. The area and the delay will be related to the neurons` fan-in and to the precision of their synaptic weights. The main conclusion is that hardware-efficient solutions require analog computations, and suggests the following two alternatives: (i) cope with the limitations imposed by silicon, by speeding up the computation of the elementary silicon neurons; (2) investigate solutions which would allow the use of the third dimension (e.g. using optical interconnections).

  8. Blood glucose prediction using neural network

    Science.gov (United States)

    Soh, Chit Siang; Zhang, Xiqin; Chen, Jianhong; Raveendran, P.; Soh, Phey Hong; Yeo, Joon Hock

    2008-02-01

    We used neural network for blood glucose level determination in this study. The data set used in this study was collected using a non-invasive blood glucose monitoring system with six laser diodes, each laser diode operating at distinct near infrared wavelength between 1500nm and 1800nm. The neural network is specifically used to determine blood glucose level of one individual who participated in an oral glucose tolerance test (OGTT) session. Partial least squares regression is also used for blood glucose level determination for the purpose of comparison with the neural network model. The neural network model performs better in the prediction of blood glucose level as compared with the partial least squares model.

  9. Opioid Use and Neural Tube Defects

    Science.gov (United States)

    ... CDC.gov . Error processing SSI file Key Findings: Opioid Use and Neural Tube Defects Recommend on Facebook ... new study that looked at the use of opioids during pregnancy and their relationship to having a ...

  10. Theory of mind: a neural prediction problem.

    Science.gov (United States)

    Koster-Hale, Jorie; Saxe, Rebecca

    2013-09-04

    Predictive coding posits that neural systems make forward-looking predictions about incoming information. Neural signals contain information not about the currently perceived stimulus, but about the difference between the observed and the predicted stimulus. We propose to extend the predictive coding framework from high-level sensory processing to the more abstract domain of theory of mind; that is, to inferences about others' goals, thoughts, and personalities. We review evidence that, across brain regions, neural responses to depictions of human behavior, from biological motion to trait descriptions, exhibit a key signature of predictive coding: reduced activity to predictable stimuli. We discuss how future experiments could distinguish predictive coding from alternative explanations of this response profile. This framework may provide an important new window on the neural computations underlying theory of mind. Copyright © 2013 Elsevier Inc. All rights reserved.

  11. Neural plasticity after spinal cord injury.

    Science.gov (United States)

    Liu, Jian; Yang, Xiaoyu; Jiang, Lianying; Wang, Chunxin; Yang, Maoguang

    2012-02-15

    Plasticity changes of uninjured nerves can result in a novel neural circuit after spinal cord injury, which can restore sensory and motor functions to different degrees. Although processes of neural plasticity have been studied, the mechanism and treatment to effectively improve neural plasticity changes remain controversial. The present study reviewed studies regarding plasticity of the central nervous system and methods for promoting plasticity to improve repair of injured central nerves. The results showed that synaptic reorganization, axonal sprouting, and neurogenesis are critical factors for neural circuit reconstruction. Directed functional exercise, neurotrophic factor and transplantation of nerve-derived and non-nerve-derived tissues and cells can effectively ameliorate functional disturbances caused by spinal cord injury and improve quality of life for patients.

  12. PREDIKSI FOREX MENGGUNAKAN MODEL NEURAL NETWORK

    Directory of Open Access Journals (Sweden)

    R. Hadapiningradja Kusumodestoni

    2015-11-01

    Full Text Available ABSTRAK Prediksi adalah salah satu teknik yang paling penting dalam menjalankan bisnis forex. Keputusan dalam memprediksi adalah sangatlah penting, karena dengan prediksi dapat membantu mengetahui nilai forex di waktu tertentu kedepan sehingga dapat mengurangi resiko kerugian. Tujuan dari penelitian ini dimaksudkan memprediksi bisnis fores menggunakan model neural network dengan data time series per 1 menit untuk mengetahui nilai akurasi prediksi sehingga dapat mengurangi resiko dalam menjalankan bisnis forex. Metode penelitian pada penelitian ini meliputi metode pengumpulan data kemudian dilanjutkan ke metode training, learning, testing menggunakan neural network. Setelah di evaluasi hasil penelitian ini menunjukan bahwa penerapan algoritma Neural Network mampu untuk memprediksi forex dengan tingkat akurasi prediksi 0.431 +/- 0.096 sehingga dengan prediksi ini dapat membantu mengurangi resiko dalam menjalankan bisnis forex. Kata kunci: prediksi, forex, neural network.

  13. Neural adaptations to electrical stimulation strength training

    NARCIS (Netherlands)

    Hortobagyi, Tibor; Maffiuletti, Nicola A.

    2011-01-01

    This review provides evidence for the hypothesis that electrostimulation strength training (EST) increases the force of a maximal voluntary contraction (MVC) through neural adaptations in healthy skeletal muscle. Although electrical stimulation and voluntary effort activate muscle differently, there

  14. Using Neural Networks in Diagnosing Breast Cancer

    National Research Council Canada - National Science Library

    Fogel, David

    1997-01-01

    .... In the current study, evolutionary programming is used to train neural networks and linear discriminant models to detect breast cancer in suspicious and microcalcifications using radiographic features and patient age...

  15. Neural Networks in Mobile Robot Motion

    Directory of Open Access Journals (Sweden)

    Danica Janglová

    2004-03-01

    Full Text Available This paper deals with a path planning and intelligent control of an autonomous robot which should move safely in partially structured environment. This environment may involve any number of obstacles of arbitrary shape and size; some of them are allowed to move. We describe our approach to solving the motion-planning problem in mobile robot control using neural networks-based technique. Our method of the construction of a collision-free path for moving robot among obstacles is based on two neural networks. The first neural network is used to determine the “free” space using ultrasound range finder data. The second neural network “finds” a safe direction for the next robot section of the path in the workspace while avoiding the nearest obstacles. Simulation examples of generated path with proposed techniques will be presented.

  16. Isolated Speech Recognition Using Artificial Neural Networks

    National Research Council Canada - National Science Library

    Polur, Prasad

    2001-01-01

    .... A small size vocabulary containing the words YES and NO is chosen. Spectral features using cepstral analysis are extracted per frame and imported to a feedforward neural network which uses a backpropagation with momentum training algorithm...

  17. Control of autonomous robot using neural networks

    Science.gov (United States)

    Barton, Adam; Volna, Eva

    2017-07-01

    The aim of the article is to design a method of control of an autonomous robot using artificial neural networks. The introductory part describes control issues from the perspective of autonomous robot navigation and the current mobile robots controlled by neural networks. The core of the article is the design of the controlling neural network, and generation and filtration of the training set using ART1 (Adaptive Resonance Theory). The outcome of the practical part is an assembled Lego Mindstorms EV3 robot solving the problem of avoiding obstacles in space. To verify models of an autonomous robot behavior, a set of experiments was created as well as evaluation criteria. The speed of each motor was adjusted by the controlling neural network with respect to the situation in which the robot was found.

  18. Neural Networks in Mobile Robot Motion

    Directory of Open Access Journals (Sweden)

    Danica Janglova

    2008-11-01

    Full Text Available This paper deals with a path planning and intelligent control of an autonomous robot which should move safely in partially structured environment. This environment may involve any number of obstacles of arbitrary shape and size; some of them are allowed to move. We describe our approach to solving the motion-planning problem in mobile robot control using neural networks-based technique. Our method of the construction of a collision-free path for moving robot among obstacles is based on two neural networks. The first neural network is used to determine the "free" space using ultrasound range finder data. The second neural network "finds" a safe direction for the next robot section of the path in the workspace while avoiding the nearest obstacles. Simulation examples of generated path with proposed techniques will be presented.

  19. Artificial neural networks a practical course

    CERN Document Server

    da Silva, Ivan Nunes; Andrade Flauzino, Rogerio; Liboni, Luisa Helena Bartocci; dos Reis Alves, Silas Franco

    2017-01-01

    This book provides comprehensive coverage of neural networks, their evolution, their structure, the problems they can solve, and their applications. The first half of the book looks at theoretical investigations on artificial neural networks and addresses the key architectures that are capable of implementation in various application scenarios. The second half is designed specifically for the production of solutions using artificial neural networks to solve practical problems arising from different areas of knowledge. It also describes the various implementation details that were taken into account to achieve the reported results. These aspects contribute to the maturation and improvement of experimental techniques to specify the neural network architecture that is most appropriate for a particular application scope. The book is appropriate for students in graduate and upper undergraduate courses in addition to researchers and professionals.

  20. Neural bases of accented speech perception.

    Science.gov (United States)

    Adank, Patti; Nuttall, Helen E; Banks, Briony; Kennedy-Higgins, Daniel

    2015-01-01

    The recognition of unfamiliar regional and foreign accents represents a challenging task for the speech perception system (Floccia et al., 2006; Adank et al., 2009). Despite the frequency with which we encounter such accents, the neural mechanisms supporting successful perception of accented speech are poorly understood. Nonetheless, candidate neural substrates involved in processing speech in challenging listening conditions, including accented speech, are beginning to be identified. This review will outline neural bases associated with perception of accented speech in the light of current models of speech perception, and compare these data to brain areas associated with processing other speech distortions. We will subsequently evaluate competing models of speech processing with regards to neural processing of accented speech. See Cristia et al. (2012) for an in-depth overview of behavioral aspects of accent processing.

  1. Constructive autoassociative neural network for facial recognition.

    Directory of Open Access Journals (Sweden)

    Bruno J T Fernandes

    Full Text Available Autoassociative artificial neural networks have been used in many different computer vision applications. However, it is difficult to define the most suitable neural network architecture because this definition is based on previous knowledge and depends on the problem domain. To address this problem, we propose a constructive autoassociative neural network called CANet (Constructive Autoassociative Neural Network. CANet integrates the concepts of receptive fields and autoassociative memory in a dynamic architecture that changes the configuration of the receptive fields by adding new neurons in the hidden layer, while a pruning algorithm removes neurons from the output layer. Neurons in the CANet output layer present lateral inhibitory connections that improve the recognition rate. Experiments in face recognition and facial expression recognition show that the CANet outperforms other methods presented in the literature.

  2. Genetic Algorithm Optimized Neural Networks Ensemble as ...

    African Journals Online (AJOL)

    NJD

    Genetic Algorithm Optimized Neural Networks Ensemble as. Calibration Model for Simultaneous Spectrophotometric. Estimation of Atenolol and Losartan Potassium in Tablets. Dondeti Satyanarayana*, Kamarajan Kannan and Rajappan Manavalan. Department of Pharmacy, Annamalai University, Annamalainagar, Tamil ...

  3. Removing Epistemological Bias From Empirical Observation of Neural Networks

    OpenAIRE

    Waldron, Ronan

    1994-01-01

    Also in Proceedings of the International Joint Conference on Neural Networks, Nagoya, Japan. This paper addresses the application of neural network research to a theory of autonomous systems. Neural networks, while enjoying considerable success in autonomous systems applications, have failed to provide a firm theoretical underpinning to neural systems embedded in their natural ecological context. This paper proposes a stochastic formulation of such an embedding. A neural sys...

  4. Methodology of Neural Design: Applications in Microwave Engineering

    OpenAIRE

    Z. Raida; P. Pomenka

    2006-01-01

    In the paper, an original methodology for the automatic creation of neural models of microwave structures is proposed and verified. Following the methodology, neural models of the prescribed accuracy are built within the minimum CPU time. Validity of the proposed methodology is verified by developing neural models of selected microwave structures. Functionality of neural models is verified in a design - a neural model is joined with a genetic algorithm to find a global minimum of a formulat...

  5. CDMA and TDMA based neural nets.

    Science.gov (United States)

    Herrero, J C

    2001-06-01

    CDMA and TDMA telecommunication techniques were established long time ago, but they have acquired a renewed presence due to the rapidly increasing mobile phones demand. In this paper, we are going to see they are suitable for neural nets, if we leave the concept "connection" between processing units and we adopt the concept "messages" exchanged between them. This may open the door to neural nets with a higher number of processing units and flexible configuration.

  6. Culture of Mouse Neural Stem Cell Precursors

    OpenAIRE

    Currle, D. Spencer; Hu, Jia Sheng; Kolski-Andreaco, Aaron; Monuki, Edwin S.

    2007-01-01

    Primary neural stem cell cultures are useful for studying the mechanisms underlying central nervous system development. Stem cell research will increase our understanding of the nervous system and may allow us to develop treatments for currently incurable brain diseases and injuries. In addition, stem cells should be used for stem cell research aimed at the detailed study of mechanisms of neural differentiation and transdifferentiation and the genetic and environmental signals that direct the...

  7. Learning Processes of Layered Neural Networks

    OpenAIRE

    Fujiki, Sumiyoshi; FUJIKI, Nahomi, M.

    1995-01-01

    A positive reinforcement type learning algorithm is formulated for a stochastic feed-forward neural network, and a learning equation similar to that of the Boltzmann machine algorithm is obtained. By applying a mean field approximation to the same stochastic feed-forward neural network, a deterministic analog feed-forward network is obtained and the back-propagation learning rule is re-derived.

  8. Drift chamber tracking with neural networks

    Energy Technology Data Exchange (ETDEWEB)

    Lindsey, C.S.; Denby, B.; Haggerty, H.

    1992-10-01

    We discuss drift chamber tracking with a commercial log VLSI neural network chip. Voltages proportional to the drift times in a 4-layer drift chamber were presented to the Intel ETANN chip. The network was trained to provide the intercept and slope of straight tracks traversing the chamber. The outputs were recorded and later compared off line to conventional track fits. Two types of network architectures were studied. Applications of neural network tracking to high energy physics detector triggers is discussed.

  9. Can neural machine translation do simultaneous translation?

    OpenAIRE

    Cho, Kyunghyun; Esipova, Masha

    2016-01-01

    We investigate the potential of attention-based neural machine translation in simultaneous translation. We introduce a novel decoding algorithm, called simultaneous greedy decoding, that allows an existing neural machine translation model to begin translating before a full source sentence is received. This approach is unique from previous works on simultaneous translation in that segmentation and translation are done jointly to maximize the translation quality and that translating each segmen...

  10. Targeting the Neural Microenvironment in Prostate Cancer

    Science.gov (United States)

    2017-10-01

    Award Number: W81XWH-14-1-0505 TITLE: Targeting the Neural Microenvironment in Prostate Cancer PRINCIPAL INVESTIGATOR: Michael Ittmann MD PhD...CONTRACT NUMBER Targeting the Neural Microenvironment in Prostate Cancer 5b. GRANT NUMBER W81XWH-14-1-0505 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d...ABSTRACT Prostate cancer (PCa) remains the most common malignancy and the second leading cause of cancer -related death for men in the United States. Recent

  11. Reading Neural Encodings using Phase Space Methods

    OpenAIRE

    Abarbanel, Henry D. I.; Tumer, Evren C.

    2003-01-01

    Environmental signals sensed by nervous systems are often represented in spike trains carried from sensory neurons to higher neural functions where decisions and functional actions occur. Information about the environmental stimulus is contained (encoded) in the train of spikes. We show how to "read" the encoding using state space methods of nonlinear dynamics. We create a mapping from spike signals which are output from the neural processing system back to an estimate of the analog input sig...

  12. Human Neural Cell-Based Biosensor

    Science.gov (United States)

    2013-05-28

    Orlando R, Stice SL. Membrane proteomic signatures of karyotypically normal and abnormal human embryonic stem cell lines and derivatives. Proteomics. 2011...format (96-,384-well) assays, 2) grow as adherent monolayers, and 3) possess a stable karyotype for multiple (>10) passages with a doubling time of ~36...derived neural progenitor cell line working stock has been amplified, characterized for karyotype and evaluated for the expression of neural progenitor

  13. Neural crest development in fetal alcohol syndrome.

    Science.gov (United States)

    Smith, Susan M; Garic, Ana; Flentke, George R; Berres, Mark E

    2014-09-01

    Fetal alcohol spectrum disorder (FASD) is a leading cause of neurodevelopmental disability. Some affected individuals possess distinctive craniofacial deficits, but many more lack overt facial changes. An understanding of the mechanisms underlying these deficits would inform their diagnostic utility. Our understanding of these mechanisms is challenged because ethanol lacks a single receptor when redirecting cellular activity. This review summarizes our current understanding of how ethanol alters neural crest development. Ample evidence shows that ethanol causes the "classic" fetal alcohol syndrome (FAS) face (short palpebral fissures, elongated upper lip, deficient philtrum) because it suppresses prechordal plate outgrowth, thereby reducing neuroectoderm and neural crest induction and causing holoprosencephaly. Prenatal alcohol exposure (PAE) at premigratory stages elicits a different facial appearance, indicating FASD may represent a spectrum of facial outcomes. PAE at this premigratory period initiates a calcium transient that activates CaMKII and destabilizes transcriptionally active β-catenin, thereby initiating apoptosis within neural crest populations. Contributing to neural crest vulnerability are their low antioxidant responses. Ethanol-treated neural crest produce reactive oxygen species and free radical scavengers attenuate their production and prevent apoptosis. Ethanol also significantly impairs neural crest migration, causing cytoskeletal rearrangements that destabilize focal adhesion formation; their directional migratory capacity is also lost. Genetic factors further modify vulnerability to ethanol-induced craniofacial dysmorphology and include genes important for neural crest development, including shh signaling, PDFGA, vangl2, and ribosomal biogenesis. Because facial and brain development are mechanistically and functionally linked, research into ethanol's effects on neural crest also informs our understanding of ethanol's CNS pathologies. © 2014

  14. Applications of Pulse-Coupled Neural Networks

    CERN Document Server

    Ma, Yide; Wang, Zhaobin

    2011-01-01

    "Applications of Pulse-Coupled Neural Networks" explores the fields of image processing, including image filtering, image segmentation, image fusion, image coding, image retrieval, and biometric recognition, and the role of pulse-coupled neural networks in these fields. This book is intended for researchers and graduate students in artificial intelligence, pattern recognition, electronic engineering, and computer science. Prof. Yide Ma conducts research on intelligent information processing, biomedical image processing, and embedded system development at the School of Information Sci

  15. Radioactive fallout and neural tube defects

    Directory of Open Access Journals (Sweden)

    Nejat Akar

    2015-10-01

    Full Text Available Possible link between radioactivity and the occurrence of neural tube defects is a long lasting debate since the Chernobyl nuclear fallout in 1986. A recent report on the incidence of neural defects in the west coast of USA, following Fukushima disaster, brought another evidence for effect of radioactive fallout on the occurrence of NTD’s. Here a literature review was performed focusing on this special subject.

  16. Neural networks as models of psychopathology.

    Science.gov (United States)

    Aakerlund, L; Hemmingsen, R

    1998-04-01

    Neural network modeling is situated between neurobiology, cognitive science, and neuropsychology. The structural and functional resemblance with biological computation has made artificial neural networks (ANN) useful for exploring the relationship between neurobiology and computational performance, i.e., cognition and behavior. This review provides an introduction to the theory of ANN and how they have linked theories from neurobiology and psychopathology in schizophrenia, affective disorders, and dementia.

  17. Neural Reranking for Named Entity Recognition

    OpenAIRE

    Yang, Jie; Zhang, Yue; Dong, Fei

    2017-01-01

    We propose a neural reranking system for named entity recognition (NER). The basic idea is to leverage recurrent neural network models to learn sentence-level patterns that involve named entity mentions. In particular, given an output sentence produced by a baseline NER model, we replace all entity mentions, such as \\textit{Barack Obama}, into their entity types, such as \\textit{PER}. The resulting sentence patterns contain direct output information, yet is less sparse without specific named ...

  18. Microtubules, polarity and vertebrate neural tube morphogenesis.

    Science.gov (United States)

    Cearns, Michael D; Escuin, Sarah; Alexandre, Paula; Greene, Nicholas D E; Copp, Andrew J

    2016-07-01

    Microtubules (MTs) are key cellular components, long known to participate in morphogenetic events that shape the developing embryo. However, the links between the cellular functions of MTs, their effects on cell shape and polarity, and their role in large-scale morphogenesis remain poorly understood. Here, these relationships were examined with respect to two strategies for generating the vertebrate neural tube: bending and closure of the mammalian neural plate; and cavitation of the teleost neural rod. The latter process has been compared with 'secondary' neurulation that generates the caudal spinal cord in mammals. MTs align along the apico-basal axis of the mammalian neuroepithelium early in neural tube closure, participating functionally in interkinetic nuclear migration, which indirectly impacts on cell shape. Whether MTs play other functional roles in mammalian neurulation remains unclear. In the zebrafish, MTs are important for defining the neural rod midline prior to its cavitation, both by localizing apical proteins at the tissue midline and by orienting cell division through a mirror-symmetric MT apparatus that helps to further define the medial localization of apical polarity proteins. Par proteins have been implicated in centrosome positioning in neuroepithelia as well as in the control of polarized morphogenetic movements in the neural rod. Understanding of MT functions during early nervous system development has so far been limited, partly by techniques that fail to distinguish 'cause' from 'effect'. Future developments will likely rely on novel ways to selectively impair MT function in order to investigate the roles they play. © 2016 Anatomical Society.

  19. Modular representation of layered neural networks.

    Science.gov (United States)

    Watanabe, Chihiro; Hiramatsu, Kaoru; Kashino, Kunio

    2018-01-01

    Layered neural networks have greatly improved the performance of various applications including image processing, speech recognition, natural language processing, and bioinformatics. However, it is still difficult to discover or interpret knowledge from the inference provided by a layered neural network, since its internal representation has many nonlinear and complex parameters embedded in hierarchical layers. Therefore, it becomes important to establish a new methodology by which layered neural networks can be understood. In this paper, we propose a new method for extracting a global and simplified structure from a layered neural network. Based on network analysis, the proposed method detects communities or clusters of units with similar connection patterns. We show its effectiveness by applying it to three use cases. (1) Network decomposition: it can decompose a trained neural network into multiple small independent networks thus dividing the problem and reducing the computation time. (2) Training assessment: the appropriateness of a trained result with a given hyperparameter or randomly chosen initial parameters can be evaluated by using a modularity index. And (3) data analysis: in practical data it reveals the community structure in the input, hidden, and output layers, which serves as a clue for discovering knowledge from a trained neural network. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. The Laplacian spectrum of neural networks

    Science.gov (United States)

    de Lange, Siemon C.; de Reus, Marcel A.; van den Heuvel, Martijn P.

    2014-01-01

    The brain is a complex network of neural interactions, both at the microscopic and macroscopic level. Graph theory is well suited to examine the global network architecture of these neural networks. Many popular graph metrics, however, encode average properties of individual network elements. Complementing these “conventional” graph metrics, the eigenvalue spectrum of the normalized Laplacian describes a network's structure directly at a systems level, without referring to individual nodes or connections. In this paper, the Laplacian spectra of the macroscopic anatomical neuronal networks of the macaque and cat, and the microscopic network of the Caenorhabditis elegans were examined. Consistent with conventional graph metrics, analysis of the Laplacian spectra revealed an integrative community structure in neural brain networks. Extending previous findings of overlap of network attributes across species, similarity of the Laplacian spectra across the cat, macaque and C. elegans neural networks suggests a certain level of consistency in the overall architecture of the anatomical neural networks of these species. Our results further suggest a specific network class for neural networks, distinct from conceptual small-world and scale-free models as well as several empirical networks. PMID:24454286

  1. Logarithmic learning for generalized classifier neural network.

    Science.gov (United States)

    Ozyildirim, Buse Melis; Avci, Mutlu

    2014-12-01

    Generalized classifier neural network is introduced as an efficient classifier among the others. Unless the initial smoothing parameter value is close to the optimal one, generalized classifier neural network suffers from convergence problem and requires quite a long time to converge. In this work, to overcome this problem, a logarithmic learning approach is proposed. The proposed method uses logarithmic cost function instead of squared error. Minimization of this cost function reduces the number of iterations used for reaching the minima. The proposed method is tested on 15 different data sets and performance of logarithmic learning generalized classifier neural network is compared with that of standard one. Thanks to operation range of radial basis function included by generalized classifier neural network, proposed logarithmic approach and its derivative has continuous values. This makes it possible to adopt the advantage of logarithmic fast convergence by the proposed learning method. Due to fast convergence ability of logarithmic cost function, training time is maximally decreased to 99.2%. In addition to decrease in training time, classification performance may also be improved till 60%. According to the test results, while the proposed method provides a solution for time requirement problem of generalized classifier neural network, it may also improve the classification accuracy. The proposed method can be considered as an efficient way for reducing the time requirement problem of generalized classifier neural network. Copyright © 2014 Elsevier Ltd. All rights reserved.

  2. Diabetic retinopathy screening using deep neural network.

    Science.gov (United States)

    Ramachandran, Nishanthan; Hong, Sheng Chiong; Sime, Mary J; Wilson, Graham A

    2017-09-07

    There is a burgeoning interest in the use of deep neural network in diabetic retinal screening. To determine whether a deep neural network could satisfactorily detect diabetic retinopathy that requires referral to an ophthalmologist from a local diabetic retinal screening programme and an international database. Retrospective audit. Diabetic retinal photos from Otago database photographed during October 2016 (485 photos), and 1200 photos from Messidor international database. Receiver operating characteristic curve to illustrate the ability of a deep neural network to identify referable diabetic retinopathy (moderate or worse diabetic retinopathy or exudates within one disc diameter of the fovea). Area under the receiver operating characteristic curve, sensitivity and specificity. For detecting referable diabetic retinopathy, the deep neural network had an area under receiver operating characteristic curve of 0.901 (95% confidence interval 0.807-0.995), with 84.6% sensitivity and 79.7% specificity for Otago and 0.980 (95% confidence interval 0.973-0.986), with 96.0% sensitivity and 90.0% specificity for Messidor. This study has shown that a deep neural network can detect referable diabetic retinopathy with sensitivities and specificities close to or better than 80% from both an international and a domestic (New Zealand) database. We believe that deep neural networks can be integrated into community screening once they can successfully detect both diabetic retinopathy and diabetic macular oedema. © 2017 Royal Australian and New Zealand College of Ophthalmologists.

  3. Recent advances in neural recording microsystems.

    Science.gov (United States)

    Gosselin, Benoit

    2011-01-01

    The accelerating pace of research in neuroscience has created a considerable demand for neural interfacing microsystems capable of monitoring the activity of large groups of neurons. These emerging tools have revealed a tremendous potential for the advancement of knowledge in brain research and for the development of useful clinical applications. They can extract the relevant control signals directly from the brain enabling individuals with severe disabilities to communicate their intentions to other devices, like computers or various prostheses. Such microsystems are self-contained devices composed of a neural probe attached with an integrated circuit for extracting neural signals from multiple channels, and transferring the data outside the body. The greatest challenge facing development of such emerging devices into viable clinical systems involves addressing their small form factor and low-power consumption constraints, while providing superior resolution. In this paper, we survey the recent progress in the design and the implementation of multi-channel neural recording Microsystems, with particular emphasis on the design of recording and telemetry electronics. An overview of the numerous neural signal modalities is given and the existing microsystem topologies are covered. We present energy-efficient sensory circuits to retrieve weak signals from neural probes and we compare them. We cover data management and smart power scheduling approaches, and we review advances in low-power telemetry. Finally, we conclude by summarizing the remaining challenges and by highlighting the emerging trends in the field.

  4. Brain and language: evidence for neural multifunctionality.

    Science.gov (United States)

    Cahana-Amitay, Dalia; Albert, Martin L

    2014-01-01

    This review paper presents converging evidence from studies of brain damage and longitudinal studies of language in aging which supports the following thesis: the neural basis of language can best be understood by the concept of neural multifunctionality. In this paper the term "neural multifunctionality" refers to incorporation of nonlinguistic functions into language models of the intact brain, reflecting a multifunctional perspective whereby a constant and dynamic interaction exists among neural networks subserving cognitive, affective, and praxic functions with neural networks specialized for lexical retrieval, sentence comprehension, and discourse processing, giving rise to language as we know it. By way of example, we consider effects of executive system functions on aspects of semantic processing among persons with and without aphasia, as well as the interaction of executive and language functions among older adults. We conclude by indicating how this multifunctional view of brain-language relations extends to the realm of language recovery from aphasia, where evidence of the influence of nonlinguistic factors on the reshaping of neural circuitry for aphasia rehabilitation is clearly emerging.

  5. Aging affects neural precision of speech encoding.

    Science.gov (United States)

    Anderson, Samira; Parbery-Clark, Alexandra; White-Schwoch, Travis; Kraus, Nina

    2012-10-10

    Older adults frequently report they can hear what is said but cannot understand the meaning, especially in noise. This difficulty may arise from the inability to process rapidly changing elements of speech. Aging is accompanied by a general slowing of neural processing and decreased neural inhibition, both of which likely interfere with temporal processing in auditory and other sensory domains. Age-related reductions in inhibitory neurotransmitter levels and delayed neural recovery can contribute to decreases in the temporal precision of the auditory system. Decreased precision may lead to neural timing delays, reductions in neural response magnitude, and a disadvantage in processing the rapid acoustic changes in speech. The auditory brainstem response (ABR), a scalp-recorded electrical potential, is known for its ability to capture precise neural synchrony within subcortical auditory nuclei; therefore, we hypothesized that a loss of temporal precision results in subcortical timing delays and decreases in response consistency and magnitude. To assess this hypothesis, we recorded ABRs to the speech syllable /da/ in normal hearing younger (18-30 years old) and older (60-67 years old) adult humans. Older adults had delayed ABRs, especially in response to the rapidly changing formant transition, and greater response variability. We also found that older adults had decreased phase locking and smaller response magnitudes than younger adults. Together, our results support the theory that older adults have a loss of temporal precision in the subcortical encoding of sound, which may account, at least in part, for their difficulties with speech perception.

  6. Research of The Deeper Neural Networks

    Directory of Open Access Journals (Sweden)

    Xiao You Rong

    2016-01-01

    Full Text Available Neural networks (NNs have powerful computational abilities and could be used in a variety of applications; however, training these networks is still a difficult problem. With different network structures, many neural models have been constructed. In this report, a deeper neural networks (DNNs architecture is proposed. The training algorithm of deeper neural network insides searching the global optimal point in the actual error surface. Before the training algorithm is designed, the error surface of the deeper neural network is analyzed from simple to complicated, and the features of the error surface is obtained. Based on these characters, the initialization method and training algorithm of DNNs is designed. For the initialization, a block-uniform design method is proposed which separates the error surface into some blocks and finds the optimal block using the uniform design method. For the training algorithm, the improved gradient-descent method is proposed which adds a penalty term into the cost function of the old gradient descent method. This algorithm makes the network have a great approximating ability and keeps the network state stable. All of these improve the practicality of the neural network.

  7. Recent Advances in Neural Recording Microsystems

    Directory of Open Access Journals (Sweden)

    Benoit Gosselin

    2011-04-01

    Full Text Available The accelerating pace of research in neuroscience has created a considerable demand for neural interfacing microsystems capable of monitoring the activity of large groups of neurons. These emerging tools have revealed a tremendous potential for the advancement of knowledge in brain research and for the development of useful clinical applications. They can extract the relevant control signals directly from the brain enabling individuals with severe disabilities to communicate their intentions to other devices, like computers or various prostheses. Such microsystems are self-contained devices composed of a neural probe attached with an integrated circuit for extracting neural signals from multiple channels, and transferring the data outside the body. The greatest challenge facing development of such emerging devices into viable clinical systems involves addressing their small form factor and low-power consumption constraints, while providing superior resolution. In this paper, we survey the recent progress in the design and the implementation of multi-channel neural recording Microsystems, with particular emphasis on the design of recording and telemetry electronics. An overview of the numerous neural signal modalities is given and the existing microsystem topologies are covered. We present energy-efficient sensory circuits to retrieve weak signals from neural probes and we compare them. We cover data management and smart power scheduling approaches, and we review advances in low-power telemetry. Finally, we conclude by summarizing the remaining challenges and by highlighting the emerging trends in the field.

  8. Noise shaping in neural populations.

    Science.gov (United States)

    Avila Akerberg, Oscar; Chacron, Maurice J

    2009-01-01

    Many neurons display intrinsic interspike interval correlations in their spike trains. However, the effects of such correlations on information transmission in neural populations are not well understood. We quantified signal processing using linear response theory supported by numerical simulations in networks composed of two different models: One model generates a renewal process where interspike intervals are not correlated while the other generates a nonrenewal process where subsequent interspike intervals are negatively correlated. Our results show that the fractional rate of increase in information rate as a function of network size and stimulus intensity is lower for the nonrenewal model than for the renewal one. We show that this is mostly due to the lower amount of effective noise in the nonrenewal model. We also show the surprising result that coupling has opposite effects in renewal and nonrenewal networks: Excitatory (inhibitory coupling) will decrease (increase) the information rate in renewal networks while inhibitory (excitatory coupling) will decrease (increase) the information rate in nonrenewal networks. We discuss these results and their applicability to other classes of excitable systems.

  9. Cochlear Implant Using Neural Prosthetics

    Science.gov (United States)

    Gupta, Shweta; Singh, Shashi kumar; Dubey, Pratik Kumar

    2012-10-01

    This research is based on neural prosthetic device. The oldest and most widely used of these electrical, and often computerized, devices is the cochlear implant, which has provided hearing to thousands of congenitally deaf people in this country. Recently, the use of the cochlear implant is expanding to the elderly, who frequently suffer major hearing loss. More cutting edge are artificial retinas, which are helping dozens of blind people see, and ìsmartî artificial arms and legs that amputees can maneuver by thoughts alone, and that feel more like real limbs.Research, which curiosity led to explore frog legs dancing during thunderstorms, a snail shapedorgan in the inner ear, and how various eye cells react to light, have fostered an understanding of how to ìtalkî to the nervous system. That understanding combined with the miniaturization of electronics and enhanced computer processing has enabled prosthetic devices that often can bridge the gap in nerve signaling that is caused by disease or injury.

  10. Neural correlates of rhythmic expectancy

    Directory of Open Access Journals (Sweden)

    Theodore P. Zanto

    2006-01-01

    Full Text Available Temporal expectancy is thought to play a fundamental role in the perception of rhythm. This review summarizes recent studies that investigated rhythmic expectancy by recording neuroelectric activity with high temporal resolution during the presentation of rhythmic patterns. Prior event-related brain potential (ERP studies have uncovered auditory evoked responses that reflect detection of onsets, offsets, sustains,and abrupt changes in acoustic properties such as frequency, intensity, and spectrum, in addition to indexing higher-order processes such as auditory sensory memory and the violation of expectancy. In our studies of rhythmic expectancy, we measured emitted responses - a type of ERP that occurs when an expected event is omitted from a regular series of stimulus events - in simple rhythms with temporal structures typical of music. Our observations suggest that middle-latency gamma band (20-60 Hz activity (GBA plays an essential role in auditory rhythm processing. Evoked (phase-locked GBA occurs in the presence of physically presented auditory events and reflects the degree of accent. Induced (non-phase-locked GBA reflects temporally precise expectancies for strongly and weakly accented events in sound patterns. Thus far, these findings support theories of rhythm perception that posit temporal expectancies generated by active neural processes.

  11. Neural architectures for stereo vision.

    Science.gov (United States)

    Parker, Andrew J; Smith, Jackson E T; Krug, Kristine

    2016-06-19

    Stereoscopic vision delivers a sense of depth based on binocular information but additionally acts as a mechanism for achieving correspondence between patterns arriving at the left and right eyes. We analyse quantitatively the cortical architecture for stereoscopic vision in two areas of macaque visual cortex. For primary visual cortex V1, the result is consistent with a module that is isotropic in cortical space with a diameter of at least 3 mm in surface extent. This implies that the module for stereo is larger than the repeat distance between ocular dominance columns in V1. By contrast, in the extrastriate cortical area V5/MT, which has a specialized architecture for stereo depth, the module for representation of stereo is about 1 mm in surface extent, so the representation of stereo in V5/MT is more compressed than V1 in terms of neural wiring of the neocortex. The surface extent estimated for stereo in V5/MT is consistent with measurements of its specialized domains for binocular disparity. Within V1, we suggest that long-range horizontal, anatomical connections form functional modules that serve both binocular and monocular pattern recognition: this common function may explain the distortion and disruption of monocular pattern vision observed in amblyopia.This article is part of the themed issue 'Vision in our three-dimensional world'. © 2016 The Authors.

  12. MANF Promotes Differentiation and Migration of Neural Progenitor Cells with Potential Neural Regenerative Effects in Stroke

    DEFF Research Database (Denmark)

    Tseng, Kuan-Yin; Anttila, Jenni E; Khodosevich, Konstantin

    2018-01-01

    Cerebral ischemia activates endogenous reparative processes, such as increased proliferation of neural stem cells (NSCs) in the subventricular zone (SVZ) and migration of neural progenitor cells (NPCs) toward the ischemic area. However, this reparative process is limited because most of the NPCs...

  13. Neural Schematics as a unified formal graphical representation of large-scale Neural Network Structures

    Directory of Open Access Journals (Sweden)

    Matthias eEhrlich

    2013-10-01

    Full Text Available One of the major outcomes of neuroscientific research are models of Neural Network Structures. Descriptions of these models usually consist of a non-standardized mixture of text, figures, and other means of visual information communication in print media. However, as neuroscience is an interdisciplinary domain by nature, a standardized way of consistently representing models of Neural Network Structures is required. While generic descriptions of such models in textual form have recently been developed, a formalized way of schematically expressing them does not exist to date. Hence, in this paper we present Neural Schematics as a concept inspired by similar approaches from other disciplines for a generic two dimensional representation of said structures. After introducing Neural Network Structures in general, a set of current visualizations of models of Neural Network Structures is reviewed and analyzed for what information they convey and how their elements are rendered. This analysis then allows for the definition of general items and symbols to consistently represent these models as Neural Schematics on a two dimensional plane. We will illustrate the possibilities an agreed upon standard can yield on sampled diagrams transformed into Neural Schematics and an example application for the design and modeling of large-scale Neural Network Structures.

  14. Fast and Efficient Asynchronous Neural Computation with Adapting Spiking Neural Networks

    NARCIS (Netherlands)

    D. Zambrano (Davide); S.M. Bohte (Sander)

    2016-01-01

    textabstractBiological neurons communicate with a sparing exchange of pulses - spikes. It is an open question how real spiking neurons produce the kind of powerful neural computation that is possible with deep artificial neural networks, using only so very few spikes to communicate. Building on

  15. Fin-and-tube condenser performance evaluation using neural networks

    Energy Technology Data Exchange (ETDEWEB)

    Zhao, Ling-Xiao [Institute of Refrigeration and Cryogenics, Shanghai Jiaotong University, Shanghai 200240 (China); Zhang, Chun-Lu [China R and D Center, Carrier Corporation, No. 3239 Shen Jiang Road, Shanghai 201206 (China)

    2010-05-15

    The paper presents neural network approach to performance evaluation of the fin-and-tube air-cooled condensers which are widely used in air-conditioning and refrigeration systems. Inputs of the neural network include refrigerant and air-flow rates, refrigerant inlet temperature and saturated temperature, and entering air dry-bulb temperature. Outputs of the neural network consist of the heating capacity and the pressure drops on both refrigerant and air sides. The multi-input multi-output (MIMO) neural network is separated into multi-input single-output (MISO) neural networks for training. Afterwards, the trained MISO neural networks are combined into a MIMO neural network, which indicates that the number of training data sets is determined by the biggest MISO neural network not the whole MIMO network. Compared with a validated first-principle model, the standard deviations of neural network models are less than 1.9%, and all errors fall into {+-}5%. (author)

  16. Neural systems for tactual memories.

    Science.gov (United States)

    Bonda, E; Petrides, M; Evans, A

    1996-04-01

    1. The aim of this study was to investigate the neural systems involved in the memory processing of experiences through touch. 2. Regional cerebral blood flow was measured with positron emission tomography by means of the water bolus H2(15)O methodology in human subjects as they performed tasks involving different levels of tactual memory. In one of the experimental tasks, the subjects had to palpate nonsense shapes to match each one to a previously learned set, thus requiring constant reference to long-term memory. The other experimental task involved judgements of the recent recurrence of shapes during the scanning period. A set of three control tasks was used to control for the type of exploratory movements and sensory processing inherent in the two experimental tasks. 3. Comparisons of the distribution of activity between the experimental and the control tasks were carried out by means of the subtraction method. In relation to the control conditions, the two experimental tasks requiring memory resulted in significant changes within the posteroventral insula and the central opercular region. In addition, the task requiring recall from long-term memory yielded changes in the perirhinal cortex. 4. The above findings demonstrated that a ventrally directed parietoinsular pathway, leading to the posteroventral insula and the perirhinal cortex, constitutes a system by which long-lasting representations of tactual experiences are formed. It is proposed that the posteroventral insula is involved in tactual feature analysis, by analogy with the similar role of the inferotemporal cortex in vision, whereas the perirhinal cortex is further involved in the integration of these features into long-lasting representations of somatosensory experiences.

  17. EDITORIAL: Focus on the neural interface Focus on the neural interface

    Science.gov (United States)

    Durand, Dominique M.

    2009-10-01

    The possibility of an effective connection between neural tissue and computers has inspired scientists and engineers to develop new ways of controlling and obtaining information from the nervous system. These applications range from `brain hacking' to neural control of artificial limbs with brain signals. Notwithstanding the significant advances in neural prosthetics in the last few decades and the success of some stimulation devices such as cochlear prosthesis, neurotechnology remains below its potential for restoring neural function in patients with nervous system disorders. One of the reasons for this limited impact can be found at the neural interface and close attention to the integration between electrodes and tissue should improve the possibility of successful outcomes. The neural interfaces research community consists of investigators working in areas such as deep brain stimulation, functional neuromuscular/electrical stimulation, auditory prostheses, cortical prostheses, neuromodulation, microelectrode array technology, brain-computer/machine interfaces. Following the success of previous neuroprostheses and neural interfaces workshops, funding (from NIH) was obtained to establish a biennial conference in the area of neural interfaces. The first Neural Interfaces Conference took place in Cleveland, OH in 2008 and several topics from this conference have been selected for publication in this special section of the Journal of Neural Engineering. Three `perspectives' review the areas of neural regeneration (Corredor and Goldberg), cochlear implants (O'Leary et al) and neural prostheses (Anderson). Seven articles focus on various aspects of neural interfacing. One of the most popular of these areas is the field of brain-computer interfaces. Fraser et al, report on a method to generate robust control with simple signal processing algorithms of signals obtained with electrodes implanted in the brain. One problem with implanted electrode arrays, however, is that

  18. Neural oscillations and information flow associated with synaptic plasticity.

    Science.gov (United States)

    Zhang, Tao

    2011-10-25

    As a rhythmic neural activity, neural oscillation exists all over the nervous system, in structures as diverse as the cerebral cortex, hippocampus, subcortical nuclei and sense organs. This review firstly presents some evidence that synchronous neural oscillations in theta and gamma bands reveal much about the origin and nature of cognitive processes such as learning and memory. And then it introduces the novel analyzing algorithms of neural oscillations, which is a directionality index of neural information flow (NIF) as a measure of synaptic plasticity. An example of application used such an analyzing algorithms of neural oscillations has been provided.

  19. Fate Specification of Neural Plate Border by Canonical Wnt Signaling and Grhl3 is Crucial for Neural Tube Closure.

    Science.gov (United States)

    Kimura-Yoshida, Chiharu; Mochida, Kyoko; Ellwanger, Kristina; Niehrs, Christof; Matsuo, Isao

    2015-06-01

    During primary neurulation, the separation of a single-layered ectodermal sheet into the surface ectoderm (SE) and neural tube specifies SE and neural ectoderm (NE) cell fates. The mechanisms underlying fate specification in conjunction with neural tube closure are poorly understood. Here, by comparing expression profiles between SE and NE lineages, we observed that uncommitted progenitor cells, expressing stem cell markers, are present in the neural plate border/neural fold prior to neural tube closure. Our results also demonstrated that canonical Wnt and its antagonists, DKK1/KREMEN1, progressively specify these progenitors into SE or NE fates in accord with the progress of neural tube closure. Additionally, SE specification of the neural plate border via canonical Wnt signaling is directed by the grainyhead-like 3 (Grhl3) transcription factor. Thus, we propose that the fate specification of uncommitted progenitors in the neural plate border by canonical Wnt signaling and its downstream effector Grhl3 is crucial for neural tube closure. This study implicates that failure in critical genetic factors controlling fate specification of progenitor cells in the neural plate border/neural fold coordinated with neural tube closure may be potential causes of human neural tube defects.

  20. Prototype-Incorporated Emotional Neural Network.

    Science.gov (United States)

    Oyedotun, Oyebade K; Khashman, Adnan

    2017-08-15

    Artificial neural networks (ANNs) aim to simulate the biological neural activities. Interestingly, many ''engineering'' prospects in ANN have relied on motivations from cognition and psychology studies. So far, two important learning theories that have been subject of active research are the prototype and adaptive learning theories. The learning rules employed for ANNs can be related to adaptive learning theory, where several examples of the different classes in a task are supplied to the network for adjusting internal parameters. Conversely, the prototype-learning theory uses prototypes (representative examples); usually, one prototype per class of the different classes contained in the task. These prototypes are supplied for systematic matching with new examples so that class association can be achieved. In this paper, we propose and implement a novel neural network algorithm based on modifying the emotional neural network (EmNN) model to unify the prototype- and adaptive-learning theories. We refer to our new model as ``prototype-incorporated EmNN''. Furthermore, we apply the proposed model to two real-life challenging tasks, namely, static hand-gesture recognition and face recognition, and compare the result to those obtained using the popular back-propagation neural network (BPNN), emotional BPNN (EmNN), deep networks, an exemplar classification model, and k-nearest neighbor.

  1. The anatomy of the intralingual neural interconnections.

    Science.gov (United States)

    Păduraru, Dumitru; Rusu, Mugurel Constantin

    2013-08-01

    The intrinsic lingual neural interconnections are overlooked. It was hypothesized that intralingual anatomically well defined anastomoses interconnect the somatic and autonomic neural systems of the tongue. It was thus aimed to evaluate the intralingual neural scaffold in human tongues. Human tongue samples (ten adult and one pediatric) were microdissected (4.5 magnification). In the interstitium between the genioglossus and hyoglossus muscles, the branches of the lingual nerve (LN) and the medial trunk of the hypoglossal nerve (HN) had a layered disposition of the outer and inner side, respectively, of the lingual artery with its periarterial plexus. Anastomoses of these three distinctive neural suppliers of tongue were recorded, as also were those of the LN with the lateral trunk of the HN and the anastomoses between successive terminal branches of the LN. Successive ansae linguales were joining the LN branches and the medial trunk of the HN. The intrinsic neural system of the tongue supports integrative functions and allows a better retrospective understanding of various experimental studies. The topographical pattern is useful for an accurate diagnosis of intralingual nerves on microscopic slides.

  2. Folate receptors and neural tube closure.

    Science.gov (United States)

    Saitsu, Hirotomo

    2017-09-01

    Neural tube defects (NTD) are among the most common human congenital malformations, affecting 0.5-8.0/1000 of live births. Human clinical trials have shown that periconceptional folate supplementation significantly decreases the occurrence of NTD in offspring. However, the mechanism by which folate acts on NTD remains largely unknown. Folate receptor (Folr) is one of the three membrane proteins that mediate cellular uptake of folates. Recent studies suggest that mouse Folr1 (formerly referred to as Fbp1) is essential for neural tube closure. Therefore, we examined spatial and temporal expression patterns of Folr1 in developing mouse embryos, showing a close association between Folr1 and anterior neural tube closure. Transient transgenic analysis was performed using lacZ as a reporter; we identified a 1.1-kb enhancer that directs lacZ expression in the neural tube and optic vesicle in a manner that is similar to endogenous Folr1. The 1.1-kb enhancer sequences were highly conserved between humans and mice, suggesting that human FOLR1 is associated with anterior neural tube closure in humans. Several experimental studies in mice and human epidemiological and genetics studies have suggested that folate receptor abnormalities are involved in a portion of human NTDs, although the solo defect of FOLR1 did not cause NTD. © 2017 Japanese Teratology Society.

  3. Tuning Neural Phase Entrainment to Speech.

    Science.gov (United States)

    Falk, Simone; Lanzilotti, Cosima; Schön, Daniele

    2017-08-01

    Musical rhythm positively impacts on subsequent speech processing. However, the neural mechanisms underlying this phenomenon are so far unclear. We investigated whether carryover effects from a preceding musical cue to a speech stimulus result from a continuation of neural phase entrainment to periodicities that are present in both music and speech. Participants listened and memorized French metrical sentences that contained (quasi-)periodic recurrences of accents and syllables. Speech stimuli were preceded by a rhythmically regular or irregular musical cue. Our results show that the presence of a regular cue modulates neural response as estimated by EEG power spectral density, intertrial coherence, and source analyses at critical frequencies during speech processing compared with the irregular condition. Importantly, intertrial coherences for regular cues were indicative of the participants' success in memorizing the subsequent speech stimuli. These findings underscore the highly adaptive nature of neural phase entrainment across fundamentally different auditory stimuli. They also support current models of neural phase entrainment as a tool of predictive timing and attentional selection across cognitive domains.

  4. Weather forecasting based on hybrid neural model

    Science.gov (United States)

    Saba, Tanzila; Rehman, Amjad; AlGhamdi, Jarallah S.

    2017-02-01

    Making deductions and expectations about climate has been a challenge all through mankind's history. Challenges with exact meteorological directions assist to foresee and handle problems well in time. Different strategies have been investigated using various machine learning techniques in reported forecasting systems. Current research investigates climate as a major challenge for machine information mining and deduction. Accordingly, this paper presents a hybrid neural model (MLP and RBF) to enhance the accuracy of weather forecasting. Proposed hybrid model ensure precise forecasting due to the specialty of climate anticipating frameworks. The study concentrates on the data representing Saudi Arabia weather forecasting. The main input features employed to train individual and hybrid neural networks that include average dew point, minimum temperature, maximum temperature, mean temperature, average relative moistness, precipitation, normal wind speed, high wind speed and average cloudiness. The output layer composed of two neurons to represent rainy and dry weathers. Moreover, trial and error approach is adopted to select an appropriate number of inputs to the hybrid neural network. Correlation coefficient, RMSE and scatter index are the standard yard sticks adopted for forecast accuracy measurement. On individual standing MLP forecasting results are better than RBF, however, the proposed simplified hybrid neural model comes out with better forecasting accuracy as compared to both individual networks. Additionally, results are better than reported in the state of art, using a simple neural structure that reduces training time and complexity.

  5. On sparsely connected optimal neural networks

    Energy Technology Data Exchange (ETDEWEB)

    Beiu, V. [Los Alamos National Lab., NM (United States); Draghici, S. [Wayne State Univ., Detroit, MI (United States)

    1997-10-01

    This paper uses two different approaches to show that VLSI- and size-optimal discrete neural networks are obtained for small fan-in values. These have applications to hardware implementations of neural networks, but also reveal an intrinsic limitation of digital VLSI technology: its inability to cope with highly connected structures. The first approach is based on implementing F{sub n,m} functions. The authors show that this class of functions can be implemented in VLSI-optimal (i.e., minimizing AT{sup 2}) neural networks of small constant fan-ins. In order to estimate the area (A) and the delay (T) of such networks, the following cost functions will be used: (i) the connectivity and the number-of-bits for representing the weights and thresholds--for good estimates of the area; and (ii) the fan-ins and the length of the wires--for good approximates of the delay. The second approach is based on implementing Boolean functions for which the classical Shannon`s decomposition can be used. Such a solution has already been used to prove bounds on the size of fan-in 2 neural networks. They will generalize the result presented there to arbitrary fan-in, and prove that the size is minimized by small fan-in values. Finally, a size-optimal neural network of small constant fan-ins will be suggested for F{sub n,m} functions.

  6. Neural substrates of decision-making.

    Science.gov (United States)

    Broche-Pérez, Y; Herrera Jiménez, L F; Omar-Martínez, E

    2016-06-01

    Decision-making is the process of selecting a course of action from among 2 or more alternatives by considering the potential outcomes of selecting each option and estimating its consequences in the short, medium and long term. The prefrontal cortex (PFC) has traditionally been considered the key neural structure in decision-making process. However, new studies support the hypothesis that describes a complex neural network including both cortical and subcortical structures. The aim of this review is to summarise evidence on the anatomical structures underlying the decision-making process, considering new findings that support the existence of a complex neural network that gives rise to this complex neuropsychological process. Current evidence shows that the cortical structures involved in decision-making include the orbitofrontal cortex (OFC), anterior cingulate cortex (ACC), and dorsolateral prefrontal cortex (DLPFC). This process is assisted by subcortical structures including the amygdala, thalamus, and cerebellum. Findings to date show that both cortical and subcortical brain regions contribute to the decision-making process. The neural basis of decision-making is a complex neural network of cortico-cortical and cortico-subcortical connections which includes subareas of the PFC, limbic structures, and the cerebellum. Copyright © 2014 Sociedad Española de Neurología. Published by Elsevier España, S.L.U. All rights reserved.

  7. Anechoic aquarium for ultrasonic neural telemetry.

    Science.gov (United States)

    Mensinger, A F; Deffenbaugh, M

    2000-09-29

    An acoustic neural telemetry tag has been developed for recording from free-swimming aquatic animals. Microwire electrodes were implanted into the VIIIth nerve of the toadfish, Opsanus tau, and interfaced to the subdermally implanted tag. The telemetry tag frequency modulates the neural signal, converting it into a varying frequency, which is amplified and transmitted acoustically (centre frequency of 90 kHz and a 20 kHz bandwidth). This acoustic signal is detected by a receiver hydrophone, and the receiver reconstructs the full neural waveform from the acoustic signal. However, due to the multipath environment in the experimental aquarium, the acoustic signal is quickly degraded as the hydrophone is moved away from the source. In order to receive the signal independent of fish position, an anechoic aquarium was designed. Streams of microbubbles (ca. 70 microm diameter) were generated to produce a curtain of sound-absorptive material along the walls and water surface of the aquarium. Microbubble generation significantly reduced the multipath artefacts, and allowed signal discrimination independent of fish and hydrophone position. The anechoic aquarium will allow the recording of neural activity from free-swimming fishes in quasi-natural habitats, thus allowing better understanding of the neural mechanisms of behaviour.

  8. Representations in neural network based empirical potentials

    Science.gov (United States)

    Cubuk, Ekin D.; Malone, Brad D.; Onat, Berk; Waterland, Amos; Kaxiras, Efthimios

    2017-07-01

    Many structural and mechanical properties of crystals, glasses, and biological macromolecules can be modeled from the local interactions between atoms. These interactions ultimately derive from the quantum nature of electrons, which can be prohibitively expensive to simulate. Machine learning has the potential to revolutionize materials modeling due to its ability to efficiently approximate complex functions. For example, neural networks can be trained to reproduce results of density functional theory calculations at a much lower cost. However, how neural networks reach their predictions is not well understood, which has led to them being used as a "black box" tool. This lack of understanding is not desirable especially for applications of neural networks in scientific inquiry. We argue that machine learning models trained on physical systems can be used as more than just approximations since they had to "learn" physical concepts in order to reproduce the labels they were trained on. We use dimensionality reduction techniques to study in detail the representation of silicon atoms at different stages in a neural network, which provides insight into how a neural network learns to model atomic interactions.

  9. Weather forecasting based on hybrid neural model

    Science.gov (United States)

    Saba, Tanzila; Rehman, Amjad; AlGhamdi, Jarallah S.

    2017-11-01

    Making deductions and expectations about climate has been a challenge all through mankind's history. Challenges with exact meteorological directions assist to foresee and handle problems well in time. Different strategies have been investigated using various machine learning techniques in reported forecasting systems. Current research investigates climate as a major challenge for machine information mining and deduction. Accordingly, this paper presents a hybrid neural model (MLP and RBF) to enhance the accuracy of weather forecasting. Proposed hybrid model ensure precise forecasting due to the specialty of climate anticipating frameworks. The study concentrates on the data representing Saudi Arabia weather forecasting. The main input features employed to train individual and hybrid neural networks that include average dew point, minimum temperature, maximum temperature, mean temperature, average relative moistness, precipitation, normal wind speed, high wind speed and average cloudiness. The output layer composed of two neurons to represent rainy and dry weathers. Moreover, trial and error approach is adopted to select an appropriate number of inputs to the hybrid neural network. Correlation coefficient, RMSE and scatter index are the standard yard sticks adopted for forecast accuracy measurement. On individual standing MLP forecasting results are better than RBF, however, the proposed simplified hybrid neural model comes out with better forecasting accuracy as compared to both individual networks. Additionally, results are better than reported in the state of art, using a simple neural structure that reduces training time and complexity.

  10. Artificial neural network intelligent method for prediction

    Science.gov (United States)

    Trifonov, Roumen; Yoshinov, Radoslav; Pavlova, Galya; Tsochev, Georgi

    2017-09-01

    Accounting and financial classification and prediction problems are high challenge and researchers use different methods to solve them. Methods and instruments for short time prediction of financial operations using artificial neural network are considered. The methods, used for prediction of financial data as well as the developed forecasting system with neural network are described in the paper. The architecture of a neural network used four different technical indicators, which are based on the raw data and the current day of the week is presented. The network developed is used for forecasting movement of stock prices one day ahead and consists of an input layer, one hidden layer and an output layer. The training method is algorithm with back propagation of the error. The main advantage of the developed system is self-determination of the optimal topology of neural network, due to which it becomes flexible and more precise The proposed system with neural network is universal and can be applied to various financial instruments using only basic technical indicators as input data.

  11. Moral transgressions corrupt neural representations of value.

    Science.gov (United States)

    Crockett, Molly J; Siegel, Jenifer Z; Kurth-Nelson, Zeb; Dayan, Peter; Dolan, Raymond J

    2017-06-01

    Moral systems universally prohibit harming others for personal gain. However, we know little about how such principles guide moral behavior. Using a task that assesses the financial cost participants ascribe to harming others versus themselves, we probed the relationship between moral behavior and neural representations of profit and pain. Most participants displayed moral preferences, placing a higher cost on harming others than themselves. Moral preferences correlated with neural responses to profit, where participants with stronger moral preferences had lower dorsal striatal responses to profit gained from harming others. Lateral prefrontal cortex encoded profit gained from harming others, but not self, and tracked the blameworthiness of harmful choices. Moral decisions also modulated functional connectivity between lateral prefrontal cortex and the profit-sensitive region of dorsal striatum. The findings suggest moral behavior in our task is linked to a neural devaluation of reward realized by a prefrontal modulation of striatal value representations.

  12. Electrospun Nanofibrous Materials for Neural Tissue Engineering

    Directory of Open Access Journals (Sweden)

    Yee-Shuan Lee

    2011-02-01

    Full Text Available The use of biomaterials processed by the electrospinning technique has gained considerable interest for neural tissue engineering applications. The tissue engineering strategy is to facilitate the regrowth of nerves by combining an appropriate cell type with the electrospun scaffold. Electrospinning can generate fibrous meshes having fiber diameter dimensions at the nanoscale and these fibers can be nonwoven or oriented to facilitate neurite extension via contact guidance. This article reviews studies evaluating the effect of the scaffold’s architectural features such as fiber diameter and orientation on neural cell function and neurite extension. Electrospun meshes made of natural polymers, proteins and compositions having electrical activity in order to enhance neural cell function are also discussed.

  13. Dysfunction of Rapid Neural Adaptation in Dyslexia.

    Science.gov (United States)

    Perrachione, Tyler K; Del Tufo, Stephanie N; Winter, Rebecca; Murtagh, Jack; Cyr, Abigail; Chang, Patricia; Halverson, Kelly; Ghosh, Satrajit S; Christodoulou, Joanna A; Gabrieli, John D E

    2016-12-21

    Identification of specific neurophysiological dysfunctions resulting in selective reading difficulty (dyslexia) has remained elusive. In addition to impaired reading development, individuals with dyslexia frequently exhibit behavioral deficits in perceptual adaptation. Here, we assessed neurophysiological adaptation to stimulus repetition in adults and children with dyslexia for a wide variety of stimuli, spoken words, written words, visual objects, and faces. For every stimulus type, individuals with dyslexia exhibited significantly diminished neural adaptation compared to controls in stimulus-specific cortical areas. Better reading skills in adults and children with dyslexia were associated with greater repetition-induced neural adaptation. These results highlight a dysfunction of rapid neural adaptation as a core neurophysiological difference in dyslexia that may underlie impaired reading development. Reduced neurophysiological adaptation may relate to prior reports of reduced behavioral adaptation in dyslexia and may reveal a difference in brain functions that ultimately results in a specific reading impairment. Copyright © 2016 Elsevier Inc. All rights reserved.

  14. Neural network for sonogram gap filling

    DEFF Research Database (Denmark)

    Klebæk, Henrik; Jensen, Jørgen Arendt; Hansen, Lars Kai

    1995-01-01

    a neural network for predicting mean frequency of the velocity signal and its variance. The neural network then predicts the evolution of the mean and variance in the gaps, and the sonogram and audio signal are reconstructed from these. The technique is applied on in-vivo data from the carotid artery...... in the sonogram and in the audio signal, rendering the audio signal useless, thus making diagnosis difficult. The current goal for ultrasound scanners is to maintain a high refresh rate for the B-mode image and at the same time attain a high maximum velocity in the sonogram display. This precludes the intermixing...... series, and is shown to yield better results, i.e., the variances of the predictions are lower. The ability of the neural predictor to reconstruct both the sonogram and the audio signal, when only 50% of the time is used for velocity data acquisition, is demonstrated for the in-vivo data...

  15. A quantum-implementable neural network model

    Science.gov (United States)

    Chen, Jialin; Wang, Lingli; Charbon, Edoardo

    2017-10-01

    A quantum-implementable neural network, namely quantum probability neural network (QPNN) model, is proposed in this paper. QPNN can use quantum parallelism to trace all possible network states to improve the result. Due to its unique quantum nature, this model is robust to several quantum noises under certain conditions, which can be efficiently implemented by the qubus quantum computer. Another advantage is that QPNN can be used as memory to retrieve the most relevant data and even to generate new data. The MATLAB experimental results of Iris data classification and MNIST handwriting recognition show that much less neuron resources are required in QPNN to obtain a good result than the classical feedforward neural network. The proposed QPNN model indicates that quantum effects are useful for real-life classification tasks.

  16. Low endogenous neural noise in autism.

    Science.gov (United States)

    Davis, Greg; Plaisted-Grant, Kate

    2015-04-01

    'Heuristic' theories of autism postulate that a single mechanism or process underpins the diverse psychological features of autism spectrum disorder. Although no such theory can offer a comprehensive account, the parsimonious descriptions they provide are powerful catalysts to autism research. One recent proposal holds that 'noisy' neuronal signalling explains not only some deficits in autism spectrum disorder, but also some superior abilities, due to 'stochastic resonance'. Here, we discuss three distinct actions of noise in neural networks, arguing in each case that autism spectrum disorder symptoms reflect too little, rather than too much, neural noise. Such reduced noise, perhaps a function of atypical brainstem activation, would enhance detection and discrimination in autism spectrum disorder but at significant cost, foregoing the widespread benefits of noise in neural networks. © The Author(s) 2014.

  17. Neural correlates of forethought in ADHD.

    Science.gov (United States)

    Poissant, Hélène; Mendrek, Adrianna; Senhadji, Noureddine

    2014-04-01

    The purpose of the present investigation was to delineate the neural correlates of forethought in the ADHD children relative to typically developing (TD) children. In all, 21 TD and 23 ADHD adolescents underwent functional magnetic resonance imaging (fMRI) while performing a forethought task. The participants had to identify congruent and incongruent stimuli from cartoon stories representing sequences of action. The findings revealed significantly greater activation in the bilateral prefrontal cortex (PFC) in TD versus ADHD children, and more activation in the cerebellar vermis in the adolescents with ADHD versus TD, during performance of the incongruent relative to congruent condition. The inverse pattern of activation of the PFC and the cerebellar vermis in both groups could reflect a compensatory role played by the cerebellum or suggest the malfunction of the neural network between those regions in ADHD. Further research of the neural correlates of forethought in ADHD is warranted.

  18. Digital Neural Networks for New Media

    Science.gov (United States)

    Spaanenburg, Lambert; Malki, Suleyman

    Neural Networks perform computationally intensive tasks offering smart solutions for many new media applications. A number of analog and mixed digital/analog implementations have been proposed to smooth the algorithmic gap. But gradually, the digital implementation has become feasible, and the dedicated neural processor is on the horizon. A notable example is the Cellular Neural Network (CNN). The analog direction has matured for low-power, smart vision sensors; the digital direction is gradually being shaped into an IP-core for algorithm acceleration, especially for use in FPGA-based high-performance systems. The chapter discusses the next step towards a flexible and scalable multi-core engine using Application-Specific Integrated Processors (ASIP). This topographic engine can serve many new media tasks, as illustrated by novel applications in Homeland Security. We conclude with a view on the CNN kaleidoscope for the year 2020.

  19. Identification and characterization of secondary neural tube-derived embryonic neural stem cells in vitro.

    Science.gov (United States)

    Shaker, Mohammed R; Kim, Joo Yeon; Kim, Hyun; Sun, Woong

    2015-05-15

    Secondary neurulation is an embryonic progress that gives rise to the secondary neural tube, the precursor of the lower spinal cord region. The secondary neural tube is derived from aggregated Sox2-expressing neural cells at the dorsal region of the tail bud, which eventually forms rosette or tube-like structures to give rise to neural tissues in the tail bud. We addressed whether the embryonic tail contains neural stem cells (NSCs), namely secondary NSCs (sNSCs), with the potential for self-renewal in vitro. Using in vitro neurosphere assays, neurospheres readily formed at the rosette and neural-tube levels, but less frequently at the tail bud tip level. Furthermore, we identified that sNSC-generated neurospheres were significantly smaller in size compared with cortical neurospheres. Interestingly, various cell cycle analyses revealed that this difference was not due to a reduction in the proliferation rate of NSCs, but rather the neuronal commitment of sNSCs, as sNSC-derived neurospheres contain more committed neuronal progenitor cells, even in the presence of epidermal growth factor (EGF) and basic fibroblast growth factor (bFGF). These results suggest that the higher tendency for sNSCs to spontaneously differentiate into progenitor cells may explain the limited expansion of the secondary neural tube during embryonic development.

  20. Computational modeling of neural plasticity for self-organization of neural networks.

    Science.gov (United States)

    Chrol-Cannon, Joseph; Jin, Yaochu

    2014-11-01

    Self-organization in biological nervous systems during the lifetime is known to largely occur through a process of plasticity that is dependent upon the spike-timing activity in connected neurons. In the field of computational neuroscience, much effort has been dedicated to building up computational models of neural plasticity to replicate experimental data. Most recently, increasing attention has been paid to understanding the role of neural plasticity in functional and structural neural self-organization, as well as its influence on the learning performance of neural networks for accomplishing machine learning tasks such as classification and regression. Although many ideas and hypothesis have been suggested, the relationship between the structure, dynamics and learning performance of neural networks remains elusive. The purpose of this article is to review the most important computational models for neural plasticity and discuss various ideas about neural plasticity's role. Finally, we suggest a few promising research directions, in particular those along the line that combines findings in computational neuroscience and systems biology, and their synergetic roles in understanding learning, memory and cognition, thereby bridging the gap between computational neuroscience, systems biology and computational intelligence. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  1. Neurale Netværk anvendt indenfor Proceskontrol. Neural Network for Process Control

    DEFF Research Database (Denmark)

    Madsen, Per Printz

    Dette projekt omhandler anvendelsen af neurale netværksmodeller til proceskontrol. Neurale netværksmodeller er simple modeller af de processer, der forløber i det biologiske neurale netværk. Det biologiske neurale netværk er det netværk af nerveceller, der tilsammen danner centralnervesystemet hos...... beskrivelige inputsignaler. Det biologiske neurale netværk dvs. hjernen er således gennem indlæring i stand til at læse, hvorledes der skal stryes og reguleres på baggrund af disse inputsignaler, så det ønskede resultat opnås. Det er derfor nærliggende at undersøge, hvorvidt neurale netværk er anvendelige...... indenfor proceskontrol i almindelighed. Med anvendelser til proceskontrol menes der her anvendeler til prediction, simulering og regulering af dynamiske systemer. For at teste, hvorvidt neurale netværk er anvendelig til prediction og simulering, er der anvendt en tre-trinsoverheder simulator til...

  2. Cotton genotypes selection through artificial neural networks.

    Science.gov (United States)

    Júnior, E G Silva; Cardoso, D B O; Reis, M C; Nascimento, A F O; Bortolin, D I; Martins, M R; Sousa, L B

    2017-09-27

    Breeding programs currently use statistical analysis to assist in the identification of superior genotypes at various stages of a cultivar's development. Differently from these analyses, the computational intelligence approach has been little explored in genetic improvement of cotton. Thus, this study was carried out with the objective of presenting the use of artificial neural networks as auxiliary tools in the improvement of the cotton to improve fiber quality. To demonstrate the applicability of this approach, this research was carried out using the evaluation data of 40 genotypes. In order to classify the genotypes for fiber quality, the artificial neural networks were trained with replicate data of 20 genotypes of cotton evaluated in the harvests of 2013/14 and 2014/15, regarding fiber length, uniformity of length, fiber strength, micronaire index, elongation, short fiber index, maturity index, reflectance degree, and fiber quality index. This quality index was estimated by means of a weighted average on the determined score (1 to 5) of each characteristic of the HVI evaluated, according to its industry standards. The artificial neural networks presented a high capacity of correct classification of the 20 selected genotypes based on the fiber quality index, so that when using fiber length associated with the short fiber index, fiber maturation, and micronaire index, the artificial neural networks presented better results than using only fiber length and previous associations. It was also observed that to submit data of means of new genotypes to the neural networks trained with data of repetition, provides better results of classification of the genotypes. When observing the results obtained in the present study, it was verified that the artificial neural networks present great potential to be used in the different stages of a genetic improvement program of the cotton, aiming at the improvement of the fiber quality of the future cultivars.

  3. Neural network approaches for noisy language modeling.

    Science.gov (United States)

    Li, Jun; Ouazzane, Karim; Kazemian, Hassan B; Afzal, Muhammad Sajid

    2013-11-01

    Text entry from people is not only grammatical and distinct, but also noisy. For example, a user's typing stream contains all the information about the user's interaction with computer using a QWERTY keyboard, which may include the user's typing mistakes as well as specific vocabulary, typing habit, and typing performance. In particular, these features are obvious in disabled users' typing streams. This paper proposes a new concept called noisy language modeling by further developing information theory and applies neural networks to one of its specific application-typing stream. This paper experimentally uses a neural network approach to analyze the disabled users' typing streams both in general and specific ways to identify their typing behaviors and subsequently, to make typing predictions and typing corrections. In this paper, a focused time-delay neural network (FTDNN) language model, a time gap model, a prediction model based on time gap, and a probabilistic neural network model (PNN) are developed. A 38% first hitting rate (HR) and a 53% first three HR in symbol prediction are obtained based on the analysis of a user's typing history through the FTDNN language modeling, while the modeling results using the time gap prediction model and the PNN model demonstrate that the correction rates lie predominantly in between 65% and 90% with the current testing samples, and 70% of all test scores above basic correction rates, respectively. The modeling process demonstrates that a neural network is a suitable and robust language modeling tool to analyze the noisy language stream. The research also paves the way for practical application development in areas such as informational analysis, text prediction, and error correction by providing a theoretical basis of neural network approaches for noisy language modeling.

  4. Foetal ECG recovery using dynamic neural networks.

    Science.gov (United States)

    Camps-Valls, Gustavo; Martínez-Sober, Marcelino; Soria-Olivas, Emilio; Magdalena-Benedito, Rafael; Calpe-Maravilla, Javier; Guerrero-Martínez, Juan

    2004-07-01

    Non-invasive electrocardiography has proven to be a very interesting method for obtaining information about the foetus state and thus to assure its well-being during pregnancy. One of the main applications in this field is foetal electrocardiogram (ECG) recovery by means of automatic methods. Evident problems found in the literature are the limited number of available registers, the lack of performance indicators, and the limited use of non-linear adaptive methods. In order to circumvent these problems, we first introduce the generation of synthetic registers and discuss the influence of different kinds of noise to the modelling. Second, a method which is based on numerical (correlation coefficient) and statistical (analysis of variance, ANOVA) measures allows us to select the best recovery model. Finally, finite impulse response (FIR) and gamma neural networks are included in the adaptive noise cancellation (ANC) scheme in order to provide highly non-linear, dynamic capabilities to the recovery model. Neural networks are benchmarked with classical adaptive methods such as the least mean squares (LMS) and the normalized LMS (NLMS) algorithms in simulated and real registers and some conclusions are drawn. For synthetic registers, the most determinant factor in the identification of the models is the foetal-maternal signal-to-noise ratio (SNR). In addition, as the electromyogram contribution becomes more relevant, neural networks clearly outperform the LMS-based algorithm. From the ANOVA test, we found statistical differences between LMS-based models and neural models when complex situations (high foetal-maternal and foetal-noise SNRs) were present. These conclusions were confirmed after doing robustness tests on synthetic registers, visual inspection of the recovered signals and calculation of the recognition rates of foetal R-peaks for real situations. Finally, the best compromise between model complexity and outcomes was provided by the FIR neural network. Both

  5. File list: ALL.Neu.20.AllAg.Induced_neural_progenitors [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available ALL.Neu.20.AllAg.Induced_neural_progenitors mm9 All antigens Neural Induced neural progeni....biosciencedbc.jp/kyushu-u/mm9/assembled/ALL.Neu.20.AllAg.Induced_neural_progenitors.bed ...

  6. File list: ALL.Neu.10.AllAg.Induced_neural_progenitors [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available ALL.Neu.10.AllAg.Induced_neural_progenitors mm9 All antigens Neural Induced neural progeni....biosciencedbc.jp/kyushu-u/mm9/assembled/ALL.Neu.10.AllAg.Induced_neural_progenitors.bed ...

  7. File list: ALL.Neu.05.AllAg.Induced_neural_progenitors [Chip-atlas[Archive

    Lifescience Database Archive (English)

    Full Text Available ALL.Neu.05.AllAg.Induced_neural_progenitors mm9 All antigens Neural Induced neural progeni....biosciencedbc.jp/kyushu-u/mm9/assembled/ALL.Neu.05.AllAg.Induced_neural_progenitors.bed ...

  8. The Effects of GABAergic Polarity Changes on Episodic Neural Network Activity in Developing Neural Systems

    Directory of Open Access Journals (Sweden)

    Wilfredo Blanco

    2017-09-01

    Full Text Available Early in development, neural systems have primarily excitatory coupling, where even GABAergic synapses are excitatory. Many of these systems exhibit spontaneous episodes of activity that have been characterized through both experimental and computational studies. As development progress the neural system goes through many changes, including synaptic remodeling, intrinsic plasticity in the ion channel expression, and a transformation of GABAergic synapses from excitatory to inhibitory. What effect each of these, and other, changes have on the network behavior is hard to know from experimental studies since they all happen in parallel. One advantage of a computational approach is that one has the ability to study developmental changes in isolation. Here, we examine the effects of GABAergic synapse polarity change on the spontaneous activity of both a mean field and a neural network model that has both glutamatergic and GABAergic coupling, representative of a developing neural network. We find some intuitive behavioral changes as the GABAergic neurons go from excitatory to inhibitory, shared by both models, such as a decrease in the duration of episodes. We also find some paradoxical changes in the activity that are only present in the neural network model. In particular, we find that during early development the inter-episode durations become longer on average, while later in development they become shorter. In addressing this unexpected finding, we uncover a priming effect that is particularly important for a small subset of neurons, called the “intermediate neurons.” We characterize these neurons and demonstrate why they are crucial to episode initiation, and why the paradoxical behavioral change result from priming of these neurons. The study illustrates how even arguably the simplest of developmental changes that occurs in neural systems can present non-intuitive behaviors. It also makes predictions about neural network behavioral changes

  9. Neural simulations on multi-core architectures

    Directory of Open Access Journals (Sweden)

    Hubert Eichner

    2009-07-01

    Full Text Available Neuroscience is witnessing increasing knowledge about the anatomy and electrophysiological properties of neurons and their connectivity, leading to an ever increasing computational complexity of neural simulations. At the same time, a rather radical change in personal computer technology emerges with the establishment of multi-cores: high-density, explicitly parallel processor architectures for both high performance as well as standard desktop computers. This work introduces strategies for the parallelization of biophysically realistic neural simulations based on the compartmental modeling technique and results of such an implementation, with a strong focus on multi-core architectures and automation, i. e. user-transparent load balancing.

  10. Visual Servoing from Deep Neural Networks

    OpenAIRE

    Bateux, Quentin; Marchand, Eric; Leitner, Jürgen; Chaumette, Francois; Corke, Peter

    2017-01-01

    International audience; We present a deep neural network-based method to perform high-precision, robust and real-time 6 DOF visual servoing. The paper describes how to create a dataset simulating various perturbations (occlusions and lighting conditions) from a single real-world image of the scene. A convolutional neural network is fine-tuned using this dataset to estimate the relative pose between two images of the same scene. The output of the network is then employed in a visual servoing c...

  11. Design of Robust Neural Network Classifiers

    DEFF Research Database (Denmark)

    Larsen, Jan; Andersen, Lars Nonboe; Hintz-Madsen, Mads

    1998-01-01

    This paper addresses a new framework for designing robust neural network classifiers. The network is optimized using the maximum a posteriori technique, i.e., the cost function is the sum of the log-likelihood and a regularization term (prior). In order to perform robust classification, we present...... a modified likelihood function which incorporates the potential risk of outliers in the data. This leads to the introduction of a new parameter, the outlier probability. Designing the neural classifier involves optimization of network weights as well as outlier probability and regularization parameters. We...

  12. Artificial neural network in cosmic landscape

    Science.gov (United States)

    Liu, Junyu

    2017-12-01

    In this paper we propose that artificial neural network, the basis of machine learning, is useful to generate the inflationary landscape from a cosmological point of view. Traditional numerical simulations of a global cosmic landscape typically need an exponential complexity when the number of fields is large. However, a basic application of artificial neural network could solve the problem based on the universal approximation theorem of the multilayer perceptron. A toy model in inflation with multiple light fields is investigated numerically as an example of such an application.

  13. Program Aids Simulation Of Neural Networks

    Science.gov (United States)

    Baffes, Paul T.

    1990-01-01

    Computer program NETS - Tool for Development and Evaluation of Neural Networks - provides simulation of neural-network algorithms plus software environment for development of such algorithms. Enables user to customize patterns of connections between layers of network, and provides features for saving weight values of network, providing for more precise control over learning process. Consists of translating problem into format using input/output pairs, designing network configuration for problem, and finally training network with input/output pairs until acceptable error reached. Written in C.

  14. Electronic device aspects of neural network memories

    Science.gov (United States)

    Lambe, J.; Moopenn, A.; Thakoor, A. P.

    1985-01-01

    The basic issues related to the electronic implementation of the neural network model (NNM) for content addressable memories are examined. A brief introduction to the principles of the NNM is followed by an analysis of the information storage of the neural network in the form of a binary connection matrix and the recall capability of such matrix memories based on a hardware simulation study. In addition, materials and device architecture issues involved in the future realization of such networks in VLSI-compatible ultrahigh-density memories are considered. A possible space application of such devices would be in the area of large-scale information storage without mechanical devices.

  15. Top tagging with deep neural networks [Vidyo

    CERN Multimedia

    CERN. Geneva

    2017-01-01

    Recent literature on deep neural networks for top tagging has focussed on image based techniques or multivariate approaches using high level jet substructure variables. Here, we take a sequential approach to this task by using anordered sequence of energy deposits as training inputs. Unlike previous approaches, this strategy does not result in a loss of information during pixelization or the calculation of high level features. We also propose new preprocessing methods that do not alter key physical quantities such as jet mass. We compare the performance of this approach to standard tagging techniques and present results evaluating the robustness of the neural network to pileup.

  16. A neural network simulation package in CLIPS

    Science.gov (United States)

    Bhatnagar, Himanshu; Krolak, Patrick D.; Mcgee, Brenda J.; Coleman, John

    1990-01-01

    The intrinsic similarity between the firing of a rule and the firing of a neuron has been captured in this research to provide a neural network development system within an existing production system (CLIPS). A very important by-product of this research has been the emergence of an integrated technique of using rule based systems in conjunction with the neural networks to solve complex problems. The systems provides a tool kit for an integrated use of the two techniques and is also extendible to accommodate other AI techniques like the semantic networks, connectionist networks, and even the petri nets. This integrated technique can be very useful in solving complex AI problems.

  17. Automatic identification of species with neural networks.

    Science.gov (United States)

    Hernández-Serna, Andrés; Jiménez-Segura, Luz Fernanda

    2014-01-01

    A new automatic identification system using photographic images has been designed to recognize fish, plant, and butterfly species from Europe and South America. The automatic classification system integrates multiple image processing tools to extract the geometry, morphology, and texture of the images. Artificial neural networks (ANNs) were used as the pattern recognition method. We tested a data set that included 740 species and 11,198 individuals. Our results show that the system performed with high accuracy, reaching 91.65% of true positive fish identifications, 92.87% of plants and 93.25% of butterflies. Our results highlight how the neural networks are complementary to species identification.

  18. Automatic identification of species with neural networks

    Directory of Open Access Journals (Sweden)

    Andrés Hernández-Serna

    2014-11-01

    Full Text Available A new automatic identification system using photographic images has been designed to recognize fish, plant, and butterfly species from Europe and South America. The automatic classification system integrates multiple image processing tools to extract the geometry, morphology, and texture of the images. Artificial neural networks (ANNs were used as the pattern recognition method. We tested a data set that included 740 species and 11,198 individuals. Our results show that the system performed with high accuracy, reaching 91.65% of true positive fish identifications, 92.87% of plants and 93.25% of butterflies. Our results highlight how the neural networks are complementary to species identification.

  19. Stimulation and recording electrodes for neural prostheses

    CERN Document Server

    Pour Aryan, Naser; Rothermel, Albrecht

    2015-01-01

    This book provides readers with basic principles of the electrochemistry of the electrodes used in modern, implantable neural prostheses. The authors discuss the boundaries and conditions in which the electrodes continue to function properly for long time spans, which are required when designing neural stimulator devices for long-term in vivo applications. Two kinds of electrode materials, titanium nitride and iridium are discussed extensively, both qualitatively and quantitatively. The influence of the counter electrode on the safety margins and electrode lifetime in a two electrode system is explained. Electrode modeling is handled in a final chapter.

  20. Neural activation in stress-related exhaustion

    DEFF Research Database (Denmark)

    Gavelin, Hanna Malmberg; Neely, Anna Stigsdotter; Andersson, Micael

    2017-01-01

    The primary purpose of this study was to investigate the association between burnout and neural activation during working memory processing in patients with stress-related exhaustion. Additionally, we investigated the neural effects of cognitive training as part of stress rehabilitation. Fifty......-five patients with clinical diagnosis of exhaustion disorder were administered the n-back task during fMRI scanning at baseline. Ten patients completed a 12-week cognitive training intervention, as an addition to stress rehabilitation. Eleven patients served as a treatment-as-usual control group. At baseline...

  1. Pulse image recognition using fuzzy neural network.

    Science.gov (United States)

    Xu, L S; Meng, Max Q -H; Wang, K Q

    2007-01-01

    The automatic recognition of pulse images is the key in the research of computerized pulse diagnosis. In order to automatically differentiate the pulse patterns by using small samples, a fuzzy neural network to classify pulse images based on the knowledge of experts in traditional Chinese pulse diagnosis was designed. The designed classifier can make hard decision and soft decision for identifying 18 patterns of pulse images at the accuracy of 91%, which is better than the results that achieved by back-propagation neural network.

  2. Assessing Landslide Hazard Using Artificial Neural Network

    DEFF Research Database (Denmark)

    Farrokhzad, Farzad; Choobbasti, Asskar Janalizadeh; Barari, Amin

    2011-01-01

    failure" which is main concentration of the current research and "liquefaction failure". Shear failures along shear planes occur when the shear stress along the sliding surfaces exceed the effective shear strength. These slides have been referred to as landslide. An expert system based on artificial...... neural network has been developed for use in the stability evaluation of slopes under various geological conditions and engineering requirements. The Artificial neural network model of this research uses slope characteristics as input and leads to the output in form of the probability of failure...

  3. Neural networks advances and applications 2

    CERN Document Server

    Gelenbe, E

    1992-01-01

    The present volume is a natural follow-up to Neural Networks: Advances and Applications which appeared one year previously. As the title indicates, it combines the presentation of recent methodological results concerning computational models and results inspired by neural networks, and of well-documented applications which illustrate the use of such models in the solution of difficult problems. The volume is balanced with respect to these two orientations: it contains six papers concerning methodological developments and five papers concerning applications and examples illustrating the theoret

  4. Neural Variability Quenching Predicts Individual Perceptual Abilities.

    Science.gov (United States)

    Arazi, Ayelet; Censor, Nitzan; Dinstein, Ilan

    2017-01-04

    Neural activity during repeated presentations of a sensory stimulus exhibits considerable trial-by-trial variability. Previous studies have reported that trial-by-trial neural variability is reduced (quenched) by the presentation of a stimulus. However, the functional significance and behavioral relevance of variability quenching and the potential physiological mechanisms that may drive it have been studied only rarely. Here, we recorded neural activity with EEG as subjects performed a two-interval forced-choice contrast discrimination task. Trial-by-trial neural variability was quenched by ∼40% after the presentation of the stimulus relative to the variability apparent before stimulus presentation, yet there were large differences in the magnitude of variability quenching across subjects. Individual magnitudes of quenching predicted individual discrimination capabilities such that subjects who exhibited larger quenching had smaller contrast discrimination thresholds and steeper psychometric function slopes. Furthermore, the magnitude of variability quenching was strongly correlated with a reduction in broadband EEG power after stimulus presentation. Our results suggest that neural variability quenching is achieved by reducing the amplitude of broadband neural oscillations after sensory input, which yields relatively more reproducible cortical activity across trials and enables superior perceptual abilities in individuals who quench more. Variability quenching is a phenomenon in which neural variability across trials is reduced by the presentation of a stimulus. Although this phenomenon has been reported across a variety of animal and human studies, its functional significance and behavioral relevance have been examined only rarely. Here, we report novel empirical evidence from humans revealing that variability quenching differs dramatically across individual subjects and explains to a certain degree why some individuals exhibit better perceptual abilities than

  5. Neural Dynamics Underlying Event-Related Potentials

    Science.gov (United States)

    Shah, Ankoor S.; Bressler, Steven L.; Knuth, Kevin H.; Ding, Ming-Zhou; Mehta, Ashesh D.; Ulbert, Istvan; Schroeder, Charles E.

    2003-01-01

    There are two opposing hypotheses about the brain mechanisms underlying sensory event-related potentials (ERPs). One holds that sensory ERPs are generated by phase resetting of ongoing electroencephalographic (EEG) activity, and the other that they result from signal averaging of stimulus-evoked neural responses. We tested several contrasting predictions of these hypotheses by direct intracortical analysis of neural activity in monkeys. Our findings clearly demonstrate evoked response contributions to the sensory ERP in the monkey, and they suggest the likelihood that a mixed (Evoked/Phase Resetting) model may account for the generation of scalp ERPs in humans.

  6. Human Face Recognition Using Convolutional Neural Networks

    Directory of Open Access Journals (Sweden)

    Răzvan-Daniel Albu

    2009-10-01

    Full Text Available In this paper, I present a novel hybrid face recognition approach based on a convolutional neural architecture, designed to robustly detect highly variable face patterns. The convolutional network extracts successively larger features in a hierarchical set of layers. With the weights of the trained neural networks there are created kernel windows used for feature extraction in a 3-stage algorithm. I present experimental results illustrating the efficiency of the proposed approach. I use a database of 796 images of 159 individuals from Reims University which contains quite a high degree of variability in expression, pose, and facial details.

  7. SAR ATR Based on Convolutional Neural Network

    Directory of Open Access Journals (Sweden)

    Tian Zhuangzhuang

    2016-06-01

    Full Text Available This study presents a new method of Synthetic Aperture Radar (SAR image target recognition based on a convolutional neural network. First, we introduce a class separability measure into the cost function to improve this network’s ability to distinguish between categories. Then, we extract SAR image features using the improved convolutional neural network and classify these features using a support vector machine. Experimental results using moving and stationary target acquisition and recognition SAR datasets prove the validity of this method.

  8. Neural Activity Reveals Preferences Without Choices

    Science.gov (United States)

    Smith, Alec; Bernheim, B. Douglas; Camerer, Colin

    2014-01-01

    We investigate the feasibility of inferring the choices people would make (if given the opportunity) based on their neural responses to the pertinent prospects when they are not engaged in actual decision making. The ability to make such inferences is of potential value when choice data are unavailable, or limited in ways that render standard methods of estimating choice mappings problematic. We formulate prediction models relating choices to “non-choice” neural responses and use them to predict out-of-sample choices for new items and for new groups of individuals. The predictions are sufficiently accurate to establish the feasibility of our approach. PMID:25729468

  9. Astrocytes: Tailored to Support the Demand of Neural Circuits?

    DEFF Research Database (Denmark)

    Rasmussen, Rune

    2017-01-01

    Anatomy, physiology, proteomics, and genomics reveal the prospect of distinct highly specialized astrocyte subtypes within neural circuits.......Anatomy, physiology, proteomics, and genomics reveal the prospect of distinct highly specialized astrocyte subtypes within neural circuits....

  10. Emerging trends in neuro engineering and neural computation

    CERN Document Server

    Lee, Kendall; Garmestani, Hamid; Lim, Chee

    2017-01-01

    This book focuses on neuro-engineering and neural computing, a multi-disciplinary field of research attracting considerable attention from engineers, neuroscientists, microbiologists and material scientists. It explores a range of topics concerning the design and development of innovative neural and brain interfacing technologies, as well as novel information acquisition and processing algorithms to make sense of the acquired data. The book also highlights emerging trends and advances regarding the applications of neuro-engineering in real-world scenarios, such as neural prostheses, diagnosis of neural degenerative diseases, deep brain stimulation, biosensors, real neural network-inspired artificial neural networks (ANNs) and the predictive modeling of information flows in neuronal networks. The book is broadly divided into three main sections including: current trends in technological developments, neural computation techniques to make sense of the neural behavioral data, and application of these technologie...

  11. Parameter Identification by Bayes Decision and Neural Networks

    DEFF Research Database (Denmark)

    Kulczycki, P.; Schiøler, Henrik

    1994-01-01

    The problem of parameter identification by Bayes point estimation using neural networks is investigated.......The problem of parameter identification by Bayes point estimation using neural networks is investigated....

  12. On The Comparison of Artificial Neural Network (ANN) and ...

    African Journals Online (AJOL)

    West African Journal of Industrial and Academic Research ... This work presented the results of an experimental comparison of two models: Multinomial Logistic Regression (MLR) and Artificial Neural Network (ANN) for ... Keywords: Multinomial Logistic Regression, Artificial Neural Network, Correct classification rate.

  13. A NEURAL OSCILLATOR-NETWORK MODEL OF TEMPORAL PATTERN GENERATION

    NARCIS (Netherlands)

    Schomaker, Lambert

    Most contemporary neural network models deal with essentially static, perceptual problems of classification and transformation. Models such as multi-layer feedforward perceptrons generally do not incorporate time as an essential dimension, whereas biological neural networks are inherently temporal

  14. Activity Patterns of Cultured Neural Networks on Micro Electrode Arrays

    National Research Council Canada - National Science Library

    Rutten, Wim

    2001-01-01

    A hybrid neuro-electronic interface is a cell-cultured micro electrode array, acting as a neural information transducer for stimulation and/or recording of neural activity in the brain or the spinal cord...

  15. Real-time decision fusion for multimodal neural prosthetic devices

    National Research Council Canada - National Science Library

    White, James Robert; Levy, Todd; Bishop, William; Beaty, James D

    2010-01-01

    ...) through which neural activity is decoded into movements. A natural extension of current research is the incorporation of neural activity from multiple modalities to more accurately estimate the user's intent...

  16. Unveiling neural coupling within the sensorimotor system : directionality and nonlinearity

    NARCIS (Netherlands)

    Yang, Y.; Dewald, J.P.A.; van der Helm, F.C.T.; Schouten, A.C.

    2017-01-01

    Neural coupling between the central nervous system and the periphery is essential for the neural control of movement. Corticomuscular coherence is a popular linear technique to assess synchronised oscillatory activity in the sensorimotor system. This oscillatory coupling originates from ascending

  17. Equivalence of Conventional and Modified Network of Generalized Neural Elements

    Directory of Open Access Journals (Sweden)

    E. V. Konovalov

    2016-01-01

    Full Text Available The article is devoted to the analysis of neural networks consisting of generalized neural elements. The first part of the article proposes a new neural network model — a modified network of generalized neural elements (MGNE-network. This network developes the model of generalized neural element, whose formal description contains some flaws. In the model of the MGNE-network these drawbacks are overcome. A neural network is introduced all at once, without preliminary description of the model of a single neural element and method of such elements interaction. The description of neural network mathematical model is simplified and makes it relatively easy to construct on its basis a simulation model to conduct numerical experiments. The model of the MGNE-network is universal, uniting properties of networks consisting of neurons-oscillators and neurons-detectors. In the second part of the article we prove the equivalence of the dynamics of the two considered neural networks: the network, consisting of classical generalized neural elements, and MGNE-network. We introduce the definition of equivalence in the functioning of the generalized neural element and the MGNE-network consisting of a single element. Then we introduce the definition of the equivalence of the dynamics of the two neural networks in general. It is determined the correlation of different parameters of the two considered neural network models. We discuss the issue of matching the initial conditions of the two considered neural network models. We prove the theorem about the equivalence of the dynamics of the two considered neural networks. This theorem allows us to apply all previously obtained results for the networks, consisting of classical generalized neural elements, to the MGNE-network.

  18. Robust Adaptive Control via Neural Linearization and Compensation

    Directory of Open Access Journals (Sweden)

    Roberto Carmona Rodríguez

    2012-01-01

    Full Text Available We propose a new type of neural adaptive control via dynamic neural networks. For a class of unknown nonlinear systems, a neural identifier-based feedback linearization controller is first used. Dead-zone and projection techniques are applied to assure the stability of neural identification. Then four types of compensator are addressed. The stability of closed-loop system is also proven.

  19. Optimizing neural network models: motivation and case studies

    OpenAIRE

    Harp, S A; T. Samad

    2012-01-01

    Practical successes have been achieved  with neural network models in a variety of domains, including energy-related industry. The large, complex design space presented by neural networks is only minimally explored in current practice. The satisfactory results that nevertheless have been obtained testify that neural networks are a robust modeling technology; at the same time, however, the lack of a systematic design approach implies that the best neural network models generally  rem...

  20. Dynamic Object Identification with SOM-based neural networks

    Directory of Open Access Journals (Sweden)

    Aleksey Averkin

    2014-03-01

    Full Text Available In this article a number of neural networks based on self-organizing maps, that can be successfully used for dynamic object identification, is described. Unique SOM-based modular neural networks with vector quantized associative memory and recurrent self-organizing maps as modules are presented. The structured algorithms of learning and operation of such SOM-based neural networks are described in details, also some experimental results and comparison with some other neural networks are given.

  1. Stock Price Prediction Based on Procedural Neural Networks

    OpenAIRE

    Jiuzhen Liang; Wei Song; Mei Wang

    2011-01-01

    We present a spatiotemporal model, namely, procedural neural networks for stock price prediction. Compared with some successful traditional models on simulating stock market, such as BNN (backpropagation neural networks, HMM (hidden Markov model) and SVM (support vector machine)), the procedural neural network model processes both spacial and temporal information synchronously without slide time window, which is typically used in the well-known recurrent neural networks. Two differen...

  2. Analysis of neural networks in terms of domain functions

    NARCIS (Netherlands)

    van der Zwaag, B.J.; Slump, Cornelis H.; Spaanenburg, Lambert

    Despite their success-story, artificial neural networks have one major disadvantage compared to other techniques: the inability to explain comprehensively how a trained neural network reaches its output; neural networks are not only (incorrectly) seen as a "magic tool" but possibly even more as a

  3. Extracting knowledge from supervised neural networks in image processing

    NARCIS (Netherlands)

    van der Zwaag, B.J.; Slump, Cornelis H.; Spaanenburg, Lambert; Jain, R.; Abraham, A.; Faucher, C.; van der Zwaag, B.J.

    Despite their success-story, artificial neural networks have one major disadvantage compared to other techniques: the inability to explain comprehensively how a trained neural network reaches its output; neural networks are not only (incorrectly) seen as a “magic tool��? but possibly even more as a

  4. neural network based load frequency control for restructuring power

    African Journals Online (AJOL)

    2012-03-01

    Mar 1, 2012 ... Abstract. In this study, an artificial neural network (ANN) application of load frequency control. (LFC) of a Multi-Area power system by using a neural network controller is presented. The comparison between a conventional Proportional Integral (PI) controller and the proposed artificial neural networks ...

  5. Artificial Neural Network Modeling of an Inverse Fluidized Bed ...

    African Journals Online (AJOL)

    The application of neural networks to model a laboratory scale inverse fluidized bed reactor has been studied. A Radial Basis Function neural network has been successfully employed for the modeling of the inverse fluidized bed reactor. In the proposed model, the trained neural network represents the kinetics of biological ...

  6. Artificial Neural Network Analysis of Xinhui Pericarpium Citri ...

    African Journals Online (AJOL)

    Purpose: To develop an effective analytical method to distinguish old peels of Xinhui Pericarpium citri reticulatae (XPCR) stored for > 3 years from new peels stored for < 3 years. Methods: Artificial neural networks (ANN) models, including general regression neural network (GRNN) and multi-layer feedforward neural ...

  7. Time series prediction with simple recurrent neural networks ...

    African Journals Online (AJOL)

    Simple recurrent neural networks are widely used in time series prediction. Most researchers and application developers often choose arbitrarily between Elman or Jordan simple recurrent neural networks for their applications. A hybrid of the two called Elman-Jordan (or Multi-recurrent) neural network is also being used.

  8. Application of radial basis neural network for state estimation of ...

    African Journals Online (AJOL)

    user

    An original application of radial basis function (RBF) neural network for power system state estimation is proposed in this paper. The property of massive parallelism of neural networks is employed for this. The application of RBF neural network for state estimation is investigated by testing its applicability on a IEEE 14 bus ...

  9. New Neural Network Methods for Forecasting Regional Employment

    NARCIS (Netherlands)

    Patuelli, R.; Reggiani, A; Nijkamp, P.; Blien, U.

    2006-01-01

    In this paper, a set of neural network (NN) models is developed to compute short-term forecasts of regional employment patterns in Germany. Neural networks are modern statistical tools based on learning algorithms that are able to process large amounts of data. Neural networks are enjoying

  10. The Artifical Neural Network as means for modeling Nonlinear Systems

    OpenAIRE

    Drábek Oldøich; Taufer Ivan

    1998-01-01

    The paper deals with nonlinear system identification based on neural network. The topic of this publication is simulation of training and testing a neural network. A contribution is assigned to technologists which are good at the clasical identification problems but their knowledges about identification based on neural network are only on the stage of theoretical bases.

  11. The Artifical Neural Network as means for modeling Nonlinear Systems

    Directory of Open Access Journals (Sweden)

    Drábek Oldøich

    1998-12-01

    Full Text Available The paper deals with nonlinear system identification based on neural network. The topic of this publication is simulation of training and testing a neural network. A contribution is assigned to technologists which are good at the clasical identification problems but their knowledges about identification based on neural network are only on the stage of theoretical bases.

  12. Parametric Identification of Aircraft Loads: An Artificial Neural Network Approach

    Science.gov (United States)

    2016-03-30

    Undergraduate Student Paper Postgraduate Student Paper Parametric Identification of Aircraft Loads: An Artificial Neural Network Approach...monitoring, flight parameter, nonlinear modeling, Artificial Neural Network , typical loadcase. Introduction Aircraft load monitoring is an... Neural Networks (ANN), i.e. the BP network and Kohonen Clustering Network , are applied and revised by Kalman Filter and Genetic Algorithm to build

  13. Algorithm For A Self-Growing Neural Network

    Science.gov (United States)

    Cios, Krzysztof J.

    1996-01-01

    CID3 algorithm simulates self-growing neural network. Constructs decision trees equivalent to hidden layers of neural network. Based on ID3 algorithm, which dynamically generates decision tree while minimizing entropy of information. CID3 algorithm generates feedforward neural network by use of either crisp or fuzzy measure of entropy.

  14. Identification and Position Control of Marine Helm using Artificial Neural Network Neural Network

    Directory of Open Access Journals (Sweden)

    Hui ZHU

    2008-02-01

    Full Text Available If nonlinearities such as saturation of the amplifier gain and motor torque, gear backlash, and shaft compliances- just to name a few - are considered in the position control system of marine helm, traditional control methods are no longer sufficient to be used to improve the performance of the system. In this paper an alternative approach to traditional control methods - a neural network reference controller - is proposed to establish an adaptive control of the position of the marine helm to achieve the controlled variable at the command position. This neural network controller comprises of two neural networks. One is the plant model network used to identify the nonlinear system and the other the controller network used to control the output to follow the reference model. The experimental results demonstrate that this adaptive neural network reference controller has much better control performance than is obtained with traditional controllers.

  15. Neural Processing of Auditory Signals and Modular Neural Control for Sound Tropism of Walking Machines

    Directory of Open Access Journals (Sweden)

    Hubert Roth

    2008-11-01

    Full Text Available The specialized hairs and slit sensillae of spiders (Cupiennius salei can sense the airflow and auditory signals in a low-frequency range. They provide the sensor information for reactive behavior, like e.g. capturing a prey. In analogy, in this paper a setup is described where two microphones and a neural preprocessing system together with a modular neural controller are used to generate a sound tropism of a four-legged walking machine. The neural preprocessing network is acting as a low-pass filter and it is followed by a network which discerns between signals coming from the left or the right. The parameters of these networks are optimized by an evolutionary algorithm. In addition, a simple modular neural controller then generates the desired different walking patterns such that the machine walks straight, then turns towards a switched-on sound source, and then stops near to it.

  16. Neural classifiers using one-time updating.

    Science.gov (United States)

    Diamantaras, K I; Strintzis, M G

    1998-01-01

    The linear threshold element (LTE), or perceptron, is a linear classifier with limited capabilities due to the problems arising when the input pattern set is linearly nonseparable. Assuming that the patterns are presented in a sequential fashion, we derive a theory for the detection of linear nonseparability as soon as it appears in the pattern set. This theory is based on the precise determination of the solution region in the weight space with the help of a special set of vectors. For this region, called the solution cone, we present a recursive computation procedure which allows immediate detection of nonseparability. The separability-violating patterns may be skipped so that, at the end, we derive a totally separable subset of the original pattern set along with its solution cone. The intriguing aspect of this algorithm is that it can be directly cast into a simple neural-network implementation. In this model the synaptic weights are committed (they are updated only once, and the only change that may happen after that is their destruction). This bears resemblance to the behavior of biological neural networks, and it is a feature unlike those of most other artificial neural techniques. Finally, by combining many such neural models we develop a learning procedure capable of separating convex classes.

  17. WATER DEMAND PREDICTION USING ARTIFICIAL NEURAL ...

    African Journals Online (AJOL)

    This paper presents Hourly water demand prediction at the demand nodes of a water distribution network using NeuNet Pro 2.3 neural network software and the monitoring and control of water distribution using supervisory control. The case study is the Laminga Water Treatment Plant and its water distribution network, Jos.

  18. Neural entrainment to speech modulates speech intelligibility

    NARCIS (Netherlands)

    Riecke, Lars; Formisano, Elia; Sorger, Bettina; Başkent, Deniz; Gaudrain, Etienne

    2018-01-01

    Speech is crucial for communication in everyday life. Speech-brain entrainment, the alignment of neural activity to the slow temporal fluctuations (envelope) of acoustic speech input, is a ubiquitous element of current theories of speech processing. Associations between speech-brain entrainment and

  19. Alexia and the Neural Basis of Reading.

    Science.gov (United States)

    Benson, D. Frank

    1984-01-01

    The historical background of alexia (loss or impairment of the ability to comprehend written or printed language based on damage to the brain) is reviewed, classification and symptomatology considered, theories on the involvement of right hemisphere reading are noted, and the neural basis of reading is postulated. (CL)

  20. The neural basis of phantom limb pain.

    Science.gov (United States)

    Flor, Herta; Diers, Martin; Andoh, Jamila

    2013-07-01

    A recent study suggests that brain changes in amputees may be pain-induced, questioning maladaptive plasticity as a neural basis of phantom pain. These findings add valuable information on cortical reorganization after amputation. We suggest further lines of research to clarify the mechanisms that underlie phantom pain. Copyright © 2013 Elsevier Ltd. All rights reserved.

  1. Tumor Diagnosis Using Backpropagation Neural Network Method

    Science.gov (United States)

    Ma, Lixing; Looney, Carl; Sukuta, Sydney; Bruch, Reinhard; Afanasyeva, Natalia

    1998-05-01

    For characterization of skin cancer, an artificial neural network (ANN) method has been developed to diagnose normal tissue, benign tumor and melanoma. The pattern recognition is based on a three-layer neural network fuzzy learning system. In this study, the input neuron data set is the Fourier Transform infrared (FT-IR)spectrum obtained by a new Fiberoptic Evanescent Wave Fourier Transform Infrared (FEW-FTIR) spectroscopy method in the range of 1480 to 1850 cm-1. Ten input features are extracted from the absorbency values in this region. A single hidden layer of neural nodes with sigmoids activation functions clusters the feature space into small subclasses and the output nodes are separated in different nonconvex classes to permit nonlinear discrimination of disease states. The output is classified as three classes: normal tissue, benign tumor and melanoma. The results obtained from the neural network pattern recognition are shown to be consistent with traditional medical diagnosis. Input features have also been extracted from the absorbency spectra using chemical factor analysis. These abstract features or factors are also used in the classification.

  2. Computational capabilities of graph neural networks.

    Science.gov (United States)

    Scarselli, Franco; Gori, Marco; Tsoi, Ah Chung; Hagenbuchner, Markus; Monfardini, Gabriele

    2009-01-01

    In this paper, we will consider the approximation properties of a recently introduced neural network model called graph neural network (GNN), which can be used to process-structured data inputs, e.g., acyclic graphs, cyclic graphs, and directed or undirected graphs. This class of neural networks implements a function tau(G,n) is an element of IR(m) that maps a graph G and one of its nodes n onto an m-dimensional Euclidean space. We characterize the functions that can be approximated by GNNs, in probability, up to any prescribed degree of precision. This set contains the maps that satisfy a property called preservation of the unfolding equivalence, and includes most of the practically useful functions on graphs; the only known exception is when the input graph contains particular patterns of symmetries when unfolding equivalence may not be preserved. The result can be considered an extension of the universal approximation property established for the classic feedforward neural networks (FNNs). Some experimental examples are used to show the computational capabilities of the proposed model.

  3. Parameter estimation using compensatory neural networks

    Indian Academy of Sciences (India)

    Proposed here is a new neuron model, a basis for Compensatory Neural Network Architecture (CNNA), which not only reduces the total number of interconnections among neurons but also reduces the total computing time for training. The suggested model has properties of the basic neuron model as well as the higher ...

  4. Neural and behavioral investigations into timbre perception

    Directory of Open Access Journals (Sweden)

    Stephen Michael Town

    2013-11-01

    Full Text Available Timbre is the attribute that distinguishes sounds of equal pitch, loudness and duration. It contributes to our perception and discrimination of different vowels and consonants in speech, instruments in music and environmental sounds. Here we begin by reviewing human timbre perception and the spectral and temporal acoustic features that give rise to timbre in speech, musical and environmental sounds. We also consider the perception of timbre by animals, both in the case of human vowels and non-human vocalizations. We then explore the neural representation of timbre, first within the peripheral auditory system and later at the level of the auditory cortex. We examine the neural networks that are implicated in timbre perception and the computations that may be performed in auditory cortex to enable listeners to extract information about timbre. We consider whether single neurons in auditory cortex are capable of representing spectral timbre independently of changes in other perceptual attributes and the mechanisms that may shape neural sensitivity to timbre. Finally, we conclude by outlining some of the questions that remain about the role of neural mechanisms in behavior and consider some potentially fruitful avenues for future research.

  5. Domestic Heat Demand Prediction using Neural Networks

    NARCIS (Netherlands)

    Bakker, Vincent; Molderink, Albert; Hurink, Johann L.; Smit, Gerardus Johannes Maria

    2008-01-01

    By combining a cluster of microCHP appliances, a virtual power plant can be formed. To use such a virtual power plant, a good heat demand prediction of individual households is needed since the heat demand determines the production capacity. In this paper we present the results of using neural

  6. Based on BP Neural Network Stock Prediction

    Science.gov (United States)

    Liu, Xiangwei; Ma, Xin

    2012-01-01

    The stock market has a high profit and high risk features, on the stock market analysis and prediction research has been paid attention to by people. Stock price trend is a complex nonlinear function, so the price has certain predictability. This article mainly with improved BP neural network (BPNN) to set up the stock market prediction model, and…

  7. Epileptiform spike detection via convolutional neural networks

    DEFF Research Database (Denmark)

    Johansen, Alexander Rosenberg; Jin, Jing; Maszczyk, Tomasz

    2016-01-01

    The EEG of epileptic patients often contains sharp waveforms called "spikes", occurring between seizures. Detecting such spikes is crucial for diagnosing epilepsy. In this paper, we develop a convolutional neural network (CNN) for detecting spikes in EEG of epileptic patients in an automated...

  8. Artificial neural networks and support vector mac

    Indian Academy of Sciences (India)

    Quantitative structure-property relationships of electroluminescent materials: Artificial neural networks and support vector machines to predict electroluminescence of organic molecules. ALANA FERNANDES GOLIN and RICARDO STEFANI. ∗. Laboratório de Estudos de Materiais (LEMAT), Instituto de Ciências Exatas e da ...

  9. Neural Networks for protein Structure Prediction

    DEFF Research Database (Denmark)

    Bohr, Henrik

    1998-01-01

    This is a review about neural network applications in bioinformatics. Especially the applications to protein structure prediction, e.g. prediction of secondary structures, prediction of surface structure, fold class recognition and prediction of the 3-dimensional structure of protein backbones...

  10. Training Deep Spiking Neural Networks Using Backpropagation.

    Science.gov (United States)

    Lee, Jun Haeng; Delbruck, Tobi; Pfeiffer, Michael

    2016-01-01

    Deep spiking neural networks (SNNs) hold the potential for improving the latency and energy efficiency of deep neural networks through data-driven event-based computation. However, training such networks is difficult due to the non-differentiable nature of spike events. In this paper, we introduce a novel technique, which treats the membrane potentials of spiking neurons as differentiable signals, where discontinuities at spike times are considered as noise. This enables an error backpropagation mechanism for deep SNNs that follows the same principles as in conventional deep networks, but works directly on spike signals and membrane potentials. Compared with previous methods relying on indirect training and conversion, our technique has the potential to capture the statistics of spikes more precisely. We evaluate the proposed framework on artificially generated events from the original MNIST handwritten digit benchmark, and also on the N-MNIST benchmark recorded with an event-based dynamic vision sensor, in which the proposed method reduces the error rate by a factor of more than three compared to the best previous SNN, and also achieves a higher accuracy than a conventional convolutional neural network (CNN) trained and tested on the same data. We demonstrate in the context of the MNIST task that thanks to their event-driven operation, deep SNNs (both fully connected and convolutional) trained with our method achieve accuracy equivalent with conventional neural networks. In the N-MNIST example, equivalent accuracy is achieved with about five times fewer computational operations.

  11. Towards semen quality assessment using neural networks

    DEFF Research Database (Denmark)

    Linneberg, Christian; Salamon, P.; Svarer, C.

    1994-01-01

    The paper presents the methodology and results from a neural net based classification of human sperm head morphology. The methodology uses a preprocessing scheme in which invariant Fourier descriptors are lumped into “energy” bands. The resulting networks are pruned using optimal brain damage...

  12. Estimating Conditional Distributions by Neural Networks

    DEFF Research Database (Denmark)

    Kulczycki, P.; Schiøler, Henrik

    1998-01-01

    Neural Networks for estimating conditionaldistributions and their associated quantiles are investigated in this paper. A basic network structure is developed on the basis of kernel estimation theory, and consistency property is considered from a mild set of assumptions. A number of applications...

  13. Convolutional Neural Networks for SAR Image Segmentation

    DEFF Research Database (Denmark)

    Malmgren-Hansen, David; Nobel-Jørgensen, Morten

    2015-01-01

    Segmentation of Synthetic Aperture Radar (SAR) images has several uses, but it is a difficult task due to a number of properties related to SAR images. In this article we show how Convolutional Neural Networks (CNNs) can easily be trained for SAR image segmentation with good results. Besides...

  14. Convolutional Neural Networks - Generalizability and Interpretations

    DEFF Research Database (Denmark)

    Malmgren-Hansen, David

    from data despite it being limited in amount or context representation. Within Machine Learning this thesis focuses on Convolutional Neural Networks for Computer Vision. The research aims to answer how to explore a model's generalizability to the whole population of data samples and how to interpret...

  15. Visualization of neural networks using saliency maps

    DEFF Research Database (Denmark)

    Mørch, Niels J.S.; Kjems, Ulrik; Hansen, Lars Kai

    1995-01-01

    The saliency map is proposed as a new method for understanding and visualizing the nonlinearities embedded in feedforward neural networks, with emphasis on the ill-posed case, where the dimensionality of the input-field by far exceeds the number of examples. Several levels of approximations...

  16. Separable explanations of neural network decisions

    DEFF Research Database (Denmark)

    Rieger, Laura

    2017-01-01

    Deep Taylor Decomposition is a method used to explain neural network decisions. When applying this method to non-dominant classifications, the resulting explanation does not reflect important features for the chosen classification. We propose that this is caused by the dense layers and propose...

  17. Fast Fingerprint Classification with Deep Neural Network

    DEFF Research Database (Denmark)

    Michelsanti, Daniel; Guichi, Yanis; Ene, Andreea-Daniela

    2017-01-01

    . In this work we evaluate the performance of two pre-trained convolutional neural networks fine-tuned on the NIST SD4 benchmark database. The obtained results show that this approach is comparable with other results in the literature, with the advantage of a fast feature extraction stage....

  18. Empirical generalization assessment of neural network models

    DEFF Research Database (Denmark)

    Larsen, Jan; Hansen, Lars Kai

    1995-01-01

    This paper addresses the assessment of generalization performance of neural network models by use of empirical techniques. We suggest to use the cross-validation scheme combined with a resampling technique to obtain an estimate of the generalization performance distribution of a specific model...

  19. Neural Network for Estimating Conditional Distribution

    DEFF Research Database (Denmark)

    Schiøler, Henrik; Kulczycki, P.

    Neural networks for estimating conditional distributions and their associated quantiles are investigated in this paper. A basic network structure is developed on the basis of kernel estimation theory, and consistency is proved from a mild set of assumptions. A number of applications within...... statistcs, decision theory and signal processing are suggested, and a numerical example illustrating the capabilities of the elaborated network is given...

  20. Genetic control of active neural circuits

    Directory of Open Access Journals (Sweden)

    Leon Reijmers

    2009-12-01

    Full Text Available The use of molecular tools to study the neurobiology of complex behaviors has been hampered by an inability to target the desired changes to relevant groups of neurons. Specific memories and specific sensory representations are sparsely encoded by a small fraction of neurons embedded in a sea of morphologically and functionally similar cells. In this review we discuss genetics techniques that are being developed to address this difficulty. In several studies the use of promoter elements that are responsive to neural activity have been used to drive long lasting genetic alterations into neural ensembles that are activated by natural environmental stimuli. This approach has been used to examine neural activity patterns during learning and retrieval of a memory, to examine the regulation of receptor trafficking following learning and to functionally manipulate a specific memory trace. We suggest that these techniques will provide a general approach to experimentally investigate the link between patterns of environmentally activated neural firing and cognitive processes such as perception and memory.

  1. Localizing Tortoise Nests by Neural Networks.

    Directory of Open Access Journals (Sweden)

    Roberto Barbuti

    Full Text Available The goal of this research is to recognize the nest digging activity of tortoises using a device mounted atop the tortoise carapace. The device classifies tortoise movements in order to discriminate between nest digging, and non-digging activity (specifically walking and eating. Accelerometer data was collected from devices attached to the carapace of a number of tortoises during their two-month nesting period. Our system uses an accelerometer and an activity recognition system (ARS which is modularly structured using an artificial neural network and an output filter. For the purpose of experiment and comparison, and with the aim of minimizing the computational cost, the artificial neural network has been modelled according to three different architectures based on the input delay neural network (IDNN. We show that the ARS can achieve very high accuracy on segments of data sequences, with an extremely small neural network that can be embedded in programmable low power devices. Given that digging is typically a long activity (up to two hours, the application of ARS on data segments can be repeated over time to set up a reliable and efficient system, called Tortoise@, for digging activity recognition.

  2. Neural and Behavioral Correlates of Song Prosody

    Science.gov (United States)

    Gordon, Reyna Leigh

    2010-01-01

    This dissertation studies the neural basis of song, a universal human behavior. The relationship of words and melodies in the perception of song at phonological, semantic, melodic, and rhythmic levels of processing was investigated using the fine temporal resolution of Electroencephalography (EEG). The observations reported here may shed light on…

  3. Medical Text Classification using Convolutional Neural Networks

    OpenAIRE

    Hughes, Mark; Li, Irene; Kotoulas, Spyros; Suzumura, Toyotaro

    2017-01-01

    We present an approach to automatically classify clinical text at a sentence level. We are using deep convolutional neural networks to represent complex features. We train the network on a dataset providing a broad categorization of health information. Through a detailed evaluation, we demonstrate that our method outperforms several approaches widely used in natural language processing tasks by about 15%.

  4. Medical Text Classification Using Convolutional Neural Networks.

    Science.gov (United States)

    Hughes, Mark; Li, Irene; Kotoulas, Spyros; Suzumura, Toyotaro

    2017-01-01

    We present an approach to automatically classify clinical text at a sentence level. We are using deep convolutional neural networks to represent complex features. We train the network on a dataset providing a broad categorization of health information. Through a detailed evaluation, we demonstrate that our method outperforms several approaches widely used in natural language processing tasks by about 15%.

  5. Feature to prototype transition in neural networks

    Science.gov (United States)

    Krotov, Dmitry; Hopfield, John

    Models of associative memory with higher order (higher than quadratic) interactions, and their relationship to neural networks used in deep learning are discussed. Associative memory is conventionally described by recurrent neural networks with dynamical convergence to stable points. Deep learning typically uses feedforward neural nets without dynamics. However, a simple duality relates these two different views when applied to problems of pattern classification. From the perspective of associative memory such models deserve attention because they make it possible to store a much larger number of memories, compared to the quadratic case. In the dual description, these models correspond to feedforward neural networks with one hidden layer and unusual activation functions transmitting the activities of the visible neurons to the hidden layer. These activation functions are rectified polynomials of a higher degree rather than the rectified linear functions used in deep learning. The network learns representations of the data in terms of features for rectified linear functions, but as the power in the activation function is increased there is a gradual shift to a prototype-based representation, the two extreme regimes of pattern recognition known in cognitive psychology. Simons Center for Systems Biology.

  6. Adaptive Neurotechnology for Making Neural Circuits Functional .

    Science.gov (United States)

    Jung, Ranu

    2008-03-01

    Two of the most important trends in recent technological developments are that technology is increasingly integrated with biological systems and that it is increasingly adaptive in its capabilities. Neuroprosthetic systems that provide lost sensorimotor function after a neural disability offer a platform to investigate this interplay between biological and engineered systems. Adaptive neurotechnology (hardware and software) could be designed to be biomimetic, guided by the physical and programmatic constraints observed in biological systems, and allow for real-time learning, stability, and error correction. An example will present biomimetic neural-network hardware that can be interfaced with the isolated spinal cord of a lower vertebrate to allow phase-locked real-time neural control. Another will present adaptive neural network control algorithms for functional electrical stimulation of the peripheral nervous system to provide desired movements of paralyzed limbs in rodents or people. Ultimately, the frontier lies in being able to utilize the adaptive neurotechnology to promote neuroplasticity in the living system on a long-time scale under co-adaptive conditions.

  7. Atypical Neural Self-Representation in Autism

    Science.gov (United States)

    Lombardo, Michael V.; Chakrabarti, Bhismadev; Bullmore, Edward T.; Sadek, Susan A.; Pasco, Greg; Wheelwright, Sally J.; Suckling, John; Baron-Cohen, Simon

    2010-01-01

    The "self" is a complex multidimensional construct deeply embedded and in many ways defined by our relations with the social world. Individuals with autism are impaired in both self-referential and other-referential social cognitive processing. Atypical neural representation of the self may be a key to understanding the nature of such impairments.…

  8. IMPLEMENTATION OF NEURAL - CRYPTOGRAPHIC SYSTEM USING FPGA

    Directory of Open Access Journals (Sweden)

    KARAM M. Z. OTHMAN

    2011-08-01

    Full Text Available Modern cryptography techniques are virtually unbreakable. As the Internet and other forms of electronic communication become more prevalent, electronic security is becoming increasingly important. Cryptography is used to protect e-mail messages, credit card information, and corporate data. The design of the cryptography system is a conventional cryptography that uses one key for encryption and decryption process. The chosen cryptography algorithm is stream cipher algorithm that encrypt one bit at a time. The central problem in the stream-cipher cryptography is the difficulty of generating a long unpredictable sequence of binary signals from short and random key. Pseudo random number generators (PRNG have been widely used to construct this key sequence. The pseudo random number generator was designed using the Artificial Neural Networks (ANN. The Artificial Neural Networks (ANN providing the required nonlinearity properties that increases the randomness statistical properties of the pseudo random generator. The learning algorithm of this neural network is backpropagation learning algorithm. The learning process was done by software program in Matlab (software implementation to get the efficient weights. Then, the learned neural network was implemented using field programmable gate array (FPGA.

  9. Optimal Decision Making in Neural Inhibition Models

    Science.gov (United States)

    van Ravenzwaaij, Don; van der Maas, Han L. J.; Wagenmakers, Eric-Jan

    2012-01-01

    In their influential "Psychological Review" article, Bogacz, Brown, Moehlis, Holmes, and Cohen (2006) discussed optimal decision making as accomplished by the drift diffusion model (DDM). The authors showed that neural inhibition models, such as the leaky competing accumulator model (LCA) and the feedforward inhibition model (FFI), can mimic the…

  10. Neutron spectrometry with artificial neural networks

    Energy Technology Data Exchange (ETDEWEB)

    Vega C, H.R.; Hernandez D, V.M.; Manzanares A, E.; Rodriguez, J.M.; Mercado S, G.A. [Universidad Autonoma de Zacatecas, A.P. 336, 98000 Zacatecas (Mexico); Iniguez de la Torre Bayo, M.P. [Universidad de Valladolid, Valladolid (Spain); Barquero, R. [Hospital Universitario Rio Hortega, Valladolid (Spain); Arteaga A, T. [Envases de Zacatecas, S.A. de C.V., Zacatecas (Mexico)]. e-mail: rvega@cantera.reduaz.mx

    2005-07-01

    An artificial neural network has been designed to obtain the neutron spectra from the Bonner spheres spectrometer's count rates. The neural network was trained using 129 neutron spectra. These include isotopic neutron sources; reference and operational spectra from accelerators and nuclear reactors, spectra from mathematical functions as well as few energy groups and monoenergetic spectra. The spectra were transformed from lethargy to energy distribution and were re-bin ned to 31 energy groups using the MCNP 4C code. Re-binned spectra and UTA4 response matrix were used to calculate the expected count rates in Bonner spheres spectrometer. These count rates were used as input and the respective spectrum was used as output during neural network training. After training the network was tested with the Bonner spheres count rates produced by a set of neutron spectra. This set contains data used during network training as well as data not used. Training and testing was carried out in the Mat lab program. To verify the network unfolding performance the original and unfolded spectra were compared using the {chi}{sup 2}-test and the total fluence ratios. The use of Artificial Neural Networks to unfold neutron spectra in neutron spectrometry is an alternative procedure that overcomes the drawbacks associated in this ill-conditioned problem. (Author)

  11. Neutron spectrometry using artificial neural networks

    Energy Technology Data Exchange (ETDEWEB)

    Vega-Carrillo, Hector Rene [Unidad Academica de Estudios Nucleares, Universidad Autonoma de Zacatecas, Apdo. Postal 336, 98000 Zacatecas, Zac. (Mexico)]|[Unidad Academica de Ing. Electrica, Universidad Autonoma de Zacatecas, Apdo. Postal 336, 98000 Zacatecas, Zac. (Mexico)]|[Unidad Academica de Matematicas, Universidad Autonoma de Zacatecas, Apdo. Postal 336, 98000 Zacatecas, Zac. (Mexico)]. E-mail: fermineutron@yahoo.com; Martin Hernandez-Davila, Victor [Unidad Academica de Estudios Nucleares, Universidad Autonoma de Zacatecas, Apdo. Postal 336, 98000 Zacatecas, Zac. (Mexico)]|[Unidad Academica de Ing. Electrica, Universidad Autonoma de Zacatecas, Apdo. Postal 336, 98000 Zacatecas, Zac. (Mexico); Manzanares-Acuna, Eduardo [Unidad Academica de Estudios Nucleares, Universidad Autonoma de Zacatecas, Apdo. Postal 336, 98000 Zacatecas, Zac. (Mexico); Mercado Sanchez, Gema A. [Unidad Academica de Matematicas, Universidad Autonoma de Zacatecas, Apdo. Postal 336, 98000 Zacatecas, Zac. (Mexico); Pilar Iniguez de la Torre, Maria [Depto. Fisica Teorica, Molecular y Nuclear, Universidad de Valladolid, Valladolid (Spain); Barquero, Raquel [Hospital Universitario Rio Hortega, Valladolid (Spain); Palacios, Francisco; Mendez Villafane, Roberto [Depto. Fisica Teorica, Molecular y Nuclear, Universidad de Valladolid, Valladolid (Spain)]|[Universidad Europea Miguel de Cervantes, C. Padre Julio Chevalier No. 2, 47012 Valladolid (Spain); Arteaga Arteaga, Tarcicio [Unidad Academica de Estudios Nucleares, Universidad Autonoma de Zacatecas, Apdo. Postal 336, 98000 Zacatecas, Zac. (Mexico)]|[Envases de Zacatecas, SA de CV, Parque Industrial de Calera de Victor Rosales, Zac. (Mexico); Manuel Ortiz Rodriguez, Jose [Unidad Academica de Estudios Nucleares, Universidad Autonoma de Zacatecas, Apdo. Postal 336, 98000 Zacatecas, Zac. (Mexico)]|[Unidad Academica de Ing. Electrica, Universidad Autonoma de Zacatecas, Apdo. Postal 336, 98000 Zacatecas, Zac. (Mexico)

    2006-04-15

    An artificial neural network has been designed to obtain neutron spectra from Bonner spheres spectrometer count rates. The neural network was trained using 129 neutron spectra. These include spectra from isotopic neutron sources; reference and operational spectra from accelerators and nuclear reactors, spectra based on mathematical functions as well as few energy groups and monoenergetic spectra. The spectra were transformed from lethargy to energy distribution and were re-binned to 31 energy groups using the MCNP 4C code. The re-binned spectra and the UTA4 response matrix were used to calculate the expected count rates in Bonner spheres spectrometer. These count rates were used as input and their respective spectra were used as output during the neural network training. After training, the network was tested with the Bonner spheres count rates produced by folding a set of neutron spectra with the response matrix. This set contains data used during network training as well as data not used. Training and testing was carried out using the Matlab{sup (R)} program. To verify the network unfolding performance, the original and unfolded spectra were compared using the root mean square error. The use of artificial neural networks to unfold neutron spectra in neutron spectrometry is an alternative procedure that overcomes the drawbacks associated with this ill-conditioned problem.

  12. Memory-optimal neural network approximation

    Science.gov (United States)

    Bölcskei, Helmut; Grohs, Philipp; Kutyniok, Gitta; Petersen, Philipp

    2017-08-01

    We summarize the main results of a recent theory-developed by the authors-establishing fundamental lower bounds on the connectivity and memory requirements of deep neural networks as a function of the complexity of the function class to be approximated by the network. These bounds are shown to be achievable. Specifically, all function classes that are optimally approximated by a general class of representation systems-so-called affine systems-can be approximated by deep neural networks with minimal connectivity and memory requirements. Affine systems encompass a wealth of representation systems from applied harmonic analysis such as wavelets, shearlets, ridgelets, α-shearlets, and more generally α-molecules. This result elucidates a remarkable universality property of deep neural networks and shows that they achieve the optimum approximation properties of all affine systems combined. Finally, we present numerical experiments demonstrating that the standard stochastic gradient descent algorithm generates deep neural networks which provide close-to-optimal approximation rates at minimal connectivity. Moreover, stochastic gradient descent is found to actually learn approximations that are sparse in the representation system optimally sparsifying the function class the network is trained on.

  13. Artificial astrocytes improve neural network performance.

    Science.gov (United States)

    Porto-Pazos, Ana B; Veiguela, Noha; Mesejo, Pablo; Navarrete, Marta; Alvarellos, Alberto; Ibáñez, Oscar; Pazos, Alejandro; Araque, Alfonso

    2011-04-19

    Compelling evidence indicates the existence of bidirectional communication between astrocytes and neurons. Astrocytes, a type of glial cells classically considered to be passive supportive cells, have been recently demonstrated to be actively involved in the processing and regulation of synaptic information, suggesting that brain function arises from the activity of neuron-glia networks. However, the actual impact of astrocytes in neural network function is largely unknown and its application in artificial intelligence remains untested. We have investigated the consequences of including artificial astrocytes, which present the biologically defined properties involved in astrocyte-neuron communication, on artificial neural network performance. Using connectionist systems and evolutionary algorithms, we have compared the performance of artificial neural networks (NN) and artificial neuron-glia networks (NGN) to solve classification problems. We show that the degree of success of NGN is superior to NN. Analysis of performances of NN with different number of neurons or different architectures indicate that the effects of NGN cannot be accounted for an increased number of network elements, but rather they are specifically due to astrocytes. Furthermore, the relative efficacy of NGN vs. NN increases as the complexity of the network increases. These results indicate that artificial astrocytes improve neural network performance, and established the concept of Artificial Neuron-Glia Networks, which represents a novel concept in Artificial Intelligence with implications in computational science as well as in the understanding of brain function.

  14. Tinnitus and neural plasticity of the brain

    NARCIS (Netherlands)

    Bartels, Hilke; Staal, Michiel J.; Albers, Frans W. J.

    Objective: To describe the current ideas about the manifestations of neural plasticity in generating tinnitus. Data Sources: Recently published source articles were identified using MEDLINE, PubMed, and Cochrane Library according to the key words mentioned below. Study Selection: Review articles and

  15. Neural Classifier Construction using Regularization, Pruning

    DEFF Research Database (Denmark)

    Hintz-Madsen, Mads; Hansen, Lars Kai; Larsen, Jan

    1998-01-01

    In this paper we propose a method for construction of feed-forward neural classifiers based on regularization and adaptive architectures. Using a penalized maximum likelihood scheme, we derive a modified form of the entropic error measure and an algebraic estimate of the test error. In conjunction...

  16. Applying Artificial Neural Networks for Face Recognition

    Directory of Open Access Journals (Sweden)

    Thai Hoang Le

    2011-01-01

    Full Text Available This paper introduces some novel models for all steps of a face recognition system. In the step of face detection, we propose a hybrid model combining AdaBoost and Artificial Neural Network (ABANN to solve the process efficiently. In the next step, labeled faces detected by ABANN will be aligned by Active Shape Model and Multi Layer Perceptron. In this alignment step, we propose a new 2D local texture model based on Multi Layer Perceptron. The classifier of the model significantly improves the accuracy and the robustness of local searching on faces with expression variation and ambiguous contours. In the feature extraction step, we describe a methodology for improving the efficiency by the association of two methods: geometric feature based method and Independent Component Analysis method. In the face matching step, we apply a model combining many Neural Networks for matching geometric features of human face. The model links many Neural Networks together, so we call it Multi Artificial Neural Network. MIT + CMU database is used for evaluating our proposed methods for face detection and alignment. Finally, the experimental results of all steps on CallTech database show the feasibility of our proposed model.

  17. drinking water treatment using artificial neural network

    African Journals Online (AJOL)

    ogwueleka

    synaptic weights are used to store the knowledge.” The neural network approach is a branch of artificial intelligence. The ANN is based on a model of the human neurological system that consists of basic computing elements (called neurons) interconnected together (Figure 1). The model used for all classification attempts.

  18. Cognitive and Neural Sciences Division 1991 Programs

    Science.gov (United States)

    1991-08-01

    techniques on a mobile robotic deriveter. Approach: NETROLOGiC will capitalize on its research programs in applying neural networks to problems in pattern...and association fiber differences in STP in piriform cortex. J. Neurophysiol. 64: 179-190. 217 TITLE: Nonlinear Neurodynamics of Biological Pattern

  19. Neural modeling of prefrontal executive function

    Energy Technology Data Exchange (ETDEWEB)

    Levine, D.S. [Univ. of Texas, Arlington, TX (United States)

    1996-12-31

    Brain executive function is based in a distributed system whereby prefrontal cortex is interconnected with other cortical. and subcortical loci. Executive function is divided roughly into three interacting parts: affective guidance of responses; linkage among working memory representations; and forming complex behavioral schemata. Neural network models of each of these parts are reviewed and fit into a preliminary theoretical framework.

  20. Artificial Neural Networks and Instructional Technology.

    Science.gov (United States)

    Carlson, Patricia A.

    1991-01-01

    Artificial neural networks (ANN), part of artificial intelligence, are discussed. Such networks are fed sample cases (training sets), learn how to recognize patterns in the sample data, and use this experience in handling new cases. Two cognitive roles for ANNs (intelligent filters and spreading, associative memories) are examined. Prototypes…

  1. Cognitive And Neural Sciences Division 1992 Programs

    Science.gov (United States)

    1992-08-01

    Neuronal Micronets as Nodal Elements PRINCIPAL INVESTIGATOR: Thomas H. Brown Yale University Department of Psychology (203) 432-7008 R&T PROJECT CODE...of neural nets, and to develop a micronet architecture which captures the computations in neurons. Approach: Simulations will be conducted of the

  2. Artificial neural networks in neutron dosimetry

    Energy Technology Data Exchange (ETDEWEB)

    Vega C, H.R.; Hernandez D, V.M.; Manzanares A, E.; Mercado, G.A.; Perales M, W.A.; Robles R, J.A. [Unidades Academicas de Estudios Nucleares, UAZ, A.P. 336, 98000 Zacatecas (Mexico); Gallego, E.; Lorente, A. [Depto. de Ingenieria Nuclear, Universidad Politecnica de Madrid, (Spain)

    2005-07-01

    An artificial neural network has been designed to obtain the neutron doses using only the Bonner spheres spectrometer's count rates. Ambient, personal and effective neutron doses were included. 187 neutron spectra were utilized to calculate the Bonner count rates and the neutron doses. The spectra were transformed from lethargy to energy distribution and were re-binned to 31 energy groups using the MCNP 4C code. Re-binned spectra, UTA4 response matrix and fluence-to-dose coefficients were used to calculate the count rates in Bonner spheres spectrometer and the doses. Count rates were used as input and the respective doses were used as output during neural network training. Training and testing was carried out in Mat lab environment. The artificial neural network performance was evaluated using the {chi}{sup 2}- test, where the original and calculated doses were compared. The use of Artificial Neural Networks in neutron dosimetry is an alternative procedure that overcomes the drawbacks associated in this ill-conditioned problem. (Author)

  3. A Neural Region of Abstract Working Memory

    Science.gov (United States)

    Cowan, Nelson; Li, Dawei; Moffitt, Amanda; Becker, Theresa M.; Martin, Elizabeth A.; Saults, J. Scott; Christ, Shawn E.

    2011-01-01

    Over 350 years ago, Descartes proposed that the neural basis of consciousness must be a brain region in which sensory inputs are combined. Using fMRI, we identified at least one such area for working memory, the limited information held in mind, described by William James as the trailing edge of consciousness. Specifically, a region in the left…

  4. Neural predictive control for active buffet alleviation

    Science.gov (United States)

    Pado, Lawrence E.; Lichtenwalner, Peter F.; Liguore, Salvatore L.; Drouin, Donald

    1998-06-01

    The adaptive neural control of aeroelastic response (ANCAR) and the affordable loads and dynamics independent research and development (IRAD) programs at the Boeing Company jointly examined using neural network based active control technology for alleviating undesirable vibration and aeroelastic response in a scale model aircraft vertical tail. The potential benefits of adaptive control includes reducing aeroelastic response associated with buffet and atmospheric turbulence, increasing flutter margins, and reducing response associated with nonlinear phenomenon like limit cycle oscillations. By reducing vibration levels and thus loads, aircraft structures can have lower acquisition cost, reduced maintenance, and extended lifetimes. Wind tunnel tests were undertaken on a rigid 15% scale aircraft in Boeing's mini-speed wind tunnel, which is used for testing at very low air speeds up to 80 mph. The model included a dynamically scaled flexible fail consisting of an aluminum spar with balsa wood cross sections with a hydraulically powered rudder. Neural predictive control was used to actuate the vertical tail rudder in response to strain gauge feedback to alleviate buffeting effects. First mode RMS strain reduction of 50% was achieved. The neural predictive control system was developed and implemented by the Boeing Company to provide an intelligent, adaptive control architecture for smart structures applications with automated synthesis, self-optimization, real-time adaptation, nonlinear control, and fault tolerance capabilities. It is designed to solve complex control problems though a process of automated synthesis, eliminating costly control design and surpassing it in many instances by accounting for real world non-linearities.

  5. Epigenetic learning in non-neural organisms

    Indian Academy of Sciences (India)

    2008-09-19

    Sep 19, 2008 ... ... some physical traces of the relation persist and can later be the basis of a more effective response. Using toy models we show that this characterization applies not only to the paradigmatic case of neural learning, but also to cellular responses that are based on epigenetic mechanisms of cell memory.

  6. Neural constraints and flexibility in language processing.

    Science.gov (United States)

    Huyck, Christian R

    2016-01-01

    Humans process language with their neurons. Memory in neurons is supported by neural firing and by short- and long-term synaptic weight change; the emergent behaviour of neurons, synchronous firing, and cell assembly dynamics is also a form of memory. As the language signal moves to later stages, it is processed with different mechanisms that are slower but more persistent.

  7. Neural networks for sign language translation

    Science.gov (United States)

    Wilson, Beth J.; Anspach, Gretel

    1993-09-01

    A neural network is used to extract relevant features of sign language from video images of a person communicating in American Sign Language or Signed English. The key features are hand motion, hand location with respect to the body, and handshape. A modular hybrid design is under way to apply various techniques, including neural networks, in the development of a translation system that will facilitate communication between deaf and hearing people. One of the neural networks described here is used to classify video images of handshapes into their linguistic counterpart in American Sign Language. The video image is preprocessed to yield Fourier descriptors that encode the shape of the hand silhouette. These descriptors are then used as inputs to a neural network that classifies their shapes. The network is trained with various examples from different signers and is tested with new images from new signers. The results have shown that for coarse handshape classes, the network is invariant to the type of camera used to film the various signers and to the segmentation technique.

  8. The Neural Substrates of Infant Speech Perception

    Science.gov (United States)

    Homae, Fumitaka; Watanabe, Hama; Taga, Gentaro

    2014-01-01

    Infants often pay special attention to speech sounds, and they appear to detect key features of these sounds. To investigate the neural foundation of speech perception in infants, we measured cortical activation using near-infrared spectroscopy. We presented the following three types of auditory stimuli while 3-month-old infants watched a silent…

  9. Neural Representations of Location Outside the Hippocampus

    Science.gov (United States)

    Knierim, James J.

    2006-01-01

    Place cells of the rat hippocampus are a dominant model system for understanding the role of the hippocampus in learning and memory at the level of single-unit and neural ensemble responses. A complete understanding of the information processing and computations performed by the hippocampus requires detailed knowledge about the properties of the…

  10. Learning chaotic attractors by neural networks

    NARCIS (Netherlands)

    Bakker, R; Schouten, JC; Giles, CL; Takens, F; van den Bleek, CM

    2000-01-01

    An algorithm is introduced that trains a neural network to identify chaotic dynamics from a single measured time series. During training, the algorithm learns to short-term predict the time series. At the same time a criterion, developed by Diks, van Zwet, Takens, and de Goede (1996) is monitored

  11. Neural Control of the Lower Urinary Tract

    Science.gov (United States)

    de Groat, William C.; Griffiths, Derek; Yoshimura, Naoki

    2015-01-01

    This article summarizes anatomical, neurophysiological, pharmacological, and brain imaging studies in humans and animals that have provided insights into the neural circuitry and neurotransmitter mechanisms controlling the lower urinary tract. The functions of the lower urinary tract to store and periodically eliminate urine are regulated by a complex neural control system in the brain, spinal cord, and peripheral autonomic ganglia that coordinates the activity of smooth and striated muscles of the bladder and urethral outlet. The neural control of micturition is organized as a hierarchical system in which spinal storage mechanisms are in turn regulated by circuitry in the rostral brain stem that initiates reflex voiding. Input from the forebrain triggers voluntary voiding by modulating the brain stem circuitry. Many neural circuits controlling the lower urinary tract exhibit switch-like patterns of activity that turn on and off in an all-or-none manner. The major component of the micturition switching circuit is a spinobulbospinal parasympathetic reflex pathway that has essential connections in the periaqueductal gray and pontine micturition center. A computer model of this circuit that mimics the switching functions of the bladder and urethra at the onset of micturition is described. Micturition occurs involuntarily in infants and young children until the age of 3 to 5 years, after which it is regulated voluntarily. Diseases or injuries of the nervous system in adults can cause the re-emergence of involuntary micturition, leading to urinary incontinence. Neuroplasticity underlying these developmental and pathological changes in voiding function is discussed. PMID:25589273

  12. Nonlinear Time Series Analysis via Neural Networks

    Science.gov (United States)

    Volná, Eva; Janošek, Michal; Kocian, Václav; Kotyrba, Martin

    This article deals with a time series analysis based on neural networks in order to make an effective forex market [Moore and Roche, J. Int. Econ. 58, 387-411 (2002)] pattern recognition. Our goal is to find and recognize important patterns which repeatedly appear in the market history to adapt our trading system behaviour based on them.

  13. Review: Neural correlates of consciousness | Negrao | African ...

    African Journals Online (AJOL)

    A basic understanding of consciousness and its neural correlates is of major importance for all clinicians, especially those involved with patients with altered states of consciousness. In this paper it is shown that consciousness is dependent on the brainstem and thalamus for arousal; that basic cognition is supported by ...

  14. Neural networks, penalty logic and optimality theory

    NARCIS (Netherlands)

    Blutner, R.; Benz, A.; Blutner, R.

    2009-01-01

    Ever since the discovery of neural networks, there has been a controversy between two modes of information processing. On the one hand, symbolic systems have proven indispensable for our understanding of higher intelligence, especially when cognitive domains like language and reasoning are examined.

  15. Image inpainting using a neural network

    Directory of Open Access Journals (Sweden)

    Gapon Nikolay

    2017-01-01

    Full Text Available The paper describes a new method of two-dimensional signals reconstruction by restoring static images. A new method of spatial reconstruction of static images based on a geometric model using a neural network is proposed, it is based on the search for similar blocks and copying them into the region of distorted or missing pixel values.

  16. mTORC1 is a critical mediator of oncogenic Semaphorin3A signaling

    Energy Technology Data Exchange (ETDEWEB)

    Yamada, Daisuke; Kawahara, Kohichi; Maeda, Takehiko, E-mail: maeda@nupals.ac.jp

    2016-08-05

    Aberration of signaling pathways by genetic mutations or alterations in the surrounding tissue environments can result in tumor development or metastasis. However, signaling molecules responsible for these processes have not been completely elucidated. Here, we used mouse Lewis lung carcinoma cells (LLC) to explore the mechanism by which the oncogenic activity of Semaphorin3A (Sema3A) signaling is regulated. Sema3A knockdown by shRNA did not affect apoptosis, but decreased cell proliferation in LLCs; both the mammalian target of rapamycin complex 1 (mTORC1) level and glycolytic activity were also decreased. In addition, Sema3A knockdown sensitized cells to inhibition of oxidative phosphorylation by oligomycin, but conferred resistance to decreased cell viability induced by glucose starvation. Furthermore, recombinant SEMA3A rescued the attenuation of cell proliferation and glycolytic activity in LLCs after Sema3A knockdown, whereas mTORC1 inhibition by rapamycin completely counteracted this effect. These results demonstrate that Sema3A signaling exerts its oncogenic effect by promoting an mTORC1-mediated metabolic shift from oxidative phosphorylation to aerobic glycolysis. -- Highlights: •Sema3A knockdown decreased proliferation of Lewis lung carcinoma cells (LLCs). •Sema3A knockdown decreased mTORC1 levels and glycolytic activity in LLCs. •Sema3A knockdown sensitized cells to inhibition of oxidative phosphorylation. •Sema3A promotes shift from oxidative phosphorylation to aerobic glycolysis via mTORC1.

  17. Implementing Signature Neural Networks with Spiking Neurons.

    Science.gov (United States)

    Carrillo-Medina, José Luis; Latorre, Roberto

    2016-01-01

    Spiking Neural Networks constitute the most promising approach to develop realistic Artificial Neural Networks (ANNs). Unlike traditional firing rate-based paradigms, information coding in spiking models is based on the precise timing of individual spikes. It has been demonstrated that spiking ANNs can be successfully and efficiently applied to multiple realistic problems solvable with traditional strategies (e.g., data classification or pattern recognition). In recent years, major breakthroughs in neuroscience research have discovered new relevant computational principles in different living neural systems. Could ANNs benefit from some of these recent findings providing novel elements of inspiration? This is an intriguing question for the research community and the development of spiking ANNs including novel bio-inspired information coding and processing strategies is gaining attention. From this perspective, in this work, we adapt the core concepts of the recently proposed Signature Neural Network paradigm-i.e., neural signatures to identify each unit in the network, local information contextualization during the processing, and multicoding strategies for information propagation regarding the origin and the content of the data-to be employed in a spiking neural network. To the best of our knowledge, none of these mechanisms have been used yet in the context of ANNs of spiking neurons. This paper provides a proof-of-concept for their applicability in such networks. Computer simulations show that a simple network model like the discussed here exhibits complex self-organizing properties. The combination of multiple simultaneous encoding schemes allows the network to generate coexisting spatio-temporal patterns of activity encoding information in different spatio-temporal spaces. As a function of the network and/or intra-unit parameters shaping the corresponding encoding modality, different forms of competition among the evoked patterns can emerge even in the absence

  18. Implementing Signature Neural Networks with Spiking Neurons

    Science.gov (United States)

    Carrillo-Medina, José Luis; Latorre, Roberto

    2016-01-01

    Spiking Neural Networks constitute the most promising approach to develop realistic Artificial Neural Networks (ANNs). Unlike traditional firing rate-based paradigms, information coding in spiking models is based on the precise timing of individual spikes. It has been demonstrated that spiking ANNs can be successfully and efficiently applied to multiple realistic problems solvable with traditional strategies (e.g., data classification or pattern recognition). In recent years, major breakthroughs in neuroscience research have discovered new relevant computational principles in different living neural systems. Could ANNs benefit from some of these recent findings providing novel elements of inspiration? This is an intriguing question for the research community and the development of spiking ANNs including novel bio-inspired information coding and processing strategies is gaining attention. From this perspective, in this work, we adapt the core concepts of the recently proposed Signature Neural Network paradigm—i.e., neural signatures to identify each unit in the network, local information contextualization during the processing, and multicoding strategies for information propagation regarding the origin and the content of the data—to be employed in a spiking neural network. To the best of our knowledge, none of these mechanisms have been used yet in the context of ANNs of spiking neurons. This paper provides a proof-of-concept for their applicability in such networks. Computer simulations show that a simple network model like the discussed here exhibits complex self-organizing properties. The combination of multiple simultaneous encoding schemes allows the network to generate coexisting spatio-temporal patterns of activity encoding information in different spatio-temporal spaces. As a function of the network and/or intra-unit parameters shaping the corresponding encoding modality, different forms of competition among the evoked patterns can emerge even in the

  19. Semaphorin3A regulates axon growth independently of growth cone repulsion via modulation of TrkA signaling.

    Science.gov (United States)

    Ben-Zvi, A; Ben-Gigi, L; Yagil, Z; Lerman, O; Behar, O

    2008-03-01

    Regulation of axon growth is a critical event in neuronal development. Nerve growth factor (NGF) is a strong inducer of axon growth and survival in the dorsal root ganglia (DRG). Paradoxically, high concentrations of NGF are present in the target region where axon growth must slow down for axons to accurately identify their correct targets. Semaphorin3A (Sema3A), a powerful axonal repellent molecule for DRG neurons, is also situated in their target regions. NGF is a modulator of Sema3A-induced repulsion and death. We show that Sema3A is a regulator of NGF-induced neurite outgrowth via the TrkA receptor, independent of its growth cone repulsion activity. First, neurite outgrowth of DRG neurons is more sensitive to Sema3A than repulsion. Second, at concentrations sufficient to significantly inhibit Sema3A-induced repulsion, NGF has no effect on Sema3A-induced axon growth inhibition. Third, Sema3A-induced outgrowth inhibition, but not repulsion activity, is dependent on NGF stimulation. Fourth, Sema3A attenuates TrkA-mediated growth signaling, but not survival signaling, and over-expression of constitutively active TrkA blocks Sema3A-induced axon growth inhibition, suggesting that Sema3A activity is mediated via regulation of NGF/TrkA-induced growth. Finally, quantitative analysis of axon growth in vivo supports the possibility that Sema3A affects axon growth, in addition to its well-documented role in axon guidance. We suggest a model whereby NGF at high concentrations in the target region is important for survival, attraction and inhibition of Sema3A-induced repulsion, while Sema3A inhibits its growth-promoting activity. The combined and cross-modulatory effects of these two signaling molecules ensure the accuracy of the final stages in axon targeting.

  20. Neural and Neural Gray-Box Modeling for Entry Temperature Prediction in a Hot Strip Mill

    Science.gov (United States)

    Barrios, José Angel; Torres-Alvarado, Miguel; Cavazos, Alberto; Leduc, Luis

    2011-10-01

    In hot strip mills, initial controller set points have to be calculated before the steel bar enters the mill. Calculations rely on the good knowledge of rolling variables. Measurements are available only after the bar has entered the mill, and therefore they have to be estimated. Estimation of process variables, particularly that of temperature, is of crucial importance for the bar front section to fulfill quality requirements, and the same must be performed in the shortest possible time to preserve heat. Currently, temperature estimation is performed by physical modeling; however, it is highly affected by measurement uncertainties, variations in the incoming bar conditions, and final product changes. In order to overcome these problems, artificial intelligence techniques such as artificial neural networks and fuzzy logic have been proposed. In this article, neural network-based systems, including neural-based Gray-Box models, are applied to estimate scale breaker entry temperature, given its importance, and their performance is compared to that of the physical model used in plant. Several neural systems and several neural-based Gray-Box models are designed and tested with real data. Taking advantage of the flexibility of neural networks for input incorporation, several factors which are believed to have influence on the process are also tested. The systems proposed in this study were proven to have better performance indexes and hence better prediction capabilities than the physical models currently used in plant.

  1. Genetics and development of neural tube defects

    Science.gov (United States)

    Copp, Andrew J.; Greene, Nicholas D. E.

    2014-01-01

    Congenital defects of neural tube closure (neural tube defects; NTDs) are among the commonest and most severe disorders of the fetus and newborn. Disturbance of any of the sequential events of embryonic neurulation produce NTDs, with the phenotype (e.g. anencephaly, spina bifida) varying depending on the region of neural tube that remains open. While mutation of more than 200 genes is known to cause NTDs in mice, the pattern of occurrence in humans suggests a multifactorial polygenic or oligogenic aetiology. This emphasises the importance of gene-gene and gene-environment interactions in the origin of these defects. A number of cell biological functions are essential for neural tube closure, with defects of the cytoskeleton, cell cycle and molecular regulation of cell viability prominent among the mouse NTD mutants. Many transcriptional regulators and proteins that affect chromatin structure are also required for neural tube closure, although the downstream molecular pathways regulated by these proteins is unknown. Some key signalling pathways for NTDs have been identified: over-activation of sonic hedgehog signalling and loss of function in the planar cell polarity (non-canonical Wnt) pathway are potent causes of NTD, with requirements also for retinoid and inositol signalling. Folic acid supplementation is an effective method for primary prevention of a proportion of NTDs, in both humans and mice, although the embryonic mechanism of folate action remains unclear. Folic acid-resistant cases can be prevented by inositol supplementation in mice, raising the possibility that this could lead to an additional preventive strategy for human NTDs in future. PMID:19918803

  2. Neural network optimization, components, and design selection

    Science.gov (United States)

    Weller, Scott W.

    1990-07-01

    Neural Networks are part of a revived technology which has received a lot of hype in recent years. As is apt to happen in any hyped technology, jargon and predictions make its assimilation and application difficult. Nevertheless, Neural Networks have found use in a number of areas, working on non-trivial and noncontrived problems. For example, one net has been trained to "read", translating English text into phoneme sequences. Other applications of Neural Networks include data base manipulation and the solving of muting and classification types of optimization problems. Neural Networks are constructed from neurons, which in electronics or software attempt to model but are not constrained by the real thing, i.e., neurons in our gray matter. Neurons are simple processing units connected to many other neurons over pathways which modify the incoming signals. A single synthetic neuron typically sums its weighted inputs, runs this sum through a non-linear function, and produces an output. In the brain, neurons are connected in a complex topology: in hardware/software the topology is typically much simpler, with neurons lying side by side, forming layers of neurons which connect to the layer of neurons which receive their outputs. This simplistic model is much easier to construct than the real thing, and yet can solve real problems. The information in a network, or its "memory", is completely contained in the weights on the connections from one neuron to another. Establishing these weights is called "training" the network. Some networks are trained by design -- once constructed no further learning takes place. Other types of networks require iterative training once wired up, but are not trainable once taught Still other types of networks can continue to learn after initial construction. The main benefit to using Neural Networks is their ability to work with conflicting or incomplete ("fuzzy") data sets. This ability and its usefulness will become evident in the following

  3. Fuzzy logic and neural networks basic concepts & application

    CERN Document Server

    Alavala, Chennakesava R

    2008-01-01

    About the Book: The primary purpose of this book is to provide the student with a comprehensive knowledge of basic concepts of fuzzy logic and neural networks. The hybridization of fuzzy logic and neural networks is also included. No previous knowledge of fuzzy logic and neural networks is required. Fuzzy logic and neural networks have been discussed in detail through illustrative examples, methods and generic applications. Extensive and carefully selected references is an invaluable resource for further study of fuzzy logic and neural networks. Each chapter is followed by a question bank

  4. MBVCNN: Joint convolutional neural networks method for image recognition

    Science.gov (United States)

    Tong, Tong; Mu, Xiaodong; Zhang, Li; Yi, Zhaoxiang; Hu, Pei

    2017-05-01

    Aiming at the problem of objects in image recognition rectangle, but objects which are input into convolutional neural networks square, the object recognition model was put forward which was based on BING method to realize object estimate, used vectorization of convolutional neural networks to realize input square image in convolutional networks, therefore, built joint convolution neural networks, which achieve multiple size image input. Verified by experiments, the accuracy of multi-object image recognition was improved by 6.70% compared with single vectorization of convolutional neural networks. Therefore, image recognition method of joint convolutional neural networks can enhance the accuracy in image recognition, especially for target in rectangular shape.

  5. Energy coding in neural network with inhibitory neurons.

    Science.gov (United States)

    Wang, Ziyin; Wang, Rubin; Fang, Ruiyan

    2015-04-01

    This paper aimed at assessing and comparing the effects of the inhibitory neurons in the neural network on the neural energy distribution, and the network activities in the absence of the inhibitory neurons to understand the nature of neural energy distribution and neural energy coding. Stimulus, synchronous oscillation has significant difference between neural networks with and without inhibitory neurons, and this difference can be quantitatively evaluated by the characteristic energy distribution. In addition, the synchronous oscillation difference of the neural activity can be quantitatively described by change of the energy distribution if the network parameters are gradually adjusted. Compared with traditional method of correlation coefficient analysis, the quantitative indicators based on nervous energy distribution characteristics are more effective in reflecting the dynamic features of the neural network activities. Meanwhile, this neural coding method from a global perspective of neural activity effectively avoids the current defects of neural encoding and decoding theory and enormous difficulties encountered. Our studies have shown that neural energy coding is a new coding theory with high efficiency and great potential.

  6. Genetic, epigenetic, and environmental contributions to neural tube closure.

    Science.gov (United States)

    Wilde, Jonathan J; Petersen, Juliette R; Niswander, Lee

    2014-01-01

    The formation of the embryonic brain and spinal cord begins as the neural plate bends to form the neural folds, which meet and adhere to close the neural tube. The neural ectoderm and surrounding tissues also coordinate proliferation, differentiation, and patterning. This highly orchestrated process is susceptible to disruption, leading to neural tube defects (NTDs), a common birth defect. Here, we highlight genetic and epigenetic contributions to neural tube closure. We describe an online database we created as a resource for researchers, geneticists, and clinicians. Neural tube closure is sensitive to environmental influences, and we discuss disruptive causes, preventative measures, and possible mechanisms. New technologies will move beyond candidate genes in small cohort studies toward unbiased discoveries in sporadic NTD cases. This will uncover the genetic complexity of NTDs and critical gene-gene interactions. Animal models can reveal the causative nature of genetic variants, the genetic interrelationships, and the mechanisms underlying environmental influences.

  7. A Projection Neural Network for Constrained Quadratic Minimax Optimization.

    Science.gov (United States)

    Liu, Qingshan; Wang, Jun

    2015-11-01

    This paper presents a projection neural network described by a dynamic system for solving constrained quadratic minimax programming problems. Sufficient conditions based on a linear matrix inequality are provided for global convergence of the proposed neural network. Compared with some of the existing neural networks for quadratic minimax optimization, the proposed neural network in this paper is capable of solving more general constrained quadratic minimax optimization problems, and the designed neural network does not include any parameter. Moreover, the neural network has lower model complexities, the number of state variables of which is equal to that of the dimension of the optimization problems. The simulation results on numerical examples are discussed to demonstrate the effectiveness and characteristics of the proposed neural network.

  8. Automated Modeling of Microwave Structures by Enhanced Neural Networks

    Directory of Open Access Journals (Sweden)

    Z. Raida

    2006-12-01

    Full Text Available The paper describes the methodology of the automated creation of neural models of microwave structures. During the creation process, artificial neural networks are trained using the combination of the particle swarm optimization and the quasi-Newton method to avoid critical training problems of the conventional neural nets. In the paper, neural networks are used to approximate the behavior of a planar microwave filter (moment method, Zeland IE3D. In order to evaluate the efficiency of neural modeling, global optimizations are performed using numerical models and neural ones. Both approaches are compared from the viewpoint of CPU-time demands and the accuracy. Considering conclusions, methodological recommendations for including neural networks to the microwave design are formulated.

  9. Neural crest cells: from developmental biology to clinical interventions.

    Science.gov (United States)

    Noisa, Parinya; Raivio, Taneli

    2014-09-01

    Neural crest cells are multipotent cells, which are specified in embryonic ectoderm in the border of neural plate and epiderm during early development by interconnection of extrinsic stimuli and intrinsic factors. Neural crest cells are capable of differentiating into various somatic cell types, including melanocytes, craniofacial cartilage and bone, smooth muscle, and peripheral nervous cells, which supports their promise for cell therapy. In this work, we provide a comprehensive review of wide aspects of neural crest cells from their developmental biology to applicability in medical research. We provide a simplified model of neural crest cell development and highlight the key external stimuli and intrinsic regulators that determine the neural crest cell fate. Defects of neural crest cell development leading to several human disorders are also mentioned, with the emphasis of using human induced pluripotent stem cells to model neurocristopathic syndromes. © 2014 Wiley Periodicals, Inc.

  10. Differentiation state determines neural effects on microvascular endothelial cells

    Energy Technology Data Exchange (ETDEWEB)

    Muffley, Lara A., E-mail: muffley@u.washington.edu [University of Washington, Campus Box 359796, 300 9th Avenue, Seattle, WA 98104 (United States); Pan, Shin-Chen, E-mail: pansc@mail.ncku.edu.tw [University of Washington, Campus Box 359796, 300 9th Avenue, Seattle, WA 98104 (United States); Smith, Andria N., E-mail: gnaunderwater@gmail.com [University of Washington, Campus Box 359796, 300 9th Avenue, Seattle, WA 98104 (United States); Ga, Maricar, E-mail: marga16@uw.edu [University of Washington, Campus Box 359796, 300 9th Avenue, Seattle, WA 98104 (United States); Hocking, Anne M., E-mail: ahocking@u.washington.edu [University of Washington, Campus Box 359796, 300 9th Avenue, Seattle, WA 98104 (United States); Gibran, Nicole S., E-mail: nicoleg@u.washington.edu [University of Washington, Campus Box 359796, 300 9th Avenue, Seattle, WA 98104 (United States)

    2012-10-01

    Growing evidence indicates that nerves and capillaries interact paracrinely in uninjured skin and cutaneous wounds. Although mature neurons are the predominant neural cell in the skin, neural progenitor cells have also been detected in uninjured adult skin. The aim of this study was to characterize differential paracrine effects of neural progenitor cells and mature sensory neurons on dermal microvascular endothelial cells. Our results suggest that neural progenitor cells and mature sensory neurons have unique secretory profiles and distinct effects on dermal microvascular endothelial cell proliferation, migration, and nitric oxide production. Neural progenitor cells and dorsal root ganglion neurons secrete different proteins related to angiogenesis. Specific to neural progenitor cells were dipeptidyl peptidase-4, IGFBP-2, pentraxin-3, serpin f1, TIMP-1, TIMP-4 and VEGF. In contrast, endostatin, FGF-1, MCP-1 and thrombospondin-2 were specific to dorsal root ganglion neurons. Microvascular endothelial cell proliferation was inhibited by dorsal root ganglion neurons but unaffected by neural progenitor cells. In contrast, microvascular endothelial cell migration in a scratch wound assay was inhibited by neural progenitor cells and unaffected by dorsal root ganglion neurons. In addition, nitric oxide production by microvascular endothelial cells was increased by dorsal root ganglion neurons but unaffected by neural progenitor cells. -- Highlights: Black-Right-Pointing-Pointer Dorsal root ganglion neurons, not neural progenitor cells, regulate microvascular endothelial cell proliferation. Black-Right-Pointing-Pointer Neural progenitor cells, not dorsal root ganglion neurons, regulate microvascular endothelial cell migration. Black-Right-Pointing-Pointer Neural progenitor cells and dorsal root ganglion neurons do not effect microvascular endothelial tube formation. Black-Right-Pointing-Pointer Dorsal root ganglion neurons, not neural progenitor cells, regulate

  11. The Criticality Hypothesis in Neural Systems

    Science.gov (United States)

    Karimipanah, Yahya

    There is mounting evidence that neural networks of the cerebral cortex exhibit scale invariant dynamics. At the larger scale, fMRI recordings have shown evidence for spatiotemporal long range correlations. On the other hand, at the smaller scales this scale invariance is marked by the power law distribution of the size and duration of spontaneous bursts of activity, which are referred as neuronal avalanches. The existence of such avalanches has been confirmed by several studies in vitro and in vivo, among different species and across multiple scales, from spatial scale of MEG and EEG down to single cell resolution. This prevalent scale free nature of cortical activity suggests the hypothesis that the cortex resides at a critical state between two phases of order (short-lasting activity) and disorder (long-lasting activity). In addition, it has been shown, both theoretically and experimentally, that being at criticality brings about certain functional advantages for information processing. However, despite the plenty of evidence and plausibility of the neural criticality hypothesis, still very little is known on how the brain may leverage such criticality to facilitate neural coding. Moreover, the emergent functions that may arise from critical dynamics is poorly understood. In the first part of this thesis, we review several pieces of evidence for the neural criticality hypothesis at different scales, as well as some of the most popular theories of self-organized criticality (SOC). Thereafter, we will focus on the most prominent evidence from small scales, namely neuronal avalanches. We will explore the effect of adaptation and how it can maintain scale free dynamics even at the presence of external stimuli. Using calcium imaging we also experimentally demonstrate the existence of scale free activity at the cellular resolution in vivo. Moreover, by exploring the subsampling issue in neural data, we will find some fundamental constraints of the conventional methods

  12. Investment Valuation Analysis with Artificial Neural Networks

    Directory of Open Access Journals (Sweden)

    Hüseyin İNCE

    2017-07-01

    Full Text Available This paper shows that discounted cash flow and net present value, which are traditional investment valuation models, can be combined with artificial neural network model forecasting. The main inputs for the valuation models, such as revenue, costs, capital expenditure, and their growth rates, are heavily related to sector dynamics and macroeconomics. The growth rates of those inputs are related to inflation and exchange rates. Therefore, predicting inflation and exchange rates is a critical issue for the valuation output. In this paper, the Turkish economy’s inflation rate and the exchange rate of USD/TRY are forecast by artificial neural networks and implemented to the discounted cash flow model. Finally, the results are benchmarked with conventional practices.

  13. Continual Learning through Evolvable Neural Turing Machines

    DEFF Research Database (Denmark)

    Lüders, Benno; Schläger, Mikkel; Risi, Sebastian

    2016-01-01

    Continual learning, i.e. the ability to sequentially learn tasks without catastrophic forgetting of previously learned ones, is an important open challenge in machine learning. In this paper we take a step in this direction by showing that the recently proposed Evolving Neural Turing Machine (ENT......) approach is able to perform one-shot learning in a reinforcement learning task without catastrophic forgetting of previously stored associations.......Continual learning, i.e. the ability to sequentially learn tasks without catastrophic forgetting of previously learned ones, is an important open challenge in machine learning. In this paper we take a step in this direction by showing that the recently proposed Evolving Neural Turing Machine (ENTM...

  14. Beyond Pattern Recognition With Neural Nets

    Science.gov (United States)

    Arsenault, Henri H.; Macukow, Bohdan

    1989-02-01

    Neural networks are finding many areas of application. Although they are particularly well-suited for applications related to associative recall such as content-addressable memories, neural nets can perform many other applications ranging from logic operations to the solution of optimization problems. The training of a recently introduced model to perform boolean logical operations such as XOR is described. Such simple systems can be combined to perform any complex boolean operation. Any complex task consisting of parallel and serial operations including fuzzy logic that can be described in terms of input-output relations can be accomplished by combining modules such as the ones described here. The fact that some modules can carry out their functions even when their inputs contain erroneous data, and the fact that each module can carry out its functions in parallel with itself and other modules promises some interesting applications.

  15. Neural networks: Application to medical imaging

    Science.gov (United States)

    Clarke, Laurence P.

    1994-01-01

    The research mission is the development of computer assisted diagnostic (CAD) methods for improved diagnosis of medical images including digital x-ray sensors and tomographic imaging modalities. The CAD algorithms include advanced methods for adaptive nonlinear filters for image noise suppression, hybrid wavelet methods for feature segmentation and enhancement, and high convergence neural networks for feature detection and VLSI implementation of neural networks for real time analysis. Other missions include (1) implementation of CAD methods on hospital based picture archiving computer systems (PACS) and information networks for central and remote diagnosis and (2) collaboration with defense and medical industry, NASA, and federal laboratories in the area of dual use technology conversion from defense or aerospace to medicine.

  16. Evaluating neural networks and artificial intelligence systems

    Science.gov (United States)

    Alberts, David S.

    1994-02-01

    Systems have no intrinsic value in and of themselves, but rather derive value from the contributions they make to the missions, decisions, and tasks they are intended to support. The estimation of the cost-effectiveness of systems is a prerequisite for rational planning, budgeting, and investment documents. Neural network and expert system applications, although similar in their incorporation of a significant amount of decision-making capability, differ from each other in ways that affect the manner in which they can be evaluated. Both these types of systems are, by definition, evolutionary systems, which also impacts their evaluation. This paper discusses key aspects of neural network and expert system applications and their impact on the evaluation process. A practical approach or methodology for evaluating a certain class of expert systems that are particularly difficult to measure using traditional evaluation approaches is presented.

  17. Fuzzy logic and neural network technologies

    Science.gov (United States)

    Villarreal, James A.; Lea, Robert N.; Savely, Robert T.

    1992-01-01

    Applications of fuzzy logic technologies in NASA projects are reviewed to examine their advantages in the development of neural networks for aerospace and commercial expert systems and control. Examples of fuzzy-logic applications include a 6-DOF spacecraft controller, collision-avoidance systems, and reinforcement-learning techniques. The commercial applications examined include a fuzzy autofocusing system, an air conditioning system, and an automobile transmission application. The practical use of fuzzy logic is set in the theoretical context of artificial neural systems (ANSs) to give the background for an overview of ANS research programs at NASA. The research and application programs include the Network Execution and Training Simulator and faster training algorithms such as the Difference Optimized Training Scheme. The networks are well suited for pattern-recognition applications such as predicting sunspots, controlling posture maintenance, and conducting adaptive diagnoses.

  18. Artificial Neural Network for Displacement Vectors Determination

    Directory of Open Access Journals (Sweden)

    P. Bohmann

    1997-09-01

    Full Text Available An artificial neural network (NN for displacement vectors (DV determination is presented in this paper. DV are computed in areas which are essential for image analysis and computer vision, in areas where are edges, lines, corners etc. These special features are found by edges operators with the following filtration. The filtration is performed by a threshold function. The next step is DV computation by 2D Hamming artificial neural network. A method of DV computation is based on the full search block matching algorithms. The pre-processing (edges finding is the reason why the correlation function is very simple, the process of DV determination needs less computation and the structure of the NN is simpler.

  19. A Chip for an Implantable Neural Stimulator

    DEFF Research Database (Denmark)

    Gudnason, Gunnar; Bruun, Erik; Haugland, Morten

    2000-01-01

    This paper describes a chip for a multichannel neural stimulator for functional electrical stimulation (FES). The purpose of FES is to restore muscular control in disabled patients. The chip performs all the signal processing required in an implanted neural stimulator. The power and digital data...... transmission to the stimulator passes through a 5 MHz inductive link. From the signals transmitted to the stimulator, the chip is able to generate charge-balanced current pulses with a controllable length up to 256 µs and an amplitude up to 2 mA, for stimulation of nerve fibers. The quiescent current...... consumption of the chip is approx. 650 µA at supply voltages of 6–12 V, and its size is 3.9×3.5 mm2. It has 4 output channels for use in a multipolar cuff electrode....

  20. A canonical neural mechanism for behavioral variability

    Science.gov (United States)

    Darshan, Ran; Wood, William E.; Peters, Susan; Leblois, Arthur; Hansel, David

    2017-05-01

    The ability to generate variable movements is essential for learning and adjusting complex behaviours. This variability has been linked to the temporal irregularity of neuronal activity in the central nervous system. However, how neuronal irregularity actually translates into behavioural variability is unclear. Here we combine modelling, electrophysiological and behavioural studies to address this issue. We demonstrate that a model circuit comprising topographically organized and strongly recurrent neural networks can autonomously generate irregular motor behaviours. Simultaneous recordings of neurons in singing finches reveal that neural correlations increase across the circuit driving song variability, in agreement with the model predictions. Analysing behavioural data, we find remarkable similarities in the babbling statistics of 5-6-month-old human infants and juveniles from three songbird species and show that our model naturally accounts for these `universal' statistics.

  1. Spiking neural P systems with multiple channels.

    Science.gov (United States)

    Peng, Hong; Yang, Jinyu; Wang, Jun; Wang, Tao; Sun, Zhang; Song, Xiaoxiao; Luo, Xiaohui; Huang, Xiangnian

    2017-11-01

    Spiking neural P systems (SNP systems, in short) are a class of distributed parallel computing systems inspired from the neurophysiological behavior of biological spiking neurons. In this paper, we investigate a new variant of SNP systems in which each neuron has one or more synaptic channels, called spiking neural P systems with multiple channels (SNP-MC systems, in short). The spiking rules with channel label are introduced to handle the firing mechanism of neurons, where the channel labels indicate synaptic channels of transmitting the generated spikes. The computation power of SNP-MC systems is investigated. Specifically, we prove that SNP-MC systems are Turing universal as both number generating and number accepting devices. Copyright © 2017 Elsevier Ltd. All rights reserved.

  2. A Topological Perspective of Neural Network Structure

    Science.gov (United States)

    Sizemore, Ann; Giusti, Chad; Cieslak, Matthew; Grafton, Scott; Bassett, Danielle

    The wiring patterns of white matter tracts between brain regions inform functional capabilities of the neural network. Indeed, densely connected and cyclically arranged cognitive systems may communicate and thus perform distinctly. However, previously employed graph theoretical statistics are local in nature and thus insensitive to such global structure. Here we present an investigation of the structural neural network in eight healthy individuals using persistent homology. An extension of homology to weighted networks, persistent homology records both circuits and cliques (all-to-all connected subgraphs) through a repetitive thresholding process, thus perceiving structural motifs. We report structural features found across patients and discuss brain regions responsible for these patterns, finally considering the implications of such motifs in relation to cognitive function.

  3. Learning Topologies with the Growing Neural Forest.

    Science.gov (United States)

    Palomo, Esteban José; López-Rubio, Ezequiel

    2016-06-01

    In this work, a novel self-organizing model called growing neural forest (GNF) is presented. It is based on the growing neural gas (GNG), which learns a general graph with no special provisions for datasets with separated clusters. On the contrary, the proposed GNF learns a set of trees so that each tree represents a connected cluster of data. High dimensional datasets often contain large empty regions among clusters, so this proposal is better suited to them than other self-organizing models because it represents these separated clusters as connected components made of neurons. Experimental results are reported which show the self-organization capabilities of the model. Moreover, its suitability for unsupervised clustering and foreground detection applications is demonstrated. In particular, the GNF is shown to correctly discover the connected component structure of some datasets. Moreover, it outperforms some well-known foreground detectors both in quantitative and qualitative terms.

  4. Neural Network Program Package for Prosody Modeling

    Directory of Open Access Journals (Sweden)

    J. Santarius

    2004-04-01

    Full Text Available This contribution describes the programme for one part of theautomatic Text-to-Speech (TTS synthesis. Some experiments (for example[14] documented the considerable improvement of the naturalness ofsynthetic speech, but this approach requires completing the inputfeature values by hand. This completing takes a lot of time for bigfiles. We need to improve the prosody by other approaches which useonly automatically classified features (input parameters. Theartificial neural network (ANN approach is used for the modeling ofprosody parameters. The program package contains all modules necessaryfor the text and speech signal pre-processing, neural network training,sensitivity analysis, result processing and a module for the creationof the input data protocol for Czech speech synthesizer ARTIC [1].

  5. Spatiotemporal canards in neural field equations

    Science.gov (United States)

    Avitabile, D.; Desroches, M.; Knobloch, E.

    2017-04-01

    Canards are special solutions to ordinary differential equations that follow invariant repelling slow manifolds for long time intervals. In realistic biophysical single-cell models, canards are responsible for several complex neural rhythms observed experimentally, but their existence and role in spatially extended systems is largely unexplored. We identify and describe a type of coherent structure in which a spatial pattern displays temporal canard behavior. Using interfacial dynamics and geometric singular perturbation theory, we classify spatiotemporal canards and give conditions for the existence of folded-saddle and folded-node canards. We find that spatiotemporal canards are robust to changes in the synaptic connectivity and firing rate. The theory correctly predicts the existence of spatiotemporal canards with octahedral symmetry in a neural field model posed on the unit sphere.

  6. A neural theory of visual attention

    DEFF Research Database (Denmark)

    Bundesen, Claus; Habekost, Thomas; Kyllingsbæk, Søren

    2005-01-01

    resources (cells) are devoted to behaviorally important objects than to less important ones. By use of the same basic equations used in TVA, NTVA accounts for a wide range of known attentional effects in human performance (reaction times and error rates) and a wide range of effects observed in firing rates......A neural theory of visual attention (NTVA) is presented. NTVA is a neural interpretation of C. Bundesen's (1990) theory of visual attention (TVA). In NTVA, visual processing capacity is distributed across stimuli by dynamic remapping of receptive fields of cortical cells such that more processing...... of single cells in the primate visual system. NTVA provides a mathematical framework to unify the 2 fields of research--formulas bridging cognition and neurophysiology....

  7. Supervised Sequence Labelling with Recurrent Neural Networks

    CERN Document Server

    Graves, Alex

    2012-01-01

    Supervised sequence labelling is a vital area of machine learning, encompassing tasks such as speech, handwriting and gesture recognition, protein secondary structure prediction and part-of-speech tagging. Recurrent neural networks are powerful sequence learning tools—robust to input noise and distortion, able to exploit long-range contextual information—that would seem ideally suited to such problems. However their role in large-scale sequence labelling systems has so far been auxiliary.    The goal of this book is a complete framework for classifying and transcribing sequential data with recurrent neural networks only. Three main innovations are introduced in order to realise this goal. Firstly, the connectionist temporal classification output layer allows the framework to be trained with unsegmented target sequences, such as phoneme-level speech transcriptions; this is in contrast to previous connectionist approaches, which were dependent on error-prone prior segmentation. Secondly, multidimensional...

  8. Hierarchical Neural Network Structures for Phoneme Recognition

    CERN Document Server

    Vasquez, Daniel; Minker, Wolfgang

    2013-01-01

    In this book, hierarchical structures based on neural networks are investigated for automatic speech recognition. These structures are evaluated on the phoneme recognition task where a  Hybrid Hidden Markov Model/Artificial Neural Network paradigm is used. The baseline hierarchical scheme consists of two levels each which is based on a Multilayered Perceptron. Additionally, the output of the first level serves as a second level input. The computational speed of the phoneme recognizer can be substantially increased by removing redundant information still contained at the first level output. Several techniques based on temporal and phonetic criteria have been investigated to remove this redundant information. The computational time could be reduced by 57% whilst keeping the system accuracy comparable to the baseline hierarchical approach.

  9. The neural bases for valuing social equality.

    Science.gov (United States)

    Aoki, Ryuta; Yomogida, Yukihito; Matsumoto, Kenji

    2015-01-01

    The neural basis of how humans value and pursue social equality has become a major topic in social neuroscience research. Although recent studies have identified a set of brain regions and possible mechanisms that are involved in the neural processing of equality of outcome between individuals, how the human brain processes equality of opportunity remains unknown. In this review article, first we describe the importance of the distinction between equality of outcome and equality of opportunity, which has been emphasized in philosophy and economics. Next, we discuss possible approaches for empirical characterization of human valuation of equality of opportunity vs. equality of outcome. Understanding how these two concepts are distinct and interact with each other may provide a better explanation of complex human behaviors concerning fairness and social equality. Copyright © 2014 Elsevier Ireland Ltd and the Japan Neuroscience Society. All rights reserved.

  10. Identifying bilingual semantic neural representations across languages

    Science.gov (United States)

    Buchweitz, Augusto; Shinkareva, Svetlana V.; Mason, Robert A.; Mitchell, Tom M.; Just, Marcel Adam

    2015-01-01

    The goal of the study was to identify the neural representation of a noun's meaning in one language based on the neural representation of that same noun in another language. Machine learning methods were used to train classifiers to identify which individual noun bilingual participants were thinking about in one language based solely on their brain activation in the other language. The study shows reliable (p languages. It also shows that the stable voxels used to classify the brain activation were located in areas associated with encoding information about semantic dimensions of the words in the study. The identification of the semantic trace of individual nouns from the pattern of cortical activity demonstrates the existence of a multi-voxel pattern of activation across the cortex for a single noun common to both languages in bilinguals. PMID:21978845

  11. Temporal-pattern learning in neural models

    CERN Document Server

    Genís, Carme Torras

    1985-01-01

    While the ability of animals to learn rhythms is an unquestionable fact, the underlying neurophysiological mechanisms are still no more than conjectures. This monograph explores the requirements of such mechanisms, reviews those previously proposed and postulates a new one based on a direct electric coding of stimulation frequencies. Experi­ mental support for the option taken is provided both at the single neuron and neural network levels. More specifically, the material presented divides naturally into four parts: a description of the experimental and theoretical framework where this work becomes meaningful (Chapter 2), a detailed specifica­ tion of the pacemaker neuron model proposed together with its valida­ tion through simulation (Chapter 3), an analytic study of the behavior of this model when submitted to rhythmic stimulation (Chapter 4) and a description of the neural network model proposed for learning, together with an analysis of the simulation results obtained when varying seve­ ral factors r...

  12. Modelling Framework of a Neural Object Recognition

    Directory of Open Access Journals (Sweden)

    Aswathy K S

    2016-02-01

    Full Text Available In many industrial, medical and scientific image processing applications, various feature and pattern recognition techniques are used to match specific features in an image with a known template. Despite the capabilities of these techniques, some applications require simultaneous analysis of multiple, complex, and irregular features within an image as in semiconductor wafer inspection. In wafer inspection discovered defects are often complex and irregular and demand more human-like inspection techniques to recognize irregularities. By incorporating neural network techniques such image processing systems with much number of images can be trained until the system eventually learns to recognize irregularities. The aim of this project is to develop a framework of a machine-learning system that can classify objects of different category. The framework utilizes the toolboxes in the Matlab such as Computer Vision Toolbox, Neural Network Toolbox etc.

  13. Neural Network Solves "Traveling-Salesman" Problem

    Science.gov (United States)

    Thakoor, Anilkumar P.; Moopenn, Alexander W.

    1990-01-01

    Experimental electronic neural network solves "traveling-salesman" problem. Plans round trip of minimum distance among N cities, visiting every city once and only once (without backtracking). This problem is paradigm of many problems of global optimization (e.g., routing or allocation of resources) occuring in industry, business, and government. Applied to large number of cities (or resources), circuits of this kind expected to solve problem faster and more cheaply.

  14. Permutation parity machines for neural cryptography.

    Science.gov (United States)

    Reyes, Oscar Mauricio; Zimmermann, Karl-Heinz

    2010-06-01

    Recently, synchronization was proved for permutation parity machines, multilayer feed-forward neural networks proposed as a binary variant of the tree parity machines. This ability was already used in the case of tree parity machines to introduce a key-exchange protocol. In this paper, a protocol based on permutation parity machines is proposed and its performance against common attacks (simple, geometric, majority and genetic) is studied.

  15. Hindcasting cyclonic waves using neural networks

    Digital Repository Service at National Institute of Oceanography (India)

    Mandal, S.; Rao, S.; Chakravarty, N.V.

    the backpropagation networks with updated algorithms are used in this paper. A brief description about the working of a back propagation neural network and three updated algorithms is given below. Backpropagation learning: Backpropagation is the most widely used... algorithm for supervised learning with multi layer feed forward networks. The idea of the backpropagation learning algorithm is the repeated application of the chain rule to compute the influence of each weight in the network with respect to an arbitrary...

  16. Language and Cognition Interaction Neural Mechanisms

    Science.gov (United States)

    2011-06-01

    2007. [72] L. I. Perlovsky, “Symbols: integrated cognition and language ,” in Semiotics and Intelligent Systems Development, R. Gudwin and J. Queiroz...Article Language and Cognition Interaction Neural Mechanisms Leonid Perlovsky Harvard University and Air Force Research Laboratory, Harvard University...Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. How language

  17. Character Recognition Using Genetically Trained Neural Networks

    Energy Technology Data Exchange (ETDEWEB)

    Diniz, C.; Stantz, K.M.; Trahan, M.W.; Wagner, J.S.

    1998-10-01

    Computationally intelligent recognition of characters and symbols addresses a wide range of applications including foreign language translation and chemical formula identification. The combination of intelligent learning and optimization algorithms with layered neural structures offers powerful techniques for character recognition. These techniques were originally developed by Sandia National Laboratories for pattern and spectral analysis; however, their ability to optimize vast amounts of data make them ideal for character recognition. An adaptation of the Neural Network Designer soflsvare allows the user to create a neural network (NN_) trained by a genetic algorithm (GA) that correctly identifies multiple distinct characters. The initial successfid recognition of standard capital letters can be expanded to include chemical and mathematical symbols and alphabets of foreign languages, especially Arabic and Chinese. The FIN model constructed for this project uses a three layer feed-forward architecture. To facilitate the input of characters and symbols, a graphic user interface (GUI) has been developed to convert the traditional representation of each character or symbol to a bitmap. The 8 x 8 bitmap representations used for these tests are mapped onto the input nodes of the feed-forward neural network (FFNN) in a one-to-one correspondence. The input nodes feed forward into a hidden layer, and the hidden layer feeds into five output nodes correlated to possible character outcomes. During the training period the GA optimizes the weights of the NN until it can successfully recognize distinct characters. Systematic deviations from the base design test the network's range of applicability. Increasing capacity, the number of letters to be recognized, requires a nonlinear increase in the number of hidden layer neurodes. Optimal character recognition performance necessitates a minimum threshold for the number of cases when genetically training the net. And, the

  18. Patterns of neural differentiation in melanomas

    Directory of Open Access Journals (Sweden)

    Singh Avantika V

    2010-11-01

    Full Text Available Abstract Background Melanomas, highly malignant tumors arise from the melanocytes which originate as multipotent neural crest cells during neural tube genesis. The purpose of this study is to assess the pattern of neural differentiation in relation to angiogenesis in VGP melanomas using the tumor as a three dimensional system. Methods Tumor-vascular complexes [TVC] are formed at the tumor-stroma interphase, by tumor cells ensheathing angiogenic vessels to proliferate into a mantle of 5 to 6 layers [L1 to L5] forming a perivascular mantle zone [PMZ]. The pattern of neural differentiation is assessed by immunopositivity for HMB45, GFAP, NFP and synaptophysin has been compared in: [a] the general tumor [b] tumor-vascular complexes and [c] perimantle zone [PC] on serial frozen and paraffin sections. Statistical Analysis: ANOVA: Kruskal-Wallis One Way Analysis of Variance; All Pairwise Multiple Comparison Procedures [Tukey Test]. Results The cells abutting on the basement membrane acquire GFAP positivity and extend processes. New layers of tumor cells show a transition between L2 to L3 followed by NFP and Syn positivity in L4&L5. The level of GFAP+vity in L1&L2 directly proportionate to the percentage of NFP/Syn+vity in L4&L5, on comparing pigmented PMZ with poorly pigmented PMZ. Tumor cells in the perimantle zone show high NFP [65%] and Syn [35.4%] positivity with very low GFAP [6.9%] correlating with the positivity in the outer layers. Discussion From this study it is seen that melanoma cells revert to the embryonic pattern of differentiation, with radial glial like cells [GFAP+ve] which further differentiate into neuronal positive cells [NFP&Syn+ve] during angiogenic tumor-vascular interaction, as seen during neurogenesis, to populate the tumor substance.

  19. Learning in Neural Networks: VLSI Implementation Strategies

    Science.gov (United States)

    Duong, Tuan Anh

    1995-01-01

    Fully-parallel hardware neural network implementations may be applied to high-speed recognition, classification, and mapping tasks in areas such as vision, or can be used as low-cost self-contained units for tasks such as error detection in mechanical systems (e.g. autos). Learning is required not only to satisfy application requirements, but also to overcome hardware-imposed limitations such as reduced dynamic range of connections.

  20. Convolutional Neural Networks for Font Classification

    OpenAIRE

    Tensmeyer, Chris; Saunders, Daniel; Martinez, Tony

    2017-01-01

    Classifying pages or text lines into font categories aids transcription because single font Optical Character Recognition (OCR) is generally more accurate than omni-font OCR. We present a simple framework based on Convolutional Neural Networks (CNNs), where a CNN is trained to classify small patches of text into predefined font classes. To classify page or line images, we average the CNN predictions over densely extracted patches. We show that this method achieves state-of-the-art performance...

  1. The neural correlates of temporal reward discounting.

    Science.gov (United States)

    Scheres, Anouk; de Water, Erik; Mies, Gabry W

    2013-09-01

    Temporal reward discounting (TD) refers to the decrease in subjective value of a reward when the delay to that reward increases. In recent years, a growing number of studies on the neural correlates of temporal reward discounting have been conducted. This article focuses on functional magnetic resonance imaging (fMRI) studies on TD in humans. First, we describe the different types of tasks (also from behavioral studies) and the dependent variables. Subsequently, we discuss the evidence for three neurobiological models of TD: the dual-systems model, the single-system model and the self-control model. Further, studies in which nontraditional tasks (e.g., with nonmonetary rewards) were used to study TD are reviewed. Finally, we discuss the neural correlates of individual differences in discounting, and its development across the lifespan. We conclude that the evidence for each of the three neurobiological models of TD is mixed, in that all models receive (partial) support, and several studies provide support for multiple models. Because of large differences between studies in task design and analytical approach, it is difficult to draw a firm conclusion regarding which model provides the best explanation of the neural correlates of temporal discounting. We propose that some components of these models can complement each other, and future studies should test the predictions offered by different models against each other. Several future research directions are suggested, including studying the connectivity between brain regions in relation to discounting, and directly comparing the neural mechanisms involved in discounting of monetary and primary rewards. WIREs Cogn Sci 2013, 4:523-545. doi: 10.1002/wcs.1246 CONFLICT OF INTEREST: The authors have declared no conflicts of interest for this article. For further resources related to this article, please visit the WIREs website. © 2013 John Wiley & Sons, Ltd.

  2. Flexible neural interfaces with integrated stiffening shank

    Energy Technology Data Exchange (ETDEWEB)

    Tooker, Angela C.; Felix, Sarah H.; Pannu, Satinderpall S.; Shah, Kedar G.; Sheth, Heeral; Tolosa, Vanessa

    2017-10-17

    A neural interface includes a first dielectric material having at least one first opening for a first electrical conducting material, a first electrical conducting material in the first opening, and at least one first interconnection trace electrical conducting material connected to the first electrical conducting material. A stiffening shank material is located adjacent the first dielectric material, the first electrical conducting material, and the first interconnection trace electrical conducting material.

  3. Neural Plasticity in the Gustatory System

    OpenAIRE

    Hill, David L.

    2004-01-01

    Sensory systems adapt to changing environmental influences by coordinated alterations in structure and function. These alterations are referred to as plastic changes. The gustatory system displays numerous plastic changes even in receptor cells. This review focuses on the plasticity of gustatory structures through the first synaptic relay in the brain. Unlike other sensory systems, there is a remarkable amount of environmentally induced changes in these peripheral-most neural structures. The ...

  4. Deep Learning in Neural Networks: An Overview

    OpenAIRE

    Schmidhuber, Juergen

    2014-01-01

    In recent years, deep artificial neural networks (including recurrent ones) have won numerous contests in pattern recognition and machine learning. This historical survey compactly summarises relevant work, much of it from the previous millennium. Shallow and deep learners are distinguished by the depth of their credit assignment paths, which are chains of possibly learnable, causal links between actions and effects. I review deep supervised learning (also recapitulating the history of backpr...

  5. Neural networks of human nature and nurture

    Directory of Open Access Journals (Sweden)

    Daniel S. Levine

    2009-11-01

    Full Text Available Neural network methods have facilitated the unification of several unfortunate splits in psychology, including nature versus nurture. We review the contributions of this methodology and then discuss tentative network theories of caring behavior, of uncaring behavior, and of how the frontal lobes are involved in the choices between them. The implications of our theory are optimistic about the prospects of society to encourage the human potential for caring.

  6. A Dynamic Neural Network Approach to CBM

    Science.gov (United States)

    2011-03-15

    Therefore post-processing is needed to extract the time difference between corresponding events from which to calculate the crankshaft rotational speed...potentially already available from existing sensors (such as a crankshaft timing device) and a Neural Network processor to carry out the calculation . As...files are designated with the “_genmod” suffix. These files were the sources for the training and testing sets and made the extraction process easy

  7. Artificial neural network cardiopulmonary modeling and diagnosis

    Science.gov (United States)

    Kangas, Lars J.; Keller, Paul E.

    1997-01-01

    The present invention is a method of diagnosing a cardiopulmonary condition in an individual by comparing data from a progressive multi-stage test for the individual to a non-linear multi-variate model, preferably a recurrent artificial neural network having sensor fusion. The present invention relies on a cardiovascular model developed from physiological measurements of an individual. Any differences between the modeled parameters and the parameters of an individual at a given time are used for diagnosis.

  8. A neural basis for general intelligence

    OpenAIRE

    Duncan, J.; Seitz, R.J.; Kolodny, J.; Bor, D.; Herzog, H; Ahmed, A.; Newell, F. N.; Emslie, H

    2000-01-01

    Universal positive correlations between different cognitive tests motivate the concept of "general intelligence" or Spearman's g. Here the neural basis for g is investigated by means of positron emission tomography. Spatial, verbal, and perceptuo-motor tasks with high-g involvement are compared with matched Low-g control tasks. In contrast to the common view that g reflects a broad sample of major cognitive functions, high-g tasks do not show diffuse recruitment of multiple brain regions. Ins...

  9. Neural mechanisms of hypnosis and meditation.

    Science.gov (United States)

    De Benedittis, Giuseppe

    2015-12-01

    Hypnosis has been an elusive concept for science for a long time. However, the explosive advances in neuroscience in the last few decades have provided a "bridge of understanding" between classical neurophysiological studies and psychophysiological studies. These studies have shed new light on the neural basis of the hypnotic experience. Furthermore, an ambitious new area of research is focusing on mapping the core processes of psychotherapy and the neurobiology/underlying them. Hypnosis research offers powerful techniques to isolate psychological processes in ways that allow their neural bases to be mapped. The Hypnotic Brain can serve as a way to tap neurocognitive questions and our cognitive assays can in turn shed new light on the neural bases of hypnosis. This cross-talk should enhance research and clinical applications. An increasing body of evidence provides insight in the neural mechanisms of the Meditative Brain. Discrete meditative styles are likely to target different neurodynamic patterns. Recent findings emphasize increased attentional resources activating the attentional and salience networks with coherent perception. Cognitive and emotional equanimity gives rise to an eudaimonic state, made of calm, resilience and stability, readiness to express compassion and empathy, a main goal of Buddhist practices. Structural changes in gray matter of key areas of the brain involved in learning processes suggest that these skills can be learned through practice. Hypnosis and Meditation represent two important, historical and influential landmarks of Western and Eastern civilization and culture respectively. Neuroscience has beginning to provide a better understanding of the mechanisms of both Hypnotic and Meditative Brain, outlining similarities but also differences between the two states and processes. It is important not to view either the Eastern or the Western system as superior to the other. Cross-fertilization of the ancient Eastern meditation techniques

  10. Neural Cross-Frequency Coupling Functions

    Directory of Open Access Journals (Sweden)

    Tomislav Stankovski

    2017-06-01

    Full Text Available Although neural interactions are usually characterized only by their coupling strength and directionality, there is often a need to go beyond this by establishing the functional mechanisms of the interaction. We introduce the use of dynamical Bayesian inference for estimation of the coupling functions of neural oscillations in the presence of noise. By grouping the partial functional contributions, the coupling is decomposed into its functional components and its most important characteristics—strength and form—are quantified. The method is applied to characterize the δ-to-α phase-to-phase neural coupling functions from electroencephalographic (EEG data of the human resting state, and the differences that arise when the eyes are either open (EO or closed (EC are evaluated. The δ-to-α phase-to-phase coupling functions were reconstructed, quantified, compared, and followed as they evolved in time. Using phase-shuffled surrogates to test for significance, we show how the strength of the direct coupling, and the similarity and variability of the coupling functions, characterize the EO and EC states for different regions of the brain. We confirm an earlier observation that the direct coupling is stronger during EC, and we show for the first time that the coupling function is significantly less variable. Given the current understanding of the effects of e.g., aging and dementia on δ-waves, as well as the effect of cognitive and emotional tasks on α-waves, one may expect that new insights into the neural mechanisms underlying certain diseases will be obtained from studies of coupling functions. In principle, any pair of coupled oscillations could be studied in the same way as those shown here.

  11. Antagonistic neural networks underlying differentiated leadership roles

    OpenAIRE

    Richard Eleftherios Boyatzis; Kylie eRochford; Anthony Ian Jack

    2014-01-01

    The emergence of two distinct leadership roles, the task leader and the socio-emotional leader, has been documented in the leadership literature since the 1950’s. Recent research in neuroscience suggests that the division between task oriented and socio-emotional oriented roles derives from a fundamental feature of our neurobiology: an antagonistic relationship between two large-scale cortical networks -- the Task Positive Network (TPN) and the Default Mode Network (DMN). Neural activity in ...

  12. Neural Correlates of Verb Argument Structure Processing

    OpenAIRE

    Thompson, Cynthia K.; Bonakdarpour, Borna; Fix, Stephen C.; Blumenfeld, Henrike K.; Parrish, Todd B.; Gitelman, Darren R.; Mesulam, M.-Marsel

    2007-01-01

    Neuroimaging and lesion studies suggest that processing of word classes, such as verbs and nouns, is associated with distinct neural mechanisms. Such studies also suggest that subcategories within these broad word class categories are differentially processed in the brain. Within the class of verbs, argument structure provides one linguistic dimension that distinguishes among verb exemplars, with some requiring more complex argument structure entries than others. This study examined the neura...

  13. Neural correlates of verb argument structure processing.

    Science.gov (United States)

    Thompson, Cynthia K; Bonakdarpour, Borna; Fix, Stephen C; Blumenfeld, Henrike K; Parrish, Todd B; Gitelman, Darren R; Mesulam, M-Marsel

    2007-11-01

    Neuroimaging and lesion studies suggest that processing of word classes, such as verbs and nouns, is associated with distinct neural mechanisms. Such studies also suggest that subcategories within these broad word class categories are differentially processed in the brain. Within the class of verbs, argument structure provides one linguistic dimension that distinguishes among verb exemplars, with some requiring more complex argument structure entries than others. This study examined the neural instantiation of verbs by argument structure complexity: one-, two-, and three-argument verbs. Stimuli of each type, along with nouns and pseudowords, were presented for lexical decision using an event-related functional magnetic resonance imaging design. Results for 14 young normal participants indicated largely overlapping activation maps for verbs and nouns, with no areas of significant activation for verbs compared to nouns, or vice versa. Pseudowords also engaged neural tissue overlapping with that for both word classes, with more widespread activation noted in visual, motor, and peri-sylvian regions. Examination of verbs by argument structure revealed activation of the supramarginal and angular gyri, limited to the left hemisphere only when verbs with two obligatory arguments were compared to verbs with a single argument. However, bilateral activation was noted when both two- and three-argument verbs were compared to one-argument verbs. These findings suggest that posterior peri-sylvian regions are engaged for processing argument structure information associated with verbs, with increasing neural tissue in the inferior parietal region associated with increasing argument structure complexity. These findings are consistent with processing accounts, which suggest that these regions are crucial for semantic integration.

  14. Identifying Tracks Duplicates via Neural Network

    CERN Document Server

    Sunjerga, Antonio; CERN. Geneva. EP Department

    2017-01-01

    The goal of the project is to study feasibility of state of the art machine learning techniques in track reconstruction. Machine learning techniques provide promising ways to speed up the pattern recognition of tracks by adding more intelligence in the algorithms. Implementation of neural network to process of track duplicates identifying will be discussed. Different approaches are shown and results are compared to method that is currently in use.

  15. Statistical Physics, Neural Networks, Brain Studies

    OpenAIRE

    TOULOUSE, Gérard

    2014-01-01

    An overview of some aspects of a vast domain, located at the crossroads of physics, biology and computer science is presented: 1) During the last fifteen years, physicists advancing along various pathways have come into contact with biology (computational neurosciences) and engineering (formal neural nets). 2) This move may actually be viewed as one component in a larger picture. A prominent trend of recent years, observable over many countries, has been the establishment of interdis...

  16. Epigenomic Landscapes of hESC-Derived Neural Rosettes: Modeling Neural Tube Formation and Diseases.

    Science.gov (United States)

    Valensisi, Cristina; Andrus, Colin; Buckberry, Sam; Doni Jayavelu, Naresh; Lund, Riikka J; Lister, Ryan; Hawkins, R David

    2017-08-08

    We currently lack a comprehensive understanding of the mechanisms underlying neural tube formation and their contributions to neural tube defects (NTDs). Developing a model to study such a complex morphogenetic process, especially one that models human-specific aspects, is critical. Three-dimensional, human embryonic stem cell (hESC)-derived neural rosettes (NRs) provide a powerful resource for in vitro modeling of human neural tube formation. Epigenomic maps reveal enhancer elements unique to NRs relative to 2D systems. A master regulatory network illustrates that key NR properties are related to their epigenomic landscapes. We found that folate-associated DNA methylation changes were enriched within NR regulatory elements near genes involved in neural tube formation and metabolism. Our comprehensive regulatory maps offer insights into the mechanisms by which folate may prevent NTDs. Lastly, our distal regulatory maps provide a better understanding of the potential role of neurological-disorder-associated SNPs. Copyright © 2017 The Author(s). Published by Elsevier Inc. All rights reserved.

  17. At the interface: convergence of neural regeneration and neural prostheses for restoration of function.

    Science.gov (United States)

    Grill, W M; McDonald, J W; Peckham, P H; Heetderks, W; Kocsis, J; Weinrich, M

    2001-01-01

    The rapid pace of recent advances in development and application of electrical stimulation of the nervous system and in neural regeneration has created opportunities to combine these two approaches to restoration of function. This paper relates the discussion on this topic from a workshop at the International Functional Electrical Stimulation Society. The goals of this workshop were to discuss the current state of interaction between the fields of neural regeneration and neural prostheses and to identify potential areas of future research that would have the greatest impact on achieving the common goal of restoring function after neurological damage. Identified areas include enhancement of axonal regeneration with applied electric fields, development of hybrid neural interfaces combining synthetic silicon and biologically derived elements, and investigation of the role of patterned neural activity in regulating various neuronal processes and neurorehabilitation. Increased communication and cooperation between the two communities and recognition by each field that the other has something to contribute to their efforts are needed to take advantage of these opportunities. In addition, creative grants combining the two approaches and more flexible funding mechanisms to support the convergence of their perspectives are necessary to achieve common objectives.

  18. Neural substrates of sublexical processing for spelling.

    Science.gov (United States)

    DeMarco, Andrew T; Wilson, Stephen M; Rising, Kindle; Rapcsak, Steven Z; Beeson, Pélagie M

    2017-01-01

    We used fMRI to examine the neural substrates of sublexical phoneme-grapheme conversion during spelling in a group of healthy young adults. Participants performed a writing-to-dictation task involving irregular words (e.g., choir), plausible nonwords (e.g., kroid), and a control task of drawing familiar geometric shapes (e.g., squares). Written production of both irregular words and nonwords engaged a left-hemisphere perisylvian network associated with reading/spelling and phonological processing skills. Effects of lexicality, manifested by increased activation during nonword relative to irregular word spelling, were noted in anterior perisylvian regions (posterior inferior frontal gyrus/operculum/precentral gyrus/insula), and in left ventral occipito-temporal cortex. In addition to enhanced neural responses within domain-specific components of the language network, the increased cognitive demands associated with spelling nonwords engaged domain-general frontoparietal cortical networks involved in selective attention and executive control. These results elucidate the neural substrates of sublexical processing during written language production and complement lesion-deficit correlation studies of phonological agraphia. Copyright © 2016 Elsevier Inc. All rights reserved.

  19. Neural Network Control of Asymmetrical Multilevel Converters

    Directory of Open Access Journals (Sweden)

    Patrice WIRA

    2009-12-01

    Full Text Available This paper proposes a neural implementation of a harmonic eliminationstrategy (HES to control a Uniform Step Asymmetrical Multilevel Inverter(USAMI. The mapping between the modulation rate and the requiredswitching angles is learned and approximated with a Multi-Layer Perceptron(MLP neural network. After learning, appropriate switching angles can bedetermined with the neural network leading to a low-computational-costneural controller which is well suited for real-time applications. Thistechnique can be applied to multilevel inverters with any number of levels. Asan example, a nine-level inverter and an eleven-level inverter are consideredand the optimum switching angles are calculated on-line. Comparisons to thewell-known sinusoidal pulse-width modulation (SPWM have been carriedout in order to evaluate the performance of the proposed approach. Simulationresults demonstrate the technical advantages of the proposed neuralimplementation over the conventional method (SPWM in eliminatingharmonics while controlling a nine-level and eleven-level USAMI. Thisneural approach is applied for the supply of an asynchronous machine andresults show that it ensures a highest quality torque by efficiently cancelingthe harmonics generated by the inverters.

  20. A neural model of hierarchical reinforcement learning.

    Science.gov (United States)

    Rasmussen, Daniel; Voelker, Aaron; Eliasmith, Chris

    2017-01-01

    We develop a novel, biologically detailed neural model of reinforcement learning (RL) processes in the brain. This model incorporates a broad range of biological features that pose challenges to neural RL, such as temporally extended action sequences, continuous environments involving unknown time delays, and noisy/imprecise computations. Most significantly, we expand the model into the realm of hierarchical reinforcement learning (HRL), which divides the RL process into a hierarchy of actions at different levels of abstraction. Here we implement all the major components of HRL in a neural model that captures a variety of known anatomical and physiological properties of the brain. We demonstrate the performance of the model in a range of different environments, in order to emphasize the aim of understanding the brain's general reinforcement learning ability. These results show that the model compares well to previous modelling work and demonstrates improved performance as a result of its hierarchical ability. We also show that the model's behaviour is consistent with available data on human hierarchical RL, and generate several novel predictions.