WorldWideScience

Sample records for release probability ltp

  1. Cholinergic induction of input-specific late-phase LTP via localized Ca2+ release in the visual cortex.

    Science.gov (United States)

    Cho, Kwang-Hyun; Jang, Hyun-Jong; Jo, Yang-Hyeok; Singer, Wolf; Rhie, Duck-Joo

    2012-03-28

    Acetylcholine facilitates long-term potentiation (LTP) and long-term depression (LTD), substrates of learning, memory, and sensory processing, in which acetylcholine also plays a crucial role. Ca(2+) ions serve as a canonical regulator of LTP/LTD but little is known about the effect of acetylcholine on intracellular Ca(2+) dynamics. Here, we investigated dendritic Ca(2+) dynamics evoked by synaptic stimulation and the resulting LTP/LTD in layer 2/3 pyramidal neurons of the rat visual cortex. Under muscarinic stimulation, single-shock electrical stimulation (SES) inducing ∼20 mV EPSP, applied via a glass electrode located ∼10 μm from the basal dendrite, evoked NMDA receptor-dependent fast Ca(2+) transients and the subsequent Ca(2+) release from the inositol 1,4,5-trisphosphate (IP(3))-sensitive stores. These secondary dendritic Ca(2+) transients were highly localized within 10 μm from the center (SD = 5.0 μm). The dendritic release of Ca(2+) was a prerequisite for input-specific muscarinic LTP (LTPm). Without the secondary Ca(2+) release, only muscarinic LTD (LTDm) was induced. D(-)-2-amino-5-phosphopentanoic acid and intracellular heparin blocked LTPm as well as dendritic Ca(2+) release. A single burst consisting of 3 EPSPs with weak stimulus intensities instead of the SES also induced secondary Ca(2+) release and LTPm. LTPm and LTDm were protein synthesis-dependent. Furthermore, LTPm was confined to specific dendritic compartments and not inducible in distal apical dendrites. Thus, cholinergic activation facilitated selectively compartment-specific induction of late-phase LTP through IP(3)-dependent Ca(2+) release.

  2. Multivesicular release underlies short term synaptic potentiation independent of release probability change in the supraoptic nucleus.

    Directory of Open Access Journals (Sweden)

    Michelle E Quinlan

    Full Text Available Magnocellular neurons of the supraoptic nucleus receive glutamatergic excitatory inputs that regulate the firing activity and hormone release from these neurons. A strong, brief activation of these excitatory inputs induces a lingering barrage of tetrodotoxin-resistant miniature EPSCs (mEPSCs that lasts for tens of minutes. This is known to accompany an immediate increase in large amplitude mEPSCs. However, it remains unknown how long this amplitude increase can last and whether it is simply a byproduct of greater release probability. Using in vitro patch clamp recording on acute rat brain slices, we found that a brief, high frequency stimulation (HFS of afferents induced a potentiation of mEPSC amplitude lasting up to 20 min. This amplitude potentiation did not correlate with changes in mEPSC frequency, suggesting that it does not reflect changes in presynaptic release probability. Nonetheless, neither postsynaptic calcium chelator nor the NMDA receptor antagonist blocked the potentiation. Together with the known calcium dependency of HFS-induced potentiation of mEPSCs, our results imply that mEPSC amplitude increase requires presynaptic calcium. Further analysis showed multimodal distribution of mEPSC amplitude, suggesting that large mEPSCs were due to multivesicular glutamate release, even at late post-HFS when the frequency is no longer elevated. In conclusion, high frequency activation of excitatory synapses induces lasting multivesicular release in the SON, which is independent of changes in release probability. This represents a novel form of synaptic plasticity that may contribute to prolonged excitatory tone necessary for generation of burst firing of magnocellular neurons.

  3. Thermal diagnostics for LTP

    International Nuclear Information System (INIS)

    Lobo, Alberto; Nofrarias, M; Sanjuan, J

    2005-01-01

    This is a short note reporting on the current state of development of the temperature sensors which are part of the LTP Diagnostics Subsystem on board the LISA Pathfinder mission (LPF). A thermal insulator has been designed which ensures sufficient stability of a set of eight NTC sensors (negative temperature coefficient of resistance or thermistors), and the front-end electronics has also been designed and manufactured. Tests have been performed which nearly approach the goal of a global stability of 10 -5 K Hz -1/2

  4. LISA and the LTP

    International Nuclear Information System (INIS)

    Jennrich, O.

    2002-01-01

    The primary objective of the LISA (Laser Interferometer Space Antenna mission is the detection and observation of gravitational waves from massive black holes (MBH) and galactic binaries in the frequency range 10 -4 Hz ... 10 -1 Hz. This low-frequency range is inaccessible to ground-based interferometers due to the background of local gravitational noise and because ground-based interferometers are limited in length to a few kilometres. LISA is envisaged as an ESA/NASA collaborative project, selected an ESA cornerstone mission and included in NASA's strategic plan and with a nominal launch date in 2011. SMART-2 is primarily intended to demonstrate the key technologies for the ESA/NASA collaborative LISA cornerstone mission. The synergy with the technology being used for Darwin motivates the utilization of SMART-2 for both missions. To this end, SMART-2 will accommodate a LISA technology package (LTP), provided by European institutes and industry and possibly also a Disturbance Reduction System (DRS) that is very similar to the LTP and has the same goals but is provided by US institutes and industry

  5. Upregulation of transmitter release probability improves a conversion of synaptic analogue signals into neuronal digital spikes

    Science.gov (United States)

    2012-01-01

    Action potentials at the neurons and graded signals at the synapses are primary codes in the brain. In terms of their functional interaction, the studies were focused on the influence of presynaptic spike patterns on synaptic activities. How the synapse dynamics quantitatively regulates the encoding of postsynaptic digital spikes remains unclear. We investigated this question at unitary glutamatergic synapses on cortical GABAergic neurons, especially the quantitative influences of release probability on synapse dynamics and neuronal encoding. Glutamate release probability and synaptic strength are proportionally upregulated by presynaptic sequential spikes. The upregulation of release probability and the efficiency of probability-driven synaptic facilitation are strengthened by elevating presynaptic spike frequency and Ca2+. The upregulation of release probability improves spike capacity and timing precision at postsynaptic neuron. These results suggest that the upregulation of presynaptic glutamate release facilitates a conversion of synaptic analogue signals into digital spikes in postsynaptic neurons, i.e., a functional compatibility between presynaptic and postsynaptic partners. PMID:22852823

  6. Kepler Planet Reliability Metrics: Astrophysical Positional Probabilities for Data Release 25

    Science.gov (United States)

    Bryson, Stephen T.; Morton, Timothy D.

    2017-01-01

    This document is very similar to KSCI-19092-003, Planet Reliability Metrics: Astrophysical Positional Probabilities, which describes the previous release of the astrophysical positional probabilities for Data Release 24. The important changes for Data Release 25 are:1. The computation of the astrophysical positional probabilities uses the Data Release 25 processed pixel data for all Kepler Objects of Interest.2. Computed probabilities now have associated uncertainties, whose computation is described in x4.1.3.3. The scene modeling described in x4.1.2 uses background stars detected via ground-based high-resolution imaging, described in x5.1, that are not in the Kepler Input Catalog or UKIRT catalog. These newly detected stars are presented in Appendix B. Otherwise the text describing the algorithms and examples is largely unchanged from KSCI-19092-003.

  7. Probabilistic Approach to Conditional Probability of Release of Hazardous Materials from Railroad Tank Cars during Accidents

    Science.gov (United States)

    2009-10-13

    This paper describes a probabilistic approach to estimate the conditional probability of release of hazardous materials from railroad tank cars during train accidents. Monte Carlo methods are used in developing a probabilistic model to simulate head ...

  8. Probability for human intake of an atom randomly released into ground, rivers, oceans and air

    Energy Technology Data Exchange (ETDEWEB)

    Cohen, B L

    1984-08-01

    Numerical estimates are developed for the probability of an atom randomly released in the top ground layers, in a river, or in the oceans to be ingested orally by a human, and for an atom emitted from an industrial source to be inhaled by a human. Estimates are obtained for both probability per year and for total eventual probability. Results vary considerably for different elements, but typical values for total probabilities are: ground, 3 X 10/sup -3/, oceans, 3 X 10/sup -4/; rivers, 1.7 x 10/sup -4/; and air, 5 X 10/sup -6/. Probabilities per year are typcially 1 X 10/sup -7/ for releases into the ground and 5 X 10/sup -8/ for releases into the oceans. These results indicate that for material with very long-lasting toxicity, it is important to include the pathways from the ground and from the oceans.

  9. The LTP interferometer and phasemeter

    International Nuclear Information System (INIS)

    Heinzel, G; Wand, V; GarcIa, A; Jennrich, O; Braxmaier, C; Robertson, D; Middleton, K; Hoyland, D; Ruediger, A; Schilling, R; Johann, U; Danzmann, K

    2004-01-01

    The LISA Technology Package (LTP), to be launched by ESA in 2006/2007, is a technology demonstration mission in preparation for the LISA space-borne gravitational wave detector. A central part of the LTP is the optical metrology package (heterodyne interferometer with phasemeter) which monitors the distance between two test masses with a noise level of 10 pm Hz -1/2 between 3 mHz and 30 mHz. It has a dynamic range of >100 μm without any actuators for the pathlength. In addition to the longitudinal measurements, it provides alignment measurements with an expected noise level of -1/2 . While the basic design has been described previously by Heinzel et al (2003 Class. Quantum Grav. 20 S153-61), this paper gives new details on the laser stabilization, the phasemeter and recent prototype results

  10. Probability

    CERN Document Server

    Shiryaev, A N

    1996-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, ergodic theory, weak convergence of probability measures, stationary stochastic processes, and the Kalman-Bucy filter Many examples are discussed in detail, and there are a large number of exercises The book is accessible to advanced undergraduates and can be used as a text for self-study This new edition contains substantial revisions and updated references The reader will find a deeper study of topics such as the distance between probability measures, metrization of weak convergence, and contiguity of probability measures Proofs for a number of some important results which were merely stated in the first edition have been added The author included new material on the probability of large deviations, and on the central limit theorem for sums of dependent random variables

  11. LTP after Stress: Up or Down?

    Directory of Open Access Journals (Sweden)

    Marian Joëls

    2007-01-01

    Full Text Available When an organism is exposed to a stressful situation, corticosteroid levels in the brain rise. This rise has consequences for behavioral performance, including memory formation. Over the past decades, it has become clear that a rise in corticosteroid level is also accompanied by a reduction in hippocampal long-term potentiation (LTP. Recent studies, however, indicate that stress does not lead to a universal suppression of LTP. Many factors, including the type of stress, the phase of the stress response, the area of investigation, type of LTP, and the life history of the organism determine in which direction LTP will be changed.

  12. Radioactivity release vs probability for a steam generator tube rupture accident

    International Nuclear Information System (INIS)

    Buslik, A.J.; Hall, R.E.

    1978-01-01

    A calculation of the probability of obtaining various radioactivity releases from a steam generator tube rupture (SGTR) is presented. The only radioactive isotopes considered are Iodine-131 and Xe-133. The particular accident path considered consists of a double-ended guillotine SGTR followed by loss of offsite power (LOSP). If there is no loss of offsite power, and no system fault other than the SGTR, it is judged that the consequences will be minimal, since the amount of iodine released through the condenser air ejector is expected to be quite small; this is a consequence of the fact that the concentration of iodine in the vapor released from the condenser air ejector is very small compared to that dissolved in the condensate water. In addition, in some plants the condenser air ejector flow is automatically diverted to containment or a high-activity alarm. The analysis presented here is for a typical Westinghouse PWR such as described in RESAR-3S

  13. Molecular machines regulating the release probability of synaptic vesicles at the active zone.

    Directory of Open Access Journals (Sweden)

    Christoph eKoerber

    2016-03-01

    Full Text Available The fusion of synaptic vesicles (SVs with the plasma membrane of the active zone (AZ upon arrival of an action potential (AP at the presynaptic compartment is a tightly regulated probabil-istic process crucial for information transfer. The probability of a SV to release its transmitter content in response to an AP, termed release probability (Pr, is highly diverse both at the level of entire synapses and individual SVs at a given synapse. Differences in Pr exist between different types of synapses, between synapses of the same type, synapses originating from the same axon and even between different SV subpopulations within the same presynaptic terminal. The Pr of SVs at the AZ is set by a complex interplay of different presynaptic properties including the availability of release-ready SVs, the location of the SVs relative to the voltage-gated calcium channels (VGCCs at the AZ, the magnitude of calcium influx upon arrival of the AP, the buffer-ing of calcium ions as well as the identity and sensitivity of the calcium sensor. These properties are not only interconnected, but can also be regulated dynamically to match the requirements of activity patterns mediated by the synapse. Here, we review recent advances in identifying mole-cules and molecular machines taking part in the determination of vesicular Pr at the AZ.

  14. Probability analysis of multiple-tank-car release incidents in railway hazardous materials transportation

    International Nuclear Information System (INIS)

    Liu, Xiang; Saat, Mohd Rapik; Barkan, Christopher P.L.

    2014-01-01

    Railroads play a key role in the transportation of hazardous materials in North America. Rail transport differs from highway transport in several aspects, an important one being that rail transport involves trains in which many railcars carrying hazardous materials travel together. By contrast to truck accidents, it is possible that a train accident may involve multiple hazardous materials cars derailing and releasing contents with consequently greater potential impact on human health, property and the environment. In this paper, a probabilistic model is developed to estimate the probability distribution of the number of tank cars releasing contents in a train derailment. Principal operational characteristics considered include train length, derailment speed, accident cause, position of the first car derailed, number and placement of tank cars in a train and tank car safety design. The effect of train speed, tank car safety design and tank car positions in a train were evaluated regarding the number of cars that release their contents in a derailment. This research provides insights regarding the circumstances affecting multiple-tank-car release incidents and potential strategies to reduce their occurrences. The model can be incorporated into a larger risk management framework to enable better local, regional and national safety management of hazardous materials transportation by rail

  15. Probability analysis of multiple-tank-car release incidents in railway hazardous materials transportation

    Energy Technology Data Exchange (ETDEWEB)

    Liu, Xiang, E-mail: liu94@illinois.edu; Saat, Mohd Rapik, E-mail: mohdsaat@illinois.edu; Barkan, Christopher P.L., E-mail: cbarkan@illinois.edu

    2014-07-15

    Railroads play a key role in the transportation of hazardous materials in North America. Rail transport differs from highway transport in several aspects, an important one being that rail transport involves trains in which many railcars carrying hazardous materials travel together. By contrast to truck accidents, it is possible that a train accident may involve multiple hazardous materials cars derailing and releasing contents with consequently greater potential impact on human health, property and the environment. In this paper, a probabilistic model is developed to estimate the probability distribution of the number of tank cars releasing contents in a train derailment. Principal operational characteristics considered include train length, derailment speed, accident cause, position of the first car derailed, number and placement of tank cars in a train and tank car safety design. The effect of train speed, tank car safety design and tank car positions in a train were evaluated regarding the number of cars that release their contents in a derailment. This research provides insights regarding the circumstances affecting multiple-tank-car release incidents and potential strategies to reduce their occurrences. The model can be incorporated into a larger risk management framework to enable better local, regional and national safety management of hazardous materials transportation by rail.

  16. APP Homodimers Transduce an Amyloid-β-Mediated Increase in Release Probability at Excitatory Synapses

    Directory of Open Access Journals (Sweden)

    Hilla Fogel

    2014-06-01

    Full Text Available Accumulation of amyloid-β peptides (Aβ, the proteolytic products of the amyloid precursor protein (APP, induces a variety of synaptic dysfunctions ranging from hyperactivity to depression that are thought to cause cognitive decline in Alzheimer’s disease. While depression of synaptic transmission has been extensively studied, the mechanisms underlying synaptic hyperactivity remain unknown. Here, we show that Aβ40 monomers and dimers augment release probability through local fine-tuning of APP-APP interactions at excitatory hippocampal boutons. Aβ40 binds to the APP, increases the APP homodimer fraction at the plasma membrane, and promotes APP-APP interactions. The APP activation induces structural rearrangements in the APP/Gi/o-protein complex, boosting presynaptic calcium flux and vesicle release. The APP growth-factor-like domain (GFLD mediates APP-APP conformational changes and presynaptic enhancement. Thus, the APP homodimer constitutes a presynaptic receptor that transduces signal from Aβ40 to glutamate release. Excessive APP activation may initiate a positive feedback loop, contributing to hippocampal hyperactivity in Alzheimer’s disease.

  17. A statistical model for deriving probability distributions of contamination for accidental releases

    International Nuclear Information System (INIS)

    ApSimon, H.M.; Davison, A.C.

    1986-01-01

    Results generated from a detailed long-range transport model, MESOS, simulating dispersal of a large number of hypothetical releases of radionuclides in a variety of meteorological situations over Western Europe have been used to derive a simpler statistical model, MESOSTAT. This model may be used to generate probability distributions of different levels of contamination at a receptor point 100-1000 km or so from the source (for example, across a frontier in another country) without considering individual release and dispersal scenarios. The model is embodied in a series of equations involving parameters which are determined from such factors as distance between source and receptor, nuclide decay and deposition characteristics, release duration, and geostrophic windrose at the source. Suitable geostrophic windrose data have been derived for source locations covering Western Europe. Special attention has been paid to the relatively improbable extreme values of contamination at the top end of the distribution. The MESOSTAT model and its development are described, with illustrations of its use and comparison with the original more detailed modelling techniques. (author)

  18. Studies in Optimal Configuration of the LTP

    International Nuclear Information System (INIS)

    McKinney, Wayne R.; Anders, Mark; Barber, Samuel K.; Domning, Edward E.; Lou, Yunian; Morrison, Gregory Y.; Salmassi, Farhad; Smith, Brian V.; Yashchuk, Valeriy V.

    2010-01-01

    Brightness preservation requirements for ever brighter synchrotron radiation and free electron laser beamlines require surface slope tolerances of x-ray optics on the order of 0.2 mu rad, or better. Hence, the accuracy of dedicated surface slope metrology must be 0.1 mu rad, or even less. Achieving this level of measurement accuracy with the flagship instrument at synchrotron radiation metrology laboratories, the Long Trace Profiler (LTP), requires all significant sources of systematic, random, and instrumental drift errors to be identified, and reduced or eliminated. In this respect, the performance of certain components of the Advanced Light Source LTP-II design [Kirschman, et al., Proc. SPIE, 7077, 70770A-12 (2008)] is analyzed, considering the principal justification for inclusion of each component, possible systematic error due to the quality of its optical material, and drift effects due to generated heat, etc. We investigate the effects of replacement of the existing diode laser with a fiber-coupled laser light source, and demonstrate that reducing the number of components by using a single beam on the surface under test (SUT), rather than an original double beam maintains, or even improves the accuracy of measurement with our LTP. Based on the performance of the upgraded LTP, we trace the further steps for improving of the LTP optical system.

  19. Parameter estimation techniques for LTP system identification

    Science.gov (United States)

    Nofrarias Serra, Miquel

    LISA Pathfinder (LPF) is the precursor mission of LISA (Laser Interferometer Space Antenna) and the first step towards gravitational waves detection in space. The main instrument onboard the mission is the LTP (LISA Technology Package) whose scientific goal is to test LISA's drag-free control loop by reaching a differential acceleration noise level between two masses in √ geodesic motion of 3 × 10-14 ms-2 / Hz in the milliHertz band. The mission is not only challenging in terms of technology readiness but also in terms of data analysis. As with any gravitational wave detector, attaining the instrument performance goals will require an extensive noise hunting campaign to measure all contributions with high accuracy. But, opposite to on-ground experiments, LTP characterisation will be only possible by setting parameters via telecommands and getting a selected amount of information through the available telemetry downlink. These two conditions, high accuracy and high reliability, are the main restrictions that the LTP data analysis must overcome. A dedicated object oriented Matlab Toolbox (LTPDA) has been set up by the LTP analysis team for this purpose. Among the different toolbox methods, an essential part for the mission are the parameter estimation tools that will be used for system identification during operations: Linear Least Squares, Non-linear Least Squares and Monte Carlo Markov Chain methods have been implemented as LTPDA methods. The data analysis team has been testing those methods with a series of mock data exercises with the following objectives: to cross-check parameter estimation methods and compare the achievable accuracy for each of them, and to develop the best strategies to describe the physics underlying a complex controlled experiment as the LTP. In this contribution we describe how these methods were tested with simulated LTP-like data to recover the parameters of the model and we report on the latest results of these mock data exercises.

  20. The upgraded LTP-V at SLS

    Energy Technology Data Exchange (ETDEWEB)

    Flechsig, U., E-mail: uwe.flechsig@psi.ch [Paul Scherrer Institut, Swiss Light Source, 5232 Villigen-PSI (Switzerland); Jaggi, A.; Krempaský, J.; Spielmann, S.; Thominet, V. [Paul Scherrer Institut, Swiss Light Source, 5232 Villigen-PSI (Switzerland)

    2013-05-11

    Since 2005 the Swiss Light Source (SLS) has been operating a Long Trace Profiler (LTP)-V from Ocean Optics in its metrology laboratory to measure the synchrotron optics for SLS. In 2012 we finished a significant upgrade to improve the accuracy, reliability and measurement efficiency in particular for the calibration of adaptive optics. Folding mirrors with figure errors <λ/100 and an additional linear encoder have been installed, the 1d CCD detector with 2048 pixels has been replaced by a 16 mega-pixel CCD camera with gigabit ethernet interface GigE, the monolithic software has been replaced by a modular, full- EPICS compatible system based on a new LTP plugin for the areaDetector software for image processing. The plugin allows slope determination in real time i.e. per frame.

  1. LTP interferometer-noise sources and performance

    International Nuclear Information System (INIS)

    Robertson, David; Killow, Christian; Ward, Harry; Hough, Jim; Heinzel, Gerhard; Garcia, Antonio; Wand, Vinzenz; Johann, Ulrich; Braxmaier, Claus

    2005-01-01

    The LISA Technology Package (LTP) uses laser interferometry to measure the changes in relative displacement between two inertial test masses. The goals of the mission require a displacement measuring precision of 10 pm Hz -1/2 at frequencies in the 3-30 mHz band. We report on progress with a prototype LTP interferometer optical bench in which fused silica mirrors and beamsplitters are fixed to a ZERODUR (registered) substrate using hydroxide catalysis bonding to form a rigid interferometer. The couplings to displacement noise of this interferometer of two expected noise sources-laser frequency noise and ambient temperature fluctuations-have been investigated, and an additional, unexpected, noise source has been identified. The additional noise is due to small amounts of signal at the heterodyne frequency arriving at the photodiode preamplifiers with a phase that quasistatically changes with respect to the optical signal. The phase shift is caused by differential changes in the external optical paths the beams travel before they reach the rigid interferometer. Two different external path length stabilization systems have been demonstrated and these allowed the performance of the overall system to meet the LTP displacement noise requirement

  2. The ltp gene of temperate Streptococcus thermophilus phage TP-J34 confers superinfection exclusion to Streptococcus thermophilus and Lactococcus lactis

    International Nuclear Information System (INIS)

    Sun Xingmin; Goehler, Andre; Heller, Knut J.; Neve, Horst

    2006-01-01

    The ltp gene, located within the lysogeny module of temperate Streptococcus thermophilus phage TP-J34, has been shown to be expressed in lysogenic strain S. thermophilus J34. It codes for a lipoprotein, as demonstrated by inhibition of cleavage of the signal sequence by globomycin. Exposure of Ltp on the surface of Lactococcus lactis protoplasts bearing a plasmid-encoded copy of ltp has been demonstrated by immunogold labeling and electron microscopy. Expression of ltp in prophage- and plasmid-cured S. thermophilus J34-6f interfered with TP-J34 infection. While plating efficiency was reduced by a factor of about 40 and lysis of strain J34-6f in liquid medium was delayed considerably, phage adsorption was not affected at all. Intracellular accumulation of phage DNA was shown to be inhibited by Ltp. This indicates interference of Ltp with infection at the stage of triggering DNA release and injection into the cell, indicating a role of Ltp in superinfection exclusion. Expression of ltp in L. lactis Bu2-60 showed that the same superinfection exclusion mechanism was strongly effective against phage P008, a member of the lactococcal 936 phage species: no plaque-formation was detectable with even 10 9 phage per ml applied, and lysis in liquid medium did not occur. In Lactococcus also, Ltp apparently inhibited phage DNA release and/or injection. Ltp appears to be a member of a family of small, secreted proteins with a 42 amino acids repeat structure encoded by genes of Gram-positive bacteria. Some of these homologous genes are part of the genomes of prophages

  3. Maintained LTP and Memory Are Lost by Zn2+ Influx into Dentate Granule Cells, but Not Ca2+ Influx.

    Science.gov (United States)

    Takeda, Atsushi; Tamano, Haruna; Hisatsune, Marie; Murakami, Taku; Nakada, Hiroyuki; Fujii, Hiroaki

    2018-02-01

    The idea that maintained LTP and memory are lost by either increase in intracellular Zn 2+ in dentate granule cells or increase in intracellular Ca 2+ was examined to clarify significance of the increases induced by excess synapse excitation. Both maintained LTP and space memory were impaired by injection of high K + into the dentate gyrus, but rescued by co-injection of CaEDTA, which blocked high K + -induced increase in intracellular Zn 2+ but not high K + -induced increase in intracellular Ca 2+ . High K + -induced disturbances of LTP and intracellular Zn 2+ are rescued by co-injection of 6-cyano-7-nitroquinoxakine-2,3-dione, an α-amino-3-hydroxy-5-methyl-4-isoxazolepropionate (AMPA) receptor antagonist, but not by co-injection of blockers of NMDA receptors, metabotropic glutamate receptors, and voltage-dependent calcium channels. Furthermore, AMPA impaired maintained LTP and the impairment was also rescued by co-injection of CaEDTA, which blocked increase in intracellular Zn 2+ , but not increase in intracellular Ca 2+ . NMDA and glucocorticoid, which induced Zn 2+ release from the internal stores, did not impair maintained LTP. The present study indicates that increase in Zn 2+ influx into dentate granule cells through AMPA receptors loses maintained LTP and memory. Regulation of Zn 2+ influx into dentate granule cells is more critical for not only memory acquisition but also memory retention than that of Ca 2+ influx.

  4. Conductive plastics: comparing alternative nanotechnologies by performance and life cycle release probability

    Energy Technology Data Exchange (ETDEWEB)

    Neubauer, Nicole; Wohlleben, Wendel, E-mail: wendel.wohlleben@basf.com [Material Physics, GMC/R, BASF SE (Germany); Tomović, Željko, E-mail: zeljko.tomovic@basf.com [BASF Polyurethanes GmbH, GMP/LS (Germany)

    2017-03-15

    Nanocomposites can be considered safe during their life cycle as long as the nanofillers remain embedded in the matrix. Therefore, a possible release of nanofillers has to be assessed before commercialization. This report addresses possible life cycle release scenarios for carbon nanotubes (CNT), graphene, and carbon black (CB) from a thermoplastic polyurethane (TPU) matrix. The content of each nanofiller was adjusted to achieve the same conductivity level. The nanofillers reduced the rate of nanoscale releases during mechanical processing with decreasing release in the order neat TPU, TPU-CNT, TPU-graphene, and TPU-CB. Released fragments were dominated by the polymer matrix with embedded or surface-protruding nanofillers. During electron microscopy analysis, free CB was observed, however, there was no free CNT or graphene. Quantitatively, the presence of free nanofillers remained below the detection limit of <0.01% of generated dust. Further, both the production process and type of mechanical processing showed a significant impact with higher release rates for injection-molded compared to extruded and sanded compared to drilled materials. Due to its optimal performance for further development, extruded TPU-CNT was investigated in a combined, stepwise worst case scenario (mechanical processing after weathering). After weathering by simulated sunlight and rain, CNT were visible at the surface of the nanocomposite; after additional sanding, fragments showed protruding CNT, but free CNT were not detected. In summary, this preliminary exposure assessment showed no indication that recommended occupational exposure limits for carbonaceous nanomaterials can be exceeded during the life cycle of the specific TPU nanocomposites and conditions investigated in this study.

  5. LTP fibre injector qualification and status

    International Nuclear Information System (INIS)

    Bogenstahl, J; Cunningham, L; Fitzsimons, E D; Hough, J; Killow, C J; Perreur-Lloyd, M; Robertson, D; Rowan, S; Ward, H

    2009-01-01

    This paper presents the current state of the LISA Technology Package (LTP) fibre injector qualification project in terms of vibration and shock tests. The fibre injector is a custom built part and therefore must undergo a full space qualification process. The mounting structure and method for sinusoidal vibration and random vibration tests as well as shock tests will be presented. Furthermore a proposal will be presented to use the fibre injector pair qualification model to build an optical prototype bench. The optical prototype bench is a full-scale model of the flight model. It will be used for development and rehearsal of all the assembly stages of the flight model and will provide an on-ground simulator for investigation as an updated engineering model.

  6. Involvement of intracellular Zn2+ signaling in LTP at perforant pathway-CA1 pyramidal cell synapse.

    Science.gov (United States)

    Tamano, Haruna; Nishio, Ryusuke; Takeda, Atsushi

    2017-07-01

    Physiological significance of synaptic Zn 2+ signaling was examined at perforant pathway-CA1 pyramidal cell synapses. In vivo long-term potentiation (LTP) at perforant pathway-CA1 pyramidal cell synapses was induced using a recording electrode attached to a microdialysis probe and the recording region was locally perfused with artificial cerebrospinal fluid (ACSF) via the microdialysis probe. Perforant pathway LTP was not attenuated under perfusion with CaEDTA (10 mM), an extracellular Zn 2+ chelator, but attenuated under perfusion with ZnAF-2DA (50 μM), an intracellular Zn 2+ chelator, suggesting that intracellular Zn 2+ signaling is required for perforant pathway LTP. Even in rat brain slices bathed in CaEDTA in ACSF, intracellular Zn 2+ level, which was measured with intracellular ZnAF-2, was increased in the stratum lacunosum-moleculare where perforant pathway-CA1 pyramidal cell synapses were contained after tetanic stimulation. These results suggest that intracellular Zn 2+ signaling, which originates in internal stores/proteins, is involved in LTP at perforant pathway-CA1 pyramidal cell synapses. Because the influx of extracellular Zn 2+ , which originates in presynaptic Zn 2+ release, is involved in LTP at Schaffer collateral-CA1 pyramidal cell synapses, synapse-dependent Zn 2+ dynamics may be involved in plasticity of postsynaptic CA1 pyramidal cells. © 2017 Wiley Periodicals, Inc.

  7. LTP - LISA technology package: Development challenges of a spaceborne fundamental physics experiment

    International Nuclear Information System (INIS)

    Gerndt, R

    2009-01-01

    The LISA Technology Package (LTP) is the main payload onboard the LISA Pathfinder Spacecraft. The LTP Instrument together with the Drag-Free Attitude Control System (DFACS) and the respective LTP and DFACS operational software forms the LTP Experiment. It is completed by the FEEPs of the LPF spacecraft that are controlled by DFACS in order to control the spacecraft's attitude along with the experiment's needs. This article concentrates on aspects of the Industrial development of the LTP Instrument items and on essential performance issues of LTP. Examples of investigations on specific issue will highlight the kind of special problems to be solved for LTP in close cooperation with the Scientific Community.

  8. Optimised purification and characterisation of lipid transfer protein 1 (LTP1) and its lipid-bound isoform LTP1b from barley malt.

    Science.gov (United States)

    Nieuwoudt, Melanie; Lombard, Nicolaas; Rautenbach, Marina

    2014-08-15

    In beer brewing, brewers worldwide strive to obtain product consistency in terms of flavour, colour and foam. Important proteins contributing to beer foam are lipid transfer proteins (LTPs), in particular LTP1 and its lipid-bound isoform LTP1b, which are known to transport lipids in vivo and prevent lipids from destabilising the beer foam. LTP1 and LTP1b were successfully purified using only five purification steps with a high purified protein yield (160 mg LTP1 and LTP1b from 200 g barley). Circular dichroism of LTP1 and LTP1b confirmed that both proteins are highly tolerant to high temperatures (>90 °C) and are pH stable, particularly at a neutral to a more basic pH. Only LTP1 exhibited antiyeast and thermo-stable lytic activity, while LTP1b was inactive, indicating that the fatty acid moiety compromised the antimicrobial activity of LTP1. This lack in antiyeast activity and the positive foam properties of LTP1b would benefit beer fermentation and quality. Copyright © 2014 Elsevier Ltd. All rights reserved.

  9. Involvement of glucocorticoid-mediated Zn2+ signaling in attenuation of hippocampal CA1 LTP by acute stress.

    Science.gov (United States)

    Takeda, Atsushi; Suzuki, Miki; Tamano, Haruna; Takada, Shunsuke; Ide, Kazuki; Oku, Naoto

    2012-03-01

    Glucocorticoid-glutamatergic interactions have been proposed as a potential model to explain stress-mediated impairment of cognition. However, it is unknown whether glucocorticoid-zincergic interactions are involved in this impairment. Histochemically reactive zinc (Zn(2+)) is co-released with glutamate from zincergic neurons. In the present study, involvement of synaptic Zn(2+) in stress-induced attenuation of CA1 LTP was examined in hippocampal slices from young rats after exposure to tail suspension stress for 30s, which significantly increased serum corticosterone. Stress-induced attenuation of CA1 LTP was ameliorated by administration of clioquinol, a membrane permeable zinc chelator, to rats prior to exposure to stress, implying that the reduction of synaptic Zn(2+) by clioquinol participates in this amelioration. To pursue the involvement of corticosterone-mediated Zn(2+) signal in the attenuated CA1 LTP by stress, dynamics of synaptic Zn(2+) was checked in hippocampal slices exposed to corticosterone. Corticosterone increased extracellular Zn(2+) levels measured with ZnAF-2 dose-dependently, as well as the intracellular Ca(2+) levels measured with calcium orange AM, suggesting that corticosterone excites zincergic neurons in the hippocampus and increases Zn(2+) release from the neuron terminals. Intracellular Zn(2+) levels measured with ZnAF-2DA were also increased dose-dependently, but not in the coexistence of CaEDTA, a membrane-impermeable zinc chelator, suggesting that intracellular Zn(2+) levels is increased by the influx of extracellular Zn(2+). Furthermore, corticosterone-induced attenuation of CA1 LTP was abolished in the coexistence of CaEDTA. The present study suggests that corticosterone-mediated increase in postsynaptic Zn(2+) signal in the cytosolic compartment is involved in the attenuation of CA1 LTP after exposure to acute stress. Copyright © 2012 Elsevier Ltd. All rights reserved.

  10. Manifestations and possible sources of lunar transient phenomena (LTP)

    International Nuclear Information System (INIS)

    Cameron, W.S.

    1975-01-01

    Several different manifestations of lunar transient phenomena (LTP) have been reported. These include: (1) brightenings--both sudden and slow, (2) reddish--both bright and dull, (3) bluish--both bright and dull, (4) fairly abrupt dimmings or darkenings, and (5) obscurations, which may be accompanied by any of the other four manifestations. Approximately 200 lunar features exhibiting such anomalies have been reported at least once, but 80% of all observations are found in less than a dozen sites and 60% are found in about one-half dozen sites. An observing program is being conducted for the Association of Lunar and Planetary Observers which is designed to monitor the LTP sites, the seismic epicenter sites and non-LTP comparison sites. It addresses the ''brightenings'' category of observations and is designed to establish normal brightness of each observed feature for all phases of a lunation. It also seeks to establish a quantified ''seeing'' scale. About one-half dozen observers have reported albedo measures (estimated from an albedo scale set up by each observer). The most extensive new data on albedo versus age (phase of Moon) are for the crater Dawes. Several LTP effects have been discerned in Dawes. In addition, seeing estimates, based on the behavior of a star's diffraction disk, provided some unexpected results when disk behavior is compared with other subjective estimates of seeing

  11. Interferometry for the LISA technology package LTP: an update

    International Nuclear Information System (INIS)

    Heinzel, G; Bogenstahl, J; Braxmaier, C; Danzmann, K; Garcia, A; Guzman, F; Hough, J; Hoyland, D; Jennrich, O; Killow, C; Robertson, D; Sodnik, Z; Steier, F; Ward, H; Wand, V

    2006-01-01

    This paper gives an update on the status of the LISA technology package (LTP) which is to be launched in 2009 by ESA as a technology demonstration mission for the spaceborne gravitational wave observatory LISA. The dominant noise source in the interferometer prototype has been investigated and improved such that it is now comfortably below its budget at all frequencies

  12. Biochemical principles underlying the stable maintenance of LTP by the CaMKII/NMDAR complex.

    Science.gov (United States)

    Lisman, John; Raghavachari, Sridhar

    2015-09-24

    Memory involves the storage of information at synapses by an LTP-like process. This information storage is synapse specific and can endure for years despite the turnover of all synaptic proteins. There must, therefore, be special principles that underlie the stability of LTP. Recent experimental results suggest that LTP is maintained by the complex of CaMKII with the NMDAR. Here we consider the specifics of the CaMKII/NMDAR molecular switch, with the goal of understanding the biochemical principles that underlie stable information storage by synapses. Consideration of a variety of experimental results suggests that multiple principles are involved. One switch requirement is to prevent spontaneous transitions from the off to the on state. The highly cooperative nature of CaMKII autophosphorylation by Ca(2+) (Hill coefficient of 8) and the fact that formation of the CaMKII/NMDAR complex requires release of CaMKII from actin are mechanisms that stabilize the off state. The stability of the on state depends critically on intersubunit autophosphorylation, a process that restores any loss of pT286 due to phosphatase activity. Intersubunit autophosphorylation is also important in explaining why on state stability is not compromised by protein turnover. Recent evidence suggests that turnover occurs by subunit exchange. Thus, stability could be achieved if a newly inserted unphosphorylated subunit was autophosphorylated by a neighboring subunit. Based on other recent work, we posit a novel mechanism that enhances the stability of the on state by protection of pT286 from phosphatases. We posit that the binding of the NMNDAR to CaMKII forces pT286 into the catalytic site of a neighboring subunit, thereby protecting pT286 from phosphatases. A final principle concerns the role of structural changes. The binding of CaMKII to the NMDAR may act as a tag to organize the binding of further proteins that produce the synapse enlargement that underlies late LTP. We argue that these

  13. Combining scenarios in a calculation of the overall probability distribution of cumulative releases of radioactivity from the Waste Isolation Pilot Plant, southeastern New Mexico

    International Nuclear Information System (INIS)

    Tierney, M.S.

    1991-11-01

    The Waste Isolation Pilot Plant (WIPP), in southeastern New Mexico, is a research and development facility to demonstrate safe disposal of defense-generated transuranic waste. The US Department of Energy will designate WIPP as a disposal facility if it meets the US Environmental Protection Agency's standard for disposal of such waste; the standard includes a requirement that estimates of cumulative releases of radioactivity to the accessible environment be incorporated in an overall probability distribution. The WIPP Project has chosen an approach to calculation of an overall probability distribution that employs the concept of scenarios for release and transport of radioactivity to the accessible environment. This report reviews the use of Monte Carlo methods in the calculation of an overall probability distribution and presents a logical and mathematical foundation for use of the scenario concept in such calculations. The report also draws preliminary conclusions regarding the shape of the probability distribution for the WIPP system; preliminary conclusions are based on the possible occurrence of three events and the presence of one feature: namely, the events ''attempted boreholes over rooms and drifts,'' ''mining alters ground-water regime,'' ''water-withdrawal wells provide alternate pathways,'' and the feature ''brine pocket below room or drift.'' Calculation of the WIPP systems's overall probability distributions for only five of sixteen possible scenario classes that can be obtained by combining the four postulated events or features

  14. LTP - LISA technology package: Development challenges of a spaceborne fundamental physics experiment

    Energy Technology Data Exchange (ETDEWEB)

    Gerndt, R, E-mail: ruediger.gerndt@astrium.eads.ne [Astrium Satellites GmbH, Claude-Dornier-Str., 88090 Immenstaad (Germany)

    2009-03-01

    The LISA Technology Package (LTP) is the main payload onboard the LISA Pathfinder Spacecraft. The LTP Instrument together with the Drag-Free Attitude Control System (DFACS) and the respective LTP and DFACS operational software forms the LTP Experiment. It is completed by the FEEPs of the LPF spacecraft that are controlled by DFACS in order to control the spacecraft's attitude along with the experiment's needs. This article concentrates on aspects of the Industrial development of the LTP Instrument items and on essential performance issues of LTP. Examples of investigations on specific issue will highlight the kind of special problems to be solved for LTP in close cooperation with the Scientific Community.

  15. Dopamine Induces LTP Differentially in Apical and Basal Dendrites through BDNF and Voltage-Dependent Calcium Channels

    Science.gov (United States)

    Navakkode, Sheeja; Sajikumar, Sreedharan; Korte, Martin; Soong, Tuck Wah

    2012-01-01

    The dopaminergic modulation of long-term potentiation (LTP) has been studied well, but the mechanism by which dopamine induces LTP (DA-LTP) in CA1 pyramidal neurons is unknown. Here, we report that DA-LTP in basal dendrites is dependent while in apical dendrites it is independent of activation of L-type voltage-gated calcium channels (VDCC).…

  16. Weakened Intracellular Zn2+-Buffering in the Aged Dentate Gyrus and Its Involvement in Erasure of Maintained LTP.

    Science.gov (United States)

    Takeda, Atsushi; Tamano, Haruna; Murakami, Taku; Nakada, Hiroyuki; Minamino, Tatsuya; Koike, Yuta

    2018-05-01

    Memory is lost by the increased influx of extracellular Zn 2+ into neurons. It is possible that intracellular Zn 2+ dynamics is modified even at non-zincergic medial perforant pathway-dentate granule cell synapses along with aging and that vulnerability to the modification is linked to age-related cognitive decline. To examine these possibilities, vulnerability of long-term potentiation (LTP) maintenance, which underlies memory retention, to modification of synaptic Zn 2+ dynamics was compared between young and aged rats. The influx of extracellular Zn 2+ into dentate granule cells was increased in aged rats after injection of high K + into the dentate gyrus, but not in young rats. This increase impaired maintained LTP in aged rats. However, the impairment was rescued by co-injection of CaEDTA, an extracellular Zn 2+ chelator, or CNQX, an AMPA receptor antagonist, which suppressed the Zn 2+ influx. Maintained LTP was also impaired in aged rats after injection of ZnAF-2DA into the dentate gyrus that chelates intracellular Zn 2+ , but not in young rats. Interestingly, the capacity of chelating intracellular Zn 2+ with intracellular ZnAF-2 was almost lost in the aged dentate gyrus 2 h after injection of ZnAF-2DA into the dentate gyrus, suggesting that intracellular Zn 2+ -buffering is weakened in the aged dentate gyrus, compared to the young dentate gyrus. In the dentate gyrus of aged rats, maintained LTP is more vulnerable to modification of intracellular Zn 2+ dynamics than in young rats, probably due to weakened intracellular Zn 2+ -buffering.

  17. "Hand surgeons probably don't starve": Patient's perceptions of physician reimbursements for performing an open carpal tunnel release.

    Science.gov (United States)

    Kokko, Kyle P; Lipman, Adam J; Sapienza, Anthony; Capo, John T; Barfield, William R; Paksima, Nader

    2015-12-01

    The purpose of this study is to evaluate patient's perceptions of physician reimbursement for the most commonly performed surgery on the hand, a carpal tunnel release (CTR). Anonymous physician reimbursement surveys were given to patients and non-patients in the waiting rooms of orthopaedic hand physicians' offices and certified hand therapist's offices. The survey consisted of 13 questions. Respondents were asked (1) what they thought a surgeon should be paid to perform a carpal tunnel release, (2) to estimate how much Medicare reimburses the surgeon, and (3) about how health care dollars should be divided among the surgeon, the anesthesiologist, and the hospital or surgery center. Descriptive subject data included age, gender, income, educational background, and insurance type. Patients thought that hand surgeons should receive $5030 for performing a CTR and the percentage of health care funds should be distributed primarily to the hand surgeon (56 %), followed by the anesthesiologist (23 %) and then the hospital/surgery center (21 %). They estimated that Medicare reimburses the hand surgeon $2685 for a CTR. Most patients (86 %) stated that Medicare reimbursement was "lower" or "much lower" than what it should be. Respondents believed that hand surgeons should be reimbursed greater than 12 times the Medicare reimbursement rate of approximately $412 and that the physicians (surgeons and anesthesiologist) should command most of the health care funds allocated to this treatment. This study highlights the discrepancy between patient's perceptions and actual physician reimbursement as it relates to federal health care. Efforts should be made to educate patients on this discrepancy.

  18. Learning, memory and hippocampal LTP in genetically obese rodents

    Institute of Scientific and Technical Information of China (English)

    2001-01-01

    We have found that leptin, at physiological concentrations of 10-12 mol/L, facilitates learning and memory and LTP maintenance in Wistar rats. To explore the role of leptin recepors in learning, memory and synaptic plasticity, experiments were carried out using Zucker rats (Z), db/db mice (db), and ob/ob mice(ob). The former two have defects in leptin receptors and the latter cannot produce normal leptin. Unlike the effects observed in normal rats, high or low frequency stimulation of Schaffer collateral-CA1 synapses in hippocampal slices prepared from Z, db and ob animals failed to induce the learning and memory relevant long-term potentiation or depression in CA1 neurons. However, LTP in ob CA1 synapses was facilitated by leptin at 10-12 mol/L concentration. Moreover, the paired-pulse facilitation of CA1 synaptic potentials and intracellularly recorded postsynaptic responses to the neurotransmitters AMPA, NMDA and GABA, applied electrophoretically to the apical dendrites of CA1 neurons, were approximately the same compared to the control lean animals. In addition, unlike the second messenger responses observed in Wistar rats, calmodulin kinase Ⅱ activity in the CA1 area of Z and db animals was not activated after tetanic stimulation of the Schaffer collaterals. It has been shown that all three strains, Z, db and ob display impaired spatial learning and memory when tested in the Morris water maze. The results of these experiments indicate a close relationship between spatial learning and memory, facilitation of LTP, and calmodulin kinase Ⅱ activity.

  19. Neuromodulatory neurotransmitters influence LTP-like plasticity in human cortex: a pharmaco-TMS study.

    Science.gov (United States)

    Korchounov, Alexei; Ziemann, Ulf

    2011-08-01

    Long-term potentiation (LTP) of synaptic efficacy is considered a fundamental mechanism of learning and memory. At the cellular level a large body of evidence demonstrated that the major neuromodulatory neurotransmitters dopamine (DA), norepinephrine (NE), and acetylcholine (ACh) influence LTP magnitude. Noninvasive brain stimulation protocols provide the opportunity to study LTP-like plasticity at the systems level of human cortex. Here we applied paired associative stimulation (PAS) to induce LTP-like plasticity in the primary motor cortex of eight healthy subjects. In a double-blind, randomized, placebo-controlled, crossover design, the acute effects of a single oral dose of the neuromodulatory drugs cabergoline (DA agonist), haloperidol (DA antagonist), methylphenidate (indirect NE agonist), prazosine (NE antagonist), tacrine (ACh agonist), and biperiden (ACh antagonist) on PAS-induced LTP-like plasticity were examined. The antagonists haloperidol, prazosine, and biperiden depressed significantly the PAS-induced LTP-like plasticity observed under placebo, whereas the agonists cabergoline, methylphenidate, and tacrine had no effect. Findings demonstrate that antagonists in major neuromodulatory neurotransmitter systems suppress LTP-like plasticity at the systems level of human cortex, in accord with evidence of their modulating action of LTP at the cellular level. This provides further supportive evidence for the known detrimental effects of these drugs on LTP-dependent mechanisms such as learning and memory.

  20. Temperate Streptococcus thermophilus phages expressing superinfection exclusion proteins of the Ltp type

    Directory of Open Access Journals (Sweden)

    Yahya eAli

    2014-03-01

    Full Text Available Lipoprotein Ltp encoded by temperate Streptococcus thermophilus phage TP-J34 is the prototype of the wide-spread family of host cell surface-exposed lipoproteins involved in superinfection exclusion. When screening for other S. thermophilus phages expressing this type of lipoprotein, three temperate phages - TP-EW, TP-DSM20617 and TP-778 - were isolated. In this communication we present the total nucleotide sequences of TP-J34 and TP-778L. For TP-EW, a phage almost identical to TP-J34, besides the ltp gene only the two regions of deviation from TP-J34 DNA were analyzed: the gene encoding the tail protein causing an assembly defect in TP-J34 and the gene encoding the lysin, which in TP-EW contains an intron. For TP-DSM20617 only the sequence of the lysogeny module containing the ltp gene was determined. The region showed high homology to the same region of TP-778. For TP-778 we could show that absence of the attR region resulted in aberrant excision of phage DNA. The amino acid sequence of mature LtpTP-EW was shown to be identical to that of mature LtpTP-J34, whereas the amino acid sequence of mature LtpTP-778 was shown to differ from mature LtpTP-J34 in eight amino acid positions. LtpTP-DSM20617 was shown to differ from LtpTP-778 in just one amino acid position. In contrast to LtpTP-J34, LtpTP-778 did not affect infection of lactococcal phage P008 instead increased activity against phage P001 was noticed.

  1. Optimal Release Time and Sensitivity Analysis Using a New NHPP Software Reliability Model with Probability of Fault Removal Subject to Operating Environments

    Directory of Open Access Journals (Sweden)

    Kwang Yoon Song

    2018-05-01

    Full Text Available With the latest technological developments, the software industry is at the center of the fourth industrial revolution. In today’s complex and rapidly changing environment, where software applications must be developed quickly and easily, software must be focused on rapidly changing information technology. The basic goal of software engineering is to produce high-quality software at low cost. However, because of the complexity of software systems, software development can be time consuming and expensive. Software reliability models (SRMs are used to estimate and predict the reliability, number of remaining faults, failure intensity, total and development cost, etc., of software. Additionally, it is very important to decide when, how, and at what cost to release the software to users. In this study, we propose a new nonhomogeneous Poisson process (NHPP SRM with a fault detection rate function affected by the probability of fault removal on failure subject to operating environments and discuss the optimal release time and software reliability with the new NHPP SRM. The example results show a good fit to the proposed model, and we propose an optimal release time for a given change in the proposed model.

  2. Probability of liquid radionuclide release of a near surface repository; Probabilidade de liberacao liquida de radionuclideos de um repositorio proximo a superficie

    Energy Technology Data Exchange (ETDEWEB)

    Aguiar, Lais A.; Melo, P.F. Frutuoso e [Universidade Federal, Rio de Janeiro, RJ (Brazil). Coordenacao dos Programas de Pos-graduacao de Engenharia. Programa de Engenharia Nuclear]. E-mail: lais@con.ufrj.br; frutuoso@con.ufrj.br; Passos, Erivaldo; Alves, Antonio Sergio [ELETRONUCLEAR, Rio de Janeiro, RJ (Brazil). Div. de Seguranca Nuclear]. E-mail: epassos@eletronuclear.gov.br; asergi@eletronuclear.gov.br

    2005-07-01

    The safety analysis of a near surface repository for medium and low activity wastes leads to investigating accident scenarios related to water infiltration phenomena. The probability of radionuclide release through the infiltration water could be estimated with the aid of suitable probabilistic models. For the analysis, the repository system is divided into two subsystems: the first, due to the barriers against the water infiltration (backfill material and container), and the second one comprising the barriers against the leaching of radionuclide to the biosphere (solid matrix and geosphere). The repository system is supposed to have its components (barriers) working in an active parallel mode. The probability of the system failure is obtained from the logical structure of a failure tree. The study was based on the Probabilistic Safety Assessment (PSA) technique for the most significant radionuclides within the radioactive packages system of low and medium activity, and so the probability of failure of the system for each radionuclide during the time period of institutional control was obtained. (author)

  3. Interferometry for the LISA technology package (LTP) aboard SMART-2

    International Nuclear Information System (INIS)

    Heinzel, G; Braxmaier, C; Schilling, R; Ruediger, A; Robertson, D; Plate, M te; Wand, V; Arai, K; Johann, U; Danzmann, K

    2003-01-01

    The interferometer of the LISA technology package (LTP) on SMART-2 is needed to verify the performance of the gravitational sensors by monitoring the distance between two test masses with a noise level of 10 pm Hz -1/2 between 3 mHz and 30 mHz. It must continuously track the motion of the test mass distance while that distance changes by many μm with a speed of up to 20 μm s -1 , without losing track of the sign of the motion and without exerting any influence on the test masses that might lead to a motion above that level. As a result of a detailed comparison study, a heterodyne Mach-Zehnder interferometer was selected as the baseline for the SMART-2 mission. Its design and expected performance are described in this paper

  4. Long-term potentiation (LTP) of human sensory-evoked potentials.

    Science.gov (United States)

    Kirk, Ian J; McNair, Nicolas A; Hamm, Jeffrey P; Clapp, Wesley C; Mathalon, Daniel H; Cavus, Idil; Teyler, Timothy J

    2010-09-01

    Long-term potentiation (LTP) is the principal candidate synaptic mechanism underlying learning and memory, and has been studied extensively at the cellular and molecular level in laboratory animals. Inquiry into the functional significance of LTP has been hindered by the absence of a human model as, until recently, LTP has only been directly demonstrated in humans in isolated cortical tissue obtained from patients undergoing surgery, where it displays properties identical to those seen in non-human preparations. In this brief review, we describe the results of paradigms recently developed in our laboratory for inducing LTP-like changes in visual-, and auditory-evoked potentials. We describe how rapid, repetitive presentation of sensory stimuli leads to a persistent enhancement of components of sensory-evoked potential in normal humans. Experiments to date, investigating the locus, stimulus specificity, and NMDA receptor dependence of these LTP-like changes suggest that they have the essential characteristics of LTP seen in experimental animals. The ability to elicit LTP from non-surgical patients will provide a human model system allowing the detailed examination of synaptic plasticity in normal subjects and may have future clinical applications in the assessment of cognitive disorders. Copyright © 2010 John Wiley & Sons, Ltd. For further resources related to this article, please visit the WIREs website. Copyright © 2010 John Wiley & Sons, Ltd.

  5. Lentiviral Modulation of Wnt/β-Catenin Signaling Affects In Vivo LTP.

    Science.gov (United States)

    Ivanova, Olga Ya; Dobryakova, Yulia V; Salozhin, Sergey V; Aniol, Viktor A; Onufriev, Mikhail V; Gulyaeva, Natalia V; Markevich, Vladimir A

    2017-10-01

    Wnt signaling is involved in hippocampal development and synaptogenesis. Numerous recent studies have been focused on the role of Wnt ligands in the regulation of synaptic plasticity. Inhibitors and activators of canonical Wnt signaling were demonstrated to decrease or increase, respectively, in vitro long-term potentiation (LTP) maintenance in hippocampal slices (Chen et al. in J Biol Chem 281:11910-11916, 2006; Vargas et al. in J Neurosci 34:2191-2202, 2014, Vargas et al. in Exp Neurol 264:14-25, 2015). Using lentiviral approach to down- and up-regulate the canonical Wnt signaling, we explored whether Wnt/β-catenin signaling is critical for the in vivo LTP. Chronic suppression of Wnt signaling induced an impairment of in vivo LTP expression 14 days after lentiviral suspension injection, while overexpression of Wnt3 was associated with a transient enhancement of in vivo LTP magnitude. Both effects were related to the early phase LTP and did not affect LTP maintenance. A loss-of-function study demonstrated decreased initial paired pulse facilitation ratio, β-catenin, and phGSK-3β levels. A gain-of-function study revealed not only an increase in PSD-95, β-catenin, and Cyclin D1 protein levels, but also a reduced phGSK-3β level and enhanced GSK-3β kinase activity. These results suggest a presynaptic dysfunction predominantly underlying LTP impairment while postsynaptic modifications are primarily involved in transient LTP amplification. This study is the first demonstration of the involvement of Wnt/β-catenin signaling in synaptic plasticity regulation in an in vivo LTP model.

  6. CaMKII Requirement for in Vivo Insular Cortex LTP Maintenance and CTA Memory Persistence

    Directory of Open Access Journals (Sweden)

    Yectivani Juárez-Muñoz

    2017-11-01

    Full Text Available Calcium-calmodulin/dependent protein kinase II (CaMKII plays an essential role in LTP induction, but since it has the capacity to remain persistently activated even after the decay of external stimuli it has been proposed that it can also be necessary for LTP maintenance and therefore for memory persistence. It has been shown that basolateral amygdaloid nucleus (Bla stimulation induces long-term potentiation (LTP in the insular cortex (IC, a neocortical region implicated in the acquisition and retention of conditioned taste aversion (CTA. Our previous studies have demonstrated that induction of LTP in the Bla-IC pathway before CTA training increased the retention of this task. Although it is known that IC-LTP induction and CTA consolidation share similar molecular mechanisms, little is known about the molecular actors that underlie their maintenance. The purpose of the present study was to evaluate the role of CaMKII in the maintenance of in vivo Bla-IC LTP as well as in the persistence of CTA long-term memory (LTM. Our results show that acute microinfusion of myr-CaMKIINtide, a selective inhibitor of CaMKII, in the IC of adult rats during the late-phase of in vivo Bla-IC LTP blocked its maintenance. Moreover, the intracortical inhibition of CaMKII 24 h after CTA acquisition impairs CTA-LTM persistence. Together these results indicate that CaMKII is a central key component for the maintenance of neocortical synaptic plasticity as well as for persistence of CTA-LTM.

  7. Detection of some safe plant-derived foods for LTP-allergic patients.

    Science.gov (United States)

    Asero, Riccardo; Mistrello, Gianni; Roncarolo, Daniela; Amato, Stefano

    2007-01-01

    Lipid transfer protein (LTP) is a widely cross-reacting plant pan-allergen. Adverse reactions to Rosaceae, tree nuts, peanut, beer, maize, mustard, asparagus, grapes, mulberry, cabbage, dates, orange, fig, kiwi, lupine, fennel, celery, tomato, eggplant, lettuce, chestnut and pineapple have been recorded. To detect vegetable foods to be regarded as safe for LTP-allergic patients. Tolerance/intolerance to a large spectrum of vegetable foods other than Rosaceae, tree nuts and peanut was assessed by interview in 49 subjects monosensitized to LTP and in three distinct groups of controls monosensitized to Bet v 1 (n = 24) or Bet v 2 (n = 18), or sensitized to both LTP and birch pollen (n = 16), all with a history of vegetable food allergy. Patients and controls underwent skin prick test (SPT) with a large spectrum of vegetable foods. The absence of IgE reactivity to foods that were negative in both clinical history and SPT was confirmed by immunoblot analysis and their clinical tolerance was finally assessed by open oral challenge (50 g per food). All patients reported tolerance and showed negative SPT to carrot, potato, banana and melon; these foods scored positive in SPT and elicited clinical symptoms in a significant proportion of patients from all three control groups. All patients tolerated these four foods on oral challenge. Immunoblot analysis confirmed the lack of IgE reactivity to these foods by LTP-allergic patients. Carrot, potato, banana and melon seem safe for LTP-allergic patients. This finding may be helpful for a better management of allergy to LTP.

  8. The design and analysis of salmonid tagging studies in the Columbia basin. Volume 8: A new model for estimating survival probabilities and residualization from a release-recapture study of fall chinook salmon (Oncorhynchus tschawytscha) smolts in the Snake River

    International Nuclear Information System (INIS)

    Lowther, A.B.; Skalski, J.

    1997-09-01

    Standard release-recapture analysis using Cormack-Jolly-Seber (CJS) models to estimate survival probabilities between hydroelectric facilities for Snake river fall chinook salmon (Oncorhynchus tschawytscha) ignore the possibility of individual fish residualizing and completing their migration in the year following tagging. These models do not utilize available capture history data from this second year and, thus, produce negatively biased estimates of survival probabilities. A new multinomial likelihood model was developed that results in biologically relevant, unbiased estimates of survival probabilities using the full two years of capture history data. This model was applied to 1995 Snake River fall chinook hatchery releases to estimate the true survival probability from one of three upstream release points (Asotin, Billy Creek, and Pittsburgh Landing) to Lower Granite Dam. In the data analyzed here, residualization is not a common physiological response and thus the use of CJS models did not result in appreciably different results than the true survival probability obtained using the new multinomial likelihood model

  9. Motor learning interference is proportional to occlusion of LTP-like plasticity.

    Science.gov (United States)

    Cantarero, Gabriela; Tang, Byron; O'Malley, Rebecca; Salas, Rachel; Celnik, Pablo

    2013-03-13

    Learning interference occurs when learning something new causes forgetting of an older memory (retrograde interference) or when learning a new task disrupts learning of a second subsequent task (anterograde interference). This phenomenon, described in cognitive, sensory, and motor domains, limits our ability to learn multiple tasks in close succession. It has been suggested that the source of interference is competition of neural resources, although the neuronal mechanisms are unknown. Learning induces long-term potentiation (LTP), which can ultimately limit the ability to induce further LTP, a phenomenon known as occlusion. In humans we quantified the magnitude of occlusion of anodal transcranial direct current stimulation-induced increased excitability after learning a skill task as an index of the amount of LTP-like plasticity used. We found that retention of a newly acquired skill, as reflected by performance in the second day of practice, is proportional to the magnitude of occlusion. Moreover, the degree of behavioral interference was correlated with the magnitude of occlusion. Individuals with larger occlusion after learning the first skill were (1) more resilient to retrograde interference and (2) experienced larger anterograde interference when training a second task, as expressed by decreased performance of the learned skill in the second day of practice. This effect was not observed if sufficient time elapsed between training the two skills and LTP-like occlusion was not present. These findings suggest competition of LTP-like plasticity is a factor that limits the ability to remember multiple tasks trained in close succession.

  10. Discovering long-term potentiation (LTP) - recollections and reflections on what came after.

    Science.gov (United States)

    Lømo, T

    2018-02-01

    Chance events led me to a lifelong career in scientific research. They paved the way for being the first to see long-term potentiation of synaptic efficiency (LTP) in Per Andersen's laboratory in Oslo in 1966. Here I describe my way to this discovery and the experiments with Tim Bliss in 1968-1969 that led to Bliss and Lømo, 1973. Surprisingly, we later failed to reproduce these results. I discuss possible reasons for this failure, which made us both leave LTP research, in my case for good, in Tim's case for several years. After 30 years of work in a different field, I renewed my interest in the hippocampus and LTP in the early 2000s and published, for the first time, results that I had obtained 40 years earlier. Here I present my take on how interest in and research on LTP evolved after the early years. This includes a discussion of the functions of hippocampus as seen in those early days, the case of patient H.M., Donald Hebb's place in the story, the search for 'memory molecules' such as PKMζ, and the primary site for LTP expression (pre- and/or post-synaptic?). Throughout, I reflect on my life in science, how science is done and what drives it. The reflections are quite personal and I admit to mixed feelings about broadcasting them. © 2017 Scandinavian Physiological Society. Published by John Wiley & Sons Ltd.

  11. MGluR5 mediates the interaction between late-LTP, network activity, and learning.

    Directory of Open Access Journals (Sweden)

    Arthur Bikbaev

    2008-05-01

    Full Text Available Hippocampal synaptic plasticity and learning are strongly regulated by metabotropic glutamate receptors (mGluRs and particularly by mGluR5. Here, we investigated the mechanisms underlying mGluR5-modulation of these phenomena. Prolonged pharmacological blockade of mGluR5 with MPEP produced a profound impairment of spatial memory. Effects were associated with 1 a reduction of mGluR1a-expression in the dentate gyrus; 2 impaired dentate gyrus LTP; 3 enhanced CA1-LTP and 4 suppressed theta (5-10 Hz and gamma (30-100 Hz oscillations in the dentate gyrus. Allosteric potentiation of mGluR1 after mGluR5 blockade significantly ameliorated dentate gyrus LTP, as well as suppression of gamma oscillatory activity. CA3-lesioning prevented MPEP effects on CA1-LTP, suggesting that plasticity levels in CA1 are driven by mGluR5-dependent synaptic and network activity in the dentate gyrus. These data support the hypothesis that prolonged mGluR5-inactivation causes altered hippocampal LTP levels and network activity, which is mediated in part by impaired mGluR1-expression in the dentate gyrus. The consequence is impairment of long-term learning.

  12. Enhanced sensitivity to ethanol-induced inhibition of LTP in CA1 pyramidal neurons of socially isolated C57BL/6J mice: role of neurosteroids

    Directory of Open Access Journals (Sweden)

    Giuseppe eTalani

    2011-10-01

    Full Text Available Ethanol (EtOH–induced impairment of long-term potentiation (LTP in the rat hippocampus is prevented by the 5α-reductase inhibitor finasteride, suggesting that this effect of EtOH is dependent on the increased local release of neurosteroids such as 3α,5α-THP that promote GABA–mediated transmission. Given that social isolation (SI in rodents is associated with altered plasma and brain levels of such neurosteroids as well as with an enhanced neurosteroidogenic action of EtOH, we examined whether the inhibitory effect of EtOH on LTP at CA3-CA1 hippocampal excitatory synapses is altered in C57BL/6J mice subjected to SI for 6 weeks in comparison with group-housed (GH animals. Extracellular recording of fEPSPs as well as patch-clamp analysis were performed in hippocampal slices prepared from both SI and GH mice. Consistent with previous observations, recording of fEPSPs revealed that the extent of LTP induced in the CA1 region of SI mice was significantly reduced compared with that in GH animals. EtOH (40 mM inhibited LTP in slices from SI mice but not in those from GH mice, and this effect of EtOH was abolished by co-application of 1 µM finasteride. Current-clamp analysis of CA1 pyramidal neurons revealed a decrease in action potential frequency and an increase in the intensity of injected current required to evoke the first action potential in SI mice compared with GH mice, indicative of a decrease in neuronal excitability associated with SI. Together, our data suggest that SI results in reduced levels of neuronal excitability and synaptic plasticity in the hippocampus. Furthermore, the increased sensitivity to the neurosteroidogenic effect of EtOH associated with SI likely accounts for the greater inhibitory effect of EtOH on LTP in SI mice. The increase in EtOH sensitivity induced by SI may be important for the changes in the effects of EtOH on anxiety and on learning and memory associated with the prolonged stress attributable to social

  13. Alignment and Testing of the GPRM as Part of the LTP Caging Mechanism

    Science.gov (United States)

    Koker, I.; Rozemeijer, H.; Stary, F.; Reichenberger, K.

    2013-09-01

    The GPRM (Grabbing, Position and Release Mechanism) is part of the Caging Mechanism (CM) and its electrical control unit (CCU) of the LISA Technology Package (LTP) on board ESA's LISA Pathfinder Spacecraft (LPF). The GPRM was only tested at sub-assembly level (one half on the mechanism) but never in assembled configuration on system level with the flight electronics (CCU). The developing company (RUAG Space, CH) was contracted with these limited activities.The GPRM EQM was successfully tested in 2008 and the two flight models were delivered in 2009. Due to design evolution of the CM, the flight GPRMs could not be tested in assembled configuration directly after their delivery. These GPRM system tests needed to be implemented in the upgraded CM design. In addition an alternative integration and alignment approach was developed taking advantage of the experience to date, which also resulted in an optimised schedule.As a consequence of the above mentioned starting point and the evolution of Caging Mechanism, the interface to the CM, an alternative alignment concept and verification approach of the GPRM needed to be developed and implemented by MAGNA Steyr Aerospace in close cooperation with all involved parties. The TV and functional test set-up was refurbished and pre-tests were performed such that the requirements could be verified. Vibration testing of the GPRM in its assembled and aligned configuration was different due to the new test and verification approach. New FEM models for the GPRM vibration test needed to be established and verified.Handling and operating the flight hardware, establishment of new alignment approaches and upgrade of test equipment for the new approach were the major challenges in this verification programme.This paper presents the alignment and testing activities of the GPRM together with its control electronics - the CCU.

  14. Hippocampal NPY gene transfer attenuates seizures without affecting epilepsy-induced impairment of LTP

    DEFF Research Database (Denmark)

    Sørensen, Andreas T; Nikitidou, Litsa; Ledri, Marco

    2009-01-01

    (TLE). However, our previous studies show that recombinant adeno-associated viral (rAAV)-NPY treatment in naive rats attenuates long-term potentiation (LTP) and transiently impairs hippocampal learning process, indicating that negative effect on memory function could be a potential side effect of NPY...... is significantly attenuated in vitro. Importantly, transgene NPY overexpression has no effect on short-term synaptic plasticity, and does not further compromise LTP in kindled animals. These data suggest that epileptic seizure-induced impairment of memory function in the hippocampus may not be further affected...... injected with rAAV-NPY, we show that rapid kindling-induced hippocampal seizures in vivo are effectively suppressed as compared to rAAV-empty injected (control) rats. Six to nine weeks later, basal synaptic transmission and short-term synaptic plasticity are unchanged after rapid kindling, while LTP...

  15. Bidirectional modulation of taste aversion extinction by insular cortex LTP and LTD.

    Science.gov (United States)

    Rodríguez-Durán, Luis F; Martínez-Moreno, Araceli; Escobar, Martha L

    2017-07-01

    The history of activity of a given neuron has been proposed to bidirectionally influence its future response to synaptic inputs. In particular, induction of synaptic plasticity expressions such as long-term potentiation (LTP) and long-term depression (LTD) modifies the performance of several behavioral tasks. Our previous studies in the insular cortex (IC), a neocortical region that has been related to acquisition and retention of conditioned taste aversion (CTA), have demonstrated that induction of LTP in the basolateral amygdaloid nucleus (Bla)-IC pathway before CTA training enhances the retention of this task. In addition, we reported that CTA training triggers a persistent impairment in the ability to induce in vivo LTP in the IC. The aim of the present study was to investigate whether LTD can be induced in the Bla-IC projection in vivo, as well as, whether the extinction of CTA is bidirectionally modified by previous synaptic plasticity induction in this pathway. Thus, rats received 900 train pulses (five 250μs pulses at 250Hz) delivered at 1Hz in the Bla-IC projection in order to induce LTD or 10 trains of 100Hz/1s with an intertrain interval of 20s in order to induce LTP. Seven days after surgery, rats were trained in the CTA task including the extinction trials. Our results show that the Bla-IC pathway is able to express in vivo LTD in an N-Methyl-D-aspartate (NMDA) receptor-dependent manner. Induction of LTD in the Bla-IC projection previous to CTA training facilitates the extinction of this task. Conversely, LTP induction enhances CTA retention. The present results show the bidirectional modulation of CTA extinction in response to IC-LTP and LTD, providing evidence of the homeostatic adaptation of taste learning. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. Screening of agrochemicals in foodstuffs using low-temperature plasma (LTP) ambient ionization mass spectrometry.

    Science.gov (United States)

    Wiley, Joshua S; García-Reyes, Juan F; Harper, Jason D; Charipar, Nicholas A; Ouyang, Zheng; Cooks, R Graham

    2010-05-01

    Low-temperature plasma (LTP) permits direct ambient ionization and mass analysis of samples in their native environment with minimal or no prior preparation. LTP utilizes dielectric barrier discharges (DBDs) to create a low power plasma which is guided by gas flow onto the sample from which analytes are desorbed and ionized. In this study, the potential of LTP-MS for the detection of pesticide residues in food is demonstrated. Thirteen multi-class agricultural chemicals were studied (ametryn, amitraz, atrazine, buprofezin, DEET, diphenylamine, ethoxyquin, imazalil, isofenphos-methyl, isoproturon, malathion, parathion-ethyl and terbuthylazine). To evaluate the potential of the proposed approach, LTP-MS experiments were performed directly on fruit peels as well as on fruit/vegetable extracts. Most of the agrochemicals examined displayed remarkable sensitivity in the positive ion mode, giving limits of detection (LOD) for the direct measurement in the low picogram range. Tandem mass spectrometry (MS/MS) was used to confirm identification of selected pesticides by using for these experiments spiked fruit/vegetable extracts (QuEChERS, a standard sample treatment protocol) at levels as low as 1 pg, absolute, for some of the analytes. Comparisons of the data obtained by direct LTP-MS were made with the slower but more accurate conventional LC-MS/MS procedure. Herbicides spiked in aqueous solutions were detectable at LODs as low as 0.5 microg L(-1) without the need for any sample preparation. The results demonstrate that ambient LTP-MS can be applied for the detection and confirmation of traces of agrochemicals in actual market-purchased produce and in natural water samples. Quantitative analysis was also performed in a few selected cases and displayed a relatively high degree of linearity over four orders of magnitude.

  17. Activity-dependent volume transmission by transgene NPY attenuates glutamate release and LTP in the subiculum

    DEFF Research Database (Denmark)

    Sørensen, Andreas T; Kanter-Schlifke, Irene; Lin, En-Ju D

    2008-01-01

    Neuropeptide Y (NPY) gene transduction of the brain using viral vectors in epileptogenic regions can effectively suppress seizures in animals, and is being considered as a promising alternative treatment strategy for epilepsy. Therefore, it is fundamental to understand the detailed mechanisms...

  18. Afferent input selects NMDA receptor subtype to determine the persistency of hippocampal LTP in freely behaving mice

    Directory of Open Access Journals (Sweden)

    Jesús Javier Ballesteros

    2016-10-01

    Full Text Available The glutamatergic N-methyl-D-aspartate receptor (NMDAR is critically involved in many forms of hippocampus-dependent memory that may be enabled by synaptic plasticity. Behavioral studies with NMDAR antagonists and NMDAR subunit (GluN2 mutants revealed distinct contributions from GluN2A- and GluN2B-containing NMDARs to rapidly and slowly acquired memory performance. Furthermore, studies of synaptic plasticity, in genetically modified mice in vitro, suggest that GluN2A and GluN2B may contribute in different ways to the induction and longevity of synaptic plasticity. In contrast to the hippocampal slice preparation, in behaving mice, the afferent frequencies that induce synaptic plasticity are very restricted and specific. In fact, it is the stimulus pattern, and not variations in afferent frequency that determine the longevity of long-term potentiation (LTP. Here, we explored the contribution of GluN2A and GluN2B to LTP of differing magnitudes and persistencies in freely behaving mice. We applied differing high-frequency stimulation (HFS patterns at 100 Hz to the hippocampal CA1 region, to induce NMDAR-dependent LTP in wild-type (WT mice, that endured for 24h (late (L-LTP. In GluN2A-KO mice, E-LTP (HFS, 50 pulses was significantly reduced in magnitude and duration, whereas LTP (HFS, 2 x 50 pulses and L-LTP (HFS, 4 x 50 pulses were unaffected compared to responses in WT animals. By contrast, pharmacological antagonism of GluN2B in WT had no effect on E-LTP but significantly prevented LTP. E- LTP and LTP were significantly impaired by GluN2B antagonism in GluN2A-KO mice. These data indicate that the pattern of afferent stimulation is decisive for the recruitment of distinct GluN2A and GluN2B signaling pathways that in turn determine the persistency of hippocampal LTP. Whereas brief bursts of patterned stimulation preferentially recruit GluN2A and lead to weak and short-lived forms of LTP, prolonged, more intense, afferent activation recruits GluN2B

  19. ASIC-dependent LTP at multiple glutamatergic synapses in amygdala network is required for fear memory.

    Science.gov (United States)

    Chiang, Po-Han; Chien, Ta-Chun; Chen, Chih-Cheng; Yanagawa, Yuchio; Lien, Cheng-Chang

    2015-05-19

    Genetic variants in the human ortholog of acid-sensing ion channel-1a subunit (ASIC1a) gene are associated with panic disorder and amygdala dysfunction. Both fear learning and activity-induced long-term potentiation (LTP) of cortico-basolateral amygdala (BLA) synapses are impaired in ASIC1a-null mice, suggesting a critical role of ASICs in fear memory formation. In this study, we found that ASICs were differentially expressed within the amygdala neuronal population, and the extent of LTP at various glutamatergic synapses correlated with the level of ASIC expression in postsynaptic neurons. Importantly, selective deletion of ASIC1a in GABAergic cells, including amygdala output neurons, eliminated LTP in these cells and reduced fear learning to the same extent as that found when ASIC1a was selectively abolished in BLA glutamatergic neurons. Thus, fear learning requires ASIC-dependent LTP at multiple amygdala synapses, including both cortico-BLA input synapses and intra-amygdala synapses on output neurons.

  20. Encoding of Discriminative Fear Memory by Input-Specific LTP in the Amygdala.

    Science.gov (United States)

    Kim, Woong Bin; Cho, Jun-Hyeong

    2017-08-30

    In auditory fear conditioning, experimental subjects learn to associate an auditory conditioned stimulus (CS) with an aversive unconditioned stimulus. With sufficient training, animals fear conditioned to an auditory CS show fear response to the CS, but not to irrelevant auditory stimuli. Although long-term potentiation (LTP) in the lateral amygdala (LA) plays an essential role in auditory fear conditioning, it is unknown whether LTP is induced selectively in the neural pathways conveying specific CS information to the LA in discriminative fear learning. Here, we show that postsynaptically expressed LTP is induced selectively in the CS-specific auditory pathways to the LA in a mouse model of auditory discriminative fear conditioning. Moreover, optogenetically induced depotentiation of the CS-specific auditory pathways to the LA suppressed conditioned fear responses to the CS. Our results suggest that input-specific LTP in the LA contributes to fear memory specificity, enabling adaptive fear responses only to the relevant sensory cue. VIDEO ABSTRACT. Copyright © 2017 Elsevier Inc. All rights reserved.

  1. Acupuncture Prevents the Impairment of Hippocampal LTP Through β1-AR in Vascular Dementia Rats.

    Science.gov (United States)

    Xiao, Ling-Yong; Wang, Xue-Rui; Yang, Jing-Wen; Ye, Yang; Zhu, Wen; Cao, Yan; Ma, Si-Ming; Liu, Cun-Zhi

    2018-02-13

    It is widely accepted that the synaptic dysfunction and synapse loss contribute to the cognitive deficits of vascular dementia (VD) patients. We have previously reported that acupuncture improved cognitive function in rats with VD. However, the mechanisms involved in acupuncture improving cognitive ability remain to be elucidated. The present study aims to investigate the pathways and molecules involved in the neuroprotective effect of acupuncture. We assessed the effects of acupuncture on hippocampal long-term potentiation (LTP), the most prominent cellular model of memory formation. Acupuncture enhanced LTP and norepinephrine (NE) levels in the hippocampus. Inhibition of the β-adrenergic receptor (AR), but not the α-AR, was able to block the effects of acupuncture on hippocampal LTP. Furthermore, inhibition of β1-AR, not β2-AR, abolished the enhanced LTP induced by acupuncture. The expression analysis revealed a significant upregulation of β1-AR and unchanged β2-AR with acupuncture, which supported the above findings. Specifically, increased β1-ARs in the dentate gyrus were expressed on neurons exclusively. Taken together, the present data supports a beneficial role of acupuncture in synaptic plasticity challenged with VD. A likely mechanism is the increase of NE and activation of β1-AR in the hippocampus.

  2. Methamphetamine reduces LTP and increases baseline synaptic transmission in the CA1 region of mouse hippocampus.

    Directory of Open Access Journals (Sweden)

    Jarod Swant

    2010-06-01

    Full Text Available Methamphetamine (METH is an addictive psychostimulant whose societal impact is on the rise. Emerging evidence suggests that psychostimulants alter synaptic plasticity in the brain--which may partly account for their adverse effects. While it is known that METH increases the extracellular concentration of monoamines dopamine, serotonin, and norepinephrine, it is not clear how METH alters glutamatergic transmission. Within this context, the aim of the present study was to investigate the effects of acute and systemic METH on basal synaptic transmission and long-term potentiation (LTP; an activity-induced increase in synaptic efficacy in CA1 sub-field in the hippocampus. Both the acute ex vivo application of METH to hippocampal slices and systemic administration of METH decreased LTP. Interestingly, the acute ex vivo application of METH at a concentration of 30 or 60 microM increased baseline synaptic transmission as well as decreased LTP. Pretreatment with eticlopride (D2-like receptor antagonist did not alter the effects of METH on synaptic transmission or LTP. In contrast, pretreatment with D1/D5 dopamine receptor antagonist SCH23390 or 5-HT1A receptor antagonist NAN-190 abrogated the effect of METH on synaptic transmission. Furthermore, METH did not increase baseline synaptic transmission in D1 dopamine receptor haploinsufficient mice. Our findings suggest that METH affects excitatory synaptic transmission via activation of dopamine and serotonin receptor systems in the hippocampus. This modulation may contribute to synaptic maladaption induced by METH addiction and/or METH-mediated cognitive dysfunction.

  3. Expression of NMDA receptor-dependent LTP in the hippocampus: bridging the divide

    Directory of Open Access Journals (Sweden)

    Bliss Tim VP

    2013-01-01

    Full Text Available Abstract A consensus has famously yet to emerge on the locus and mechanisms underlying the expression of the canonical NMDA receptor-dependent form of LTP. An objective assessment of the evidence leads us to conclude that both presynaptic and postsynaptic expression mechanisms contribute to this type of synaptic plasticity.

  4. Human sensory Long-Term Potentiation (LTP) predicts visual memory performance and is modulated by the brain-derived neurotrophic factor (BDNF) Val66Met polymorphism

    OpenAIRE

    King, Rohan; Moreau, David; Russell, Bruce; Kirk, Ian; Wu, Carolyn; Antia, Ushtana; Lamb, Yvette; Spriggs, Meg; Thompson, Chris; Mckay, Nicole; Shelling, Andrew; Waldie, Karen; Teyler, Tim; Hamm, Jeff; Mcnair, Nicolas

    2018-01-01

    Background: Long-Term Potentiation (LTP) is recognised as a core neuronal process underlying long-term memory. However, a direct relationship between LTP and human memory performance is yet to be demonstrated. The first aim of the current study was thus to assess the relationship between LTP and human long-term memory performance. With this also comes an opportunity to explore factors thought to mediate the relationship between LTP and long-term memory, and to gain additional insight into var...

  5. Blockade of intracellular Zn2+ signaling in the basolateral amygdala affects object recognition memory via attenuation of dentate gyrus LTP.

    Science.gov (United States)

    Fujise, Yuki; Kubota, Mitsuyasu; Suzuki, Miki; Tamano, Haruna; Takeda, Atsushi

    2017-09-01

    Hippocampus-dependent memory is modulated by the amygdala. However, it is unknown whether intracellular Zn 2+ signaling in the amygdala is involved in hippocampus-dependent memory. On the basis of the evidence that intracellular Zn 2+ signaling in dentate granule cells (DGC) is necessary for object recognition memory via LTP at medial perforant pathway (PP)-DGC synapses, the present study examined whether intracellular Zn 2+ signaling in the amygdala influences object recognition memory via modulation of LTP at medial PP-DGC synapses. When ZnAF-2DA (100 μM, 2 μl) was injected into the basolateral amygdala (BLA), intracellular ZnAF-2 locally chelated intracellular Zn 2+ in the amygdala. Recognition memory was affected when training of object recognition test was performed 20 min after ZnAF-2DA injection into the BLA. Twenty minutes after injection of ZnAF-2DA into the BLA, LTP induction at medial PP-DGC synapses was attenuated, while LTP induction at PP-BLA synapses was potentiated and LTP induction at BLA-DGC synapses was attenuated. These results suggest that intracellular Zn 2+ signaling in the BLA is involved in BLA-associated LTP and modulates LTP at medial PP-DGC synapses, followed by modulation of object recognition memory. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. The effects of lindane and long-term potentiation (LTP) on pyramidal cell excitability in the rat hippocampal slice.

    Science.gov (United States)

    Albertson, T E; Walby, W F; Stark, L G; Joy, R M

    1997-01-01

    An in vitro orthodromic stimulation technique was used to examine the effects of lindane and long-term potentiation (LTP) inducing stimuli, alone or in combination, on the excitatory afferent terminal of CA1 pyramidal cells and on recurrent collateral evoked inhibition using the rat hippocampal slice model. Hippocampal slices of 400 microns thickness were perfused with oxygenated artificial cerebrospinal fluid. Stimulation of Schaffer collateral/commissural fibers produced extracellular excitatory postsynaptic potential (EPSP) and/or populations spike (PS) responses recorded from electrodes in the CA1 region. A paired-pulse technique was used to measure gamma-aminobutyric acid (GABAA)-mediated recurrent inhibition before and after treatments. After both lindane and LTP, larger PS amplitudes for a given stimulus intensity were seen. The resulting leftward shift in the curve of the PS amplitude versus stimulus intensity was larger after LTP than after 25 microM lindane. Both lindane and LTP treatments reduced PS thresholds and reduced or eliminated recurrent inhibition as measured by paired-pulse stimulation at the 15 msec interval. The reduction of recurrent inhibition after both treatments was more pronounced at lower stimulus intensities. When LTP stimuli were applied after lindane exposure a further large shift to the left was seen in the PS amplitude versus stimulus intensity curve. A smaller shift to the left was seen in the PS amplitude versus stimulus intensity curve only at the higher stimuli when lindane exposure occurred after LTP. Only at low stimulus intensities were further argumentations seen in PS amplitudes when the LTP stimuli was followed by a second LTP stimuli. Previous exposure to 25 microM lindane stimuli does not block the development of a further robust LTP in this in vitro model.

  7. Ruin probabilities

    DEFF Research Database (Denmark)

    Asmussen, Søren; Albrecher, Hansjörg

    The book gives a comprehensive treatment of the classical and modern ruin probability theory. Some of the topics are Lundberg's inequality, the Cramér-Lundberg approximation, exact solutions, other approximations (e.g., for heavy-tailed claim size distributions), finite horizon ruin probabilities......, extensions of the classical compound Poisson model to allow for reserve-dependent premiums, Markov-modulation, periodicity, change of measure techniques, phase-type distributions as a computational vehicle and the connection to other applied probability areas, like queueing theory. In this substantially...... updated and extended second version, new topics include stochastic control, fluctuation theory for Levy processes, Gerber–Shiu functions and dependence....

  8. Two-Stage Translational Control of Dentate Gyrus LTP Consolidation Is Mediated by Sustained BDNF-TrkB Signaling to MNK

    Directory of Open Access Journals (Sweden)

    Debabrata Panja

    2014-11-01

    Full Text Available BDNF signaling contributes to protein-synthesis-dependent synaptic plasticity, but the dynamics of TrkB signaling and mechanisms of translation have not been defined. Here, we show that long-term potentiation (LTP consolidation in the dentate gyrus of live rodents requires sustained (hours BDNF-TrkB signaling. Surprisingly, this sustained activation maintains an otherwise labile signaling pathway from TrkB to MAP-kinase-interacting kinase (MNK. MNK activity promotes eIF4F translation initiation complex formation and protein synthesis in mechanistically distinct early and late stages. In early-stage translation, MNK triggers release of the CYFIP1/FMRP repressor complex from the 5′-mRNA cap. In late-stage translation, MNK regulates the canonical translational repressor 4E-BP2 in a synapse-compartment-specific manner. This late stage is coupled to MNK-dependent enhanced dendritic mRNA translation. We conclude that LTP consolidation in the dentate gyrus is mediated by sustained BDNF signaling to MNK and MNK-dependent regulation of translation in two functionally and mechanistically distinct stages.

  9. Generalized Probability-Probability Plots

    NARCIS (Netherlands)

    Mushkudiani, N.A.; Einmahl, J.H.J.

    2004-01-01

    We introduce generalized Probability-Probability (P-P) plots in order to study the one-sample goodness-of-fit problem and the two-sample problem, for real valued data.These plots, that are constructed by indexing with the class of closed intervals, globally preserve the properties of classical P-P

  10. Probability-1

    CERN Document Server

    Shiryaev, Albert N

    2016-01-01

    This book contains a systematic treatment of probability from the ground up, starting with intuitive ideas and gradually developing more sophisticated subjects, such as random walks, martingales, Markov chains, the measure-theoretic foundations of probability theory, weak convergence of probability measures, and the central limit theorem. Many examples are discussed in detail, and there are a large number of exercises. The book is accessible to advanced undergraduates and can be used as a text for independent study. To accommodate the greatly expanded material in the third edition of Probability, the book is now divided into two volumes. This first volume contains updated references and substantial revisions of the first three chapters of the second edition. In particular, new material has been added on generating functions, the inclusion-exclusion principle, theorems on monotonic classes (relying on a detailed treatment of “π-λ” systems), and the fundamental theorems of mathematical statistics.

  11. Ignition Probability

    Data.gov (United States)

    Earth Data Analysis Center, University of New Mexico — USFS, State Forestry, BLM, and DOI fire occurrence point locations from 1987 to 2008 were combined and converted into a fire occurrence probability or density grid...

  12. Comparison of LTI and LTP Models for Stability Analysis of Grid Converters

    DEFF Research Database (Denmark)

    Kwon, Jun Bum; Wang, Xiongfei; Blaabjerg, Frede

    2016-01-01

    The stability analysis of grid-connected converters have attracted increasing attentions, due to the oscillations arising in wind power plants, micro-grids, and other emerging power electronics based power systems. The modeling tool of converters thus becomes essential to faithfully reveal...... oscillations without any hidden regions. This paper presents a detailed comparison of two linearized modeling methods, which are, respectively, developed in the Linear Time-Invariant (LTI) and the Linear Time-Periodic (LTP) frameworks. The LTP model can consider the effect of frequency-coupling dynamics, which...... are occurred by the time-varying behavior, while the conventional LTI model can not capture this behavior. The advantages and limits of two models are then illustrated with examples. The compared results are verified in the frequency domain and time domain as well....

  13. Excess influx of Zn(2+) into dentate granule cells affects object recognition memory via attenuated LTP.

    Science.gov (United States)

    Suzuki, Miki; Fujise, Yuki; Tsuchiya, Yuka; Tamano, Haruna; Takeda, Atsushi

    2015-08-01

    The influx of extracellular Zn(2+) into dentate granule cells is nonessential for dentate gyrus long-term potentiation (LTP) and the physiological significance of extracellular Zn(2+) dynamics is unknown in the dentate gyrus. Excess increase in extracellular Zn(2+) in the hippocampal CA1, which is induced with excitation of zincergic neurons, induces memory deficit via excess influx of Zn(2+) into CA1 pyramidal cells. In the present study, it was examined whether extracellular Zn(2+) induces object recognition memory deficit via excess influx of Zn(2+) into dentate granule cells. KCl (100 mM, 2 µl) was locally injected into the dentate gyrus. The increase in intracellular Zn(2+) in dentate granule cells induced with high K(+) was blocked by co-injection of CaEDTA and CNQX, an extracellular Zn(2+) chelator and an AMPA receptor antagonist, respectively, suggesting that high K(+) increases the influx of Zn(2+) into dentate granule cells via AMPA receptor activation. Dentate gyrus LTP induction was attenuated 1 h after KCl injection into the dentate gyrus and also attenuated when KCl was injected 5 min after the induction. Memory deficit was induced when training of object recognition test was performed 1 h after KCl injection into the dentate gyrus and also induced when KCl was injected 5 min after the training. High K(+)-induced impairments of LTP and memory were rescued by co-injection of CaEDTA. These results indicate that excess influx of Zn(2+) into dentate granule cells via AMPA receptor activation affects object recognition memory via attenuated LTP induction. Even in the dentate gyrus where is scarcely innervated by zincergic neurons, it is likely that extracellular Zn(2+) homeostasis is strictly regulated for cognition. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. Quantum Probabilities as Behavioral Probabilities

    Directory of Open Access Journals (Sweden)

    Vyacheslav I. Yukalov

    2017-03-01

    Full Text Available We demonstrate that behavioral probabilities of human decision makers share many common features with quantum probabilities. This does not imply that humans are some quantum objects, but just shows that the mathematics of quantum theory is applicable to the description of human decision making. The applicability of quantum rules for describing decision making is connected with the nontrivial process of making decisions in the case of composite prospects under uncertainty. Such a process involves deliberations of a decision maker when making a choice. In addition to the evaluation of the utilities of considered prospects, real decision makers also appreciate their respective attractiveness. Therefore, human choice is not based solely on the utility of prospects, but includes the necessity of resolving the utility-attraction duality. In order to justify that human consciousness really functions similarly to the rules of quantum theory, we develop an approach defining human behavioral probabilities as the probabilities determined by quantum rules. We show that quantum behavioral probabilities of humans do not merely explain qualitatively how human decisions are made, but they predict quantitative values of the behavioral probabilities. Analyzing a large set of empirical data, we find good quantitative agreement between theoretical predictions and observed experimental data.

  15. Risk Probabilities

    DEFF Research Database (Denmark)

    Rojas-Nandayapa, Leonardo

    Tail probabilities of sums of heavy-tailed random variables are of a major importance in various branches of Applied Probability, such as Risk Theory, Queueing Theory, Financial Management, and are subject to intense research nowadays. To understand their relevance one just needs to think...... analytic expression for the distribution function of a sum of random variables. The presence of heavy-tailed random variables complicates the problem even more. The objective of this dissertation is to provide better approximations by means of sharp asymptotic expressions and Monte Carlo estimators...

  16. Activation of PAF-synthesizing enzymes in rat brain stem slices after LTP induction in the medial vestibular nuclei.

    Science.gov (United States)

    Francescangeli, Ermelinda; Grassi, Silvarosa; Pettorossi, Vito E; Goracci, Gianfrancesco

    2002-11-01

    LysoPAF acetyltransferase (lysoPAF-AT) and PAF-synthesizing phosphocholinetransferase (PAF-PCT) are the two enzymes which catalyze the final reactions for the synthesis of PAF. Their activities, assayed in the homogenate of rat brain stem slices and under their optimal conditions, increased 5 min after high frequency stimulation of vestibular afferents, inducing LTP in the medial vestibular nuclei. The activity of phosphatidylcholine-synthesizing phosphocholinetransferase, was not affected. Sixty minutes from the induction of LTP, PAF-PCT activity, but not that of lysoPAF-AT, was still significantly higher with respect to 5 min test stimulated control. We used AP-5 to verify whether this increase was strictly dependent upon LTP induction, which requires NMDA receptor activation. In AP-5 treated slices, lysoPAF-acetyltransferase and PAF-synthesizing phosphocholinetransferase activities increased, but they were reduced after high frequency stimulation under AP-5. In conclusion, we have demonstrated that the activities of PAF-synthesizing enzymes are activated soon after the induction of LTP and that this effect is linked to the activation of NMDA-receptors. We suggest that the enzyme activation by AP-5, preventing LTP, might be due to glutamate enhancement but, in neurons showing LTP and under normal conditions, the activation of potentiation mechanisms is critical for the enhancement of enzyme activities.

  17. Probability tales

    CERN Document Server

    Grinstead, Charles M; Snell, J Laurie

    2011-01-01

    This book explores four real-world topics through the lens of probability theory. It can be used to supplement a standard text in probability or statistics. Most elementary textbooks present the basic theory and then illustrate the ideas with some neatly packaged examples. Here the authors assume that the reader has seen, or is learning, the basic theory from another book and concentrate in some depth on the following topics: streaks, the stock market, lotteries, and fingerprints. This extended format allows the authors to present multiple approaches to problems and to pursue promising side discussions in ways that would not be possible in a book constrained to cover a fixed set of topics. To keep the main narrative accessible, the authors have placed the more technical mathematical details in appendices. The appendices can be understood by someone who has taken one or two semesters of calculus.

  18. Probability theory

    CERN Document Server

    Dorogovtsev, A Ya; Skorokhod, A V; Silvestrov, D S; Skorokhod, A V

    1997-01-01

    This book of problems is intended for students in pure and applied mathematics. There are problems in traditional areas of probability theory and problems in the theory of stochastic processes, which has wide applications in the theory of automatic control, queuing and reliability theories, and in many other modern science and engineering fields. Answers to most of the problems are given, and the book provides hints and solutions for more complicated problems.

  19. Elevation of endogenous anandamide impairs LTP, learning, and memory through CB1 receptor signaling in mice.

    Science.gov (United States)

    Basavarajappa, Balapal S; Nagre, Nagaraja N; Xie, Shan; Subbanna, Shivakumar

    2014-07-01

    In rodents, many exogenous and endogenous cannabinoids, such as anandamide (AEA) and 2-arachidonyl glycerol (2-AG), have been shown to play an important role in certain hippocampal memory processes. However, the mechanisms by which endogenous AEA regulate this processes are not well understood. Here the effects of AEA on long-term potentiation (LTP), hippocampal-dependent learning and memory tasks, pERK1/2, pCaMKIV, and pCREB signaling events in both cannabinoid receptor type 1 (CB1R) wild-type (WT) and knockout (KO) mice were assessed following administration of URB597, an inhibitor of the fatty acid amide hydrolase (FAAH). Acute administration of URB597 enhanced AEA levels without affecting the levels of 2-AG or CB1R in the hippocampus and neocortex as compared to vehicle. In hippocampal slices, URB597 impaired LTP in CB1R WT but not in KO littermates. URB597 impaired object recognition, spontaneous alternation and spatial memory in the Y-maze test in CB1R WT mice but not in KO mice. Furthermore, URB597 enhanced ERK phosphorylation in WT without affecting total ERK levels in WT or KO mice. URB597 impaired CaMKIV and CREB phosphorylation in WT but not in KO mice. CB1R KO mice have a lower pCaMKIV/CaMKIV ratio and higher pCREB/CREB ratio as compared to WT littermates. Our results indicate that pharmacologically elevated AEA impair LTP, learning and memory and inhibit CaMKIV and CREB phosphorylation, via the activation of CB1Rs. Collectively, these findings also suggest that pharmacological elevation of AEA beyond normal concentrations is also detrimental for the underlying physiological responses. © 2014 Wiley Periodicals, Inc.

  20. Collision Probability Analysis

    DEFF Research Database (Denmark)

    Hansen, Peter Friis; Pedersen, Preben Terndrup

    1998-01-01

    It is the purpose of this report to apply a rational model for prediction of ship-ship collision probabilities as function of the ship and the crew characteristics and the navigational environment for MS Dextra sailing on a route between Cadiz and the Canary Islands.The most important ship and crew...... characteristics are: ship speed, ship manoeuvrability, the layout of the navigational bridge, the radar system, the number and the training of navigators, the presence of a look out etc. The main parameters affecting the navigational environment are ship traffic density, probability distributions of wind speeds...... probability, i.e. a study of the navigator's role in resolving critical situations, a causation factor is derived as a second step.The report documents the first step in a probabilistic collision damage analysis. Future work will inlcude calculation of energy released for crushing of structures giving...

  1. LTP-like plasticity in the visual system and in the motor system appear related in young and healthy subjects

    Directory of Open Access Journals (Sweden)

    Stefan eKlöppel

    2015-09-01

    Full Text Available LTP-like plasticity measured by visual evoked potentials (VEP can be induced in the intact human brain by presenting checkerboard reversals. Also associated with LTP-like plasticity, around two third of participants respond to transcranial magnetic stimulation with a paired-associate stimulation (PAS protocol with a potentiation of their motor evoked potentials. LTP-like processes are also required for verbal and motor learning tasks. We compared effect sizes, responder rates and intercorrelations as well as the potential influence of attention between these four assessments in a group of 37 young and healthy volunteers. We observed a potentiation effect of the N75 and P100 VEP component which positively correlated with plasticity induced by PAS. Subjects with a better subjective alertness were more likely to show PAS and VEP potentiation. No correlation was found between the other assessments. Effect sizes and responder rates of VEP potentiation were higher compared to PAS. Our results indicate a high variability of LTP-like effects and no evidence for a system-specific nature. As a consequence, studies wishing to assess individual levels of LTP-like plasticity should employ a combination of multiple assessments.

  2. Metaplasticity at CA1 Synapses by Homeostatic Control of Presynaptic Release Dynamics

    Directory of Open Access Journals (Sweden)

    Cary Soares

    2017-10-01

    Full Text Available Summary: Hebbian and homeostatic forms of plasticity operate on different timescales to regulate synaptic strength. The degree of mechanistic overlap between these processes and their mutual influence are still incompletely understood. Here, we report that homeostatic synaptic strengthening induced by prolonged network inactivity compromised the ability of CA1 synapses to exhibit LTP. This effect could not be accounted for by an obvious deficit in the postsynaptic capacity for LTP expression, since neither the fraction of silent synapses nor the ability to induce LTP by two-photon glutamate uncaging were reduced by the homeostatic process. Rather, optical quantal analysis reveals that homeostatically strengthened synapses display a reduced capacity to maintain glutamate release fidelity during repetitive stimulation, ultimately impeding the induction, and thus expression, of LTP. By regulating the short-term dynamics of glutamate release, the homeostatic process thus influences key aspects of dynamic network function and exhibits features of metaplasticity. : Several forms of synaptic plasticity operating over distinct spatiotemporal scales have been described at hippocampal synapses. Whether these distinct plasticity mechanisms interact and influence one another remains incompletely understood. Here, Soares et al. show that homeostatic plasticity induced by network silencing influences short-term release dynamics and Hebbian plasticity rules at hippocampal synapses. Keywords: synapse, LTP, homeostatic plasticity, metaplasticity, iGluSNFR

  3. Performance of the upgraded LTP-II at the ALS Optical Metrology Laboratory

    International Nuclear Information System (INIS)

    Advanced Light Source; Yashchuk, Valeriy V; Kirschman, Jonathan L.; Domning, Edward E.; McKinney, Wayne R.; Morrison, Gregory Y.; Smith, Brian V.; Yashchuk, Valeriy V.

    2008-01-01

    The next generation of synchrotrons and free electron laser facilities requires x-ray optical systems with extremely high performance, generally of diffraction limited quality. Fabrication and use of such optics requires adequate, highly accurate metrology and dedicated instrumentation. Previously, we suggested ways to improve the performance of the Long Trace Profiler (LTP), a slope measuring instrument widely used to characterize x-ray optics at long spatial wavelengths. The main way is use of a CCD detector and corresponding technique for calibration of photo-response non-uniformity [J. L. Kirschman, et al., Proceedings of SPIE 6704, 67040J (2007)]. The present work focuses on the performance and characteristics of the upgraded LTP-II at the ALS Optical Metrology Laboratory. This includes a review of the overall aspects of the design, control system, the movement and measurement regimes for the stage, and analysis of the performance by a slope measurement of a highly curved super-quality substrate with less than 0.3 microradian (rms)slope variation

  4. Lack of LTP-like plasticity in primary motor cortex in Parkinson's disease.

    Science.gov (United States)

    Suppa, A; Marsili, L; Belvisi, D; Conte, A; Iezzi, E; Modugno, N; Fabbrini, G; Berardelli, A

    2011-02-01

    In this study in patients with Parkinson's disease (PD), off and on dopaminergic therapy, with and without L-dopa-induced dyskinesias (LIDs), we tested intermittent theta-burst stimulation (iTBS), a technique currently used for non-invasively inducing long-term potentiation (LTP)-like plasticity in primary motor cortex (M1). The study group comprised 20 PD patients on and off dopaminergic therapy (11 patients without and 9 patients with LIDs), and 14 age-matched healthy subjects. Patients had mild-to-moderate PD, and no additional neuropsychiatric disorders. We clinically evaluated patients using the Unified Parkinson's Disease Rating Scale (UPDRS) and the Unified Dyskinesia Rating Scale (UDysRS). The left M1 was conditioned with iTBS at 80% active motor threshold intensity. Twenty motor evoked potentials (MEPs) were recorded from right first interosseous muscle before and at 5, 15 and 30 min after iTBS. Between-group analysis of variance (ANOVA) testing healthy subjects versus patients with and without LIDs, on and off therapy showed a significant interaction between factors "Group" and "Time". After iTBS, MEP amplitudes in healthy subjects increased significantly at 5, 15 and 30 min (piTBS fails to increase MEP responses. This finding suggests lack of iTBS-induced LTP-like plasticity in M1 in PD regardless of patients' clinical features. Copyright © 2010 Elsevier Inc. All rights reserved.

  5. Extinction of aversive taste memory homeostatically prevents the maintenance of in vivo insular cortex LTP: Calcineurin participation.

    Science.gov (United States)

    Rivera-Olvera, Alejandro; Nelson-Mora, Janikua; Gonsebatt, María E; Escobar, Martha L

    2018-04-06

    Accumulating evidence indicates that homeostatic plasticity mechanisms dynamically adjust synaptic strength to promote stability that is crucial for memory storage. Our previous studies have shown that prior training in conditioned taste aversion (CTA) prevents the subsequent induction of long-term potentiation (LTP) in the projection from the basolateral nucleus of the amygdala (Bla) to the insular cortex (IC) in vivo. We have also reported that induction of LTP in the Bla-IC pathway modifies the CTA extinction. Memoryextinction involves the formation of a new associativememorythat inhibits a previously conditioned association. The aim of the present study was to analyze the effect of CTA extinction on the ability to induce subsequent LTP in the Bla-IC projection in vivo. Thus, 48 h after CTA extinction animals received high frequency stimulation in order to induce IC-LTP. Our results show that extinction training allows the induction but not the maintenance of IC-LTP. In addition, with the purpose of exploring part of the mechanisms involved in this process and since a body of evidence suggests that protein phosphatase calcineurin (CaN) is involved in the extinction of some behavioral tasks, we analyzed the participation of this phosphatase. The present results show that extinction training increases the CaN expression in the IC, as well as that the inhibition of this phosphatase reverts the effects of the CTA-extinction on the IC-LTP. These findings reveal that CTA extinction promotes a homeostatic regulation of subsequent IC synaptic plasticity maintenance through increases in CaN levels. Copyright © 2018 Elsevier Inc. All rights reserved.

  6. BraLTP1, a lipid transfer protein gene involved in epicuticular wax deposition, cell proliferation and flower development in Brassica napus.

    Directory of Open Access Journals (Sweden)

    Fang Liu

    Full Text Available Plant non-specific lipid transfer proteins (nsLTPs constitute large multigene families that possess complex physiological functions, many of which remain unclear. This study isolated and characterized the function of a lipid transfer protein gene, BraLTP1 from Brassica rapa, in the important oilseed crops Brassica napus. BraLTP1 encodes a predicted secretory protein, in the little known VI Class of nsLTP families. Overexpression of BnaLTP1 in B. napus caused abnormal green coloration and reduced wax deposition on leaves and detailed wax analysis revealed 17-80% reduction in various major wax components, which resulted in significant water-loss relative to wild type. BnaLTP1 overexpressing leaves exhibited morphological disfiguration and abaxially curled leaf edges, and leaf cross-sections revealed cell overproliferation that was correlated to increased cytokinin levels (tZ, tZR, iP, and iPR in leaves and high expression of the cytokinin biosynthsis gene IPT3. BnaLTP1-overexpressing plants also displayed morphological disfiguration of flowers, with early-onset and elongated carpel development and outwardly curled stamen. This was consistent with altered expression of a a number of ABC model genes related to flower development. Together, these results suggest that BraLTP1 is a new nsLTP gene involved in wax production or deposition, with additional direct or indirect effects on cell division and flower development.

  7. Overexpression of wheat lipid transfer protein gene TaLTP5 increases resistances to Cochliobolus sativus and Fusarium graminearum in transgenic wheat.

    Science.gov (United States)

    Zhu, Xiuliang; Li, Zhao; Xu, Huijun; Zhou, Miaoping; Du, Lipu; Zhang, Zengyan

    2012-08-01

    The fungus Cochliobolus sativus is the main pathogen of common root rot, a serious soil-borne disease of wheat (Triticum aestivum L.). The fungus Fusarium graminearum is the primary pathogen of Fusarium head blight, a devastating disease of wheat worldwide. In this study, the wheat lipid transfer protein gene, TaLTP5, was cloned and evaluated for its ability to suppress disease development in transgenic wheat. TaLTP5 expression was induced after C. sativus infection. The TaLTP5 expression vector, pA25-TaLTP5, was constructed and bombarded into Chinese wheat variety Yangmai 18. Six TaLTP5 transgenic wheat lines were established and characterized. PCR and Southern blot analyses indicated that the introduced TaLTP5 gene was integrated into the genomes of six transgenic wheat lines by distinct patterns, and heritable. RT-PCR and real-time quantitative RT-PCR revealed that the TaLTP5 gene was over-expressed in the transgenic wheat lines compared to segregants lacking the transgene and wild-type wheat plants. Following challenge with C. sativus or F. graminearum, all six transgenic lines overexpressing TaLTP5 exhibited significantly enhanced resistance to both common root rot and Fusarium head blight compared to the untransformed wheat Yangmai 18.

  8. Barley lipid transfer protein, LTP1, contains a new type of lipid-like post-translational modification

    DEFF Research Database (Denmark)

    Lindorff-Larsen, Kresten; Lerche, Mathilde H.; Poulsen, Flemming Martin

    2001-01-01

    in which an aspartic acid in LTP1 is bound to the modification through what most likely is an ester bond. The chemical structure of the modification has been characterized by means of two-dimensional homo- and heteronuclear nuclear magnetic resonance spectroscopy as well as mass spectrometry and is found...

  9. D1/D5 Receptors and Histone Deacetylation Mediate the Gateway Effect of LTP in Hippocampal Dentate Gyrus

    Science.gov (United States)

    Huang, Yan-You; Lavine, Amir; Kandel, Denise B.; Yin, Deqi; Colnaghi, Luca; Drisaldi, Bettina; Kandel, Eric R.

    2014-01-01

    The dentate gyrus (DG) of the hippocampus is critical for spatial memory and is also thought to be involved in the formation of drug-related associative memory. Here, we attempt to test an aspect of the Gateway Hypothesis, by studying the effect of consecutive exposure to nicotine and cocaine on long-term synaptic potentiation (LTP) in the DG. We…

  10. Glutamate Receptor GluA1 Subunit Is Implicated in Capsaicin Induced Modulation of Amygdala LTP but Not LTD

    Science.gov (United States)

    Gebhardt, Christine; Albrecht, Doris

    2018-01-01

    Capsaicin has been shown to modulate synaptic plasticity in various brain regions including the amygdala. Whereas in the lateral amygdala the modulatory effect of capsaicin on long-term potentiation (LA-LTP) is mediated by TRPV1 channels, we have recently shown that capsaicin-induced enhancement of long term depression (LA-LTD) is mediated by…

  11. LTP in Hippocampal Area CA1 Is Induced by Burst Stimulation over a Broad Frequency Range Centered around Delta

    Science.gov (United States)

    Grover, Lawrence M.; Kim, Eunyoung; Cooke, Jennifer D.; Holmes, William R.

    2009-01-01

    Long-term potentiation (LTP) is typically studied using either continuous high-frequency stimulation or theta burst stimulation. Previous studies emphasized the physiological relevance of theta frequency; however, synchronized hippocampal activity occurs over a broader frequency range. We therefore tested burst stimulation at intervals from 100…

  12. Blockade of intracellular Zn2+ signaling in the dentate gyrus erases recognition memory via impairment of maintained LTP.

    Science.gov (United States)

    Tamano, Haruna; Minamino, Tatsuya; Fujii, Hiroaki; Takada, Shunsuke; Nakamura, Masatoshi; Ando, Masaki; Takeda, Atsushi

    2015-08-01

    There is no evidence on the precise role of synaptic Zn2+ signaling on the retention and recall of recognition memory. On the basis of the findings that intracellular Zn2+ signaling in the dentate gyrus is required for object recognition, short-term memory, the present study deals with the effect of spatiotemporally blocking Zn2+ signaling in the dentate gyrus after LTP induction and learning. Three-day-maintained LTP was impaired 1 day after injection of clioquinol into the dentate gyrus, which transiently reduced intracellular Zn2+ signaling in the dentate gyrus. The irreversible impairment was rescued not only by co-injection of ZnCl2 , which ameliorated the loss of Zn2+ signaling, but also by pre-injection of Jasplakinolide, a stabilizer of F-actin, prior to clioquinol injection. Simultaneously, 3-day-old space recognition memory was impaired 1 day after injection of clioquinol into the dentate gyrus, but not by pre-injection of Jasplakinolide. Jasplakinolide also rescued both impairments of 3-day-maintained LTP and 3-day-old memory after injection of ZnAF-2DA into the dentate gyrus, which blocked intracellular Zn2+ signaling in the dentate gyrus. The present paper indicates that the blockade and/or loss of intracellular Zn2+ signaling in the dentate gyrus coincidently impair maintained LTP and recognition memory. The mechanism maintaining LTP via intracellular Zn2+ signaling in dentate granule cells, which may be involved in the formation of F-actin, may retain space recognition memory. © 2015 Wiley Periodicals, Inc.

  13. Redistribution of ionotropic glutamate receptors detected by laser microdissection of the rat dentate gyrus 48 h following LTP induction in vivo.

    Directory of Open Access Journals (Sweden)

    Jeremy T T Kennard

    Full Text Available The persistence and input specificity of long-term potentiation (LTP make it attractive as a mechanism of information storage. In its initial phase, both in vivo and in vitro studies have shown that LTP is associated with increased membrane localization of AMPA receptor subunits, but the molecular basis of LTP maintenance over the long-term is still unclear. We have previously shown that expression of AMPA and NMDA receptor subunits is elevated in whole homogenates prepared from dentate gyrus 48 h after LTP induction in vivo. In the present study, we utilized laser microdissection (LMD techniques to determine whether AMPA and NMDA receptor upregulation occurs specifically in the stimulated regions of the dentate gyrus dendritic arbor. Receptor proteins GluN1, GluA1 and GluA2, as well as postsynaptic density protein of 95 kDa and tubulin were detected by Western blot analysis in microdissected samples. Gradients of expression were observed for GluN1 and GluA2, decreasing from the inner to the outer zones of the molecular layer, and were independent of LTP. When induced at medial perforant path synapses, LTP was associated with an apparent specific redistribution of GluA1 and GluN1 to the middle molecular layer that contains these synapses. These data indicate that glutamate receptor proteins are delivered specifically to dendritic regions possessing LTP-expressing synapses, and that these changes are preserved for at least 48 h.

  14. Calcium microdomains near R-type calcium channels control the induction of presynaptic LTP at parallel fiber to Purkinje cell synapses

    Science.gov (United States)

    Myoga, Michael H.; Regehr, Wade G.

    2011-01-01

    R-type calcium channels in postsynaptic spines signal through functional calcium microdomains to regulate a calcium-calmodulin sensitive potassium channel that in turn regulates postsynaptic hippocampal LTP. Here we ask whether R-type calcium channels in presynaptic terminals also signal through calcium microdomains to control presynaptic LTP. We focus on presynaptic LTP at parallel fiber to Purkinje cell synapses in the cerebellum (PF-LTP), which is mediated by calcium/calmodulin-stimulated adenylyl cyclases. Although most presynaptic calcium influx is through N-type and P/Q-type calcium channels, blocking these channels does not disrupt PF-LTP, but blocking R-type calcium channels does. Moreover, global calcium signaling cannot account for the calcium dependence of PF-LTP because R-type channels contribute modestly to overall calcium entry. These findings indicate that within presynaptic terminals, R-type calcium channels produce calcium microdomains that evoke presynaptic LTP at moderate frequencies that do not greatly increase global calcium levels,. PMID:21471358

  15. Probability Aggregates in Probability Answer Set Programming

    OpenAIRE

    Saad, Emad

    2013-01-01

    Probability answer set programming is a declarative programming that has been shown effective for representing and reasoning about a variety of probability reasoning tasks. However, the lack of probability aggregates, e.g. {\\em expected values}, in the language of disjunctive hybrid probability logic programs (DHPP) disallows the natural and concise representation of many interesting problems. In this paper, we extend DHPP to allow arbitrary probability aggregates. We introduce two types of p...

  16. Advantages of the in-situ LTP distortion profile test on high-heat-load mirrors and applications

    International Nuclear Information System (INIS)

    Qian, S.; Jark, W.; Sostero, G.; Gambitta, A.; Mazzolini, F.; Savoia, A.

    1996-01-01

    The first in-situ distortion profile measurement of a high heat load mirror by use of the penta-prism LTP is presented. A maximum height distortion of 0.47 micron in tangential direction over a length of 180 mm was measured for an internally water-cooled mirror of a undulator beam line at ELETTRA while exposed to a total emitted power of 600 W (undulator gap 30 mm and current 180 mA). The experiment has an accuracy and repeatability of 0.04 micron. The test schematic and the test equipment are presented. Two measuring methods to scan a penta-prism being installed either outside or inside the vacuum chamber are introduced. Advantages and some possible applications of adopting the penta-prism LTP to make the in-situ profile test are explained

  17. Scaling Qualitative Probability

    OpenAIRE

    Burgin, Mark

    2017-01-01

    There are different approaches to qualitative probability, which includes subjective probability. We developed a representation of qualitative probability based on relational systems, which allows modeling uncertainty by probability structures and is more coherent than existing approaches. This setting makes it possible proving that any comparative probability is induced by some probability structure (Theorem 2.1), that classical probability is a probability structure (Theorem 2.2) and that i...

  18. Wild-Type, but Not Mutant N296H, Human Tau Restores Aβ-Mediated Inhibition of LTP in Tau−/− mice

    Directory of Open Access Journals (Sweden)

    Mariana Vargas-Caballero

    2017-04-01

    Full Text Available Microtubule associated protein tau (MAPT is involved in the pathogenesis of Alzheimer's disease and many forms of frontotemporal dementia (FTD. We recently reported that Aβ-mediated inhibition of hippocampal long-term potentiation (LTP in mice requires tau. Here, we asked whether expression of human MAPT can restore Aβ-mediated inhibition on a mouse Tau−/− background and whether human tau with an FTD-causing mutation (N296H can interfere with Aβ-mediated inhibition of LTP. We used transgenic mouse lines each expressing the full human MAPT locus using bacterial artificial chromosome technology. These lines expressed all six human tau protein isoforms on a Tau−/− background. We found that the human wild-type MAPT H1 locus was able to restore Aβ42-mediated impairment of LTP. In contrast, Aβ42 did not reduce LTP in slices in two independently generated transgenic lines expressing tau protein with the mutation N296H associated with frontotemporal dementia (FTD. Basal phosphorylation of tau measured as the ratio of AT8/Tau5 immunoreactivity was significantly reduced in N296H mutant hippocampal slices. Our data show that human MAPT is able to restore Aβ42-mediated inhibition of LTP in Tau−/− mice. These results provide further evidence that tau protein is central to Aβ-induced LTP impairment and provide a valuable tool for further analysis of the links between Aβ, human tau and impairment of synaptic function.

  19. Nitric oxide-induced calcium release: activation of type 1 ryanodine receptor by endogenous nitric oxide.

    Science.gov (United States)

    Kakizawa, Sho; Yamazawa, Toshiko; Iino, Masamitsu

    2013-01-01

    Ryanodine receptors (RyRs), located in the sarcoplasmic/endoplasmic reticulum (SR/ER) membrane, are required for intracellular Ca2+ release that is involved in a wide range of cellular functions. In addition to Ca2+-induced Ca2+ release in cardiac cells and voltage-induced Ca2+ release in skeletal muscle cells, we recently identified another mode of intracellular Ca2+ mobilization mediated by RyR, i.e., nitric oxide-induced Ca2+ release (NICR), in cerebellar Purkinje cells. NICR is evoked by neuronal activity, is dependent on S-nitrosylation of type 1 RyR (RyR1) and is involved in the induction of long-term potentiation (LTP) of cerebellar synapses. In this addendum, we examined whether peroxynitrite, which is produced by the reaction of nitric oxide with superoxide, may also have an effect on the Ca2+ release via RyR1 and the cerebellar LTP. We found that scavengers of peroxynitrite have no significant effect either on the Ca2+ release via RyR1 or on the cerebellar LTP. We also found that an application of a high concentration of peroxynitrite does not reproduce neuronal activity-dependent Ca2+ release in Purkinje cells. These results support that NICR is induced by endogenous nitric oxide produced by neuronal activity through S-nitrosylation of RyR1.

  20. Mimicking Neurotransmitter Release in Chemical Synapses via Hysteresis Engineering in MoS2 Transistors.

    Science.gov (United States)

    Arnold, Andrew J; Razavieh, Ali; Nasr, Joseph R; Schulman, Daniel S; Eichfeld, Chad M; Das, Saptarshi

    2017-03-28

    Neurotransmitter release in chemical synapses is fundamental to diverse brain functions such as motor action, learning, cognition, emotion, perception, and consciousness. Moreover, improper functioning or abnormal release of neurotransmitter is associated with numerous neurological disorders such as epilepsy, sclerosis, schizophrenia, Alzheimer's disease, and Parkinson's disease. We have utilized hysteresis engineering in a back-gated MoS 2 field effect transistor (FET) in order to mimic such neurotransmitter release dynamics in chemical synapses. All three essential features, i.e., quantal, stochastic, and excitatory or inhibitory nature of neurotransmitter release, were accurately captured in our experimental demonstration. We also mimicked an important phenomenon called long-term potentiation (LTP), which forms the basis of human memory. Finally, we demonstrated how to engineer the LTP time by operating the MoS 2 FET in different regimes. Our findings could provide a critical component toward the design of next-generation smart and intelligent human-like machines and human-machine interfaces.

  1. On Probability Leakage

    OpenAIRE

    Briggs, William M.

    2012-01-01

    The probability leakage of model M with respect to evidence E is defined. Probability leakage is a kind of model error. It occurs when M implies that events $y$, which are impossible given E, have positive probability. Leakage does not imply model falsification. Models with probability leakage cannot be calibrated empirically. Regression models, which are ubiquitous in statistical practice, often evince probability leakage.

  2. Reduction in LFP cross-frequency coupling between theta and gamma rhythms associated with impaired STP and LTP in a rat model of brain ischemia

    Directory of Open Access Journals (Sweden)

    Tao eZhang

    2013-04-01

    Full Text Available The theta-gamma cross-frequency coupling (CFC in hippocampus was reported to reflect memory process. In this study, we measured the CFC of hippocampal local field potentials (LFPs in a two-vessel occlusion (2VO rat model, combined with both amplitude and phase properties and associated with short and long-term plasticity indicating the memory function. Male Wistar rats were used and a 2VO model was established. STP and LTP were recorded in hippocampal CA3-CA1 pathway after LFPs were collected in both CA3 and CA1. Based on the data of relative power spectra and phase synchronization, it suggested that both the amplitude and phase coupling of either theta or gamma rhythm were involved in modulating the neural network in 2VO rats. In order to determine whether the CFC was also implicated in neural impairment in 2VO rats, the coupling of CA3 theta–CA1 gamma was measured by both phase-phase coupling (n:m phase synchronization and phase-amplitude coupling. The attenuated CFC strength in 2VO rats implied the impaired neural communication in the coordination of theta-gamma entraining process. Moreover, compared with modulation index (MI a novel algorithm named cross frequency conditional mutual information (CF-CMI, was developed to focus on the coupling between theta phase and the phase of gamma amplitude. The results suggest that the reduced CFC strength probably attributed to the disruption of the phase of CA1 gamma envelop. In conclusion, it implied that the phase coupling and CFC of hippocampal theta and gamma played an important role in supporting functions of neural network. Furthermore, synaptic plasticity on CA3-CA1 pathway was reduced in line with the decreased CFC strength from CA3 to CA1. It partly supported our hypothesis that directional CFC indicator might probably be used as a measure of synaptic plasticity.

  3. Probability 1/e

    Science.gov (United States)

    Koo, Reginald; Jones, Martin L.

    2011-01-01

    Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.

  4. Probability an introduction

    CERN Document Server

    Goldberg, Samuel

    1960-01-01

    Excellent basic text covers set theory, probability theory for finite sample spaces, binomial theorem, probability distributions, means, standard deviations, probability function of binomial distribution, more. Includes 360 problems with answers for half.

  5. Quantum probability measures and tomographic probability densities

    NARCIS (Netherlands)

    Amosov, GG; Man'ko, [No Value

    2004-01-01

    Using a simple relation of the Dirac delta-function to generalized the theta-function, the relationship between the tomographic probability approach and the quantum probability measure approach with the description of quantum states is discussed. The quantum state tomogram expressed in terms of the

  6. A Voltage-Based STDP Rule Combined with Fast BCM-Like Metaplasticity Accounts for LTP and Concurrent "Heterosynaptic" LTD in the Dentate Gyrus In Vivo.

    Directory of Open Access Journals (Sweden)

    Peter Jedlicka

    2015-11-01

    Full Text Available Long-term potentiation (LTP and long-term depression (LTD are widely accepted to be synaptic mechanisms involved in learning and memory. It remains uncertain, however, which particular activity rules are utilized by hippocampal neurons to induce LTP and LTD in behaving animals. Recent experiments in the dentate gyrus of freely moving rats revealed an unexpected pattern of LTP and LTD from high-frequency perforant path stimulation. While 400 Hz theta-burst stimulation (400-TBS and 400 Hz delta-burst stimulation (400-DBS elicited substantial LTP of the tetanized medial path input and, concurrently, LTD of the non-tetanized lateral path input, 100 Hz theta-burst stimulation (100-TBS, a normally efficient LTP protocol for in vitro preparations produced only weak LTP and concurrent LTD. Here we show in a biophysically realistic compartmental granule cell model that this pattern of results can be accounted for by a voltage-based spike-timing-dependent plasticity (STDP rule combined with a relatively fast Bienenstock-Cooper-Munro (BCM-like homeostatic metaplasticity rule, all on a background of ongoing spontaneous activity in the input fibers. Our results suggest that, at least for dentate granule cells, the interplay of STDP-BCM plasticity rules and ongoing pre- and postsynaptic background activity determines not only the degree of input-specific LTP elicited by various plasticity-inducing protocols, but also the degree of associated LTD in neighboring non-tetanized inputs, as generated by the ongoing constitutive activity at these synapses.

  7. NF-κB p50 subunit knockout impairs late LTP and alters long term memory in the mouse hippocampus

    Directory of Open Access Journals (Sweden)

    Oikawa Kensuke

    2012-07-01

    Full Text Available Abstract Background Nuclear factor kappa B (NF-κB is a transcription factor typically expressed with two specific subunits (p50, p65. Investigators have reported that NF-κB is activated during the induction of in vitro long term potentiation (LTP, a paradigm of synaptic plasticity and correlate of memory, suggesting that NF-κB may be necessary for some aspects of memory encoding. Furthermore, NF-κB has been implicated as a potential requirement in behavioral tests of memory. Unfortunately, very little work has been done to explore the effects of deleting specific NF-κB subunits on memory. Studies have shown that NF-κB p50 subunit deletion (p50−/− leads to memory deficits, however some recent studies suggest the contrary where p50−/− mice show enhanced memory in the Morris water maze (MWM. To more critically explore the role of the NF-κB p50 subunit in synaptic plasticity and memory, we assessed long term spatial memory in vivo using the MWM, and synaptic plasticity in vitro utilizing high frequency stimuli capable of eliciting LTP in slices from the hippocampus of NF-κB p50−/− versus their controls (p50+/+. Results We found that the lack of the NF-κB p50 subunit led to significant decreases in late LTP and in selective but significant alterations in MWM tests (i.e., some improvements during acquisition, but deficits during retention. Conclusions These results support the hypothesis that the NF-κ p50 subunit is required in long term spatial memory in the hippocampus.

  8. Pomegranate ( Punica granatum L.) expresses several nsLTP isoforms characterized by different immunoglobulin E-binding properties.

    Science.gov (United States)

    Bolla, Michela; Zenoni, Sara; Scheurer, Stephan; Vieths, Stefan; San Miguel Moncin, Maria Del Mar; Olivieri, Mario; Antico, Andrea; Ferrer, Marta; Berroa, Felicia; Enrique, Ernesto; Avesani, Linda; Marsano, Francesco; Zoccatelli, Gianni

    2014-01-01

    Pomegranate allergy is associated with sensitization to non-specific lipid transfer proteins (nsLTPs). Our aim was to identify and characterize the non-specific nsLTPs expressed in pomegranate at the molecular level and to study their allergenic properties in terms of immunoglobulin E (IgE)-binding and cross-reactivity with peach nsLTP (Pru p 3). A non-equilibrium two-dimensional (2-D) electrophoretic approach based on acid-urea PAGE and sodium dodecyl sulfate PAGE was set up to separate pomegranate nsLTPs. Their immunoreactivity was tested by immunoblotting carried out with anti-Pru p 3 polyclonal antibodies and sera from pomegranate-allergic patients. For final identification, pomegranate nsLTPs were purified by chromatography and subjected to trypsin digestion and mass spectrometry (MS) analysis. For this purpose, the sequences obtained by cDNA cloning of three pomegranate nsLTPs were integrated in the database that was subsequently searched for MS data interpretation. Four nsLTPs were identified by 2-D immunoblotting. The detected proteins showed different IgE-binding capacity and partial cross-reactivity with Pru p 3. cDNA cloning and MS analyses led to the identification of three nsLTP isoforms with 66-68% amino acid sequence identity named Pun g 1.0101, Pun g 1.0201 and Pun g 1.0301. By 2-D electrophoresis, we could separate different nsLTP isoforms possessing different IgE-binding properties, which might reflect peculiar allergenic potencies. The contribution of Pru p 3 to prime sensitization is not central as in other plant nsLTPs. © 2014 S. Karger AG, Basel.

  9. Deficits in LTP induction by 5-HT2A receptor antagonist in a mouse model for fragile X syndrome.

    Directory of Open Access Journals (Sweden)

    Zhao-hui Xu

    Full Text Available Fragile X syndrome is a common inherited form of mental retardation caused by the lack of fragile X mental retardation protein (FMRP because of Fmr1 gene silencing. Serotonin (5-HT is significantly increased in the null mutants of Drosophila Fmr1, and elevated 5-HT brain levels result in cognitive and behavioral deficits in human patients. The serotonin type 2A receptor (5-HT2AR is highly expressed in the cerebral cortex; it acts on pyramidal cells and GABAergic interneurons to modulate cortical functions. 5-HT2AR and FMRP both regulate synaptic plasticity. Therefore, the lack of FMRP may affect serotoninergic activity. In this study, we determined the involvement of FMRP in the 5-HT modulation of synaptic potentiation with the use of primary cortical neuron culture and brain slice recording. Pharmacological inhibition of 5-HT2AR by R-96544 or ketanserin facilitated long-term potentiation (LTP in the anterior cingulate cortex (ACC of WT mice. The prefrontal LTP induction was dependent on the activation of NMDARs and elevation of postsynaptic Ca(2+ concentrations. By contrast, inhibition of 5-HT2AR could not restore the induction of LTP in the ACC of Fmr1 knock-out mice. Furthermore, 5-HT2AR inhibition induced AMPA receptor GluR1 subtype surface insertion in the cultured ACC neurons of Fmr1 WT mice, however, GluR1 surface insertion by inhibition of 5-HT2AR was impaired in the neurons of Fmr1KO mice. These findings suggested that FMRP was involved in serotonin receptor signaling and contributed in GluR1 surface expression induced by 5-HT2AR inactivation.

  10. Toward a generalized probability theory: conditional probabilities

    International Nuclear Information System (INIS)

    Cassinelli, G.

    1979-01-01

    The main mathematical object of interest in the quantum logic approach to the foundations of quantum mechanics is the orthomodular lattice and a set of probability measures, or states, defined by the lattice. This mathematical structure is studied per se, independently from the intuitive or physical motivation of its definition, as a generalized probability theory. It is thought that the building-up of such a probability theory could eventually throw light on the mathematical structure of Hilbert-space quantum mechanics as a particular concrete model of the generalized theory. (Auth.)

  11. Characterization of a new antifungal non-specific lipid transfer protein (nsLTP) from sugar beet leaves

    DEFF Research Database (Denmark)

    Kristensen, A K; Brunstedt, J; Madsen, M T

    2000-01-01

    A novel protein (IWF5) comprising 92 amino acids has been purified from the intercellular washing fluid of sugar beet leaves using cation exchange chromatography and reversed phase high performance liquid chromatography. Based on amino acid sequence homology, including the presence of eight...... cysteines at conserved positions, the protein can be classified as a member of the plant family of non-specific lipid transfer proteins (nsLTPs). The protein is 47% identical to IWF1, an antifungal nsLTP previously isolated from leaves of sugar beet. A potential site for N-linked glycosylation present...

  12. Ischemic long-term-potentiation (iLTP: perspectives to set the threshold of neural plasticity toward therapy

    Directory of Open Access Journals (Sweden)

    Maximilian Lenz

    2015-01-01

    Full Text Available The precise role of neural plasticity under pathological conditions remains not well understood. It appears to be well accepted, however, that changes in the ability of neurons to express plasticity accompany neurological diseases. Here, we discuss recent experimental evidence, which suggests that synaptic plasticity induced by a pathological stimulus, i.e., ischemic long-term-potentiation (iLTP of excitatory synapses, could play an important role for post-stroke recovery by influencing the post-lesional reorganization of surviving neuronal networks.

  13. The effect of acute swim stress and training in the water maze on hippocampal synaptic activity as well as plasticity in the dentate gyrus of freely moving rats: revisiting swim-induced LTP reinforcement.

    Science.gov (United States)

    Tabassum, Heena; Frey, Julietta U

    2013-12-01

    Hippocampal long-term potentiation (LTP) is a cellular model of learning and memory. An early form of LTP (E-LTP) can be reinforced into its late form (L-LTP) by various behavioral interactions within a specific time window ("behavioral LTP-reinforcement"). Depending on the type and procedure used, various studies have shown that stress differentially affects synaptic plasticity. Under low stress, such as novelty detection or mild foot shocks, E-LTP can be transformed into L-LTP in the rat dentate gyrus (DG). A reinforcing effect of a 2-min swim, however, has only been shown in (Korz and Frey (2003) J Neurosci 23:7281-7287; Korz and Frey (2005) J Neurosci 25:7393-7400; Ahmed et al. (2006) J Neurosci 26:3951-3958; Sajikumar et al., (2007) J Physiol 584.2:389-400) so far. We have reinvestigated these studies using the same as well as an improved recording technique which allowed the recording of field excitatory postsynaptic potentials (fEPSP) and the population spike amplitude (PSA) at their places of generation in freely moving rats. We show that acute swim stress led to a long-term depression (LTD) in baseline values of PSA and partially fEPSP. In contrast to earlier studies a LTP-reinforcement by swimming could never be reproduced. Our results indicate that 2-min swim stress influenced synaptic potentials as well as E-LTP negatively. Copyright © 2013 Wiley Periodicals, Inc.

  14. Determining probabilities of geologic events and processes

    International Nuclear Information System (INIS)

    Hunter, R.L.; Mann, C.J.; Cranwell, R.M.

    1985-01-01

    The Environmental Protection Agency has recently published a probabilistic standard for releases of high-level radioactive waste from a mined geologic repository. The standard sets limits for contaminant releases with more than one chance in 100 of occurring within 10,000 years, and less strict limits for releases of lower probability. The standard offers no methods for determining probabilities of geologic events and processes, and no consensus exists in the waste-management community on how to do this. Sandia National Laboratories is developing a general method for determining probabilities of a given set of geologic events and processes. In addition, we will develop a repeatable method for dealing with events and processes whose probability cannot be determined. 22 refs., 4 figs

  15. Boosting the LTP-like plasticity effect of intermittent theta-burst stimulation using gamma transcranial alternating current stimulation.

    Science.gov (United States)

    Guerra, Andrea; Suppa, Antonio; Bologna, Matteo; D'Onofrio, Valentina; Bianchini, Edoardo; Brown, Peter; Di Lazzaro, Vincenzo; Berardelli, Alfredo

    2018-03-24

    Transcranial Alternating Current Stimulation (tACS) consists in delivering electric current to the brain using an oscillatory pattern that may entrain the rhythmic activity of cortical neurons. When delivered at gamma frequency, tACS modulates motor performance and GABA-A-ergic interneuron activity. Since interneuronal discharges play a crucial role in brain plasticity phenomena, here we co-stimulated the primary motor cortex (M1) in healthy subjects by means of tACS during intermittent theta-burst stimulation (iTBS), a transcranial magnetic stimulation paradigm known to induce long-term potentiation (LTP)-like plasticity. We measured and compared motor evoked potentials before and after gamma, beta and sham tACS-iTBS. While we delivered gamma-tACS, we also measured short-interval intracortical inhibition (SICI) to detect any changes in GABA-A-ergic neurotransmission. Gamma, but not beta and sham tACS, significantly boosted and prolonged the iTBS-induced after-effects. Interestingly, the extent of the gamma tACS-iTBS after-effects correlated directly with SICI changes. Overall, our findings point to a link between gamma oscillations, interneuronal GABA-A-ergic activity and LTP-like plasticity in the human M1. Gamma tACS-iTBS co-stimulation might represent a new strategy to enhance and prolong responses to plasticity-inducing protocols, thereby lending itself to future applications in the neurorehabilitation setting. Copyright © 2018 Elsevier Inc. All rights reserved.

  16. Characteristic of Extracellular Zn2+ Influx in the Middle-Aged Dentate Gyrus and Its Involvement in Attenuation of LTP.

    Science.gov (United States)

    Takeda, Atsushi; Koike, Yuta; Osaw, Misa; Tamano, Haruna

    2018-03-01

    An increased influx of extracellular Zn 2+ into neurons is a cause of cognitive decline. The influx of extracellular Zn 2+ into dentate granule cells was compared between young and middle-aged rats because of vulnerability of the dentate gyrus to aging. The influx of extracellular Zn 2+ into dentate granule cells was increased in middle-aged rats after injection of AMPA and high K + into the dentate gyrus, but not in young rats. Simultaneously, high K + -induced attenuation of LTP was observed in middle-aged rats, but not in young rats. The attenuation was rescued by co-injection of CaEDTA, an extracellular Zn 2+ chelator. Intracellular Zn 2+ in dentate granule cells was also increased in middle-aged slices with high K + , in which the increase in extracellular Zn 2+ was the same as young slices with high K + , suggesting that ability of extracellular Zn 2+ influx into dentate granule cells is greater in middle-aged rats. Furthermore, extracellular zinc concentration in the hippocampus was increased age-dependently. The present study suggests that the influx of extracellular Zn 2+ into dentate granule cells is more readily increased in middle-aged rats and that its increase is a cause of age-related attenuation of LTP in the dentate gyrus.

  17. Philosophical theories of probability

    CERN Document Server

    Gillies, Donald

    2000-01-01

    The Twentieth Century has seen a dramatic rise in the use of probability and statistics in almost all fields of research. This has stimulated many new philosophical ideas on probability. Philosophical Theories of Probability is the first book to present a clear, comprehensive and systematic account of these various theories and to explain how they relate to one another. Gillies also offers a distinctive version of the propensity theory of probability, and the intersubjective interpretation, which develops the subjective theory.

  18. Non-Archimedean Probability

    NARCIS (Netherlands)

    Benci, Vieri; Horsten, Leon; Wenmackers, Sylvia

    We propose an alternative approach to probability theory closely related to the framework of numerosity theory: non-Archimedean probability (NAP). In our approach, unlike in classical probability theory, all subsets of an infinite sample space are measurable and only the empty set gets assigned

  19. Interpretations of probability

    CERN Document Server

    Khrennikov, Andrei

    2009-01-01

    This is the first fundamental book devoted to non-Kolmogorov probability models. It provides a mathematical theory of negative probabilities, with numerous applications to quantum physics, information theory, complexity, biology and psychology. The book also presents an interesting model of cognitive information reality with flows of information probabilities, describing the process of thinking, social, and psychological phenomena.

  20. A mathematical model for predicting the probability of acute mortality in a human population exposed to accidentally released airborne radionuclides. Final report for Phase I of the project: early effects of inhaled radionuclides

    International Nuclear Information System (INIS)

    Filipy, R.E.; Borst, F.J.; Cross, F.T.; Park, J.F.; Moss, O.R.

    1980-06-01

    The report presents a mathematical model for the purpose of predicting the fraction of human population which would die within 1 year of an accidental exposure to airborne radionuclides. The model is based on data from laboratory experiments with rats, dogs and baboons, and from human epidemiological data. Doses from external, whole-body irradiation and from inhaled, alpha- and beta-emitting radionuclides are calculated for several organs. The probabilities of death from radiation pneumonitis and from bone marrow irradiation are predicted from doses accumulated within 30 days of exposure to the radioactive aerosol. The model is compared with existing similar models under hypothetical exposure conditions. Suggestions for further experiments with inhaled radionuclides are included

  1. Foundations of probability

    International Nuclear Information System (INIS)

    Fraassen, B.C. van

    1979-01-01

    The interpretation of probabilities in physical theories are considered, whether quantum or classical. The following points are discussed 1) the functions P(μ, Q) in terms of which states and propositions can be represented, are classical (Kolmogoroff) probabilities, formally speaking, 2) these probabilities are generally interpreted as themselves conditional, and the conditions are mutually incompatible where the observables are maximal and 3) testing of the theory typically takes the form of confronting the expectation values of observable Q calculated with probability measures P(μ, Q) for states μ; hence, of comparing the probabilities P(μ, Q)(E) with the frequencies of occurrence of the corresponding events. It seems that even the interpretation of quantum mechanics, in so far as it concerns what the theory says about the empirical (i.e. actual, observable) phenomena, deals with the confrontation of classical probability measures with observable frequencies. This confrontation is studied. (Auth./C.F.)

  2. The quantum probability calculus

    International Nuclear Information System (INIS)

    Jauch, J.M.

    1976-01-01

    The Wigner anomaly (1932) for the joint distribution of noncompatible observables is an indication that the classical probability calculus is not applicable for quantum probabilities. It should, therefore, be replaced by another, more general calculus, which is specifically adapted to quantal systems. In this article this calculus is exhibited and its mathematical axioms and the definitions of the basic concepts such as probability field, random variable, and expectation values are given. (B.R.H)

  3. Choice Probability Generating Functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel L; Bierlaire, Michel

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications....

  4. Probability of satellite collision

    Science.gov (United States)

    Mccarter, J. W.

    1972-01-01

    A method is presented for computing the probability of a collision between a particular artificial earth satellite and any one of the total population of earth satellites. The collision hazard incurred by the proposed modular Space Station is assessed using the technique presented. The results of a parametric study to determine what type of satellite orbits produce the greatest contribution to the total collision probability are presented. Collision probability for the Space Station is given as a function of Space Station altitude and inclination. Collision probability was also parameterized over miss distance and mission duration.

  5. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2013-01-01

    This paper considers discrete choice, with choice probabilities coming from maximization of preferences from a random utility field perturbed by additive location shifters (ARUM). Any ARUM can be characterized by a choice-probability generating function (CPGF) whose gradient gives the choice...... probabilities, and every CPGF is consistent with an ARUM. We relate CPGF to multivariate extreme value distributions, and review and extend methods for constructing CPGF for applications. The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended...

  6. Handbook of probability

    CERN Document Server

    Florescu, Ionut

    2013-01-01

    THE COMPLETE COLLECTION NECESSARY FOR A CONCRETE UNDERSTANDING OF PROBABILITY Written in a clear, accessible, and comprehensive manner, the Handbook of Probability presents the fundamentals of probability with an emphasis on the balance of theory, application, and methodology. Utilizing basic examples throughout, the handbook expertly transitions between concepts and practice to allow readers an inclusive introduction to the field of probability. The book provides a useful format with self-contained chapters, allowing the reader easy and quick reference. Each chapter includes an introductio

  7. Real analysis and probability

    CERN Document Server

    Ash, Robert B; Lukacs, E

    1972-01-01

    Real Analysis and Probability provides the background in real analysis needed for the study of probability. Topics covered range from measure and integration theory to functional analysis and basic concepts of probability. The interplay between measure theory and topology is also discussed, along with conditional probability and expectation, the central limit theorem, and strong laws of large numbers with respect to martingale theory.Comprised of eight chapters, this volume begins with an overview of the basic concepts of the theory of measure and integration, followed by a presentation of var

  8. Beta-Adrenergic Receptor Activation during Distinct Patterns of Stimulation Critically Modulates the PKA-Dependence of LTP in the Mouse Hippocampus

    Science.gov (United States)

    Gelinas, Jennifer N.; Tenorio, Gustavo; Lemon, Neal; Abel, Ted; Nguyen, Peter V.

    2008-01-01

    Activation of Beta-adrenergic receptors (Beta-ARs) enhances hippocampal memory consolidation and long-term potentiation (LTP), a likely mechanism for memory storage. One signaling pathway linked to Beta-AR activation is the cAMP-PKA pathway. PKA is critical for the consolidation of hippocampal long-term memory and for the expression of some forms…

  9. DEVELOPMENTAL LEAD (PB) EXPOSURE REDUCES THE ABILITY OF THE NNDA ANTAGONIST MK801 TO SUPPRESS LONG-TERM POTENTIATION (LTP) IN THE RAT DENTATE GYRUS, IN VIVO

    Science.gov (United States)

    Chronic developmental lead (Pb) exposure increases the threshold and enhances decay of long-term potentiation (LTP) in the dentate gyrus of the hippocampal formation. MK-801 and other antagonists of the N-methyl-D-aspartate (NMDA) glutamate receptor subtype impair induction of LT...

  10. Functional optical probing of the hippocampal trisynaptic circuit in vitro: network dynamics, filter properties, and polysynaptic induction of CA1 LTP.

    Science.gov (United States)

    Stepan, Jens; Dine, Julien; Eder, Matthias

    2015-01-01

    Decades of brain research have identified various parallel loops linking the hippocampus with neocortical areas, enabling the acquisition of spatial and episodic memories. Especially the hippocampal trisynaptic circuit [entorhinal cortex layer II → dentate gyrus (DG) → cornu ammonis (CA)-3 → CA1] was studied in great detail because of its seemingly simple connectivity and characteristic structures that are experimentally well accessible. While numerous researchers focused on functional aspects, obtained from a limited number of cells in distinct hippocampal subregions, little is known about the neuronal network dynamics which drive information across multiple synapses for subsequent long-term storage. Fast voltage-sensitive dye imaging in vitro allows real-time recording of activity patterns in large/meso-scale neuronal networks with high spatial resolution. In this way, we recently found that entorhinal theta-frequency input to the DG most effectively passes filter mechanisms of the trisynaptic circuit network, generating activity waves which propagate across the entire DG-CA axis. These "trisynaptic circuit waves" involve high-frequency firing of CA3 pyramidal neurons, leading to a rapid induction of classical NMDA receptor-dependent long-term potentiation (LTP) at CA3-CA1 synapses (CA1 LTP). CA1 LTP has been substantially evidenced to be essential for some forms of explicit learning in mammals. Here, we review data with particular reference to whole network-level approaches, illustrating how activity propagation can take place within the trisynaptic circuit to drive formation of CA1 LTP.

  11. Alzheimer's Disease Brain-Derived Amyloid-{beta}-Mediated Inhibition of LTP In Vivo Is Prevented by Immunotargeting Cellular Prion Protein.

    LENUS (Irish Health Repository)

    Barry, Andrew E

    2011-05-18

    Synthetic amyloid-β protein (Aβ) oligomers bind with high affinity to cellular prion protein (PrP(C)), but the role of this interaction in mediating the disruption of synaptic plasticity by such soluble Aβ in vitro is controversial. Here we report that intracerebroventricular injection of Aβ-containing aqueous extracts of Alzheimer\\'s disease (AD) brain robustly inhibits long-term potentiation (LTP) without significantly affecting baseline excitatory synaptic transmission in the rat hippocampus in vivo. Moreover, the disruption of LTP was abrogated by immunodepletion of Aβ. Importantly, intracerebroventricular administration of antigen-binding antibody fragment D13, directed to a putative Aβ-binding site on PrP(C), prevented the inhibition of LTP by AD brain-derived Aβ. In contrast, R1, a Fab directed to the C terminus of PrP(C), a region not implicated in binding of Aβ, did not significantly affect the Aβ-mediated inhibition of LTP. These data support the pathophysiological significance of SDS-stable Aβ dimer and the role of PrP(C) in mediating synaptic plasticity disruption by soluble Aβ.

  12. Pharmacological Activators of the NR4A Nuclear Receptors Enhance LTP in a CREB/CBP-Dependent Manner.

    Science.gov (United States)

    Bridi, Morgan S; Hawk, Joshua D; Chatterjee, Snehajyoti; Safe, Stephen; Abel, Ted

    2017-05-01

    Nr4a nuclear receptors contribute to long-term memory formation and are required for long-term memory enhancement by a class of broad-acting drugs known as histone deacetylase (HDAC) inhibitors. Understanding the molecular mechanisms that regulate these genes and identifying ways to increase their activity may provide novel therapeutic approaches for ameliorating cognitive dysfunction. In the present study, we find that Nr4a gene expression after learning requires the cAMP-response element binding (CREB) interaction domain of the histone acetyltransferase CREB-binding protein (CBP). These gene expression deficits emerge at a time after learning marked by promoter histone acetylation in wild-type mice. Further, mutation of the CREB-CBP interaction domain reduces Nr4a promoter acetylation after learning. As memory enhancement by HDAC inhibitors requires CREB-CBP interaction and Nr4a gene function, these data support the notion that the balance of histone acetylation at the Nr4a promoters is critical for memory formation. NR4A ligands have recently been described, but the effect of these drugs on synaptic plasticity or memory has not been investigated. We find that the 'C-DIM' NR4A ligands, para-phenyl substituted di-indolylmethane compounds, enhance long-term contextual fear memory and increase the duration of long-term potentiation (LTP), a form of hippocampal synaptic plasticity. LTP enhancement by these drugs is eliminated in mice expressing a dominant negative form of NR4A and attenuated in mice with mutation of the CREB-CBP interaction domain. These data define the molecular connection between histone acetylation and Nr4a gene expression after learning. In addition, they suggest that NR4A-activating C-DIM compounds may serve as a potent and selective means to enhance memory and synaptic plasticity.

  13. Differential Effects of HRAS Mutation on LTP-Like Activity Induced by Different Protocols of Repetitive Transcranial Magnetic Stimulation.

    Science.gov (United States)

    Dileone, Michele; Ranieri, Federico; Florio, Lucia; Capone, Fioravante; Musumeci, Gabriella; Leoni, Chiara; Mordillo-Mateos, Laura; Tartaglia, Marco; Zampino, Giuseppe; Di Lazzaro, Vincenzo

    2016-01-01

    Costello syndrome (CS) is a rare congenital disorder due to a G12S amino acid substitution in HRAS protoncogene. Previous studies have shown that Paired Associative Stimulation (PAS), a repetitive brain stimulation protocol inducing motor cortex plasticity by coupling peripheral nerve stimulation with brain stimulation, leads to an extremely pronounced motor cortex excitability increase in CS patients. Intermittent Theta Burst Stimulation (iTBS) represents a protocol able to induce motor cortex plasticity by trains of stimuli at 50 Hz. In healthy subjects PAS and iTBS produce similar after-effects in motor cortex excitability. Experimental models showed that HRAS-dependent signalling pathways differently affect LTP induced by different patterns of repetitive synaptic stimulation. We aimed to compare iTBS-induced after-effects on motor cortex excitability with those produced by PAS in CS patients and to observe whether HRAS mutation differentially affects two different forms of neuromodulation protocols. We evaluated in vivo after-effects induced by PAS and iTBS applied over the right motor cortex in 4 CS patients and in 21 healthy age-matched controls. Our findings confirmed HRAS-dependent extremely pronounced PAS-induced after-effects and showed for the first time that iTBS induces no change in MEP amplitude in CS patients whereas both protocols lead to an increase of about 50% in controls. CS patients are characterized by an impairment of iTBS-related LTP-like phenomena besides enhanced PAS-induced after-effects, suggesting that HRAS-dependent signalling pathways have a differential influence on PAS- and iTBS-induced plasticity in humans. Copyright © 2015 Elsevier Inc. All rights reserved.

  14. Introduction to probability

    CERN Document Server

    Freund, John E

    1993-01-01

    Thorough, lucid coverage of permutations and factorials, probabilities and odds, frequency interpretation, mathematical expectation, decision making, postulates of probability, rule of elimination, binomial distribution, geometric distribution, standard deviation, law of large numbers, and much more. Exercises with some solutions. Summary. Bibliography. Includes 42 black-and-white illustrations. 1973 edition.

  15. Probability, Nondeterminism and Concurrency

    DEFF Research Database (Denmark)

    Varacca, Daniele

    Nondeterminism is modelled in domain theory by the notion of a powerdomain, while probability is modelled by that of the probabilistic powerdomain. Some problems arise when we want to combine them in order to model computation in which both nondeterminism and probability are present. In particula...

  16. Janus-faced probability

    CERN Document Server

    Rocchi, Paolo

    2014-01-01

    The problem of probability interpretation was long overlooked before exploding in the 20th century, when the frequentist and subjectivist schools formalized two conflicting conceptions of probability. Beyond the radical followers of the two schools, a circle of pluralist thinkers tends to reconcile the opposing concepts. The author uses two theorems in order to prove that the various interpretations of probability do not come into opposition and can be used in different contexts. The goal here is to clarify the multifold nature of probability by means of a purely mathematical approach and to show how philosophical arguments can only serve to deepen actual intellectual contrasts. The book can be considered as one of the most important contributions in the analysis of probability interpretation in the last 10-15 years.

  17. Probability and Measure

    CERN Document Server

    Billingsley, Patrick

    2012-01-01

    Praise for the Third Edition "It is, as far as I'm concerned, among the best books in math ever written....if you are a mathematician and want to have the top reference in probability, this is it." (Amazon.com, January 2006) A complete and comprehensive classic in probability and measure theory Probability and Measure, Anniversary Edition by Patrick Billingsley celebrates the achievements and advancements that have made this book a classic in its field for the past 35 years. Now re-issued in a new style and format, but with the reliable content that the third edition was revered for, this

  18. The concept of probability

    International Nuclear Information System (INIS)

    Bitsakis, E.I.; Nicolaides, C.A.

    1989-01-01

    The concept of probability is now, and always has been, central to the debate on the interpretation of quantum mechanics. Furthermore, probability permeates all of science, as well as our every day life. The papers included in this volume, written by leading proponents of the ideas expressed, embrace a broad spectrum of thought and results: mathematical, physical epistemological, and experimental, both specific and general. The contributions are arranged in parts under the following headings: Following Schroedinger's thoughts; Probability and quantum mechanics; Aspects of the arguments on nonlocality; Bell's theorem and EPR correlations; Real or Gedanken experiments and their interpretation; Questions about irreversibility and stochasticity; and Epistemology, interpretation and culture. (author). refs.; figs.; tabs

  19. Probability for statisticians

    CERN Document Server

    Shorack, Galen R

    2017-01-01

    This 2nd edition textbook offers a rigorous introduction to measure theoretic probability with particular attention to topics of interest to mathematical statisticians—a textbook for courses in probability for students in mathematical statistics. It is recommended to anyone interested in the probability underlying modern statistics, providing a solid grounding in the probabilistic tools and techniques necessary to do theoretical research in statistics. For the teaching of probability theory to post graduate statistics students, this is one of the most attractive books available. Of particular interest is a presentation of the major central limit theorems via Stein's method either prior to or alternative to a characteristic function presentation. Additionally, there is considerable emphasis placed on the quantile function as well as the distribution function. The bootstrap and trimming are both presented. Martingale coverage includes coverage of censored data martingales. The text includes measure theoretic...

  20. Concepts of probability theory

    CERN Document Server

    Pfeiffer, Paul E

    1979-01-01

    Using the Kolmogorov model, this intermediate-level text discusses random variables, probability distributions, mathematical expectation, random processes, more. For advanced undergraduates students of science, engineering, or math. Includes problems with answers and six appendixes. 1965 edition.

  1. Probability and Bayesian statistics

    CERN Document Server

    1987-01-01

    This book contains selected and refereed contributions to the "Inter­ national Symposium on Probability and Bayesian Statistics" which was orga­ nized to celebrate the 80th birthday of Professor Bruno de Finetti at his birthplace Innsbruck in Austria. Since Professor de Finetti died in 1985 the symposium was dedicated to the memory of Bruno de Finetti and took place at Igls near Innsbruck from 23 to 26 September 1986. Some of the pa­ pers are published especially by the relationship to Bruno de Finetti's scientific work. The evolution of stochastics shows growing importance of probability as coherent assessment of numerical values as degrees of believe in certain events. This is the basis for Bayesian inference in the sense of modern statistics. The contributions in this volume cover a broad spectrum ranging from foundations of probability across psychological aspects of formulating sub­ jective probability statements, abstract measure theoretical considerations, contributions to theoretical statistics an...

  2. Probability and Statistical Inference

    OpenAIRE

    Prosper, Harrison B.

    2006-01-01

    These lectures introduce key concepts in probability and statistical inference at a level suitable for graduate students in particle physics. Our goal is to paint as vivid a picture as possible of the concepts covered.

  3. Probabilities in physics

    CERN Document Server

    Hartmann, Stephan

    2011-01-01

    Many results of modern physics--those of quantum mechanics, for instance--come in a probabilistic guise. But what do probabilistic statements in physics mean? Are probabilities matters of objective fact and part of the furniture of the world, as objectivists think? Or do they only express ignorance or belief, as Bayesians suggest? And how are probabilistic hypotheses justified and supported by empirical evidence? Finally, what does the probabilistic nature of physics imply for our understanding of the world? This volume is the first to provide a philosophical appraisal of probabilities in all of physics. Its main aim is to make sense of probabilistic statements as they occur in the various physical theories and models and to provide a plausible epistemology and metaphysics of probabilities. The essays collected here consider statistical physics, probabilistic modelling, and quantum mechanics, and critically assess the merits and disadvantages of objectivist and subjectivist views of probabilities in these fie...

  4. Probability an introduction

    CERN Document Server

    Grimmett, Geoffrey

    2014-01-01

    Probability is an area of mathematics of tremendous contemporary importance across all aspects of human endeavour. This book is a compact account of the basic features of probability and random processes at the level of first and second year mathematics undergraduates and Masters' students in cognate fields. It is suitable for a first course in probability, plus a follow-up course in random processes including Markov chains. A special feature is the authors' attention to rigorous mathematics: not everything is rigorous, but the need for rigour is explained at difficult junctures. The text is enriched by simple exercises, together with problems (with very brief hints) many of which are taken from final examinations at Cambridge and Oxford. The first eight chapters form a course in basic probability, being an account of events, random variables, and distributions - discrete and continuous random variables are treated separately - together with simple versions of the law of large numbers and the central limit th...

  5. Probability in physics

    CERN Document Server

    Hemmo, Meir

    2012-01-01

    What is the role and meaning of probability in physical theory, in particular in two of the most successful theories of our age, quantum physics and statistical mechanics? Laws once conceived as universal and deterministic, such as Newton‘s laws of motion, or the second law of thermodynamics, are replaced in these theories by inherently probabilistic laws. This collection of essays by some of the world‘s foremost experts presents an in-depth analysis of the meaning of probability in contemporary physics. Among the questions addressed are: How are probabilities defined? Are they objective or subjective? What is their  explanatory value? What are the differences between quantum and classical probabilities? The result is an informative and thought-provoking book for the scientifically inquisitive. 

  6. Probability in quantum mechanics

    Directory of Open Access Journals (Sweden)

    J. G. Gilson

    1982-01-01

    Full Text Available By using a fluid theory which is an alternative to quantum theory but from which the latter can be deduced exactly, the long-standing problem of how quantum mechanics is related to stochastic processes is studied. It can be seen how the Schrödinger probability density has a relationship to time spent on small sections of an orbit, just as the probability density has in some classical contexts.

  7. Quantum computing and probability.

    Science.gov (United States)

    Ferry, David K

    2009-11-25

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.

  8. Quantum computing and probability

    International Nuclear Information System (INIS)

    Ferry, David K

    2009-01-01

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction. (viewpoint)

  9. Probability analysis of nuclear power plant hazards

    International Nuclear Information System (INIS)

    Kovacs, Z.

    1985-01-01

    The probability analysis of risk is described used for quantifying the risk of complex technological systems, especially of nuclear power plants. Risk is defined as the product of the probability of the occurrence of a dangerous event and the significance of its consequences. The process of the analysis may be divided into the stage of power plant analysis to the point of release of harmful material into the environment (reliability analysis) and the stage of the analysis of the consequences of this release and the assessment of the risk. The sequence of operations is characterized in the individual stages. The tasks are listed which Czechoslovakia faces in the development of the probability analysis of risk, and the composition is recommended of the work team for coping with the task. (J.C.)

  10. The perception of probability.

    Science.gov (United States)

    Gallistel, C R; Krishan, Monika; Liu, Ye; Miller, Reilly; Latham, Peter E

    2014-01-01

    We present a computational model to explain the results from experiments in which subjects estimate the hidden probability parameter of a stepwise nonstationary Bernoulli process outcome by outcome. The model captures the following results qualitatively and quantitatively, with only 2 free parameters: (a) Subjects do not update their estimate after each outcome; they step from one estimate to another at irregular intervals. (b) The joint distribution of step widths and heights cannot be explained on the assumption that a threshold amount of change must be exceeded in order for them to indicate a change in their perception. (c) The mapping of observed probability to the median perceived probability is the identity function over the full range of probabilities. (d) Precision (how close estimates are to the best possible estimate) is good and constant over the full range. (e) Subjects quickly detect substantial changes in the hidden probability parameter. (f) The perceived probability sometimes changes dramatically from one observation to the next. (g) Subjects sometimes have second thoughts about a previous change perception, after observing further outcomes. (h) The frequency with which they perceive changes moves in the direction of the true frequency over sessions. (Explaining this finding requires 2 additional parametric assumptions.) The model treats the perception of the current probability as a by-product of the construction of a compact encoding of the experienced sequence in terms of its change points. It illustrates the why and the how of intermittent Bayesian belief updating and retrospective revision in simple perception. It suggests a reinterpretation of findings in the recent literature on the neurobiology of decision making. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

  11. Irreversibility and conditional probability

    International Nuclear Information System (INIS)

    Stuart, C.I.J.M.

    1989-01-01

    The mathematical entropy - unlike physical entropy - is simply a measure of uniformity for probability distributions in general. So understood, conditional entropies have the same logical structure as conditional probabilities. If, as is sometimes supposed, conditional probabilities are time-reversible, then so are conditional entropies and, paradoxically, both then share this symmetry with physical equations of motion. The paradox is, of course that probabilities yield a direction to time both in statistical mechanics and quantum mechanics, while the equations of motion do not. The supposed time-reversibility of both conditionals seems also to involve a form of retrocausality that is related to, but possibly not the same as, that described by Costa de Beaurgard. The retrocausality is paradoxically at odds with the generally presumed irreversibility of the quantum mechanical measurement process. Further paradox emerges if the supposed time-reversibility of the conditionals is linked with the idea that the thermodynamic entropy is the same thing as 'missing information' since this confounds the thermodynamic and mathematical entropies. However, it is shown that irreversibility is a formal consequence of conditional entropies and, hence, of conditional probabilities also. 8 refs. (Author)

  12. The pleasures of probability

    CERN Document Server

    Isaac, Richard

    1995-01-01

    The ideas of probability are all around us. Lotteries, casino gambling, the al­ most non-stop polling which seems to mold public policy more and more­ these are a few of the areas where principles of probability impinge in a direct way on the lives and fortunes of the general public. At a more re­ moved level there is modern science which uses probability and its offshoots like statistics and the theory of random processes to build mathematical descriptions of the real world. In fact, twentieth-century physics, in embrac­ ing quantum mechanics, has a world view that is at its core probabilistic in nature, contrary to the deterministic one of classical physics. In addition to all this muscular evidence of the importance of probability ideas it should also be said that probability can be lots of fun. It is a subject where you can start thinking about amusing, interesting, and often difficult problems with very little mathematical background. In this book, I wanted to introduce a reader with at least a fairl...

  13. Experimental Probability in Elementary School

    Science.gov (United States)

    Andrew, Lane

    2009-01-01

    Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.

  14. Improving Ranking Using Quantum Probability

    OpenAIRE

    Melucci, Massimo

    2011-01-01

    The paper shows that ranking information units by quantum probability differs from ranking them by classical probability provided the same data used for parameter estimation. As probability of detection (also known as recall or power) and probability of false alarm (also known as fallout or size) measure the quality of ranking, we point out and show that ranking by quantum probability yields higher probability of detection than ranking by classical probability provided a given probability of ...

  15. Release the Prisoners Game

    Science.gov (United States)

    Van Hecke, Tanja

    2011-01-01

    This article presents the mathematical approach of the optimal strategy to win the "Release the prisoners" game and the integration of this analysis in a math class. Outline lesson plans at three different levels are given, where simulations are suggested as well as theoretical findings about the probability distribution function and its mean…

  16. Choice probability generating functions

    DEFF Research Database (Denmark)

    Fosgerau, Mogens; McFadden, Daniel; Bierlaire, Michel

    2010-01-01

    This paper establishes that every random utility discrete choice model (RUM) has a representation that can be characterized by a choice-probability generating function (CPGF) with specific properties, and that every function with these specific properties is consistent with a RUM. The choice...... probabilities from the RUM are obtained from the gradient of the CPGF. Mixtures of RUM are characterized by logarithmic mixtures of their associated CPGF. The paper relates CPGF to multivariate extreme value distributions, and reviews and extends methods for constructing generating functions for applications....... The choice probabilities of any ARUM may be approximated by a cross-nested logit model. The results for ARUM are extended to competing risk survival models....

  17. Probability and stochastic modeling

    CERN Document Server

    Rotar, Vladimir I

    2012-01-01

    Basic NotionsSample Space and EventsProbabilitiesCounting TechniquesIndependence and Conditional ProbabilityIndependenceConditioningThe Borel-Cantelli TheoremDiscrete Random VariablesRandom Variables and VectorsExpected ValueVariance and Other Moments. Inequalities for DeviationsSome Basic DistributionsConvergence of Random Variables. The Law of Large NumbersConditional ExpectationGenerating Functions. Branching Processes. Random Walk RevisitedBranching Processes Generating Functions Branching Processes Revisited More on Random WalkMarkov ChainsDefinitions and Examples. Probability Distributions of Markov ChainsThe First Step Analysis. Passage TimesVariables Defined on a Markov ChainErgodicity and Stationary DistributionsA Classification of States and ErgodicityContinuous Random VariablesContinuous DistributionsSome Basic Distributions Continuous Multivariate Distributions Sums of Independent Random Variables Conditional Distributions and ExpectationsDistributions in the General Case. SimulationDistribution F...

  18. Estimating Subjective Probabilities

    DEFF Research Database (Denmark)

    Andersen, Steffen; Fountain, John; Harrison, Glenn W.

    2014-01-01

    either construct elicitation mechanisms that control for risk aversion, or construct elicitation mechanisms which undertake 'calibrating adjustments' to elicited reports. We illustrate how the joint estimation of risk attitudes and subjective probabilities can provide the calibration adjustments...... that theory calls for. We illustrate this approach using data from a controlled experiment with real monetary consequences to the subjects. This allows the observer to make inferences about the latent subjective probability, under virtually any well-specified model of choice under subjective risk, while still...

  19. Introduction to imprecise probabilities

    CERN Document Server

    Augustin, Thomas; de Cooman, Gert; Troffaes, Matthias C M

    2014-01-01

    In recent years, the theory has become widely accepted and has been further developed, but a detailed introduction is needed in order to make the material available and accessible to a wide audience. This will be the first book providing such an introduction, covering core theory and recent developments which can be applied to many application areas. All authors of individual chapters are leading researchers on the specific topics, assuring high quality and up-to-date contents. An Introduction to Imprecise Probabilities provides a comprehensive introduction to imprecise probabilities, includin

  20. Classic Problems of Probability

    CERN Document Server

    Gorroochurn, Prakash

    2012-01-01

    "A great book, one that I will certainly add to my personal library."—Paul J. Nahin, Professor Emeritus of Electrical Engineering, University of New Hampshire Classic Problems of Probability presents a lively account of the most intriguing aspects of statistics. The book features a large collection of more than thirty classic probability problems which have been carefully selected for their interesting history, the way they have shaped the field, and their counterintuitive nature. From Cardano's 1564 Games of Chance to Jacob Bernoulli's 1713 Golden Theorem to Parrondo's 1996 Perplexin

  1. A novel fusion method of improved adaptive LTP and two-directional two-dimensional PCA for face feature extraction

    Science.gov (United States)

    Luo, Yuan; Wang, Bo-yu; Zhang, Yi; Zhao, Li-ming

    2018-03-01

    In this paper, under different illuminations and random noises, focusing on the local texture feature's defects of a face image that cannot be completely described because the threshold of local ternary pattern (LTP) cannot be calculated adaptively, a local three-value model of improved adaptive local ternary pattern (IALTP) is proposed. Firstly, the difference function between the center pixel and the neighborhood pixel weight is established to obtain the statistical characteristics of the central pixel and the neighborhood pixel. Secondly, the adaptively gradient descent iterative function is established to calculate the difference coefficient which is defined to be the threshold of the IALTP operator. Finally, the mean and standard deviation of the pixel weight of the local region are used as the coding mode of IALTP. In order to reflect the overall properties of the face and reduce the dimension of features, the two-directional two-dimensional PCA ((2D)2PCA) is adopted. The IALTP is used to extract local texture features of eyes and mouth area. After combining the global features and local features, the fusion features (IALTP+) are obtained. The experimental results on the Extended Yale B and AR standard face databases indicate that under different illuminations and random noises, the algorithm proposed in this paper is more robust than others, and the feature's dimension is smaller. The shortest running time reaches 0.329 6 s, and the highest recognition rate reaches 97.39%.

  2. Counterexamples in probability

    CERN Document Server

    Stoyanov, Jordan M

    2013-01-01

    While most mathematical examples illustrate the truth of a statement, counterexamples demonstrate a statement's falsity. Enjoyable topics of study, counterexamples are valuable tools for teaching and learning. The definitive book on the subject in regards to probability, this third edition features the author's revisions and corrections plus a substantial new appendix.

  3. Epistemology and Probability

    CERN Document Server

    Plotnitsky, Arkady

    2010-01-01

    Offers an exploration of the relationships between epistemology and probability in the work of Niels Bohr, Werner Heisenberg, and Erwin Schrodinger; in quantum mechanics; and in modern physics. This book considers the implications of these relationships and of quantum theory for our understanding of the nature of thinking and knowledge in general

  4. Transition probabilities for atoms

    International Nuclear Information System (INIS)

    Kim, Y.K.

    1980-01-01

    Current status of advanced theoretical methods for transition probabilities for atoms and ions is discussed. An experiment on the f values of the resonance transitions of the Kr and Xe isoelectronic sequences is suggested as a test for the theoretical methods

  5. NMDA receptor subunits in the adult rat hippocampus undergo similar changes after 5 minutes in an open field and after LTP induction.

    Directory of Open Access Journals (Sweden)

    Maria Veronica Baez

    Full Text Available NMDA receptor subunits change during development and their synaptic expression is modified rapidly after synaptic plasticity induction in hippocampal slices. However, there is scarce information on subunits expression after synaptic plasticity induction or memory acquisition, particularly in adults. GluN1, GluN2A and GluN2B NMDA receptor subunits were assessed by western blot in 1 adult rats that had explored an open field (OF for 5 minutes, a time sufficient to induce habituation, 2 mature rat hippocampal neuron cultures depolarized by KCl and 3 hippocampal slices from adult rats where long term potentiation (LTP was induced by theta-burst stimulation (TBS. GluN1 and GluN2A, though not GluN2B, were significantly higher 70 minutes--but not 30 minutes--after a 5 minutes session in an OF. GluN1 and GluN2A total immunofluorescence and puncta in neurites increased in cultures, as evaluated 70 minutes after KCl stimulation. Similar changes were found in hippocampal slices 70 minutes after LTP induction. To start to explore underlying mechanisms, hippocampal slices were treated either with cycloheximide (a translation inhibitor or actinomycin D (a transcription inhibitor during electrophysiological assays. It was corroborated that translation was necessary for LTP induction and expression. The rise in GluN1 depends on transcription and translation, while the increase in GluN2A appears to mainly depend on translation, though a contribution of some remaining transcriptional activity during actinomycin D treatment could not be rouled out. LTP effective induction was required for the subunits to increase. Although in the three models same subunits suffered modifications in the same direction, within an apparently similar temporal course, further investigation is required to reveal if they are related processes and to find out whether they are causally related with synaptic plasticity, learning and memory.

  6. Negative probability in the framework of combined probability

    OpenAIRE

    Burgin, Mark

    2013-01-01

    Negative probability has found diverse applications in theoretical physics. Thus, construction of sound and rigorous mathematical foundations for negative probability is important for physics. There are different axiomatizations of conventional probability. So, it is natural that negative probability also has different axiomatic frameworks. In the previous publications (Burgin, 2009; 2010), negative probability was mathematically formalized and rigorously interpreted in the context of extende...

  7. Contributions to quantum probability

    International Nuclear Information System (INIS)

    Fritz, Tobias

    2010-01-01

    Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a finite set can occur as the outcome

  8. Bayesian Probability Theory

    Science.gov (United States)

    von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo

    2014-06-01

    Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.

  9. Contributions to quantum probability

    Energy Technology Data Exchange (ETDEWEB)

    Fritz, Tobias

    2010-06-25

    Chapter 1: On the existence of quantum representations for two dichotomic measurements. Under which conditions do outcome probabilities of measurements possess a quantum-mechanical model? This kind of problem is solved here for the case of two dichotomic von Neumann measurements which can be applied repeatedly to a quantum system with trivial dynamics. The solution uses methods from the theory of operator algebras and the theory of moment problems. The ensuing conditions reveal surprisingly simple relations between certain quantum-mechanical probabilities. It also shown that generally, none of these relations holds in general probabilistic models. This result might facilitate further experimental discrimination between quantum mechanics and other general probabilistic theories. Chapter 2: Possibilistic Physics. I try to outline a framework for fundamental physics where the concept of probability gets replaced by the concept of possibility. Whereas a probabilistic theory assigns a state-dependent probability value to each outcome of each measurement, a possibilistic theory merely assigns one of the state-dependent labels ''possible to occur'' or ''impossible to occur'' to each outcome of each measurement. It is argued that Spekkens' combinatorial toy theory of quantum mechanics is inconsistent in a probabilistic framework, but can be regarded as possibilistic. Then, I introduce the concept of possibilistic local hidden variable models and derive a class of possibilistic Bell inequalities which are violated for the possibilistic Popescu-Rohrlich boxes. The chapter ends with a philosophical discussion on possibilistic vs. probabilistic. It can be argued that, due to better falsifiability properties, a possibilistic theory has higher predictive power than a probabilistic one. Chapter 3: The quantum region for von Neumann measurements with postselection. It is determined under which conditions a probability distribution on a

  10. Waste Package Misload Probability

    International Nuclear Information System (INIS)

    Knudsen, J.K.

    2001-01-01

    The objective of this calculation is to calculate the probability of occurrence for fuel assembly (FA) misloads (i.e., Fa placed in the wrong location) and FA damage during FA movements. The scope of this calculation is provided by the information obtained from the Framatome ANP 2001a report. The first step in this calculation is to categorize each fuel-handling events that occurred at nuclear power plants. The different categories are based on FAs being damaged or misloaded. The next step is to determine the total number of FAs involved in the event. Using the information, a probability of occurrence will be calculated for FA misload and FA damage events. This calculation is an expansion of preliminary work performed by Framatome ANP 2001a

  11. Probability theory and applications

    CERN Document Server

    Hsu, Elton P

    1999-01-01

    This volume, with contributions by leading experts in the field, is a collection of lecture notes of the six minicourses given at the IAS/Park City Summer Mathematics Institute. It introduces advanced graduates and researchers in probability theory to several of the currently active research areas in the field. Each course is self-contained with references and contains basic materials and recent results. Topics include interacting particle systems, percolation theory, analysis on path and loop spaces, and mathematical finance. The volume gives a balanced overview of the current status of probability theory. An extensive bibliography for further study and research is included. This unique collection presents several important areas of current research and a valuable survey reflecting the diversity of the field.

  12. Paradoxes in probability theory

    CERN Document Server

    Eckhardt, William

    2013-01-01

    Paradoxes provide a vehicle for exposing misinterpretations and misapplications of accepted principles. This book discusses seven paradoxes surrounding probability theory.  Some remain the focus of controversy; others have allegedly been solved, however the accepted solutions are demonstrably incorrect. Each paradox is shown to rest on one or more fallacies.  Instead of the esoteric, idiosyncratic, and untested methods that have been brought to bear on these problems, the book invokes uncontroversial probability principles, acceptable both to frequentists and subjectivists. The philosophical disputation inspired by these paradoxes is shown to be misguided and unnecessary; for instance, startling claims concerning human destiny and the nature of reality are directly related to fallacious reasoning in a betting paradox, and a problem analyzed in philosophy journals is resolved by means of a computer program.

  13. Measurement uncertainty and probability

    CERN Document Server

    Willink, Robin

    2013-01-01

    A measurement result is incomplete without a statement of its 'uncertainty' or 'margin of error'. But what does this statement actually tell us? By examining the practical meaning of probability, this book discusses what is meant by a '95 percent interval of measurement uncertainty', and how such an interval can be calculated. The book argues that the concept of an unknown 'target value' is essential if probability is to be used as a tool for evaluating measurement uncertainty. It uses statistical concepts, such as a conditional confidence interval, to present 'extended' classical methods for evaluating measurement uncertainty. The use of the Monte Carlo principle for the simulation of experiments is described. Useful for researchers and graduate students, the book also discusses other philosophies relating to the evaluation of measurement uncertainty. It employs clear notation and language to avoid the confusion that exists in this controversial field of science.

  14. Model uncertainty and probability

    International Nuclear Information System (INIS)

    Parry, G.W.

    1994-01-01

    This paper discusses the issue of model uncertainty. The use of probability as a measure of an analyst's uncertainty as well as a means of describing random processes has caused some confusion, even though the two uses are representing different types of uncertainty with respect to modeling a system. The importance of maintaining the distinction between the two types is illustrated with a simple example

  15. Retrocausality and conditional probability

    International Nuclear Information System (INIS)

    Stuart, C.I.J.M.

    1989-01-01

    Costa de Beauregard has proposed that physical causality be identified with conditional probability. The proposal is shown to be vulnerable on two accounts. The first, though mathematically trivial, seems to be decisive so far as the current formulation of the proposal is concerned. The second lies in a physical inconsistency which seems to have its source in a Copenhagenlike disavowal of realism in quantum mechanics. 6 refs. (Author)

  16. Probability via expectation

    CERN Document Server

    Whittle, Peter

    1992-01-01

    This book is a complete revision of the earlier work Probability which ap­ peared in 1970. While revised so radically and incorporating so much new material as to amount to a new text, it preserves both the aim and the approach of the original. That aim was stated as the provision of a 'first text in probability, de­ manding a reasonable but not extensive knowledge of mathematics, and taking the reader to what one might describe as a good intermediate level'. In doing so it attempted to break away from stereotyped applications, and consider applications of a more novel and significant character. The particular novelty of the approach was that expectation was taken as the prime concept, and the concept of expectation axiomatized rather than that of a probability measure. In the preface to the original text of 1970 (reproduced below, together with that to the Russian edition of 1982) I listed what I saw as the advantages of the approach in as unlaboured a fashion as I could. I also took the view that the text...

  17. Probability mapping of contaminants

    Energy Technology Data Exchange (ETDEWEB)

    Rautman, C.A.; Kaplan, P.G. [Sandia National Labs., Albuquerque, NM (United States); McGraw, M.A. [Univ. of California, Berkeley, CA (United States); Istok, J.D. [Oregon State Univ., Corvallis, OR (United States); Sigda, J.M. [New Mexico Inst. of Mining and Technology, Socorro, NM (United States)

    1994-04-01

    Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. The probability mapping approach illustrated in this paper appears to offer site operators a reasonable, quantitative methodology for many environmental remediation decisions and allows evaluation of the risk associated with those decisions. For example, output from this approach can be used in quantitative, cost-based decision models for evaluating possible site characterization and/or remediation plans, resulting in selection of the risk-adjusted, least-cost alternative. The methodology is completely general, and the techniques are applicable to a wide variety of environmental restoration projects. The probability-mapping approach is illustrated by application to a contaminated site at the former DOE Feed Materials Production Center near Fernald, Ohio. Soil geochemical data, collected as part of the Uranium-in-Soils Integrated Demonstration Project, have been used to construct a number of geostatistical simulations of potential contamination for parcels approximately the size of a selective remediation unit (the 3-m width of a bulldozer blade). Each such simulation accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination (potential clean-up or personnel-hazard thresholds).

  18. Probability mapping of contaminants

    International Nuclear Information System (INIS)

    Rautman, C.A.; Kaplan, P.G.; McGraw, M.A.; Istok, J.D.; Sigda, J.M.

    1994-01-01

    Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. The probability mapping approach illustrated in this paper appears to offer site operators a reasonable, quantitative methodology for many environmental remediation decisions and allows evaluation of the risk associated with those decisions. For example, output from this approach can be used in quantitative, cost-based decision models for evaluating possible site characterization and/or remediation plans, resulting in selection of the risk-adjusted, least-cost alternative. The methodology is completely general, and the techniques are applicable to a wide variety of environmental restoration projects. The probability-mapping approach is illustrated by application to a contaminated site at the former DOE Feed Materials Production Center near Fernald, Ohio. Soil geochemical data, collected as part of the Uranium-in-Soils Integrated Demonstration Project, have been used to construct a number of geostatistical simulations of potential contamination for parcels approximately the size of a selective remediation unit (the 3-m width of a bulldozer blade). Each such simulation accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination (potential clean-up or personnel-hazard thresholds)

  19. Probability of causation approach

    International Nuclear Information System (INIS)

    Jose, D.E.

    1988-01-01

    Probability of causation (PC) is sometimes viewed as a great improvement by those persons who are not happy with the present rulings of courts in radiation cases. The author does not share that hope and expects that PC will not play a significant role in these issues for at least the next decade. If it is ever adopted in a legislative compensation scheme, it will be used in a way that is unlikely to please most scientists. Consequently, PC is a false hope for radiation scientists, and its best contribution may well lie in some of the spin-off effects, such as an influence on medical practice

  20. Generalized Probability Functions

    Directory of Open Access Journals (Sweden)

    Alexandre Souto Martinez

    2009-01-01

    Full Text Available From the integration of nonsymmetrical hyperboles, a one-parameter generalization of the logarithmic function is obtained. Inverting this function, one obtains the generalized exponential function. Motivated by the mathematical curiosity, we show that these generalized functions are suitable to generalize some probability density functions (pdfs. A very reliable rank distribution can be conveniently described by the generalized exponential function. Finally, we turn the attention to the generalization of one- and two-tail stretched exponential functions. We obtain, as particular cases, the generalized error function, the Zipf-Mandelbrot pdf, the generalized Gaussian and Laplace pdf. Their cumulative functions and moments were also obtained analytically.

  1. Probability in High Dimension

    Science.gov (United States)

    2014-06-30

    precisely the content of the following result. The price we pay is that the assumption that A is a packing in (F, k ·k1) is too weak to make this happen...Regularité des trajectoires des fonctions aléatoires gaussiennes. In: École d’Été de Probabilités de Saint- Flour , IV-1974, pp. 1–96. Lecture Notes in...Lectures on probability theory and statistics (Saint- Flour , 1994), Lecture Notes in Math., vol. 1648, pp. 165–294. Springer, Berlin (1996) 50. Ledoux

  2. Probable Unusual Transmission of Zika Virus

    Centers for Disease Control (CDC) Podcasts

    2011-05-23

    This podcast discusses a study about the probable unusual transmission of Zika Virus Infection from a scientist to his wife, published in the May 2011 issue of Emerging Infectious Diseases. Dr. Brian Foy, Associate Professor at Colorado State University, shares details of this event.  Created: 5/23/2011 by National Center for Emerging Zoonotic and Infectious Diseases (NCEZID).   Date Released: 5/25/2011.

  3. Probability of brittle failure

    Science.gov (United States)

    Kim, A.; Bosnyak, C. P.; Chudnovsky, A.

    1991-01-01

    A methodology was developed for collecting statistically representative data for crack initiation and arrest from small number of test specimens. An epoxy (based on bisphenol A diglycidyl ether and polyglycol extended diglycyl ether and cured with diethylene triamine) is selected as a model material. A compact tension specimen with displacement controlled loading is used to observe multiple crack initiation and arrests. The energy release rate at crack initiation is significantly higher than that at a crack arrest, as has been observed elsewhere. The difference between these energy release rates is found to depend on specimen size (scale effect), and is quantitatively related to the fracture surface morphology. The scale effect, similar to that in statistical strength theory, is usually attributed to the statistics of defects which control the fracture process. Triangular shaped ripples (deltoids) are formed on the fracture surface during the slow subcritical crack growth, prior to the smooth mirror-like surface characteristic of fast cracks. The deltoids are complementary on the two crack faces which excludes any inelastic deformation from consideration. Presence of defects is also suggested by the observed scale effect. However, there are no defects at the deltoid apexes detectable down to the 0.1 micron level.

  4. Probable maximum flood control

    International Nuclear Information System (INIS)

    DeGabriele, C.E.; Wu, C.L.

    1991-11-01

    This study proposes preliminary design concepts to protect the waste-handling facilities and all shaft and ramp entries to the underground from the probable maximum flood (PMF) in the current design configuration for the proposed Nevada Nuclear Waste Storage Investigation (NNWSI) repository protection provisions were furnished by the United States Bureau of Reclamation (USSR) or developed from USSR data. Proposed flood protection provisions include site grading, drainage channels, and diversion dikes. Figures are provided to show these proposed flood protection provisions at each area investigated. These areas are the central surface facilities (including the waste-handling building and waste treatment building), tuff ramp portal, waste ramp portal, men-and-materials shaft, emplacement exhaust shaft, and exploratory shafts facility

  5. Long-term fluoxetine treatment induces input-specific LTP and LTD impairment and structural plasticity in the CA1 hippocampal subfield.

    Directory of Open Access Journals (Sweden)

    Francisco J Rubio

    2013-05-01

    Full Text Available Antidepressant drugs are usually administered for long time for the treatment of major depressive disorder. However, they are also prescribed in several additional psychiatric conditions as well as during long term maintenance treatments. Antidepressants induce adaptive changes in several forebrain structures which include modifications at glutamatergic synapses. We recently found that repetitive administration of the selective serotonin reuptake inhibitor fluoxetine to naϊve adult male rats induced an increase of mature, mushroom-type dendritic spines in several forebrain regions. This was associated with an increase of GluA2-containing α-amino-3-hydroxy-5-methylisoxazole-4-propionate receptors (AMPA-Rs in telencephalic postsynaptic densities. To unravel the functional significance of such a synaptic re-arrangement, we focused on glutamate neurotransmission in the hippocampus. We evaluated the effect of four weeks of treatment with 0.7 mg/kg of fluoxetine on long-term potentiation (LTP and long-term depression (LTD in the Schaffer collateral-CA1 synapses and the perforant path-CA1 synapses. Recordings in hippocampal slices revealed profound deficits in LTP and LTD at Schaffer collateral-CA1 synapses associated to increased spine density and enhanced presence of mushroom-type spines, as revealed by Golgi staining. However, the same treatment had neither an effect on spine morphology, nor on LTP and LTD at perforant path-CA1 synapses. Cobalt staining experiments revealed decreased AMPA-R Ca2+ permeability in the stratum radiatum together with increased GluA2-containing, Ca2+-impermeable AMPA-Rs. Therefore, 4 weeks of fluoxetine treatment promoted structural and functional adaptations in CA1 neurons in a pathway-specific manner that were selectively associated with impairment of activity-dependent plasticity at Schaffer collateral-CA1 synapses.

  6. Method for assessing the probability of accumulated doses from an intermittent source using the convolution technique

    International Nuclear Information System (INIS)

    Coleman, J.H.

    1980-10-01

    A technique is discussed for computing the probability distribution of the accumulated dose received by an arbitrary receptor resulting from several single releases from an intermittent source. The probability density of the accumulated dose is the convolution of the probability densities of doses from the intermittent releases. Emissions are not assumed to be constant over the brief release period. The fast fourier transform is used in the calculation of the convolution

  7. Probability and rational choice

    Directory of Open Access Journals (Sweden)

    David Botting

    2014-05-01

    Full Text Available http://dx.doi.org/10.5007/1808-1711.2014v18n1p1 In this paper I will discuss the rationality of reasoning about the future. There are two things that we might like to know about the future: which hypotheses are true and what will happen next. To put it in philosophical language, I aim to show that there are methods by which inferring to a generalization (selecting a hypothesis and inferring to the next instance (singular predictive inference can be shown to be normative and the method itself shown to be rational, where this is due in part to being based on evidence (although not in the same way and in part on a prior rational choice. I will also argue that these two inferences have been confused, being distinct not only conceptually (as nobody disputes but also in their results (the value given to the probability of the hypothesis being not in general that given to the next instance and that methods that are adequate for one are not by themselves adequate for the other. A number of debates over method founder on this confusion and do not show what the debaters think they show.

  8. COVAL, Compound Probability Distribution for Function of Probability Distribution

    International Nuclear Information System (INIS)

    Astolfi, M.; Elbaz, J.

    1979-01-01

    1 - Nature of the physical problem solved: Computation of the probability distribution of a function of variables, given the probability distribution of the variables themselves. 'COVAL' has been applied to reliability analysis of a structure subject to random loads. 2 - Method of solution: Numerical transformation of probability distributions

  9. Establishment probability in newly founded populations

    Directory of Open Access Journals (Sweden)

    Gusset Markus

    2012-06-01

    Full Text Available Abstract Background Establishment success in newly founded populations relies on reaching the established phase, which is defined by characteristic fluctuations of the population’s state variables. Stochastic population models can be used to quantify the establishment probability of newly founded populations; however, so far no simple but robust method for doing so existed. To determine a critical initial number of individuals that need to be released to reach the established phase, we used a novel application of the “Wissel plot”, where –ln(1 – P0(t is plotted against time t. This plot is based on the equation P0t=1–c1e–ω1t, which relates the probability of extinction by time t, P0(t, to two constants: c1 describes the probability of a newly founded population to reach the established phase, whereas ω1 describes the population’s probability of extinction per short time interval once established. Results For illustration, we applied the method to a previously developed stochastic population model of the endangered African wild dog (Lycaon pictus. A newly founded population reaches the established phase if the intercept of the (extrapolated linear parts of the “Wissel plot” with the y-axis, which is –ln(c1, is negative. For wild dogs in our model, this is the case if a critical initial number of four packs, consisting of eight individuals each, are released. Conclusions The method we present to quantify the establishment probability of newly founded populations is generic and inferences thus are transferable to other systems across the field of conservation biology. In contrast to other methods, our approach disaggregates the components of a population’s viability by distinguishing establishment from persistence.

  10. A Tale of Two Probabilities

    Science.gov (United States)

    Falk, Ruma; Kendig, Keith

    2013-01-01

    Two contestants debate the notorious probability problem of the sex of the second child. The conclusions boil down to explication of the underlying scenarios and assumptions. Basic principles of probability theory are highlighted.

  11. Introduction to probability with R

    CERN Document Server

    Baclawski, Kenneth

    2008-01-01

    FOREWORD PREFACE Sets, Events, and Probability The Algebra of Sets The Bernoulli Sample Space The Algebra of Multisets The Concept of Probability Properties of Probability Measures Independent Events The Bernoulli Process The R Language Finite Processes The Basic Models Counting Rules Computing Factorials The Second Rule of Counting Computing Probabilities Discrete Random Variables The Bernoulli Process: Tossing a Coin The Bernoulli Process: Random Walk Independence and Joint Distributions Expectations The Inclusion-Exclusion Principle General Random Variable

  12. Electromagnetic field effect or simply stress? Effects of UMTS exposure on hippocampal longterm plasticity in the context of procedure related hormone release.

    Directory of Open Access Journals (Sweden)

    Nora Prochnow

    Full Text Available Harmful effects of electromagnetic fields (EMF on cognitive and behavioural features of humans and rodents have been controversially discussed and raised persistent concern about adverse effects of EMF on general brain functions. In the present study we applied radio-frequency (RF signals of the Universal Mobile Telecommunications System (UMTS to full brain exposed male Wistar rats in order to elaborate putative influences on stress hormone release (corticosteron; CORT and adrenocorticotropic hormone; ACTH and on hippocampal derived synaptic long-term plasticity (LTP and depression (LTD as electrophysiological hallmarks for memory storage and memory consolidation. Exposure was computer controlled providing blind conditions. Nominal brain-averaged specific absorption rates (SAR as a measure of applied mass-related dissipated RF power were 0, 2, and 10 W/kg over a period of 120 min. Comparison of cage exposed animals revealed, regardless of EMF exposure, significantly increased CORT and ACTH levels which corresponded with generally decreased field potential slopes and amplitudes in hippocampal LTP and LTD. Animals following SAR exposure of 2 W/kg (averaged over the whole brain of 2.3 g tissue mass did not differ from the sham-exposed group in LTP and LTD experiments. In contrast, a significant reduction in LTP and LTD was observed at the high power rate of SAR (10 W/kg. The results demonstrate that a rate of 2 W/kg displays no adverse impact on LTP and LTD, while 10 W/kg leads to significant effects on the electrophysiological parameters, which can be clearly distinguished from the stress derived background. Our findings suggest that UMTS exposure with SAR in the range of 2 W/kg is not harmful to critical markers for memory storage and memory consolidation, however, an influence of UMTS at high energy absorption rates (10 W/kg cannot be excluded.

  13. A first course in probability

    CERN Document Server

    Ross, Sheldon

    2014-01-01

    A First Course in Probability, Ninth Edition, features clear and intuitive explanations of the mathematics of probability theory, outstanding problem sets, and a variety of diverse examples and applications. This book is ideal for an upper-level undergraduate or graduate level introduction to probability for math, science, engineering and business students. It assumes a background in elementary calculus.

  14. A brief introduction to probability.

    Science.gov (United States)

    Di Paola, Gioacchino; Bertani, Alessandro; De Monte, Lavinia; Tuzzolino, Fabio

    2018-02-01

    The theory of probability has been debated for centuries: back in 1600, French mathematics used the rules of probability to place and win bets. Subsequently, the knowledge of probability has significantly evolved and is now an essential tool for statistics. In this paper, the basic theoretical principles of probability will be reviewed, with the aim of facilitating the comprehension of statistical inference. After a brief general introduction on probability, we will review the concept of the "probability distribution" that is a function providing the probabilities of occurrence of different possible outcomes of a categorical or continuous variable. Specific attention will be focused on normal distribution that is the most relevant distribution applied to statistical analysis.

  15. Propensity, Probability, and Quantum Theory

    Science.gov (United States)

    Ballentine, Leslie E.

    2016-08-01

    Quantum mechanics and probability theory share one peculiarity. Both have well established mathematical formalisms, yet both are subject to controversy about the meaning and interpretation of their basic concepts. Since probability plays a fundamental role in QM, the conceptual problems of one theory can affect the other. We first classify the interpretations of probability into three major classes: (a) inferential probability, (b) ensemble probability, and (c) propensity. Class (a) is the basis of inductive logic; (b) deals with the frequencies of events in repeatable experiments; (c) describes a form of causality that is weaker than determinism. An important, but neglected, paper by P. Humphreys demonstrated that propensity must differ mathematically, as well as conceptually, from probability, but he did not develop a theory of propensity. Such a theory is developed in this paper. Propensity theory shares many, but not all, of the axioms of probability theory. As a consequence, propensity supports the Law of Large Numbers from probability theory, but does not support Bayes theorem. Although there are particular problems within QM to which any of the classes of probability may be applied, it is argued that the intrinsic quantum probabilities (calculated from a state vector or density matrix) are most naturally interpreted as quantum propensities. This does not alter the familiar statistical interpretation of QM. But the interpretation of quantum states as representing knowledge is untenable. Examples show that a density matrix fails to represent knowledge.

  16. Prediction and probability in sciences

    International Nuclear Information System (INIS)

    Klein, E.; Sacquin, Y.

    1998-01-01

    This book reports the 7 presentations made at the third meeting 'physics and fundamental questions' whose theme was probability and prediction. The concept of probability that was invented to apprehend random phenomena has become an important branch of mathematics and its application range spreads from radioactivity to species evolution via cosmology or the management of very weak risks. The notion of probability is the basis of quantum mechanics and then is bound to the very nature of matter. The 7 topics are: - radioactivity and probability, - statistical and quantum fluctuations, - quantum mechanics as a generalized probability theory, - probability and the irrational efficiency of mathematics, - can we foresee the future of the universe?, - chance, eventuality and necessity in biology, - how to manage weak risks? (A.C.)

  17. Applied probability and stochastic processes

    CERN Document Server

    Sumita, Ushio

    1999-01-01

    Applied Probability and Stochastic Processes is an edited work written in honor of Julien Keilson. This volume has attracted a host of scholars in applied probability, who have made major contributions to the field, and have written survey and state-of-the-art papers on a variety of applied probability topics, including, but not limited to: perturbation method, time reversible Markov chains, Poisson processes, Brownian techniques, Bayesian probability, optimal quality control, Markov decision processes, random matrices, queueing theory and a variety of applications of stochastic processes. The book has a mixture of theoretical, algorithmic, and application chapters providing examples of the cutting-edge work that Professor Keilson has done or influenced over the course of his highly-productive and energetic career in applied probability and stochastic processes. The book will be of interest to academic researchers, students, and industrial practitioners who seek to use the mathematics of applied probability i...

  18. Poisson Processes in Free Probability

    OpenAIRE

    An, Guimei; Gao, Mingchu

    2015-01-01

    We prove a multidimensional Poisson limit theorem in free probability, and define joint free Poisson distributions in a non-commutative probability space. We define (compound) free Poisson process explicitly, similar to the definitions of (compound) Poisson processes in classical probability. We proved that the sum of finitely many freely independent compound free Poisson processes is a compound free Poisson processes. We give a step by step procedure for constructing a (compound) free Poisso...

  19. PROBABILITY SURVEYS , CONDITIONAL PROBABILITIES AND ECOLOGICAL RISK ASSESSMENT

    Science.gov (United States)

    We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...

  20. Probability inequalities for decomposition integrals

    Czech Academy of Sciences Publication Activity Database

    Agahi, H.; Mesiar, Radko

    2017-01-01

    Roč. 315, č. 1 (2017), s. 240-248 ISSN 0377-0427 Institutional support: RVO:67985556 Keywords : Decomposition integral * Superdecomposition integral * Probability inequalities Subject RIV: BA - General Mathematics OBOR OECD: Statistics and probability Impact factor: 1.357, year: 2016 http://library.utia.cas.cz/separaty/2017/E/mesiar-0470959.pdf

  1. Expected utility with lower probabilities

    DEFF Research Database (Denmark)

    Hendon, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte

    1994-01-01

    An uncertain and not just risky situation may be modeled using so-called belief functions assigning lower probabilities to subsets of outcomes. In this article we extend the von Neumann-Morgenstern expected utility theory from probability measures to belief functions. We use this theory...

  2. The antimicrobial efficacy of sustained release silver–carbene complex-loaded l-tyrosine polyphosphate nanoparticles: Characterization, in vitro and in vivo studies

    Science.gov (United States)

    Hindi, Khadijah M.; Ditto, Andrew J.; Panzner, Matthew J.; Medvetz, Douglas A.; Han, Daniel S.; Hovis, Christine E.; Hilliard, Julia K.; Taylor, Jane B.; Yun, Yang H.; Cannon, Carolyn L.; Youngs, Wiley J.

    2009-01-01

    The pressing need to treat multi-drug resistant bacteria in the chronically infected lungs of cystic fibrosis (CF) patients has given rise to novel nebulized antimicrobials. We have synthesized a silver–carbene complex (SCC10) active against a variety of bacterial strains associated with CF and chronic lung infections. Our studies have demonstrated that SCC10-loaded into l-tyrosine polyphosphate nanoparticles (LTP NPs) exhibits excellent antimicrobial activity in vitro and in vivo against the CF relevant bacteria Pseudomonas aeruginosa. Encapsulation of SCC10 in LTP NPs provides sustained release of the antimicrobial over the course of several days translating into efficacious results in vivo with only two administered doses over a 72 h period. PMID:19395021

  3. Invariant probabilities of transition functions

    CERN Document Server

    Zaharopol, Radu

    2014-01-01

    The structure of the set of all the invariant probabilities and the structure of various types of individual invariant probabilities of a transition function are two topics of significant interest in the theory of transition functions, and are studied in this book. The results obtained are useful in ergodic theory and the theory of dynamical systems, which, in turn, can be applied in various other areas (like number theory). They are illustrated using transition functions defined by flows, semiflows, and one-parameter convolution semigroups of probability measures. In this book, all results on transition probabilities that have been published by the author between 2004 and 2008 are extended to transition functions. The proofs of the results obtained are new. For transition functions that satisfy very general conditions the book describes an ergodic decomposition that provides relevant information on the structure of the corresponding set of invariant probabilities. Ergodic decomposition means a splitting of t...

  4. Introduction to probability with Mathematica

    CERN Document Server

    Hastings, Kevin J

    2009-01-01

    Discrete ProbabilityThe Cast of Characters Properties of Probability Simulation Random SamplingConditional ProbabilityIndependenceDiscrete DistributionsDiscrete Random Variables, Distributions, and ExpectationsBernoulli and Binomial Random VariablesGeometric and Negative Binomial Random Variables Poisson DistributionJoint, Marginal, and Conditional Distributions More on ExpectationContinuous ProbabilityFrom the Finite to the (Very) Infinite Continuous Random Variables and DistributionsContinuous ExpectationContinuous DistributionsThe Normal Distribution Bivariate Normal DistributionNew Random Variables from OldOrder Statistics Gamma DistributionsChi-Square, Student's t, and F-DistributionsTransformations of Normal Random VariablesAsymptotic TheoryStrong and Weak Laws of Large Numbers Central Limit TheoremStochastic Processes and ApplicationsMarkov ChainsPoisson Processes QueuesBrownian MotionFinancial MathematicsAppendixIntroduction to Mathematica Glossary of Mathematica Commands for Probability Short Answers...

  5. Linear positivity and virtual probability

    International Nuclear Information System (INIS)

    Hartle, James B.

    2004-01-01

    We investigate the quantum theory of closed systems based on the linear positivity decoherence condition of Goldstein and Page. The objective of any quantum theory of a closed system, most generally the universe, is the prediction of probabilities for the individual members of sets of alternative coarse-grained histories of the system. Quantum interference between members of a set of alternative histories is an obstacle to assigning probabilities that are consistent with the rules of probability theory. A quantum theory of closed systems therefore requires two elements: (1) a condition specifying which sets of histories may be assigned probabilities and (2) a rule for those probabilities. The linear positivity condition of Goldstein and Page is the weakest of the general conditions proposed so far. Its general properties relating to exact probability sum rules, time neutrality, and conservation laws are explored. Its inconsistency with the usual notion of independent subsystems in quantum mechanics is reviewed. Its relation to the stronger condition of medium decoherence necessary for classicality is discussed. The linear positivity of histories in a number of simple model systems is investigated with the aim of exhibiting linearly positive sets of histories that are not decoherent. The utility of extending the notion of probability to include values outside the range of 0-1 is described. Alternatives with such virtual probabilities cannot be measured or recorded, but can be used in the intermediate steps of calculations of real probabilities. Extended probabilities give a simple and general way of formulating quantum theory. The various decoherence conditions are compared in terms of their utility for characterizing classicality and the role they might play in further generalizations of quantum mechanics

  6. Probability Machines: Consistent Probability Estimation Using Nonparametric Learning Machines

    Science.gov (United States)

    Malley, J. D.; Kruppa, J.; Dasgupta, A.; Malley, K. G.; Ziegler, A.

    2011-01-01

    Summary Background Most machine learning approaches only provide a classification for binary responses. However, probabilities are required for risk estimation using individual patient characteristics. It has been shown recently that every statistical learning machine known to be consistent for a nonparametric regression problem is a probability machine that is provably consistent for this estimation problem. Objectives The aim of this paper is to show how random forests and nearest neighbors can be used for consistent estimation of individual probabilities. Methods Two random forest algorithms and two nearest neighbor algorithms are described in detail for estimation of individual probabilities. We discuss the consistency of random forests, nearest neighbors and other learning machines in detail. We conduct a simulation study to illustrate the validity of the methods. We exemplify the algorithms by analyzing two well-known data sets on the diagnosis of appendicitis and the diagnosis of diabetes in Pima Indians. Results Simulations demonstrate the validity of the method. With the real data application, we show the accuracy and practicality of this approach. We provide sample code from R packages in which the probability estimation is already available. This means that all calculations can be performed using existing software. Conclusions Random forest algorithms as well as nearest neighbor approaches are valid machine learning methods for estimating individual probabilities for binary responses. Freely available implementations are available in R and may be used for applications. PMID:21915433

  7. Probable Inference and Quantum Mechanics

    International Nuclear Information System (INIS)

    Grandy, W. T. Jr.

    2009-01-01

    In its current very successful interpretation the quantum theory is fundamentally statistical in nature. Although commonly viewed as a probability amplitude whose (complex) square is a probability, the wavefunction or state vector continues to defy consensus as to its exact meaning, primarily because it is not a physical observable. Rather than approach this problem directly, it is suggested that it is first necessary to clarify the precise role of probability theory in quantum mechanics, either as applied to, or as an intrinsic part of the quantum theory. When all is said and done the unsurprising conclusion is that quantum mechanics does not constitute a logic and probability unto itself, but adheres to the long-established rules of classical probability theory while providing a means within itself for calculating the relevant probabilities. In addition, the wavefunction is seen to be a description of the quantum state assigned by an observer based on definite information, such that the same state must be assigned by any other observer based on the same information, in much the same way that probabilities are assigned.

  8. Prediction of accident sequence probabilities in a nuclear power plant due to earthquake events

    International Nuclear Information System (INIS)

    Hudson, J.M.; Collins, J.D.

    1980-01-01

    This paper presents a methodology to predict accident probabilities in nuclear power plants subject to earthquakes. The resulting computer program accesses response data to compute component failure probabilities using fragility functions. Using logical failure definitions for systems, and the calculated component failure probabilities, initiating event and safety system failure probabilities are synthesized. The incorporation of accident sequence expressions allows the calculation of terminal event probabilities. Accident sequences, with their occurrence probabilities, are finally coupled to a specific release category. A unique aspect of the methodology is an analytical procedure for calculating top event probabilities based on the correlated failure of primary events

  9. Failure probability under parameter uncertainty.

    Science.gov (United States)

    Gerrard, R; Tsanakas, A

    2011-05-01

    In many problems of risk analysis, failure is equivalent to the event of a random risk factor exceeding a given threshold. Failure probabilities can be controlled if a decisionmaker is able to set the threshold at an appropriate level. This abstract situation applies, for example, to environmental risks with infrastructure controls; to supply chain risks with inventory controls; and to insurance solvency risks with capital controls. However, uncertainty around the distribution of the risk factor implies that parameter error will be present and the measures taken to control failure probabilities may not be effective. We show that parameter uncertainty increases the probability (understood as expected frequency) of failures. For a large class of loss distributions, arising from increasing transformations of location-scale families (including the log-normal, Weibull, and Pareto distributions), the article shows that failure probabilities can be exactly calculated, as they are independent of the true (but unknown) parameters. Hence it is possible to obtain an explicit measure of the effect of parameter uncertainty on failure probability. Failure probability can be controlled in two different ways: (1) by reducing the nominal required failure probability, depending on the size of the available data set, and (2) by modifying of the distribution itself that is used to calculate the risk control. Approach (1) corresponds to a frequentist/regulatory view of probability, while approach (2) is consistent with a Bayesian/personalistic view. We furthermore show that the two approaches are consistent in achieving the required failure probability. Finally, we briefly discuss the effects of data pooling and its systemic risk implications. © 2010 Society for Risk Analysis.

  10. Probability with applications and R

    CERN Document Server

    Dobrow, Robert P

    2013-01-01

    An introduction to probability at the undergraduate level Chance and randomness are encountered on a daily basis. Authored by a highly qualified professor in the field, Probability: With Applications and R delves into the theories and applications essential to obtaining a thorough understanding of probability. With real-life examples and thoughtful exercises from fields as diverse as biology, computer science, cryptology, ecology, public health, and sports, the book is accessible for a variety of readers. The book's emphasis on simulation through the use of the popular R software language c

  11. A philosophical essay on probabilities

    CERN Document Server

    Laplace, Marquis de

    1996-01-01

    A classic of science, this famous essay by ""the Newton of France"" introduces lay readers to the concepts and uses of probability theory. It is of especial interest today as an application of mathematical techniques to problems in social and biological sciences.Generally recognized as the founder of the modern phase of probability theory, Laplace here applies the principles and general results of his theory ""to the most important questions of life, which are, in effect, for the most part, problems in probability."" Thus, without the use of higher mathematics, he demonstrates the application

  12. Methane release

    International Nuclear Information System (INIS)

    Seifert, M.

    1999-01-01

    The Swiss Gas Industry has carried out a systematic, technical estimate of methane release from the complete supply chain from production to consumption for the years 1992/1993. The result of this survey provided a conservative value, amounting to 0.9% of the Swiss domestic output. A continuation of the study taking into account new findings with regard to emission factors and the effect of the climate is now available, which provides a value of 0.8% for the target year of 1996. These results show that the renovation of the network has brought about lower losses in the local gas supplies, particularly for the grey cast iron pipelines. (author)

  13. Logic, probability, and human reasoning.

    Science.gov (United States)

    Johnson-Laird, P N; Khemlani, Sangeet S; Goodwin, Geoffrey P

    2015-04-01

    This review addresses the long-standing puzzle of how logic and probability fit together in human reasoning. Many cognitive scientists argue that conventional logic cannot underlie deductions, because it never requires valid conclusions to be withdrawn - not even if they are false; it treats conditional assertions implausibly; and it yields many vapid, although valid, conclusions. A new paradigm of probability logic allows conclusions to be withdrawn and treats conditionals more plausibly, although it does not address the problem of vapidity. The theory of mental models solves all of these problems. It explains how people reason about probabilities and postulates that the machinery for reasoning is itself probabilistic. Recent investigations accordingly suggest a way to integrate probability and deduction. Copyright © 2015 Elsevier Ltd. All rights reserved.

  14. Free probability and random matrices

    CERN Document Server

    Mingo, James A

    2017-01-01

    This volume opens the world of free probability to a wide variety of readers. From its roots in the theory of operator algebras, free probability has intertwined with non-crossing partitions, random matrices, applications in wireless communications, representation theory of large groups, quantum groups, the invariant subspace problem, large deviations, subfactors, and beyond. This book puts a special emphasis on the relation of free probability to random matrices, but also touches upon the operator algebraic, combinatorial, and analytic aspects of the theory. The book serves as a combination textbook/research monograph, with self-contained chapters, exercises scattered throughout the text, and coverage of important ongoing progress of the theory. It will appeal to graduate students and all mathematicians interested in random matrices and free probability from the point of view of operator algebras, combinatorics, analytic functions, or applications in engineering and statistical physics.

  15. Introduction to probability and measure

    CERN Document Server

    Parthasarathy, K R

    2005-01-01

    According to a remark attributed to Mark Kac 'Probability Theory is a measure theory with a soul'. This book with its choice of proofs, remarks, examples and exercises has been prepared taking both these aesthetic and practical aspects into account.

  16. Real-time monitoring of extracellular l-glutamate levels released by high-frequency stimulation at region CA1 of hippocampal slices with a glass capillary-based l-glutamate sensor

    Directory of Open Access Journals (Sweden)

    Yuki Ikegami

    2014-12-01

    Full Text Available Real-time monitoring of l-glutamate released by high-frequency stimulation in region CA1 of mouse hippocampal slices was performed with a glass capillary-based sensor, in combination with the recoding of excitatory postsynaptic potentials (fEPSPs. A method for extracting l-glutamate currents from the recorded ones was described and applied for determining the level of extracellular l-glutamate released by 100 Hz stimulation. Recording of an l-glutamate current with a current sampling interval of 1 Hz was found to be useful for acquiring a Faradaic current that reflects l-glutamate level released by the high-frequency stimulation of 7 trains, each 20 stimuli at 100 Hz and inter-train interval of 3 s. The l-glutamate level was obtained as 15 ± 6 μM (n = 8 for the persistent enhancement of fEPSPs, i.e., the induction of long-term potentiation (LTP, and 3 ± 1 μM (n = 5 for the case of no LTP induction. Based on these observations, the level of the extracellular l-glutamate was shown to play a crucial role in the induction of LTP.

  17. Joint probabilities and quantum cognition

    International Nuclear Information System (INIS)

    Acacio de Barros, J.

    2012-01-01

    In this paper we discuss the existence of joint probability distributions for quantumlike response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.

  18. Joint probabilities and quantum cognition

    Energy Technology Data Exchange (ETDEWEB)

    Acacio de Barros, J. [Liberal Studies, 1600 Holloway Ave., San Francisco State University, San Francisco, CA 94132 (United States)

    2012-12-18

    In this paper we discuss the existence of joint probability distributions for quantumlike response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.

  19. Default probabilities and default correlations

    OpenAIRE

    Erlenmaier, Ulrich; Gersbach, Hans

    2001-01-01

    Starting from the Merton framework for firm defaults, we provide the analytics and robustness of the relationship between default correlations. We show that loans with higher default probabilities will not only have higher variances but also higher correlations between loans. As a consequence, portfolio standard deviation can increase substantially when loan default probabilities rise. This result has two important implications. First, relative prices of loans with different default probabili...

  20. The Probabilities of Unique Events

    Science.gov (United States)

    2012-08-30

    Washington, DC USA Max Lotstein and Phil Johnson-Laird Department of Psychology Princeton University Princeton, NJ USA August 30th 2012...social justice and also participated in antinuclear demonstrations. The participants ranked the probability that Linda is a feminist bank teller as...retorted that such a flagrant violation of the probability calculus was a result of a psychological experiment that obscured the rationality of the

  1. Probability Matching, Fast and Slow

    OpenAIRE

    Koehler, Derek J.; James, Greta

    2014-01-01

    A prominent point of contention among researchers regarding the interpretation of probability-matching behavior is whether it represents a cognitively sophisticated, adaptive response to the inherent uncertainty of the tasks or settings in which it is observed, or whether instead it represents a fundamental shortcoming in the heuristics that support and guide human decision making. Put crudely, researchers disagree on whether probability matching is "smart" or "dumb." Here, we consider eviden...

  2. Probably not future prediction using probability and statistical inference

    CERN Document Server

    Dworsky, Lawrence N

    2008-01-01

    An engaging, entertaining, and informative introduction to probability and prediction in our everyday lives Although Probably Not deals with probability and statistics, it is not heavily mathematical and is not filled with complex derivations, proofs, and theoretical problem sets. This book unveils the world of statistics through questions such as what is known based upon the information at hand and what can be expected to happen. While learning essential concepts including "the confidence factor" and "random walks," readers will be entertained and intrigued as they move from chapter to chapter. Moreover, the author provides a foundation of basic principles to guide decision making in almost all facets of life including playing games, developing winning business strategies, and managing personal finances. Much of the book is organized around easy-to-follow examples that address common, everyday issues such as: How travel time is affected by congestion, driving speed, and traffic lights Why different gambling ...

  3. Normal probability plots with confidence.

    Science.gov (United States)

    Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang

    2015-01-01

    Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. Probability theory a foundational course

    CERN Document Server

    Pakshirajan, R P

    2013-01-01

    This book shares the dictum of J. L. Doob in treating Probability Theory as a branch of Measure Theory and establishes this relation early. Probability measures in product spaces are introduced right at the start by way of laying the ground work to later claim the existence of stochastic processes with prescribed finite dimensional distributions. Other topics analysed in the book include supports of probability measures, zero-one laws in product measure spaces, Erdos-Kac invariance principle, functional central limit theorem and functional law of the iterated logarithm for independent variables, Skorohod embedding, and the use of analytic functions of a complex variable in the study of geometric ergodicity in Markov chains. This book is offered as a text book for students pursuing graduate programs in Mathematics and or Statistics. The book aims to help the teacher present the theory with ease, and to help the student sustain his interest and joy in learning the subject.

  5. VIBRATION ISOLATION SYSTEM PROBABILITY ANALYSIS

    Directory of Open Access Journals (Sweden)

    Smirnov Vladimir Alexandrovich

    2012-10-01

    Full Text Available The article deals with the probability analysis for a vibration isolation system of high-precision equipment, which is extremely sensitive to low-frequency oscillations even of submicron amplitude. The external sources of low-frequency vibrations may include the natural city background or internal low-frequency sources inside buildings (pedestrian activity, HVAC. Taking Gauss distribution into account, the author estimates the probability of the relative displacement of the isolated mass being still lower than the vibration criteria. This problem is being solved in the three dimensional space, evolved by the system parameters, including damping and natural frequency. According to this probability distribution, the chance of exceeding the vibration criteria for a vibration isolation system is evaluated. Optimal system parameters - damping and natural frequency - are being developed, thus the possibility of exceeding vibration criteria VC-E and VC-D is assumed to be less than 0.04.

  6. Approximation methods in probability theory

    CERN Document Server

    Čekanavičius, Vydas

    2016-01-01

    This book presents a wide range of well-known and less common methods used for estimating the accuracy of probabilistic approximations, including the Esseen type inversion formulas, the Stein method as well as the methods of convolutions and triangle function. Emphasising the correct usage of the methods presented, each step required for the proofs is examined in detail. As a result, this textbook provides valuable tools for proving approximation theorems. While Approximation Methods in Probability Theory will appeal to everyone interested in limit theorems of probability theory, the book is particularly aimed at graduate students who have completed a standard intermediate course in probability theory. Furthermore, experienced researchers wanting to enlarge their toolkit will also find this book useful.

  7. Model uncertainty: Probabilities for models?

    International Nuclear Information System (INIS)

    Winkler, R.L.

    1994-01-01

    Like any other type of uncertainty, model uncertainty should be treated in terms of probabilities. The question is how to do this. The most commonly-used approach has a drawback related to the interpretation of the probabilities assigned to the models. If we step back and look at the big picture, asking what the appropriate focus of the model uncertainty question should be in the context of risk and decision analysis, we see that a different probabilistic approach makes more sense, although it raise some implementation questions. Current work that is underway to address these questions looks very promising

  8. Knowledge typology for imprecise probabilities.

    Energy Technology Data Exchange (ETDEWEB)

    Wilson, G. D. (Gregory D.); Zucker, L. J. (Lauren J.)

    2002-01-01

    When characterizing the reliability of a complex system there are often gaps in the data available for specific subsystems or other factors influencing total system reliability. At Los Alamos National Laboratory we employ ethnographic methods to elicit expert knowledge when traditional data is scarce. Typically, we elicit expert knowledge in probabilistic terms. This paper will explore how we might approach elicitation if methods other than probability (i.e., Dempster-Shafer, or fuzzy sets) prove more useful for quantifying certain types of expert knowledge. Specifically, we will consider if experts have different types of knowledge that may be better characterized in ways other than standard probability theory.

  9. Probability, Statistics, and Stochastic Processes

    CERN Document Server

    Olofsson, Peter

    2011-01-01

    A mathematical and intuitive approach to probability, statistics, and stochastic processes This textbook provides a unique, balanced approach to probability, statistics, and stochastic processes. Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area. This text combines a rigorous, calculus-based development of theory with a more intuitive approach that appeals to readers' sense of reason and logic, an approach developed through the author's many years of classroom experience. The text begins with three chapters that d

  10. Statistical probability tables CALENDF program

    International Nuclear Information System (INIS)

    Ribon, P.

    1989-01-01

    The purpose of the probability tables is: - to obtain dense data representation - to calculate integrals by quadratures. They are mainly used in the USA for calculations by Monte Carlo and in the USSR and Europe for self-shielding calculations by the sub-group method. The moment probability tables, in addition to providing a more substantial mathematical basis and calculation methods, are adapted for condensation and mixture calculations, which are the crucial operations for reactor physics specialists. However, their extension is limited by the statistical hypothesis they imply. Efforts are being made to remove this obstacle, at the cost, it must be said, of greater complexity

  11. Probability, statistics, and queueing theory

    CERN Document Server

    Allen, Arnold O

    1990-01-01

    This is a textbook on applied probability and statistics with computer science applications for students at the upper undergraduate level. It may also be used as a self study book for the practicing computer science professional. The successful first edition of this book proved extremely useful to students who need to use probability, statistics and queueing theory to solve problems in other fields, such as engineering, physics, operations research, and management science. The book has also been successfully used for courses in queueing theory for operations research students. This second edit

  12. Probability and Statistics: 5 Questions

    DEFF Research Database (Denmark)

    Probability and Statistics: 5 Questions is a collection of short interviews based on 5 questions presented to some of the most influential and prominent scholars in probability and statistics. We hear their views on the fields, aims, scopes, the future direction of research and how their work fits...... in these respects. Interviews with Nick Bingham, Luc Bovens, Terrence L. Fine, Haim Gaifman, Donald Gillies, James Hawthorne, Carl Hoefer, James M. Joyce, Joseph B. Kadane Isaac Levi, D.H. Mellor, Patrick Suppes, Jan von Plato, Carl Wagner, Sandy Zabell...

  13. Group I mGluR antagonist rescues the deficit of D1-induced LTP in a mouse model of fragile X syndrome

    Directory of Open Access Journals (Sweden)

    Xu Zhao-Hui

    2012-05-01

    Full Text Available Abstract Background Fragile X syndrome (FXS is caused by the absence of the mRNA-binding protein Fragile X mental retardation protein (FMRP, encoded by the Fmr1 gene. Overactive signaling by group 1 metabotropic glutamate receptor (Grp1 mGluR could contribute to slowed synaptic development and other symptoms of FXS. Our previous study has identified that facilitation of synaptic long-term potentiation (LTP by D1 receptor is impaired in Fmr1 knockout (KO mice. However, the contribution of Grp1 mGluR to the facilitation of synaptic plasticity by D1 receptor stimulation in the prefrontal cortex has been less extensively studied. Results Here we demonstrated that DL-AP3, a Grp1 mGluR antagonist, rescued LTP facilitation by D1 receptor agonist SKF81297 in Fmr1KO mice. Grp1 mGluR inhibition restored the GluR1-subtype AMPA receptors surface insertion by D1 activation in the cultured Fmr1KO neurons. Simultaneous treatment of Grp1 mGluR antagonist with D1 agonist recovered the D1 receptor signaling by reversing the subcellular redistribution of G protein-coupled receptor kinase 2 (GRK2 in the Fmr1KO neurons. Treatment of SKF81297 alone failed to increase the phosphorylation of NR2B-containing N-methyl D-aspartate receptors (NMDARs at Tyr-1472 (p-NR2B-Tyr1472 in the cultures from KO mice. However, simultaneous treatment of DL-AP3 could rescue the level of p-NR2B-Tyr1472 by SKF81297 in the cultures from KO mice. Furthermore, behavioral tests indicated that simultaneous treatment of Grp1 mGluR antagonist with D1 agonist inhibited hyperactivity and improved the learning ability in the Fmr1KO mice. Conclusion The findings demonstrate that mGluR1 inhibition is a useful strategy to recover D1 receptor signaling in the Fmr1KO mice, and combination of Grp1 mGluR antagonist and D1 agonist is a potential drug therapy for the FXS.

  14. Does charge transfer correlate with ignition probability?

    International Nuclear Information System (INIS)

    Holdstock, Paul

    2008-01-01

    Flammable or explosive atmospheres exist in many industrial environments. The risk of ignition caused by electrostatic discharges is very real and there has been extensive study of the incendiary nature of sparks and brush discharges. It is clear that in order to ignite a gas, an amount of energy needs to be delivered to a certain volume of gas within a comparatively short time. It is difficult to measure the energy released in an electrostatic discharge directly, but it is possible to approximate the energy in a spark generated from a well defined electrical circuit. The spark energy required to ignite a gas, vapour or dust cloud can be determined by passing such sparks through them. There is a relationship between energy and charge in a capacitive circuit and so it is possible to predict whether or not a spark discharge will cause an ignition by measuring the charge transferred in the spark. Brush discharges are in many ways less well defined than sparks. Nevertheless, some work has been done that has established a relationship between charge transferred in brush discharges and the probability of igniting a flammable atmosphere. The question posed by this paper concerns whether such a relationship holds true in all circumstances and if there is a universal correlation between charge transfer and ignition probability. Data is presented on discharges from textile materials that go some way to answering this question.

  15. Dynamic SEP event probability forecasts

    Science.gov (United States)

    Kahler, S. W.; Ling, A.

    2015-10-01

    The forecasting of solar energetic particle (SEP) event probabilities at Earth has been based primarily on the estimates of magnetic free energy in active regions and on the observations of peak fluxes and fluences of large (≥ M2) solar X-ray flares. These forecasts are typically issued for the next 24 h or with no definite expiration time, which can be deficient for time-critical operations when no SEP event appears following a large X-ray flare. It is therefore important to decrease the event probability forecast with time as a SEP event fails to appear. We use the NOAA listing of major (≥10 pfu) SEP events from 1976 to 2014 to plot the delay times from X-ray peaks to SEP threshold onsets as a function of solar source longitude. An algorithm is derived to decrease the SEP event probabilities with time when no event is observed to reach the 10 pfu threshold. In addition, we use known SEP event size distributions to modify probability forecasts when SEP intensity increases occur below the 10 pfu event threshold. An algorithm to provide a dynamic SEP event forecast, Pd, for both situations of SEP intensities following a large flare is derived.

  16. Conditional Independence in Applied Probability.

    Science.gov (United States)

    Pfeiffer, Paul E.

    This material assumes the user has the background provided by a good undergraduate course in applied probability. It is felt that introductory courses in calculus, linear algebra, and perhaps some differential equations should provide the requisite experience and proficiency with mathematical concepts, notation, and argument. The document is…

  17. Stretching Probability Explorations with Geoboards

    Science.gov (United States)

    Wheeler, Ann; Champion, Joe

    2016-01-01

    Students are faced with many transitions in their middle school mathematics classes. To build knowledge, skills, and confidence in the key areas of algebra and geometry, students often need to practice using numbers and polygons in a variety of contexts. Teachers also want students to explore ideas from probability and statistics. Teachers know…

  18. GPS: Geometry, Probability, and Statistics

    Science.gov (United States)

    Field, Mike

    2012-01-01

    It might be said that for most occupations there is now less of a need for mathematics than there was say fifty years ago. But, the author argues, geometry, probability, and statistics constitute essential knowledge for everyone. Maybe not the geometry of Euclid, but certainly geometrical ways of thinking that might enable us to describe the world…

  19. Swedish earthquakes and acceleration probabilities

    International Nuclear Information System (INIS)

    Slunga, R.

    1979-03-01

    A method to assign probabilities to ground accelerations for Swedish sites is described. As hardly any nearfield instrumental data is available we are left with the problem of interpreting macroseismic data in terms of acceleration. By theoretical wave propagation computations the relation between seismic strength of the earthquake, focal depth, distance and ground accelerations are calculated. We found that most Swedish earthquake of the area, the 1904 earthquake 100 km south of Oslo, is an exception and probably had a focal depth exceeding 25 km. For the nuclear power plant sites an annual probability of 10 -5 has been proposed as interesting. This probability gives ground accelerations in the range 5-20 % for the sites. This acceleration is for a free bedrock site. For consistency all acceleration results in this study are given for bedrock sites. When applicating our model to the 1904 earthquake and assuming the focal zone to be in the lower crust we get the epicentral acceleration of this earthquake to be 5-15 % g. The results above are based on an analyses of macrosismic data as relevant instrumental data is lacking. However, the macroseismic acceleration model deduced in this study gives epicentral ground acceleration of small Swedish earthquakes in agreement with existent distant instrumental data. (author)

  20. DECOFF Probabilities of Failed Operations

    DEFF Research Database (Denmark)

    Gintautas, Tomas

    2015-01-01

    A statistical procedure of estimation of Probabilities of Failed Operations is described and exemplified using ECMWF weather forecasts and SIMO output from Rotor Lift test case models. Also safety factor influence is investigated. DECOFF statistical method is benchmarked against standard Alpha-factor...

  1. Risk estimation using probability machines

    Science.gov (United States)

    2014-01-01

    Background Logistic regression has been the de facto, and often the only, model used in the description and analysis of relationships between a binary outcome and observed features. It is widely used to obtain the conditional probabilities of the outcome given predictors, as well as predictor effect size estimates using conditional odds ratios. Results We show how statistical learning machines for binary outcomes, provably consistent for the nonparametric regression problem, can be used to provide both consistent conditional probability estimation and conditional effect size estimates. Effect size estimates from learning machines leverage our understanding of counterfactual arguments central to the interpretation of such estimates. We show that, if the data generating model is logistic, we can recover accurate probability predictions and effect size estimates with nearly the same efficiency as a correct logistic model, both for main effects and interactions. We also propose a method using learning machines to scan for possible interaction effects quickly and efficiently. Simulations using random forest probability machines are presented. Conclusions The models we propose make no assumptions about the data structure, and capture the patterns in the data by just specifying the predictors involved and not any particular model structure. So they do not run the same risks of model mis-specification and the resultant estimation biases as a logistic model. This methodology, which we call a “risk machine”, will share properties from the statistical machine that it is derived from. PMID:24581306

  2. Probability and statistics: A reminder

    International Nuclear Information System (INIS)

    Clement, B.

    2013-01-01

    The main purpose of these lectures is to provide the reader with the tools needed to data analysis in the framework of physics experiments. Basic concepts are introduced together with examples of application in experimental physics. The lecture is divided into two parts: probability and statistics. It is build on the introduction from 'data analysis in experimental sciences' given in [1]. (authors)

  3. Nash equilibrium with lower probabilities

    DEFF Research Database (Denmark)

    Groes, Ebbe; Jacobsen, Hans Jørgen; Sloth, Birgitte

    1998-01-01

    We generalize the concept of Nash equilibrium in mixed strategies for strategic form games to allow for ambiguity in the players' expectations. In contrast to other contributions, we model ambiguity by means of so-called lower probability measures or belief functions, which makes it possible...

  4. On probability-possibility transformations

    Science.gov (United States)

    Klir, George J.; Parviz, Behzad

    1992-01-01

    Several probability-possibility transformations are compared in terms of the closeness of preserving second-order properties. The comparison is based on experimental results obtained by computer simulation. Two second-order properties are involved in this study: noninteraction of two distributions and projections of a joint distribution.

  5. Large deviations and idempotent probability

    CERN Document Server

    Puhalskii, Anatolii

    2001-01-01

    In the view of many probabilists, author Anatolii Puhalskii''s research results stand among the most significant achievements in the modern theory of large deviations. In fact, his work marked a turning point in the depth of our understanding of the connections between the large deviation principle (LDP) and well-known methods for establishing weak convergence results.Large Deviations and Idempotent Probability expounds upon the recent methodology of building large deviation theory along the lines of weak convergence theory. The author develops an idempotent (or maxitive) probability theory, introduces idempotent analogues of martingales (maxingales), Wiener and Poisson processes, and Ito differential equations, and studies their properties. The large deviation principle for stochastic processes is formulated as a certain type of convergence of stochastic processes to idempotent processes. The author calls this large deviation convergence.The approach to establishing large deviation convergence uses novel com...

  6. Probability biases as Bayesian inference

    Directory of Open Access Journals (Sweden)

    Andre; C. R. Martins

    2006-11-01

    Full Text Available In this article, I will show how several observed biases in human probabilistic reasoning can be partially explained as good heuristics for making inferences in an environment where probabilities have uncertainties associated to them. Previous results show that the weight functions and the observed violations of coalescing and stochastic dominance can be understood from a Bayesian point of view. We will review those results and see that Bayesian methods should also be used as part of the explanation behind other known biases. That means that, although the observed errors are still errors under the be understood as adaptations to the solution of real life problems. Heuristics that allow fast evaluations and mimic a Bayesian inference would be an evolutionary advantage, since they would give us an efficient way of making decisions. %XX In that sense, it should be no surprise that humans reason with % probability as it has been observed.

  7. Probability matching and strategy availability.

    Science.gov (United States)

    Koehler, Derek J; James, Greta

    2010-09-01

    Findings from two experiments indicate that probability matching in sequential choice arises from an asymmetry in strategy availability: The matching strategy comes readily to mind, whereas a superior alternative strategy, maximizing, does not. First, compared with the minority who spontaneously engage in maximizing, the majority of participants endorse maximizing as superior to matching in a direct comparison when both strategies are described. Second, when the maximizing strategy is brought to their attention, more participants subsequently engage in maximizing. Third, matchers are more likely than maximizers to base decisions in other tasks on their initial intuitions, suggesting that they are more inclined to use a choice strategy that comes to mind quickly. These results indicate that a substantial subset of probability matchers are victims of "underthinking" rather than "overthinking": They fail to engage in sufficient deliberation to generate a superior alternative to the matching strategy that comes so readily to mind.

  8. Probability as a Physical Motive

    Directory of Open Access Journals (Sweden)

    Peter Martin

    2007-04-01

    Full Text Available Recent theoretical progress in nonequilibrium thermodynamics, linking thephysical principle of Maximum Entropy Production (“MEP” to the information-theoretical“MaxEnt” principle of scientific inference, together with conjectures from theoreticalphysics that there may be no fundamental causal laws but only probabilities for physicalprocesses, and from evolutionary theory that biological systems expand “the adjacentpossible” as rapidly as possible, all lend credence to the proposition that probability shouldbe recognized as a fundamental physical motive. It is further proposed that spatial order andtemporal order are two aspects of the same thing, and that this is the essence of the secondlaw of thermodynamics.

  9. Logic, Probability, and Human Reasoning

    Science.gov (United States)

    2015-01-01

    accordingly suggest a way to integrate probability and deduction. The nature of deductive reasoning To be rational is to be able to make deductions...3–6] and they underlie mathematics, science, and tech- nology [7–10]. Plato claimed that emotions upset reason- ing. However, individuals in the grip...fundamental to human rationality . So, if counterexamples to its principal predictions occur, the theory will at least explain its own refutation

  10. Probability Measures on Groups IX

    CERN Document Server

    1989-01-01

    The latest in this series of Oberwolfach conferences focussed on the interplay between structural probability theory and various other areas of pure and applied mathematics such as Tauberian theory, infinite-dimensional rotation groups, central limit theorems, harmonizable processes, and spherical data. Thus it was attended by mathematicians whose research interests range from number theory to quantum physics in conjunction with structural properties of probabilistic phenomena. This volume contains 5 survey articles submitted on special invitation and 25 original research papers.

  11. Probability matching and strategy availability

    OpenAIRE

    J. Koehler, Derek; Koehler, Derek J.; James, Greta

    2010-01-01

    Findings from two experiments indicate that probability matching in sequential choice arises from an asymmetry in strategy availability: The matching strategy comes readily to mind, whereas a superior alternative strategy, maximizing, does not. First, compared with the minority who spontaneously engage in maximizing, the majority of participants endorse maximizing as superior to matching in a direct comparison when both strategies are described. Second, when the maximizing strategy is brought...

  12. [Biometric bases: basic concepts of probability calculation].

    Science.gov (United States)

    Dinya, E

    1998-04-26

    The author gives or outline of the basic concepts of probability theory. The bases of the event algebra, definition of the probability, the classical probability model and the random variable are presented.

  13. Probability for Weather and Climate

    Science.gov (United States)

    Smith, L. A.

    2013-12-01

    Over the last 60 years, the availability of large-scale electronic computers has stimulated rapid and significant advances both in meteorology and in our understanding of the Earth System as a whole. The speed of these advances was due, in large part, to the sudden ability to explore nonlinear systems of equations. The computer allows the meteorologist to carry a physical argument to its conclusion; the time scales of weather phenomena then allow the refinement of physical theory, numerical approximation or both in light of new observations. Prior to this extension, as Charney noted, the practicing meteorologist could ignore the results of theory with good conscience. Today, neither the practicing meteorologist nor the practicing climatologist can do so, but to what extent, and in what contexts, should they place the insights of theory above quantitative simulation? And in what circumstances can one confidently estimate the probability of events in the world from model-based simulations? Despite solid advances of theory and insight made possible by the computer, the fidelity of our models of climate differs in kind from the fidelity of models of weather. While all prediction is extrapolation in time, weather resembles interpolation in state space, while climate change is fundamentally an extrapolation. The trichotomy of simulation, observation and theory which has proven essential in meteorology will remain incomplete in climate science. Operationally, the roles of probability, indeed the kinds of probability one has access too, are different in operational weather forecasting and climate services. Significant barriers to forming probability forecasts (which can be used rationally as probabilities) are identified. Monte Carlo ensembles can explore sensitivity, diversity, and (sometimes) the likely impact of measurement uncertainty and structural model error. The aims of different ensemble strategies, and fundamental differences in ensemble design to support of

  14. Probability, Statistics, and Stochastic Processes

    CERN Document Server

    Olofsson, Peter

    2012-01-01

    This book provides a unique and balanced approach to probability, statistics, and stochastic processes.   Readers gain a solid foundation in all three fields that serves as a stepping stone to more advanced investigations into each area.  The Second Edition features new coverage of analysis of variance (ANOVA), consistency and efficiency of estimators, asymptotic theory for maximum likelihood estimators, empirical distribution function and the Kolmogorov-Smirnov test, general linear models, multiple comparisons, Markov chain Monte Carlo (MCMC), Brownian motion, martingales, and

  15. Probability, statistics, and computational science.

    Science.gov (United States)

    Beerenwinkel, Niko; Siebourg, Juliane

    2012-01-01

    In this chapter, we review basic concepts from probability theory and computational statistics that are fundamental to evolutionary genomics. We provide a very basic introduction to statistical modeling and discuss general principles, including maximum likelihood and Bayesian inference. Markov chains, hidden Markov models, and Bayesian network models are introduced in more detail as they occur frequently and in many variations in genomics applications. In particular, we discuss efficient inference algorithms and methods for learning these models from partially observed data. Several simple examples are given throughout the text, some of which point to models that are discussed in more detail in subsequent chapters.

  16. Sensitivity analysis using probability bounding

    International Nuclear Information System (INIS)

    Ferson, Scott; Troy Tucker, W.

    2006-01-01

    Probability bounds analysis (PBA) provides analysts a convenient means to characterize the neighborhood of possible results that would be obtained from plausible alternative inputs in probabilistic calculations. We show the relationship between PBA and the methods of interval analysis and probabilistic uncertainty analysis from which it is jointly derived, and indicate how the method can be used to assess the quality of probabilistic models such as those developed in Monte Carlo simulations for risk analyses. We also illustrate how a sensitivity analysis can be conducted within a PBA by pinching inputs to precise distributions or real values

  17. Lectures on probability and statistics

    International Nuclear Information System (INIS)

    Yost, G.P.

    1984-09-01

    These notes are based on a set of statistics lectures delivered at Imperial College to the first-year postgraduate students in High Energy Physics. They are designed for the professional experimental scientist. We begin with the fundamentals of probability theory, in which one makes statements about the set of possible outcomes of an experiment, based upon a complete a priori understanding of the experiment. For example, in a roll of a set of (fair) dice, one understands a priori that any given side of each die is equally likely to turn up. From that, we can calculate the probability of any specified outcome. We finish with the inverse problem, statistics. Here, one begins with a set of actual data (e.g., the outcomes of a number of rolls of the dice), and attempts to make inferences about the state of nature which gave those data (e.g., the likelihood of seeing any given side of any given die turn up). This is a much more difficult problem, of course, and one's solutions often turn out to be unsatisfactory in one respect or another

  18. iTBS-induced LTP-like plasticity parallels oscillatory activity changes in the primary sensory and motor areas of macaque monkeys.

    Directory of Open Access Journals (Sweden)

    Odysseas Papazachariadis

    Full Text Available Recently, neuromodulation techniques based on the use of repetitive transcranial magnetic stimulation (rTMS have been proposed as a non-invasive and efficient method to induce in vivo long-term potentiation (LTP-like aftereffects. However, the exact impact of rTMS-induced perturbations on the dynamics of neuronal population activity is not well understood. Here, in two monkeys, we examine changes in the oscillatory activity of the sensorimotor cortex following an intermittent theta burst stimulation (iTBS protocol. We first probed iTBS modulatory effects by testing the iTBS-induced facilitation of somatosensory evoked potentials (SEP. Then, we examined the frequency information of the electrocorticographic signal, obtained using a custom-made miniaturised multi-electrode array for electrocorticography, after real or sham iTBS. We observed that iTBS induced facilitation of SEPs and influenced spectral components of the signal, in both animals. The latter effect was more prominent on the θ band (4-8 Hz and the high γ band (55-90 Hz, de-potentiated and potentiated respectively. We additionally found that the multi-electrode array uniformity of β (13-26 Hz and high γ bands were also afflicted by iTBS. Our study suggests that enhanced cortical excitability promoted by iTBS parallels a dynamic reorganisation of the interested neural network. The effect in the γ band suggests a transient local modulation, possibly at the level of synaptic strength in interneurons. The effect in the θ band suggests the disruption of temporal coordination on larger spatial scales.

  19. iTBS-induced LTP-like plasticity parallels oscillatory activity changes in the primary sensory and motor areas of macaque monkeys.

    Science.gov (United States)

    Papazachariadis, Odysseas; Dante, Vittorio; Verschure, Paul F M J; Del Giudice, Paolo; Ferraina, Stefano

    2014-01-01

    Recently, neuromodulation techniques based on the use of repetitive transcranial magnetic stimulation (rTMS) have been proposed as a non-invasive and efficient method to induce in vivo long-term potentiation (LTP)-like aftereffects. However, the exact impact of rTMS-induced perturbations on the dynamics of neuronal population activity is not well understood. Here, in two monkeys, we examine changes in the oscillatory activity of the sensorimotor cortex following an intermittent theta burst stimulation (iTBS) protocol. We first probed iTBS modulatory effects by testing the iTBS-induced facilitation of somatosensory evoked potentials (SEP). Then, we examined the frequency information of the electrocorticographic signal, obtained using a custom-made miniaturised multi-electrode array for electrocorticography, after real or sham iTBS. We observed that iTBS induced facilitation of SEPs and influenced spectral components of the signal, in both animals. The latter effect was more prominent on the θ band (4-8 Hz) and the high γ band (55-90 Hz), de-potentiated and potentiated respectively. We additionally found that the multi-electrode array uniformity of β (13-26 Hz) and high γ bands were also afflicted by iTBS. Our study suggests that enhanced cortical excitability promoted by iTBS parallels a dynamic reorganisation of the interested neural network. The effect in the γ band suggests a transient local modulation, possibly at the level of synaptic strength in interneurons. The effect in the θ band suggests the disruption of temporal coordination on larger spatial scales.

  20. News/Press Releases

    Data.gov (United States)

    Office of Personnel Management — A press release, news release, media release, press statement is written communication directed at members of the news media for the purpose of announcing programs...

  1. Probability theory a comprehensive course

    CERN Document Server

    Klenke, Achim

    2014-01-01

    This second edition of the popular textbook contains a comprehensive course in modern probability theory. Overall, probabilistic concepts play an increasingly important role in mathematics, physics, biology, financial engineering and computer science. They help us in understanding magnetism, amorphous media, genetic diversity and the perils of random developments at financial markets, and they guide us in constructing more efficient algorithms.   To address these concepts, the title covers a wide variety of topics, many of which are not usually found in introductory textbooks, such as:   • limit theorems for sums of random variables • martingales • percolation • Markov chains and electrical networks • construction of stochastic processes • Poisson point process and infinite divisibility • large deviation principles and statistical physics • Brownian motion • stochastic integral and stochastic differential equations. The theory is developed rigorously and in a self-contained way, with the c...

  2. Excluding joint probabilities from quantum theory

    Science.gov (United States)

    Allahverdyan, Armen E.; Danageozian, Arshag

    2018-03-01

    Quantum theory does not provide a unique definition for the joint probability of two noncommuting observables, which is the next important question after the Born's probability for a single observable. Instead, various definitions were suggested, e.g., via quasiprobabilities or via hidden-variable theories. After reviewing open issues of the joint probability, we relate it to quantum imprecise probabilities, which are noncontextual and are consistent with all constraints expected from a quantum probability. We study two noncommuting observables in a two-dimensional Hilbert space and show that there is no precise joint probability that applies for any quantum state and is consistent with imprecise probabilities. This contrasts with theorems by Bell and Kochen-Specker that exclude joint probabilities for more than two noncommuting observables, in Hilbert space with dimension larger than two. If measurement contexts are included into the definition, joint probabilities are not excluded anymore, but they are still constrained by imprecise probabilities.

  3. Probability theory and mathematical statistics for engineers

    CERN Document Server

    Pugachev, V S

    1984-01-01

    Probability Theory and Mathematical Statistics for Engineers focuses on the concepts of probability theory and mathematical statistics for finite-dimensional random variables.The publication first underscores the probabilities of events, random variables, and numerical characteristics of random variables. Discussions focus on canonical expansions of random vectors, second-order moments of random vectors, generalization of the density concept, entropy of a distribution, direct evaluation of probabilities, and conditional probabilities. The text then examines projections of random vector

  4. Introduction to probability theory with contemporary applications

    CERN Document Server

    Helms, Lester L

    2010-01-01

    This introduction to probability theory transforms a highly abstract subject into a series of coherent concepts. Its extensive discussions and clear examples, written in plain language, expose students to the rules and methods of probability. Suitable for an introductory probability course, this volume requires abstract and conceptual thinking skills and a background in calculus.Topics include classical probability, set theory, axioms, probability functions, random and independent random variables, expected values, and covariance and correlations. Additional subjects include stochastic process

  5. K-forbidden transition probabilities

    International Nuclear Information System (INIS)

    Saitoh, T.R.; Sletten, G.; Bark, R.A.; Hagemann, G.B.; Herskind, B.; Saitoh-Hashimoto, N.; Tsukuba Univ., Ibaraki

    2000-01-01

    Reduced hindrance factors of K-forbidden transitions are compiled for nuclei with A∝180 where γ-vibrational states are observed. Correlations between these reduced hindrance factors and Coriolis forces, statistical level mixing and γ-softness have been studied. It is demonstrated that the K-forbidden transition probabilities are related to γ-softness. The decay of the high-K bandheads has been studied by means of the two-state mixing, which would be induced by the γ-softness, with the use of a number of K-forbidden transitions compiled in the present work, where high-K bandheads are depopulated by both E2 and ΔI=1 transitions. The validity of the two-state mixing scheme has been examined by using the proposed identity of the B(M1)/B(E2) ratios of transitions depopulating high-K bandheads and levels of low-K bands. A break down of the identity might indicate that other levels would mediate transitions between high- and low-K states. (orig.)

  6. Direct probability mapping of contaminants

    International Nuclear Information System (INIS)

    Rautman, C.A.

    1993-01-01

    Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. Geostatistical simulation provides powerful tools for investigating contaminant levels, and in particular, for identifying and using the spatial interrelationships among a set of isolated sample values. This additional information can be used to assess the likelihood of encountering contamination at unsampled locations and to evaluate the risk associated with decisions to remediate or not to remediate specific regions within a site. Past operation of the DOE Feed Materials Production Center has contaminated a site near Fernald, Ohio, with natural uranium. Soil geochemical data have been collected as part of the Uranium-in-Soils Integrated Demonstration Project. These data have been used to construct a number of stochastic images of potential contamination for parcels approximately the size of a selective remediation unit. Each such image accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely, statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination. Evaluation of the geostatistical simulations can yield maps representing the expected magnitude of the contamination for various regions and other information that may be important in determining a suitable remediation process or in sizing equipment to accomplish the restoration

  7. Psychophysics of the probability weighting function

    Science.gov (United States)

    Takahashi, Taiki

    2011-03-01

    A probability weighting function w(p) for an objective probability p in decision under risk plays a pivotal role in Kahneman-Tversky prospect theory. Although recent studies in econophysics and neuroeconomics widely utilized probability weighting functions, psychophysical foundations of the probability weighting functions have been unknown. Notably, a behavioral economist Prelec (1998) [4] axiomatically derived the probability weighting function w(p)=exp(-() (01e)=1e,w(1)=1), which has extensively been studied in behavioral neuroeconomics. The present study utilizes psychophysical theory to derive Prelec's probability weighting function from psychophysical laws of perceived waiting time in probabilistic choices. Also, the relations between the parameters in the probability weighting function and the probability discounting function in behavioral psychology are derived. Future directions in the application of the psychophysical theory of the probability weighting function in econophysics and neuroeconomics are discussed.

  8. THE BLACK HOLE FORMATION PROBABILITY

    Energy Technology Data Exchange (ETDEWEB)

    Clausen, Drew; Piro, Anthony L.; Ott, Christian D., E-mail: dclausen@tapir.caltech.edu [TAPIR, Walter Burke Institute for Theoretical Physics, California Institute of Technology, Mailcode 350-17, Pasadena, CA 91125 (United States)

    2015-02-01

    A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P {sub BH}(M {sub ZAMS}). Although we find that it is difficult to derive a unique P {sub BH}(M {sub ZAMS}) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P {sub BH}(M {sub ZAMS}) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P {sub BH}(M {sub ZAMS}) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment.

  9. THE BLACK HOLE FORMATION PROBABILITY

    International Nuclear Information System (INIS)

    Clausen, Drew; Piro, Anthony L.; Ott, Christian D.

    2015-01-01

    A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P BH (M ZAMS ). Although we find that it is difficult to derive a unique P BH (M ZAMS ) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P BH (M ZAMS ) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P BH (M ZAMS ) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment

  10. The Black Hole Formation Probability

    Science.gov (United States)

    Clausen, Drew; Piro, Anthony L.; Ott, Christian D.

    2015-02-01

    A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P BH(M ZAMS). Although we find that it is difficult to derive a unique P BH(M ZAMS) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P BH(M ZAMS) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P BH(M ZAMS) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment.

  11. Foundations of the theory of probability

    CERN Document Server

    Kolmogorov, AN

    2018-01-01

    This famous little book remains a foundational text for the understanding of probability theory, important both to students beginning a serious study of probability and to historians of modern mathematics. 1956 second edition.

  12. The Probability Distribution for a Biased Spinner

    Science.gov (United States)

    Foster, Colin

    2012-01-01

    This article advocates biased spinners as an engaging context for statistics students. Calculating the probability of a biased spinner landing on a particular side makes valuable connections between probability and other areas of mathematics. (Contains 2 figures and 1 table.)

  13. Conditional Probability Modulates Visual Search Efficiency

    Directory of Open Access Journals (Sweden)

    Bryan eCort

    2013-10-01

    Full Text Available We investigated the effects of probability on visual search. Previous work has shown that people can utilize spatial and sequential probability information to improve target detection. We hypothesized that performance improvements from probability information would extend to the efficiency of visual search. Our task was a simple visual search in which the target was always present among a field of distractors, and could take one of two colors. The absolute probability of the target being either color was 0.5; however, the conditional probability – the likelihood of a particular color given a particular combination of two cues – varied from 0.1 to 0.9. We found that participants searched more efficiently for high conditional probability targets and less efficiently for low conditional probability targets, but only when they were explicitly informed of the probability relationship between cues and target color.

  14. Analytic Neutrino Oscillation Probabilities in Matter: Revisited

    Energy Technology Data Exchange (ETDEWEB)

    Parke, Stephen J. [Fermilab; Denton, Peter B. [Copenhagen U.; Minakata, Hisakazu [Madrid, IFT

    2018-01-02

    We summarize our recent paper on neutrino oscillation probabilities in matter, explaining the importance, relevance and need for simple, highly accurate approximations to the neutrino oscillation probabilities in matter.

  15. Void probability scaling in hadron nucleus interactions

    International Nuclear Information System (INIS)

    Ghosh, Dipak; Deb, Argha; Bhattacharyya, Swarnapratim; Ghosh, Jayita; Bandyopadhyay, Prabhat; Das, Rupa; Mukherjee, Sima

    2002-01-01

    Heygi while investigating with the rapidity gap probability (that measures the chance of finding no particle in the pseudo-rapidity interval Δη) found that a scaling behavior in the rapidity gap probability has a close correspondence with the scaling of a void probability in galaxy correlation study. The main aim in this paper is to study the scaling behavior of the rapidity gap probability

  16. Pre-Service Teachers' Conceptions of Probability

    Science.gov (United States)

    Odafe, Victor U.

    2011-01-01

    Probability knowledge and skills are needed in science and in making daily decisions that are sometimes made under uncertain conditions. Hence, there is the need to ensure that the pre-service teachers of our children are well prepared to teach probability. Pre-service teachers' conceptions of probability are identified, and ways of helping them…

  17. Using Playing Cards to Differentiate Probability Interpretations

    Science.gov (United States)

    López Puga, Jorge

    2014-01-01

    The aprioristic (classical, naïve and symmetric) and frequentist interpretations of probability are commonly known. Bayesian or subjective interpretation of probability is receiving increasing attention. This paper describes an activity to help students differentiate between the three types of probability interpretations.

  18. Dependent Human Error Probability Assessment

    International Nuclear Information System (INIS)

    Simic, Z.; Mikulicic, V.; Vukovic, I.

    2006-01-01

    This paper presents an assessment of the dependence between dynamic operator actions modeled in a Nuclear Power Plant (NPP) PRA and estimate the associated impact on Core damage frequency (CDF). This assessment was done improve HEP dependencies implementation inside existing PRA. All of the dynamic operator actions modeled in the NPP PRA are included in this assessment. Determining the level of HEP dependence and the associated influence on CDF are the major steps of this assessment. A decision on how to apply the results, i.e., should permanent HEP model changes be made, is based on the resulting relative CDF increase. Some CDF increase was selected as a threshold based on the NPP base CDF value and acceptance guidelines from the Regulatory Guide 1.174. HEP dependence resulting in a CDF increase of > 5E-07 would be considered potential candidates for specific incorporation into the baseline model. The approach used to judge the level of dependence between operator actions is based on dependency level categories and conditional probabilities developed in the Handbook of Human Reliability Analysis with Emphasis on Nuclear Power Plant Applications NUREG/CR-1278. To simplify the process, NUREG/CR-1278 identifies five levels of dependence: ZD (zero dependence), LD (low dependence), MD (moderate dependence), HD (high dependence), and CD (complete dependence). NUREG/CR-1278 also identifies several qualitative factors that could be involved in determining the level of dependence. Based on the NUREG/CR-1278 information, Time, Function, and Spatial attributes were judged to be the most important considerations when determining the level of dependence between operator actions within an accident sequence. These attributes were used to develop qualitative criteria (rules) that were used to judge the level of dependence (CD, HD, MD, LD, ZD) between the operator actions. After the level of dependence between the various HEPs is judged, quantitative values associated with the

  19. Atmospheric dispersion models of radioactivity releases

    International Nuclear Information System (INIS)

    Oza, R.B.

    2016-01-01

    In view of the rapid industrialization in recent time, atmospheric dispersion models have become indispensible 'tools' to ensure that the effects of releases are well within the acceptable limits set by the regulatory authority. In the case of radioactive releases from the nuclear facility, though negligible in quantity and many a times not even measurable, it is required to demonstrate the compliance of these releases to the regulatory limits set by the regulatory authority by carrying out radiological impact assessment. During routine operations of nuclear facility, the releases are so low that environmental impact is usually assessed with the help of atmospheric dispersion models as it is difficult to distinguish negligible contribution of nuclear facility to relatively high natural background radiation. The accidental releases from nuclear facility, though with negligible probability of occurrence, cannot be ruled out. In such cases, the atmospheric dispersion models are of great help to emergency planners for deciding the intervention actions to minimize the consequences in public domain and also to workout strategies for the management of situation. In case of accidental conditions, the atmospheric dispersion models are also utilized for the estimation of probable quantities of radionuclides which might have got released to the atmosphere. Thus, atmospheric dispersion models are an essential tool for nuclear facility during routine operation as well as in the case of accidental conditions

  20. Fundamentals of applied probability and random processes

    CERN Document Server

    Ibe, Oliver

    2014-01-01

    The long-awaited revision of Fundamentals of Applied Probability and Random Processes expands on the central components that made the first edition a classic. The title is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability t

  1. Probability of Failure in Random Vibration

    DEFF Research Database (Denmark)

    Nielsen, Søren R.K.; Sørensen, John Dalsgaard

    1988-01-01

    Close approximations to the first-passage probability of failure in random vibration can be obtained by integral equation methods. A simple relation exists between the first-passage probability density function and the distribution function for the time interval spent below a barrier before out......-crossing. An integral equation for the probability density function of the time interval is formulated, and adequate approximations for the kernel are suggested. The kernel approximation results in approximate solutions for the probability density function of the time interval and thus for the first-passage probability...

  2. An Objective Theory of Probability (Routledge Revivals)

    CERN Document Server

    Gillies, Donald

    2012-01-01

    This reissue of D. A. Gillies highly influential work, first published in 1973, is a philosophical theory of probability which seeks to develop von Mises' views on the subject. In agreement with von Mises, the author regards probability theory as a mathematical science like mechanics or electrodynamics, and probability as an objective, measurable concept like force, mass or charge. On the other hand, Dr Gillies rejects von Mises' definition of probability in terms of limiting frequency and claims that probability should be taken as a primitive or undefined term in accordance with modern axioma

  3. Paraconsistent Probabilities: Consistency, Contradictions and Bayes’ Theorem

    Directory of Open Access Journals (Sweden)

    Juliana Bueno-Soler

    2016-09-01

    Full Text Available This paper represents the first steps towards constructing a paraconsistent theory of probability based on the Logics of Formal Inconsistency (LFIs. We show that LFIs encode very naturally an extension of the notion of probability able to express sophisticated probabilistic reasoning under contradictions employing appropriate notions of conditional probability and paraconsistent updating, via a version of Bayes’ theorem for conditionalization. We argue that the dissimilarity between the notions of inconsistency and contradiction, one of the pillars of LFIs, plays a central role in our extended notion of probability. Some critical historical and conceptual points about probability theory are also reviewed.

  4. Nuclear energy release from fragmentation

    Energy Technology Data Exchange (ETDEWEB)

    Li, Cheng [The Key Laboratory of Beam Technology and Material Modification of Ministry of Education, College of Nuclear Science and Technology, Beijing Normal University, Beijing 100875 (China); Beijing Radiation Center, Beijing 100875 (China); Souza, S.R. [Instituto de Física, Universidade Federal do Rio de Janeiro Cidade Universitária, Caixa Postal 68528, 21945-970 Rio de Janeiro (Brazil); Tsang, M.B. [The Key Laboratory of Beam Technology and Material Modification of Ministry of Education, College of Nuclear Science and Technology, Beijing Normal University, Beijing 100875 (China); Beijing Radiation Center, Beijing 100875 (China); National Superconducting Cyclotron Laboratory and Physics and Astronomy Department, Michigan State University, East Lansing, MI 48824 (United States); Zhang, Feng-Shou, E-mail: fszhang@bnu.edu.cn [The Key Laboratory of Beam Technology and Material Modification of Ministry of Education, College of Nuclear Science and Technology, Beijing Normal University, Beijing 100875 (China); Beijing Radiation Center, Beijing 100875 (China); Center of Theoretical Nuclear Physics, National Laboratory of Heavy Ion Accelerator of Lanzhou, Lanzhou 730000 (China)

    2016-08-15

    It is well known that binary fission occurs with positive energy gain. In this article we examine the energetics of splitting uranium and thorium isotopes into various numbers of fragments (from two to eight) with nearly equal size. We find that the energy released by splitting {sup 230,232}Th and {sup 235,238}U into three equal size fragments is largest. The statistical multifragmentation model (SMM) is applied to calculate the probability of different breakup channels for excited nuclei. By weighing the probability distributions of fragment multiplicity at different excitation energies, we find the peaks of energy release for {sup 230,232}Th and {sup 235,238}U are around 0.7–0.75 MeV/u at excitation energy between 1.2 and 2 MeV/u in the primary breakup process. Taking into account the secondary de-excitation processes of primary fragments with the GEMINI code, these energy peaks fall to about 0.45 MeV/u.

  5. Toxics Release Inventory (TRI)

    Data.gov (United States)

    U.S. Environmental Protection Agency — The Toxics Release Inventory (TRI) is a dataset compiled by the U.S. Environmental Protection Agency (EPA). It contains information on the release and waste...

  6. Probability concepts in quality risk management.

    Science.gov (United States)

    Claycamp, H Gregg

    2012-01-01

    Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although risk is generally a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management tools are relatively silent on the meaning and uses of "probability." The probability concept is typically applied by risk managers as a combination of frequency-based calculation and a "degree of belief" meaning of probability. Probability as a concept that is crucial for understanding and managing risk is discussed through examples from the most general, scenario-defining and ranking tools that use probability implicitly to more specific probabilistic tools in risk management. A rich history of probability in risk management applied to other fields suggests that high-quality risk management decisions benefit from the implementation of more thoughtful probability concepts in both risk modeling and risk management. Essentially any concept of risk is built on fundamental concepts of chance, likelihood, or probability. Although "risk" generally describes a probability of loss of something of value, given that a risk-generating event will occur or has occurred, it is ironic that the quality risk management literature and guidelines on quality risk management methodologies and respective tools focus on managing severity but are relatively silent on the in-depth meaning and uses of "probability." Pharmaceutical manufacturers are expanding their use of quality risk management to identify and manage risks to the patient that might occur in phases of the pharmaceutical life cycle from drug development to manufacture, marketing to product discontinuation. A probability concept is typically applied by risk managers as a combination of data-based measures of probability and a subjective "degree of belief" meaning of probability. Probability as

  7. Transition probability spaces in loop quantum gravity

    Science.gov (United States)

    Guo, Xiao-Kan

    2018-03-01

    We study the (generalized) transition probability spaces, in the sense of Mielnik and Cantoni, for spacetime quantum states in loop quantum gravity. First, we show that loop quantum gravity admits the structures of transition probability spaces. This is exemplified by first checking such structures in covariant quantum mechanics and then identifying the transition probability spaces in spin foam models via a simplified version of general boundary formulation. The transition probability space thus defined gives a simple way to reconstruct the discrete analog of the Hilbert space of the canonical theory and the relevant quantum logical structures. Second, we show that the transition probability space and in particular the spin foam model are 2-categories. Then we discuss how to realize in spin foam models two proposals by Crane about the mathematical structures of quantum gravity, namely, the quantum topos and causal sites. We conclude that transition probability spaces provide us with an alternative framework to understand various foundational questions of loop quantum gravity.

  8. Towards a Categorical Account of Conditional Probability

    Directory of Open Access Journals (Sweden)

    Robert Furber

    2015-11-01

    Full Text Available This paper presents a categorical account of conditional probability, covering both the classical and the quantum case. Classical conditional probabilities are expressed as a certain "triangle-fill-in" condition, connecting marginal and joint probabilities, in the Kleisli category of the distribution monad. The conditional probabilities are induced by a map together with a predicate (the condition. The latter is a predicate in the logic of effect modules on this Kleisli category. This same approach can be transferred to the category of C*-algebras (with positive unital maps, whose predicate logic is also expressed in terms of effect modules. Conditional probabilities can again be expressed via a triangle-fill-in property. In the literature, there are several proposals for what quantum conditional probability should be, and also there are extra difficulties not present in the classical case. At this stage, we only describe quantum systems with classical parametrization.

  9. UT Biomedical Informatics Lab (BMIL) probability wheel

    Science.gov (United States)

    Huang, Sheng-Cheng; Lee, Sara; Wang, Allen; Cantor, Scott B.; Sun, Clement; Fan, Kaili; Reece, Gregory P.; Kim, Min Soon; Markey, Mia K.

    A probability wheel app is intended to facilitate communication between two people, an "investigator" and a "participant", about uncertainties inherent in decision-making. Traditionally, a probability wheel is a mechanical prop with two colored slices. A user adjusts the sizes of the slices to indicate the relative value of the probabilities assigned to them. A probability wheel can improve the adjustment process and attenuate the effect of anchoring bias when it is used to estimate or communicate probabilities of outcomes. The goal of this work was to develop a mobile application of the probability wheel that is portable, easily available, and more versatile. We provide a motivating example from medical decision-making, but the tool is widely applicable for researchers in the decision sciences.

  10. A probability space for quantum models

    Science.gov (United States)

    Lemmens, L. F.

    2017-06-01

    A probability space contains a set of outcomes, a collection of events formed by subsets of the set of outcomes and probabilities defined for all events. A reformulation in terms of propositions allows to use the maximum entropy method to assign the probabilities taking some constraints into account. The construction of a probability space for quantum models is determined by the choice of propositions, choosing the constraints and making the probability assignment by the maximum entropy method. This approach shows, how typical quantum distributions such as Maxwell-Boltzmann, Fermi-Dirac and Bose-Einstein are partly related with well-known classical distributions. The relation between the conditional probability density, given some averages as constraints and the appropriate ensemble is elucidated.

  11. Fundamentals of applied probability and random processes

    CERN Document Server

    Ibe, Oliver

    2005-01-01

    This book is based on the premise that engineers use probability as a modeling tool, and that probability can be applied to the solution of engineering problems. Engineers and students studying probability and random processes also need to analyze data, and thus need some knowledge of statistics. This book is designed to provide students with a thorough grounding in probability and stochastic processes, demonstrate their applicability to real-world problems, and introduce the basics of statistics. The book''s clear writing style and homework problems make it ideal for the classroom or for self-study.* Good and solid introduction to probability theory and stochastic processes * Logically organized; writing is presented in a clear manner * Choice of topics is comprehensive within the area of probability * Ample homework problems are organized into chapter sections

  12. Striatal activity is modulated by target probability.

    Science.gov (United States)

    Hon, Nicholas

    2017-06-14

    Target probability has well-known neural effects. In the brain, target probability is known to affect frontal activity, with lower probability targets producing more prefrontal activation than those that occur with higher probability. Although the effect of target probability on cortical activity is well specified, its effect on subcortical structures such as the striatum is less well understood. Here, I examined this issue and found that the striatum was highly responsive to target probability. This is consistent with its hypothesized role in the gating of salient information into higher-order task representations. The current data are interpreted in light of that fact that different components of the striatum are sensitive to different types of task-relevant information.

  13. Defining Probability in Sex Offender Risk Assessment.

    Science.gov (United States)

    Elwood, Richard W

    2016-12-01

    There is ongoing debate and confusion over using actuarial scales to predict individuals' risk of sexual recidivism. Much of the debate comes from not distinguishing Frequentist from Bayesian definitions of probability. Much of the confusion comes from applying Frequentist probability to individuals' risk. By definition, only Bayesian probability can be applied to the single case. The Bayesian concept of probability resolves most of the confusion and much of the debate in sex offender risk assessment. Although Bayesian probability is well accepted in risk assessment generally, it has not been widely used to assess the risk of sex offenders. I review the two concepts of probability and show how the Bayesian view alone provides a coherent scheme to conceptualize individuals' risk of sexual recidivism.

  14. Spatial probability aids visual stimulus discrimination

    Directory of Open Access Journals (Sweden)

    Michael Druker

    2010-08-01

    Full Text Available We investigated whether the statistical predictability of a target's location would influence how quickly and accurately it was classified. Recent results have suggested that spatial probability can be a cue for the allocation of attention in visual search. One explanation for probability cuing is spatial repetition priming. In our two experiments we used probability distributions that were continuous across the display rather than relying on a few arbitrary screen locations. This produced fewer spatial repeats and allowed us to dissociate the effect of a high probability location from that of short-term spatial repetition. The task required participants to quickly judge the color of a single dot presented on a computer screen. In Experiment 1, targets were more probable in an off-center hotspot of high probability that gradually declined to a background rate. Targets garnered faster responses if they were near earlier target locations (priming and if they were near the high probability hotspot (probability cuing. In Experiment 2, target locations were chosen on three concentric circles around fixation. One circle contained 80% of targets. The value of this ring distribution is that it allowed for a spatially restricted high probability zone in which sequentially repeated trials were not likely to be physically close. Participant performance was sensitive to the high-probability circle in addition to the expected effects of eccentricity and the distance to recent targets. These two experiments suggest that inhomogeneities in spatial probability can be learned and used by participants on-line and without prompting as an aid for visual stimulus discrimination and that spatial repetition priming is not a sufficient explanation for this effect. Future models of attention should consider explicitly incorporating the probabilities of targets locations and features.

  15. Is probability of frequency too narrow?

    International Nuclear Information System (INIS)

    Martz, H.F.

    1993-01-01

    Modern methods of statistical data analysis, such as empirical and hierarchical Bayesian methods, should find increasing use in future Probabilistic Risk Assessment (PRA) applications. In addition, there will be a more formalized use of expert judgment in future PRAs. These methods require an extension of the probabilistic framework of PRA, in particular, the popular notion of probability of frequency, to consideration of frequency of frequency, frequency of probability, and probability of probability. The genesis, interpretation, and examples of these three extended notions are discussed

  16. Pacific Northwest geomorphology and hydrology: rates and probabilities of selected processes and events

    International Nuclear Information System (INIS)

    Tubbs, D.W.

    1979-01-01

    This report presents results of one of the geomorphological and hydrological studies that have been conducted for the release scenario analysis of the Waste Isolation Safety Assessment Program (WISAP). Three general topics are considered: (1) determination of rates of denudation, (2) estimation of the probability of flooding due to each of several causes, and (3) evaluation of other surface processes that should be considered in the release scenario analysis. The third general topic was ultimately narrowed to the possible effects of landsliding. Rates of erosion are expressed as centimeters per 100 years, except that the original units are retained in figures taken from other sources. Probabilities are also expressed per 100 years

  17. Probability of Grounding and Collision Events

    DEFF Research Database (Denmark)

    Pedersen, Preben Terndrup

    1996-01-01

    To quantify the risks involved in ship traffic, rational criteria for collision and grounding accidents are developed. This implies that probabilities as well as inherent consequences can be analysed and assessed. The presnt paper outlines a method for evaluation of the probability of ship...

  18. Probability of Grounding and Collision Events

    DEFF Research Database (Denmark)

    Pedersen, Preben Terndrup

    1996-01-01

    To quantify the risks involved in ship traffic, rational criteria for collision and grounding accidents have to be developed. This implies that probabilities as well as inherent consequences have to be analyzed and assessed.The present notes outline a method for evaluation of the probability...

  19. Introducing Disjoint and Independent Events in Probability.

    Science.gov (United States)

    Kelly, I. W.; Zwiers, F. W.

    Two central concepts in probability theory are those of independence and mutually exclusive events. This document is intended to provide suggestions to teachers that can be used to equip students with an intuitive, comprehensive understanding of these basic concepts in probability. The first section of the paper delineates mutually exclusive and…

  20. Selected papers on probability and statistics

    CERN Document Server

    2009-01-01

    This volume contains translations of papers that originally appeared in the Japanese journal Sūgaku. The papers range over a variety of topics in probability theory, statistics, and applications. This volume is suitable for graduate students and research mathematicians interested in probability and statistics.

  1. Collective probabilities algorithm for surface hopping calculations

    International Nuclear Information System (INIS)

    Bastida, Adolfo; Cruz, Carlos; Zuniga, Jose; Requena, Alberto

    2003-01-01

    General equations that transition probabilities of the hopping algorithms in surface hopping calculations must obey to assure the equality between the average quantum and classical populations are derived. These equations are solved for two particular cases. In the first it is assumed that probabilities are the same for all trajectories and that the number of hops is kept to a minimum. These assumptions specify the collective probabilities (CP) algorithm, for which the transition probabilities depend on the average populations for all trajectories. In the second case, the probabilities for each trajectory are supposed to be completely independent of the results from the other trajectories. There is, then, a unique solution of the general equations assuring that the transition probabilities are equal to the quantum population of the target state, which is referred to as the independent probabilities (IP) algorithm. The fewest switches (FS) algorithm developed by Tully is accordingly understood as an approximate hopping algorithm which takes elements from the accurate CP and IP solutions. A numerical test of all these hopping algorithms is carried out for a one-dimensional two-state problem with two avoiding crossings which shows the accuracy and computational efficiency of the collective probabilities algorithm proposed, the limitations of the FS algorithm and the similarity between the results offered by the IP algorithm and those obtained with the Ehrenfest method

  2. Examples of Neutrosophic Probability in Physics

    Directory of Open Access Journals (Sweden)

    Fu Yuhua

    2015-01-01

    Full Text Available This paper re-discusses the problems of the so-called “law of nonconservation of parity” and “accelerating expansion of the universe”, and presents the examples of determining Neutrosophic Probability of the experiment of Chien-Shiung Wu et al in 1957, and determining Neutrosophic Probability of accelerating expansion of the partial universe.

  3. Eliciting Subjective Probabilities with Binary Lotteries

    DEFF Research Database (Denmark)

    Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd

    objective probabilities. Drawing a sample from the same subject population, we find evidence that the binary lottery procedure induces linear utility in a subjective probability elicitation task using the Quadratic Scoring Rule. We also show that the binary lottery procedure can induce direct revelation...

  4. Probability Issues in without Replacement Sampling

    Science.gov (United States)

    Joarder, A. H.; Al-Sabah, W. S.

    2007-01-01

    Sampling without replacement is an important aspect in teaching conditional probabilities in elementary statistics courses. Different methods proposed in different texts for calculating probabilities of events in this context are reviewed and their relative merits and limitations in applications are pinpointed. An alternative representation of…

  5. Some open problems in noncommutative probability

    International Nuclear Information System (INIS)

    Kruszynski, P.

    1981-01-01

    A generalization of probability measures to non-Boolean structures is discussed. The starting point of the theory is the Gleason theorem about the form of measures on closed subspaces of a Hilbert space. The problems are formulated in terms of probability on lattices of projections in arbitrary von Neumann algebras. (Auth.)

  6. Probability: A Matter of Life and Death

    Science.gov (United States)

    Hassani, Mehdi; Kippen, Rebecca; Mills, Terence

    2016-01-01

    Life tables are mathematical tables that document probabilities of dying and life expectancies at different ages in a society. Thus, the life table contains some essential features of the health of a population. Probability is often regarded as a difficult branch of mathematics. Life tables provide an interesting approach to introducing concepts…

  7. Teaching Probability: A Socio-Constructivist Perspective

    Science.gov (United States)

    Sharma, Sashi

    2015-01-01

    There is a considerable and rich literature on students' misconceptions in probability. However, less attention has been paid to the development of students' probabilistic thinking in the classroom. This paper offers a sequence, grounded in socio-constructivist perspective for teaching probability.

  8. Stimulus Probability Effects in Absolute Identification

    Science.gov (United States)

    Kent, Christopher; Lamberts, Koen

    2016-01-01

    This study investigated the effect of stimulus presentation probability on accuracy and response times in an absolute identification task. Three schedules of presentation were used to investigate the interaction between presentation probability and stimulus position within the set. Data from individual participants indicated strong effects of…

  9. 47 CFR 1.1623 - Probability calculation.

    Science.gov (United States)

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Probability calculation. 1.1623 Section 1.1623 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL PRACTICE AND PROCEDURE Random Selection Procedures for Mass Media Services General Procedures § 1.1623 Probability calculation. (a) All calculations shall be...

  10. Simulations of Probabilities for Quantum Computing

    Science.gov (United States)

    Zak, M.

    1996-01-01

    It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-LIpschitz dynamics, without utilization of any man-made devices (such as random number generators). Self-organizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed.

  11. Against All Odds: When Logic Meets Probability

    NARCIS (Netherlands)

    van Benthem, J.; Katoen, J.-P.; Langerak, R.; Rensink, A.

    2017-01-01

    This paper is a light walk along interfaces between logic and probability, triggered by a chance encounter with Ed Brinksma. It is not a research paper, or a literature survey, but a pointer to issues. I discuss both direct combinations of logic and probability and structured ways in which logic can

  12. An introduction to probability and stochastic processes

    CERN Document Server

    Melsa, James L

    2013-01-01

    Geared toward college seniors and first-year graduate students, this text is designed for a one-semester course in probability and stochastic processes. Topics covered in detail include probability theory, random variables and their functions, stochastic processes, linear system response to stochastic processes, Gaussian and Markov processes, and stochastic differential equations. 1973 edition.

  13. The probability of the false vacuum decay

    International Nuclear Information System (INIS)

    Kiselev, V.; Selivanov, K.

    1983-01-01

    The closed expession for the probability of the false vacuum decay in (1+1) dimensions is given. The probability of false vacuum decay is expessed as the product of exponential quasiclassical factor and a functional determinant of the given form. The method for calcutation of this determinant is developed and a complete answer for (1+1) dimensions is given

  14. Probability elements of the mathematical theory

    CERN Document Server

    Heathcote, C R

    2000-01-01

    Designed for students studying mathematical statistics and probability after completing a course in calculus and real variables, this text deals with basic notions of probability spaces, random variables, distribution functions and generating functions, as well as joint distributions and the convergence properties of sequences of random variables. Includes worked examples and over 250 exercises with solutions.

  15. The transition probabilities of the reciprocity model

    NARCIS (Netherlands)

    Snijders, T.A.B.

    1999-01-01

    The reciprocity model is a continuous-time Markov chain model used for modeling longitudinal network data. A new explicit expression is derived for its transition probability matrix. This expression can be checked relatively easily. Some properties of the transition probabilities are given, as well

  16. Probability numeracy and health insurance purchase

    NARCIS (Netherlands)

    Dillingh, Rik; Kooreman, Peter; Potters, Jan

    2016-01-01

    This paper provides new field evidence on the role of probability numeracy in health insurance purchase. Our regression results, based on rich survey panel data, indicate that the expenditure on two out of three measures of health insurance first rises with probability numeracy and then falls again.

  17. The enigma of probability and physics

    International Nuclear Information System (INIS)

    Mayants, L.

    1984-01-01

    This volume contains a coherent exposition of the elements of two unique sciences: probabilistics (science of probability) and probabilistic physics (application of probabilistics to physics). Proceeding from a key methodological principle, it starts with the disclosure of the true content of probability and the interrelation between probability theory and experimental statistics. This makes is possible to introduce a proper order in all the sciences dealing with probability and, by conceiving the real content of statistical mechanics and quantum mechanics in particular, to construct both as two interconnected domains of probabilistic physics. Consistent theories of kinetics of physical transformations, decay processes, and intramolecular rearrangements are also outlined. The interrelation between the electromagnetic field, photons, and the theoretically discovered subatomic particle 'emon' is considered. Numerous internal imperfections of conventional probability theory, statistical physics, and quantum physics are exposed and removed - quantum physics no longer needs special interpretation. EPR, Bohm, and Bell paradoxes are easily resolved, among others. (Auth.)

  18. Optimizing Probability of Detection Point Estimate Demonstration

    Science.gov (United States)

    Koshti, Ajay M.

    2017-01-01

    Probability of detection (POD) analysis is used in assessing reliably detectable flaw size in nondestructive evaluation (NDE). MIL-HDBK-18231and associated mh18232POD software gives most common methods of POD analysis. Real flaws such as cracks and crack-like flaws are desired to be detected using these NDE methods. A reliably detectable crack size is required for safe life analysis of fracture critical parts. The paper provides discussion on optimizing probability of detection (POD) demonstration experiments using Point Estimate Method. POD Point estimate method is used by NASA for qualifying special NDE procedures. The point estimate method uses binomial distribution for probability density. Normally, a set of 29 flaws of same size within some tolerance are used in the demonstration. The optimization is performed to provide acceptable value for probability of passing demonstration (PPD) and achieving acceptable value for probability of false (POF) calls while keeping the flaw sizes in the set as small as possible.

  19. Alternative probability theories for cognitive psychology.

    Science.gov (United States)

    Narens, Louis

    2014-01-01

    Various proposals for generalizing event spaces for probability functions have been put forth in the mathematical, scientific, and philosophic literatures. In cognitive psychology such generalizations are used for explaining puzzling results in decision theory and for modeling the influence of context effects. This commentary discusses proposals for generalizing probability theory to event spaces that are not necessarily boolean algebras. Two prominent examples are quantum probability theory, which is based on the set of closed subspaces of a Hilbert space, and topological probability theory, which is based on the set of open sets of a topology. Both have been applied to a variety of cognitive situations. This commentary focuses on how event space properties can influence probability concepts and impact cognitive modeling. Copyright © 2013 Cognitive Science Society, Inc.

  20. Evaluation of nuclear power plant component failure probability and core damage probability using simplified PSA model

    International Nuclear Information System (INIS)

    Shimada, Yoshio

    2000-01-01

    It is anticipated that the change of frequency of surveillance tests, preventive maintenance or parts replacement of safety related components may cause the change of component failure probability and result in the change of core damage probability. It is also anticipated that the change is different depending on the initiating event frequency or the component types. This study assessed the change of core damage probability using simplified PSA model capable of calculating core damage probability in a short time period, which is developed by the US NRC to process accident sequence precursors, when various component's failure probability is changed between 0 and 1, or Japanese or American initiating event frequency data are used. As a result of the analysis, (1) It was clarified that frequency of surveillance test, preventive maintenance or parts replacement of motor driven pumps (high pressure injection pumps, residual heat removal pumps, auxiliary feedwater pumps) should be carefully changed, since the core damage probability's change is large, when the base failure probability changes toward increasing direction. (2) Core damage probability change is insensitive to surveillance test frequency change, since the core damage probability change is small, when motor operated valves and turbine driven auxiliary feed water pump failure probability changes around one figure. (3) Core damage probability change is small, when Japanese failure probability data are applied to emergency diesel generator, even if failure probability changes one figure from the base value. On the other hand, when American failure probability data is applied, core damage probability increase is large, even if failure probability changes toward increasing direction. Therefore, when Japanese failure probability data is applied, core damage probability change is insensitive to surveillance tests frequency change etc. (author)

  1. Assessing the clinical probability of pulmonary embolism

    International Nuclear Information System (INIS)

    Miniati, M.; Pistolesi, M.

    2001-01-01

    Clinical assessment is a cornerstone of the recently validated diagnostic strategies for pulmonary embolism (PE). Although the diagnostic yield of individual symptoms, signs, and common laboratory tests is limited, the combination of these variables, either by empirical assessment or by a prediction rule, can be used to express a clinical probability of PE. The latter may serve as pretest probability to predict the probability of PE after further objective testing (posterior or post-test probability). Over the last few years, attempts have been made to develop structured prediction models for PE. In a Canadian multicenter prospective study, the clinical probability of PE was rated as low, intermediate, or high according to a model which included assessment of presenting symptoms and signs, risk factors, and presence or absence of an alternative diagnosis at least as likely as PE. Recently, a simple clinical score was developed to stratify outpatients with suspected PE into groups with low, intermediate, or high clinical probability. Logistic regression was used to predict parameters associated with PE. A score ≤ 4 identified patients with low probability of whom 10% had PE. The prevalence of PE in patients with intermediate (score 5-8) and high probability (score ≥ 9) was 38 and 81%, respectively. As opposed to the Canadian model, this clinical score is standardized. The predictor variables identified in the model, however, were derived from a database of emergency ward patients. This model may, therefore, not be valid in assessing the clinical probability of PE in inpatients. In the PISA-PED study, a clinical diagnostic algorithm was developed which rests on the identification of three relevant clinical symptoms and on their association with electrocardiographic and/or radiographic abnormalities specific for PE. Among patients who, according to the model, had been rated as having a high clinical probability, the prevalence of proven PE was 97%, while it was 3

  2. Domestic wells have high probability of pumping septic tank leachate

    Science.gov (United States)

    Bremer, J. E.; Harter, T.

    2012-08-01

    Onsite wastewater treatment systems are common in rural and semi-rural areas around the world; in the US, about 25-30% of households are served by a septic (onsite) wastewater treatment system, and many property owners also operate their own domestic well nearby. Site-specific conditions and local groundwater flow are often ignored when installing septic systems and wells. In areas with small lots (thus high spatial septic system densities), shallow domestic wells are prone to contamination by septic system leachate. Mass balance approaches have been used to determine a maximum septic system density that would prevent contamination of groundwater resources. In this study, a source area model based on detailed groundwater flow and transport modeling is applied for a stochastic analysis of domestic well contamination by septic leachate. Specifically, we determine the probability that a source area overlaps with a septic system drainfield as a function of aquifer properties, septic system density and drainfield size. We show that high spatial septic system density poses a high probability of pumping septic system leachate. The hydraulic conductivity of the aquifer has a strong influence on the intersection probability. We find that mass balance calculations applied on a regional scale underestimate the contamination risk of individual drinking water wells by septic systems. This is particularly relevant for contaminants released at high concentrations, for substances that experience limited attenuation, and those that are harmful even at low concentrations (e.g., pathogens).

  3. Domestic wells have high probability of pumping septic tank leachate

    Directory of Open Access Journals (Sweden)

    J. E. Bremer

    2012-08-01

    Full Text Available Onsite wastewater treatment systems are common in rural and semi-rural areas around the world; in the US, about 25–30% of households are served by a septic (onsite wastewater treatment system, and many property owners also operate their own domestic well nearby. Site-specific conditions and local groundwater flow are often ignored when installing septic systems and wells. In areas with small lots (thus high spatial septic system densities, shallow domestic wells are prone to contamination by septic system leachate. Mass balance approaches have been used to determine a maximum septic system density that would prevent contamination of groundwater resources. In this study, a source area model based on detailed groundwater flow and transport modeling is applied for a stochastic analysis of domestic well contamination by septic leachate. Specifically, we determine the probability that a source area overlaps with a septic system drainfield as a function of aquifer properties, septic system density and drainfield size. We show that high spatial septic system density poses a high probability of pumping septic system leachate. The hydraulic conductivity of the aquifer has a strong influence on the intersection probability. We find that mass balance calculations applied on a regional scale underestimate the contamination risk of individual drinking water wells by septic systems. This is particularly relevant for contaminants released at high concentrations, for substances that experience limited attenuation, and those that are harmful even at low concentrations (e.g., pathogens.

  4. Upgrading Probability via Fractions of Events

    Directory of Open Access Journals (Sweden)

    Frič Roman

    2016-08-01

    Full Text Available The influence of “Grundbegriffe” by A. N. Kolmogorov (published in 1933 on education in the area of probability and its impact on research in stochastics cannot be overestimated. We would like to point out three aspects of the classical probability theory “calling for” an upgrade: (i classical random events are black-and-white (Boolean; (ii classical random variables do not model quantum phenomena; (iii basic maps (probability measures and observables { dual maps to random variables have very different “mathematical nature”. Accordingly, we propose an upgraded probability theory based on Łukasiewicz operations (multivalued logic on events, elementary category theory, and covering the classical probability theory as a special case. The upgrade can be compared to replacing calculations with integers by calculations with rational (and real numbers. Namely, to avoid the three objections, we embed the classical (Boolean random events (represented by the f0; 1g-valued indicator functions of sets into upgraded random events (represented by measurable {0; 1}-valued functions, the minimal domain of probability containing “fractions” of classical random events, and we upgrade the notions of probability measure and random variable.

  5. Failure probability analysis of optical grid

    Science.gov (United States)

    Zhong, Yaoquan; Guo, Wei; Sun, Weiqiang; Jin, Yaohui; Hu, Weisheng

    2008-11-01

    Optical grid, the integrated computing environment based on optical network, is expected to be an efficient infrastructure to support advanced data-intensive grid applications. In optical grid, the faults of both computational and network resources are inevitable due to the large scale and high complexity of the system. With the optical network based distributed computing systems extensive applied in the processing of data, the requirement of the application failure probability have been an important indicator of the quality of application and an important aspect the operators consider. This paper will present a task-based analysis method of the application failure probability in optical grid. Then the failure probability of the entire application can be quantified, and the performance of reducing application failure probability in different backup strategies can be compared, so that the different requirements of different clients can be satisfied according to the application failure probability respectively. In optical grid, when the application based DAG (directed acyclic graph) is executed in different backup strategies, the application failure probability and the application complete time is different. This paper will propose new multi-objective differentiated services algorithm (MDSA). New application scheduling algorithm can guarantee the requirement of the failure probability and improve the network resource utilization, realize a compromise between the network operator and the application submission. Then differentiated services can be achieved in optical grid.

  6. Uncertainty about probability: a decision analysis perspective

    International Nuclear Information System (INIS)

    Howard, R.A.

    1988-01-01

    The issue of how to think about uncertainty about probability is framed and analyzed from the viewpoint of a decision analyst. The failure of nuclear power plants is used as an example. The key idea is to think of probability as describing a state of information on an uncertain event, and to pose the issue of uncertainty in this quantity as uncertainty about a number that would be definitive: it has the property that you would assign it as the probability if you knew it. Logical consistency requires that the probability to assign to a single occurrence in the absence of further information be the mean of the distribution of this definitive number, not the medium as is sometimes suggested. Any decision that must be made without the benefit of further information must also be made using the mean of the definitive number's distribution. With this formulation, they find further that the probability of r occurrences in n exchangeable trials will depend on the first n moments of the definitive number's distribution. In making decisions, the expected value of clairvoyance on the occurrence of the event must be at least as great as that on the definitive number. If one of the events in question occurs, then the increase in probability of another such event is readily computed. This means, in terms of coin tossing, that unless one is absolutely sure of the fairness of a coin, seeing a head must increase the probability of heads, in distinction to usual thought. A numerical example for nuclear power shows that the failure of one plant of a group with a low probability of failure can significantly increase the probability that must be assigned to failure of a second plant in the group

  7. Radioactive releases into the environment under accidental conditions

    International Nuclear Information System (INIS)

    Beninson, D.

    1976-01-01

    Although accidents involving the release of radioactive materials and the unplanned exposure of people can occur at any stage of the nuclear fuel cycle, most attention has been focused on reactor accidents. Although no power reactor accidents involving exposure of the public have yet occured, it should be recognized that the probability of such accidental releases cannot be reduced to zero. Since the inventory of radioactive materials in power reactors is very large, it is usual to postulate, for safety assessments, that a release of fission products takes place in spite of all protective measures. This postulated release is of importance for reactor siting and for preparing emergency plans. (HP) [de

  8. Probability an introduction with statistical applications

    CERN Document Server

    Kinney, John J

    2014-01-01

    Praise for the First Edition""This is a well-written and impressively presented introduction to probability and statistics. The text throughout is highly readable, and the author makes liberal use of graphs and diagrams to clarify the theory.""  - The StatisticianThoroughly updated, Probability: An Introduction with Statistical Applications, Second Edition features a comprehensive exploration of statistical data analysis as an application of probability. The new edition provides an introduction to statistics with accessible coverage of reliability, acceptance sampling, confidence intervals, h

  9. Dependency models and probability of joint events

    International Nuclear Information System (INIS)

    Oerjasaeter, O.

    1982-08-01

    Probabilistic dependencies between components/systems are discussed with reference to a broad classification of potential failure mechanisms. Further, a generalized time-dependency model, based on conditional probabilities for estimation of the probability of joint events and event sequences is described. The applicability of this model is clarified/demonstrated by various examples. It is concluded that the described model of dependency is a useful tool for solving a variety of practical problems concerning the probability of joint events and event sequences where common cause and time-dependent failure mechanisms are involved. (Auth.)

  10. Handbook of probability theory and applications

    CERN Document Server

    Rudas, Tamas

    2008-01-01

    ""This is a valuable reference guide for readers interested in gaining a basic understanding of probability theory or its applications in problem solving in the other disciplines.""-CHOICEProviding cutting-edge perspectives and real-world insights into the greater utility of probability and its applications, the Handbook of Probability offers an equal balance of theory and direct applications in a non-technical, yet comprehensive, format. Editor Tamás Rudas and the internationally-known contributors present the material in a manner so that researchers of vari

  11. Probabilities on Streams and Reflexive Games

    Directory of Open Access Journals (Sweden)

    Andrew Schumann

    2014-01-01

    Full Text Available Probability measures on streams (e.g. on hypernumbers and p-adic numbers have been defined. It was shown that these probabilities can be used for simulations of reflexive games. In particular, it can be proved that Aumann's agreement theorem does not hold for these probabilities. Instead of this theorem, there is a statement that is called the reflexion disagreement theorem. Based on this theorem, probabilistic and knowledge conditions can be defined for reflexive games at various reflexion levels up to the infinite level. (original abstract

  12. Concept of probability in statistical physics

    CERN Document Server

    Guttmann, Y M

    1999-01-01

    Foundational issues in statistical mechanics and the more general question of how probability is to be understood in the context of physical theories are both areas that have been neglected by philosophers of physics. This book fills an important gap in the literature by providing a most systematic study of how to interpret probabilistic assertions in the context of statistical mechanics. The book explores both subjectivist and objectivist accounts of probability, and takes full measure of work in the foundations of probability theory, in statistical mechanics, and in mathematical theory. It will be of particular interest to philosophers of science, physicists and mathematicians interested in foundational issues, and also to historians of science.

  13. Computation of the Complex Probability Function

    Energy Technology Data Exchange (ETDEWEB)

    Trainer, Amelia Jo [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Ledwith, Patrick John [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2017-08-22

    The complex probability function is important in many areas of physics and many techniques have been developed in an attempt to compute it for some z quickly and e ciently. Most prominent are the methods that use Gauss-Hermite quadrature, which uses the roots of the nth degree Hermite polynomial and corresponding weights to approximate the complex probability function. This document serves as an overview and discussion of the use, shortcomings, and potential improvements on the Gauss-Hermite quadrature for the complex probability function.

  14. Pre-aggregation for Probability Distributions

    DEFF Research Database (Denmark)

    Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach

    Motivated by the increasing need to analyze complex uncertain multidimensional data (e.g., in order to optimize and personalize location-based services), this paper proposes novel types of {\\em probabilistic} OLAP queries that operate on aggregate values that are probability distributions...... and the techniques to process these queries. The paper also presents the methods for computing the probability distributions, which enables pre-aggregation, and for using the pre-aggregated distributions for further aggregation. In order to achieve good time and space efficiency, the methods perform approximate...... multidimensional data analysis that is considered in this paper (i.e., approximate processing of probabilistic OLAP queries over probability distributions)....

  15. Comparing linear probability model coefficients across groups

    DEFF Research Database (Denmark)

    Holm, Anders; Ejrnæs, Mette; Karlson, Kristian Bernt

    2015-01-01

    of the following three components: outcome truncation, scale parameters and distributional shape of the predictor variable. These results point to limitations in using linear probability model coefficients for group comparisons. We also provide Monte Carlo simulations and real examples to illustrate......This article offers a formal identification analysis of the problem in comparing coefficients from linear probability models between groups. We show that differences in coefficients from these models can result not only from genuine differences in effects, but also from differences in one or more...... these limitations, and we suggest a restricted approach to using linear probability model coefficients in group comparisons....

  16. Modeling experiments using quantum and Kolmogorov probability

    International Nuclear Information System (INIS)

    Hess, Karl

    2008-01-01

    Criteria are presented that permit a straightforward partition of experiments into sets that can be modeled using both quantum probability and the classical probability framework of Kolmogorov. These new criteria concentrate on the operational aspects of the experiments and lead beyond the commonly appreciated partition by relating experiments to commuting and non-commuting quantum operators as well as non-entangled and entangled wavefunctions. In other words the space of experiments that can be understood using classical probability is larger than usually assumed. This knowledge provides advantages for areas such as nanoscience and engineering or quantum computation.

  17. The probability outcome correpondence principle : a dispositional view of the interpretation of probability statements

    NARCIS (Netherlands)

    Keren, G.; Teigen, K.H.

    2001-01-01

    This article presents a framework for lay people's internal representations of probabilities, which supposedly reflect the strength of underlying dispositions, or propensities, associated with the predicted event. From this framework, we derive the probability-outcome correspondence principle, which

  18. Tritium release from neutron irradiated beryllium pebbles

    Energy Technology Data Exchange (ETDEWEB)

    Scaffidi-Argentina, F.; Werle, H. [Forschungszentrum Karlsruhe GmbH Technik und Umwelt (Germany). Inst. fuer Neutronenphysik und Reactortechnik

    1998-01-01

    One of the most important open issues related to beryllium for fusion applications refers to the kinetics of the tritium release as a function of neutron fluence and temperature. The EXOTIC-7 as well as the `Beryllium` experiments carried out in the HFR reactor in Petten are considered as the most detailed and significant tests for investigating the beryllium response under neutron irradiation. This paper reviews the present status of beryllium post-irradiation examinations performed at the Forschungszentrum Karlsruhe with samples from the above mentioned irradiation experiments, trying to elucidate the tritium release controlling processes. In agreement with previous studies it has been found that release starts at about 500-550degC and achieves a maximum at about 700-750degC. The observed release at about 500-550degC is probably due to tritium escaping from chemical traps, while the maximum release at about 700-750degC is due to tritium escaping from physical traps. The consequences of a direct contact between beryllium and ceramics during irradiation, causing tritium implanting in a surface layer of beryllium up to a depth of about 40 mm and leading to an additional inventory which is usually several times larger than the neutron-produced one, are also presented and the effects on the tritium release are discussed. (author)

  19. Large scientific releases

    International Nuclear Information System (INIS)

    Pongratz, M.B.

    1981-01-01

    The motivation for active experiments in space is considered, taking into account the use of active techniques to obtain a better understanding of the natural space environment, the utilization of the advantages of space as a laboratory to study fundamental plasma physics, and the employment of active techniques to determine the magnitude, degree, and consequences of artificial modification of the space environment. It is pointed out that mass-injection experiments in space plasmas began about twenty years ago with the Project Firefly releases. Attention is given to mass-release techniques and diagnostics, operational aspects of mass release active experiments, the active observation of mass release experiments, active perturbation mass release experiments, simulating an artificial modification of the space environment, and active experiments to study fundamental plasma physics

  20. Modelling the probability of building fires

    Directory of Open Access Journals (Sweden)

    Vojtěch Barták

    2014-12-01

    Full Text Available Systematic spatial risk analysis plays a crucial role in preventing emergencies.In the Czech Republic, risk mapping is currently based on the risk accumulationprinciple, area vulnerability, and preparedness levels of Integrated Rescue Systemcomponents. Expert estimates are used to determine risk levels for individualhazard types, while statistical modelling based on data from actual incidents andtheir possible causes is not used. Our model study, conducted in cooperation withthe Fire Rescue Service of the Czech Republic as a model within the Liberec andHradec Králové regions, presents an analytical procedure leading to the creation ofbuilding fire probability maps based on recent incidents in the studied areas andon building parameters. In order to estimate the probability of building fires, aprediction model based on logistic regression was used. Probability of fire calculatedby means of model parameters and attributes of specific buildings can subsequentlybe visualized in probability maps.

  1. Encounter Probability of Individual Wave Height

    DEFF Research Database (Denmark)

    Liu, Z.; Burcharth, H. F.

    1998-01-01

    wave height corresponding to a certain exceedence probability within a structure lifetime (encounter probability), based on the statistical analysis of long-term extreme significant wave height. Then the design individual wave height is calculated as the expected maximum individual wave height...... associated with the design significant wave height, with the assumption that the individual wave heights follow the Rayleigh distribution. However, the exceedence probability of such a design individual wave height within the structure lifetime is unknown. The paper presents a method for the determination...... of the design individual wave height corresponding to an exceedence probability within the structure lifetime, given the long-term extreme significant wave height. The method can also be applied for estimation of the number of relatively large waves for fatigue analysis of constructions....

  2. Predicting binary choices from probability phrase meanings.

    Science.gov (United States)

    Wallsten, Thomas S; Jang, Yoonhee

    2008-08-01

    The issues of how individuals decide which of two events is more likely and of how they understand probability phrases both involve judging relative likelihoods. In this study, we investigated whether derived scales representing probability phrase meanings could be used within a choice model to predict independently observed binary choices. If they can, this simultaneously provides support for our model and suggests that the phrase meanings are measured meaningfully. The model assumes that, when deciding which of two events is more likely, judges take a single sample from memory regarding each event and respond accordingly. The model predicts choice probabilities by using the scaled meanings of individually selected probability phrases as proxies for confidence distributions associated with sampling from memory. Predictions are sustained for 34 of 41 participants but, nevertheless, are biased slightly low. Sequential sampling models improve the fit. The results have both theoretical and applied implications.

  3. Certainties and probabilities of the IPCC

    International Nuclear Information System (INIS)

    2004-01-01

    Based on an analysis of information about the climate evolution, simulations of a global warming and the snow coverage monitoring of Meteo-France, the IPCC presented its certainties and probabilities concerning the greenhouse effect. (A.L.B.)

  4. The probability factor in establishing causation

    International Nuclear Information System (INIS)

    Hebert, J.

    1988-01-01

    This paper discusses the possibilities and limitations of methods using the probability factor in establishing the causal link between bodily injury, whether immediate or delayed, and the nuclear incident presumed to have caused it (NEA) [fr

  5. Bayesian optimization for computationally extensive probability distributions.

    Science.gov (United States)

    Tamura, Ryo; Hukushima, Koji

    2018-01-01

    An efficient method for finding a better maximizer of computationally extensive probability distributions is proposed on the basis of a Bayesian optimization technique. A key idea of the proposed method is to use extreme values of acquisition functions by Gaussian processes for the next training phase, which should be located near a local maximum or a global maximum of the probability distribution. Our Bayesian optimization technique is applied to the posterior distribution in the effective physical model estimation, which is a computationally extensive probability distribution. Even when the number of sampling points on the posterior distributions is fixed to be small, the Bayesian optimization provides a better maximizer of the posterior distributions in comparison to those by the random search method, the steepest descent method, or the Monte Carlo method. Furthermore, the Bayesian optimization improves the results efficiently by combining the steepest descent method and thus it is a powerful tool to search for a better maximizer of computationally extensive probability distributions.

  6. Characteristic length of the knotting probability revisited

    International Nuclear Information System (INIS)

    Uehara, Erica; Deguchi, Tetsuo

    2015-01-01

    We present a self-avoiding polygon (SAP) model for circular DNA in which the radius of impermeable cylindrical segments corresponds to the screening length of double-stranded DNA surrounded by counter ions. For the model we evaluate the probability for a generated SAP with N segments having a given knot K through simulation. We call it the knotting probability of a knot K with N segments for the SAP model. We show that when N is large the most significant factor in the knotting probability is given by the exponentially decaying part exp(−N/N K ), where the estimates of parameter N K are consistent with the same value for all the different knots we investigated. We thus call it the characteristic length of the knotting probability. We give formulae expressing the characteristic length as a function of the cylindrical radius r ex , i.e. the screening length of double-stranded DNA. (paper)

  7. Probability of Survival Decision Aid (PSDA)

    National Research Council Canada - National Science Library

    Xu, Xiaojiang; Amin, Mitesh; Santee, William R

    2008-01-01

    A Probability of Survival Decision Aid (PSDA) is developed to predict survival time for hypothermia and dehydration during prolonged exposure at sea in both air and water for a wide range of environmental conditions...

  8. Probability and statistics with integrated software routines

    CERN Document Server

    Deep, Ronald

    2005-01-01

    Probability & Statistics with Integrated Software Routines is a calculus-based treatment of probability concurrent with and integrated with statistics through interactive, tailored software applications designed to enhance the phenomena of probability and statistics. The software programs make the book unique.The book comes with a CD containing the interactive software leading to the Statistical Genie. The student can issue commands repeatedly while making parameter changes to observe the effects. Computer programming is an excellent skill for problem solvers, involving design, prototyping, data gathering, testing, redesign, validating, etc, all wrapped up in the scientific method.See also: CD to accompany Probability and Stats with Integrated Software Routines (0123694698)* Incorporates more than 1,000 engaging problems with answers* Includes more than 300 solved examples* Uses varied problem solving methods

  9. Pre-Aggregation with Probability Distributions

    DEFF Research Database (Denmark)

    Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach

    2006-01-01

    Motivated by the increasing need to analyze complex, uncertain multidimensional data this paper proposes probabilistic OLAP queries that are computed using probability distributions rather than atomic values. The paper describes how to create probability distributions from base data, and how...... the distributions can be subsequently used in pre-aggregation. Since the probability distributions can become large, we show how to achieve good time and space efficiency by approximating the distributions. We present the results of several experiments that demonstrate the effectiveness of our methods. The work...... is motivated with a real-world case study, based on our collaboration with a leading Danish vendor of location-based services. This paper is the first to consider the approximate processing of probabilistic OLAP queries over probability distributions....

  10. Probability of spent fuel transportation accidents

    International Nuclear Information System (INIS)

    McClure, J.D.

    1981-07-01

    The transported volume of spent fuel, incident/accident experience and accident environment probabilities were reviewed in order to provide an estimate of spent fuel accident probabilities. In particular, the accident review assessed the accident experience for large casks of the type that could transport spent (irradiated) nuclear fuel. This review determined that since 1971, the beginning of official US Department of Transportation record keeping for accidents/incidents, there has been one spent fuel transportation accident. This information, coupled with estimated annual shipping volumes for spent fuel, indicated an estimated annual probability of a spent fuel transport accident of 5 x 10 -7 spent fuel accidents per mile. This is consistent with ordinary truck accident rates. A comparison of accident environments and regulatory test environments suggests that the probability of truck accidents exceeding regulatory test for impact is approximately 10 -9 /mile

  11. Sampling, Probability Models and Statistical Reasoning Statistical

    Indian Academy of Sciences (India)

    Home; Journals; Resonance – Journal of Science Education; Volume 1; Issue 5. Sampling, Probability Models and Statistical Reasoning Statistical Inference. Mohan Delampady V R Padmawar. General Article Volume 1 Issue 5 May 1996 pp 49-58 ...

  12. Imprecise Probability Methods for Weapons UQ

    Energy Technology Data Exchange (ETDEWEB)

    Picard, Richard Roy [Los Alamos National Lab. (LANL), Los Alamos, NM (United States); Vander Wiel, Scott Alan [Los Alamos National Lab. (LANL), Los Alamos, NM (United States)

    2016-05-13

    Building on recent work in uncertainty quanti cation, we examine the use of imprecise probability methods to better characterize expert knowledge and to improve on misleading aspects of Bayesian analysis with informative prior distributions. Quantitative approaches to incorporate uncertainties in weapons certi cation are subject to rigorous external peer review, and in this regard, certain imprecise probability methods are well established in the literature and attractive. These methods are illustrated using experimental data from LANL detonator impact testing.

  13. Escape and transmission probabilities in cylindrical geometry

    International Nuclear Information System (INIS)

    Bjerke, M.A.

    1980-01-01

    An improved technique for the generation of escape and transmission probabilities in cylindrical geometry was applied to the existing resonance cross section processing code ROLAIDS. The algorithm of Hwang and Toppel, [ANL-FRA-TM-118] (with modifications) was employed. The probabilities generated were found to be as accurate as those given by the method previously applied in ROLAIDS, while requiring much less computer core storage and CPU time

  14. Probability and statistics for computer science

    CERN Document Server

    Johnson, James L

    2011-01-01

    Comprehensive and thorough development of both probability and statistics for serious computer scientists; goal-oriented: ""to present the mathematical analysis underlying probability results"" Special emphases on simulation and discrete decision theory Mathematically-rich, but self-contained text, at a gentle pace Review of calculus and linear algebra in an appendix Mathematical interludes (in each chapter) which examine mathematical techniques in the context of probabilistic or statistical importance Numerous section exercises, summaries, historical notes, and Further Readings for reinforcem

  15. Collision Probabilities for Finite Cylinders and Cuboids

    Energy Technology Data Exchange (ETDEWEB)

    Carlvik, I

    1967-05-15

    Analytical formulae have been derived for the collision probabilities of homogeneous finite cylinders and cuboids. The formula for the finite cylinder contains double integrals, and the formula for the cuboid only single integrals. Collision probabilities have been calculated by means of the formulae and compared with values obtained by other authors. It was found that the calculations using the analytical formulae are much quicker and give higher accuracy than Monte Carlo calculations.

  16. High throughput nonparametric probability density estimation.

    Science.gov (United States)

    Farmer, Jenny; Jacobs, Donald

    2018-01-01

    In high throughput applications, such as those found in bioinformatics and finance, it is important to determine accurate probability distribution functions despite only minimal information about data characteristics, and without using human subjectivity. Such an automated process for univariate data is implemented to achieve this goal by merging the maximum entropy method with single order statistics and maximum likelihood. The only required properties of the random variables are that they are continuous and that they are, or can be approximated as, independent and identically distributed. A quasi-log-likelihood function based on single order statistics for sampled uniform random data is used to empirically construct a sample size invariant universal scoring function. Then a probability density estimate is determined by iteratively improving trial cumulative distribution functions, where better estimates are quantified by the scoring function that identifies atypical fluctuations. This criterion resists under and over fitting data as an alternative to employing the Bayesian or Akaike information criterion. Multiple estimates for the probability density reflect uncertainties due to statistical fluctuations in random samples. Scaled quantile residual plots are also introduced as an effective diagnostic to visualize the quality of the estimated probability densities. Benchmark tests show that estimates for the probability density function (PDF) converge to the true PDF as sample size increases on particularly difficult test probability densities that include cases with discontinuities, multi-resolution scales, heavy tails, and singularities. These results indicate the method has general applicability for high throughput statistical inference.

  17. Genefer: Programs for Finding Large Probable Generalized Fermat Primes

    Directory of Open Access Journals (Sweden)

    Iain Arthur Bethune

    2015-11-01

    Full Text Available Genefer is a suite of programs for performing Probable Primality (PRP tests of Generalised Fermat numbers 'b'2'n'+1 (GFNs using a Fermat test. Optimised implementations are available for modern CPUs using single instruction, multiple data (SIMD instructions, as well as for GPUs using CUDA or OpenCL. Genefer has been extensively used by PrimeGrid – a volunteer computing project searching for large prime numbers of various kinds, including GFNs. Genefer’s architecture separates the high level logic such as checkpointing and user interface from the architecture-specific performance-critical parts of the implementation, which are suitable for re-use. Genefer is released under the MIT license. Source and binaries are available from www.assembla.com/spaces/genefer.

  18. Causal inference, probability theory, and graphical insights.

    Science.gov (United States)

    Baker, Stuart G

    2013-11-10

    Causal inference from observational studies is a fundamental topic in biostatistics. The causal graph literature typically views probability theory as insufficient to express causal concepts in observational studies. In contrast, the view here is that probability theory is a desirable and sufficient basis for many topics in causal inference for the following two reasons. First, probability theory is generally more flexible than causal graphs: Besides explaining such causal graph topics as M-bias (adjusting for a collider) and bias amplification and attenuation (when adjusting for instrumental variable), probability theory is also the foundation of the paired availability design for historical controls, which does not fit into a causal graph framework. Second, probability theory is the basis for insightful graphical displays including the BK-Plot for understanding Simpson's paradox with a binary confounder, the BK2-Plot for understanding bias amplification and attenuation in the presence of an unobserved binary confounder, and the PAD-Plot for understanding the principal stratification component of the paired availability design. Published 2013. This article is a US Government work and is in the public domain in the USA.

  19. The 2017 Release Cloudy

    Science.gov (United States)

    Ferland, G. J.; Chatzikos, M.; Guzmán, F.; Lykins, M. L.; van Hoof, P. A. M.; Williams, R. J. R.; Abel, N. P.; Badnell, N. R.; Keenan, F. P.; Porter, R. L.; Stancil, P. C.

    2017-10-01

    We describe the 2017 release of the spectral synthesis code Cloudy, summarizing the many improvements to the scope and accuracy of the physics which have been made since the previous release. Exporting the atomic data into external data files has enabled many new large datasets to be incorporated into the code. The use of the complete datasets is not realistic for most calculations, so we describe the limited subset of data used by default, which predicts significantly more lines than the previous release of Cloudy. This version is nevertheless faster than the previous release, as a result of code optimizations. We give examples of the accuracy limits using small models, and the performance requirements of large complete models. We summarize several advances in the H- and He-like iso-electronic sequences and use our complete collisional-radiative models to establish the densities where the coronal and local thermodynamic equilibrium approximations work.

  20. EIA new releases

    International Nuclear Information System (INIS)

    1994-09-01

    This report is a compliation of news releases from the Energy Information Administration. The september-october report includes articles on energy conservation, energy consumption in commercial buildings, and a short term energy model for a personal computer

  1. On the Possibility of Assigning Probabilities to Singular Cases, or: Probability Is Subjective Too!

    Directory of Open Access Journals (Sweden)

    Mark R. Crovelli

    2009-06-01

    Full Text Available Both Ludwig von Mises and Richard von Mises claimed that numerical probability could not be legitimately applied to singular cases. This paper challenges this aspect of the von Mises brothers’ theory of probability. It is argued that their denial that numerical probability could be applied to singular cases was based solely upon Richard von Mises’ exceptionally restrictive definition of probability. This paper challenges Richard von Mises’ definition of probability by arguing that the definition of probability necessarily depends upon whether the world is governed by time-invariant causal laws. It is argued that if the world is governed by time-invariant causal laws, a subjective definition of probability must be adopted. It is further argued that both the nature of human action and the relative frequency method for calculating numerical probabilities both presuppose that the world is indeed governed by time-invariant causal laws. It is finally argued that the subjective definition of probability undercuts the von Mises claim that numerical probability cannot legitimately be applied to singular, non-replicable cases.

  2. Sellafield (release of radioactivity)

    Energy Technology Data Exchange (ETDEWEB)

    Cunningham, J; Goodlad, A; Morris, M

    1986-02-06

    A government statement is reported, about the release of plutonium nitrate at the Sellafield site of British Nuclear Fuels plc on 5 February 1986. Matters raised included: details of accident; personnel monitoring; whether radioactive material was released from the site; need for public acceptance of BNFL activities; whether plant should be closed; need to reduce level of radioactive effluent; number of incidents at the plant.

  3. Uncertainty relation and probability. Numerical illustration

    International Nuclear Information System (INIS)

    Fujikawa, Kazuo; Umetsu, Koichiro

    2011-01-01

    The uncertainty relation and the probability interpretation of quantum mechanics are intrinsically connected, as is evidenced by the evaluation of standard deviations. It is thus natural to ask if one can associate a very small uncertainty product of suitably sampled events with a very small probability. We have shown elsewhere that some examples of the evasion of the uncertainty relation noted in the past are in fact understood in this way. We here numerically illustrate that a very small uncertainty product is realized if one performs a suitable sampling of measured data that occur with a very small probability. We introduce a notion of cyclic measurements. It is also shown that our analysis is consistent with the Landau-Pollak-type uncertainty relation. It is suggested that the present analysis may help reconcile the contradicting views about the 'standard quantum limit' in the detection of gravitational waves. (author)

  4. Scoring Rules for Subjective Probability Distributions

    DEFF Research Database (Denmark)

    Harrison, Glenn W.; Martínez-Correa, Jimmy; Swarthout, J. Todd

    The theoretical literature has a rich characterization of scoring rules for eliciting the subjective beliefs that an individual has for continuous events, but under the restrictive assumption of risk neutrality. It is well known that risk aversion can dramatically affect the incentives to correctly...... report the true subjective probability of a binary event, even under Subjective Expected Utility. To address this one can “calibrate” inferences about true subjective probabilities from elicited subjective probabilities over binary events, recognizing the incentives that risk averse agents have...... to distort reports. We characterize the comparable implications of the general case of a risk averse agent when facing a popular scoring rule over continuous events, and find that these concerns do not apply with anything like the same force. For empirically plausible levels of risk aversion, one can...

  5. Comparing coefficients of nested nonlinear probability models

    DEFF Research Database (Denmark)

    Kohler, Ulrich; Karlson, Kristian Bernt; Holm, Anders

    2011-01-01

    In a series of recent articles, Karlson, Holm and Breen have developed a method for comparing the estimated coeffcients of two nested nonlinear probability models. This article describes this method and the user-written program khb that implements the method. The KHB-method is a general decomposi......In a series of recent articles, Karlson, Holm and Breen have developed a method for comparing the estimated coeffcients of two nested nonlinear probability models. This article describes this method and the user-written program khb that implements the method. The KHB-method is a general...... decomposition method that is unaffected by the rescaling or attenuation bias that arise in cross-model comparisons in nonlinear models. It recovers the degree to which a control variable, Z, mediates or explains the relationship between X and a latent outcome variable, Y*, underlying the nonlinear probability...

  6. A basic course in probability theory

    CERN Document Server

    Bhattacharya, Rabi

    2016-01-01

    This text develops the necessary background in probability theory underlying diverse treatments of stochastic processes and their wide-ranging applications. In this second edition, the text has been reorganized for didactic purposes, new exercises have been added and basic theory has been expanded. General Markov dependent sequences and their convergence to equilibrium is the subject of an entirely new chapter. The introduction of conditional expectation and conditional probability very early in the text maintains the pedagogic innovation of the first edition; conditional expectation is illustrated in detail in the context of an expanded treatment of martingales, the Markov property, and the strong Markov property. Weak convergence of probabilities on metric spaces and Brownian motion are two topics to highlight. A selection of large deviation and/or concentration inequalities ranging from those of Chebyshev, Cramer–Chernoff, Bahadur–Rao, to Hoeffding have been added, with illustrative comparisons of thei...

  7. Ignition probabilities for Compact Ignition Tokamak designs

    International Nuclear Information System (INIS)

    Stotler, D.P.; Goldston, R.J.

    1989-09-01

    A global power balance code employing Monte Carlo techniques had been developed to study the ''probability of ignition'' and has been applied to several different configurations of the Compact Ignition Tokamak (CIT). Probability distributions for the critical physics parameters in the code were estimated using existing experimental data. This included a statistical evaluation of the uncertainty in extrapolating the energy confinement time. A substantial probability of ignition is predicted for CIT if peaked density profiles can be achieved or if one of the two higher plasma current configurations is employed. In other cases, values of the energy multiplication factor Q of order 10 are generally obtained. The Ignitor-U and ARIES designs are also examined briefly. Comparisons of our empirically based confinement assumptions with two theory-based transport models yield conflicting results. 41 refs., 11 figs

  8. Independent events in elementary probability theory

    Science.gov (United States)

    Csenki, Attila

    2011-07-01

    In Probability and Statistics taught to mathematicians as a first introduction or to a non-mathematical audience, joint independence of events is introduced by requiring that the multiplication rule is satisfied. The following statement is usually tacitly assumed to hold (and, at best, intuitively motivated): quote specific-use="indent"> If the n events E 1, E 2, … , E n are jointly independent then any two events A and B built in finitely many steps from two disjoint subsets of E 1, E 2, … , E n are also independent. The operations 'union', 'intersection' and 'complementation' are permitted only when forming the events A and B. quote>Here we examine this statement from the point of view of elementary probability theory. The approach described here is accessible also to users of probability theory and is believed to be novel.

  9. Pointwise probability reinforcements for robust statistical inference.

    Science.gov (United States)

    Frénay, Benoît; Verleysen, Michel

    2014-02-01

    Statistical inference using machine learning techniques may be difficult with small datasets because of abnormally frequent data (AFDs). AFDs are observations that are much more frequent in the training sample that they should be, with respect to their theoretical probability, and include e.g. outliers. Estimates of parameters tend to be biased towards models which support such data. This paper proposes to introduce pointwise probability reinforcements (PPRs): the probability of each observation is reinforced by a PPR and a regularisation allows controlling the amount of reinforcement which compensates for AFDs. The proposed solution is very generic, since it can be used to robustify any statistical inference method which can be formulated as a likelihood maximisation. Experiments show that PPRs can be easily used to tackle regression, classification and projection: models are freed from the influence of outliers. Moreover, outliers can be filtered manually since an abnormality degree is obtained for each observation. Copyright © 2013 Elsevier Ltd. All rights reserved.

  10. Uncertainty the soul of modeling, probability & statistics

    CERN Document Server

    Briggs, William

    2016-01-01

    This book presents a philosophical approach to probability and probabilistic thinking, considering the underpinnings of probabilistic reasoning and modeling, which effectively underlie everything in data science. The ultimate goal is to call into question many standard tenets and lay the philosophical and probabilistic groundwork and infrastructure for statistical modeling. It is the first book devoted to the philosophy of data aimed at working scientists and calls for a new consideration in the practice of probability and statistics to eliminate what has been referred to as the "Cult of Statistical Significance". The book explains the philosophy of these ideas and not the mathematics, though there are a handful of mathematical examples. The topics are logically laid out, starting with basic philosophy as related to probability, statistics, and science, and stepping through the key probabilistic ideas and concepts, and ending with statistical models. Its jargon-free approach asserts that standard methods, suc...

  11. Introduction to probability with statistical applications

    CERN Document Server

    Schay, Géza

    2016-01-01

    Now in its second edition, this textbook serves as an introduction to probability and statistics for non-mathematics majors who do not need the exhaustive detail and mathematical depth provided in more comprehensive treatments of the subject. The presentation covers the mathematical laws of random phenomena, including discrete and continuous random variables, expectation and variance, and common probability distributions such as the binomial, Poisson, and normal distributions. More classical examples such as Montmort's problem, the ballot problem, and Bertrand’s paradox are now included, along with applications such as the Maxwell-Boltzmann and Bose-Einstein distributions in physics. Key features in new edition: * 35 new exercises * Expanded section on the algebra of sets * Expanded chapters on probabilities to include more classical examples * New section on regression * Online instructors' manual containing solutions to all exercises

  12. Python for probability, statistics, and machine learning

    CERN Document Server

    Unpingco, José

    2016-01-01

    This book covers the key ideas that link probability, statistics, and machine learning illustrated using Python modules in these areas. The entire text, including all the figures and numerical results, is reproducible using the Python codes and their associated Jupyter/IPython notebooks, which are provided as supplementary downloads. The author develops key intuitions in machine learning by working meaningful examples using multiple analytical methods and Python codes, thereby connecting theoretical concepts to concrete implementations. Modern Python modules like Pandas, Sympy, and Scikit-learn are applied to simulate and visualize important machine learning concepts like the bias/variance trade-off, cross-validation, and regularization. Many abstract mathematical ideas, such as convergence in probability theory, are developed and illustrated with numerical examples. This book is suitable for anyone with an undergraduate-level exposure to probability, statistics, or machine learning and with rudimentary knowl...

  13. EARLY HISTORY OF GEOMETRIC PROBABILITY AND STEREOLOGY

    Directory of Open Access Journals (Sweden)

    Magdalena Hykšová

    2012-03-01

    Full Text Available The paper provides an account of the history of geometric probability and stereology from the time of Newton to the early 20th century. It depicts the development of two parallel ways: on one hand, the theory of geometric probability was formed with minor attention paid to other applications than those concerning spatial chance games. On the other hand, practical rules of the estimation of area or volume fraction and other characteristics, easily deducible from geometric probability theory, were proposed without the knowledge of this branch. A special attention is paid to the paper of J.-É. Barbier published in 1860, which contained the fundamental stereological formulas, but remained almost unnoticed both by mathematicians and practicians.

  14. Correlations and Non-Linear Probability Models

    DEFF Research Database (Denmark)

    Breen, Richard; Holm, Anders; Karlson, Kristian Bernt

    2014-01-01

    the dependent variable of the latent variable model and its predictor variables. We show how this correlation can be derived from the parameters of non-linear probability models, develop tests for the statistical significance of the derived correlation, and illustrate its usefulness in two applications. Under......Although the parameters of logit and probit and other non-linear probability models are often explained and interpreted in relation to the regression coefficients of an underlying linear latent variable model, we argue that they may also be usefully interpreted in terms of the correlations between...... certain circumstances, which we explain, the derived correlation provides a way of overcoming the problems inherent in cross-sample comparisons of the parameters of non-linear probability models....

  15. Geometric modeling in probability and statistics

    CERN Document Server

    Calin, Ovidiu

    2014-01-01

    This book covers topics of Informational Geometry, a field which deals with the differential geometric study of the manifold probability density functions. This is a field that is increasingly attracting the interest of researchers from many different areas of science, including mathematics, statistics, geometry, computer science, signal processing, physics and neuroscience. It is the authors’ hope that the present book will be a valuable reference for researchers and graduate students in one of the aforementioned fields. This textbook is a unified presentation of differential geometry and probability theory, and constitutes a text for a course directed at graduate or advanced undergraduate students interested in applications of differential geometry in probability and statistics. The book contains over 100 proposed exercises meant to help students deepen their understanding, and it is accompanied by software that is able to provide numerical computations of several information geometric objects. The reader...

  16. Fixation probability on clique-based graphs

    Science.gov (United States)

    Choi, Jeong-Ok; Yu, Unjong

    2018-02-01

    The fixation probability of a mutant in the evolutionary dynamics of Moran process is calculated by the Monte-Carlo method on a few families of clique-based graphs. It is shown that the complete suppression of fixation can be realized with the generalized clique-wheel graph in the limit of small wheel-clique ratio and infinite size. The family of clique-star is an amplifier, and clique-arms graph changes from amplifier to suppressor as the fitness of the mutant increases. We demonstrate that the overall structure of a graph can be more important to determine the fixation probability than the degree or the heat heterogeneity. The dependence of the fixation probability on the position of the first mutant is discussed.

  17. Duelling idiots and other probability puzzlers

    CERN Document Server

    Nahin, Paul J

    2002-01-01

    What are your chances of dying on your next flight, being called for jury duty, or winning the lottery? We all encounter probability problems in our everyday lives. In this collection of twenty-one puzzles, Paul Nahin challenges us to think creatively about the laws of probability as they apply in playful, sometimes deceptive, ways to a fascinating array of speculative situations. Games of Russian roulette, problems involving the accumulation of insects on flypaper, and strategies for determining the odds of the underdog winning the World Series all reveal intriguing dimensions to the worki

  18. Proposal for Modified Damage Probability Distribution Functions

    DEFF Research Database (Denmark)

    Pedersen, Preben Terndrup; Hansen, Peter Friis

    1996-01-01

    Immidiately following the Estonia disaster, the Nordic countries establishe a project entitled "Safety of Passenger/RoRo Vessels" As part of this project the present proposal for modified damage stability probability distribution functions has been developed. and submitted to "Sub-committee on st......Immidiately following the Estonia disaster, the Nordic countries establishe a project entitled "Safety of Passenger/RoRo Vessels" As part of this project the present proposal for modified damage stability probability distribution functions has been developed. and submitted to "Sub...

  19. Probability densities and Lévy densities

    DEFF Research Database (Denmark)

    Barndorff-Nielsen, Ole Eiler

    For positive Lévy processes (i.e. subordinators) formulae are derived that express the probability density or the distribution function in terms of power series in time t. The applicability of the results to finance and to turbulence is briefly indicated.......For positive Lévy processes (i.e. subordinators) formulae are derived that express the probability density or the distribution function in terms of power series in time t. The applicability of the results to finance and to turbulence is briefly indicated....

  20. Probabilities from entanglement, Born's rule from envariance

    International Nuclear Information System (INIS)

    Zurek, W.

    2005-01-01

    Full text: I shall discuss consequences of envariance (environment - assisted invariance) symmetry exhibited by entangled quantum states. I shall focus on the implications of envariance for the understanding of the origins and nature of ignorance, and, hence, for the origin of probabilities in physics. While the derivation of the Born's rule for probabilities (pk IykI2) is the principal accomplishment of this research, I shall explore the possibility that several other symptoms of the quantum - classical transition that are a consequence of decoherence can be justified directly by envariance -- i.e., without invoking Born's rule. (author)

  1. Risk Probability Estimating Based on Clustering

    DEFF Research Database (Denmark)

    Chen, Yong; Jensen, Christian D.; Gray, Elizabeth

    2003-01-01

    of prior experiences, recommendations from a trusted entity or the reputation of the other entity. In this paper we propose a dynamic mechanism for estimating the risk probability of a certain interaction in a given environment using hybrid neural networks. We argue that traditional risk assessment models...... from the insurance industry do not directly apply to ubiquitous computing environments. Instead, we propose a dynamic mechanism for risk assessment, which is based on pattern matching, classification and prediction procedures. This mechanism uses an estimator of risk probability, which is based...

  2. Fifty challenging problems in probability with solutions

    CERN Document Server

    Mosteller, Frederick

    1987-01-01

    Can you solve the problem of ""The Unfair Subway""? Marvin gets off work at random times between 3 and 5 p.m. His mother lives uptown, his girlfriend downtown. He takes the first subway that comes in either direction and eats dinner with the one he is delivered to. His mother complains that he never comes to see her, but he says she has a 50-50 chance. He has had dinner with her twice in the last 20 working days. Explain. Marvin's adventures in probability are one of the fifty intriguing puzzles that illustrate both elementary ad advanced aspects of probability, each problem designed to chall

  3. Path probabilities of continuous time random walks

    International Nuclear Information System (INIS)

    Eule, Stephan; Friedrich, Rudolf

    2014-01-01

    Employing the path integral formulation of a broad class of anomalous diffusion processes, we derive the exact relations for the path probability densities of these processes. In particular, we obtain a closed analytical solution for the path probability distribution of a Continuous Time Random Walk (CTRW) process. This solution is given in terms of its waiting time distribution and short time propagator of the corresponding random walk as a solution of a Dyson equation. Applying our analytical solution we derive generalized Feynman–Kac formulae. (paper)

  4. Probable Gastrointestinal Toxicity of Kombucha Tea

    Science.gov (United States)

    Srinivasan, Radhika; Smolinske, Susan; Greenbaum, David

    1997-01-01

    Kombucha tea is a health beverage made by incubating the Kombucha “mushroom” in tea and sugar. Although therapeutic benefits have been attributed to the drink, neither its beneficial effects nor adverse side effects have been reported widely in the scientific literature. Side effects probably related to consumption of Kombucha tea are reported in four patients. Two presented with symptoms of allergic reaction, the third with jaundice, and the fourth with nausea, vomiting, and head and neck pain. In all four, use of Kombucha tea in proximity to onset of symptoms and symptom resolution on cessation of tea drinking suggest a probable etiologic association. PMID:9346462

  5. Quantum probability and quantum decision-making.

    Science.gov (United States)

    Yukalov, V I; Sornette, D

    2016-01-13

    A rigorous general definition of quantum probability is given, which is valid not only for elementary events but also for composite events, for operationally testable measurements as well as for inconclusive measurements, and also for non-commuting observables in addition to commutative observables. Our proposed definition of quantum probability makes it possible to describe quantum measurements and quantum decision-making on the same common mathematical footing. Conditions are formulated for the case when quantum decision theory reduces to its classical counterpart and for the situation where the use of quantum decision theory is necessary. © 2015 The Author(s).

  6. Lady luck the theory of probability

    CERN Document Server

    Weaver, Warren

    1982-01-01

    ""Should I take my umbrella?"" ""Should I buy insurance?"" ""Which horse should I bet on?"" Every day ― in business, in love affairs, in forecasting the weather or the stock market questions arise which cannot be answered by a simple ""yes"" or ""no."" Many of these questions involve probability. Probabilistic thinking is as crucially important in ordinary affairs as it is in the most abstruse realms of science. This book is the best nontechnical introduction to probability ever written. Its author, the late Dr. Warren Weaver, was a professor of mathematics, active in the Rockefeller and Sloa

  7. Bayesian estimation of core-melt probability

    International Nuclear Information System (INIS)

    Lewis, H.W.

    1984-01-01

    A very simple application of the canonical Bayesian algorithm is made to the problem of estimation of the probability of core melt in a commercial power reactor. An approximation to the results of the Rasmussen study on reactor safety is used as the prior distribution, and the observation that there has been no core melt yet is used as the single experiment. The result is a substantial decrease in the mean probability of core melt--factors of 2 to 4 for reasonable choices of parameters. The purpose is to illustrate the procedure, not to argue for the decrease

  8. Minimizing employee exposure to toxic chemical releases

    International Nuclear Information System (INIS)

    Plummer, R.W.; Stobbe, T.J.; Mogensen, J.E.; Jeram, L.K.

    1987-01-01

    This book describes procedures for minimizing employee exposure to toxic chemical releases and suggested personal protective equipment (PPE) to be used in the event of such chemical release. How individuals, employees, supervisors, or companies perceive the risks of chemical exposure (risk meaning both probability of exposure and effect of exposure) determines to a great extent what precautions are taken to avoid risk. In Part I, the authors develop and approach which divides the project into three phases: kinds of procedures currently being used; the types of toxic chemical release accidents and injuries that occur; and, finally, integration of this information into a set of recommended procedures which should decrease the likelihood of a toxic chemical release and, if one does occur, will minimize the exposure and its severity to employees. Part II covers the use of personal protective equipment. It addresses the questions: what personal protective equipment ensembles are used in industry in situations where the release of a toxic or dangerous chemical may occur or has occurred; and what personal protective equipment ensembles should be used in these situations

  9. Quantum probabilities as Dempster-Shafer probabilities in the lattice of subspaces

    International Nuclear Information System (INIS)

    Vourdas, A.

    2014-01-01

    The orthocomplemented modular lattice of subspaces L[H(d)], of a quantum system with d-dimensional Hilbert space H(d), is considered. A generalized additivity relation which holds for Kolmogorov probabilities is violated by quantum probabilities in the full lattice L[H(d)] (it is only valid within the Boolean subalgebras of L[H(d)]). This suggests the use of more general (than Kolmogorov) probability theories, and here the Dempster-Shafer probability theory is adopted. An operator D(H 1 ,H 2 ), which quantifies deviations from Kolmogorov probability theory is introduced, and it is shown to be intimately related to the commutator of the projectors P(H 1 ),P(H 2 ), to the subspaces H 1 , H 2 . As an application, it is shown that the proof of the inequalities of Clauser, Horne, Shimony, and Holt for a system of two spin 1/2 particles is valid for Kolmogorov probabilities, but it is not valid for Dempster-Shafer probabilities. The violation of these inequalities in experiments supports the interpretation of quantum probabilities as Dempster-Shafer probabilities

  10. Radionuclide release calculations for SAR-08

    International Nuclear Information System (INIS)

    Thomson, Gavin; Miller, Alex; Smith, Graham; Jackson, Duncan

    2008-04-01

    Following a review by the Swedish regulatory authorities of the post-closure safety assessment of the SFR 1 disposal facility for low and intermediate waste (L/ILW), SAFE, the SKB has prepared an updated assessment called SAR-08. This report describes the radionuclide release calculations that have been undertaken as part of SAR-08. The information, assumptions and data used in the calculations are reported and the results are presented. The calculations address issues raised in the regulatory review, but also take account of new information including revised inventory data. The scenarios considered include the main case of expected behaviour of the system, with variants; low probability releases, and so-called residual scenarios. Apart from these scenario uncertainties, data uncertainties have been examined using a probabilistic approach. Calculations have been made using the AMBER software. This allows all the component features of the assessment model to be included in one place. AMBER has been previously used to reproduce results the corresponding calculations in the SAFE assessment. It is also used in demonstration of the IAEA's near surface disposal assessment methodology ISAM and has been subject to very substantial verification tests and has been used in verifying other assessment codes. Results are presented as a function of time for the release of radionuclides from the near field, and then from the far field into the biosphere. Radiological impacts of the releases are reported elsewhere. Consideration is given to each radionuclide and to each component part of the repository. The releases from the entire repository are also presented. The peak releases rates are, for most scenarios, due to organic C-14. Other radionuclides which contribute to peak release rates include inorganic C-14, Ni-59 and Ni-63. (author)

  11. Radionuclide release calculations for SAR-08

    Energy Technology Data Exchange (ETDEWEB)

    Thomson, Gavin; Miller, Alex; Smith, Graham; Jackson, Duncan (Enviros Consulting Ltd, Wolverhampton (United Kingdom))

    2008-04-15

    Following a review by the Swedish regulatory authorities of the post-closure safety assessment of the SFR 1 disposal facility for low and intermediate waste (L/ILW), SAFE, the SKB has prepared an updated assessment called SAR-08. This report describes the radionuclide release calculations that have been undertaken as part of SAR-08. The information, assumptions and data used in the calculations are reported and the results are presented. The calculations address issues raised in the regulatory review, but also take account of new information including revised inventory data. The scenarios considered include the main case of expected behaviour of the system, with variants; low probability releases, and so-called residual scenarios. Apart from these scenario uncertainties, data uncertainties have been examined using a probabilistic approach. Calculations have been made using the AMBER software. This allows all the component features of the assessment model to be included in one place. AMBER has been previously used to reproduce results the corresponding calculations in the SAFE assessment. It is also used in demonstration of the IAEA's near surface disposal assessment methodology ISAM and has been subject to very substantial verification tests and has been used in verifying other assessment codes. Results are presented as a function of time for the release of radionuclides from the near field, and then from the far field into the biosphere. Radiological impacts of the releases are reported elsewhere. Consideration is given to each radionuclide and to each component part of the repository. The releases from the entire repository are also presented. The peak releases rates are, for most scenarios, due to organic C-14. Other radionuclides which contribute to peak release rates include inorganic C-14, Ni-59 and Ni-63. (author)

  12. Tropical Cyclone Wind Probability Forecasting (WINDP).

    Science.gov (United States)

    1981-04-01

    llq. h. ,c ilrac (t’ small probabilities (below 107c) is limited II(t’h, numb(r o!, significant digits given: therefore 1t( huld lU r~ruidvd as being...APPLIED SCI. CORP. ENGLAMD ;7MOS. SCIENCES OEPT., LIBRARY ATTN: LIBARY , SUITE 500 400 WASHINGTON AVE. 6811 KENILWORTH AVE. EUROPEAN CENTRE FOR MEDIUM

  13. The Probability Heuristics Model of Syllogistic Reasoning.

    Science.gov (United States)

    Chater, Nick; Oaksford, Mike

    1999-01-01

    Proposes a probability heuristic model for syllogistic reasoning and confirms the rationality of this heuristic by an analysis of the probabilistic validity of syllogistic reasoning that treats logical inference as a limiting case of probabilistic inference. Meta-analysis and two experiments involving 40 adult participants and using generalized…

  14. Probability & Perception: The Representativeness Heuristic in Action

    Science.gov (United States)

    Lu, Yun; Vasko, Francis J.; Drummond, Trevor J.; Vasko, Lisa E.

    2014-01-01

    If the prospective students of probability lack a background in mathematical proofs, hands-on classroom activities may work well to help them to learn to analyze problems correctly. For example, students may physically roll a die twice to count and compare the frequency of the sequences. Tools such as graphing calculators or Microsoft Excel®…

  15. Critique of `Elements of Quantum Probability'

    NARCIS (Netherlands)

    Gill, R.D.

    1998-01-01

    We analyse the thesis of Kummerer and Maassen that classical probability is unable to model the the stochastic nature of the Aspect experiment in which violation of Bells inequality was experimentally demonstrated According to these authors the experiment shows the need to introduce the extension

  16. Independent Events in Elementary Probability Theory

    Science.gov (United States)

    Csenki, Attila

    2011-01-01

    In Probability and Statistics taught to mathematicians as a first introduction or to a non-mathematical audience, joint independence of events is introduced by requiring that the multiplication rule is satisfied. The following statement is usually tacitly assumed to hold (and, at best, intuitively motivated): If the n events E[subscript 1],…

  17. Probable Unusual Transmission of Zika Virus

    Centers for Disease Control (CDC) Podcasts

    This podcast discusses a study about the probable unusual transmission of Zika Virus Infection from a scientist to his wife, published in the May 2011 issue of Emerging Infectious Diseases. Dr. Brian Foy, Associate Professor at Colorado State University, shares details of this event.

  18. Error probabilities in default Bayesian hypothesis testing

    NARCIS (Netherlands)

    Gu, Xin; Hoijtink, Herbert; Mulder, J,

    2016-01-01

    This paper investigates the classical type I and type II error probabilities of default Bayes factors for a Bayesian t test. Default Bayes factors quantify the relative evidence between the null hypothesis and the unrestricted alternative hypothesis without needing to specify prior distributions for

  19. Spatial Probability Cuing and Right Hemisphere Damage

    Science.gov (United States)

    Shaqiri, Albulena; Anderson, Britt

    2012-01-01

    In this experiment we studied statistical learning, inter-trial priming, and visual attention. We assessed healthy controls and right brain damaged (RBD) patients with and without neglect, on a simple visual discrimination task designed to measure priming effects and probability learning. All participants showed a preserved priming effect for item…

  20. Sampling, Probability Models and Statistical Reasoning -RE ...

    Indian Academy of Sciences (India)

    random sampling allows data to be modelled with the help of probability ... g based on different trials to get an estimate of the experimental error. ... research interests lie in the .... if e is indeed the true value of the proportion of defectives in the.

  1. Virus isolation: Specimen type and probable transmission

    Indian Academy of Sciences (India)

    First page Back Continue Last page Overview Graphics. Virus isolation: Specimen type and probable transmission. Over 500 CHIK virus isolations were made. 4 from male Ae. Aegypti (?TOT). 6 from CSF (neurological involvement). 1 from a 4-day old child (transplacental transmission.

  2. Estimating the Probability of Negative Events

    Science.gov (United States)

    Harris, Adam J. L.; Corner, Adam; Hahn, Ulrike

    2009-01-01

    How well we are attuned to the statistics of our environment is a fundamental question in understanding human behaviour. It seems particularly important to be able to provide accurate assessments of the probability with which negative events occur so as to guide rational choice of preventative actions. One question that arises here is whether or…

  3. Concurrency meets probability: theory and practice (abstract)

    NARCIS (Netherlands)

    Katoen, Joost P.

    Treating random phenomena in concurrency theory has a long tradition. Petri nets [18, 10] and process algebras [14] have been extended with probabilities. The same applies to behavioural semantics such as strong and weak (bi)simulation [1], and testing pre-orders [5]. Beautiful connections between

  4. Confusion between Odds and Probability, a Pandemic?

    Science.gov (United States)

    Fulton, Lawrence V.; Mendez, Francis A.; Bastian, Nathaniel D.; Musal, R. Muzaffer

    2012-01-01

    This manuscript discusses the common confusion between the terms probability and odds. To emphasize the importance and responsibility of being meticulous in the dissemination of information and knowledge, this manuscript reveals five cases of sources of inaccurate statistical language imbedded in the dissemination of information to the general…

  5. Probability in Action: The Red Traffic Light

    Science.gov (United States)

    Shanks, John A.

    2007-01-01

    Emphasis on problem solving in mathematics has gained considerable attention in recent years. While statistics teaching has always been problem driven, the same cannot be said for the teaching of probability where discrete examples involving coins and playing cards are often the norm. This article describes an application of simple probability…

  6. Probability & Statistics: Modular Learning Exercises. Teacher Edition

    Science.gov (United States)

    Actuarial Foundation, 2012

    2012-01-01

    The purpose of these modules is to provide an introduction to the world of probability and statistics to accelerated mathematics students at the high school level. The modules also introduce students to real world math concepts and problems that property and casualty actuaries come across in their work. They are designed to be used by teachers and…

  7. Probability & Statistics: Modular Learning Exercises. Student Edition

    Science.gov (United States)

    Actuarial Foundation, 2012

    2012-01-01

    The purpose of these modules is to provide an introduction to the world of probability and statistics to accelerated mathematics students at the high school level. The materials are centered on the fictional town of Happy Shores, a coastal community which is at risk for hurricanes. Actuaries at an insurance company figure out the risks and…

  8. Conditional probability on MV-algebras

    Czech Academy of Sciences Publication Activity Database

    Kroupa, Tomáš

    2005-01-01

    Roč. 149, č. 2 (2005), s. 369-381 ISSN 0165-0114 R&D Projects: GA AV ČR IAA2075302 Institutional research plan: CEZ:AV0Z10750506 Keywords : conditional probability * tribe * MV-algebra Subject RIV: BA - General Mathematics Impact factor: 1.039, year: 2005

  9. Investigating Probability with the NBA Draft Lottery.

    Science.gov (United States)

    Quinn, Robert J.

    1997-01-01

    Investigates an interesting application of probability in the world of sports. Considers the role of permutations in the lottery system used by the National Basketball Association (NBA) in the United States to determine the order in which nonplayoff teams select players from the college ranks. Presents a lesson on this topic in which students work…

  10. Probability from a Socio-Cultural Perspective

    Science.gov (United States)

    Sharma, Sashi

    2016-01-01

    There exists considerable and rich literature on students' misconceptions about probability; less attention has been paid to the development of students' probabilistic thinking in the classroom. Grounded in an analysis of the literature, this article offers a lesson sequence for developing students' probabilistic understanding. In particular, a…

  11. Neutrosophic Probability, Set, And Logic (first version)

    OpenAIRE

    Smarandache, Florentin

    2000-01-01

    This project is a part of a National Science Foundation interdisciplinary project proposal. Starting from a new viewpoint in philosophy, the neutrosophy, one extends the classical "probability theory", "fuzzy set" and "fuzzy logic" to , and respectively. They are useful in artificial intelligence, neural networks, evolutionary programming, neutrosophic dynamic systems, and quantum mechanics.

  12. Pade approximant calculations for neutron escape probability

    International Nuclear Information System (INIS)

    El Wakil, S.A.; Saad, E.A.; Hendi, A.A.

    1984-07-01

    The neutron escape probability from a non-multiplying slab containing internal source is defined in terms of a functional relation for the scattering function for the diffuse reflection problem. The Pade approximant technique is used to get numerical results which compare with exact results. (author)

  13. On a paradox of probability theory

    International Nuclear Information System (INIS)

    Stuart, C.I.J.M.

    1989-01-01

    Costa de Beauregard's proposal concerning physical retrocausality has been shown to fail on two crucial points. However, it is argued that his proposal still merits serious attention. The argument arises from showing that his proposal reveals a paradox involving relations between conditional probabilities, statistical correlations and reciprocal causalities of the type exhibited by cooperative dynamics in physical systems. 4 refs. (Author)

  14. Escape probabilities for fluorescent x-rays

    International Nuclear Information System (INIS)

    Dance, D.R.; Day, G.J.

    1985-01-01

    Computation of the energy absorption efficiency of an x-ray photon detector involves consideration of the histories of the secondary particles produced in any initial or secondary interaction which may occur within the detector. In particular, the K or higher shell fluorescent x-rays which may be emitted following a photoelectric interaction can carry away a large fraction of the energy of the incident photon, especially if this energy is just above an absorption edge. The effects of such photons cannot be ignored and a correction term, depending upon the probability that the fluorescent x-rays will escape from the detector, must be applied to the energy absorption efficiency. For detectors such as x-ray intensifying screens, it has been usual to calculate this probability by numerical integration. In this note analytic expressions are derived for the escape probability of fluorescent photons from planar detectors in terms of exponential integral functions. Rational approximations for these functions are readily available and these analytic expressions therefore facilitate the computation of photon absorption efficiencies. A table is presented which should obviate the need for calculating the escape probability for most cases of interest. (author)

  15. Sequential Probability Ration Tests : Conservative and Robust

    NARCIS (Netherlands)

    Kleijnen, J.P.C.; Shi, Wen

    2017-01-01

    In practice, most computers generate simulation outputs sequentially, so it is attractive to analyze these outputs through sequential statistical methods such as sequential probability ratio tests (SPRTs). We investigate several SPRTs for choosing between two hypothesized values for the mean output

  16. Applied probability models with optimization applications

    CERN Document Server

    Ross, Sheldon M

    1992-01-01

    Concise advanced-level introduction to stochastic processes that frequently arise in applied probability. Largely self-contained text covers Poisson process, renewal theory, Markov chains, inventory theory, Brownian motion and continuous time optimization models, much more. Problems and references at chapter ends. ""Excellent introduction."" - Journal of the American Statistical Association. Bibliography. 1970 edition.

  17. Quantum probability and conceptual combination in conjunctions.

    Science.gov (United States)

    Hampton, James A

    2013-06-01

    I consider the general problem of category conjunctions in the light of Pothos & Busemeyer (P&B)'s quantum probability (QP) account of the conjunction fallacy. I argue that their account as presented cannot capture the "guppy effect" - the case in which a class is a better member of a conjunction A^B than it is of either A or B alone.

  18. Monte Carlo methods to calculate impact probabilities

    Science.gov (United States)

    Rickman, H.; Wiśniowski, T.; Wajer, P.; Gabryszewski, R.; Valsecchi, G. B.

    2014-09-01

    Context. Unraveling the events that took place in the solar system during the period known as the late heavy bombardment requires the interpretation of the cratered surfaces of the Moon and terrestrial planets. This, in turn, requires good estimates of the statistical impact probabilities for different source populations of projectiles, a subject that has received relatively little attention, since the works of Öpik (1951, Proc. R. Irish Acad. Sect. A, 54, 165) and Wetherill (1967, J. Geophys. Res., 72, 2429). Aims: We aim to work around the limitations of the Öpik and Wetherill formulae, which are caused by singularities due to zero denominators under special circumstances. Using modern computers, it is possible to make good estimates of impact probabilities by means of Monte Carlo simulations, and in this work, we explore the available options. Methods: We describe three basic methods to derive the average impact probability for a projectile with a given semi-major axis, eccentricity, and inclination with respect to a target planet on an elliptic orbit. One is a numerical averaging of the Wetherill formula; the next is a Monte Carlo super-sizing method using the target's Hill sphere. The third uses extensive minimum orbit intersection distance (MOID) calculations for a Monte Carlo sampling of potentially impacting orbits, along with calculations of the relevant interval for the timing of the encounter allowing collision. Numerical experiments are carried out for an intercomparison of the methods and to scrutinize their behavior near the singularities (zero relative inclination and equal perihelion distances). Results: We find an excellent agreement between all methods in the general case, while there appear large differences in the immediate vicinity of the singularities. With respect to the MOID method, which is the only one that does not involve simplifying assumptions and approximations, the Wetherill averaging impact probability departs by diverging toward

  19. Bounding probabilistic safety assessment probabilities by reality

    International Nuclear Information System (INIS)

    Fragola, J.R.; Shooman, M.L.

    1991-01-01

    The investigation of the failure in systems where failure is a rare event makes the continual comparisons between the developed probabilities and empirical evidence difficult. The comparison of the predictions of rare event risk assessments with historical reality is essential to prevent probabilistic safety assessment (PSA) predictions from drifting into fantasy. One approach to performing such comparisons is to search out and assign probabilities to natural events which, while extremely rare, have a basis in the history of natural phenomena or human activities. For example the Segovian aqueduct and some of the Roman fortresses in Spain have existed for several millennia and in many cases show no physical signs of earthquake damage. This evidence could be used to bound the probability of earthquakes above a certain magnitude to less than 10 -3 per year. On the other hand, there is evidence that some repetitive actions can be performed with extremely low historical probabilities when operators are properly trained and motivated, and sufficient warning indicators are provided. The point is not that low probability estimates are impossible, but continual reassessment of the analysis assumptions, and a bounding of the analysis predictions by historical reality. This paper reviews the probabilistic predictions of PSA in this light, attempts to develop, in a general way, the limits which can be historically established and the consequent bounds that these limits place upon the predictions, and illustrates the methodology used in computing such limits. Further, the paper discusses the use of empirical evidence and the requirement for disciplined systematic approaches within the bounds of reality and the associated impact on PSA probabilistic estimates

  20. The probability and severity of decompression sickness

    Science.gov (United States)

    Hada, Ethan A.; Vann, Richard D.; Denoble, Petar J.

    2017-01-01

    Decompression sickness (DCS), which is caused by inert gas bubbles in tissues, is an injury of concern for scuba divers, compressed air workers, astronauts, and aviators. Case reports for 3322 air and N2-O2 dives, resulting in 190 DCS events, were retrospectively analyzed and the outcomes were scored as (1) serious neurological, (2) cardiopulmonary, (3) mild neurological, (4) pain, (5) lymphatic or skin, and (6) constitutional or nonspecific manifestations. Following standard U.S. Navy medical definitions, the data were grouped into mild—Type I (manifestations 4–6)–and serious–Type II (manifestations 1–3). Additionally, we considered an alternative grouping of mild–Type A (manifestations 3–6)–and serious–Type B (manifestations 1 and 2). The current U.S. Navy guidance allows for a 2% probability of mild DCS and a 0.1% probability of serious DCS. We developed a hierarchical trinomial (3-state) probabilistic DCS model that simultaneously predicts the probability of mild and serious DCS given a dive exposure. Both the Type I/II and Type A/B discriminations of mild and serious DCS resulted in a highly significant (p probability of ‘mild’ DCS resulted in a longer allowable bottom time for the same 2% limit. However, for the 0.1% serious DCS limit, we found a vastly decreased allowable bottom dive time for all dive depths. If the Type A/B scoring was assigned to outcome severity, the no decompression limits (NDL) for air dives were still controlled by the acceptable serious DCS risk limit rather than the acceptable mild DCS risk limit. However, in this case, longer NDL limits were allowed than with the Type I/II scoring. The trinomial model mild and serious probabilities agree reasonably well with the current air NDL only with the Type A/B scoring and when 0.2% risk of serious DCS is allowed. PMID:28296928

  1. ATP Release Channels

    Directory of Open Access Journals (Sweden)

    Akiyuki Taruno

    2018-03-01

    Full Text Available Adenosine triphosphate (ATP has been well established as an important extracellular ligand of autocrine signaling, intercellular communication, and neurotransmission with numerous physiological and pathophysiological roles. In addition to the classical exocytosis, non-vesicular mechanisms of cellular ATP release have been demonstrated in many cell types. Although large and negatively charged ATP molecules cannot diffuse across the lipid bilayer of the plasma membrane, conductive ATP release from the cytosol into the extracellular space is possible through ATP-permeable channels. Such channels must possess two minimum qualifications for ATP permeation: anion permeability and a large ion-conducting pore. Currently, five groups of channels are acknowledged as ATP-release channels: connexin hemichannels, pannexin 1, calcium homeostasis modulator 1 (CALHM1, volume-regulated anion channels (VRACs, also known as volume-sensitive outwardly rectifying (VSOR anion channels, and maxi-anion channels (MACs. Recently, major breakthroughs have been made in the field by molecular identification of CALHM1 as the action potential-dependent ATP-release channel in taste bud cells, LRRC8s as components of VRACs, and SLCO2A1 as a core subunit of MACs. Here, the function and physiological roles of these five groups of ATP-release channels are summarized, along with a discussion on the future implications of understanding these channels.

  2. An Alternative Version of Conditional Probabilities and Bayes' Rule: An Application of Probability Logic

    Science.gov (United States)

    Satake, Eiki; Amato, Philip P.

    2008-01-01

    This paper presents an alternative version of formulas of conditional probabilities and Bayes' rule that demonstrate how the truth table of elementary mathematical logic applies to the derivations of the conditional probabilities of various complex, compound statements. This new approach is used to calculate the prior and posterior probabilities…

  3. A Comprehensive Probability Project for the Upper Division One-Semester Probability Course Using Yahtzee

    Science.gov (United States)

    Wilson, Jason; Lawman, Joshua; Murphy, Rachael; Nelson, Marissa

    2011-01-01

    This article describes a probability project used in an upper division, one-semester probability course with third-semester calculus and linear algebra prerequisites. The student learning outcome focused on developing the skills necessary for approaching project-sized math/stat application problems. These skills include appropriately defining…

  4. RAVEN Beta Release

    International Nuclear Information System (INIS)

    Rabiti, Cristian; Alfonsi, Andrea; Cogliati, Joshua Joseph; Mandelli, Diego; Kinoshita, Robert Arthur; Wang, Congjian; Maljovec, Daniel Patrick; Talbot, Paul William

    2016-01-01

    This documents the release of the Risk Analysis Virtual Environment (RAVEN) code. A description of the RAVEN code is provided, and discussion of the release process for the M2LW-16IN0704045 milestone. The RAVEN code is a generic software framework to perform parametric and probabilistic analysis based on the response of complex system codes. RAVEN is capable of investigating the system response as well as the input space using Monte Carlo, Grid, or Latin Hyper Cube sampling schemes, but its strength is focused toward system feature discovery, such as limit surfaces, separating regions of the input space leading to system failure, using dynamic supervised learning techniques. RAVEN has now increased in maturity enough for the Beta 1.0 release.

  5. RAVEN Beta Release

    Energy Technology Data Exchange (ETDEWEB)

    Rabiti, Cristian [Idaho National Lab. (INL), Idaho Falls, ID (United States); Alfonsi, Andrea [Idaho National Lab. (INL), Idaho Falls, ID (United States); Cogliati, Joshua Joseph [Idaho National Lab. (INL), Idaho Falls, ID (United States); Mandelli, Diego [Idaho National Lab. (INL), Idaho Falls, ID (United States); Kinoshita, Robert Arthur [Idaho National Lab. (INL), Idaho Falls, ID (United States); Wang, Congjian [Idaho National Lab. (INL), Idaho Falls, ID (United States); Maljovec, Daniel Patrick [Idaho National Lab. (INL), Idaho Falls, ID (United States); Talbot, Paul William [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-02-01

    This documents the release of the Risk Analysis Virtual Environment (RAVEN) code. A description of the RAVEN code is provided, and discussion of the release process for the M2LW-16IN0704045 milestone. The RAVEN code is a generic software framework to perform parametric and probabilistic analysis based on the response of complex system codes. RAVEN is capable of investigating the system response as well as the input space using Monte Carlo, Grid, or Latin Hyper Cube sampling schemes, but its strength is focused toward system feature discovery, such as limit surfaces, separating regions of the input space leading to system failure, using dynamic supervised learning techniques. RAVEN has now increased in maturity enough for the Beta 1.0 release.

  6. Estimating the population size and colony boundary of subterranean termites by using the density functions of directionally averaged capture probability.

    Science.gov (United States)

    Su, Nan-Yao; Lee, Sang-Hee

    2008-04-01

    Marked termites were released in a linear-connected foraging arena, and the spatial heterogeneity of their capture probabilities was averaged for both directions at distance r from release point to obtain a symmetrical distribution, from which the density function of directionally averaged capture probability P(x) was derived. We hypothesized that as marked termites move into the population and given sufficient time, the directionally averaged capture probability may reach an equilibrium P(e) over the distance r and thus satisfy the equal mixing assumption of the mark-recapture protocol. The equilibrium capture probability P(e) was used to estimate the population size N. The hypothesis was tested in a 50-m extended foraging arena to simulate the distance factor of field colonies of subterranean termites. Over the 42-d test period, the density functions of directionally averaged capture probability P(x) exhibited four phases: exponential decline phase, linear decline phase, equilibrium phase, and postequilibrium phase. The equilibrium capture probability P(e), derived as the intercept of the linear regression during the equilibrium phase, correctly projected N estimates that were not significantly different from the known number of workers in the arena. Because the area beneath the probability density function is a constant (50% in this study), preequilibrium regression parameters and P(e) were used to estimate the population boundary distance 1, which is the distance between the release point and the boundary beyond which the population is absent.

  7. VOLCANIC RISK ASSESSMENT - PROBABILITY AND CONSEQUENCES

    International Nuclear Information System (INIS)

    G.A. Valentine; F.V. Perry; S. Dartevelle

    2005-01-01

    Risk is the product of the probability and consequences of an event. Both of these must be based upon sound science that integrates field data, experiments, and modeling, but must also be useful to decision makers who likely do not understand all aspects of the underlying science. We review a decision framework used in many fields such as performance assessment for hazardous and/or radioactive waste disposal sites that can serve to guide the volcanological community towards integrated risk assessment. In this framework the underlying scientific understanding of processes that affect probability and consequences drive the decision-level results, but in turn these results can drive focused research in areas that cause the greatest level of uncertainty at the decision level. We review two examples of the determination of volcanic event probability: (1) probability of a new volcano forming at the proposed Yucca Mountain radioactive waste repository, and (2) probability that a subsurface repository in Japan would be affected by the nearby formation of a new stratovolcano. We also provide examples of work on consequences of explosive eruptions, within the framework mentioned above. These include field-based studies aimed at providing data for ''closure'' of wall rock erosion terms in a conduit flow model, predictions of dynamic pressure and other variables related to damage by pyroclastic flow into underground structures, and vulnerability criteria for structures subjected to conditions of explosive eruption. Process models (e.g., multiphase flow) are important for testing the validity or relative importance of possible scenarios in a volcanic risk assessment. We show how time-dependent multiphase modeling of explosive ''eruption'' of basaltic magma into an open tunnel (drift) at the Yucca Mountain repository provides insight into proposed scenarios that include the development of secondary pathways to the Earth's surface. Addressing volcanic risk within a decision

  8. Joint survival probability via truncated invariant copula

    International Nuclear Information System (INIS)

    Kim, Jeong-Hoon; Ma, Yong-Ki; Park, Chan Yeol

    2016-01-01

    Highlights: • We have studied an issue of dependence structure between default intensities. • We use a multivariate shot noise intensity process, where jumps occur simultaneously and their sizes are correlated. • We obtain the joint survival probability of the integrated intensities by using a copula. • We apply our theoretical result to pricing basket default swap spread. - Abstract: Given an intensity-based credit risk model, this paper studies dependence structure between default intensities. To model this structure, we use a multivariate shot noise intensity process, where jumps occur simultaneously and their sizes are correlated. Through very lengthy algebra, we obtain explicitly the joint survival probability of the integrated intensities by using the truncated invariant Farlie–Gumbel–Morgenstern copula with exponential marginal distributions. We also apply our theoretical result to pricing basket default swap spreads. This result can provide a useful guide for credit risk management.

  9. Probabilities, causes and propensities in physics

    CERN Document Server

    Suárez, Mauricio

    2010-01-01

    This volume defends a novel approach to the philosophy of physics: it is the first book devoted to a comparative study of probability, causality, and propensity, and their various interrelations, within the context of contemporary physics - particularly quantum and statistical physics. The philosophical debates and distinctions are firmly grounded upon examples from actual physics, thus exemplifying a robustly empiricist approach. The essays, by both prominent scholars in the field and promising young researchers, constitute a pioneer effort in bringing out the connections between probabilistic, causal and dispositional aspects of the quantum domain. This book will appeal to specialists in philosophy and foundations of physics, philosophy of science in general, metaphysics, ontology of physics theories, and philosophy of probability.

  10. Foundations of quantization for probability distributions

    CERN Document Server

    Graf, Siegfried

    2000-01-01

    Due to the rapidly increasing need for methods of data compression, quantization has become a flourishing field in signal and image processing and information theory. The same techniques are also used in statistics (cluster analysis), pattern recognition, and operations research (optimal location of service centers). The book gives the first mathematically rigorous account of the fundamental theory underlying these applications. The emphasis is on the asymptotics of quantization errors for absolutely continuous and special classes of singular probabilities (surface measures, self-similar measures) presenting some new results for the first time. Written for researchers and graduate students in probability theory the monograph is of potential interest to all people working in the disciplines mentioned above.

  11. Measurement of the resonance escape probability

    International Nuclear Information System (INIS)

    Anthony, J.P.; Bacher, P.; Lheureux, L.; Moreau, J.; Schmitt, A.P.

    1957-01-01

    The average cadmium ratio in natural uranium rods has been measured, using equal diameter natural uranium disks. These values correlated with independent measurements of the lattice buckling, enabled us to calculate values of the resonance escape probability for the G1 reactor with one or the other of two definitions. Measurements were performed on 26 mm and 32 mm rods, giving the following values for the resonance escape probability p: 0.8976 ± 0.005 and 0.912 ± 0.006 (d. 26 mm), 0.8627 ± 0.009 and 0.884 ± 0.01 (d. 32 mm). The influence of either definition on the lattice parameters is discussed, leading to values of the effective integral. Similar experiments have been performed with thorium rods. (author) [fr

  12. Approaches to Evaluating Probability of Collision Uncertainty

    Science.gov (United States)

    Hejduk, Matthew D.; Johnson, Lauren C.

    2016-01-01

    While the two-dimensional probability of collision (Pc) calculation has served as the main input to conjunction analysis risk assessment for over a decade, it has done this mostly as a point estimate, with relatively little effort made to produce confidence intervals on the Pc value based on the uncertainties in the inputs. The present effort seeks to try to carry these uncertainties through the calculation in order to generate a probability density of Pc results rather than a single average value. Methods for assessing uncertainty in the primary and secondary objects' physical sizes and state estimate covariances, as well as a resampling approach to reveal the natural variability in the calculation, are presented; and an initial proposal for operationally-useful display and interpretation of these data for a particular conjunction is given.

  13. Multiple model cardinalized probability hypothesis density filter

    Science.gov (United States)

    Georgescu, Ramona; Willett, Peter

    2011-09-01

    The Probability Hypothesis Density (PHD) filter propagates the first-moment approximation to the multi-target Bayesian posterior distribution while the Cardinalized PHD (CPHD) filter propagates both the posterior likelihood of (an unlabeled) target state and the posterior probability mass function of the number of targets. Extensions of the PHD filter to the multiple model (MM) framework have been published and were implemented either with a Sequential Monte Carlo or a Gaussian Mixture approach. In this work, we introduce the multiple model version of the more elaborate CPHD filter. We present the derivation of the prediction and update steps of the MMCPHD particularized for the case of two target motion models and proceed to show that in the case of a single model, the new MMCPHD equations reduce to the original CPHD equations.

  14. Conditional probabilities in Ponzano-Regge minisuperspace

    International Nuclear Information System (INIS)

    Petryk, Roman; Schleich, Kristin

    2003-01-01

    We examine the Hartle-Hawking no-boundary initial state for the Ponzano-Regge formulation of gravity in three dimensions. We consider the behavior of conditional probabilities and expectation values for geometrical quantities in this initial state for a simple minisuperspace model consisting of a two-parameter set of anisotropic geometries on a 2-sphere boundary. We find dependence on the cutoff used in the construction of Ponzano-Regge amplitudes for expectation values of edge lengths. However, these expectation values are cutoff independent when computed in certain, but not all, conditional probability distributions. Conditions that yield cutoff independent expectation values are those that constrain the boundary geometry to a finite range of edge lengths. We argue that such conditions have a correspondence to fixing a range of local time, as classically associated with the area of a surface for spatially closed cosmologies. Thus these results may hint at how classical spacetime emerges from quantum amplitudes

  15. Bayesian Prior Probability Distributions for Internal Dosimetry

    Energy Technology Data Exchange (ETDEWEB)

    Miller, G.; Inkret, W.C.; Little, T.T.; Martz, H.F.; Schillaci, M.E

    2001-07-01

    The problem of choosing a prior distribution for the Bayesian interpretation of measurements (specifically internal dosimetry measurements) is considered using a theoretical analysis and by examining historical tritium and plutonium urine bioassay data from Los Alamos. Two models for the prior probability distribution are proposed: (1) the log-normal distribution, when there is some additional information to determine the scale of the true result, and (2) the 'alpha' distribution (a simplified variant of the gamma distribution) when there is not. These models have been incorporated into version 3 of the Bayesian internal dosimetric code in use at Los Alamos (downloadable from our web site). Plutonium internal dosimetry at Los Alamos is now being done using prior probability distribution parameters determined self-consistently from population averages of Los Alamos data. (author)

  16. Probability and statistics for particle physics

    CERN Document Server

    Mana, Carlos

    2017-01-01

    This book comprehensively presents the basic concepts of probability and Bayesian inference with sufficient generality to make them applicable to current problems in scientific research. The first chapter provides the fundamentals of probability theory that are essential for the analysis of random phenomena. The second chapter includes a full and pragmatic review of the Bayesian methods that constitute a natural and coherent framework with enough freedom to analyze all the information available from experimental data in a conceptually simple manner. The third chapter presents the basic Monte Carlo techniques used in scientific research, allowing a large variety of problems to be handled difficult to tackle by other procedures. The author also introduces a basic algorithm, which enables readers to simulate samples from simple distribution, and describes useful cases for researchers in particle physics.The final chapter is devoted to the basic ideas of Information Theory, which are important in the Bayesian me...

  17. Probability Weighting as Evolutionary Second-best

    OpenAIRE

    Herold, Florian; Netzer, Nick

    2011-01-01

    The economic concept of the second-best involves the idea that multiple simultaneous deviations from a hypothetical first-best optimum may be optimal once the first-best itself can no longer be achieved, since one distortion may partially compensate for another. Within an evolutionary framework, we translate this concept to behavior under uncertainty. We argue that the two main components of prospect theory, the value function and the probability weighting function, are complements in the sec...

  18. Bayesian probability theory and inverse problems

    International Nuclear Information System (INIS)

    Kopec, S.

    1994-01-01

    Bayesian probability theory is applied to approximate solving of the inverse problems. In order to solve the moment problem with the noisy data, the entropic prior is used. The expressions for the solution and its error bounds are presented. When the noise level tends to zero, the Bayesian solution tends to the classic maximum entropy solution in the L 2 norm. The way of using spline prior is also shown. (author)

  19. Probability and Statistics in Aerospace Engineering

    Science.gov (United States)

    Rheinfurth, M. H.; Howell, L. W.

    1998-01-01

    This monograph was prepared to give the practicing engineer a clear understanding of probability and statistics with special consideration to problems frequently encountered in aerospace engineering. It is conceived to be both a desktop reference and a refresher for aerospace engineers in government and industry. It could also be used as a supplement to standard texts for in-house training courses on the subject.

  20. Statistical models based on conditional probability distributions

    International Nuclear Information System (INIS)

    Narayanan, R.S.

    1991-10-01

    We present a formulation of statistical mechanics models based on conditional probability distribution rather than a Hamiltonian. We show that it is possible to realize critical phenomena through this procedure. Closely linked with this formulation is a Monte Carlo algorithm, in which a configuration generated is guaranteed to be statistically independent from any other configuration for all values of the parameters, in particular near the critical point. (orig.)

  1. Marrakesh International Conference on Probability and Statistics

    CERN Document Server

    Ouassou, Idir; Rachdi, Mustapha

    2015-01-01

    This volume, which highlights recent advances in statistical methodology and applications, is divided into two main parts. The first part presents theoretical results on estimation techniques in functional statistics, while the second examines three key areas of application: estimation problems in queuing theory, an application in signal processing, and the copula approach to epidemiologic modelling. The book’s peer-reviewed contributions are based on papers originally presented at the Marrakesh International Conference on Probability and Statistics held in December 2013.

  2. Selected papers on analysis, probability, and statistics

    CERN Document Server

    Nomizu, Katsumi

    1994-01-01

    This book presents papers that originally appeared in the Japanese journal Sugaku. The papers fall into the general area of mathematical analysis as it pertains to probability and statistics, dynamical systems, differential equations and analytic function theory. Among the topics discussed are: stochastic differential equations, spectra of the Laplacian and Schrödinger operators, nonlinear partial differential equations which generate dissipative dynamical systems, fractal analysis on self-similar sets and the global structure of analytic functions.

  3. Clan structure analysis and rapidity gap probability

    International Nuclear Information System (INIS)

    Lupia, S.; Giovannini, A.; Ugoccioni, R.

    1995-01-01

    Clan structure analysis in rapidity intervals is generalized from negative binomial multiplicity distribution to the wide class of compound Poisson distributions. The link of generalized clan structure analysis with correlation functions is also established. These theoretical results are then applied to minimum bias events and evidentiate new interesting features, which can be inspiring and useful in order to discuss data on rapidity gap probability at TEVATRON and HERA. (orig.)

  4. Clan structure analysis and rapidity gap probability

    Energy Technology Data Exchange (ETDEWEB)

    Lupia, S. [Turin Univ. (Italy). Ist. di Fisica Teorica]|[Istituto Nazionale di Fisica Nucleare, Turin (Italy); Giovannini, A. [Turin Univ. (Italy). Ist. di Fisica Teorica]|[Istituto Nazionale di Fisica Nucleare, Turin (Italy); Ugoccioni, R. [Turin Univ. (Italy). Ist. di Fisica Teorica]|[Istituto Nazionale di Fisica Nucleare, Turin (Italy)

    1995-03-01

    Clan structure analysis in rapidity intervals is generalized from negative binomial multiplicity distribution to the wide class of compound Poisson distributions. The link of generalized clan structure analysis with correlation functions is also established. These theoretical results are then applied to minimum bias events and evidentiate new interesting features, which can be inspiring and useful in order to discuss data on rapidity gap probability at TEVATRON and HERA. (orig.)

  5. Introduction to tensorial resistivity probability tomography

    OpenAIRE

    Mauriello, Paolo; Patella, Domenico

    2005-01-01

    The probability tomography approach developed for the scalar resistivity method is here extended to the 2D tensorial apparent resistivity acquisition mode. The rotational invariant derived from the trace of the apparent resistivity tensor is considered, since it gives on the datum plane anomalies confined above the buried objects. Firstly, a departure function is introduced as the difference between the tensorial invariant measured over the real structure and that computed for a reference uni...

  6. Interaction probability value calculi for some scintillators

    International Nuclear Information System (INIS)

    Garcia-Torano Martinez, E.; Grau Malonda, A.

    1989-01-01

    Interaction probabilities for 17 gamma-ray energies between 1 and 1.000 KeV have been computed and tabulated. The tables may be applied to the case of cylindrical vials with radius 1,25 cm and volumes 5, 10 and 15 ml. Toluene, Toluene/Alcohol, Dioxane-Naftalen, PCS, INSTAGEL and HISAFE II scintillators are considered. Graphical results for 10 ml are also given. (Author) 11 refs

  7. Probability of collective excited state decay

    International Nuclear Information System (INIS)

    Manykin, Eh.A.; Ozhovan, M.I.; Poluehktov, P.P.

    1987-01-01

    Decay mechanisms of condensed excited state formed of highly excited (Rydberg) atoms are considered, i.e. stability of so-called Rydberg substance is analyzed. It is shown that Auger recombination and radiation transitions are the basic processes. The corresponding probabilities are calculated and compared. It is ascertained that the ''Rydberg substance'' possesses macroscopic lifetime (several seconds) and in a sense it is metastable

  8. SureTrak Probability of Impact Display

    Science.gov (United States)

    Elliott, John

    2012-01-01

    The SureTrak Probability of Impact Display software was developed for use during rocket launch operations. The software displays probability of impact information for each ship near the hazardous area during the time immediately preceding the launch of an unguided vehicle. Wallops range safety officers need to be sure that the risk to humans is below a certain threshold during each use of the Wallops Flight Facility Launch Range. Under the variable conditions that can exist at launch time, the decision to launch must be made in a timely manner to ensure a successful mission while not exceeding those risk criteria. Range safety officers need a tool that can give them the needed probability of impact information quickly, and in a format that is clearly understandable. This application is meant to fill that need. The software is a reuse of part of software developed for an earlier project: Ship Surveillance Software System (S4). The S4 project was written in C++ using Microsoft Visual Studio 6. The data structures and dialog templates from it were copied into a new application that calls the implementation of the algorithms from S4 and displays the results as needed. In the S4 software, the list of ships in the area was received from one local radar interface and from operators who entered the ship information manually. The SureTrak Probability of Impact Display application receives ship data from two local radars as well as the SureTrak system, eliminating the need for manual data entry.

  9. On the universality of knot probability ratios

    Energy Technology Data Exchange (ETDEWEB)

    Janse van Rensburg, E J [Department of Mathematics and Statistics, York University, Toronto, Ontario M3J 1P3 (Canada); Rechnitzer, A, E-mail: rensburg@yorku.ca, E-mail: andrewr@math.ubc.ca [Department of Mathematics, University of British Columbia, 1984 Mathematics Road, Vancouver, BC V6T 1Z2 (Canada)

    2011-04-22

    Let p{sub n} denote the number of self-avoiding polygons of length n on a regular three-dimensional lattice, and let p{sub n}(K) be the number which have knot type K. The probability that a random polygon of length n has knot type K is p{sub n}(K)/p{sub n} and is known to decay exponentially with length (Sumners and Whittington 1988 J. Phys. A: Math. Gen. 21 1689-94, Pippenger 1989 Discrete Appl. Math. 25 273-8). Little is known rigorously about the asymptotics of p{sub n}(K), but there is substantial numerical evidence. It is believed that the entropic exponent, {alpha}, is universal, while the exponential growth rate is independent of the knot type but varies with the lattice. The amplitude, C{sub K}, depends on both the lattice and the knot type. The above asymptotic form implies that the relative probability of a random polygon of length n having prime knot type K over prime knot type L. In the thermodynamic limit this probability ratio becomes an amplitude ratio; it should be universal and depend only on the knot types K and L. In this communication we examine the universality of these probability ratios for polygons in the simple cubic, face-centred cubic and body-centred cubic lattices. Our results support the hypothesis that these are universal quantities. For example, we estimate that a long random polygon is approximately 28 times more likely to be a trefoil than be a figure-eight, independent of the underlying lattice, giving an estimate of the intrinsic entropy associated with knot types in closed curves. (fast track communication)

  10. Calculating Cumulative Binomial-Distribution Probabilities

    Science.gov (United States)

    Scheuer, Ernest M.; Bowerman, Paul N.

    1989-01-01

    Cumulative-binomial computer program, CUMBIN, one of set of three programs, calculates cumulative binomial probability distributions for arbitrary inputs. CUMBIN, NEWTONP (NPO-17556), and CROSSER (NPO-17557), used independently of one another. Reliabilities and availabilities of k-out-of-n systems analyzed. Used by statisticians and users of statistical procedures, test planners, designers, and numerical analysts. Used for calculations of reliability and availability. Program written in C.

  11. PSA, subjective probability and decision making

    International Nuclear Information System (INIS)

    Clarotti, C.A.

    1989-01-01

    PSA is the natural way to making decisions in face of uncertainty relative to potentially dangerous plants; subjective probability, subjective utility and Bayes statistics are the ideal tools for carrying out a PSA. This paper reports that in order to support this statement the various stages of the PSA procedure are examined in detail and step by step the superiority of Bayes techniques with respect to sampling theory machinery is proven

  12. Box-particle probability hypothesis density filtering

    OpenAIRE

    Schikora, M.; Gning, A.; Mihaylova, L.; Cremers, D.; Koch, W.

    2014-01-01

    This paper develops a novel approach for multitarget tracking, called box-particle probability hypothesis density filter (box-PHD filter). The approach is able to track multiple targets and estimates the unknown number of targets. Furthermore, it is capable of dealing with three sources of uncertainty: stochastic, set-theoretic, and data association uncertainty. The box-PHD filter reduces the number of particles significantly, which improves the runtime considerably. The small number of box-p...

  13. Probability and statistics in particle physics

    International Nuclear Information System (INIS)

    Frodesen, A.G.; Skjeggestad, O.

    1979-01-01

    Probability theory is entered into at an elementary level and given a simple and detailed exposition. The material on statistics has been organised with an eye to the experimental physicist's practical need, which is likely to be statistical methods for estimation or decision-making. The book is intended for graduate students and research workers in experimental high energy and elementary particle physics, and numerous examples from these fields are presented. (JIW)

  14. Heart sounds analysis using probability assessment

    Czech Academy of Sciences Publication Activity Database

    Plešinger, Filip; Viščor, Ivo; Halámek, Josef; Jurčo, Juraj; Jurák, Pavel

    2017-01-01

    Roč. 38, č. 8 (2017), s. 1685-1700 ISSN 0967-3334 R&D Projects: GA ČR GAP102/12/2034; GA MŠk(CZ) LO1212; GA MŠk ED0017/01/01 Institutional support: RVO:68081731 Keywords : heart sounds * FFT * machine learning * signal averaging * probability assessment Subject RIV: FS - Medical Facilities ; Equipment OBOR OECD: Medical engineering Impact factor: 2.058, year: 2016

  15. Liquid-Liquid Extraction/Low-Temperature Purification (LLE/LTP Followed by Dispersive Solid-Phase Extraction (d-SPE Cleanup for Multiresidue Analysis in Palm Oil by LC-QTOF-MS

    Directory of Open Access Journals (Sweden)

    Elham Sobhanzadeh

    2013-01-01

    Full Text Available An evaluation of the extraction of multiresidue pesticides from palm oil by liquid-liquid extraction/low-temperature purification (LLE/LTP coupled with dispersive solid-phase extraction (d-SPE as the cleanup procedure with the determination by liquid chromatography mass spectrometry using electrospray as the ionization source (LC-ESI-MS was carried out. Optimization approaches were studied in terms of d-SPE to select efficiency of type and mass of adsorbents to obtain the highest recovery yield of pesticides and the lowest coextract fat residues in the final extract. The optimal conditions of d-SPE were obtained using 3 g of palm oil, 4 g anhydrous MgSO4, 150 mg of PSA, and 50 mg of GCB (PSA: GCB (3 : 1 w/w. Recovery study was performed at three concentration levels (25, 50, and 100 ng kg−1, yielding recovery rates between 71.8 and 112.4% except diuron with relative standard deviations of 3.2–15.1%. Detection and quantification limits were lower than 2.7 and 8.2 ng kg−1, respectively. The proposed method was successfully applied to the analysis of market-purchased palm oil samples from two different brands collected in Kuala Lumpur, showing its potential applicability and revealing the presence of some of the target species in the ng g−1 range.

  16. Classical probabilities for Majorana and Weyl spinors

    International Nuclear Information System (INIS)

    Wetterich, C.

    2011-01-01

    Highlights: → Map of classical statistical Ising model to fermionic quantum field theory. → Lattice-regularized real Grassmann functional integral for single Weyl spinor. → Emerging complex structure characteristic for quantum physics. → A classical statistical ensemble describes a quantum theory. - Abstract: We construct a map between the quantum field theory of free Weyl or Majorana fermions and the probability distribution of a classical statistical ensemble for Ising spins or discrete bits. More precisely, a Grassmann functional integral based on a real Grassmann algebra specifies the time evolution of the real wave function q τ (t) for the Ising states τ. The time dependent probability distribution of a generalized Ising model obtains as p τ (t)=q τ 2 (t). The functional integral employs a lattice regularization for single Weyl or Majorana spinors. We further introduce the complex structure characteristic for quantum mechanics. Probability distributions of the Ising model which correspond to one or many propagating fermions are discussed explicitly. Expectation values of observables can be computed equivalently in the classical statistical Ising model or in the quantum field theory for fermions.

  17. A quantum probability model of causal reasoning

    Directory of Open Access Journals (Sweden)

    Jennifer S Trueblood

    2012-05-01

    Full Text Available People can often outperform statistical methods and machine learning algorithms in situations that involve making inferences about the relationship between causes and effects. While people are remarkably good at causal reasoning in many situations, there are several instances where they deviate from expected responses. This paper examines three situations where judgments related to causal inference problems produce unexpected results and describes a quantum inference model based on the axiomatic principles of quantum probability theory that can explain these effects. Two of the three phenomena arise from the comparison of predictive judgments (i.e., the conditional probability of an effect given a cause with diagnostic judgments (i.e., the conditional probability of a cause given an effect. The third phenomenon is a new finding examining order effects in predictive causal judgments. The quantum inference model uses the notion of incompatibility among different causes to account for all three phenomena. Psychologically, the model assumes that individuals adopt different points of view when thinking about different causes. The model provides good fits to the data and offers a coherent account for all three causal reasoning effects thus proving to be a viable new candidate for modeling human judgment.

  18. Failure probability analysis on mercury target vessel

    International Nuclear Information System (INIS)

    Ishikura, Syuichi; Futakawa, Masatoshi; Kogawa, Hiroyuki; Sato, Hiroshi; Haga, Katsuhiro; Ikeda, Yujiro

    2005-03-01

    Failure probability analysis was carried out to estimate the lifetime of the mercury target which will be installed into the JSNS (Japan spallation neutron source) in J-PARC (Japan Proton Accelerator Research Complex). The lifetime was estimated as taking loading condition and materials degradation into account. Considered loads imposed on the target vessel were the static stresses due to thermal expansion and static pre-pressure on He-gas and mercury and the dynamic stresses due to the thermally shocked pressure waves generated repeatedly at 25 Hz. Materials used in target vessel will be degraded by the fatigue, neutron and proton irradiation, mercury immersion and pitting damages, etc. The imposed stresses were evaluated through static and dynamic structural analyses. The material-degradations were deduced based on published experimental data. As a result, it was quantitatively confirmed that the failure probability for the lifetime expected in the design is very much lower, 10 -11 in the safety hull, meaning that it will be hardly failed during the design lifetime. On the other hand, the beam window of mercury vessel suffered with high-pressure waves exhibits the failure probability of 12%. It was concluded, therefore, that the leaked mercury from the failed area at the beam window is adequately kept in the space between the safety hull and the mercury vessel by using mercury-leakage sensors. (author)

  19. Converting dose distributions into tumour control probability

    International Nuclear Information System (INIS)

    Nahum, A.E.

    1996-01-01

    The endpoints in radiotherapy that are truly of relevance are not dose distributions but the probability of local control, sometimes known as the Tumour Control Probability (TCP) and the Probability of Normal Tissue Complications (NTCP). A model for the estimation of TCP based on simple radiobiological considerations is described. It is shown that incorporation of inter-patient heterogeneity into the radiosensitivity parameter a through s a can result in a clinically realistic slope for the dose-response curve. The model is applied to inhomogeneous target dose distributions in order to demonstrate the relationship between dose uniformity and s a . The consequences of varying clonogenic density are also explored. Finally the model is applied to the target-volume DVHs for patients in a clinical trial of conformal pelvic radiotherapy; the effect of dose inhomogeneities on distributions of TCP are shown as well as the potential benefits of customizing the target dose according to normal-tissue DVHs. (author). 37 refs, 9 figs

  20. Converting dose distributions into tumour control probability

    Energy Technology Data Exchange (ETDEWEB)

    Nahum, A E [The Royal Marsden Hospital, London (United Kingdom). Joint Dept. of Physics

    1996-08-01

    The endpoints in radiotherapy that are truly of relevance are not dose distributions but the probability of local control, sometimes known as the Tumour Control Probability (TCP) and the Probability of Normal Tissue Complications (NTCP). A model for the estimation of TCP based on simple radiobiological considerations is described. It is shown that incorporation of inter-patient heterogeneity into the radiosensitivity parameter a through s{sub a} can result in a clinically realistic slope for the dose-response curve. The model is applied to inhomogeneous target dose distributions in order to demonstrate the relationship between dose uniformity and s{sub a}. The consequences of varying clonogenic density are also explored. Finally the model is applied to the target-volume DVHs for patients in a clinical trial of conformal pelvic radiotherapy; the effect of dose inhomogeneities on distributions of TCP are shown as well as the potential benefits of customizing the target dose according to normal-tissue DVHs. (author). 37 refs, 9 figs.

  1. Consistent probabilities in loop quantum cosmology

    International Nuclear Information System (INIS)

    Craig, David A; Singh, Parampreet

    2013-01-01

    A fundamental issue for any quantum cosmological theory is to specify how probabilities can be assigned to various quantum events or sequences of events such as the occurrence of singularities or bounces. In previous work, we have demonstrated how this issue can be successfully addressed within the consistent histories approach to quantum theory for Wheeler–DeWitt-quantized cosmological models. In this work, we generalize that analysis to the exactly solvable loop quantization of a spatially flat, homogeneous and isotropic cosmology sourced with a massless, minimally coupled scalar field known as sLQC. We provide an explicit, rigorous and complete decoherent-histories formulation for this model and compute the probabilities for the occurrence of a quantum bounce versus a singularity. Using the scalar field as an emergent internal time, we show for generic states that the probability for a singularity to occur in this model is zero, and that of a bounce is unity, complementing earlier studies of the expectation values of the volume and matter density in this theory. We also show from the consistent histories point of view that all states in this model, whether quantum or classical, achieve arbitrarily large volume in the limit of infinite ‘past’ or ‘future’ scalar ‘time’, in the sense that the wave function evaluated at any arbitrary fixed value of the volume vanishes in that limit. Finally, we briefly discuss certain misconceptions concerning the utility of the consistent histories approach in these models. (paper)

  2. Probability and containment of turbine missiles

    International Nuclear Information System (INIS)

    Yeh, G.C.K.

    1976-01-01

    With the trend toward ever larger power generating plants with large high-speed turbines, an important plant design consideration is the potential for and consequences of mechanical failure of turbine rotors. Such rotor failure could result in high-velocity disc fragments (turbine missiles) perforating the turbine casing and jeopardizing vital plant systems. The designer must first estimate the probability of any turbine missile damaging any safety-related plant component for his turbine and his plant arrangement. If the probability is not low enough to be acceptable to the regulatory agency, he must design a shield to contain the postulated turbine missiles. Alternatively, the shield could be designed to retard (to reduce the velocity of) the missiles such that they would not damage any vital plant system. In this paper, some of the presently available references that can be used to evaluate the probability, containment and retardation of turbine missiles are reviewed; various alternative methods are compared; and subjects for future research are recommended. (Auth.)

  3. Pipe failure probability - the Thomas paper revisited

    International Nuclear Information System (INIS)

    Lydell, B.O.Y.

    2000-01-01

    Almost twenty years ago, in Volume 2 of Reliability Engineering (the predecessor of Reliability Engineering and System Safety), a paper by H. M. Thomas of Rolls Royce and Associates Ltd. presented a generalized approach to the estimation of piping and vessel failure probability. The 'Thomas-approach' used insights from actual failure statistics to calculate the probability of leakage and conditional probability of rupture given leakage. It was intended for practitioners without access to data on the service experience with piping and piping system components. This article revisits the Thomas paper by drawing on insights from development of a new database on piping failures in commercial nuclear power plants worldwide (SKI-PIPE). Partially sponsored by the Swedish Nuclear Power Inspectorate (SKI), the R and D leading up to this note was performed during 1994-1999. Motivated by data requirements of reliability analysis and probabilistic safety assessment (PSA), the new database supports statistical analysis of piping failure data. Against the background of this database development program, the article reviews the applicability of the 'Thomas approach' in applied risk and reliability analysis. It addresses the question whether a new and expanded database on the service experience with piping systems would alter the original piping reliability correlation as suggested by H. M. Thomas

  4. Nuclear data uncertainties: I, Basic concepts of probability

    Energy Technology Data Exchange (ETDEWEB)

    Smith, D.L.

    1988-12-01

    Some basic concepts of probability theory are presented from a nuclear-data perspective, in order to provide a foundation for thorough understanding of the role of uncertainties in nuclear data research. Topics included in this report are: events, event spaces, calculus of events, randomness, random variables, random-variable distributions, intuitive and axiomatic probability, calculus of probability, conditional probability and independence, probability distributions, binomial and multinomial probability, Poisson and interval probability, normal probability, the relationships existing between these probability laws, and Bayes' theorem. This treatment emphasizes the practical application of basic mathematical concepts to nuclear data research, and it includes numerous simple examples. 34 refs.

  5. Nuclear data uncertainties: I, Basic concepts of probability

    International Nuclear Information System (INIS)

    Smith, D.L.

    1988-12-01

    Some basic concepts of probability theory are presented from a nuclear-data perspective, in order to provide a foundation for thorough understanding of the role of uncertainties in nuclear data research. Topics included in this report are: events, event spaces, calculus of events, randomness, random variables, random-variable distributions, intuitive and axiomatic probability, calculus of probability, conditional probability and independence, probability distributions, binomial and multinomial probability, Poisson and interval probability, normal probability, the relationships existing between these probability laws, and Bayes' theorem. This treatment emphasizes the practical application of basic mathematical concepts to nuclear data research, and it includes numerous simple examples. 34 refs

  6. Failures probability calculation of the energy supply of the Angra-1 reactor rods assembly

    International Nuclear Information System (INIS)

    Borba, P.R.

    1978-01-01

    This work analyses the electric power system of the Angra I PWR plant. It is demonstrated that this system is closely coupled with the safety engineering features, which are the equipments provided to prevent, limit, or mitigate the release of radioactive material and to permit the safe reactor shutdown. Event trees are used to analyse the operation of those systems which can lead to the release of radioactivity following a specified initial event. The fault trees technique is used to calculate the failure probability of the on-site electric power system [pt

  7. Hydraulic release oil tool

    International Nuclear Information System (INIS)

    Mims, M.G.; Mueller, M.D.; Ehlinger, J.C.

    1992-01-01

    This patent describes a hydraulic release tool. It comprises a setting assembly; a coupling member for coupling to drill string or petroleum production components, the coupling member being a plurality of sockets for receiving the dogs in the extended position and attaching the coupling member the setting assembly; whereby the setting assembly couples to the coupling member by engagement of the dogs in the sockets of releases from and disengages the coupling member in movement of the piston from its setting to its reposition in response to a pressure in the body in exceeding the predetermined pressure; and a relief port from outside the body into its bore and means to prevent communication between the relief port and the bore of the body axially of the piston when the piston is in the setting position and to establish such communication upon movement of the piston from the setting position to the release position and reduce the pressure in the body bore axially of the piston, whereby the reduction of the pressure signals that the tool has released the coupling member

  8. APASS Data Release 10

    Science.gov (United States)

    Henden, Arne A.; Levine, Stephen; Terrell, Dirk; Welch, Douglas L.; Munari, Ulisse; Kloppenborg, Brian K.

    2018-06-01

    The AAVSO Photometric All-Sky Survey (APASS) has been underway since 2010. This survey covers the entire sky from 7.5 knowledge of the optical train distortions. With these changes, DR10 includes many more stars than prior releases. We describe the survey, its remaining limitations, and prospects for the future, including a very-bright-star extension.

  9. Calculating the Probability of Returning a Loan with Binary Probability Models

    Directory of Open Access Journals (Sweden)

    Julian Vasilev

    2014-12-01

    Full Text Available The purpose of this article is to give a new approach in calculating the probability of returning a loan. A lot of factors affect the value of the probability. In this article by using statistical and econometric models some influencing factors are proved. The main approach is concerned with applying probit and logit models in loan management institutions. A new aspect of the credit risk analysis is given. Calculating the probability of returning a loan is a difficult task. We assume that specific data fields concerning the contract (month of signing, year of signing, given sum and data fields concerning the borrower of the loan (month of birth, year of birth (age, gender, region, where he/she lives may be independent variables in a binary logistics model with a dependent variable “the probability of returning a loan”. It is proved that the month of signing a contract, the year of signing a contract, the gender and the age of the loan owner do not affect the probability of returning a loan. It is proved that the probability of returning a loan depends on the sum of contract, the remoteness of the loan owner and the month of birth. The probability of returning a loan increases with the increase of the given sum, decreases with the proximity of the customer, increases for people born in the beginning of the year and decreases for people born at the end of the year.

  10. School and conference on probability theory

    International Nuclear Information System (INIS)

    Lawler, G.F.

    2004-01-01

    This volume includes expanded lecture notes from the School and Conference in Probability Theory held at ICTP in May, 2001. Probability theory is a very large area, too large for a single school and conference. The organizers, G. Lawler, C. Newman, and S. Varadhan chose to focus on a number of active research areas that have their roots in statistical physics. The pervasive theme in these lectures is trying to find the large time or large space behaviour of models defined on discrete lattices. Usually the definition of the model is relatively simple: either assigning a particular weight to each possible configuration (equilibrium statistical mechanics) or specifying the rules under which the system evolves (nonequilibrium statistical mechanics). Interacting particle systems is the area of probability that studies the evolution of particles (either finite or infinite in number) under random motions. The evolution of particles depends on the positions of the other particles; often one assumes that it depends only on the particles that are close to the particular particle. Thomas Liggett's lectures give an introduction to this very large area. Claudio Landim's follows up by discussing hydrodynamic limits of particle systems. The goal of this area is to describe the long time, large system size dynamics in terms of partial differential equations. The area of random media is concerned with the properties of materials or environments that are not homogeneous. Percolation theory studies one of the simplest stated models for impurities - taking a lattice and removing some of the vertices or bonds. Luiz Renato G. Fontes and Vladas Sidoravicius give a detailed introduction to this area. Random walk in random environment combines two sources of randomness - a particle performing stochastic motion in which the transition probabilities depend on position and have been chosen from some probability distribution. Alain-Sol Sznitman gives a survey of recent developments in this

  11. Impact of controlling the sum of error probability in the sequential probability ratio test

    Directory of Open Access Journals (Sweden)

    Bijoy Kumarr Pradhan

    2013-05-01

    Full Text Available A generalized modified method is proposed to control the sum of error probabilities in sequential probability ratio test to minimize the weighted average of the two average sample numbers under a simple null hypothesis and a simple alternative hypothesis with the restriction that the sum of error probabilities is a pre-assigned constant to find the optimal sample size and finally a comparison is done with the optimal sample size found from fixed sample size procedure. The results are applied to the cases when the random variate follows a normal law as well as Bernoullian law.

  12. A tandem queue with delayed server release

    OpenAIRE

    Nawijn, W.M.

    1997-01-01

    We consider a tandem queue with two stations. The rst station is an s-server queue with Poisson arrivals and exponential service times. After terminating his service in the rst station, a customer enters the second station to require service at an exponential single server, while in the meantime he is blocking his server in station 1 until he completes service in station 2, whereupon the server in station 1 is released. An analysis of the generating function of the simultaneous probability di...

  13. The Goiania accident: release from hospital criterion

    International Nuclear Information System (INIS)

    Falcao, R.C.; Hunt, J.

    1990-01-01

    On the thirteenth of September 1987, a 1357 Ci Cesium source was removed from the 'Instituto de Radiologia de Goiania' - probably two or three days later the source was opened, causing the internal and external contamination of 247 people, and part of the city of Goiania. This paper describes the release from hospital criterion of the contaminated patients, based on radiation protection principles which were developed for this case. The estimate of the biological half-life for cesium is also described. (author) [pt

  14. Probability functions in the context of signed involutive meadows

    NARCIS (Netherlands)

    Bergstra, J.A.; Ponse, A.

    2016-01-01

    The Kolmogorov axioms for probability functions are placed in the context of signed meadows. A completeness theorem is stated and proven for the resulting equational theory of probability calculus. Elementary definitions of probability theory are restated in this framework.

  15. Probability sampling in legal cases: Kansas cellphone users

    Science.gov (United States)

    Kadane, Joseph B.

    2012-10-01

    Probability sampling is a standard statistical technique. This article introduces the basic ideas of probability sampling, and shows in detail how probability sampling was used in a particular legal case.

  16. Probability, conditional probability and complementary cumulative distribution functions in performance assessment for radioactive waste disposal

    International Nuclear Information System (INIS)

    Helton, J.C.

    1996-03-01

    A formal description of the structure of several recent performance assessments (PAs) for the Waste Isolation Pilot Plant (WIPP) is given in terms of the following three components: a probability space (S st , S st , p st ) for stochastic uncertainty, a probability space (S su , S su , p su ) for subjective uncertainty and a function (i.e., a random variable) defined on the product space associated with (S st , S st , p st ) and (S su , S su , p su ). The explicit recognition of the existence of these three components allows a careful description of the use of probability, conditional probability and complementary cumulative distribution functions within the WIPP PA. This usage is illustrated in the context of the U.S. Environmental Protection Agency's standard for the geologic disposal of radioactive waste (40 CFR 191, Subpart B). The paradigm described in this presentation can also be used to impose a logically consistent structure on PAs for other complex systems

  17. Probability, conditional probability and complementary cumulative distribution functions in performance assessment for radioactive waste disposal

    International Nuclear Information System (INIS)

    Helton, J.C.

    1996-01-01

    A formal description of the structure of several recent performance assessments (PAs) for the Waste Isolation Pilot Plant (WIPP) is given in terms of the following three components: a probability space (S st , L st , P st ) for stochastic uncertainty, a probability space (S su , L su , P su ) for subjective uncertainty and a function (i.e., a random variable) defined on the product space associated with (S st , L st , P st ) and (S su , L su , P su ). The explicit recognition of the existence of these three components allows a careful description of the use of probability, conditional probability and complementary cumulative distribution functions within the WIPP PA. This usage is illustrated in the context of the US Environmental Protection Agency's standard for the geologic disposal of radioactive waste (40 CFR 191, Subpart B). The paradigm described in this presentation can also be used to impose a logically consistent structure on PAs for other complex systems

  18. Market-implied risk-neutral probabilities, actual probabilities, credit risk and news

    Directory of Open Access Journals (Sweden)

    Shashidhar Murthy

    2011-09-01

    Full Text Available Motivated by the credit crisis, this paper investigates links between risk-neutral probabilities of default implied by markets (e.g. from yield spreads and their actual counterparts (e.g. from ratings. It discusses differences between the two and clarifies underlying economic intuition using simple representations of credit risk pricing. Observed large differences across bonds in the ratio of the two probabilities are shown to imply that apparently safer securities can be more sensitive to news.

  19. Qubit-qutrit separability-probability ratios

    International Nuclear Information System (INIS)

    Slater, Paul B.

    2005-01-01

    Paralleling our recent computationally intensive (quasi-Monte Carlo) work for the case N=4 (e-print quant-ph/0308037), we undertake the task for N=6 of computing to high numerical accuracy, the formulas of Sommers and Zyczkowski (e-print quant-ph/0304041) for the (N 2 -1)-dimensional volume and (N 2 -2)-dimensional hyperarea of the (separable and nonseparable) NxN density matrices, based on the Bures (minimal monotone) metric--and also their analogous formulas (e-print quant-ph/0302197) for the (nonmonotone) flat Hilbert-Schmidt metric. With the same seven 10 9 well-distributed ('low-discrepancy') sample points, we estimate the unknown volumes and hyperareas based on five additional (monotone) metrics of interest, including the Kubo-Mori and Wigner-Yanase. Further, we estimate all of these seven volume and seven hyperarea (unknown) quantities when restricted to the separable density matrices. The ratios of separable volumes (hyperareas) to separable plus nonseparable volumes (hyperareas) yield estimates of the separability probabilities of generically rank-6 (rank-5) density matrices. The (rank-6) separability probabilities obtained based on the 35-dimensional volumes appear to be--independently of the metric (each of the seven inducing Haar measure) employed--twice as large as those (rank-5 ones) based on the 34-dimensional hyperareas. (An additional estimate--33.9982--of the ratio of the rank-6 Hilbert-Schmidt separability probability to the rank-4 one is quite clearly close to integral too.) The doubling relationship also appears to hold for the N=4 case for the Hilbert-Schmidt metric, but not the others. We fit simple exact formulas to our estimates of the Hilbert-Schmidt separable volumes and hyperareas in both the N=4 and N=6 cases

  20. Greek paideia and terms of probability

    Directory of Open Access Journals (Sweden)

    Fernando Leon Parada

    2016-06-01

    Full Text Available This paper addresses three aspects of the conceptual framework for a doctoral dissertation research in process in the field of Mathematics Education, in particular, in the subfield of teaching and learning basic concepts of Probability Theory at the College level. It intends to contrast, sustain and elucidate the central statement that the meanings of some of these basic terms used in Probability Theory were not formally defined by any specific theory but relate to primordial ideas developed in Western culture from Ancient Greek myths. The first aspect deals with the notion of uncertainty, with that Greek thinkers described several archaic gods and goddesses of Destiny, like Parcas and Moiras, often personified in the goddess Tyche—Fortuna for the Romans—, as regarded in Werner Jaeger’s “Paideia”. The second aspect treats the idea of hazard from two different approaches: the first approach deals with hazard, denoted by Plato with the already demythologized term ‘tyche’ from the viewpoint of innate knowledge, as Jaeger points out. The second approach deals with hazard from a perspective that could be called “phenomenological”, from which Aristotle attempted to articulate uncertainty with a discourse based on the hypothesis of causality. The term ‘causal’ was opposed both to ‘casual’ and to ‘spontaneous’ (as used in the expression “spontaneous generation”, attributing uncertainty to ignorance of the future, thus respecting causal flow. The third aspect treated in the paper refers to some definitions and etymologies of some other modern words that have become technical terms in current Probability Theory, confirming the above-mentioned main proposition of this paper.

  1. Probability of Criticality for MOX SNF

    International Nuclear Information System (INIS)

    P. Gottlieb

    1999-01-01

    The purpose of this calculation is to provide a conservative (upper bound) estimate of the probability of criticality for mixed oxide (MOX) spent nuclear fuel (SNF) of the Westinghouse pressurized water reactor (PWR) design that has been proposed for use. with the Plutonium Disposition Program (Ref. 1, p. 2). This calculation uses a Monte Carlo technique similar to that used for ordinary commercial SNF (Ref. 2, Sections 2 and 5.2). Several scenarios, covering a range of parameters, are evaluated for criticality. Parameters specifying the loss of fission products and iron oxide from the waste package are particularly important. This calculation is associated with disposal of MOX SNF

  2. Probability Distribution for Flowing Interval Spacing

    International Nuclear Information System (INIS)

    Kuzio, S.

    2001-01-01

    The purpose of this analysis is to develop a probability distribution for flowing interval spacing. A flowing interval is defined as a fractured zone that transmits flow in the Saturated Zone (SZ), as identified through borehole flow meter surveys (Figure 1). This analysis uses the term ''flowing interval spacing'' as opposed to fractured spacing, which is typically used in the literature. The term fracture spacing was not used in this analysis because the data used identify a zone (or a flowing interval) that contains fluid-conducting fractures but does not distinguish how many or which fractures comprise the flowing interval. The flowing interval spacing is measured between the midpoints of each flowing interval. Fracture spacing within the SZ is defined as the spacing between fractures, with no regard to which fractures are carrying flow. The Development Plan associated with this analysis is entitled, ''Probability Distribution for Flowing Interval Spacing'', (CRWMS M and O 2000a). The parameter from this analysis may be used in the TSPA SR/LA Saturated Zone Flow and Transport Work Direction and Planning Documents: (1) ''Abstraction of Matrix Diffusion for SZ Flow and Transport Analyses'' (CRWMS M and O 1999a) and (2) ''Incorporation of Heterogeneity in SZ Flow and Transport Analyses'', (CRWMS M and O 1999b). A limitation of this analysis is that the probability distribution of flowing interval spacing may underestimate the effect of incorporating matrix diffusion processes in the SZ transport model because of the possible overestimation of the flowing interval spacing. Larger flowing interval spacing results in a decrease in the matrix diffusion processes. This analysis may overestimate the flowing interval spacing because the number of fractures that contribute to a flowing interval cannot be determined from the data. Because each flowing interval probably has more than one fracture contributing to a flowing interval, the true flowing interval spacing could be

  3. A short walk in quantum probability

    Science.gov (United States)

    Hudson, Robin

    2018-04-01

    This is a personal survey of aspects of quantum probability related to the Heisenberg commutation relation for canonical pairs. Using the failure, in general, of non-negativity of the Wigner distribution for canonical pairs to motivate a more satisfactory quantum notion of joint distribution, we visit a central limit theorem for such pairs and a resulting family of quantum planar Brownian motions which deform the classical planar Brownian motion, together with a corresponding family of quantum stochastic areas. This article is part of the themed issue `Hilbert's sixth problem'.

  4. Probability and logical structure of statistical theories

    International Nuclear Information System (INIS)

    Hall, M.J.W.

    1988-01-01

    A characterization of statistical theories is given which incorporates both classical and quantum mechanics. It is shown that each statistical theory induces an associated logic and joint probability structure, and simple conditions are given for the structure to be of a classical or quantum type. This provides an alternative for the quantum logic approach to axiomatic quantum mechanics. The Bell inequalities may be derived for those statistical theories that have a classical structure and satisfy a locality condition weaker than factorizability. The relation of these inequalities to the issue of hidden variable theories for quantum mechanics is discussed and clarified

  5. Quantum operations, state transformations and probabilities

    International Nuclear Information System (INIS)

    Chefles, Anthony

    2002-01-01

    In quantum operations, probabilities characterize both the degree of the success of a state transformation and, as density operator eigenvalues, the degree of mixedness of the final state. We give a unified treatment of pure→pure state transformations, covering both probabilistic and deterministic cases. We then discuss the role of majorization in describing the dynamics of mixing in quantum operations. The conditions for mixing enhancement for all initial states are derived. We show that mixing is monotonically decreasing for deterministic pure→pure transformations, and discuss the relationship between these transformations and deterministic local operations with classical communication entanglement transformations

  6. An introduction to probability and statistical inference

    CERN Document Server

    Roussas, George G

    2003-01-01

    "The text is wonderfully written and has the mostcomprehensive range of exercise problems that I have ever seen." - Tapas K. Das, University of South Florida"The exposition is great; a mixture between conversational tones and formal mathematics; the appropriate combination for a math text at [this] level. In my examination I could find no instance where I could improve the book." - H. Pat Goeters, Auburn, University, Alabama* Contains more than 200 illustrative examples discussed in detail, plus scores of numerical examples and applications* Chapters 1-8 can be used independently for an introductory course in probability* Provides a substantial number of proofs

  7. STRIP: stream learning of influence probabilities

    DEFF Research Database (Denmark)

    Kutzkov, Konstantin

    2013-01-01

    cascades, and developing applications such as viral marketing. Motivated by modern microblogging platforms, such as twitter, in this paper we study the problem of learning influence probabilities in a data-stream scenario, in which the network topology is relatively stable and the challenge of a learning...... algorithm is to keep up with a continuous stream of tweets using a small amount of time and memory. Our contribution is a number of randomized approximation algorithms, categorized according to the available space (superlinear, linear, and sublinear in the number of nodes n) and according to dierent models...

  8. Modulation Based on Probability Density Functions

    Science.gov (United States)

    Williams, Glenn L.

    2009-01-01

    A proposed method of modulating a sinusoidal carrier signal to convey digital information involves the use of histograms representing probability density functions (PDFs) that characterize samples of the signal waveform. The method is based partly on the observation that when a waveform is sampled (whether by analog or digital means) over a time interval at least as long as one half cycle of the waveform, the samples can be sorted by frequency of occurrence, thereby constructing a histogram representing a PDF of the waveform during that time interval.

  9. PWR reactor pressure vessel failure probabilities

    International Nuclear Information System (INIS)

    Dufresne, J.; Lanore, J.M.; Lucia, A.C.; Elbaz, J.; Brunnhuber, R.

    1980-05-01

    To evaluate the rupture probability of a LWR vessel a probabilistic method using the fracture mechanics under probabilistic form has been proposed previously, but it appears that more accurate evaluation is possible. In consequence a joint collaboration agreement signed in 1976 between CEA, EURATOM, JRC Ispra and FRAMATOME set up and started a research program covering three parts: a computer code development, data acquisition and processing, and a support experimental program which aims at clarifying the most important parameters used in the COVASTOL computer code

  10. APPROXIMATION OF PROBABILITY DISTRIBUTIONS IN QUEUEING MODELS

    Directory of Open Access Journals (Sweden)

    T. I. Aliev

    2013-03-01

    Full Text Available For probability distributions with variation coefficient, not equal to unity, mathematical dependences for approximating distributions on the basis of first two moments are derived by making use of multi exponential distributions. It is proposed to approximate distributions with coefficient of variation less than unity by using hypoexponential distribution, which makes it possible to generate random variables with coefficient of variation, taking any value in a range (0; 1, as opposed to Erlang distribution, having only discrete values of coefficient of variation.

  11. Stochastics introduction to probability and statistics

    CERN Document Server

    Georgii, Hans-Otto

    2012-01-01

    This second revised and extended edition presents the fundamental ideas and results of both, probability theory and statistics, and comprises the material of a one-year course. It is addressed to students with an interest in the mathematical side of stochastics. Stochastic concepts, models and methods are motivated by examples and developed and analysed systematically. Some measure theory is included, but this is done at an elementary level that is in accordance with the introductory character of the book. A large number of problems offer applications and supplements to the text.

  12. Snell Envelope with Small Probability Criteria

    Energy Technology Data Exchange (ETDEWEB)

    Del Moral, Pierre, E-mail: Pierre.Del-Moral@inria.fr; Hu, Peng, E-mail: Peng.Hu@inria.fr [Universite de Bordeaux I, Centre INRIA Bordeaux et Sud-Ouest and Institut de Mathematiques de Bordeaux (France); Oudjane, Nadia, E-mail: Nadia.Oudjane@edf.fr [EDF R and D Clamart (France)

    2012-12-15

    We present a new algorithm to compute the Snell envelope in the specific case where the criteria to optimize is associated with a small probability or a rare event. This new approach combines the Stochastic Mesh approach of Broadie and Glasserman with a particle approximation scheme based on a specific change of measure designed to concentrate the computational effort in regions pointed out by the criteria. The theoretical analysis of this new algorithm provides non asymptotic convergence estimates. Finally, the numerical tests confirm the practical interest of this approach.

  13. Introduction to probability and measure theories

    International Nuclear Information System (INIS)

    Partasarati, K.

    1983-01-01

    Chapters of probability and measured theories are presented. The Borele images of spaces with the measure into each other and in separate metric spaces are studied. The Kolmogorov theorem on the continuation of probabilies is drawn from the theorem on the measure continuation to the projective limits of spaces with measure. The integration theory is plotted, measures on multiplications of spaces are studied. The theory of conventional mathematical expectations by projections in Hilbert space is presented. In conclusion, the theory of weak convergence of measures of elements of the theory of characteristic functions and the theory of invariant and quasi-invariant measures on groups and homogeneous spaces is given

  14. Statistical physics of pairwise probability models

    DEFF Research Database (Denmark)

    Roudi, Yasser; Aurell, Erik; Hertz, John

    2009-01-01

    (dansk abstrakt findes ikke) Statistical models for describing the probability distribution over the states of biological systems are commonly used for dimensional reduction. Among these models, pairwise models are very attractive in part because they can be fit using a reasonable amount of  data......: knowledge of the means and correlations between pairs of elements in the system is sufficient. Not surprisingly, then, using pairwise models for studying neural data has been the focus of many studies in recent years. In this paper, we describe how tools from statistical physics can be employed for studying...

  15. Representing Uncertainty by Probability and Possibility

    DEFF Research Database (Denmark)

    of uncertain parameters. Monte Carlo simulation is readily used for practical calculations. However, an alternative approach is offered by possibility theory making use of possibility distributions such as intervals and fuzzy intervals. This approach is well suited to represent lack of knowledge or imprecision......Uncertain parameters in modeling are usually represented by probability distributions reflecting either the objective uncertainty of the parameters or the subjective belief held by the model builder. This approach is particularly suited for representing the statistical nature or variance...

  16. The Probability Model of Expectation Disconfirmation Process

    Directory of Open Access Journals (Sweden)

    Hui-Hsin HUANG

    2015-06-01

    Full Text Available This paper proposes a probability model to explore the dynamic process of customer’s satisfaction. Bases on expectation disconfirmation theory, the satisfaction is constructed with customer’s expectation before buying behavior and the perceived performance after purchase. The experiment method is designed to measure expectation disconfirmation effects and we also use the collection data to estimate the overall satisfaction and model calibration. The results show good fitness between the model and the real data. This model has application for business marketing areas in order to manage relationship satisfaction.

  17. A short walk in quantum probability.

    Science.gov (United States)

    Hudson, Robin

    2018-04-28

    This is a personal survey of aspects of quantum probability related to the Heisenberg commutation relation for canonical pairs. Using the failure, in general, of non-negativity of the Wigner distribution for canonical pairs to motivate a more satisfactory quantum notion of joint distribution, we visit a central limit theorem for such pairs and a resulting family of quantum planar Brownian motions which deform the classical planar Brownian motion, together with a corresponding family of quantum stochastic areas.This article is part of the themed issue 'Hilbert's sixth problem'. © 2018 The Author(s).

  18. Quantum processes: probability fluxes, transition probabilities in unit time and vacuum vibrations

    International Nuclear Information System (INIS)

    Oleinik, V.P.; Arepjev, Ju D.

    1989-01-01

    Transition probabilities in unit time and probability fluxes are compared in studying the elementary quantum processes -the decay of a bound state under the action of time-varying and constant electric fields. It is shown that the difference between these quantities may be considerable, and so the use of transition probabilities W instead of probability fluxes Π, in calculating the particle fluxes, may lead to serious errors. The quantity W represents the rate of change with time of the population of the energy levels relating partly to the real states and partly to the virtual ones, and it cannot be directly measured in experiment. The vacuum background is shown to be continuously distorted when a perturbation acts on a system. Because of this the viewpoint of an observer on the physical properties of real particles continuously varies with time. This fact is not taken into consideration in the conventional theory of quantum transitions based on using the notion of probability amplitude. As a result, the probability amplitudes lose their physical meaning. All the physical information on quantum dynamics of a system is contained in the mean values of physical quantities. The existence of considerable differences between the quantities W and Π permits one in principle to make a choice of the correct theory of quantum transitions on the basis of experimental data. (author)

  19. Decontamination for free release

    Energy Technology Data Exchange (ETDEWEB)

    Simpson, K A; Elder, G R [Bradtec Ltd., Bristol (United Kingdom)

    1997-02-01

    Many countries are seeking to treat radioactive waste in ways which meet the local regulatory requirements, but yet are cost effective when all contributing factors are assessed. In some countries there are increasing amounts of waste, arising from nuclear plant decommissioning, which are categorized as low level waste: however with suitable treatment a large part of such wastes might become beyond regulatory control and be able to be released as non-radioactive. The benefits and disadvantages of additional treatment before disposal need to be considered. Several processes falling within the overall description of decontamination for free release have been developed and applied, and these are outlined. In one instance the process seeks to take advantage of techniques and equipment used for decontaminating water reactor circuits intermittently through reactor life. (author). 9 refs, 1 fig., 3 tabs.

  20. Atmospheric Release Advisory Capability

    International Nuclear Information System (INIS)

    Dickerson, M.H.; Gudiksen, P.H.; Sullivan, T.J.

    1983-02-01

    The Atmospheric Release Advisory Capability (ARAC) project is a Department of Energy (DOE) sponsored real-time emergency response service available for use by both federal and state agencies in case of a potential or actual atmospheric release of nuclear material. The project, initiated in 1972, is currently evolving from the research and development phase to full operation. Plans are underway to expand the existing capability to continuous operation by 1984 and to establish a National ARAC Center (NARAC) by 1988. This report describes the ARAC system, its utilization during the past two years, and plans for its expansion during the next five to six years. An integral part of this expansion is due to a very important and crucial effort sponsored by the Defense Nuclear Agency to extend the ARAC service to approximately 45 Department of Defense (DOD) sites throughout the continental US over the next three years