WorldWideScience

Sample records for learning probabilistic models

  1. Learning Probabilistic Logic Models from Probabilistic Examples.

    Science.gov (United States)

    Chen, Jianzhong; Muggleton, Stephen; Santos, José

    2008-10-01

    We revisit an application developed originally using abductive Inductive Logic Programming (ILP) for modeling inhibition in metabolic networks. The example data was derived from studies of the effects of toxins on rats using Nuclear Magnetic Resonance (NMR) time-trace analysis of their biofluids together with background knowledge representing a subset of the Kyoto Encyclopedia of Genes and Genomes (KEGG). We now apply two Probabilistic ILP (PILP) approaches - abductive Stochastic Logic Programs (SLPs) and PRogramming In Statistical modeling (PRISM) to the application. Both approaches support abductive learning and probability predictions. Abductive SLPs are a PILP framework that provides possible worlds semantics to SLPs through abduction. Instead of learning logic models from non-probabilistic examples as done in ILP, the PILP approach applied in this paper is based on a general technique for introducing probability labels within a standard scientific experimental setting involving control and treated data. Our results demonstrate that the PILP approach provides a way of learning probabilistic logic models from probabilistic examples, and the PILP models learned from probabilistic examples lead to a significant decrease in error accompanied by improved insight from the learned results compared with the PILP models learned from non-probabilistic examples.

  2. Probabilistic models and machine learning in structural bioinformatics

    DEFF Research Database (Denmark)

    Hamelryck, Thomas

    2009-01-01

    . Recently, probabilistic models and machine learning methods based on Bayesian principles are providing efficient and rigorous solutions to challenging problems that were long regarded as intractable. In this review, I will highlight some important recent developments in the prediction, analysis...

  3. Machine learning, computer vision, and probabilistic models in jet physics

    CERN Multimedia

    CERN. Geneva; NACHMAN, Ben

    2015-01-01

    In this talk we present recent developments in the application of machine learning, computer vision, and probabilistic models to the analysis and interpretation of LHC events. First, we will introduce the concept of jet-images and computer vision techniques for jet tagging. Jet images enabled the connection between jet substructure and tagging with the fields of computer vision and image processing for the first time, improving the performance to identify highly boosted W bosons with respect to state-of-the-art methods, and providing a new way to visualize the discriminant features of different classes of jets, adding a new capability to understand the physics within jets and to design more powerful jet tagging methods. Second, we will present Fuzzy jets: a new paradigm for jet clustering using machine learning methods. Fuzzy jets view jet clustering as an unsupervised learning task and incorporate a probabilistic assignment of particles to jets to learn new features of the jet structure. In particular, we wi...

  4. Human-Guided Learning for Probabilistic Logic Models

    Directory of Open Access Journals (Sweden)

    Phillip Odom

    2018-06-01

    Full Text Available Advice-giving has been long explored in the artificial intelligence community to build robust learning algorithms when the data is noisy, incorrect or even insufficient. While logic based systems were effectively used in building expert systems, the role of the human has been restricted to being a “mere labeler” in recent times. We hypothesize and demonstrate that probabilistic logic can provide an effective and natural way for the expert to specify domain advice. Specifically, we consider different types of advice-giving in relational domains where noise could arise due to systematic errors or class-imbalance inherent in the domains. The advice is provided as logical statements or privileged features that are thenexplicitly considered by an iterative learning algorithm at every update. Our empirical evidence shows that human advice can effectively accelerate learning in noisy, structured domains where so far humans have been merely used as labelers or as designers of the (initial or final structure of the model.

  5. The Gain-Loss Model: A Probabilistic Skill Multimap Model for Assessing Learning Processes

    Science.gov (United States)

    Robusto, Egidio; Stefanutti, Luca; Anselmi, Pasquale

    2010-01-01

    Within the theoretical framework of knowledge space theory, a probabilistic skill multimap model for assessing learning processes is proposed. The learning process of a student is modeled as a function of the student's knowledge and of an educational intervention on the attainment of specific skills required to solve problems in a knowledge…

  6. Scalable learning of probabilistic latent models for collaborative filtering

    DEFF Research Database (Denmark)

    Langseth, Helge; Nielsen, Thomas Dyhre

    2015-01-01

    variational Bayes learning and inference algorithm for these types of models. Empirical results show that the proposed algorithm achieves significantly better accuracy results than other straw-men models evaluated on a collection of well-known data sets. We also demonstrate that the algorithm has a highly...

  7. Learning probabilistic models of hydrogen bond stability from molecular dynamics simulation trajectories

    KAUST Repository

    Chikalov, Igor; Yao, Peggy; Moshkov, Mikhail; Latombe, Jean-Claude

    2011-01-01

    . The intrinsic strength of an individual H-bond has been studied from an energetic viewpoint, but energy alone may not be a very good predictor.Methods: This paper describes inductive learning methods to train protein-independent probabilistic models of H

  8. Probabilistic machine learning and artificial intelligence.

    Science.gov (United States)

    Ghahramani, Zoubin

    2015-05-28

    How can a machine learn from experience? Probabilistic modelling provides a framework for understanding what learning is, and has therefore emerged as one of the principal theoretical and practical approaches for designing machines that learn from data acquired through experience. The probabilistic framework, which describes how to represent and manipulate uncertainty about models and predictions, has a central role in scientific data analysis, machine learning, robotics, cognitive science and artificial intelligence. This Review provides an introduction to this framework, and discusses some of the state-of-the-art advances in the field, namely, probabilistic programming, Bayesian optimization, data compression and automatic model discovery.

  9. Probabilistic machine learning and artificial intelligence

    Science.gov (United States)

    Ghahramani, Zoubin

    2015-05-01

    How can a machine learn from experience? Probabilistic modelling provides a framework for understanding what learning is, and has therefore emerged as one of the principal theoretical and practical approaches for designing machines that learn from data acquired through experience. The probabilistic framework, which describes how to represent and manipulate uncertainty about models and predictions, has a central role in scientific data analysis, machine learning, robotics, cognitive science and artificial intelligence. This Review provides an introduction to this framework, and discusses some of the state-of-the-art advances in the field, namely, probabilistic programming, Bayesian optimization, data compression and automatic model discovery.

  10. Approaches to probabilistic model learning for mobile manipulation robots

    CERN Document Server

    Sturm, Jürgen

    2013-01-01

    Mobile manipulation robots are envisioned to provide many useful services both in domestic environments as well as in the industrial context. Examples include domestic service robots that implement large parts of the housework, and versatile industrial assistants that provide automation, transportation, inspection, and monitoring services. The challenge in these applications is that the robots have to function under changing, real-world conditions, be able to deal with considerable amounts of noise and uncertainty, and operate without the supervision of an expert. This book presents novel learning techniques that enable mobile manipulation robots, i.e., mobile platforms with one or more robotic manipulators, to autonomously adapt to new or changing situations. The approaches presented in this book cover the following topics: (1) learning the robot's kinematic structure and properties using actuation and visual feedback, (2) learning about articulated objects in the environment in which the robot is operating,...

  11. Learning Probabilistic Decision Graphs

    DEFF Research Database (Denmark)

    Jaeger, Manfred; Dalgaard, Jens; Silander, Tomi

    2004-01-01

    efficient representations than Bayesian networks. In this paper we present an algorithm for learning PDGs from data. First experiments show that the algorithm is capable of learning optimal PDG representations in some cases, and that the computational efficiency of PDG models learned from real-life data...

  12. Quantum-Assisted Learning of Hardware-Embedded Probabilistic Graphical Models

    Science.gov (United States)

    Benedetti, Marcello; Realpe-Gómez, John; Biswas, Rupak; Perdomo-Ortiz, Alejandro

    2017-10-01

    Mainstream machine-learning techniques such as deep learning and probabilistic programming rely heavily on sampling from generally intractable probability distributions. There is increasing interest in the potential advantages of using quantum computing technologies as sampling engines to speed up these tasks or to make them more effective. However, some pressing challenges in state-of-the-art quantum annealers have to be overcome before we can assess their actual performance. The sparse connectivity, resulting from the local interaction between quantum bits in physical hardware implementations, is considered the most severe limitation to the quality of constructing powerful generative unsupervised machine-learning models. Here, we use embedding techniques to add redundancy to data sets, allowing us to increase the modeling capacity of quantum annealers. We illustrate our findings by training hardware-embedded graphical models on a binarized data set of handwritten digits and two synthetic data sets in experiments with up to 940 quantum bits. Our model can be trained in quantum hardware without full knowledge of the effective parameters specifying the corresponding quantum Gibbs-like distribution; therefore, this approach avoids the need to infer the effective temperature at each iteration, speeding up learning; it also mitigates the effect of noise in the control parameters, making it robust to deviations from the reference Gibbs distribution. Our approach demonstrates the feasibility of using quantum annealers for implementing generative models, and it provides a suitable framework for benchmarking these quantum technologies on machine-learning-related tasks.

  13. Quantum-Assisted Learning of Hardware-Embedded Probabilistic Graphical Models

    Directory of Open Access Journals (Sweden)

    Marcello Benedetti

    2017-11-01

    Full Text Available Mainstream machine-learning techniques such as deep learning and probabilistic programming rely heavily on sampling from generally intractable probability distributions. There is increasing interest in the potential advantages of using quantum computing technologies as sampling engines to speed up these tasks or to make them more effective. However, some pressing challenges in state-of-the-art quantum annealers have to be overcome before we can assess their actual performance. The sparse connectivity, resulting from the local interaction between quantum bits in physical hardware implementations, is considered the most severe limitation to the quality of constructing powerful generative unsupervised machine-learning models. Here, we use embedding techniques to add redundancy to data sets, allowing us to increase the modeling capacity of quantum annealers. We illustrate our findings by training hardware-embedded graphical models on a binarized data set of handwritten digits and two synthetic data sets in experiments with up to 940 quantum bits. Our model can be trained in quantum hardware without full knowledge of the effective parameters specifying the corresponding quantum Gibbs-like distribution; therefore, this approach avoids the need to infer the effective temperature at each iteration, speeding up learning; it also mitigates the effect of noise in the control parameters, making it robust to deviations from the reference Gibbs distribution. Our approach demonstrates the feasibility of using quantum annealers for implementing generative models, and it provides a suitable framework for benchmarking these quantum technologies on machine-learning-related tasks.

  14. Learning probabilistic models of hydrogen bond stability from molecular dynamics simulation trajectories

    KAUST Repository

    Chikalov, Igor

    2011-02-15

    Background: Hydrogen bonds (H-bonds) play a key role in both the formation and stabilization of protein structures. They form and break while a protein deforms, for instance during the transition from a non-functional to a functional state. The intrinsic strength of an individual H-bond has been studied from an energetic viewpoint, but energy alone may not be a very good predictor.Methods: This paper describes inductive learning methods to train protein-independent probabilistic models of H-bond stability from molecular dynamics (MD) simulation trajectories of various proteins. The training data contains 32 input attributes (predictors) that describe an H-bond and its local environment in a conformation c and the output attribute is the probability that the H-bond will be present in an arbitrary conformation of this protein achievable from c within a time duration ?. We model dependence of the output variable on the predictors by a regression tree.Results: Several models are built using 6 MD simulation trajectories containing over 4000 distinct H-bonds (millions of occurrences). Experimental results demonstrate that such models can predict H-bond stability quite well. They perform roughly 20% better than models based on H-bond energy alone. In addition, they can accurately identify a large fraction of the least stable H-bonds in a conformation. In most tests, about 80% of the 10% H-bonds predicted as the least stable are actually among the 10% truly least stable. The important attributes identified during the tree construction are consistent with previous findings.Conclusions: We use inductive learning methods to build protein-independent probabilistic models to study H-bond stability, and demonstrate that the models perform better than H-bond energy alone. 2011 Chikalov et al; licensee BioMed Central Ltd.

  15. Transitive probabilistic CLIR models.

    NARCIS (Netherlands)

    Kraaij, W.; de Jong, Franciska M.G.

    2004-01-01

    Transitive translation could be a useful technique to enlarge the number of supported language pairs for a cross-language information retrieval (CLIR) system in a cost-effective manner. The paper describes several setups for transitive translation based on probabilistic translation models. The

  16. Probabilistic Learning by Rodent Grid Cells.

    Science.gov (United States)

    Cheung, Allen

    2016-10-01

    Mounting evidence shows mammalian brains are probabilistic computers, but the specific cells involved remain elusive. Parallel research suggests that grid cells of the mammalian hippocampal formation are fundamental to spatial cognition but their diverse response properties still defy explanation. No plausible model exists which explains stable grids in darkness for twenty minutes or longer, despite being one of the first results ever published on grid cells. Similarly, no current explanation can tie together grid fragmentation and grid rescaling, which show very different forms of flexibility in grid responses when the environment is varied. Other properties such as attractor dynamics and grid anisotropy seem to be at odds with one another unless additional properties are assumed such as a varying velocity gain. Modelling efforts have largely ignored the breadth of response patterns, while also failing to account for the disastrous effects of sensory noise during spatial learning and recall, especially in darkness. Here, published electrophysiological evidence from a range of experiments are reinterpreted using a novel probabilistic learning model, which shows that grid cell responses are accurately predicted by a probabilistic learning process. Diverse response properties of probabilistic grid cells are statistically indistinguishable from rat grid cells across key manipulations. A simple coherent set of probabilistic computations explains stable grid fields in darkness, partial grid rescaling in resized arenas, low-dimensional attractor grid cell dynamics, and grid fragmentation in hairpin mazes. The same computations also reconcile oscillatory dynamics at the single cell level with attractor dynamics at the cell ensemble level. Additionally, a clear functional role for boundary cells is proposed for spatial learning. These findings provide a parsimonious and unified explanation of grid cell function, and implicate grid cells as an accessible neuronal population

  17. Learning Probabilistic Models of Hydrogen Bond Stability from Molecular Dynamics Simulation Trajectories

    KAUST Repository

    Chikalov, Igor

    2011-04-02

    Hydrogen bonds (H-bonds) play a key role in both the formation and stabilization of protein structures. H-bonds involving atoms from residues that are close to each other in the main-chain sequence stabilize secondary structure elements. H-bonds between atoms from distant residues stabilize a protein’s tertiary structure. However, H-bonds greatly vary in stability. They form and break while a protein deforms. For instance, the transition of a protein from a nonfunctional to a functional state may require some H-bonds to break and others to form. The intrinsic strength of an individual H-bond has been studied from an energetic viewpoint, but energy alone may not be a very good predictor. Other local interactions may reinforce (or weaken) an H-bond. This paper describes inductive learning methods to train a protein-independent probabilistic model of H-bond stability from molecular dynamics (MD) simulation trajectories. The training data describes H-bond occurrences at successive times along these trajectories by the values of attributes called predictors. A trained model is constructed in the form of a regression tree in which each non-leaf node is a Boolean test (split) on a predictor. Each occurrence of an H-bond maps to a path in this tree from the root to a leaf node. Its predicted stability is associated with the leaf node. Experimental results demonstrate that such models can predict H-bond stability quite well. In particular, their performance is roughly 20% better than that of models based on H-bond energy alone. In addition, they can accurately identify a large fraction of the least stable H-bonds in a given conformation. The paper discusses several extensions that may yield further improvements.

  18. Probabilistic Model Development

    Science.gov (United States)

    Adam, James H., Jr.

    2010-01-01

    Objective: Develop a Probabilistic Model for the Solar Energetic Particle Environment. Develop a tool to provide a reference solar particle radiation environment that: 1) Will not be exceeded at a user-specified confidence level; 2) Will provide reference environments for: a) Peak flux; b) Event-integrated fluence; and c) Mission-integrated fluence. The reference environments will consist of: a) Elemental energy spectra; b) For protons, helium and heavier ions.

  19. Mastering probabilistic graphical models using Python

    CERN Document Server

    Ankan, Ankur

    2015-01-01

    If you are a researcher or a machine learning enthusiast, or are working in the data science field and have a basic idea of Bayesian learning or probabilistic graphical models, this book will help you to understand the details of graphical models and use them in your data science problems.

  20. Machine learning a probabilistic perspective

    CERN Document Server

    Murphy, Kevin P

    2012-01-01

    Today's Web-enabled deluge of electronic data calls for automated methods of data analysis. Machine learning provides these, developing methods that can automatically detect patterns in data and then use the uncovered patterns to predict future data. This textbook offers a comprehensive and self-contained introduction to the field of machine learning, based on a unified, probabilistic approach. The coverage combines breadth and depth, offering necessary background material on such topics as probability, optimization, and linear algebra as well as discussion of recent developments in the field, including conditional random fields, L1 regularization, and deep learning. The book is written in an informal, accessible style, complete with pseudo-code for the most important algorithms. All topics are copiously illustrated with color images and worked examples drawn from such application domains as biology, text processing, computer vision, and robotics. Rather than providing a cookbook of different heuristic method...

  1. Probabilistic escalation modelling

    Energy Technology Data Exchange (ETDEWEB)

    Korneliussen, G.; Eknes, M.L.; Haugen, K.; Selmer-Olsen, S. [Det Norske Veritas, Oslo (Norway)

    1997-12-31

    This paper describes how structural reliability methods may successfully be applied within quantitative risk assessment (QRA) as an alternative to traditional event tree analysis. The emphasis is on fire escalation in hydrocarbon production and processing facilities. This choice was made due to potential improvements over current QRA practice associated with both the probabilistic approach and more detailed modelling of the dynamics of escalating events. The physical phenomena important for the events of interest are explicitly modelled as functions of time. Uncertainties are represented through probability distributions. The uncertainty modelling enables the analysis to be simple when possible and detailed when necessary. The methodology features several advantages compared with traditional risk calculations based on event trees. (Author)

  2. Learning a generative probabilistic grammar of experience: a process-level model of language acquisition.

    Science.gov (United States)

    Kolodny, Oren; Lotem, Arnon; Edelman, Shimon

    2015-03-01

    We introduce a set of biologically and computationally motivated design choices for modeling the learning of language, or of other types of sequential, hierarchically structured experience and behavior, and describe an implemented system that conforms to these choices and is capable of unsupervised learning from raw natural-language corpora. Given a stream of linguistic input, our model incrementally learns a grammar that captures its statistical patterns, which can then be used to parse or generate new data. The grammar constructed in this manner takes the form of a directed weighted graph, whose nodes are recursively (hierarchically) defined patterns over the elements of the input stream. We evaluated the model in seventeen experiments, grouped into five studies, which examined, respectively, (a) the generative ability of grammar learned from a corpus of natural language, (b) the characteristics of the learned representation, (c) sequence segmentation and chunking, (d) artificial grammar learning, and (e) certain types of structure dependence. The model's performance largely vindicates our design choices, suggesting that progress in modeling language acquisition can be made on a broad front-ranging from issues of generativity to the replication of human experimental findings-by bringing biological and computational considerations, as well as lessons from prior efforts, to bear on the modeling approach. Copyright © 2014 Cognitive Science Society, Inc.

  3. Error Discounting in Probabilistic Category Learning

    Science.gov (United States)

    Craig, Stewart; Lewandowsky, Stephan; Little, Daniel R.

    2011-01-01

    The assumption in some current theories of probabilistic categorization is that people gradually attenuate their learning in response to unavoidable error. However, existing evidence for this error discounting is sparse and open to alternative interpretations. We report 2 probabilistic-categorization experiments in which we investigated error…

  4. A new computational account of cognitive control over reinforcement-based decision-making: Modeling of a probabilistic learning task.

    Science.gov (United States)

    Zendehrouh, Sareh

    2015-11-01

    Recent work on decision-making field offers an account of dual-system theory for decision-making process. This theory holds that this process is conducted by two main controllers: a goal-directed system and a habitual system. In the reinforcement learning (RL) domain, the habitual behaviors are connected with model-free methods, in which appropriate actions are learned through trial-and-error experiences. However, goal-directed behaviors are associated with model-based methods of RL, in which actions are selected using a model of the environment. Studies on cognitive control also suggest that during processes like decision-making, some cortical and subcortical structures work in concert to monitor the consequences of decisions and to adjust control according to current task demands. Here a computational model is presented based on dual system theory and cognitive control perspective of decision-making. The proposed model is used to simulate human performance on a variant of probabilistic learning task. The basic proposal is that the brain implements a dual controller, while an accompanying monitoring system detects some kinds of conflict including a hypothetical cost-conflict one. The simulation results address existing theories about two event-related potentials, namely error related negativity (ERN) and feedback related negativity (FRN), and explore the best account of them. Based on the results, some testable predictions are also presented. Copyright © 2015 Elsevier Ltd. All rights reserved.

  5. Probabilistic modeling of timber structures

    DEFF Research Database (Denmark)

    Köhler, Jochen; Sørensen, John Dalsgaard; Faber, Michael Havbro

    2007-01-01

    The present paper contains a proposal for the probabilistic modeling of timber material properties. It is produced in the context of the Probabilistic Model Code (PMC) of the Joint Committee on Structural Safety (JCSS) [Joint Committee of Structural Safety. Probabilistic Model Code, Internet...... Publication: www.jcss.ethz.ch; 2001] and of the COST action E24 ‘Reliability of Timber Structures' [COST Action E 24, Reliability of timber structures. Several meetings and Publications, Internet Publication: http://www.km.fgg.uni-lj.si/coste24/coste24.htm; 2005]. The present proposal is based on discussions...... and comments from participants of the COST E24 action and the members of the JCSS. The paper contains a description of the basic reference properties for timber strength parameters and ultimate limit state equations for timber components. The recommended probabilistic model for these basic properties...

  6. From exemplar to grammar: a probabilistic analogy-based model of language learning.

    Science.gov (United States)

    Bod, Rens

    2009-07-01

    While rules and exemplars are usually viewed as opposites, this paper argues that they form end points of the same distribution. By representing both rules and exemplars as (partial) trees, we can take into account the fluid middle ground between the two extremes. This insight is the starting point for a new theory of language learning that is based on the following idea: If a language learner does not know which phrase-structure trees should be assigned to initial sentences, s/he allows (implicitly) for all possible trees and lets linguistic experience decide which is the "best" tree for each sentence. The best tree is obtained by maximizing "structural analogy" between a sentence and previous sentences, which is formalized by the most probable shortest combination of subtrees from all trees of previous sentences. Corpus-based experiments with this model on the Penn Treebank and the Childes database indicate that it can learn both exemplar-based and rule-based aspects of language, ranging from phrasal verbs to auxiliary fronting. By having learned the syntactic structures of sentences, we have also learned the grammar implicit in these structures, which can in turn be used to produce new sentences. We show that our model mimicks children's language development from item-based constructions to abstract constructions, and that the model can simulate some of the errors made by children in producing complex questions. Copyright © 2009 Cognitive Science Society, Inc.

  7. A comparison of algorithms for inference and learning in probabilistic graphical models.

    Science.gov (United States)

    Frey, Brendan J; Jojic, Nebojsa

    2005-09-01

    Research into methods for reasoning under uncertainty is currently one of the most exciting areas of artificial intelligence, largely because it has recently become possible to record, store, and process large amounts of data. While impressive achievements have been made in pattern classification problems such as handwritten character recognition, face detection, speaker identification, and prediction of gene function, it is even more exciting that researchers are on the verge of introducing systems that can perform large-scale combinatorial analyses of data, decomposing the data into interacting components. For example, computational methods for automatic scene analysis are now emerging in the computer vision community. These methods decompose an input image into its constituent objects, lighting conditions, motion patterns, etc. Two of the main challenges are finding effective representations and models in specific applications and finding efficient algorithms for inference and learning in these models. In this paper, we advocate the use of graph-based probability models and their associated inference and learning algorithms. We review exact techniques and various approximate, computationally efficient techniques, including iterated conditional modes, the expectation maximization (EM) algorithm, Gibbs sampling, the mean field method, variational techniques, structured variational techniques and the sum-product algorithm ("loopy" belief propagation). We describe how each technique can be applied in a vision model of multiple, occluding objects and contrast the behaviors and performances of the techniques using a unifying cost function, free energy.

  8. Probabilistic Modeling of Timber Structures

    DEFF Research Database (Denmark)

    Köhler, J.D.; Sørensen, John Dalsgaard; Faber, Michael Havbro

    2005-01-01

    The present paper contains a proposal for the probabilistic modeling of timber material properties. It is produced in the context of the Probabilistic Model Code (PMC) of the Joint Committee on Structural Safety (JCSS) and of the COST action E24 'Reliability of Timber Structures'. The present...... proposal is based on discussions and comments from participants of the COST E24 action and the members of the JCSS. The paper contains a description of the basic reference properties for timber strength parameters and ultimate limit state equations for components and connections. The recommended...

  9. Probabilistic dietary exposure models

    NARCIS (Netherlands)

    Boon, Polly E.; Voet, van der H.

    2015-01-01

    Exposure models are used to calculate the amount of potential harmful chemicals ingested by a human population. Examples of harmful chemicals are residues of pesticides, chemicals entering food from the environment (such as dioxins, cadmium, lead, mercury), and chemicals that are generated via

  10. Perceptual learning as improved probabilistic inference in early sensory areas.

    Science.gov (United States)

    Bejjanki, Vikranth R; Beck, Jeffrey M; Lu, Zhong-Lin; Pouget, Alexandre

    2011-05-01

    Extensive training on simple tasks such as fine orientation discrimination results in large improvements in performance, a form of learning known as perceptual learning. Previous models have argued that perceptual learning is due to either sharpening and amplification of tuning curves in early visual areas or to improved probabilistic inference in later visual areas (at the decision stage). However, early theories are inconsistent with the conclusions of psychophysical experiments manipulating external noise, whereas late theories cannot explain the changes in neural responses that have been reported in cortical areas V1 and V4. Here we show that we can capture both the neurophysiological and behavioral aspects of perceptual learning by altering only the feedforward connectivity in a recurrent network of spiking neurons so as to improve probabilistic inference in early visual areas. The resulting network shows modest changes in tuning curves, in line with neurophysiological reports, along with a marked reduction in the amplitude of pairwise noise correlations.

  11. Probabilistic reasoning for assembly-based 3D modeling

    KAUST Repository

    Chaudhuri, Siddhartha

    2011-01-01

    Assembly-based modeling is a promising approach to broadening the accessibility of 3D modeling. In assembly-based modeling, new models are assembled from shape components extracted from a database. A key challenge in assembly-based modeling is the identification of relevant components to be presented to the user. In this paper, we introduce a probabilistic reasoning approach to this problem. Given a repository of shapes, our approach learns a probabilistic graphical model that encodes semantic and geometric relationships among shape components. The probabilistic model is used to present components that are semantically and stylistically compatible with the 3D model that is being assembled. Our experiments indicate that the probabilistic model increases the relevance of presented components. © 2011 ACM.

  12. A probabilistic graphical model based stochastic input model construction

    International Nuclear Information System (INIS)

    Wan, Jiang; Zabaras, Nicholas

    2014-01-01

    Model reduction techniques have been widely used in modeling of high-dimensional stochastic input in uncertainty quantification tasks. However, the probabilistic modeling of random variables projected into reduced-order spaces presents a number of computational challenges. Due to the curse of dimensionality, the underlying dependence relationships between these random variables are difficult to capture. In this work, a probabilistic graphical model based approach is employed to learn the dependence by running a number of conditional independence tests using observation data. Thus a probabilistic model of the joint PDF is obtained and the PDF is factorized into a set of conditional distributions based on the dependence structure of the variables. The estimation of the joint PDF from data is then transformed to estimating conditional distributions under reduced dimensions. To improve the computational efficiency, a polynomial chaos expansion is further applied to represent the random field in terms of a set of standard random variables. This technique is combined with both linear and nonlinear model reduction methods. Numerical examples are presented to demonstrate the accuracy and efficiency of the probabilistic graphical model based stochastic input models. - Highlights: • Data-driven stochastic input models without the assumption of independence of the reduced random variables. • The problem is transformed to a Bayesian network structure learning problem. • Examples are given in flows in random media

  13. Probabilistic Harmonic Modeling of Wind Power Plants

    DEFF Research Database (Denmark)

    Guest, Emerson; Jensen, Kim H.; Rasmussen, Tonny Wederberg

    2017-01-01

    A probabilistic sequence domain (SD) harmonic model of a grid-connected voltage-source converter is used to estimate harmonic emissions in a wind power plant (WPP) comprised of Type-IV wind turbines. The SD representation naturally partitioned converter generated voltage harmonics into those...... with deterministic phase and those with probabilistic phase. A case study performed on a string of ten 3MW, Type-IV wind turbines implemented in PSCAD was used to verify the probabilistic SD harmonic model. The probabilistic SD harmonic model can be employed in the planning phase of WPP projects to assess harmonic...

  14. Probabilistic Survivability Versus Time Modeling

    Science.gov (United States)

    Joyner, James J., Sr.

    2016-01-01

    This presentation documents Kennedy Space Center's Independent Assessment work completed on three assessments for the Ground Systems Development and Operations (GSDO) Program to assist the Chief Safety and Mission Assurance Officer during key programmatic reviews and provided the GSDO Program with analyses of how egress time affects the likelihood of astronaut and ground worker survival during an emergency. For each assessment, a team developed probability distributions for hazard scenarios to address statistical uncertainty, resulting in survivability plots over time. The first assessment developed a mathematical model of probabilistic survivability versus time to reach a safe location using an ideal Emergency Egress System at Launch Complex 39B (LC-39B); the second used the first model to evaluate and compare various egress systems under consideration at LC-39B. The third used a modified LC-39B model to determine if a specific hazard decreased survivability more rapidly than other events during flight hardware processing in Kennedy's Vehicle Assembly Building.

  15. Learning Probabilistic Inference through Spike-Timing-Dependent Plasticity.

    Science.gov (United States)

    Pecevski, Dejan; Maass, Wolfgang

    2016-01-01

    Numerous experimental data show that the brain is able to extract information from complex, uncertain, and often ambiguous experiences. Furthermore, it can use such learnt information for decision making through probabilistic inference. Several models have been proposed that aim at explaining how probabilistic inference could be performed by networks of neurons in the brain. We propose here a model that can also explain how such neural network could acquire the necessary information for that from examples. We show that spike-timing-dependent plasticity in combination with intrinsic plasticity generates in ensembles of pyramidal cells with lateral inhibition a fundamental building block for that: probabilistic associations between neurons that represent through their firing current values of random variables. Furthermore, by combining such adaptive network motifs in a recursive manner the resulting network is enabled to extract statistical information from complex input streams, and to build an internal model for the distribution p (*) that generates the examples it receives. This holds even if p (*) contains higher-order moments. The analysis of this learning process is supported by a rigorous theoretical foundation. Furthermore, we show that the network can use the learnt internal model immediately for prediction, decision making, and other types of probabilistic inference.

  16. Learning Probabilistic Inference through Spike-Timing-Dependent Plasticity123

    Science.gov (United States)

    Pecevski, Dejan

    2016-01-01

    Abstract Numerous experimental data show that the brain is able to extract information from complex, uncertain, and often ambiguous experiences. Furthermore, it can use such learnt information for decision making through probabilistic inference. Several models have been proposed that aim at explaining how probabilistic inference could be performed by networks of neurons in the brain. We propose here a model that can also explain how such neural network could acquire the necessary information for that from examples. We show that spike-timing-dependent plasticity in combination with intrinsic plasticity generates in ensembles of pyramidal cells with lateral inhibition a fundamental building block for that: probabilistic associations between neurons that represent through their firing current values of random variables. Furthermore, by combining such adaptive network motifs in a recursive manner the resulting network is enabled to extract statistical information from complex input streams, and to build an internal model for the distribution p* that generates the examples it receives. This holds even if p* contains higher-order moments. The analysis of this learning process is supported by a rigorous theoretical foundation. Furthermore, we show that the network can use the learnt internal model immediately for prediction, decision making, and other types of probabilistic inference. PMID:27419214

  17. Probabilistic learning of nonlinear dynamical systems using sequential Monte Carlo

    Science.gov (United States)

    Schön, Thomas B.; Svensson, Andreas; Murray, Lawrence; Lindsten, Fredrik

    2018-05-01

    Probabilistic modeling provides the capability to represent and manipulate uncertainty in data, models, predictions and decisions. We are concerned with the problem of learning probabilistic models of dynamical systems from measured data. Specifically, we consider learning of probabilistic nonlinear state-space models. There is no closed-form solution available for this problem, implying that we are forced to use approximations. In this tutorial we will provide a self-contained introduction to one of the state-of-the-art methods-the particle Metropolis-Hastings algorithm-which has proven to offer a practical approximation. This is a Monte Carlo based method, where the particle filter is used to guide a Markov chain Monte Carlo method through the parameter space. One of the key merits of the particle Metropolis-Hastings algorithm is that it is guaranteed to converge to the "true solution" under mild assumptions, despite being based on a particle filter with only a finite number of particles. We will also provide a motivating numerical example illustrating the method using a modeling language tailored for sequential Monte Carlo methods. The intention of modeling languages of this kind is to open up the power of sophisticated Monte Carlo methods-including particle Metropolis-Hastings-to a large group of users without requiring them to know all the underlying mathematical details.

  18. Probabilistic transport models for fusion

    International Nuclear Information System (INIS)

    Milligen, B.Ph. van; Carreras, B.A.; Lynch, V.E.; Sanchez, R.

    2005-01-01

    A generalization of diffusive (Fickian) transport is considered, in which particle motion is described by probability distributions. We design a simple model that includes a critical mechanism to switch between two transport channels, and show that it exhibits various interesting characteristics, suggesting that the ideas of probabilistic transport might provide a framework for the description of a range of unusual transport phenomena observed in fusion plasmas. The model produces power degradation and profile consistency, as well as a scaling of the confinement time with system size reminiscent of the gyro-Bohm/Bohm scalings observed in fusion plasmas, and rapid propagation of disturbances. In the present work we show how this model may also produce on-axis peaking of the profiles with off-axis fuelling. It is important to note that the fluid limit of a simple model like this, characterized by two transport channels, does not correspond to the usual (Fickian) transport models commonly used for modelling transport in fusion plasmas, and behaves in a fundamentally different way. (author)

  19. Probabilistic reasoning with graphical security models

    NARCIS (Netherlands)

    Kordy, Barbara; Pouly, Marc; Schweitzer, Patrick

    This work provides a computational framework for meaningful probabilistic evaluation of attack–defense scenarios involving dependent actions. We combine the graphical security modeling technique of attack–defense trees with probabilistic information expressed in terms of Bayesian networks. In order

  20. Lessons learned on probabilistic methodology for precursor analyses

    International Nuclear Information System (INIS)

    Babst, Siegfried; Wielenberg, Andreas; Gaenssmantel, Gerhard

    2016-01-01

    Based on its experience in precursor assessment of operating experience from German NPP and related international activities in the field, GRS has identified areas for enhancing probabilistic methodology. These are related to improving the completeness of PSA models, to insufficiencies in probabilistic assessment approaches, and to enhancements of precursor assessment methods. Three examples from the recent practice in precursor assessments illustrating relevant methodological insights are provided and discussed in more detail. Our experience reinforces the importance of having full scope, current PSA models up to Level 2 PSA and including hazard scenarios for precursor analysis. Our lessons learned include that PSA models should be regularly updated regarding CCF data and inclusion of newly discovered CCF mechanisms or groups. Moreover, precursor classification schemes should be extended to degradations and unavailabilities of the containment function. Finally, PSA and precursor assessments should put more emphasis on the consideration of passive provisions for safety, e. g. by sensitivity cases.

  1. Lessons learned on probabilistic methodology for precursor analyses

    Energy Technology Data Exchange (ETDEWEB)

    Babst, Siegfried [Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS) gGmbH, Berlin (Germany); Wielenberg, Andreas; Gaenssmantel, Gerhard [Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS) gGmbH, Garching (Germany)

    2016-11-15

    Based on its experience in precursor assessment of operating experience from German NPP and related international activities in the field, GRS has identified areas for enhancing probabilistic methodology. These are related to improving the completeness of PSA models, to insufficiencies in probabilistic assessment approaches, and to enhancements of precursor assessment methods. Three examples from the recent practice in precursor assessments illustrating relevant methodological insights are provided and discussed in more detail. Our experience reinforces the importance of having full scope, current PSA models up to Level 2 PSA and including hazard scenarios for precursor analysis. Our lessons learned include that PSA models should be regularly updated regarding CCF data and inclusion of newly discovered CCF mechanisms or groups. Moreover, precursor classification schemes should be extended to degradations and unavailabilities of the containment function. Finally, PSA and precursor assessments should put more emphasis on the consideration of passive provisions for safety, e. g. by sensitivity cases.

  2. CAD Parts-Based Assembly Modeling by Probabilistic Reasoning

    KAUST Repository

    Zhang, Kai-Ke; Hu, Kai-Mo; Yin, Li-Cheng; Yan, Dongming; Wang, Bin

    2016-01-01

    Nowadays, increasing amount of parts and sub-assemblies are publicly available, which can be used directly for product development instead of creating from scratch. In this paper, we propose an interactive design framework for efficient and smart assembly modeling, in order to improve the design efficiency. Our approach is based on a probabilistic reasoning. Given a collection of industrial assemblies, we learn a probabilistic graphical model from the relationships between the parts of assemblies. Then in the modeling stage, this probabilistic model is used to suggest the most likely used parts compatible with the current assembly. Finally, the parts are assembled under certain geometric constraints. We demonstrate the effectiveness of our framework through a variety of assembly models produced by our prototype system. © 2015 IEEE.

  3. CAD Parts-Based Assembly Modeling by Probabilistic Reasoning

    KAUST Repository

    Zhang, Kai-Ke

    2016-04-11

    Nowadays, increasing amount of parts and sub-assemblies are publicly available, which can be used directly for product development instead of creating from scratch. In this paper, we propose an interactive design framework for efficient and smart assembly modeling, in order to improve the design efficiency. Our approach is based on a probabilistic reasoning. Given a collection of industrial assemblies, we learn a probabilistic graphical model from the relationships between the parts of assemblies. Then in the modeling stage, this probabilistic model is used to suggest the most likely used parts compatible with the current assembly. Finally, the parts are assembled under certain geometric constraints. We demonstrate the effectiveness of our framework through a variety of assembly models produced by our prototype system. © 2015 IEEE.

  4. PROBABILISTIC RELATIONAL MODELS OF COMPLETE IL-SEMIRINGS

    OpenAIRE

    Tsumagari, Norihiro

    2012-01-01

    This paper studies basic properties of probabilistic multirelations which are generalized the semantic domain of probabilistic systems and then provides two probabilistic models of complete IL-semirings using probabilistic multirelations. Also it is shown that these models need not be models of complete idempotentsemirings.

  5. The effects of reduced dopamine transporter function and chronic lithium on motivation, probabilistic learning, and neurochemistry in mice: Modeling bipolar mania.

    Science.gov (United States)

    Milienne-Petiot, Morgane; Kesby, James P; Graves, Mary; van Enkhuizen, Jordy; Semenova, Svetlana; Minassian, Arpi; Markou, Athina; Geyer, Mark A; Young, Jared W

    2017-02-01

    Bipolar disorder (BD) mania patients exhibit poor cognition and reward-seeking/hypermotivation, negatively impacting a patient's quality of life. Current treatments (e.g., lithium), do not treat such deficits. Treatment development has been limited due to a poor understanding of the neural mechanisms underlying these behaviors. Here, we investigated putative mechanisms underlying cognition and reward-seeking/motivational changes relevant to BD mania patients using two validated mouse models and neurochemical analyses. The effects of reducing dopamine transporter (DAT) functioning via genetic (knockdown vs. wild-type littermates), or pharmacological (GBR12909- vs. vehicle-treated C57BL/6J mice) means were assessed in the probabilistic reversal learning task (PRLT), and progressive ratio breakpoint (PRB) test, during either water or chronic lithium treatment. These tasks quantify reward learning and effortful motivation, respectively. Neurochemistry was performed on brain samples of DAT mutants ± chronic lithium using high performance liquid chromatography. Reduced DAT functioning increased reversals in the PRLT, an effect partially attenuated by chronic lithium. Chronic lithium alone slowed PRLT acquisition. Reduced DAT functioning increased motivation (PRB), an effect attenuated by lithium in GBR12909-treated mice. Neurochemical analyses revealed that DAT knockdown mice exhibited elevated homovanillic acid levels, but that lithium had no effect on these elevated levels. Reducing DAT functioning recreates many aspects of BD mania including hypermotivation and improved reversal learning (switching), as well as elevated homovanillic acid levels. Chronic lithium only exerted main effects, impairing learning and elevating norepinephrine and serotonin levels of mice, not specifically treating the underlying mechanisms identified in these models. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. WE-H-BRC-06: A Unified Machine-Learning Based Probabilistic Model for Automated Anomaly Detection in the Treatment Plan Data

    International Nuclear Information System (INIS)

    Chang, X; Liu, S; Kalet, A; Yang, D

    2016-01-01

    Purpose: The purpose of this work was to investigate the ability of a machine-learning based probabilistic approach to detect radiotherapy treatment plan anomalies given initial disease classes information. Methods In total we obtained 1112 unique treatment plans with five plan parameters and disease information from a Mosaiq treatment management system database for use in the study. The plan parameters include prescription dose, fractions, fields, modality and techniques. The disease information includes disease site, and T, M and N disease stages. A Bayesian network method was employed to model the probabilistic relationships between tumor disease information, plan parameters and an anomaly flag. A Bayesian learning method with Dirichlet prior was useed to learn the joint probabilities between dependent variables in error-free plan data and data with artificially induced anomalies. In the study, we randomly sampled data with anomaly in a specified anomaly space.We tested the approach with three groups of plan anomalies – improper concurrence of values of all five plan parameters and values of any two out of five parameters, and all single plan parameter value anomalies. Totally, 16 types of plan anomalies were covered by the study. For each type, we trained an individual Bayesian network. Results: We found that the true positive rate (recall) and positive predictive value (precision) to detect concurrence anomalies of five plan parameters in new patient cases were 94.45±0.26% and 93.76±0.39% respectively. To detect other 15 types of plan anomalies, the average recall and precision were 93.61±2.57% and 93.78±3.54% respectively. The computation time to detect the plan anomaly of each type in a new plan is ∼0.08 seconds. Conclusion: The proposed method for treatment plan anomaly detection was found effective in the initial tests. The results suggest that this type of models could be applied to develop plan anomaly detection tools to assist manual and

  7. WE-H-BRC-06: A Unified Machine-Learning Based Probabilistic Model for Automated Anomaly Detection in the Treatment Plan Data

    Energy Technology Data Exchange (ETDEWEB)

    Chang, X; Liu, S [Washington University in St. Louis, St. Louis, MO (United States); Kalet, A [University of Washington Medical Center, Seattle, WA (United States); Yang, D [Washington University in St Louis, St Louis, MO (United States)

    2016-06-15

    Purpose: The purpose of this work was to investigate the ability of a machine-learning based probabilistic approach to detect radiotherapy treatment plan anomalies given initial disease classes information. Methods In total we obtained 1112 unique treatment plans with five plan parameters and disease information from a Mosaiq treatment management system database for use in the study. The plan parameters include prescription dose, fractions, fields, modality and techniques. The disease information includes disease site, and T, M and N disease stages. A Bayesian network method was employed to model the probabilistic relationships between tumor disease information, plan parameters and an anomaly flag. A Bayesian learning method with Dirichlet prior was useed to learn the joint probabilities between dependent variables in error-free plan data and data with artificially induced anomalies. In the study, we randomly sampled data with anomaly in a specified anomaly space.We tested the approach with three groups of plan anomalies – improper concurrence of values of all five plan parameters and values of any two out of five parameters, and all single plan parameter value anomalies. Totally, 16 types of plan anomalies were covered by the study. For each type, we trained an individual Bayesian network. Results: We found that the true positive rate (recall) and positive predictive value (precision) to detect concurrence anomalies of five plan parameters in new patient cases were 94.45±0.26% and 93.76±0.39% respectively. To detect other 15 types of plan anomalies, the average recall and precision were 93.61±2.57% and 93.78±3.54% respectively. The computation time to detect the plan anomaly of each type in a new plan is ∼0.08 seconds. Conclusion: The proposed method for treatment plan anomaly detection was found effective in the initial tests. The results suggest that this type of models could be applied to develop plan anomaly detection tools to assist manual and

  8. Impairment of probabilistic reward-based learning in schizophrenia.

    Science.gov (United States)

    Weiler, Julia A; Bellebaum, Christian; Brüne, Martin; Juckel, Georg; Daum, Irene

    2009-09-01

    Recent models assume that some symptoms of schizophrenia originate from defective reward processing mechanisms. Understanding the precise nature of reward-based learning impairments might thus make an important contribution to the understanding of schizophrenia and the development of treatment strategies. The present study investigated several features of probabilistic reward-based stimulus association learning, namely the acquisition of initial contingencies, reversal learning, generalization abilities, and the effects of reward magnitude. Compared to healthy controls, individuals with schizophrenia exhibited attenuated overall performance during acquisition, whereas learning rates across blocks were similar to the rates of controls. On the group level, persons with schizophrenia were, however, unable to learn the reversal of the initial reward contingencies. Exploratory analysis of only the subgroup of individuals with schizophrenia who showed significant learning during acquisition yielded deficits in reversal learning with low reward magnitudes only. There was further evidence of a mild generalization impairment of the persons with schizophrenia in an acquired equivalence task. In summary, although there was evidence of intact basic processing of reward magnitudes, individuals with schizophrenia were impaired at using this feedback for the adaptive guidance of behavior.

  9. Reasoning with probabilistic and deterministic graphical models exact algorithms

    CERN Document Server

    Dechter, Rina

    2013-01-01

    Graphical models (e.g., Bayesian and constraint networks, influence diagrams, and Markov decision processes) have become a central paradigm for knowledge representation and reasoning in both artificial intelligence and computer science in general. These models are used to perform many reasoning tasks, such as scheduling, planning and learning, diagnosis and prediction, design, hardware and software verification, and bioinformatics. These problems can be stated as the formal tasks of constraint satisfaction and satisfiability, combinatorial optimization, and probabilistic inference. It is well

  10. Development of probabilistic thinking-oriented learning tools for probability materials at junior high school students

    Science.gov (United States)

    Sari, Dwi Ivayana; Hermanto, Didik

    2017-08-01

    This research is a developmental research of probabilistic thinking-oriented learning tools for probability materials at ninth grade students. This study is aimed to produce a good probabilistic thinking-oriented learning tools. The subjects were IX-A students of MTs Model Bangkalan. The stages of this development research used 4-D development model which has been modified into define, design and develop. Teaching learning tools consist of lesson plan, students' worksheet, learning teaching media and students' achievement test. The research instrument used was a sheet of learning tools validation, a sheet of teachers' activities, a sheet of students' activities, students' response questionnaire and students' achievement test. The result of those instruments were analyzed descriptively to answer research objectives. The result was teaching learning tools in which oriented to probabilistic thinking of probability at ninth grade students which has been valid. Since teaching and learning tools have been revised based on validation, and after experiment in class produced that teachers' ability in managing class was effective, students' activities were good, students' responses to the learning tools were positive and the validity, sensitivity and reliability category toward achievement test. In summary, this teaching learning tools can be used by teacher to teach probability for develop students' probabilistic thinking.

  11. Teacher learning about probabilistic reasoning in relation to ...

    African Journals Online (AJOL)

    It was, however, the 'genuineness\\' of teacher learning which was the issue that the findings have to address. Therefore a speculative, hopeful strategy for affecting teacher learning in mathematics teacher education practice is to sustain disequilibrium between dichotomies such as formal and intuitive probabilistic reasoning ...

  12. Financial Markets Analysis by Probabilistic Fuzzy Modelling

    NARCIS (Netherlands)

    J.H. van den Berg (Jan); W.-M. van den Bergh (Willem-Max); U. Kaymak (Uzay)

    2003-01-01

    textabstractFor successful trading in financial markets, it is important to develop financial models where one can identify different states of the market for modifying one???s actions. In this paper, we propose to use probabilistic fuzzy systems for this purpose. We concentrate on Takagi???Sugeno

  13. Financial markets analysis by probabilistic fuzzy modelling

    NARCIS (Netherlands)

    Berg, van den J.; Kaymak, U.; Bergh, van den W.M.

    2003-01-01

    For successful trading in financial markets, it is important to develop financial models where one can identify different states of the market for modifying one???s actions. In this paper, we propose to use probabilistic fuzzy systems for this purpose. We concentrate on Takagi???Sugeno (TS)

  14. Probabilistic modeling of children's handwriting

    Science.gov (United States)

    Puri, Mukta; Srihari, Sargur N.; Hanson, Lisa

    2013-12-01

    There is little work done in the analysis of children's handwriting, which can be useful in developing automatic evaluation systems and in quantifying handwriting individuality. We consider the statistical analysis of children's handwriting in early grades. Samples of handwriting of children in Grades 2-4 who were taught the Zaner-Bloser style were considered. The commonly occurring word "and" written in cursive style as well as hand-print were extracted from extended writing. The samples were assigned feature values by human examiners using a truthing tool. The human examiners looked at how the children constructed letter formations in their writing, looking for similarities and differences from the instructions taught in the handwriting copy book. These similarities and differences were measured using a feature space distance measure. Results indicate that the handwriting develops towards more conformity with the class characteristics of the Zaner-Bloser copybook which, with practice, is the expected result. Bayesian networks were learnt from the data to enable answering various probabilistic queries, such as determining students who may continue to produce letter formations as taught during lessons in school and determining the students who will develop a different and/or variation of the those letter formations and the number of different types of letter formations.

  15. The Sense of Confidence during Probabilistic Learning: A Normative Account.

    Directory of Open Access Journals (Sweden)

    Florent Meyniel

    2015-06-01

    Full Text Available Learning in a stochastic environment consists of estimating a model from a limited amount of noisy data, and is therefore inherently uncertain. However, many classical models reduce the learning process to the updating of parameter estimates and neglect the fact that learning is also frequently accompanied by a variable "feeling of knowing" or confidence. The characteristics and the origin of these subjective confidence estimates thus remain largely unknown. Here we investigate whether, during learning, humans not only infer a model of their environment, but also derive an accurate sense of confidence from their inferences. In our experiment, humans estimated the transition probabilities between two visual or auditory stimuli in a changing environment, and reported their mean estimate and their confidence in this report. To formalize the link between both kinds of estimate and assess their accuracy in comparison to a normative reference, we derive the optimal inference strategy for our task. Our results indicate that subjects accurately track the likelihood that their inferences are correct. Learning and estimating confidence in what has been learned appear to be two intimately related abilities, suggesting that they arise from a single inference process. We show that human performance matches several properties of the optimal probabilistic inference. In particular, subjective confidence is impacted by environmental uncertainty, both at the first level (uncertainty in stimulus occurrence given the inferred stochastic characteristics and at the second level (uncertainty due to unexpected changes in these stochastic characteristics. Confidence also increases appropriately with the number of observations within stable periods. Our results support the idea that humans possess a quantitative sense of confidence in their inferences about abstract non-sensory parameters of the environment. This ability cannot be reduced to simple heuristics, it seems

  16. Probabilistic learning and inference in schizophrenia

    Science.gov (United States)

    Averbeck, Bruno B.; Evans, Simon; Chouhan, Viraj; Bristow, Eleanor; Shergill, Sukhwinder S.

    2010-01-01

    Patients with schizophrenia make decisions on the basis of less evidence when required to collect information to make an inference, a behavior often called jumping to conclusions. The underlying basis for this behaviour remains controversial. We examined the cognitive processes underpinning this finding by testing subjects on the beads task, which has been used previously to elicit jumping to conclusions behaviour, and a stochastic sequence learning task, with a similar decision theoretic structure. During the sequence learning task, subjects had to learn a sequence of button presses, while receiving noisy feedback on their choices. We fit a Bayesian decision making model to the sequence task and compared model parameters to the choice behavior in the beads task in both patients and healthy subjects. We found that patients did show a jumping to conclusions style; and those who picked early in the beads task tended to learn less from positive feedback in the sequence task. This favours the likelihood of patients selecting early because they have a low threshold for making decisions, and that they make choices on the basis of relatively little evidence. PMID:20810252

  17. Probabilistic learning and inference in schizophrenia.

    Science.gov (United States)

    Averbeck, Bruno B; Evans, Simon; Chouhan, Viraj; Bristow, Eleanor; Shergill, Sukhwinder S

    2011-04-01

    Patients with schizophrenia make decisions on the basis of less evidence when required to collect information to make an inference, a behavior often called jumping to conclusions. The underlying basis for this behavior remains controversial. We examined the cognitive processes underpinning this finding by testing subjects on the beads task, which has been used previously to elicit jumping to conclusions behavior, and a stochastic sequence learning task, with a similar decision theoretic structure. During the sequence learning task, subjects had to learn a sequence of button presses, while receiving a noisy feedback on their choices. We fit a Bayesian decision making model to the sequence task and compared model parameters to the choice behavior in the beads task in both patients and healthy subjects. We found that patients did show a jumping to conclusions style; and those who picked early in the beads task tended to learn less from positive feedback in the sequence task. This favours the likelihood of patients selecting early because they have a low threshold for making decisions, and that they make choices on the basis of relatively little evidence. Published by Elsevier B.V.

  18. A probabilistic maintenance model for diesel engines

    Science.gov (United States)

    Pathirana, Shan; Abeygunawardane, Saranga Kumudu

    2018-02-01

    In this paper, a probabilistic maintenance model is developed for inspection based preventive maintenance of diesel engines based on the practical model concepts discussed in the literature. Developed model is solved using real data obtained from inspection and maintenance histories of diesel engines and experts' views. Reliability indices and costs were calculated for the present maintenance policy of diesel engines. A sensitivity analysis is conducted to observe the effect of inspection based preventive maintenance on the life cycle cost of diesel engines.

  19. Probabilistic Models for Solar Particle Events

    Science.gov (United States)

    Adams, James H., Jr.; Dietrich, W. F.; Xapsos, M. A.; Welton, A. M.

    2009-01-01

    Probabilistic Models of Solar Particle Events (SPEs) are used in space mission design studies to provide a description of the worst-case radiation environment that the mission must be designed to tolerate.The models determine the worst-case environment using a description of the mission and a user-specified confidence level that the provided environment will not be exceeded. This poster will focus on completing the existing suite of models by developing models for peak flux and event-integrated fluence elemental spectra for the Z>2 elements. It will also discuss methods to take into account uncertainties in the data base and the uncertainties resulting from the limited number of solar particle events in the database. These new probabilistic models are based on an extensive survey of SPE measurements of peak and event-integrated elemental differential energy spectra. Attempts are made to fit the measured spectra with eight different published models. The model giving the best fit to each spectrum is chosen and used to represent that spectrum for any energy in the energy range covered by the measurements. The set of all such spectral representations for each element is then used to determine the worst case spectrum as a function of confidence level. The spectral representation that best fits these worst case spectra is found and its dependence on confidence level is parameterized. This procedure creates probabilistic models for the peak and event-integrated spectra.

  20. Probabilistic Modeling of Wind Turbine Drivetrain Components

    DEFF Research Database (Denmark)

    Rafsanjani, Hesam Mirzaei

    Wind energy is one of several energy sources in the world and a rapidly growing industry in the energy sector. When placed in offshore or onshore locations, wind turbines are exposed to wave excitations, highly dynamic wind loads and/or the wakes from other wind turbines. Therefore, most components...... in a wind turbine experience highly dynamic and time-varying loads. These components may fail due to wear or fatigue, and this can lead to unplanned shutdown repairs that are very costly. The design by deterministic methods using safety factors is generally unable to account for the many uncertainties. Thus......, a reliability assessment should be based on probabilistic methods where stochastic modeling of failures is performed. This thesis focuses on probabilistic models and the stochastic modeling of the fatigue life of the wind turbine drivetrain. Hence, two approaches are considered for stochastic modeling...

  1. Probabilistic, meso-scale flood loss modelling

    Science.gov (United States)

    Kreibich, Heidi; Botto, Anna; Schröter, Kai; Merz, Bruno

    2016-04-01

    Flood risk analyses are an important basis for decisions on flood risk management and adaptation. However, such analyses are associated with significant uncertainty, even more if changes in risk due to global change are expected. Although uncertainty analysis and probabilistic approaches have received increased attention during the last years, they are still not standard practice for flood risk assessments and even more for flood loss modelling. State of the art in flood loss modelling is still the use of simple, deterministic approaches like stage-damage functions. Novel probabilistic, multi-variate flood loss models have been developed and validated on the micro-scale using a data-mining approach, namely bagging decision trees (Merz et al. 2013). In this presentation we demonstrate and evaluate the upscaling of the approach to the meso-scale, namely on the basis of land-use units. The model is applied in 19 municipalities which were affected during the 2002 flood by the River Mulde in Saxony, Germany (Botto et al. submitted). The application of bagging decision tree based loss models provide a probability distribution of estimated loss per municipality. Validation is undertaken on the one hand via a comparison with eight deterministic loss models including stage-damage functions as well as multi-variate models. On the other hand the results are compared with official loss data provided by the Saxon Relief Bank (SAB). The results show, that uncertainties of loss estimation remain high. Thus, the significant advantage of this probabilistic flood loss estimation approach is that it inherently provides quantitative information about the uncertainty of the prediction. References: Merz, B.; Kreibich, H.; Lall, U. (2013): Multi-variate flood damage assessment: a tree-based data-mining approach. NHESS, 13(1), 53-64. Botto A, Kreibich H, Merz B, Schröter K (submitted) Probabilistic, multi-variable flood loss modelling on the meso-scale with BT-FLEMO. Risk Analysis.

  2. Probabilistic forecasting of wind power generation using extreme learning machine

    DEFF Research Database (Denmark)

    Wan, Can; Xu, Zhao; Pinson, Pierre

    2014-01-01

    an extreme learning machine (ELM)-based probabilistic forecasting method for wind power generation. To account for the uncertainties in the forecasting results, several bootstrapmethods have been compared for modeling the regression uncertainty, based on which the pairs bootstrap method is identified......Accurate and reliable forecast of wind power is essential to power system operation and control. However, due to the nonstationarity of wind power series, traditional point forecasting can hardly be accurate, leading to increased uncertainties and risks for system operation. This paper proposes...... with the best performance. Consequently, a new method for prediction intervals formulation based on theELMand the pairs bootstrap is developed.Wind power forecasting has been conducted in different seasons using the proposed approach with the historical wind power time series as the inputs alone. The results...

  3. A probabilistic model of RNA conformational space

    DEFF Research Database (Denmark)

    Frellsen, Jes; Moltke, Ida; Thiim, Martin

    2009-01-01

    efficient sampling of RNA conformations in continuous space, and with associated probabilities. We show that the model captures several key features of RNA structure, such as its rotameric nature and the distribution of the helix lengths. Furthermore, the model readily generates native-like 3-D......, the discrete nature of the fragments necessitates the use of carefully tuned, unphysical energy functions, and their non-probabilistic nature impairs unbiased sampling. We offer a solution to the sampling problem that removes these important limitations: a probabilistic model of RNA structure that allows......The increasing importance of non-coding RNA in biology and medicine has led to a growing interest in the problem of RNA 3-D structure prediction. As is the case for proteins, RNA 3-D structure prediction methods require two key ingredients: an accurate energy function and a conformational sampling...

  4. Failure probabilistic model of CNC lathes

    International Nuclear Information System (INIS)

    Wang Yiqiang; Jia Yazhou; Yu Junyi; Zheng Yuhua; Yi Shangfeng

    1999-01-01

    A field failure analysis of computerized numerical control (CNC) lathes is described. Field failure data was collected over a period of two years on approximately 80 CNC lathes. A coding system to code failure data was devised and a failure analysis data bank of CNC lathes was established. The failure position and subsystem, failure mode and cause were analyzed to indicate the weak subsystem of a CNC lathe. Also, failure probabilistic model of CNC lathes was analyzed by fuzzy multicriteria comprehensive evaluation

  5. Fully probabilistic design of hierarchical Bayesian models

    Czech Academy of Sciences Publication Activity Database

    Quinn, A.; Kárný, Miroslav; Guy, Tatiana Valentine

    2016-01-01

    Roč. 369, č. 1 (2016), s. 532-547 ISSN 0020-0255 R&D Projects: GA ČR GA13-13502S Institutional support: RVO:67985556 Keywords : Fully probabilistic design * Ideal distribution * Minimum cross-entropy principle * Bayesian conditioning * Kullback-Leibler divergence * Bayesian nonparametric modelling Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 4.832, year: 2016 http://library.utia.cas.cz/separaty/2016/AS/karny-0463052.pdf

  6. A Simple Probabilistic Combat Model

    Science.gov (United States)

    2016-06-13

    Government may violate any copyrights that exist in this work. This page intentionally left blank. ABSTRACT The Lanchester ...page intentionally left blank. TABLE OF CONTENTS Page No.Abstract iii List of Illustrations vii 1. INTRODUCTION 1 2. DETERMINISTIC LANCHESTER MODEL...This page intentionally left blank. 1. INTRODUCTION The Lanchester combat model1 is a simple way to assess the effects of quantity and quality

  7. Probabilistic Solar Energetic Particle Models

    Science.gov (United States)

    Adams, James H., Jr.; Dietrich, William F.; Xapsos, Michael A.

    2011-01-01

    To plan and design safe and reliable space missions, it is necessary to take into account the effects of the space radiation environment. This is done by setting the goal of achieving safety and reliability with some desired level of confidence. To achieve this goal, a worst-case space radiation environment at the required confidence level must be obtained. Planning and designing then proceeds, taking into account the effects of this worst-case environment. The result will be a mission that is reliable against the effects of the space radiation environment at the desired confidence level. In this paper we will describe progress toward developing a model that provides worst-case space radiation environments at user-specified confidence levels. We will present a model for worst-case event-integrated solar proton environments that provide the worst-case differential proton spectrum. This model is based on data from IMP-8 and GOES spacecraft that provide a data base extending from 1974 to the present. We will discuss extending this work to create worst-case models for peak flux and mission-integrated fluence for protons. We will also describe plans for similar models for helium and heavier ions.

  8. A probabilistic model of RNA conformational space

    DEFF Research Database (Denmark)

    Frellsen, Jes; Moltke, Ida; Thiim, Martin

    2009-01-01

    , the discrete nature of the fragments necessitates the use of carefully tuned, unphysical energy functions, and their non-probabilistic nature impairs unbiased sampling. We offer a solution to the sampling problem that removes these important limitations: a probabilistic model of RNA structure that allows...... conformations for 9 out of 10 test structures, solely using coarse-grained base-pairing information. In conclusion, the method provides a theoretical and practical solution for a major bottleneck on the way to routine prediction and simulation of RNA structure and dynamics in atomic detail.......The increasing importance of non-coding RNA in biology and medicine has led to a growing interest in the problem of RNA 3-D structure prediction. As is the case for proteins, RNA 3-D structure prediction methods require two key ingredients: an accurate energy function and a conformational sampling...

  9. Probabilistically modeling lava flows with MOLASSES

    Science.gov (United States)

    Richardson, J. A.; Connor, L.; Connor, C.; Gallant, E.

    2017-12-01

    Modeling lava flows through Cellular Automata methods enables a computationally inexpensive means to quickly forecast lava flow paths and ultimate areal extents. We have developed a lava flow simulator, MOLASSES, that forecasts lava flow inundation over an elevation model from a point source eruption. This modular code can be implemented in a deterministic fashion with given user inputs that will produce a single lava flow simulation. MOLASSES can also be implemented in a probabilistic fashion where given user inputs define parameter distributions that are randomly sampled to create many lava flow simulations. This probabilistic approach enables uncertainty in input data to be expressed in the model results and MOLASSES outputs a probability map of inundation instead of a determined lava flow extent. Since the code is comparatively fast, we use it probabilistically to investigate where potential vents are located that may impact specific sites and areas, as well as the unconditional probability of lava flow inundation of sites or areas from any vent. We have validated the MOLASSES code to community-defined benchmark tests and to the real world lava flows at Tolbachik (2012-2013) and Pico do Fogo (2014-2015). To determine the efficacy of the MOLASSES simulator at accurately and precisely mimicking the inundation area of real flows, we report goodness of fit using both model sensitivity and the Positive Predictive Value, the latter of which is a Bayesian posterior statistic. Model sensitivity is often used in evaluating lava flow simulators, as it describes how much of the lava flow was successfully modeled by the simulation. We argue that the positive predictive value is equally important in determining how good a simulator is, as it describes the percentage of the simulation space that was actually inundated by lava.

  10. Statistical learning and probabilistic prediction in music cognition: mechanisms of stylistic enculturation.

    Science.gov (United States)

    Pearce, Marcus T

    2018-05-11

    Music perception depends on internal psychological models derived through exposure to a musical culture. It is hypothesized that this musical enculturation depends on two cognitive processes: (1) statistical learning, in which listeners acquire internal cognitive models of statistical regularities present in the music to which they are exposed; and (2) probabilistic prediction based on these learned models that enables listeners to organize and process their mental representations of music. To corroborate these hypotheses, I review research that uses a computational model of probabilistic prediction based on statistical learning (the information dynamics of music (IDyOM) model) to simulate data from empirical studies of human listeners. The results show that a broad range of psychological processes involved in music perception-expectation, emotion, memory, similarity, segmentation, and meter-can be understood in terms of a single, underlying process of probabilistic prediction using learned statistical models. Furthermore, IDyOM simulations of listeners from different musical cultures demonstrate that statistical learning can plausibly predict causal effects of differential cultural exposure to musical styles, providing a quantitative model of cultural distance. Understanding the neural basis of musical enculturation will benefit from close coordination between empirical neuroimaging and computational modeling of underlying mechanisms, as outlined here. © 2018 The Authors. Annals of the New York Academy of Sciences published by Wiley Periodicals, Inc. on behalf of New York Academy of Sciences.

  11. Learning probabilistic features for robotic navigation using laser sensors.

    Directory of Open Access Journals (Sweden)

    Fidel Aznar

    Full Text Available SLAM is a popular task used by robots and autonomous vehicles to build a map of an unknown environment and, at the same time, to determine their location within the map. This paper describes a SLAM-based, probabilistic robotic system able to learn the essential features of different parts of its environment. Some previous SLAM implementations had computational complexities ranging from O(Nlog(N to O(N(2, where N is the number of map features. Unlike these methods, our approach reduces the computational complexity to O(N by using a model to fuse the information from the sensors after applying the Bayesian paradigm. Once the training process is completed, the robot identifies and locates those areas that potentially match the sections that have been previously learned. After the training, the robot navigates and extracts a three-dimensional map of the environment using a single laser sensor. Thus, it perceives different sections of its world. In addition, in order to make our system able to be used in a low-cost robot, low-complexity algorithms that can be easily implemented on embedded processors or microcontrollers are used.

  12. Learning probabilistic features for robotic navigation using laser sensors.

    Science.gov (United States)

    Aznar, Fidel; Pujol, Francisco A; Pujol, Mar; Rizo, Ramón; Pujol, María-José

    2014-01-01

    SLAM is a popular task used by robots and autonomous vehicles to build a map of an unknown environment and, at the same time, to determine their location within the map. This paper describes a SLAM-based, probabilistic robotic system able to learn the essential features of different parts of its environment. Some previous SLAM implementations had computational complexities ranging from O(Nlog(N)) to O(N(2)), where N is the number of map features. Unlike these methods, our approach reduces the computational complexity to O(N) by using a model to fuse the information from the sensors after applying the Bayesian paradigm. Once the training process is completed, the robot identifies and locates those areas that potentially match the sections that have been previously learned. After the training, the robot navigates and extracts a three-dimensional map of the environment using a single laser sensor. Thus, it perceives different sections of its world. In addition, in order to make our system able to be used in a low-cost robot, low-complexity algorithms that can be easily implemented on embedded processors or microcontrollers are used.

  13. Bayesian uncertainty analyses of probabilistic risk models

    International Nuclear Information System (INIS)

    Pulkkinen, U.

    1989-01-01

    Applications of Bayesian principles to the uncertainty analyses are discussed in the paper. A short review of the most important uncertainties and their causes is provided. An application of the principle of maximum entropy to the determination of Bayesian prior distributions is described. An approach based on so called probabilistic structures is presented in order to develop a method of quantitative evaluation of modelling uncertainties. The method is applied to a small example case. Ideas for application areas for the proposed method are discussed

  14. Probabilistic model for sterilization of food

    International Nuclear Information System (INIS)

    Chepurko, V.V.; Malinovskij, O.V.

    1986-01-01

    The probabilistic model for radiation sterilization is proposed based on the followng suppositions: (1) initial contamination of a volume unit of the sterilized product m is described by the distribution of the probabilities q(m), (2) inactivation of the population from m of microorganisms is approximated by Bernoulli test scheme, and (3) contamination of unit of the sterilized product is independent. The possibility of approximation q(m) by Poisson distribution is demonstrated. The diagrams are presented permitting to evaluate the dose which provides the defined reliability of sterilization of food for chicken-gnotobionts

  15. Probabilistic Role Models and the Guarded Fragment

    DEFF Research Database (Denmark)

    Jaeger, Manfred

    2004-01-01

    We propose a uniform semantic framework for interpreting probabilistic concept subsumption and probabilistic role quantification through statistical sampling distributions. This general semantic principle serves as the foundation for the development of a probabilistic version of the guarded fragm...... fragment of first-order logic. A characterization of equivalence in that logic in terms of bisimulations is given....

  16. Probabilistic role models and the guarded fragment

    DEFF Research Database (Denmark)

    Jaeger, Manfred

    2006-01-01

    We propose a uniform semantic framework for interpreting probabilistic concept subsumption and probabilistic role quantification through statistical sampling distributions. This general semantic principle serves as the foundation for the development of a probabilistic version of the guarded fragm...... fragment of first-order logic. A characterization of equivalence in that logic in terms of bisimulations is given....

  17. Probabilistic Radiological Performance Assessment Modeling and Uncertainty

    Science.gov (United States)

    Tauxe, J.

    2004-12-01

    A generic probabilistic radiological Performance Assessment (PA) model is presented. The model, built using the GoldSim systems simulation software platform, concerns contaminant transport and dose estimation in support of decision making with uncertainty. Both the U.S. Nuclear Regulatory Commission (NRC) and the U.S. Department of Energy (DOE) require assessments of potential future risk to human receptors of disposal of LLW. Commercially operated LLW disposal facilities are licensed by the NRC (or agreement states), and the DOE operates such facilities for disposal of DOE-generated LLW. The type of PA model presented is probabilistic in nature, and hence reflects the current state of knowledge about the site by using probability distributions to capture what is expected (central tendency or average) and the uncertainty (e.g., standard deviation) associated with input parameters, and propagating through the model to arrive at output distributions that reflect expected performance and the overall uncertainty in the system. Estimates of contaminant release rates, concentrations in environmental media, and resulting doses to human receptors well into the future are made by running the model in Monte Carlo fashion, with each realization representing a possible combination of input parameter values. Statistical summaries of the results can be compared to regulatory performance objectives, and decision makers are better informed of the inherently uncertain aspects of the model which supports their decision-making. While this information may make some regulators uncomfortable, they must realize that uncertainties which were hidden in a deterministic analysis are revealed in a probabilistic analysis, and the chance of making a correct decision is now known rather than hoped for. The model includes many typical features and processes that would be part of a PA, but is entirely fictitious. This does not represent any particular site and is meant to be a generic example. A

  18. A probabilistic model for component-based shape synthesis

    KAUST Repository

    Kalogerakis, Evangelos; Chaudhuri, Siddhartha; Koller, Daphne; Koltun, Vladlen

    2012-01-01

    represents probabilistic relationships between properties of shape components, and relates them to learned underlying causes of structural variability within the domain. These causes are treated as latent variables, leading to a compact representation

  19. Behavioral Modeling Based on Probabilistic Finite Automata: An Empirical Study.

    Science.gov (United States)

    Tîrnăucă, Cristina; Montaña, José L; Ontañón, Santiago; González, Avelino J; Pardo, Luis M

    2016-06-24

    Imagine an agent that performs tasks according to different strategies. The goal of Behavioral Recognition (BR) is to identify which of the available strategies is the one being used by the agent, by simply observing the agent's actions and the environmental conditions during a certain period of time. The goal of Behavioral Cloning (BC) is more ambitious. In this last case, the learner must be able to build a model of the behavior of the agent. In both settings, the only assumption is that the learner has access to a training set that contains instances of observed behavioral traces for each available strategy. This paper studies a machine learning approach based on Probabilistic Finite Automata (PFAs), capable of achieving both the recognition and cloning tasks. We evaluate the performance of PFAs in the context of a simulated learning environment (in this case, a virtual Roomba vacuum cleaner robot), and compare it with a collection of other machine learning approaches.

  20. A probabilistic model for component-based shape synthesis

    KAUST Repository

    Kalogerakis, Evangelos

    2012-07-01

    We present an approach to synthesizing shapes from complex domains, by identifying new plausible combinations of components from existing shapes. Our primary contribution is a new generative model of component-based shape structure. The model represents probabilistic relationships between properties of shape components, and relates them to learned underlying causes of structural variability within the domain. These causes are treated as latent variables, leading to a compact representation that can be effectively learned without supervision from a set of compatibly segmented shapes. We evaluate the model on a number of shape datasets with complex structural variability and demonstrate its application to amplification of shape databases and to interactive shape synthesis. © 2012 ACM 0730-0301/2012/08-ART55.

  1. Probabilistic Modeling of Graded Timber Material Properties

    DEFF Research Database (Denmark)

    Faber, M. H.; Köhler, J.; Sørensen, John Dalsgaard

    2004-01-01

    The probabilistic modeling of timber material characteristics is considered with special emphasis to the modeling of the effect of different quality control and selection procedures used as means for quality grading in the production line. It is shown how statistical models may be established...... on the basis of the same type of information which is normally collected as a part of the quality control procedures and furthermore, how the efficiency of different control procedures may be quantified and compared. The tail behavior of the probability distributions of timber material characteristics plays...... such that they may readily be applied in structural reliability analysis and their format appears to be appropriate for codification purposes of quality control and selection for grading procedures....

  2. Probabilistic Modelling of Timber Material Properties

    DEFF Research Database (Denmark)

    Nielsen, Michael Havbro Faber; Köhler, Jochen; Sørensen, John Dalsgaard

    2001-01-01

    The probabilistic modeling of timber material characteristics is considered with special emphasis to the modeling of the effect of different quality control and selection procedures used as means for grading of timber in the production line. It is shown how statistical models may be established...... on the basis of the same type of information which is normally collected as a part of the quality control procedures and furthermore, how the efficiency of different control procedures may be compared. The tail behavior of the probability distributions of timber material characteristics play an important role...... such that they may readily be applied in structural reliability analysis and the format appears to be appropriate for codification purposes of quality control and selection for grading procedures...

  3. A Probabilistic Typhoon Risk Model for Vietnam

    Science.gov (United States)

    Haseemkunju, A.; Smith, D. F.; Brolley, J. M.

    2017-12-01

    Annually, the coastal Provinces of low-lying Mekong River delta region in the southwest to the Red River Delta region in Northern Vietnam is exposed to severe wind and flood risk from landfalling typhoons. On average, about two to three tropical cyclones with a maximum sustained wind speed of >=34 knots make landfall along the Vietnam coast. Recently, Typhoon Wutip (2013) crossed Central Vietnam as a category 2 typhoon causing significant damage to properties. As tropical cyclone risk is expected to increase with increase in exposure and population growth along the coastal Provinces of Vietnam, insurance/reinsurance, and capital markets need a comprehensive probabilistic model to assess typhoon risk in Vietnam. In 2017, CoreLogic has expanded the geographical coverage of its basin-wide Western North Pacific probabilistic typhoon risk model to estimate the economic and insured losses from landfalling and by-passing tropical cyclones in Vietnam. The updated model is based on 71 years (1945-2015) of typhoon best-track data and 10,000 years of a basin-wide simulated stochastic tracks covering eight countries including Vietnam. The model is capable of estimating damage from wind, storm surge and rainfall flooding using vulnerability models, which relate typhoon hazard to building damageability. The hazard and loss models are validated against past historical typhoons affecting Vietnam. Notable typhoons causing significant damage in Vietnam are Lola (1993), Frankie (1996), Xangsane (2006), and Ketsana (2009). The central and northern coastal provinces of Vietnam are more vulnerable to wind and flood hazard, while typhoon risk in the southern provinces are relatively low.

  4. Fatigue modelling according to the JCSS Probabilistic model code

    NARCIS (Netherlands)

    Vrouwenvelder, A.C.W.M.

    2007-01-01

    The Joint Committee on Structural Safety is working on a Model Code for full probabilistic design. The code consists out of three major parts: Basis of design, Load Models and Models for Material and Structural Properties. The code is intended as the operational counter part of codes like ISO,

  5. Biological sequence analysis: probabilistic models of proteins and nucleic acids

    National Research Council Canada - National Science Library

    Durbin, Richard

    1998-01-01

    ... analysis methods are now based on principles of probabilistic modelling. Examples of such methods include the use of probabilistically derived score matrices to determine the significance of sequence alignments, the use of hidden Markov models as the basis for profile searches to identify distant members of sequence families, and the inference...

  6. Efficient probabilistic model checking on general purpose graphic processors

    NARCIS (Netherlands)

    Bosnacki, D.; Edelkamp, S.; Sulewski, D.; Pasareanu, C.S.

    2009-01-01

    We present algorithms for parallel probabilistic model checking on general purpose graphic processing units (GPGPUs). For this purpose we exploit the fact that some of the basic algorithms for probabilistic model checking rely on matrix vector multiplication. Since this kind of linear algebraic

  7. PROBABILISTIC MODEL FOR AIRPORT RUNWAY SAFETY AREAS

    Directory of Open Access Journals (Sweden)

    Stanislav SZABO

    2017-06-01

    Full Text Available The Laboratory of Aviation Safety and Security at CTU in Prague has recently started a project aimed at runway protection zones. The probability of exceeding by a certain distance from the runway in common incident/accident scenarios (take-off/landing overrun/veer-off, landing undershoot is being identified relative to the runway for any airport. As a result, the size and position of safety areas around runways are defined for the chosen probability. The basis for probability calculation is a probabilistic model using statistics from more than 1400 real-world cases where jet airplanes have been involved over the last few decades. Other scientific studies have contributed to understanding the issue and supported the model’s application to different conditions.

  8. Probabilistic forward model for electroencephalography source analysis

    International Nuclear Information System (INIS)

    Plis, Sergey M; George, John S; Jun, Sung C; Ranken, Doug M; Volegov, Petr L; Schmidt, David M

    2007-01-01

    Source localization by electroencephalography (EEG) requires an accurate model of head geometry and tissue conductivity. The estimation of source time courses from EEG or from EEG in conjunction with magnetoencephalography (MEG) requires a forward model consistent with true activity for the best outcome. Although MRI provides an excellent description of soft tissue anatomy, a high resolution model of the skull (the dominant resistive component of the head) requires CT, which is not justified for routine physiological studies. Although a number of techniques have been employed to estimate tissue conductivity, no present techniques provide the noninvasive 3D tomographic mapping of conductivity that would be desirable. We introduce a formalism for probabilistic forward modeling that allows the propagation of uncertainties in model parameters into possible errors in source localization. We consider uncertainties in the conductivity profile of the skull, but the approach is general and can be extended to other kinds of uncertainties in the forward model. We and others have previously suggested the possibility of extracting conductivity of the skull from measured electroencephalography data by simultaneously optimizing over dipole parameters and the conductivity values required by the forward model. Using Cramer-Rao bounds, we demonstrate that this approach does not improve localization results nor does it produce reliable conductivity estimates. We conclude that the conductivity of the skull has to be either accurately measured by an independent technique, or that the uncertainties in the conductivity values should be reflected in uncertainty in the source location estimates

  9. MODELING PROBABILISTIC CONFLICT OF TECHNOLOGICAL SYSTEMS

    Directory of Open Access Journals (Sweden)

    D. B. Desyatov

    2015-01-01

    Full Text Available Recently for the study of conflict increasingly used method of mathematical optical modeling. Its importance stems from the fact that experimental research such conflicts rather time-consuming and complex. However, existing approaches to the study of conflict do not take into account the stochastic nature of the systems, suffers from conceptual incompleteness. There is a need to develop models, algorithms and principles, in order to assess the conflict, to choose conflict resolution to ensure that not the worst of conditions. For stochastic technological systems as a utility function, we consider the probability of achieving a given objective. We assume that some system S1 is in conflict with the system S2, (SR2R К SR1R, if q(SR1R,SR2Rprobabilistic conflict of the first kind (А К1 B, if P(A/Bprobabilistic conflict of the second kind (А К2 B, if P(A/B

  10. Games people play: How video games improve probabilistic learning.

    Science.gov (United States)

    Schenk, Sabrina; Lech, Robert K; Suchan, Boris

    2017-09-29

    Recent research suggests that video game playing is associated with many cognitive benefits. However, little is known about the neural mechanisms mediating such effects, especially with regard to probabilistic categorization learning, which is a widely unexplored area in gaming research. Therefore, the present study aimed to investigate the neural correlates of probabilistic classification learning in video gamers in comparison to non-gamers. Subjects were scanned in a 3T magnetic resonance imaging (MRI) scanner while performing a modified version of the weather prediction task. Behavioral data yielded evidence for better categorization performance of video gamers, particularly under conditions characterized by stronger uncertainty. Furthermore, a post-experimental questionnaire showed that video gamers had acquired higher declarative knowledge about the card combinations and the related weather outcomes. Functional imaging data revealed for video gamers stronger activation clusters in the hippocampus, the precuneus, the cingulate gyrus and the middle temporal gyrus as well as in occipital visual areas and in areas related to attentional processes. All these areas are connected with each other and represent critical nodes for semantic memory, visual imagery and cognitive control. Apart from this, and in line with previous studies, both groups showed activation in brain areas that are related to attention and executive functions as well as in the basal ganglia and in memory-associated regions of the medial temporal lobe. These results suggest that playing video games might enhance the usage of declarative knowledge as well as hippocampal involvement and enhances overall learning performance during probabilistic learning. In contrast to non-gamers, video gamers showed better categorization performance, independently of the uncertainty of the condition. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. A Probabilistic Asteroid Impact Risk Model

    Science.gov (United States)

    Mathias, Donovan L.; Wheeler, Lorien F.; Dotson, Jessie L.

    2016-01-01

    Asteroid threat assessment requires the quantification of both the impact likelihood and resulting consequence across the range of possible events. This paper presents a probabilistic asteroid impact risk (PAIR) assessment model developed for this purpose. The model incorporates published impact frequency rates with state-of-the-art consequence assessment tools, applied within a Monte Carlo framework that generates sets of impact scenarios from uncertain parameter distributions. Explicit treatment of atmospheric entry is included to produce energy deposition rates that account for the effects of thermal ablation and object fragmentation. These energy deposition rates are used to model the resulting ground damage, and affected populations are computed for the sampled impact locations. The results for each scenario are aggregated into a distribution of potential outcomes that reflect the range of uncertain impact parameters, population densities, and strike probabilities. As an illustration of the utility of the PAIR model, the results are used to address the question of what minimum size asteroid constitutes a threat to the population. To answer this question, complete distributions of results are combined with a hypothetical risk tolerance posture to provide the minimum size, given sets of initial assumptions. Model outputs demonstrate how such questions can be answered and provide a means for interpreting the effect that input assumptions and uncertainty can have on final risk-based decisions. Model results can be used to prioritize investments to gain knowledge in critical areas or, conversely, to identify areas where additional data has little effect on the metrics of interest.

  12. Relative risk of probabilistic category learning deficits in patients with schizophrenia and their siblings

    Science.gov (United States)

    Weickert, Thomas W.; Goldberg, Terry E.; Egan, Michael F.; Apud, Jose A.; Meeter, Martijn; Myers, Catherine E.; Gluck, Mark A; Weinberger, Daniel R.

    2010-01-01

    Background While patients with schizophrenia display an overall probabilistic category learning performance deficit, the extent to which this deficit occurs in unaffected siblings of patients with schizophrenia is unknown. There are also discrepant findings regarding probabilistic category learning acquisition rate and performance in patients with schizophrenia. Methods A probabilistic category learning test was administered to 108 patients with schizophrenia, 82 unaffected siblings, and 121 healthy participants. Results Patients with schizophrenia displayed significant differences from their unaffected siblings and healthy participants with respect to probabilistic category learning acquisition rates. Although siblings on the whole failed to differ from healthy participants on strategy and quantitative indices of overall performance and learning acquisition, application of a revised learning criterion enabling classification into good and poor learners based on individual learning curves revealed significant differences between percentages of sibling and healthy poor learners: healthy (13.2%), siblings (34.1%), patients (48.1%), yielding a moderate relative risk. Conclusions These results clarify previous discrepant findings pertaining to probabilistic category learning acquisition rate in schizophrenia and provide the first evidence for the relative risk of probabilistic category learning abnormalities in unaffected siblings of patients with schizophrenia, supporting genetic underpinnings of probabilistic category learning deficits in schizophrenia. These findings also raise questions regarding the contribution of antipsychotic medication to the probabilistic category learning deficit in schizophrenia. The distinction between good and poor learning may be used to inform genetic studies designed to detect schizophrenia risk alleles. PMID:20172502

  13. Building probabilistic graphical models with Python

    CERN Document Server

    Karkera, Kiran R

    2014-01-01

    This is a short, practical guide that allows data scientists to understand the concepts of Graphical models and enables them to try them out using small Python code snippets, without being too mathematically complicated. If you are a data scientist who knows about machine learning and want to enhance your knowledge of graphical models, such as Bayes network, in order to use them to solve real-world problems using Python libraries, this book is for you. This book is intended for those who have some Python and machine learning experience, or are exploring the machine learning field.

  14. Learning to Estimate Dynamical State with Probabilistic Population Codes.

    Directory of Open Access Journals (Sweden)

    Joseph G Makin

    2015-11-01

    Full Text Available Tracking moving objects, including one's own body, is a fundamental ability of higher organisms, playing a central role in many perceptual and motor tasks. While it is unknown how the brain learns to follow and predict the dynamics of objects, it is known that this process of state estimation can be learned purely from the statistics of noisy observations. When the dynamics are simply linear with additive Gaussian noise, the optimal solution is the well known Kalman filter (KF, the parameters of which can be learned via latent-variable density estimation (the EM algorithm. The brain does not, however, directly manipulate matrices and vectors, but instead appears to represent probability distributions with the firing rates of population of neurons, "probabilistic population codes." We show that a recurrent neural network-a modified form of an exponential family harmonium (EFH-that takes a linear probabilistic population code as input can learn, without supervision, to estimate the state of a linear dynamical system. After observing a series of population responses (spike counts to the position of a moving object, the network learns to represent the velocity of the object and forms nearly optimal predictions about the position at the next time-step. This result builds on our previous work showing that a similar network can learn to perform multisensory integration and coordinate transformations for static stimuli. The receptive fields of the trained network also make qualitative predictions about the developing and learning brain: tuning gradually emerges for higher-order dynamical states not explicitly present in the inputs, appearing as delayed tuning for the lower-order states.

  15. Probabilistic Modeling of Aircraft Trajectories for Dynamic Separation Volumes

    Science.gov (United States)

    Lewis, Timothy A.

    2016-01-01

    With a proliferation of new and unconventional vehicles and operations expected in the future, the ab initio airspace design will require new approaches to trajectory prediction for separation assurance and other air traffic management functions. This paper presents an approach to probabilistic modeling of the trajectory of an aircraft when its intent is unknown. The approach uses a set of feature functions to constrain a maximum entropy probability distribution based on a set of observed aircraft trajectories. This model can be used to sample new aircraft trajectories to form an ensemble reflecting the variability in an aircraft's intent. The model learning process ensures that the variability in this ensemble reflects the behavior observed in the original data set. Computational examples are presented.

  16. Probabilistic Model for Fatigue Crack Growth in Welded Bridge Details

    DEFF Research Database (Denmark)

    Toft, Henrik Stensgaard; Sørensen, John Dalsgaard; Yalamas, Thierry

    2013-01-01

    In the present paper a probabilistic model for fatigue crack growth in welded steel details in road bridges is presented. The probabilistic model takes the influence of bending stresses in the joints into account. The bending stresses can either be introduced by e.g. misalignment or redistribution...... of stresses in the structure. The fatigue stress ranges are estimated from traffic measurements and a generic bridge model. Based on the probabilistic models for the resistance and load the reliability is estimated for a typical welded steel detail. The results show that large misalignments in the joints can...

  17. Using Structured Knowledge Representation for Context-Sensitive Probabilistic Modeling

    National Research Council Canada - National Science Library

    Sakhanenko, Nikita A; Luger, George F

    2008-01-01

    We propose a context-sensitive probabilistic modeling system (COSMOS) that reasons about a complex, dynamic environment through a series of applications of smaller, knowledge-focused models representing contextually relevant information...

  18. A methodology for acquiring qualitative knowledge for probabilistic graphical models

    DEFF Research Database (Denmark)

    Kjærulff, Uffe Bro; Madsen, Anders L.

    2004-01-01

    We present a practical and general methodology that simplifies the task of acquiring and formulating qualitative knowledge for constructing probabilistic graphical models (PGMs). The methodology efficiently captures and communicates expert knowledge, and has significantly eased the model...

  19. Probabilistic Model-based Background Subtraction

    DEFF Research Database (Denmark)

    Krüger, Volker; Anderson, Jakob; Prehn, Thomas

    2005-01-01

    is the correlation between pixels. In this paper we introduce a model-based background subtraction approach which facilitates prior knowledge of pixel correlations for clearer and better results. Model knowledge is being learned from good training video data, the data is stored for fast access in a hierarchical...

  20. Feedback-based probabilistic category learning is selectively impaired in attention/hyperactivity deficit disorder.

    Science.gov (United States)

    Gabay, Yafit; Goldfarb, Liat

    2017-07-01

    Although Attention-Deficit Hyperactivity Disorder (ADHD) is closely linked to executive function deficits, it has recently been attributed to procedural learning impairments that are quite distinct from the former. These observations challenge the ability of the executive function framework solely to account for the diverse range of symptoms observed in ADHD. A recent neurocomputational model emphasizes the role of striatal dopamine (DA) in explaining ADHD's broad range of deficits, but the link between this model and procedural learning impairments remains unclear. Significantly, feedback-based procedural learning is hypothesized to be disrupted in ADHD because of the involvement of striatal DA in this type of learning. In order to test this assumption, we employed two variants of a probabilistic category learning task known from the neuropsychological literature. Feedback-based (FB) and paired associate-based (PA) probabilistic category learning were employed in a non-medicated sample of ADHD participants and neurotypical participants. In the FB task, participants learned associations between cues and outcomes initially by guessing and subsequently through feedback indicating the correctness of the response. In the PA learning task, participants viewed the cue and its associated outcome simultaneously without receiving an overt response or corrective feedback. In both tasks, participants were trained across 150 trials. Learning was assessed in a subsequent test without a presentation of the outcome or corrective feedback. Results revealed an interesting disassociation in which ADHD participants performed as well as control participants in the PA task, but were impaired compared with the controls in the FB task. The learning curve during FB training differed between the two groups. Taken together, these results suggest that the ability to incrementally learn by feedback is selectively disrupted in ADHD participants. These results are discussed in relation to both

  1. Transitions in a probabilistic interface growth model

    International Nuclear Information System (INIS)

    Alves, S G; Moreira, J G

    2011-01-01

    We study a generalization of the Wolf–Villain (WV) interface growth model based on a probabilistic growth rule. In the WV model, particles are randomly deposited onto a substrate and subsequently move to a position nearby where the binding is strongest. We introduce a growth probability which is proportional to a power of the number n i of bindings of the site i: p i ∝n i ν . Through extensive simulations, in (1 + 1) dimensions, we find three behaviors depending on the ν value: (i) if ν is small, a crossover from the Mullins–Herring to the Edwards–Wilkinson (EW) universality class; (ii) for intermediate values of ν, a crossover from the EW to the Kardar–Parisi–Zhang (KPZ) universality class; and, finally, (iii) for large ν values, the system is always in the KPZ class. In (2 + 1) dimensions, we obtain three different behaviors: (i) a crossover from the Villain–Lai–Das Sarma to the EW universality class for small ν values; (ii) the EW class is always present for intermediate ν values; and (iii) a deviation from the EW class is observed for large ν values

  2. Online probabilistic learning with an ensemble of forecasts

    Science.gov (United States)

    Thorey, Jean; Mallet, Vivien; Chaussin, Christophe

    2016-04-01

    Our objective is to produce a calibrated weighted ensemble to forecast a univariate time series. In addition to a meteorological ensemble of forecasts, we rely on observations or analyses of the target variable. The celebrated Continuous Ranked Probability Score (CRPS) is used to evaluate the probabilistic forecasts. However applying the CRPS on weighted empirical distribution functions (deriving from the weighted ensemble) may introduce a bias because of which minimizing the CRPS does not produce the optimal weights. Thus we propose an unbiased version of the CRPS which relies on clusters of members and is strictly proper. We adapt online learning methods for the minimization of the CRPS. These methods generate the weights associated to the members in the forecasted empirical distribution function. The weights are updated before each forecast step using only past observations and forecasts. Our learning algorithms provide the theoretical guarantee that, in the long run, the CRPS of the weighted forecasts is at least as good as the CRPS of any weighted ensemble with weights constant in time. In particular, the performance of our forecast is better than that of any subset ensemble with uniform weights. A noteworthy advantage of our algorithm is that it does not require any assumption on the distributions of the observations and forecasts, both for the application and for the theoretical guarantee to hold. As application example on meteorological forecasts for photovoltaic production integration, we show that our algorithm generates a calibrated probabilistic forecast, with significant performance improvements on probabilistic diagnostic tools (the CRPS, the reliability diagram and the rank histogram).

  3. Probabilistic and machine learning-based retrieval approaches for biomedical dataset retrieval

    Science.gov (United States)

    Karisani, Payam; Qin, Zhaohui S; Agichtein, Eugene

    2018-01-01

    Abstract The bioCADDIE dataset retrieval challenge brought together different approaches to retrieval of biomedical datasets relevant to a user’s query, expressed as a text description of a needed dataset. We describe experiments in applying a data-driven, machine learning-based approach to biomedical dataset retrieval as part of this challenge. We report on a series of experiments carried out to evaluate the performance of both probabilistic and machine learning-driven techniques from information retrieval, as applied to this challenge. Our experiments with probabilistic information retrieval methods, such as query term weight optimization, automatic query expansion and simulated user relevance feedback, demonstrate that automatically boosting the weights of important keywords in a verbose query is more effective than other methods. We also show that although there is a rich space of potential representations and features available in this domain, machine learning-based re-ranking models are not able to improve on probabilistic information retrieval techniques with the currently available training data. The models and algorithms presented in this paper can serve as a viable implementation of a search engine to provide access to biomedical datasets. The retrieval performance is expected to be further improved by using additional training data that is created by expert annotation, or gathered through usage logs, clicks and other processes during natural operation of the system. Database URL: https://github.com/emory-irlab/biocaddie

  4. Probabilistic electricity price forecasting with variational heteroscedastic Gaussian process and active learning

    International Nuclear Information System (INIS)

    Kou, Peng; Liang, Deliang; Gao, Lin; Lou, Jianyong

    2015-01-01

    Highlights: • A novel active learning model for the probabilistic electricity price forecasting. • Heteroscedastic Gaussian process that captures the local volatility of the electricity price. • Variational Bayesian learning that avoids over-fitting. • Active learning algorithm that reduces the computational efforts. - Abstract: Electricity price forecasting is essential for the market participants in their decision making. Nevertheless, the accuracy of such forecasting cannot be guaranteed due to the high variability of the price data. For this reason, in many cases, rather than merely point forecasting results, market participants are more interested in the probabilistic price forecasting results, i.e., the prediction intervals of the electricity price. Focusing on this issue, this paper proposes a new model for the probabilistic electricity price forecasting. This model is based on the active learning technique and the variational heteroscedastic Gaussian process (VHGP). It provides the heteroscedastic Gaussian prediction intervals, which effectively quantify the heteroscedastic uncertainties associated with the price data. Because the high computational effort of VHGP hinders its application to the large-scale electricity price forecasting tasks, we design an active learning algorithm to select a most informative training subset from the whole available training set. By constructing the forecasting model on this smaller subset, the computational efforts can be significantly reduced. In this way, the practical applicability of the proposed model is enhanced. The forecasting performance and the computational time of the proposed model are evaluated using the real-world electricity price data, which is obtained from the ANEM, PJM, and New England ISO

  5. Causal Learning from Probabilistic Events in 24-Month-Olds: An Action Measure

    Science.gov (United States)

    Waismeyer, Anna; Meltzoff, Andrew N.; Gopnik, Alison

    2015-01-01

    How do young children learn about causal structure in an uncertain and variable world? We tested whether they can use observed probabilistic information to solve causal learning problems. In two experiments, 24-month-olds observed an adult produce a probabilistic pattern of causal evidence. The toddlers then were given an opportunity to design…

  6. Probabilistic Modeling and Visualization for Bankruptcy Prediction

    DEFF Research Database (Denmark)

    Antunes, Francisco; Ribeiro, Bernardete; Pereira, Francisco Camara

    2017-01-01

    In accounting and finance domains, bankruptcy prediction is of great utility for all of the economic stakeholders. The challenge of accurate assessment of business failure prediction, specially under scenarios of financial crisis, is known to be complicated. Although there have been many successful...... studies on bankruptcy detection, seldom probabilistic approaches were carried out. In this paper we assume a probabilistic point-of-view by applying Gaussian Processes (GP) in the context of bankruptcy prediction, comparing it against the Support Vector Machines (SVM) and the Logistic Regression (LR......). Using real-world bankruptcy data, an in-depth analysis is conducted showing that, in addition to a probabilistic interpretation, the GP can effectively improve the bankruptcy prediction performance with high accuracy when compared to the other approaches. We additionally generate a complete graphical...

  7. A Comprehensive Probabilistic Framework to Learn Air Data from Surface Pressure Measurements

    Directory of Open Access Journals (Sweden)

    Ankur Srivastava

    2015-01-01

    Full Text Available Use of probabilistic techniques has been demonstrated to learn air data parameters from surface pressure measurements. Integration of numerical models with wind tunnel data and sequential experiment design of wind tunnel runs has been demonstrated in the calibration of a flush air data sensing anemometer system. Development and implementation of a metamodeling method, Sequential Function Approximation (SFA, are presented which lies at the core of the discussed probabilistic framework. SFA is presented as a tool capable of nonlinear statistical inference, uncertainty reduction by fusion of data with physical models of variable fidelity, and sequential experiment design. This work presents the development and application of these tools in the calibration of FADS for a Runway Assisted Landing Site (RALS control tower. However, the multidisciplinary nature of this work is general in nature and is potentially applicable to a variety of mechanical and aerospace engineering problems.

  8. A probabilistic model for snow avalanche occurrence

    Science.gov (United States)

    Perona, P.; Miescher, A.; Porporato, A.

    2009-04-01

    Avalanche hazard forecasting is an important issue in relation to the protection of urbanized environments, ski resorts and of ski-touring alpinists. A critical point is to predict the conditions that trigger the snow mass instability determining the onset and the size of avalanches. On steep terrains the risk of avalanches is known to be related to preceding consistent snowfall events and to subsequent changes in the local climatic conditions. Regression analysis has shown that avalanche occurrence indeed correlates to the amount of snow fallen in consecutive three snowing days and to the state of the settled snow at the ground. Moreover, since different type of avalanches may occur as a result of the interactions of different factors, the process of snow avalanche formation is inherently complex and with some degree of unpredictability. For this reason, although several models assess the risk of avalanche by accounting for all the involved processes with a great detail, a high margin of uncertainty invariably remains. In this work, we explicitly describe such an unpredictable behaviour with an intrinsic noise affecting the processes leading snow instability. Eventually, this sets the basis for a minimalist stochastic model, which allows us to investigate the avalanche dynamics and its statistical properties. We employ a continuous time process with stochastic jumps (snowfalls), deterministic decay (snowmelt and compaction) and state dependent avalanche occurrence (renewals) as a minimalist model for the determination of avalanche size and related intertime occurrence. The physics leading to avalanches is simplified to the extent where only meteorological data and terrain data are necessary to estimate avalanche danger. We explore the analytical formulation of the process and the properties of the probability density function of the avalanche process variables. We also discuss what is the probabilistic link between avalanche size and preceding snowfall event and

  9. Probabilistic modeling of discourse-aware sentence processing.

    Science.gov (United States)

    Dubey, Amit; Keller, Frank; Sturt, Patrick

    2013-07-01

    Probabilistic models of sentence comprehension are increasingly relevant to questions concerning human language processing. However, such models are often limited to syntactic factors. This restriction is unrealistic in light of experimental results suggesting interactions between syntax and other forms of linguistic information in human sentence processing. To address this limitation, this article introduces two sentence processing models that augment a syntactic component with information about discourse co-reference. The novel combination of probabilistic syntactic components with co-reference classifiers permits them to more closely mimic human behavior than existing models. The first model uses a deep model of linguistics, based in part on probabilistic logic, allowing it to make qualitative predictions on experimental data; the second model uses shallow processing to make quantitative predictions on a broad-coverage reading-time corpus. Copyright © 2013 Cognitive Science Society, Inc.

  10. Improved probabilistic inference as a general learning mechanism with action video games.

    Science.gov (United States)

    Green, C Shawn; Pouget, Alexandre; Bavelier, Daphne

    2010-09-14

    Action video game play benefits performance in an array of sensory, perceptual, and attentional tasks that go well beyond the specifics of game play [1-9]. That a training regimen may induce improvements in so many different skills is notable because the majority of studies on training-induced learning report improvements on the trained task but limited transfer to other, even closely related, tasks ([10], but see also [11-13]). Here we ask whether improved probabilistic inference may explain such broad transfer. By using a visual perceptual decision making task [14, 15], the present study shows for the first time that action video game experience does indeed improve probabilistic inference. A neural model of this task [16] establishes how changing a single parameter, namely the strength of the connections between the neural layer providing the momentary evidence and the layer integrating the evidence over time, captures improvements in action-gamers behavior. These results were established in a visual, but also in a novel auditory, task, indicating generalization across modalities. Thus, improved probabilistic inference provides a general mechanism for why action video game playing enhances performance in a wide variety of tasks. In addition, this mechanism may serve as a signature of training regimens that are likely to produce transfer of learning. Copyright © 2010 Elsevier Ltd. All rights reserved.

  11. Probabilistic language models in cognitive neuroscience: Promises and pitfalls.

    Science.gov (United States)

    Armeni, Kristijan; Willems, Roel M; Frank, Stefan L

    2017-12-01

    Cognitive neuroscientists of language comprehension study how neural computations relate to cognitive computations during comprehension. On the cognitive part of the equation, it is important that the computations and processing complexity are explicitly defined. Probabilistic language models can be used to give a computationally explicit account of language complexity during comprehension. Whereas such models have so far predominantly been evaluated against behavioral data, only recently have the models been used to explain neurobiological signals. Measures obtained from these models emphasize the probabilistic, information-processing view of language understanding and provide a set of tools that can be used for testing neural hypotheses about language comprehension. Here, we provide a cursory review of the theoretical foundations and example neuroimaging studies employing probabilistic language models. We highlight the advantages and potential pitfalls of this approach and indicate avenues for future research. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. Model-based machine learning.

    Science.gov (United States)

    Bishop, Christopher M

    2013-02-13

    Several decades of research in the field of machine learning have resulted in a multitude of different algorithms for solving a broad range of problems. To tackle a new application, a researcher typically tries to map their problem onto one of these existing methods, often influenced by their familiarity with specific algorithms and by the availability of corresponding software implementations. In this study, we describe an alternative methodology for applying machine learning, in which a bespoke solution is formulated for each new application. The solution is expressed through a compact modelling language, and the corresponding custom machine learning code is then generated automatically. This model-based approach offers several major advantages, including the opportunity to create highly tailored models for specific scenarios, as well as rapid prototyping and comparison of a range of alternative models. Furthermore, newcomers to the field of machine learning do not have to learn about the huge range of traditional methods, but instead can focus their attention on understanding a single modelling environment. In this study, we show how probabilistic graphical models, coupled with efficient inference algorithms, provide a very flexible foundation for model-based machine learning, and we outline a large-scale commercial application of this framework involving tens of millions of users. We also describe the concept of probabilistic programming as a powerful software environment for model-based machine learning, and we discuss a specific probabilistic programming language called Infer.NET, which has been widely used in practical applications.

  13. HMM_Model-Checker pour la vérification probabiliste HMM_Model ...

    African Journals Online (AJOL)

    ASSIA

    probabiliste –Télescope Hubble. Abstract. Probabilistic verification for embedded systems continues to attract more and more followers in the research community. Given a probabilistic model, a formula of temporal logic, describing a property of a system and an exploration algorithm to check whether the property is satisfied ...

  14. Learning Markov Decision Processes for Model Checking

    DEFF Research Database (Denmark)

    Mao, Hua; Chen, Yingke; Jaeger, Manfred

    2012-01-01

    . The proposed learning algorithm is adapted from algorithms for learning deterministic probabilistic finite automata, and extended to include both probabilistic and nondeterministic transitions. The algorithm is empirically analyzed and evaluated by learning system models of slot machines. The evaluation......Constructing an accurate system model for formal model verification can be both resource demanding and time-consuming. To alleviate this shortcoming, algorithms have been proposed for automatically learning system models based on observed system behaviors. In this paper we extend the algorithm...... on learning probabilistic automata to reactive systems, where the observed system behavior is in the form of alternating sequences of inputs and outputs. We propose an algorithm for automatically learning a deterministic labeled Markov decision process model from the observed behavior of a reactive system...

  15. Probabilistic Electricity Price Forecasting Models by Aggregation of Competitive Predictors

    Directory of Open Access Journals (Sweden)

    Claudio Monteiro

    2018-04-01

    Full Text Available This article presents original probabilistic price forecasting meta-models (PPFMCP models, by aggregation of competitive predictors, for day-ahead hourly probabilistic price forecasting. The best twenty predictors of the EEM2016 EPF competition are used to create ensembles of hourly spot price forecasts. For each hour, the parameter values of the probability density function (PDF of a Beta distribution for the output variable (hourly price can be directly obtained from the expected and variance values associated to the ensemble for such hour, using three aggregation strategies of predictor forecasts corresponding to three PPFMCP models. A Reliability Indicator (RI and a Loss function Indicator (LI are also introduced to give a measure of uncertainty of probabilistic price forecasts. The three PPFMCP models were satisfactorily applied to the real-world case study of the Iberian Electricity Market (MIBEL. Results from PPFMCP models showed that PPFMCP model 2, which uses aggregation by weight values according to daily ranks of predictors, was the best probabilistic meta-model from a point of view of mean absolute errors, as well as of RI and LI. PPFMCP model 1, which uses the averaging of predictor forecasts, was the second best meta-model. PPFMCP models allow evaluations of risk decisions based on the price to be made.

  16. Undecidability of model-checking branching-time properties of stateless probabilistic pushdown process

    OpenAIRE

    Lin, T.

    2014-01-01

    In this paper, we settle a problem in probabilistic verification of infinite--state process (specifically, {\\it probabilistic pushdown process}). We show that model checking {\\it stateless probabilistic pushdown process} (pBPA) against {\\it probabilistic computational tree logic} (PCTL) is undecidable.

  17. Generative probabilistic models extend the scope of inferential structure determination

    DEFF Research Database (Denmark)

    Olsson, Simon; Boomsma, Wouter; Frellsen, Jes

    2011-01-01

    demonstrate that the use of generative probabilistic models instead of physical forcefields in the Bayesian formalism is not only conceptually attractive, but also improves precision and efficiency. Our results open new vistas for the use of sophisticated probabilistic models of biomolecular structure......Conventional methods for protein structure determination from NMR data rely on the ad hoc combination of physical forcefields and experimental data, along with heuristic determination of free parameters such as weight of experimental data relative to a physical forcefield. Recently, a theoretically...

  18. A generative, probabilistic model of local protein structure

    DEFF Research Database (Denmark)

    Boomsma, Wouter; Mardia, Kanti V.; Taylor, Charles C.

    2008-01-01

    Despite significant progress in recent years, protein structure prediction maintains its status as one of the prime unsolved problems in computational biology. One of the key remaining challenges is an efficient probabilistic exploration of the structural space that correctly reflects the relative...... conformational stabilities. Here, we present a fully probabilistic, continuous model of local protein structure in atomic detail. The generative model makes efficient conformational sampling possible and provides a framework for the rigorous analysis of local sequence-structure correlations in the native state...

  19. Approximating methods for intractable probabilistic models: Applications in neuroscience

    DEFF Research Database (Denmark)

    Højen-Sørensen, Pedro

    2002-01-01

    This thesis investigates various methods for carrying out approximate inference in intractable probabilistic models. By capturing the relationships between random variables, the framework of graphical models hints at which sets of random variables pose a problem to the inferential step. The appro...

  20. Probabilistic Load Models for Simulating the Impact of Load Management

    DEFF Research Database (Denmark)

    Chen, Peiyuan; Bak-Jensen, Birgitte; Chen, Zhe

    2009-01-01

    . It is concluded that the AR(12) model is favored with limited measurement data and that the joint-normal model may provide better results with a large data set. Both models can be applied in general to model load time series and used in time-sequential simulation of distribution system planning.......This paper analyzes a distribution system load time series through autocorrelation coefficient, power spectral density, probabilistic distribution and quantile value. Two probabilistic load models, i.e. the joint-normal model and the autoregressive model of order 12 (AR(12)), are proposed...... to simulate the impact of load management. The joint-normal model is superior in modeling the tail region of the hourly load distribution and implementing the change of hourly standard deviation. Whereas the AR(12) model requires much less parameter and is superior in modeling the autocorrelation...

  1. Probabilistic Motor Sequence Yields Greater Offline and Less Online Learning than Fixed Sequence.

    Science.gov (United States)

    Du, Yue; Prashad, Shikha; Schoenbrun, Ilana; Clark, Jane E

    2016-01-01

    It is well acknowledged that motor sequences can be learned quickly through online learning. Subsequently, the initial acquisition of a motor sequence is boosted or consolidated by offline learning. However, little is known whether offline learning can drive the fast learning of motor sequences (i.e., initial sequence learning in the first training session). To examine offline learning in the fast learning stage, we asked four groups of young adults to perform the serial reaction time (SRT) task with either a fixed or probabilistic sequence and with or without preliminary knowledge (PK) of the presence of a sequence. The sequence and PK were manipulated to emphasize either procedural (probabilistic sequence; no preliminary knowledge (NPK)) or declarative (fixed sequence; with PK) memory that were found to either facilitate or inhibit offline learning. In the SRT task, there were six learning blocks with a 2 min break between each consecutive block. Throughout the session, stimuli followed the same fixed or probabilistic pattern except in Block 5, in which stimuli appeared in a random order. We found that PK facilitated the learning of a fixed sequence, but not a probabilistic sequence. In addition to overall learning measured by the mean reaction time (RT), we examined the progressive changes in RT within and between blocks (i.e., online and offline learning, respectively). It was found that the two groups who performed the fixed sequence, regardless of PK, showed greater online learning than the other two groups who performed the probabilistic sequence. The groups who performed the probabilistic sequence, regardless of PK, did not display online learning, as indicated by a decline in performance within the learning blocks. However, they did demonstrate remarkably greater offline improvement in RT, which suggests that they are learning the probabilistic sequence offline. These results suggest that in the SRT task, the fast acquisition of a motor sequence is driven

  2. Review of probabilistic models of the strength of composite materials

    International Nuclear Information System (INIS)

    Sutherland, L.S.; Guedes Soares, C.

    1997-01-01

    The available literature concerning probabilistic models describing the strength of composite materials has been reviewed to highlight the important aspects of this behaviour which will be of interest to the modelling and analysis of a complex system. The success with which these theories have been used to predict experimental results has been discussed. Since the brittle reinforcement phase largely controls the strength of composites, the probabilistic theories used to describe the strength of brittle materials, fibres and bundles of fibres have been detailed. The use of these theories to predict the strength of composite materials has been considered, along with further developments incorporating the damage accumulation observed in the failure of such materials. Probabilistic theories of the strength of short-fibre composites have been outlined. Emphasis has been placed throughout on straightforward engineering explanations of these theories and how they may be used, rather than providing comprehensive statistical descriptions

  3. Probabilistic Modelling of Robustness and Resilience of Power Grid Systems

    DEFF Research Database (Denmark)

    Qin, Jianjun; Sansavini, Giovanni; Nielsen, Michael Havbro Faber

    2017-01-01

    The present paper proposes a framework for the modeling and analysis of resilience of networked power grid systems. A probabilistic systems model is proposed based on the JCSS Probabilistic Model Code (JCSS, 2001) and deterministic engineering systems modeling techniques such as the DC flow model...... cascading failure event scenarios (Nan and Sansavini, 2017). The concept of direct and indirect consequences proposed by the Joint Committee on Structural Safety (JCSS, 2008) is utilized to model the associated consequences. To facilitate a holistic modeling of robustness and resilience, and to identify how...... these characteristics may be optimized these characteristics, the power grid system is finally interlinked with its fundamental interdependent systems, i.e. a societal model, a regulatory system and control feedback loops. The proposed framework is exemplified with reference to optimal decision support for resilience...

  4. On the logical specification of probabilistic transition models

    CSIR Research Space (South Africa)

    Rens, G

    2013-05-01

    Full Text Available We investigate the requirements for specifying the behaviors of actions in a stochastic domain. That is, we propose how to write sentences in a logical language to capture a model of probabilistic transitions due to the execution of actions of some...

  5. Probabilistic predictive modelling of carbon nanocomposites for medical implants design.

    Science.gov (United States)

    Chua, Matthew; Chui, Chee-Kong

    2015-04-01

    Modelling of the mechanical properties of carbon nanocomposites based on input variables like percentage weight of Carbon Nanotubes (CNT) inclusions is important for the design of medical implants and other structural scaffolds. Current constitutive models for the mechanical properties of nanocomposites may not predict well due to differences in conditions, fabrication techniques and inconsistencies in reagents properties used across industries and laboratories. Furthermore, the mechanical properties of the designed products are not deterministic, but exist as a probabilistic range. A predictive model based on a modified probabilistic surface response algorithm is proposed in this paper to address this issue. Tensile testing of three groups of different CNT weight fractions of carbon nanocomposite samples displays scattered stress-strain curves, with the instantaneous stresses assumed to vary according to a normal distribution at a specific strain. From the probabilistic density function of the experimental data, a two factors Central Composite Design (CCD) experimental matrix based on strain and CNT weight fraction input with their corresponding stress distribution was established. Monte Carlo simulation was carried out on this design matrix to generate a predictive probabilistic polynomial equation. The equation and method was subsequently validated with more tensile experiments and Finite Element (FE) studies. The method was subsequently demonstrated in the design of an artificial tracheal implant. Our algorithm provides an effective way to accurately model the mechanical properties in implants of various compositions based on experimental data of samples. Copyright © 2015 Elsevier Ltd. All rights reserved.

  6. ISSUES ASSOCIATED WITH PROBABILISTIC FAILURE MODELING OF DIGITAL SYSTEMS

    International Nuclear Information System (INIS)

    CHU, T.L.; MARTINEZ-GURIDI, G.; LIHNER, J.; OVERLAND, D.

    2004-01-01

    The current U.S. Nuclear Regulatory Commission (NRC) licensing process of instrumentation and control (I and C) systems is based on deterministic requirements, e.g., single failure criteria, and defense in depth and diversity. Probabilistic considerations can be used as supplements to the deterministic process. The National Research Council has recommended development of methods for estimating failure probabilities of digital systems, including commercial off-the-shelf (COTS) equipment, for use in probabilistic risk assessment (PRA). NRC staff has developed informal qualitative and quantitative requirements for PRA modeling of digital systems. Brookhaven National Laboratory (BNL) has performed a review of the-state-of-the-art of the methods and tools that can potentially be used to model digital systems. The objectives of this paper are to summarize the review, discuss the issues associated with probabilistic modeling of digital systems, and identify potential areas of research that would enhance the state of the art toward a satisfactory modeling method that could be integrated with a typical probabilistic risk assessment

  7. Probabilistic finite element modeling of waste rollover

    International Nuclear Information System (INIS)

    Khaleel, M.A.; Cofer, W.F.; Al-fouqaha, A.A.

    1995-09-01

    Stratification of the wastes in many Hanford storage tanks has resulted in sludge layers which are capable of retaining gases formed by chemical and/or radiolytic reactions. As the gas is produced, the mechanisms of gas storage evolve until the resulting buoyancy in the sludge leads to instability, at which point the sludge ''rolls over'' and a significant volume of gas is suddenly released. Because the releases may contain flammable gases, these episodes of release are potentially hazardous. Mitigation techniques are desirable for more controlled releases at more frequent intervals. To aid the mitigation efforts, a methodology for predicting of sludge rollover at specific times is desired. This methodology would then provide a rational basis for the development of a schedule for the mitigation procedures. In addition, a knowledge of the sensitivity of the sludge rollovers to various physical and chemical properties within the tanks would provide direction for efforts to reduce the frequency and severity of these events. In this report, the use of probabilistic finite element analyses for computing the probability of rollover and the sensitivity of rollover probability to various parameters is described

  8. Brain networks for confidence weighting and hierarchical inference during probabilistic learning.

    Science.gov (United States)

    Meyniel, Florent; Dehaene, Stanislas

    2017-05-09

    Learning is difficult when the world fluctuates randomly and ceaselessly. Classical learning algorithms, such as the delta rule with constant learning rate, are not optimal. Mathematically, the optimal learning rule requires weighting prior knowledge and incoming evidence according to their respective reliabilities. This "confidence weighting" implies the maintenance of an accurate estimate of the reliability of what has been learned. Here, using fMRI and an ideal-observer analysis, we demonstrate that the brain's learning algorithm relies on confidence weighting. While in the fMRI scanner, human adults attempted to learn the transition probabilities underlying an auditory or visual sequence, and reported their confidence in those estimates. They knew that these transition probabilities could change simultaneously at unpredicted moments, and therefore that the learning problem was inherently hierarchical. Subjective confidence reports tightly followed the predictions derived from the ideal observer. In particular, subjects managed to attach distinct levels of confidence to each learned transition probability, as required by Bayes-optimal inference. Distinct brain areas tracked the likelihood of new observations given current predictions, and the confidence in those predictions. Both signals were combined in the right inferior frontal gyrus, where they operated in agreement with the confidence-weighting model. This brain region also presented signatures of a hierarchical process that disentangles distinct sources of uncertainty. Together, our results provide evidence that the sense of confidence is an essential ingredient of probabilistic learning in the human brain, and that the right inferior frontal gyrus hosts a confidence-based statistical learning algorithm for auditory and visual sequences.

  9. Brain networks for confidence weighting and hierarchical inference during probabilistic learning

    Science.gov (United States)

    Meyniel, Florent; Dehaene, Stanislas

    2017-01-01

    Learning is difficult when the world fluctuates randomly and ceaselessly. Classical learning algorithms, such as the delta rule with constant learning rate, are not optimal. Mathematically, the optimal learning rule requires weighting prior knowledge and incoming evidence according to their respective reliabilities. This “confidence weighting” implies the maintenance of an accurate estimate of the reliability of what has been learned. Here, using fMRI and an ideal-observer analysis, we demonstrate that the brain’s learning algorithm relies on confidence weighting. While in the fMRI scanner, human adults attempted to learn the transition probabilities underlying an auditory or visual sequence, and reported their confidence in those estimates. They knew that these transition probabilities could change simultaneously at unpredicted moments, and therefore that the learning problem was inherently hierarchical. Subjective confidence reports tightly followed the predictions derived from the ideal observer. In particular, subjects managed to attach distinct levels of confidence to each learned transition probability, as required by Bayes-optimal inference. Distinct brain areas tracked the likelihood of new observations given current predictions, and the confidence in those predictions. Both signals were combined in the right inferior frontal gyrus, where they operated in agreement with the confidence-weighting model. This brain region also presented signatures of a hierarchical process that disentangles distinct sources of uncertainty. Together, our results provide evidence that the sense of confidence is an essential ingredient of probabilistic learning in the human brain, and that the right inferior frontal gyrus hosts a confidence-based statistical learning algorithm for auditory and visual sequences. PMID:28439014

  10. A Probabilistic Model of Social Working Memory for Information Retrieval in Social Interactions.

    Science.gov (United States)

    Li, Liyuan; Xu, Qianli; Gan, Tian; Tan, Cheston; Lim, Joo-Hwee

    2018-05-01

    Social working memory (SWM) plays an important role in navigating social interactions. Inspired by studies in psychology, neuroscience, cognitive science, and machine learning, we propose a probabilistic model of SWM to mimic human social intelligence for personal information retrieval (IR) in social interactions. First, we establish a semantic hierarchy as social long-term memory to encode personal information. Next, we propose a semantic Bayesian network as the SWM, which integrates the cognitive functions of accessibility and self-regulation. One subgraphical model implements the accessibility function to learn the social consensus about IR-based on social information concept, clustering, social context, and similarity between persons. Beyond accessibility, one more layer is added to simulate the function of self-regulation to perform the personal adaptation to the consensus based on human personality. Two learning algorithms are proposed to train the probabilistic SWM model on a raw dataset of high uncertainty and incompleteness. One is an efficient learning algorithm of Newton's method, and the other is a genetic algorithm. Systematic evaluations show that the proposed SWM model is able to learn human social intelligence effectively and outperforms the baseline Bayesian cognitive model. Toward real-world applications, we implement our model on Google Glass as a wearable assistant for social interaction.

  11. Modelling fog in probabilistic consequence assessment

    International Nuclear Information System (INIS)

    Underwood, B.Y.

    1993-02-01

    Earlier work examined the potential influence of foggy weather conditions on the probabilistic assessment of the consequences of accidental releases of radioactive material to the atmosphere (PCA), in particular the impact of a fraction of the released aerosol becoming incorporated into droplets. A major uncertainty emerging from the initial scoping study concerned estimation of the fraction of the released material that would be taken up into droplets. An objective is to construct a method for handling in a PCA context the effect of fog on deposition, basing the method on the experience gained from prior investigations. There are two aspects to explicitly including the effect of fog in PCA: estimating the probability of occurrence of various types of foggy condition and calculating the impact on the conventional end-points of consequence assessment. For the first, a brief outline is given of the use of meteorological data by PCA computer codes, followed by a discussion of some routinely-recorded meteorological parameters that are pertinent to fog, such as the presentweather code and horizontal visibility. Four stylized scenarios are defined to cover a wide range of situations in which particle growth by uptake of water may have an important impact on deposition. A description is then given of the way in which routine meteorological data could be used to flag the presence of each of these conditions in the meteorological data file used by the PCA code. The approach developed to calculate the impact on deposition is pitched at a level of complexity appropriate to the PCA context and reflects the physical constraints of the system and accounts for the specific characteristics of the released aerosol. (Author)

  12. An Individual-based Probabilistic Model for Fish Stock Simulation

    Directory of Open Access Journals (Sweden)

    Federico Buti

    2010-08-01

    Full Text Available We define an individual-based probabilistic model of a sole (Solea solea behaviour. The individual model is given in terms of an Extended Probabilistic Discrete Timed Automaton (EPDTA, a new formalism that is introduced in the paper and that is shown to be interpretable as a Markov decision process. A given EPDTA model can be probabilistically model-checked by giving a suitable translation into syntax accepted by existing model-checkers. In order to simulate the dynamics of a given population of soles in different environmental scenarios, an agent-based simulation environment is defined in which each agent implements the behaviour of the given EPDTA model. By varying the probabilities and the characteristic functions embedded in the EPDTA model it is possible to represent different scenarios and to tune the model itself by comparing the results of the simulations with real data about the sole stock in the North Adriatic sea, available from the recent project SoleMon. The simulator is presented and made available for its adaptation to other species.

  13. Formulation of probabilistic models of protein structure in atomic detail using the reference ratio method

    DEFF Research Database (Denmark)

    Valentin, Jan B.; Andreetta, Christian; Boomsma, Wouter

    2014-01-01

    We propose a method to formulate probabilistic models of protein structure in atomic detail, for a given amino acid sequence, based on Bayesian principles, while retaining a close link to physics. We start from two previously developed probabilistic models of protein structure on a local length s....... The results indicate that the proposed method and the probabilistic models show considerable promise for probabilistic protein structure prediction and related applications. © 2013 Wiley Periodicals, Inc....

  14. Systems analysis approach to probabilistic modeling of fault trees

    International Nuclear Information System (INIS)

    Bartholomew, R.J.; Qualls, C.R.

    1985-01-01

    A method of probabilistic modeling of fault tree logic combined with stochastic process theory (Markov modeling) has been developed. Systems are then quantitatively analyzed probabilistically in terms of their failure mechanisms including common cause/common mode effects and time dependent failure and/or repair rate effects that include synergistic and propagational mechanisms. The modeling procedure results in a state vector set of first order, linear, inhomogeneous, differential equations describing the time dependent probabilities of failure described by the fault tree. The solutions of this Failure Mode State Variable (FMSV) model are cumulative probability distribution functions of the system. A method of appropriate synthesis of subsystems to form larger systems is developed and applied to practical nuclear power safety systems

  15. Probabilistic Programming (Invited Talk)

    OpenAIRE

    Yang, Hongseok

    2017-01-01

    Probabilistic programming refers to the idea of using standard programming constructs for specifying probabilistic models from machine learning and statistics, and employing generic inference algorithms for answering various queries on these models, such as posterior inference and estimation of model evidence. Although this idea itself is not new and was, in fact, explored by several programming-language and statistics researchers in the early 2000, it is only in the last few years that proba...

  16. Probabilistic Compositional Models: solution of an equivalence problem

    Czech Academy of Sciences Publication Activity Database

    Kratochvíl, Václav

    2013-01-01

    Roč. 54, č. 5 (2013), s. 590-601 ISSN 0888-613X R&D Projects: GA ČR GA13-20012S Institutional support: RVO:67985556 Keywords : Probabilistic model * Compositional model * Independence * Equivalence Subject RIV: BA - General Mathematics Impact factor: 1.977, year: 2013 http://library.utia.cas.cz/separaty/2013/MTR/kratochvil-0391079.pdf

  17. Probabilistic evaluation of process model matching techniques

    NARCIS (Netherlands)

    Kuss, Elena; Leopold, Henrik; van der Aa, Han; Stuckenschmidt, Heiner; Reijers, Hajo A.

    2016-01-01

    Process model matching refers to the automatic identification of corresponding activities between two process models. It represents the basis for many advanced process model analysis techniques such as the identification of similar process parts or process model search. A central problem is how to

  18. Cerebellar tDCS does not improve performance in probabilistic classification learning

    NARCIS (Netherlands)

    N. Seyed Majidi; M.C. Verhage (Claire); O. Donchin (Opher); P.J. Holland (Peter); M.A. Frens (Maarten); J.N. van der Geest (Jos)

    2016-01-01

    textabstractIn this study, the role of the cerebellum in a cognitive learning task using transcranial direct current stimulation (tDCS) was investigated. Using a weather prediction task, subjects had to learn the probabilistic associations between a stimulus (a combination of cards) and an outcome

  19. Distinct Roles of Dopamine and Subthalamic Nucleus in Learning and Probabilistic Decision Making

    Science.gov (United States)

    Coulthard, Elizabeth J.; Bogacz, Rafal; Javed, Shazia; Mooney, Lucy K.; Murphy, Gillian; Keeley, Sophie; Whone, Alan L.

    2012-01-01

    Even simple behaviour requires us to make decisions based on combining multiple pieces of learned and new information. Making such decisions requires both learning the optimal response to each given stimulus as well as combining probabilistic information from multiple stimuli before selecting a response. Computational theories of decision making…

  20. Learning System of Web Navigation Patterns through Hypertext Probabilistic Grammars

    Directory of Open Access Journals (Sweden)

    Augusto Cortez Vasquez

    2015-01-01

    Full Text Available One issue of real interest in the area of web data mining is to capture users’ activities during connection and extract behavior patterns that help define their preferences in order to improve the design of future pages adapting websites interfaces to individual users. This research is intended to provide, first of all, a presentation of the methodological foundations of the use of probabilistic languages to identify relevant or most visited websites. Secondly, the web sessions are represented by graphs and probabilistic context-free grammars so that the sessions that have the highest probabilities are considered the most visited and most preferred, therefore, the most important in relation to a particular topic. It aims to develop a tool for processing web sessions obtained from a log server represented by probabilistic context-free grammars.

  1. Using ELM-based weighted probabilistic model in the classification of synchronous EEG BCI.

    Science.gov (United States)

    Tan, Ping; Tan, Guan-Zheng; Cai, Zi-Xing; Sa, Wei-Ping; Zou, Yi-Qun

    2017-01-01

    Extreme learning machine (ELM) is an effective machine learning technique with simple theory and fast implementation, which has gained increasing interest from various research fields recently. A new method that combines ELM with probabilistic model method is proposed in this paper to classify the electroencephalography (EEG) signals in synchronous brain-computer interface (BCI) system. In the proposed method, the softmax function is used to convert the ELM output to classification probability. The Chernoff error bound, deduced from the Bayesian probabilistic model in the training process, is adopted as the weight to take the discriminant process. Since the proposed method makes use of the knowledge from all preceding training datasets, its discriminating performance improves accumulatively. In the test experiments based on the datasets from BCI competitions, the proposed method is compared with other classification methods, including the linear discriminant analysis, support vector machine, ELM and weighted probabilistic model methods. For comparison, the mutual information, classification accuracy and information transfer rate are considered as the evaluation indicators for these classifiers. The results demonstrate that our method shows competitive performance against other methods.

  2. Optimization and evaluation of probabilistic-logic sequence models

    DEFF Research Database (Denmark)

    Christiansen, Henning; Lassen, Ole Torp

    to, in principle, Turing complete languages. In general, such models are computationally far to complex for direct use, so optimization by pruning and approximation are needed. % The first steps are made towards a methodology for optimizing such models by approximations using auxiliary models......Analysis of biological sequence data demands more and more sophisticated and fine-grained models, but these in turn introduce hard computational problems. A class of probabilistic-logic models is considered, which increases the expressibility from HMM's and SCFG's regular and context-free languages...

  3. Probabilistic Modeling and Risk Assessment of Cable Icing

    DEFF Research Database (Denmark)

    Roldsgaard, Joan Hee

    This dissertation addresses the issues related to icing of structures with special emphasis on bridge cables. Cable supported bridges in cold climate suffers for ice accreting on the cables, this poses three different undesirable situations. Firstly the changed shape of the cable due to ice...... preliminary framework is modified for assessing the probability of occurrence of in-cloud and precipitation icing and its duration. Different probabilistic models are utilized for the representation of the meteorological variables and their appropriateness is evaluated both through goodness-of-fit tests...... are influencing the two icing mechanisms and their duration. The model is found to be more sensitive to changes in the discretization levels of the input variables. Thirdly the developed operational probabilistic framework for the assessment of the expected number of occurrences of ice/snow accretion on bridge...

  4. Probabilistic Modeling of Intracranial Pressure Effects on Optic Nerve Biomechanics

    Science.gov (United States)

    Ethier, C. R.; Feola, Andrew J.; Raykin, Julia; Myers, Jerry G.; Nelson, Emily S.; Samuels, Brian C.

    2016-01-01

    Altered intracranial pressure (ICP) is involved/implicated in several ocular conditions: papilledema, glaucoma and Visual Impairment and Intracranial Pressure (VIIP) syndrome. The biomechanical effects of altered ICP on optic nerve head (ONH) tissues in these conditions are uncertain but likely important. We have quantified ICP-induced deformations of ONH tissues, using finite element (FE) and probabilistic modeling (Latin Hypercube Simulations (LHS)) to consider a range of tissue properties and relevant pressures.

  5. Up-gradient transport in a probabilistic transport model

    DEFF Research Database (Denmark)

    Gavnholt, J.; Juul Rasmussen, J.; Garcia, O.E.

    2005-01-01

    The transport of particles or heat against the driving gradient is studied by employing a probabilistic transport model with a characteristic particle step length that depends on the local concentration or heat gradient. When this gradient is larger than a prescribed critical value, the standard....... These results supplement recent works by van Milligen [Phys. Plasmas 11, 3787 (2004)], which applied Levy distributed step sizes in the case of supercritical gradients to obtain the up-gradient transport. (c) 2005 American Institute of Physics....

  6. A Probabilistic Model for Uncertain Problem Solving

    National Research Council Canada - National Science Library

    Farley, Arthur M

    1981-01-01

    ... and provide pragmatic focusing. Search methods are generalized to produce tree-structured plans incorporating the use of such operators. Several application domains for the model also are discussed.

  7. Probabilistic mixture-based image modelling

    Czech Academy of Sciences Publication Activity Database

    Haindl, Michal; Havlíček, Vojtěch; Grim, Jiří

    2011-01-01

    Roč. 47, č. 3 (2011), s. 482-500 ISSN 0023-5954 R&D Projects: GA MŠk 1M0572; GA ČR GA102/08/0593 Grant - others:CESNET(CZ) 387/2010; GA MŠk(CZ) 2C06019; GA ČR(CZ) GA103/11/0335 Institutional research plan: CEZ:AV0Z10750506 Keywords : BTF texture modelling * discrete distribution mixtures * Bernoulli mixture * Gaussian mixture * multi-spectral texture modelling Subject RIV: BD - Theory of Information Impact factor: 0.454, year: 2011 http://library.utia.cas.cz/separaty/2011/RO/haindl-0360244.pdf

  8. Probabilistically-Cued Patterns Trump Perfect Cues in Statistical Language Learning.

    Science.gov (United States)

    Lany, Jill; Gómez, Rebecca L

    2013-01-01

    Probabilistically-cued co-occurrence relationships between word categories are common in natural languages but difficult to acquire. For example, in English, determiner-noun and auxiliary-verb dependencies both involve co-occurrence relationships, but determiner-noun relationships are more reliably marked by correlated distributional and phonological cues, and appear to be learned more readily. We tested whether experience with co-occurrence relationships that are more reliable promotes learning those that are less reliable using an artificial language paradigm. Prior experience with deterministically-cued contingencies did not promote learning of less reliably-cued structure, nor did prior experience with relationships instantiated in the same vocabulary. In contrast, prior experience with probabilistically-cued co-occurrence relationships instantiated in different vocabulary did enhance learning. Thus, experience with co-occurrence relationships sharing underlying structure but not vocabulary may be an important factor in learning grammatical patterns. Furthermore, experience with probabilistically-cued co-occurrence relationships, despite their difficultly for naïve learners, lays an important foundation for learning novel probabilistic structure.

  9. A Probabilistic Model of Cross-Categorization

    Science.gov (United States)

    Shafto, Patrick; Kemp, Charles; Mansinghka, Vikash; Tenenbaum, Joshua B.

    2011-01-01

    Most natural domains can be represented in multiple ways: we can categorize foods in terms of their nutritional content or social role, animals in terms of their taxonomic groupings or their ecological niches, and musical instruments in terms of their taxonomic categories or social uses. Previous approaches to modeling human categorization have…

  10. Probabilistic Reachability for Parametric Markov Models

    DEFF Research Database (Denmark)

    Hahn, Ernst Moritz; Hermanns, Holger; Zhang, Lijun

    2011-01-01

    Given a parametric Markov model, we consider the problem of computing the rational function expressing the probability of reaching a given set of states. To attack this principal problem, Daws has suggested to first convert the Markov chain into a finite automaton, from which a regular expression...

  11. Real-time probabilistic covariance tracking with efficient model update.

    Science.gov (United States)

    Wu, Yi; Cheng, Jian; Wang, Jinqiao; Lu, Hanqing; Wang, Jun; Ling, Haibin; Blasch, Erik; Bai, Li

    2012-05-01

    The recently proposed covariance region descriptor has been proven robust and versatile for a modest computational cost. The covariance matrix enables efficient fusion of different types of features, where the spatial and statistical properties, as well as their correlation, are characterized. The similarity between two covariance descriptors is measured on Riemannian manifolds. Based on the same metric but with a probabilistic framework, we propose a novel tracking approach on Riemannian manifolds with a novel incremental covariance tensor learning (ICTL). To address the appearance variations, ICTL incrementally learns a low-dimensional covariance tensor representation and efficiently adapts online to appearance changes of the target with only O(1) computational complexity, resulting in a real-time performance. The covariance-based representation and the ICTL are then combined with the particle filter framework to allow better handling of background clutter, as well as the temporary occlusions. We test the proposed probabilistic ICTL tracker on numerous benchmark sequences involving different types of challenges including occlusions and variations in illumination, scale, and pose. The proposed approach demonstrates excellent real-time performance, both qualitatively and quantitatively, in comparison with several previously proposed trackers.

  12. Towards port sustainability through probabilistic models: Bayesian networks

    Directory of Open Access Journals (Sweden)

    B. Molina

    2018-04-01

    Full Text Available It is necessary that a manager of an infrastructure knows relations between variables. Using Bayesian networks, variables can be classified, predicted and diagnosed, being able to estimate posterior probability of the unknown ones based on known ones. The proposed methodology has generated a database with port variables, which have been classified as economic, social, environmental and institutional, as addressed in of smart ports studies made in all Spanish Port System. Network has been developed using an acyclic directed graph, which have let us know relationships in terms of parents and sons. In probabilistic terms, it can be concluded from the constructed network that the most decisive variables for port sustainability are those that are part of the institutional dimension. It has been concluded that Bayesian networks allow modeling uncertainty probabilistically even when the number of variables is high as it occurs in port planning and exploitation.

  13. Design optimization under uncertainties of a mesoscale implant in biological tissues using a probabilistic learning algorithm

    Science.gov (United States)

    Soize, C.

    2017-11-01

    This paper deals with the optimal design of a titanium mesoscale implant in a cortical bone for which the apparent elasticity tensor is modeled by a non-Gaussian random field at mesoscale, which has been experimentally identified. The external applied forces are also random. The design parameters are geometrical dimensions related to the geometry of the implant. The stochastic elastostatic boundary value problem is discretized by the finite element method. The objective function and the constraints are related to normal, shear, and von Mises stresses inside the cortical bone. The constrained nonconvex optimization problem in presence of uncertainties is solved by using a probabilistic learning algorithm that allows for considerably reducing the numerical cost with respect to the classical approaches.

  14. Probabilistic flood damage modelling at the meso-scale

    Science.gov (United States)

    Kreibich, Heidi; Botto, Anna; Schröter, Kai; Merz, Bruno

    2014-05-01

    Decisions on flood risk management and adaptation are usually based on risk analyses. Such analyses are associated with significant uncertainty, even more if changes in risk due to global change are expected. Although uncertainty analysis and probabilistic approaches have received increased attention during the last years, they are still not standard practice for flood risk assessments. Most damage models have in common that complex damaging processes are described by simple, deterministic approaches like stage-damage functions. Novel probabilistic, multi-variate flood damage models have been developed and validated on the micro-scale using a data-mining approach, namely bagging decision trees (Merz et al. 2013). In this presentation we show how the model BT-FLEMO (Bagging decision Tree based Flood Loss Estimation MOdel) can be applied on the meso-scale, namely on the basis of ATKIS land-use units. The model is applied in 19 municipalities which were affected during the 2002 flood by the River Mulde in Saxony, Germany. The application of BT-FLEMO provides a probability distribution of estimated damage to residential buildings per municipality. Validation is undertaken on the one hand via a comparison with eight other damage models including stage-damage functions as well as multi-variate models. On the other hand the results are compared with official damage data provided by the Saxon Relief Bank (SAB). The results show, that uncertainties of damage estimation remain high. Thus, the significant advantage of this probabilistic flood loss estimation model BT-FLEMO is that it inherently provides quantitative information about the uncertainty of the prediction. Reference: Merz, B.; Kreibich, H.; Lall, U. (2013): Multi-variate flood damage assessment: a tree-based data-mining approach. NHESS, 13(1), 53-64.

  15. Probabilistic error bounds for reduced order modeling

    Energy Technology Data Exchange (ETDEWEB)

    Abdo, M.G.; Wang, C.; Abdel-Khalik, H.S., E-mail: abdo@purdue.edu, E-mail: wang1730@purdue.edu, E-mail: abdelkhalik@purdue.edu [Purdue Univ., School of Nuclear Engineering, West Lafayette, IN (United States)

    2015-07-01

    Reduced order modeling has proven to be an effective tool when repeated execution of reactor analysis codes is required. ROM operates on the assumption that the intrinsic dimensionality of the associated reactor physics models is sufficiently small when compared to the nominal dimensionality of the input and output data streams. By employing a truncation technique with roots in linear algebra matrix decomposition theory, ROM effectively discards all components of the input and output data that have negligible impact on reactor attributes of interest. This manuscript introduces a mathematical approach to quantify the errors resulting from the discarded ROM components. As supported by numerical experiments, the introduced analysis proves that the contribution of the discarded components could be upper-bounded with an overwhelmingly high probability. The reverse of this statement implies that the ROM algorithm can self-adapt to determine the level of the reduction needed such that the maximum resulting reduction error is below a given tolerance limit that is set by the user. (author)

  16. Probabilistic Modeling of the Renal Stone Formation Module

    Science.gov (United States)

    Best, Lauren M.; Myers, Jerry G.; Goodenow, Debra A.; McRae, Michael P.; Jackson, Travis C.

    2013-01-01

    The Integrated Medical Model (IMM) is a probabilistic tool, used in mission planning decision making and medical systems risk assessments. The IMM project maintains a database of over 80 medical conditions that could occur during a spaceflight, documenting an incidence rate and end case scenarios for each. In some cases, where observational data are insufficient to adequately define the inflight medical risk, the IMM utilizes external probabilistic modules to model and estimate the event likelihoods. One such medical event of interest is an unpassed renal stone. Due to a high salt diet and high concentrations of calcium in the blood (due to bone depletion caused by unloading in the microgravity environment) astronauts are at a considerable elevated risk for developing renal calculi (nephrolithiasis) while in space. Lack of observed incidences of nephrolithiasis has led HRP to initiate the development of the Renal Stone Formation Module (RSFM) to create a probabilistic simulator capable of estimating the likelihood of symptomatic renal stone presentation in astronauts on exploration missions. The model consists of two major parts. The first is the probabilistic component, which utilizes probability distributions to assess the range of urine electrolyte parameters and a multivariate regression to transform estimated crystal density and size distributions to the likelihood of the presentation of nephrolithiasis symptoms. The second is a deterministic physical and chemical model of renal stone growth in the kidney developed by Kassemi et al. The probabilistic component of the renal stone model couples the input probability distributions describing the urine chemistry, astronaut physiology, and system parameters with the physical and chemical outputs and inputs to the deterministic stone growth model. These two parts of the model are necessary to capture the uncertainty in the likelihood estimate. The model will be driven by Monte Carlo simulations, continuously

  17. A probabilistic model of brittle crack formation

    Science.gov (United States)

    Chudnovsky, A.; Kunin, B.

    1987-01-01

    Probability of a brittle crack formation in an elastic solid with fluctuating strength is considered. A set Omega of all possible crack trajectories reflecting the fluctuation of the strength field is introduced. The probability P(X) that crack penetration depth exceeds X is expressed as a functional integral over Omega of a conditional probability of the same event taking place along a particular path. Various techniques are considered to evaluate the integral. Under rather nonrestrictive assumptions, the integral is reduced to solving a diffusion-type equation. A new characteristic of fracture process, 'crack diffusion coefficient', is introduced. An illustrative example is then considered where the integration is reduced to solving an ordinary differential equation. The effect of the crack diffusion coefficient and of the magnitude of strength fluctuations on probability density of crack penetration depth is presented. Practical implications of the proposed model are discussed.

  18. Can model weighting improve probabilistic projections of climate change?

    Energy Technology Data Exchange (ETDEWEB)

    Raeisaenen, Jouni; Ylhaeisi, Jussi S. [Department of Physics, P.O. Box 48, University of Helsinki (Finland)

    2012-10-15

    Recently, Raeisaenen and co-authors proposed a weighting scheme in which the relationship between observable climate and climate change within a multi-model ensemble determines to what extent agreement with observations affects model weights in climate change projection. Within the Third Coupled Model Intercomparison Project (CMIP3) dataset, this scheme slightly improved the cross-validated accuracy of deterministic projections of temperature change. Here the same scheme is applied to probabilistic temperature change projection, under the strong limiting assumption that the CMIP3 ensemble spans the actual modeling uncertainty. Cross-validation suggests that probabilistic temperature change projections may also be improved by this weighting scheme. However, the improvement relative to uniform weighting is smaller in the tail-sensitive logarithmic score than in the continuous ranked probability score. The impact of the weighting on projection of real-world twenty-first century temperature change is modest in most parts of the world. However, in some areas mainly over the high-latitude oceans, the mean of the distribution is substantially changed and/or the distribution is considerably narrowed. The weights of individual models vary strongly with location, so that a model that receives nearly zero weight in some area may still get a large weight elsewhere. Although the details of this variation are method-specific, it suggests that the relative strengths of different models may be difficult to harness by weighting schemes that use spatially uniform model weights. (orig.)

  19. Pipe fracture evaluations for leak-rate detection: Probabilistic models

    International Nuclear Information System (INIS)

    Rahman, S.; Wilkowski, G.; Ghadiali, N.

    1993-01-01

    This is the second in series of three papers generated from studies on nuclear pipe fracture evaluations for leak-rate detection. This paper focuses on the development of novel probabilistic models for stochastic performance evaluation of degraded nuclear piping systems. It was accomplished here in three distinct stages. First, a statistical analysis was conducted to characterize various input variables for thermo-hydraulic analysis and elastic-plastic fracture mechanics, such as material properties of pipe, crack morphology variables, and location of cracks found in nuclear piping. Second, a new stochastic model was developed to evaluate performance of degraded piping systems. It is based on accurate deterministic models for thermo-hydraulic and fracture mechanics analyses described in the first paper, statistical characterization of various input variables, and state-of-the-art methods of modem structural reliability theory. From this model. the conditional probability of failure as a function of leak-rate detection capability of the piping systems can be predicted. Third, a numerical example was presented to illustrate the proposed model for piping reliability analyses. Results clearly showed that the model provides satisfactory estimates of conditional failure probability with much less computational effort when compared with those obtained from Monte Carlo simulation. The probabilistic model developed in this paper will be applied to various piping in boiling water reactor and pressurized water reactor plants for leak-rate detection applications

  20. Probabilistic wind power forecasting with online model selection and warped gaussian process

    International Nuclear Information System (INIS)

    Kou, Peng; Liang, Deliang; Gao, Feng; Gao, Lin

    2014-01-01

    Highlights: • A new online ensemble model for the probabilistic wind power forecasting. • Quantifying the non-Gaussian uncertainties in wind power. • Online model selection that tracks the time-varying characteristic of wind generation. • Dynamically altering the input features. • Recursive update of base models. - Abstract: Based on the online model selection and the warped Gaussian process (WGP), this paper presents an ensemble model for the probabilistic wind power forecasting. This model provides the non-Gaussian predictive distributions, which quantify the non-Gaussian uncertainties associated with wind power. In order to follow the time-varying characteristics of wind generation, multiple time dependent base forecasting models and an online model selection strategy are established, thus adaptively selecting the most probable base model for each prediction. WGP is employed as the base model, which handles the non-Gaussian uncertainties in wind power series. Furthermore, a regime switch strategy is designed to modify the input feature set dynamically, thereby enhancing the adaptiveness of the model. In an online learning framework, the base models should also be time adaptive. To achieve this, a recursive algorithm is introduced, thus permitting the online updating of WGP base models. The proposed model has been tested on the actual data collected from both single and aggregated wind farms

  1. A Probabilistic Model of Meter Perception: Simulating Enculturation

    Directory of Open Access Journals (Sweden)

    Bastiaan van der Weij

    2017-05-01

    Full Text Available Enculturation is known to shape the perception of meter in music but this is not explicitly accounted for by current cognitive models of meter perception. We hypothesize that the induction of meter is a result of predictive coding: interpreting onsets in a rhythm relative to a periodic meter facilitates prediction of future onsets. Such prediction, we hypothesize, is based on previous exposure to rhythms. As such, predictive coding provides a possible explanation for the way meter perception is shaped by the cultural environment. Based on this hypothesis, we present a probabilistic model of meter perception that uses statistical properties of the relation between rhythm and meter to infer meter from quantized rhythms. We show that our model can successfully predict annotated time signatures from quantized rhythmic patterns derived from folk melodies. Furthermore, we show that by inferring meter, our model improves prediction of the onsets of future events compared to a similar probabilistic model that does not infer meter. Finally, as a proof of concept, we demonstrate how our model can be used in a simulation of enculturation. From the results of this simulation, we derive a class of rhythms that are likely to be interpreted differently by enculturated listeners with different histories of exposure to rhythms.

  2. Comparison of plant-specific probabilistic safety assessments and lessons learned

    International Nuclear Information System (INIS)

    Balfanz, H.P.; Berg, H.P.; Steininger, U.

    2001-01-01

    Probabilistic safety assessments (PSA) have been performed for all German nuclear power plants in operation. These assessments are mainly based on the recent German PSA guide and an earlier draft, respectively. However, comparison of these PSA show differences in the results which are discussed in this paper. Lessons learned from this comparison and further development of the PSA methodology are described. (orig.) [de

  3. Aging and a genetic KIBRA polymorphism interactively affect feedback- and observation-based probabilistic classification learning.

    Science.gov (United States)

    Schuck, Nicolas W; Petok, Jessica R; Meeter, Martijn; Schjeide, Brit-Maren M; Schröder, Julia; Bertram, Lars; Gluck, Mark A; Li, Shu-Chen

    2018-01-01

    Probabilistic category learning involves complex interactions between the hippocampus and striatum that may depend on whether acquisition occurs via feedback or observation. Little is known about how healthy aging affects these processes. We tested whether age-related behavioral differences in probabilistic category learning from feedback or observation depend on a genetic factor known to influence individual differences in hippocampal function, the KIBRA gene (single nucleotide polymorphism rs17070145). Results showed comparable age-related performance impairments in observational as well as feedback-based learning. Moreover, genetic analyses indicated an age-related interactive effect of KIBRA on learning: among older adults, the beneficial T-allele was positively associated with learning from feedback, but negatively with learning from observation. In younger adults, no effects of KIBRA were found. Our results add behavioral genetic evidence to emerging data showing age-related differences in how neural resources relate to memory functions, namely that hippocampal and striatal contributions to probabilistic category learning may vary with age. Our findings highlight the effects genetic factors can have on differential age-related decline of different memory functions. Copyright © 2017 Elsevier Inc. All rights reserved.

  4. Probabilistic seismic hazard analysis - lessons learned: A regulator's perspective

    International Nuclear Information System (INIS)

    Reiter, L.

    1990-01-01

    Probabilistic seismic hazard analysis is a powerful, rational and attractive tool for decision-making. It is capable of absorbing and integrating a wide range of information and judgement and their associated uncertainties into a flexible framework that permits the application of societal goals and priorities. Unfortunately, its highly integrative nature can obscure those elements which drive the results, its highly quantitative nature can lead to false impressions of accuracy, and its open embrace of uncertainty can make decision-making difficult. Addressing these problems can only help to increase its use and make it more palatable to those who need to assess seismic hazard and utilize the results. (orig.)

  5. Improved detection of chemical substances from colorimetric sensor data using probabilistic machine learning

    DEFF Research Database (Denmark)

    Mølgaard, Lasse Lohilahti; Buus, Ole Thomsen; Larsen, Jan

    2017-01-01

    We present a data-driven machine learning approach to detect drug- and explosives-precursors using colorimetric sensor technology for air-sampling. The sensing technology has been developed in the context of the CRIM-TRACK project. At present a fully- integrated portable prototype for air sampling...... of the highly multi-variate data produced from the colorimetric chip a number of machine learning techniques are employed to provide reliable classification of target analytes from confounders found in the air streams. We demonstrate that a data-driven machine learning method using dimensionality reduction...... in combination with a probabilistic classifier makes it possible to produce informative features and a high detection rate of analytes. Furthermore, the probabilistic machine learning approach provides a means of automatically identifying unreliable measurements that could produce false predictions...

  6. A deterministic-probabilistic model for contaminant transport. User manual

    Energy Technology Data Exchange (ETDEWEB)

    Schwartz, F W; Crowe, A

    1980-08-01

    This manual describes a deterministic-probabilistic contaminant transport (DPCT) computer model designed to simulate mass transfer by ground-water movement in a vertical section of the earth's crust. The model can account for convection, dispersion, radioactive decay, and cation exchange for a single component. A velocity is calculated from the convective transport of the ground water for each reference particle in the modeled region; dispersion is accounted for in the particle motion by adding a readorn component to the deterministic motion. The model is sufficiently general to enable the user to specify virtually any type of water table or geologic configuration, and a variety of boundary conditions. A major emphasis in the model development has been placed on making the model simple to use, and information provided in the User Manual will permit changes to the computer code to be made relatively easily for those that might be required for specific applications. (author)

  7. A Probabilistic Genome-Wide Gene Reading Frame Sequence Model

    DEFF Research Database (Denmark)

    Have, Christian Theil; Mørk, Søren

    We introduce a new type of probabilistic sequence model, that model the sequential composition of reading frames of genes in a genome. Our approach extends gene finders with a model of the sequential composition of genes at the genome-level -- effectively producing a sequential genome annotation...... as output. The model can be used to obtain the most probable genome annotation based on a combination of i: a gene finder score of each gene candidate and ii: the sequence of the reading frames of gene candidates through a genome. The model --- as well as a higher order variant --- is developed and tested...... and are evaluated by the effect on prediction performance. Since bacterial gene finding to a large extent is a solved problem it forms an ideal proving ground for evaluating the explicit modeling of larger scale gene sequence composition of genomes. We conclude that the sequential composition of gene reading frames...

  8. Convex models and probabilistic approach of nonlinear fatigue failure

    International Nuclear Information System (INIS)

    Qiu Zhiping; Lin Qiang; Wang Xiaojun

    2008-01-01

    This paper is concerned with the nonlinear fatigue failure problem with uncertainties in the structural systems. In the present study, in order to solve the nonlinear problem by convex models, the theory of ellipsoidal algebra with the help of the thought of interval analysis is applied. In terms of the inclusion monotonic property of ellipsoidal functions, the nonlinear fatigue failure problem with uncertainties can be solved. A numerical example of 25-bar truss structures is given to illustrate the efficiency of the presented method in comparison with the probabilistic approach

  9. Probabilistic Fatigue Model for Reinforced Concrete Onshore Wind Turbine Foundations

    DEFF Research Database (Denmark)

    Marquez-Dominguez, Sergio; Sørensen, John Dalsgaard

    2013-01-01

    Reinforced Concrete Slab Foundation (RCSF) is the most common onshore wind turbine foundation type installed by the wind industry around the world. Fatigue cracks in a RCSF are an important issue to be considered by the designers. Causes and consequences of the cracks due to fatigue damage in RCSFs...... are discussed in this paper. A probabilistic fatigue model for a RCSF is established which makes a rational treatment of the uncertainties involved in the complex interaction between fatigue cyclic loads and reinforced concrete. Design and limit state equations are established considering concrete shear...

  10. Architecture for Integrated Medical Model Dynamic Probabilistic Risk Assessment

    Science.gov (United States)

    Jaworske, D. A.; Myers, J. G.; Goodenow, D.; Young, M.; Arellano, J. D.

    2016-01-01

    Probabilistic Risk Assessment (PRA) is a modeling tool used to predict potential outcomes of a complex system based on a statistical understanding of many initiating events. Utilizing a Monte Carlo method, thousands of instances of the model are considered and outcomes are collected. PRA is considered static, utilizing probabilities alone to calculate outcomes. Dynamic Probabilistic Risk Assessment (dPRA) is an advanced concept where modeling predicts the outcomes of a complex system based not only on the probabilities of many initiating events, but also on a progression of dependencies brought about by progressing down a time line. Events are placed in a single time line, adding each event to a queue, as managed by a planner. Progression down the time line is guided by rules, as managed by a scheduler. The recently developed Integrated Medical Model (IMM) summarizes astronaut health as governed by the probabilities of medical events and mitigation strategies. Managing the software architecture process provides a systematic means of creating, documenting, and communicating a software design early in the development process. The software architecture process begins with establishing requirements and the design is then derived from the requirements.

  11. Strategies in probabilistic feedback learning in Parkinson patients OFF medication.

    Science.gov (United States)

    Bellebaum, C; Kobza, S; Ferrea, S; Schnitzler, A; Pollok, B; Südmeyer, M

    2016-04-21

    Studies on classification learning suggested that altered dopamine function in Parkinson's Disease (PD) specifically affects learning from feedback. In patients OFF medication, enhanced learning from negative feedback has been described. This learning bias was not seen in observational learning from feedback, indicating different neural mechanisms for this type of learning. The present study aimed to compare the acquisition of stimulus-response-outcome associations in PD patients OFF medication and healthy control subjects in active and observational learning. 16 PD patients OFF medication and 16 controls were examined with three parallel learning tasks each, two feedback-based (active and observational) and one non-feedback-based paired associates task. No acquisition deficit was seen in the patients for any of the tasks. More detailed analyses on the learning strategies did, however, reveal that the patients showed more lose-shift responses during active feedback learning than controls, and that lose-shift and win-stay responses more strongly determined performance accuracy in patients than controls. For observational feedback learning, the performance of both groups correlated similarly with the performance in non-feedback-based paired associates learning and with the accuracy of observed performance. Also, patients and controls showed comparable evidence of feedback processing in observational learning. In active feedback learning, PD patients use alternative learning strategies than healthy controls. Analyses on observational learning did not yield differences between patients and controls, adding to recent evidence of a differential role of the human striatum in active and observational learning from feedback. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  12. Promoting Probabilistic Programming System (PPS) Development in Probabilistic Programming for Advancing Machine Learning (PPAML)

    Science.gov (United States)

    2018-03-01

    invested in the future developments of PPSs. 3.0 METHODS , ASSUMPTIONS, AND PROCEDURES Section 3 describes the methods for each of the primary areas of...approaches for solving machine learning problems of interest to defense, science , and the economy. Within DoD, there are different needs for ...Datasets include social network data and vaccination statistics . Those data have different characteristics (e.g., percentages for CDC regional

  13. Incorporating organizational factors into probabilistic safety assessment of nuclear power plants through canonical probabilistic models

    Energy Technology Data Exchange (ETDEWEB)

    Galan, S.F. [Dpto. de Inteligencia Artificial, E.T.S.I. Informatica (UNED), Juan del Rosal, 16, 28040 Madrid (Spain)]. E-mail: seve@dia.uned.es; Mosleh, A. [2100A Marie Mount Hall, Materials and Nuclear Engineering Department, University of Maryland, College Park, MD 20742 (United States)]. E-mail: mosleh@umd.edu; Izquierdo, J.M. [Area de Modelado y Simulacion, Consejo de Seguridad Nuclear, Justo Dorado, 11, 28040 Madrid (Spain)]. E-mail: jmir@csn.es

    2007-08-15

    The {omega}-factor approach is a method that explicitly incorporates organizational factors into Probabilistic safety assessment of nuclear power plants. Bayesian networks (BNs) are the underlying formalism used in this approach. They have a structural part formed by a graph whose nodes represent organizational variables, and a parametric part that consists of conditional probabilities, each of them quantifying organizational influences between one variable and its parents in the graph. The aim of this paper is twofold. First, we discuss some important limitations of current procedures in the {omega}-factor approach for either assessing conditional probabilities from experts or estimating them from data. We illustrate the discussion with an example that uses data from Licensee Events Reports of nuclear power plants for the estimation task. Second, we introduce significant improvements in the way BNs for the {omega}-factor approach can be constructed, so that parameter acquisition becomes easier and more intuitive. The improvements are based on the use of noisy-OR gates as model of multicausal interaction between each BN node and its parents.

  14. Incorporating organizational factors into probabilistic safety assessment of nuclear power plants through canonical probabilistic models

    International Nuclear Information System (INIS)

    Galan, S.F.; Mosleh, A.; Izquierdo, J.M.

    2007-01-01

    The ω-factor approach is a method that explicitly incorporates organizational factors into Probabilistic safety assessment of nuclear power plants. Bayesian networks (BNs) are the underlying formalism used in this approach. They have a structural part formed by a graph whose nodes represent organizational variables, and a parametric part that consists of conditional probabilities, each of them quantifying organizational influences between one variable and its parents in the graph. The aim of this paper is twofold. First, we discuss some important limitations of current procedures in the ω-factor approach for either assessing conditional probabilities from experts or estimating them from data. We illustrate the discussion with an example that uses data from Licensee Events Reports of nuclear power plants for the estimation task. Second, we introduce significant improvements in the way BNs for the ω-factor approach can be constructed, so that parameter acquisition becomes easier and more intuitive. The improvements are based on the use of noisy-OR gates as model of multicausal interaction between each BN node and its parents

  15. Probabilistic models of population evolution scaling limits, genealogies and interactions

    CERN Document Server

    Pardoux, Étienne

    2016-01-01

    This expository book presents the mathematical description of evolutionary models of populations subject to interactions (e.g. competition) within the population. The author includes both models of finite populations, and limiting models as the size of the population tends to infinity. The size of the population is described as a random function of time and of the initial population (the ancestors at time 0). The genealogical tree of such a population is given. Most models imply that the population is bound to go extinct in finite time. It is explained when the interaction is strong enough so that the extinction time remains finite, when the ancestral population at time 0 goes to infinity. The material could be used for teaching stochastic processes, together with their applications. Étienne Pardoux is Professor at Aix-Marseille University, working in the field of Stochastic Analysis, stochastic partial differential equations, and probabilistic models in evolutionary biology and population genetics. He obtai...

  16. A Practical Probabilistic Graphical Modeling Tool for Weighing ...

    Science.gov (United States)

    Past weight-of-evidence frameworks for adverse ecological effects have provided soft-scoring procedures for judgments based on the quality and measured attributes of evidence. Here, we provide a flexible probabilistic structure for weighing and integrating lines of evidence for ecological risk determinations. Probabilistic approaches can provide both a quantitative weighing of lines of evidence and methods for evaluating risk and uncertainty. The current modeling structure wasdeveloped for propagating uncertainties in measured endpoints and their influence on the plausibility of adverse effects. To illustrate the approach, we apply the model framework to the sediment quality triad using example lines of evidence for sediment chemistry measurements, bioassay results, and in situ infauna diversity of benthic communities using a simplified hypothetical case study. We then combine the three lines evidence and evaluate sensitivity to the input parameters, and show how uncertainties are propagated and how additional information can be incorporated to rapidly update the probability of impacts. The developed network model can be expanded to accommodate additional lines of evidence, variables and states of importance, and different types of uncertainties in the lines of evidence including spatial and temporal as well as measurement errors. We provide a flexible Bayesian network structure for weighing and integrating lines of evidence for ecological risk determinations

  17. Probabilistic modeling of crack networks in thermal fatigue

    International Nuclear Information System (INIS)

    Malesys, N.

    2007-11-01

    Thermal superficial crack networks have been detected in mixing zone of cooling system in nuclear power plants. Numerous experimental works have already been led to characterize initiation and propagation of these cracks. The random aspect of initiation led to propose a probabilistic model for the formation and propagation of crack networks in thermal fatigue. In a first part, uniaxial mechanical test were performed on smooth and slightly notched specimens in order to characterize the initiation of multiple cracks, their arrest due to obscuration and the coalescence phenomenon by recovery of amplification stress zones. In a second time, the probabilistic model was established under two assumptions: the continuous cracks initiation on surface, described by a Poisson point process law with threshold, and the shielding phenomenon which prohibits the initiation or the propagation of a crack if this one is in the relaxation stress zone of another existing crack. The crack propagation is assumed to follow a Paris' law based on the computation of stress intensity factors at the top and the bottom of crack. The evolution of multiaxial cracks on the surface can be followed thanks to three quantities: the shielding probability, comparable to a damage variable of the structure, the initiated crack density, representing the total number of cracks per unit surface which can be compared to experimental observations, and the propagating crack density, representing the number per unit surface of active cracks in the network. The crack sizes distribution is also computed by the model allowing an easier comparison with experimental results. (author)

  18. A probabilistic model for US nuclear power construction times

    International Nuclear Information System (INIS)

    Shash, A.A.H.

    1988-01-01

    Construction time for nuclear power plants is an important element in planning for resources to meet future load demands. Analysis of actual versus estimated construction times for past US nuclear power plants indicates that utilities have continuously underestimated their power plants' construction durations. The analysis also indicates that the actual average construction time has been increasing upward, and the actual durations of power plants permitted to construct in the same year varied substantially. This study presents two probabilistic models for nuclear power construction time for use by the nuclear industry as estimating tool. The study also presents a detailed explanation of the factors that are responsible for increasing and varying nuclear power construction times. Observations on 91 complete nuclear units were involved in three interdependent analyses in the process of explanation and derivation of the probabilistic models. The historical data was first utilized in the data envelopment analysis (DEA) for the purpose of obtaining frontier index measures for project management achievement in building nuclear power plants

  19. A Probabilistic Graphical Model to Detect Chromosomal Domains

    Science.gov (United States)

    Heermann, Dieter; Hofmann, Andreas; Weber, Eva

    To understand the nature of a cell, one needs to understand the structure of its genome. For this purpose, experimental techniques such as Hi-C detecting chromosomal contacts are used to probe the three-dimensional genomic structure. These experiments yield topological information, consistently showing a hierarchical subdivision of the genome into self-interacting domains across many organisms. Current methods for detecting these domains using the Hi-C contact matrix, i.e. a doubly-stochastic matrix, are mostly based on the assumption that the domains are distinct, thus non-overlapping. For overcoming this simplification and for being able to unravel a possible nested domain structure, we developed a probabilistic graphical model that makes no a priori assumptions on the domain structure. Within this approach, the Hi-C contact matrix is analyzed using an Ising like probabilistic graphical model whose coupling constant is proportional to each lattice point (entry in the contact matrix). The results show clear boundaries between identified domains and the background. These domain boundaries are dependent on the coupling constant, so that one matrix yields several clusters of different sizes, which show the self-interaction of the genome on different scales. This work was supported by a Grant from the International Human Frontier Science Program Organization (RGP0014/2014).

  20. Procedural learning in Tourette syndrome, ADHD, and comorbid Tourette-ADHD: Evidence from a probabilistic sequence learning task.

    Science.gov (United States)

    Takács, Ádám; Shilon, Yuval; Janacsek, Karolina; Kóbor, Andrea; Tremblay, Antoine; Németh, Dezső; Ullman, Michael T

    2017-10-01

    Procedural memory, which is rooted in the basal ganglia, plays an important role in the implicit learning of motor and cognitive skills. Few studies have examined procedural learning in either Tourette syndrome (TS) or Attention Deficit Hyperactivity Disorder (ADHD), despite basal ganglia abnormalities in both of these neurodevelopmental disorders. We aimed to assess procedural learning in children with TS (n=13), ADHD (n=22), and comorbid TS-ADHD (n=20), as well as in typically developing children (n=21). Procedural learning was measured with a well-studied implicit probabilistic sequence learning task, the alternating serial reaction time task. All four groups showed evidence of sequence learning, and moreover did not differ from each other in sequence learning. This result, from the first study to examine procedural memory across TS, ADHD and comorbid TS-ADHD, is consistent with previous findings of intact procedural learning of sequences in both TS and ADHD. In contrast, some studies have found impaired procedural learning of non-sequential probabilistic categories in TS. This suggests that sequence learning may be spared in TS and ADHD, while at least some other forms of learning in procedural memory are impaired, at least in TS. Our findings indicate that disorders associated with basal ganglia abnormalities do not necessarily show procedural learning deficits, and provide a possible path for more effective diagnostic tools, and educational and training programs. Copyright © 2017 Elsevier Inc. All rights reserved.

  1. Probabilistic delay differential equation modeling of event-related potentials.

    Science.gov (United States)

    Ostwald, Dirk; Starke, Ludger

    2016-08-01

    "Dynamic causal models" (DCMs) are a promising approach in the analysis of functional neuroimaging data due to their biophysical interpretability and their consolidation of functional-segregative and functional-integrative propositions. In this theoretical note we are concerned with the DCM framework for electroencephalographically recorded event-related potentials (ERP-DCM). Intuitively, ERP-DCM combines deterministic dynamical neural mass models with dipole-based EEG forward models to describe the event-related scalp potential time-series over the entire electrode space. Since its inception, ERP-DCM has been successfully employed to capture the neural underpinnings of a wide range of neurocognitive phenomena. However, in spite of its empirical popularity, the technical literature on ERP-DCM remains somewhat patchy. A number of previous communications have detailed certain aspects of the approach, but no unified and coherent documentation exists. With this technical note, we aim to close this gap and to increase the technical accessibility of ERP-DCM. Specifically, this note makes the following novel contributions: firstly, we provide a unified and coherent review of the mathematical machinery of the latent and forward models constituting ERP-DCM by formulating the approach as a probabilistic latent delay differential equation model. Secondly, we emphasize the probabilistic nature of the model and its variational Bayesian inversion scheme by explicitly deriving the variational free energy function in terms of both the likelihood expectation and variance parameters. Thirdly, we detail and validate the estimation of the model with a special focus on the explicit form of the variational free energy function and introduce a conventional nonlinear optimization scheme for its maximization. Finally, we identify and discuss a number of computational issues which may be addressed in the future development of the approach. Copyright © 2016 Elsevier Inc. All rights reserved.

  2. Applying Probabilistic Decision Models to Clinical Trial Design

    Science.gov (United States)

    Smith, Wade P; Phillips, Mark H

    2018-01-01

    Clinical trial design most often focuses on a single or several related outcomes with corresponding calculations of statistical power. We consider a clinical trial to be a decision problem, often with competing outcomes. Using a current controversy in the treatment of HPV-positive head and neck cancer, we apply several different probabilistic methods to help define the range of outcomes given different possible trial designs. Our model incorporates the uncertainties in the disease process and treatment response and the inhomogeneities in the patient population. Instead of expected utility, we have used a Markov model to calculate quality adjusted life expectancy as a maximization objective. Monte Carlo simulations over realistic ranges of parameters are used to explore different trial scenarios given the possible ranges of parameters. This modeling approach can be used to better inform the initial trial design so that it will more likely achieve clinical relevance.

  3. Does a more sophisticated storm erosion model improve probabilistic erosion estimates?

    NARCIS (Netherlands)

    Ranasinghe, R.W.M.R.J.B.; Callaghan, D.; Roelvink, D.

    2013-01-01

    The dependency between the accuracy/uncertainty of storm erosion exceedance estimates obtained via a probabilistic model and the level of sophistication of the structural function (storm erosion model) embedded in the probabilistic model is assessed via the application of Callaghan et al.'s (2008)

  4. Medication Impairs Probabilistic Classification Learning in Parkinson's Disease

    Science.gov (United States)

    Jahanshahi, Marjan; Wilkinson, Leonora; Gahir, Harpreet; Dharminda, Angeline; Lagnado, David A.

    2010-01-01

    In Parkinson's disease (PD), it is possible that tonic increase of dopamine associated with levodopa medication overshadows phasic release of dopamine, which is essential for learning. Thus while the motor symptoms of PD are improved with levodopa medication, learning would be disrupted. To test this hypothesis, we investigated the effect of…

  5. Systems control with generalized probabilistic fuzzy-reinforcement learning

    NARCIS (Netherlands)

    Hinojosa, J.; Nefti, S.; Kaymak, U.

    2011-01-01

    Reinforcement learning (RL) is a valuable learning method when the systems require a selection of control actions whose consequences emerge over long periods for which input-output data are not available. In most combinations of fuzzy systems and RL, the environment is considered to be

  6. Probabilistic modeling of caprock leakage from seismic reflection data

    DEFF Research Database (Denmark)

    Zunino, Andrea; Hansen, Thomas Mejer; Bergjofd-Kitterød, Ingjerd

    We illustrate a methodology which helps to perform a leakage risk analysis for a CO2 reservoir based on a consistent, probabilistic approach to geophysical and geostatistical inversion. Generally, risk assessments of storage complexes are based on geological models and simulations of CO2 movement...... within the storage complexes. The geological models are built on top of geophysical data such as seismic surveys, geological information and well logs from the reservoir or nearby regions. The risk assessment of CO2 storage requires a careful analysis which accounts for all sources of uncertainty....... However, at present, no well-defined and consistent method for mapping the true uncertainty related to the geophysical data and how that uncertainty affects the overall risk assessment for the potential storage site is available. To properly quantify the uncertainties and to avoid unrealistic...

  7. Semi-supervised learning based probabilistic latent semantic analysis for automatic image annotation

    Institute of Scientific and Technical Information of China (English)

    Tian Dongping

    2017-01-01

    In recent years, multimedia annotation problem has been attracting significant research attention in multimedia and computer vision areas, especially for automatic image annotation, whose purpose is to provide an efficient and effective searching environment for users to query their images more easily.In this paper, a semi-supervised learning based probabilistic latent semantic analysis ( PL-SA) model for automatic image annotation is presenred.Since it' s often hard to obtain or create la-beled images in large quantities while unlabeled ones are easier to collect, a transductive support vector machine ( TSVM) is exploited to enhance the quality of the training image data.Then, differ-ent image features with different magnitudes will result in different performance for automatic image annotation.To this end, a Gaussian normalization method is utilized to normalize different features extracted from effective image regions segmented by the normalized cuts algorithm so as to reserve the intrinsic content of images as complete as possible.Finally, a PLSA model with asymmetric mo-dalities is constructed based on the expectation maximization( EM) algorithm to predict a candidate set of annotations with confidence scores.Extensive experiments on the general-purpose Corel5k dataset demonstrate that the proposed model can significantly improve performance of traditional PL-SA for the task of automatic image annotation.

  8. Modeling and control of an unstable system using probabilistic fuzzy inference system

    Directory of Open Access Journals (Sweden)

    Sozhamadevi N.

    2015-09-01

    Full Text Available A new type Fuzzy Inference System is proposed, a Probabilistic Fuzzy Inference system which model and minimizes the effects of statistical uncertainties. The blend of two different concepts, degree of truth and probability of truth in a unique framework leads to this new concept. This combination is carried out both in Fuzzy sets and Fuzzy rules, which gives rise to Probabilistic Fuzzy Sets and Probabilistic Fuzzy Rules. Introducing these probabilistic elements, a distinctive probabilistic fuzzy inference system is developed and this involves fuzzification, inference and output processing. This integrated approach accounts for all of the uncertainty like rule uncertainties and measurement uncertainties present in the systems and has led to the design which performs optimally after training. In this paper a Probabilistic Fuzzy Inference System is applied for modeling and control of a highly nonlinear, unstable system and also proved its effectiveness.

  9. Learning Additional Languages as Hierarchical Probabilistic Inference: Insights From First Language Processing.

    Science.gov (United States)

    Pajak, Bozena; Fine, Alex B; Kleinschmidt, Dave F; Jaeger, T Florian

    2016-12-01

    We present a framework of second and additional language (L2/L n ) acquisition motivated by recent work on socio-indexical knowledge in first language (L1) processing. The distribution of linguistic categories covaries with socio-indexical variables (e.g., talker identity, gender, dialects). We summarize evidence that implicit probabilistic knowledge of this covariance is critical to L1 processing, and propose that L2/L n learning uses the same type of socio-indexical information to probabilistically infer latent hierarchical structure over previously learned and new languages. This structure guides the acquisition of new languages based on their inferred place within that hierarchy, and is itself continuously revised based on new input from any language. This proposal unifies L1 processing and L2/L n acquisition as probabilistic inference under uncertainty over socio-indexical structure. It also offers a new perspective on crosslinguistic influences during L2/L n learning, accommodating gradient and continued transfer (both negative and positive) from previously learned to novel languages, and vice versa.

  10. Teacher learning about probabilistic reasoning in relation to teaching it in an Advanced Certificate in Education (ACE programme

    Directory of Open Access Journals (Sweden)

    Faaiz Gierdien

    2008-02-01

    Full Text Available I report on what teachers in an Advanced Certificate in Education (ACE in-service programme learned about probabilistic reasoning in relation to teaching it. I worked 'on the inside' using my practice as a site for studying teaching and learning. The teachers were from three different towns in the Northern Cape province and had limited teaching contact time, as is the nature of ACE programmes. Findings revealed a complicated picture, where some teachers were prepared to consider influences of their intuitive probabilistic reasoning on formal probabilistic reasoning when it came to teaching. It was, however, the 'genuineness' of teacher learning which was the issue that the findings have to address. Therefore a speculative, hopeful strategy for affecting teacher learning in mathematics teacher education practice is to sustain disequilibrium between dichotomies such as formal and intuitive probabilistic reasoning, which has analogies in content and pedagogy, and subject matter and method.

  11. Application of a probabilistic model of rainfall-induced shallow landslides to complex hollows

    NARCIS (Netherlands)

    Talebi, A.; Uijlenhoet, R.; Troch, P.A.

    2008-01-01

    Recently, D'Odorico and Fagherazzi (2003) proposed "A probabilistic model of rainfall-triggered shallow landslides in hollows" (Water Resour. Res., 39, 2003). Their model describes the long-term evolution of colluvial deposits through a probabilistic soil mass balance at a point. Further building

  12. Probabilistic brains: knowns and unknowns

    Science.gov (United States)

    Pouget, Alexandre; Beck, Jeffrey M; Ma, Wei Ji; Latham, Peter E

    2015-01-01

    There is strong behavioral and physiological evidence that the brain both represents probability distributions and performs probabilistic inference. Computational neuroscientists have started to shed light on how these probabilistic representations and computations might be implemented in neural circuits. One particularly appealing aspect of these theories is their generality: they can be used to model a wide range of tasks, from sensory processing to high-level cognition. To date, however, these theories have only been applied to very simple tasks. Here we discuss the challenges that will emerge as researchers start focusing their efforts on real-life computations, with a focus on probabilistic learning, structural learning and approximate inference. PMID:23955561

  13. Reconstructing Constructivism: Causal Models, Bayesian Learning Mechanisms, and the Theory Theory

    Science.gov (United States)

    Gopnik, Alison; Wellman, Henry M.

    2012-01-01

    We propose a new version of the "theory theory" grounded in the computational framework of probabilistic causal models and Bayesian learning. Probabilistic models allow a constructivist but rigorous and detailed approach to cognitive development. They also explain the learning of both more specific causal hypotheses and more abstract framework…

  14. Frontostriatal development and probabilistic reinforcement learning during adolescence.

    Science.gov (United States)

    DePasque, Samantha; Galván, Adriana

    2017-09-01

    Adolescence has traditionally been viewed as a period of vulnerability to increased risk-taking and adverse outcomes, which have been linked to neurobiological maturation of the frontostriatal reward system. However, growing research on the role of developmental changes in the adolescent frontostriatal system in facilitating learning will provide a more nuanced view of adolescence. In this review, we discuss the implications of existing research on this topic for learning during adolescence, and suggest that the very neural changes that render adolescents vulnerable to social pressure and risky decision making may also stand to play a role in scaffolding the ability to learn from rewards and from performance-related feedback. Copyright © 2017 Elsevier Inc. All rights reserved.

  15. Risk Management Technologies With Logic and Probabilistic Models

    CERN Document Server

    Solozhentsev, E D

    2012-01-01

    This book presents intellectual, innovative, information technologies (I3-technologies) based on logical and probabilistic (LP) risk models. The technologies presented here consider such models for structurally complex systems and processes with logical links and with random events in economics and technology.  The volume describes the following components of risk management technologies: LP-calculus; classes of LP-models of risk and efficiency; procedures for different classes; special software for different classes; examples of applications; methods for the estimation of probabilities of events based on expert information. Also described are a variety of training courses in these topics. The classes of risk models treated here are: LP-modeling, LP-classification, LP-efficiency, and LP-forecasting. Particular attention is paid to LP-models of risk of failure to resolve difficult economic and technical problems. Amongst the  discussed  procedures of I3-technologies  are the construction of  LP-models,...

  16. Probabilistic Modeling of the Fatigue Crack Growth Rate for Ni-base Alloy X-750

    International Nuclear Information System (INIS)

    Yoon, J.Y.; Nam, H.O.; Hwang, I.S.; Lee, T.H.

    2012-01-01

    Extending the operating life of existing nuclear power plants (NPP's) beyond 60 years. Many aging problems of passive components such as PWSCC, IASCC, FAC and Corrosion Fatigue; Safety analysis: Deterministic analysis + Probabilistic analysis; Many uncertainties of parameters or relationship in general probabilistic analysis such as probabilistic safety assessment (PSA); Bayesian inference: Decreasing uncertainties by updating unknown parameter; Ensuring the reliability of passive components (e.g. pipes) as well as active components (e.g. valve, pump) in NPP's; Developing probabilistic model for failures; Updating the fatigue crack growth rate (FCGR)

  17. Probabilistic models for reactive behaviour in heterogeneous condensed phase media

    Science.gov (United States)

    Baer, M. R.; Gartling, D. K.; DesJardin, P. E.

    2012-02-01

    This work presents statistically-based models to describe reactive behaviour in heterogeneous energetic materials. Mesoscale effects are incorporated in continuum-level reactive flow descriptions using probability density functions (pdfs) that are associated with thermodynamic and mechanical states. A generalised approach is presented that includes multimaterial behaviour by treating the volume fraction as a random kinematic variable. Model simplifications are then sought to reduce the complexity of the description without compromising the statistical approach. Reactive behaviour is first considered for non-deformable media having a random temperature field as an initial state. A pdf transport relationship is derived and an approximate moment approach is incorporated in finite element analysis to model an example application whereby a heated fragment impacts a reactive heterogeneous material which leads to a delayed cook-off event. Modelling is then extended to include deformation effects associated with shock loading of a heterogeneous medium whereby random variables of strain, strain-rate and temperature are considered. A demonstrative mesoscale simulation of a non-ideal explosive is discussed that illustrates the joint statistical nature of the strain and temperature fields during shock loading to motivate the probabilistic approach. This modelling is derived in a Lagrangian framework that can be incorporated in continuum-level shock physics analysis. Future work will consider particle-based methods for a numerical implementation of this modelling approach.

  18. Surrogate reservoir models for CSI well probabilistic production forecast

    Directory of Open Access Journals (Sweden)

    Saúl Buitrago

    2017-09-01

    Full Text Available The aim of this work is to present the construction and use of Surrogate Reservoir Models capable of accurately predicting cumulative oil production for every well stimulated with cyclic steam injection at any given time in a heavy oil reservoir in Mexico considering uncertain variables. The central composite experimental design technique was selected to capture the maximum amount of information from the model response with a minimum number of reservoir models simulations. Four input uncertain variables (the dead oil viscosity with temperature, the reservoir pressure, the reservoir permeability and oil sand thickness hydraulically connected to the well were selected as the ones with more impact on the initial hot oil production rate according to an analytical production prediction model. Twenty five runs were designed and performed with the STARS simulator for each well type on the reservoir model. The results show that the use of Surrogate Reservoir Models is a fast viable alternative to perform probabilistic production forecasting of the reservoir.

  19. Learning on probabilistic manifolds in massive fusion databases: Application to confinement regime identification

    International Nuclear Information System (INIS)

    Verdoolaege, Geert; Van Oost, Guido

    2012-01-01

    Highlights: ► We present an integrated framework for pattern recognition in fusion data. ► We model measurement uncertainty through an appropriate probability distribution. ► We use the geodesic distance on probabilistic manifolds as a similarity measure. ► We apply the framework to confinement mode classification. ► The classification accuracy benefits from uncertainty information and its geometry. - Abstract: We present an integrated framework for (real-time) pattern recognition in fusion data. The main premise is the inherent probabilistic nature of measurements of plasma quantities. We propose the geodesic distance on probabilistic manifolds as a similarity measure between data points. Substructure induced by data dependencies may further reduce the dimensionality and redundancy of the data set. We present an application to confinement mode classification, showing the distinct advantage obtained by considering the measurement uncertainty and its geometry.

  20. E-Area LLWF Vadose Zone Model: Probabilistic Model for Estimating Subsided-Area Infiltration Rates

    Energy Technology Data Exchange (ETDEWEB)

    Dyer, J. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Flach, G. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2017-12-12

    A probabilistic model employing a Monte Carlo sampling technique was developed in Python to generate statistical distributions of the upslope-intact-area to subsided-area ratio (AreaUAi/AreaSAi) for closure cap subsidence scenarios that differ in assumed percent subsidence and the total number of intact plus subsided compartments. The plan is to use this model as a component in the probabilistic system model for the E-Area Performance Assessment (PA), contributing uncertainty in infiltration estimates.

  1. Age-related difference in the effective neural connectivity associated with probabilistic category learning

    International Nuclear Information System (INIS)

    Yoon, Eun Jin; Cho, Sang Soo; Kim, Hee Jung; Bang, Seong Ae; Park, Hyun Soo; Kim, Yu Kyeong; Kim, Sang Eun

    2007-01-01

    Although it is well known that explicit memory is affected by the deleterious changes in brain with aging, but effect of aging in implicit memory such as probabilistic category learning (PCL) is not clear. To identify the effect of aging on the neural interaction for successful PCL, we investigated the neural substrates of PCL and the age-related changes of the neural network between these brain regions. 23 young (age, 252 y; 11 males) and 14 elderly (673 y; 7 males) healthy subjects underwent FDG PET during a resting state and 150-trial weather prediction (WP) task. Correlations between the WP hit rates and regional glucose metabolism were assessed using SPM2 (P diff (37) = 142.47, P<0.005), Systematic comparisons of each path revealed that frontal crosscallosal and the frontal to parahippocampal connection were most responsible for the model differences (P<0.05). For the successful PCL, the elderly recruits the basal ganglia implicit memory system but MTL recruitment differs from the young. The inadequate MTL correlation pattern in the elderly is may be caused by the changes of the neural pathway related with explicit memory. These neural changes can explain the decreased performance of PCL in elderly subjects

  2. From sub-source to source: Interpreting results of biological trace investigations using probabilistic models

    NARCIS (Netherlands)

    Oosterman, W.T.; Kokshoorn, B.; Maaskant-van Wijk, P.A.; de Zoete, J.

    2015-01-01

    The current method of reporting a putative cell type is based on a non-probabilistic assessment of test results by the forensic practitioner. Additionally, the association between donor and cell type in mixed DNA profiles can be exceedingly complex. We present a probabilistic model for

  3. Performance analysis of chi models using discrete-time probabilistic reward graphs

    NARCIS (Netherlands)

    Trcka, N.; Georgievska, S.; Markovski, J.; Andova, S.; Vink, de E.P.

    2008-01-01

    We propose the model of discrete-time probabilistic reward graphs (DTPRGs) for performance analysis of systems exhibiting discrete deterministic time delays and probabilistic behavior, via their interpretation as discrete-time Markov reward chains, full-fledged platform for qualitative and

  4. A Probabilistic Model for Exteriors of Residential Buildings

    KAUST Repository

    Fan, Lubin

    2016-07-29

    We propose a new framework to model the exterior of residential buildings. The main goal of our work is to design a model that can be learned from data that is observable from the outside of a building and that can be trained with widely available data such as aerial images and street-view images. First, we propose a parametric model to describe the exterior of a building (with a varying number of parameters) and propose a set of attributes as a building representation with fixed dimensionality. Second, we propose a hierarchical graphical model with hidden variables to encode the relationships between building attributes and learn both the structure and parameters of the model from the database. Third, we propose optimization algorithms to generate three-dimensional models based on building attributes sampled from the graphical model. Finally, we demonstrate our framework by synthesizing new building models and completing partially observed building models from photographs.

  5. Bayesian statistic methods and theri application in probabilistic simulation models

    Directory of Open Access Journals (Sweden)

    Sergio Iannazzo

    2007-03-01

    Full Text Available Bayesian statistic methods are facing a rapidly growing level of interest and acceptance in the field of health economics. The reasons of this success are probably to be found on the theoretical fundaments of the discipline that make these techniques more appealing to decision analysis. To this point should be added the modern IT progress that has developed different flexible and powerful statistical software framework. Among them probably one of the most noticeably is the BUGS language project and its standalone application for MS Windows WinBUGS. Scope of this paper is to introduce the subject and to show some interesting applications of WinBUGS in developing complex economical models based on Markov chains. The advantages of this approach reside on the elegance of the code produced and in its capability to easily develop probabilistic simulations. Moreover an example of the integration of bayesian inference models in a Markov model is shown. This last feature let the analyst conduce statistical analyses on the available sources of evidence and exploit them directly as inputs in the economic model.

  6. Evaluation of seismic reliability of steel moment resisting frames rehabilitated by concentric braces with probabilistic models

    Directory of Open Access Journals (Sweden)

    Fateme Rezaei

    2017-08-01

    Full Text Available Probability of structure failure which has been designed by "deterministic methods" can be more than the one which has been designed in similar situation using probabilistic methods and models considering "uncertainties". The main purpose of this research was to evaluate the seismic reliability of steel moment resisting frames rehabilitated with concentric braces by probabilistic models. To do so, three-story and nine-story steel moment resisting frames were designed based on resistant criteria of Iranian code and then they were rehabilitated based on controlling drift limitations by concentric braces. Probability of frames failure was evaluated by probabilistic models of magnitude, location of earthquake, ground shaking intensity in the area of the structure, probabilistic model of building response (based on maximum lateral roof displacement and probabilistic methods. These frames were analyzed under subcrustal source by sampling probabilistic method "Risk Tools" (RT. Comparing the exceedance probability of building response curves (or selected points on it of the three-story and nine-story model frames (before and after rehabilitation, seismic response of rehabilitated frames, was reduced and their reliability was improved. Also the main effective variables in reducing the probability of frames failure were determined using sensitivity analysis by FORM probabilistic method. The most effective variables reducing the probability of frames failure are  in the magnitude model, ground shaking intensity model error and magnitude model error

  7. Quantitative Analysis of Probabilistic Models of Software Product Lines with Statistical Model Checking

    DEFF Research Database (Denmark)

    ter Beek, Maurice H.; Legay, Axel; Lluch Lafuente, Alberto

    2015-01-01

    We investigate the suitability of statistical model checking techniques for analysing quantitative properties of software product line models with probabilistic aspects. For this purpose, we enrich the feature-oriented language FLAN with action rates, which specify the likelihood of exhibiting pa...

  8. A Probabilistic Palimpsest Model of Visual Short-term Memory

    Science.gov (United States)

    Matthey, Loic; Bays, Paul M.; Dayan, Peter

    2015-01-01

    Working memory plays a key role in cognition, and yet its mechanisms remain much debated. Human performance on memory tasks is severely limited; however, the two major classes of theory explaining the limits leave open questions about key issues such as how multiple simultaneously-represented items can be distinguished. We propose a palimpsest model, with the occurrent activity of a single population of neurons coding for several multi-featured items. Using a probabilistic approach to storage and recall, we show how this model can account for many qualitative aspects of existing experimental data. In our account, the underlying nature of a memory item depends entirely on the characteristics of the population representation, and we provide analytical and numerical insights into critical issues such as multiplicity and binding. We consider representations in which information about individual feature values is partially separate from the information about binding that creates single items out of multiple features. An appropriate balance between these two types of information is required to capture fully the different types of error seen in human experimental data. Our model provides the first principled account of misbinding errors. We also suggest a specific set of stimuli designed to elucidate the representations that subjects actually employ. PMID:25611204

  9. Probabilistic consequence model of accidenal or intentional chemical releases.

    Energy Technology Data Exchange (ETDEWEB)

    Chang, Y.-S.; Samsa, M. E.; Folga, S. M.; Hartmann, H. M.

    2008-06-02

    In this work, general methodologies for evaluating the impacts of large-scale toxic chemical releases are proposed. The potential numbers of injuries and fatalities, the numbers of hospital beds, and the geographical areas rendered unusable during and some time after the occurrence and passage of a toxic plume are estimated on a probabilistic basis. To arrive at these estimates, historical accidental release data, maximum stored volumes, and meteorological data were used as inputs into the SLAB accidental chemical release model. Toxic gas footprints from the model were overlaid onto detailed population and hospital distribution data for a given region to estimate potential impacts. Output results are in the form of a generic statistical distribution of injuries and fatalities associated with specific toxic chemicals and regions of the United States. In addition, indoor hazards were estimated, so the model can provide contingency plans for either shelter-in-place or evacuation when an accident occurs. The stochastic distributions of injuries and fatalities are being used in a U.S. Department of Homeland Security-sponsored decision support system as source terms for a Monte Carlo simulation that evaluates potential measures for mitigating terrorist threats. This information can also be used to support the formulation of evacuation plans and to estimate damage and cleanup costs.

  10. Adaptive high learning rate probabilistic disruption predictors from scratch for the next generation of tokamaks

    International Nuclear Information System (INIS)

    Vega, J.; Moreno, R.; Pereira, A.; Acero, A.; Murari, A.; Dormido-Canto, S.

    2014-01-01

    The development of accurate real-time disruption predictors is a pre-requisite to any mitigation action. Present theoretical models of disruptions do not reliably cope with the disruption issues. This article deals with data-driven predictors and a review of existing machine learning techniques, from both physics and engineering points of view, is provided. All these methods need large training datasets to develop successful predictors. However, ITER or DEMO cannot wait for hundreds of disruptions to have a reliable predictor. So far, the attempts to extrapolate predictors between different tokamaks have not shown satisfactory results. In addition, it is not clear how valid this approach can be between present devices and ITER/DEMO, due to the differences in their respective scales and possibly underlying physics. Therefore, this article analyses the requirements to create adaptive predictors from scratch to learn from the data of an individual machine from the beginning of operation. A particular algorithm based on probabilistic classifiers has been developed and it has been applied to the database of the three first ITER-like wall campaigns of JET (1036 non-disruptive and 201 disruptive discharges). The predictions start from the first disruption and only 12 re-trainings have been necessary as a consequence of missing 12 disruptions only. Almost 10 000 different predictors have been developed (they differ in their features) and after the chronological analysis of the 1237 discharges, the predictors recognize 94% of all disruptions with an average warning time (AWT) of 654 ms. This percentage corresponds to the sum of tardy detections (11%), valid alarms (76%) and premature alarms (7%). The false alarm rate is 4%. If only valid alarms are considered, the AWT is 244 ms and the standard deviation is 205 ms. The average probability interval about the reliability and accuracy of all the individual predictions is 0.811 ± 0.189. (paper)

  11. Adaptive high learning rate probabilistic disruption predictors from scratch for the next generation of tokamaks

    Science.gov (United States)

    Vega, J.; Murari, A.; Dormido-Canto, S.; Moreno, R.; Pereira, A.; Acero, A.; Contributors, JET-EFDA

    2014-12-01

    The development of accurate real-time disruption predictors is a pre-requisite to any mitigation action. Present theoretical models of disruptions do not reliably cope with the disruption issues. This article deals with data-driven predictors and a review of existing machine learning techniques, from both physics and engineering points of view, is provided. All these methods need large training datasets to develop successful predictors. However, ITER or DEMO cannot wait for hundreds of disruptions to have a reliable predictor. So far, the attempts to extrapolate predictors between different tokamaks have not shown satisfactory results. In addition, it is not clear how valid this approach can be between present devices and ITER/DEMO, due to the differences in their respective scales and possibly underlying physics. Therefore, this article analyses the requirements to create adaptive predictors from scratch to learn from the data of an individual machine from the beginning of operation. A particular algorithm based on probabilistic classifiers has been developed and it has been applied to the database of the three first ITER-like wall campaigns of JET (1036 non-disruptive and 201 disruptive discharges). The predictions start from the first disruption and only 12 re-trainings have been necessary as a consequence of missing 12 disruptions only. Almost 10 000 different predictors have been developed (they differ in their features) and after the chronological analysis of the 1237 discharges, the predictors recognize 94% of all disruptions with an average warning time (AWT) of 654 ms. This percentage corresponds to the sum of tardy detections (11%), valid alarms (76%) and premature alarms (7%). The false alarm rate is 4%. If only valid alarms are considered, the AWT is 244 ms and the standard deviation is 205 ms. The average probability interval about the reliability and accuracy of all the individual predictions is 0.811 ± 0.189.

  12. A Probabilistic Model for Exteriors of Residential Buildings

    KAUST Repository

    Fan, Lubin; Wonka, Peter

    2016-01-01

    We propose a new framework to model the exterior of residential buildings. The main goal of our work is to design a model that can be learned from data that is observable from the outside of a building and that can be trained with widely available

  13. Predicting coastal cliff erosion using a Bayesian probabilistic model

    Science.gov (United States)

    Hapke, Cheryl J.; Plant, Nathaniel G.

    2010-01-01

    Regional coastal cliff retreat is difficult to model due to the episodic nature of failures and the along-shore variability of retreat events. There is a growing demand, however, for predictive models that can be used to forecast areas vulnerable to coastal erosion hazards. Increasingly, probabilistic models are being employed that require data sets of high temporal density to define the joint probability density function that relates forcing variables (e.g. wave conditions) and initial conditions (e.g. cliff geometry) to erosion events. In this study we use a multi-parameter Bayesian network to investigate correlations between key variables that control and influence variations in cliff retreat processes. The network uses Bayesian statistical methods to estimate event probabilities using existing observations. Within this framework, we forecast the spatial distribution of cliff retreat along two stretches of cliffed coast in Southern California. The input parameters are the height and slope of the cliff, a descriptor of material strength based on the dominant cliff-forming lithology, and the long-term cliff erosion rate that represents prior behavior. The model is forced using predicted wave impact hours. Results demonstrate that the Bayesian approach is well-suited to the forward modeling of coastal cliff retreat, with the correct outcomes forecast in 70–90% of the modeled transects. The model also performs well in identifying specific locations of high cliff erosion, thus providing a foundation for hazard mapping. This approach can be employed to predict cliff erosion at time-scales ranging from storm events to the impacts of sea-level rise at the century-scale.

  14. A study of probabilistic fatigue crack propagation models in Mg Al Zn alloys under different specimen thickness conditions by using the residual of a random variable

    International Nuclear Information System (INIS)

    Choi, Seon Soon

    2012-01-01

    The primary aim of this paper was to evaluate several probabilistic fatigue crack propagation models using the residual of a random variable, and to present the model fit for probabilistic fatigue behavior in Mg Al Zn alloys. The proposed probabilistic models are the probabilistic Paris Erdogan model, probabilistic Walker model, probabilistic Forman model, and probabilistic modified Forman models. These models were prepared by applying a random variable to the empirical fatigue crack propagation models with these names. The best models for describing fatigue crack propagation models with these names. The best models for describing fatigue crack propagation models with these names. The best models for describing fatigue crack propagation models with these names. The best models vor describing fatigue crack propagation behavior in Mg Al Zn alloys were generally the probabilistic Paris Erdogan and probabilistic Walker models. The probabilistic Forman model was a good model only for a specimen with a thickness of 9.45mm

  15. From equilibrium spin models to probabilistic cellular automata

    International Nuclear Information System (INIS)

    Georges, A.; Le Doussal, P.

    1989-01-01

    The general equivalence between D-dimensional probabilistic cellular automata (PCA) and (D + 1)-dimensional equilibrium spin models satisfying a disorder condition is first described in a pedagogical way and then used to analyze the phase diagrams, the critical behavior, and the universality classes of some automato. Diagrammatic representations of time-dependent correlation functions PCA are introduced. Two important classes of PCA are singled out for which these correlation functions simplify: (1) Quasi-Hamiltonian automata, which have a current-carrying steady state, and for which some correlation functions are those of a D-dimensional static model PCA satisfying the detailed balance condition appear as a particular case of these rules for which the current vanishes. (2) Linear (and more generally affine) PCA for which the diagrammatics reduces to a random walk problem closely related to (D + 1)-dimensional directed SAWs: both problems display a critical behavior with mean-field exponents in any dimension. The correlation length and effective velocity of propagation of excitations can be calculated for affine PCA, as is shown on an explicit D = 1 example. The authors conclude with some remarks on nonlinear PCA, for which the diagrammatics is related to reaction-diffusion processes, and which belong in some cases to the universality class of Reggeon field theory

  16. Age-related difference in the effective neural connectivity associated with probabilistic category learning

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, Eun Jin; Cho, Sang Soo; Kim, Hee Jung; Bang, Seong Ae; Park, Hyun Soo; Kim, Yu Kyeong; Kim, Sang Eun [Seoul National Univ. College of Medicine, Seoul (Korea, Republic of)

    2007-07-01

    Although it is well known that explicit memory is affected by the deleterious changes in brain with aging, but effect of aging in implicit memory such as probabilistic category learning (PCL) is not clear. To identify the effect of aging on the neural interaction for successful PCL, we investigated the neural substrates of PCL and the age-related changes of the neural network between these brain regions. 23 young (age, 252 y; 11 males) and 14 elderly (673 y; 7 males) healthy subjects underwent FDG PET during a resting state and 150-trial weather prediction (WP) task. Correlations between the WP hit rates and regional glucose metabolism were assessed using SPM2 (P<0.05 uncorrected). For path analysis, seven brain regions (bilateral middle frontal gyri and putamen, left fusiform gyrus, anterior cingulate and right parahippocampal gyri) were selected based on the results of the correlation analysis. Model construction and path analysis processing were done by AMOS 5.0. The elderly had significantly lower total hit rates than the young (P<0.005). In the correlation analysis, both groups showed similar metabolic correlation in frontal and striatal area. But correlation in the medial temporal lobe (MTL) was found differently by group. In path analysis, the functional networks for the constructed model was accepted (X(2) =0.80, P=0.67) and it proved to be significantly different between groups (X{sub diff}(37) = 142.47, P<0.005), Systematic comparisons of each path revealed that frontal crosscallosal and the frontal to parahippocampal connection were most responsible for the model differences (P<0.05). For the successful PCL, the elderly recruits the basal ganglia implicit memory system but MTL recruitment differs from the young. The inadequate MTL correlation pattern in the elderly is may be caused by the changes of the neural pathway related with explicit memory. These neural changes can explain the decreased performance of PCL in elderly subjects.

  17. Probabilistic Category Learning in Developmental Dyslexia: Evidence from Feedback and Paired-Associate Weather Prediction Tasks

    Science.gov (United States)

    Gabay, Yafit; Vakil, Eli; Schiff, Rachel; Holt, Lori L.

    2015-01-01

    Objective Developmental dyslexia is presumed to arise from specific phonological impairments. However, an emerging theoretical framework suggests that phonological impairments may be symptoms stemming from an underlying dysfunction of procedural learning. Method We tested procedural learning in adults with dyslexia (n=15) and matched-controls (n=15) using two versions of the Weather Prediction Task: Feedback (FB) and Paired-associate (PA). In the FB-based task, participants learned associations between cues and outcomes initially by guessing and subsequently through feedback indicating the correctness of response. In the PA-based learning task, participants viewed the cue and its associated outcome simultaneously without overt response or feedback. In both versions, participants trained across 150 trials. Learning was assessed in a subsequent test without presentation of the outcome, or corrective feedback. Results The Dyslexia group exhibited impaired learning compared with the Control group on both the FB and PA versions of the weather prediction task. Conclusions The results indicate that the ability to learn by feedback is not selectively impaired in dyslexia. Rather it seems that the probabilistic nature of the task, shared by the FB and PA versions of the weather prediction task, hampers learning in those with dyslexia. Results are discussed in light of procedural learning impairments among participants with dyslexia. PMID:25730732

  18. A probabilistic model for cell population phenotyping using HCS data.

    Directory of Open Access Journals (Sweden)

    Edouard Pauwels

    Full Text Available High Content Screening (HCS platforms allow screening living cells under a wide range of experimental conditions and give access to a whole panel of cellular responses to a specific treatment. The outcome is a series of cell population images. Within these images, the heterogeneity of cellular response to the same treatment leads to a whole range of observed values for the recorded cellular features. Consequently, it is difficult to compare and interpret experiments. Moreover, the definition of phenotypic classes at a cell population level remains an open question, although this would ease experiments analyses. In the present work, we tackle these two questions. The input of the method is a series of cell population images for which segmentation and cellular phenotype classification has already been performed. We propose a probabilistic model to represent and later compare cell populations. The model is able to fully exploit the HCS-specific information: "dependence structure of population descriptors" and "within-population variability". The experiments we carried out illustrate how our model accounts for this specific information, as well as the fact that the model benefits from considering them. We underline that these features allow richer HCS data analysis than simpler methods based on single cellular feature values averaged over each well. We validate an HCS data analysis method based on control experiments. It accounts for HCS specificities that were not taken into account by previous methods but have a sound biological meaning. Biological validation of previously unknown outputs of the method constitutes a future line of work.

  19. A probabilistic model of ecosystem response to climate change

    International Nuclear Information System (INIS)

    Shevliakova, E.; Dowlatabadi, H.

    1994-01-01

    Anthropogenic activities are leading to rapid changes in land cover and emissions of greenhouse gases into the atmosphere. These changes can bring about climate change typified by average global temperatures rising by 1--5 C over the next century. Climate change of this magnitude is likely to alter the distribution of terrestrial ecosystems on a large scale. Options available for dealing with such change are abatement of emissions, adaptation, and geoengineering. The integrated assessment of climate change demands that frameworks be developed where all the elements of the climate problem are present (from economic activity to climate change and its impacts on market and non-market goods and services). Integrated climate assessment requires multiple impact metrics and multi-attribute utility functions to simulate the response of different key actors/decision-makers to the actual physical impacts (rather than a dollar value) of the climate-damage vs. policy-cost debate. This necessitates direct modeling of ecosystem impacts of climate change. The authors have developed a probabilistic model of ecosystem response to global change. This model differs from previous efforts in that it is statistically estimated using actual ecosystem and climate data yielding a joint multivariate probability of prevalence for each ecosystem, given climatic conditions. The authors expect this approach to permit simulation of inertia and competition which have, so far, been absent in transfer models of continental-scale ecosystem response to global change. Thus, although the probability of one ecotype will dominate others at a given point, others would have the possibility of establishing an early foothold

  20. Statistical physics of medical diagnostics: Study of a probabilistic model.

    Science.gov (United States)

    Mashaghi, Alireza; Ramezanpour, Abolfazl

    2018-03-01

    We study a diagnostic strategy which is based on the anticipation of the diagnostic process by simulation of the dynamical process starting from the initial findings. We show that such a strategy could result in more accurate diagnoses compared to a strategy that is solely based on the direct implications of the initial observations. We demonstrate this by employing the mean-field approximation of statistical physics to compute the posterior disease probabilities for a given subset of observed signs (symptoms) in a probabilistic model of signs and diseases. A Monte Carlo optimization algorithm is then used to maximize an objective function of the sequence of observations, which favors the more decisive observations resulting in more polarized disease probabilities. We see how the observed signs change the nature of the macroscopic (Gibbs) states of the sign and disease probability distributions. The structure of these macroscopic states in the configuration space of the variables affects the quality of any approximate inference algorithm (so the diagnostic performance) which tries to estimate the sign-disease marginal probabilities. In particular, we find that the simulation (or extrapolation) of the diagnostic process is helpful when the disease landscape is not trivial and the system undergoes a phase transition to an ordered phase.

  1. A Probabilistic Recommendation Method Inspired by Latent Dirichlet Allocation Model

    Directory of Open Access Journals (Sweden)

    WenBo Xie

    2014-01-01

    Full Text Available The recent decade has witnessed an increasing popularity of recommendation systems, which help users acquire relevant knowledge, commodities, and services from an overwhelming information ocean on the Internet. Latent Dirichlet Allocation (LDA, originally presented as a graphical model for text topic discovery, now has found its application in many other disciplines. In this paper, we propose an LDA-inspired probabilistic recommendation method by taking the user-item collecting behavior as a two-step process: every user first becomes a member of one latent user-group at a certain probability and each user-group will then collect various items with different probabilities. Gibbs sampling is employed to approximate all the probabilities in the two-step process. The experiment results on three real-world data sets MovieLens, Netflix, and Last.fm show that our method exhibits a competitive performance on precision, coverage, and diversity in comparison with the other four typical recommendation methods. Moreover, we present an approximate strategy to reduce the computing complexity of our method with a slight degradation of the performance.

  2. Statistical physics of medical diagnostics: Study of a probabilistic model

    Science.gov (United States)

    Mashaghi, Alireza; Ramezanpour, Abolfazl

    2018-03-01

    We study a diagnostic strategy which is based on the anticipation of the diagnostic process by simulation of the dynamical process starting from the initial findings. We show that such a strategy could result in more accurate diagnoses compared to a strategy that is solely based on the direct implications of the initial observations. We demonstrate this by employing the mean-field approximation of statistical physics to compute the posterior disease probabilities for a given subset of observed signs (symptoms) in a probabilistic model of signs and diseases. A Monte Carlo optimization algorithm is then used to maximize an objective function of the sequence of observations, which favors the more decisive observations resulting in more polarized disease probabilities. We see how the observed signs change the nature of the macroscopic (Gibbs) states of the sign and disease probability distributions. The structure of these macroscopic states in the configuration space of the variables affects the quality of any approximate inference algorithm (so the diagnostic performance) which tries to estimate the sign-disease marginal probabilities. In particular, we find that the simulation (or extrapolation) of the diagnostic process is helpful when the disease landscape is not trivial and the system undergoes a phase transition to an ordered phase.

  3. Comprehensive probabilistic modelling of environmental emissions of engineered nanomaterials.

    Science.gov (United States)

    Sun, Tian Yin; Gottschalk, Fadri; Hungerbühler, Konrad; Nowack, Bernd

    2014-02-01

    Concerns about the environmental risks of engineered nanomaterials (ENM) are growing, however, currently very little is known about their concentrations in the environment. Here, we calculate the concentrations of five ENM (nano-TiO2, nano-ZnO, nano-Ag, CNT and fullerenes) in environmental and technical compartments using probabilistic material-flow modelling. We apply the newest data on ENM production volumes, their allocation to and subsequent release from different product categories, and their flows into and within those compartments. Further, we compare newly predicted ENM concentrations to estimates from 2009 and to corresponding measured concentrations of their conventional materials, e.g. TiO2, Zn and Ag. We show that the production volume and the compounds' inertness are crucial factors determining final concentrations. ENM production estimates are generally higher than a few years ago. In most cases, the environmental concentrations of corresponding conventional materials are between one and seven orders of magnitude higher than those for ENM. Copyright © 2013 Elsevier Ltd. All rights reserved.

  4. The Terrestrial Investigation Model: A probabilistic risk assessment model for birds exposed to pesticides

    Science.gov (United States)

    One of the major recommendations of the National Academy of Science to the USEPA, NMFS and USFWS was to utilize probabilistic methods when assessing the risks of pesticides to federally listed endangered and threatened species. The Terrestrial Investigation Model (TIM, version 3....

  5. PACE: Probabilistic Assessment for Contributor Estimation- A machine learning-based assessment of the number of contributors in DNA mixtures.

    Science.gov (United States)

    Marciano, Michael A; Adelman, Jonathan D

    2017-03-01

    The deconvolution of DNA mixtures remains one of the most critical challenges in the field of forensic DNA analysis. In addition, of all the data features required to perform such deconvolution, the number of contributors in the sample is widely considered the most important, and, if incorrectly chosen, the most likely to negatively influence the mixture interpretation of a DNA profile. Unfortunately, most current approaches to mixture deconvolution require the assumption that the number of contributors is known by the analyst, an assumption that can prove to be especially faulty when faced with increasingly complex mixtures of 3 or more contributors. In this study, we propose a probabilistic approach for estimating the number of contributors in a DNA mixture that leverages the strengths of machine learning. To assess this approach, we compare classification performances of six machine learning algorithms and evaluate the model from the top-performing algorithm against the current state of the art in the field of contributor number classification. Overall results show over 98% accuracy in identifying the number of contributors in a DNA mixture of up to 4 contributors. Comparative results showed 3-person mixtures had a classification accuracy improvement of over 6% compared to the current best-in-field methodology, and that 4-person mixtures had a classification accuracy improvement of over 20%. The Probabilistic Assessment for Contributor Estimation (PACE) also accomplishes classification of mixtures of up to 4 contributors in less than 1s using a standard laptop or desktop computer. Considering the high classification accuracy rates, as well as the significant time commitment required by the current state of the art model versus seconds required by a machine learning-derived model, the approach described herein provides a promising means of estimating the number of contributors and, subsequently, will lead to improved DNA mixture interpretation. Copyright © 2016

  6. Formulation of probabilistic models of protein structure in atomic detail using the reference ratio method.

    Science.gov (United States)

    Valentin, Jan B; Andreetta, Christian; Boomsma, Wouter; Bottaro, Sandro; Ferkinghoff-Borg, Jesper; Frellsen, Jes; Mardia, Kanti V; Tian, Pengfei; Hamelryck, Thomas

    2014-02-01

    We propose a method to formulate probabilistic models of protein structure in atomic detail, for a given amino acid sequence, based on Bayesian principles, while retaining a close link to physics. We start from two previously developed probabilistic models of protein structure on a local length scale, which concern the dihedral angles in main chain and side chains, respectively. Conceptually, this constitutes a probabilistic and continuous alternative to the use of discrete fragment and rotamer libraries. The local model is combined with a nonlocal model that involves a small number of energy terms according to a physical force field, and some information on the overall secondary structure content. In this initial study we focus on the formulation of the joint model and the evaluation of the use of an energy vector as a descriptor of a protein's nonlocal structure; hence, we derive the parameters of the nonlocal model from the native structure without loss of generality. The local and nonlocal models are combined using the reference ratio method, which is a well-justified probabilistic construction. For evaluation, we use the resulting joint models to predict the structure of four proteins. The results indicate that the proposed method and the probabilistic models show considerable promise for probabilistic protein structure prediction and related applications. Copyright © 2013 Wiley Periodicals, Inc.

  7. Probabilistic Analysis of Passive Safety System Reliability in Advanced Small Modular Reactors: Methodologies and Lessons Learned

    Energy Technology Data Exchange (ETDEWEB)

    Grabaskas, David; Bucknor, Matthew; Brunett, Acacia; Grelle, Austin

    2015-06-28

    Many advanced small modular reactor designs rely on passive systems to fulfill safety functions during accident sequences. These systems depend heavily on boundary conditions to induce a motive force, meaning the system can fail to operate as intended due to deviations in boundary conditions, rather than as the result of physical failures. Furthermore, passive systems may operate in intermediate or degraded modes. These factors make passive system operation difficult to characterize with a traditional probabilistic framework that only recognizes discrete operating modes and does not allow for the explicit consideration of time-dependent boundary conditions. Argonne National Laboratory has been examining various methodologies for assessing passive system reliability within a probabilistic risk assessment for a station blackout event at an advanced small modular reactor. This paper describes the most promising options: mechanistic techniques, which share qualities with conventional probabilistic methods, and simulation-based techniques, which explicitly account for time-dependent processes. The primary intention of this paper is to describe the strengths and weaknesses of each methodology and highlight the lessons learned while applying the two techniques while providing high-level results. This includes the global benefits and deficiencies of the methods and practical problems encountered during the implementation of each technique.

  8. Understanding onsets of rainfall in Southern Africa using temporal probabilistic modelling

    CSIR Research Space (South Africa)

    Cheruiyot, D

    2010-12-01

    Full Text Available This research investigates an alternative approach to automatically evolve the hidden temporal distribution of onset of rainfall directly from multivariate time series (MTS) data in the absence of domain experts. Temporal probabilistic modelling...

  9. Non-probabilistic defect assessment for structures with cracks based on interval model

    International Nuclear Information System (INIS)

    Dai, Qiao; Zhou, Changyu; Peng, Jian; Chen, Xiangwei; He, Xiaohua

    2013-01-01

    Highlights: • Non-probabilistic approach is introduced to defect assessment. • Definition and establishment of IFAC are put forward. • Determination of assessment rectangle is proposed. • Solution of non-probabilistic reliability index is presented. -- Abstract: Traditional defect assessment methods conservatively treat uncertainty of parameters as safety factors, while the probabilistic method is based on the clear understanding of detailed statistical information of parameters. In this paper, the non-probabilistic approach is introduced to the failure assessment diagram (FAD) to propose a non-probabilistic defect assessment method for structures with cracks. This novel defect assessment method contains three critical processes: establishment of the interval failure assessment curve (IFAC), determination of the assessment rectangle, and solution of the non-probabilistic reliability degree. Based on the interval theory, uncertain parameters such as crack sizes, material properties and loads are considered as interval variables. As a result, the failure assessment curve (FAC) will vary in a certain range, which is defined as IFAC. And the assessment point will vary within a rectangle zone which is defined as an assessment rectangle. Based on the interval model, the establishment of IFAC and the determination of the assessment rectangle are presented. Then according to the interval possibility degree method, the non-probabilistic reliability degree of IFAC can be determined. Meanwhile, in order to clearly introduce the non-probabilistic defect assessment method, a numerical example for the assessment of a pipe with crack is given. In addition, the assessment result of the proposed method is compared with that of the traditional probabilistic method, which confirms that this non-probabilistic defect assessment can reasonably resolve the practical problem with interval variables

  10. Non-probabilistic defect assessment for structures with cracks based on interval model

    Energy Technology Data Exchange (ETDEWEB)

    Dai, Qiao; Zhou, Changyu, E-mail: changyu_zhou@163.com; Peng, Jian; Chen, Xiangwei; He, Xiaohua

    2013-09-15

    Highlights: • Non-probabilistic approach is introduced to defect assessment. • Definition and establishment of IFAC are put forward. • Determination of assessment rectangle is proposed. • Solution of non-probabilistic reliability index is presented. -- Abstract: Traditional defect assessment methods conservatively treat uncertainty of parameters as safety factors, while the probabilistic method is based on the clear understanding of detailed statistical information of parameters. In this paper, the non-probabilistic approach is introduced to the failure assessment diagram (FAD) to propose a non-probabilistic defect assessment method for structures with cracks. This novel defect assessment method contains three critical processes: establishment of the interval failure assessment curve (IFAC), determination of the assessment rectangle, and solution of the non-probabilistic reliability degree. Based on the interval theory, uncertain parameters such as crack sizes, material properties and loads are considered as interval variables. As a result, the failure assessment curve (FAC) will vary in a certain range, which is defined as IFAC. And the assessment point will vary within a rectangle zone which is defined as an assessment rectangle. Based on the interval model, the establishment of IFAC and the determination of the assessment rectangle are presented. Then according to the interval possibility degree method, the non-probabilistic reliability degree of IFAC can be determined. Meanwhile, in order to clearly introduce the non-probabilistic defect assessment method, a numerical example for the assessment of a pipe with crack is given. In addition, the assessment result of the proposed method is compared with that of the traditional probabilistic method, which confirms that this non-probabilistic defect assessment can reasonably resolve the practical problem with interval variables.

  11. Probabilistic Mobility Models for Mobile and Wireless Networks

    DEFF Research Database (Denmark)

    Song, Lei; Godskesen, Jens Christian

    2010-01-01

    In this paper we present a probabilistic broadcast calculus for mobile and wireless networks whose connections are unreliable. In our calculus broadcasted messages can be lost with a certain probability, and due to mobility the connection probabilities may change. If a network broadcasts a message...... from a location it will evolve to a network distribution depending on whether nodes at other locations receive the message or not. Mobility of locations is not arbitrary but guarded by a probabilistic mobility function (PMF) and we also define the notion of a weak bisimulation given a PMF...

  12. BN-FLEMOps pluvial - A probabilistic multi-variable loss estimation model for pluvial floods

    Science.gov (United States)

    Roezer, V.; Kreibich, H.; Schroeter, K.; Doss-Gollin, J.; Lall, U.; Merz, B.

    2017-12-01

    Pluvial flood events, such as in Copenhagen (Denmark) in 2011, Beijing (China) in 2012 or Houston (USA) in 2016, have caused severe losses to urban dwellings in recent years. These floods are caused by storm events with high rainfall rates well above the design levels of urban drainage systems, which lead to inundation of streets and buildings. A projected increase in frequency and intensity of heavy rainfall events in many areas and an ongoing urbanization may increase pluvial flood losses in the future. For an efficient risk assessment and adaptation to pluvial floods, a quantification of the flood risk is needed. Few loss models have been developed particularly for pluvial floods. These models usually use simple waterlevel- or rainfall-loss functions and come with very high uncertainties. To account for these uncertainties and improve the loss estimation, we present a probabilistic multi-variable loss estimation model for pluvial floods based on empirical data. The model was developed in a two-step process using a machine learning approach and a comprehensive database comprising 783 records of direct building and content damage of private households. The data was gathered through surveys after four different pluvial flood events in Germany between 2005 and 2014. In a first step, linear and non-linear machine learning algorithms, such as tree-based and penalized regression models were used to identify the most important loss influencing factors among a set of 55 candidate variables. These variables comprise hydrological and hydraulic aspects, early warning, precaution, building characteristics and the socio-economic status of the household. In a second step, the most important loss influencing variables were used to derive a probabilistic multi-variable pluvial flood loss estimation model based on Bayesian Networks. Two different networks were tested: a score-based network learned from the data and a network based on expert knowledge. Loss predictions are made

  13. Individual model evaluation and probabilistic weighting of models

    International Nuclear Information System (INIS)

    Atwood, C.L.

    1994-01-01

    This note stresses the importance of trying to assess the accuracy of each model individually. Putting a Bayesian probability distribution on a population of models faces conceptual and practical complications, and apparently can come only after the work of evaluating the individual models. Moreover, the primary issue is open-quotes How good is this modelclose quotes? Therefore, the individual evaluations are first in both chronology and importance. They are not easy, but some ideas are given here on how to perform them

  14. Probabilistic Failure Analysis of Bone Using a Finite Element Model of Mineral-Collagen Composites

    OpenAIRE

    Dong, X. Neil; Guda, Teja; Millwater, Harry R.; Wang, Xiaodu

    2008-01-01

    Microdamage accumulation is a major pathway for energy dissipation during the post-yield deformation of bone. In this study, a two-dimensional probabilistic finite element model of a mineral-collagen composite was developed to investigate the influence of the tissue and ultrastructural properties of bone on the evolution of microdamage from an initial defect in tension. The probabilistic failure analyses indicated that the microdamage progression would be along the plane of the initial defect...

  15. Improved detection of chemical substances from colorimetric sensor data using probabilistic machine learning

    Science.gov (United States)

    Mølgaard, Lasse L.; Buus, Ole T.; Larsen, Jan; Babamoradi, Hamid; Thygesen, Ida L.; Laustsen, Milan; Munk, Jens Kristian; Dossi, Eleftheria; O'Keeffe, Caroline; Lässig, Lina; Tatlow, Sol; Sandström, Lars; Jakobsen, Mogens H.

    2017-05-01

    We present a data-driven machine learning approach to detect drug- and explosives-precursors using colorimetric sensor technology for air-sampling. The sensing technology has been developed in the context of the CRIM-TRACK project. At present a fully- integrated portable prototype for air sampling with disposable sensing chips and automated data acquisition has been developed. The prototype allows for fast, user-friendly sampling, which has made it possible to produce large datasets of colorimetric data for different target analytes in laboratory and simulated real-world application scenarios. To make use of the highly multi-variate data produced from the colorimetric chip a number of machine learning techniques are employed to provide reliable classification of target analytes from confounders found in the air streams. We demonstrate that a data-driven machine learning method using dimensionality reduction in combination with a probabilistic classifier makes it possible to produce informative features and a high detection rate of analytes. Furthermore, the probabilistic machine learning approach provides a means of automatically identifying unreliable measurements that could produce false predictions. The robustness of the colorimetric sensor has been evaluated in a series of experiments focusing on the amphetamine pre-cursor phenylacetone as well as the improvised explosives pre-cursor hydrogen peroxide. The analysis demonstrates that the system is able to detect analytes in clean air and mixed with substances that occur naturally in real-world sampling scenarios. The technology under development in CRIM-TRACK has the potential as an effective tool to control trafficking of illegal drugs, explosive detection, or in other law enforcement applications.

  16. Convolution product construction of interactions in probabilistic physical models

    International Nuclear Information System (INIS)

    Ratsimbarison, H.M.; Raboanary, R.

    2007-01-01

    This paper aims to give a probabilistic construction of interactions which may be relevant for building physical theories such as interacting quantum field theories. We start with the path integral definition of partition function in quantum field theory which recall us the probabilistic nature of this physical theory. From a Gaussian law considered as free theory, an interacting theory is constructed by nontrivial convolution product between the free theory and an interacting term which is also a probability law. The resulting theory, again a probability law, exhibits two proprieties already present in nowadays theories of interactions such as Gauge theory : the interaction term does not depend on the free term, and two different free theories can be implemented with the same interaction.

  17. Quantification of Wave Model Uncertainties Used for Probabilistic Reliability Assessments of Wave Energy Converters

    DEFF Research Database (Denmark)

    Ambühl, Simon; Kofoed, Jens Peter; Sørensen, John Dalsgaard

    2015-01-01

    Wave models used for site assessments are subjected to model uncertainties, which need to be quantified when using wave model results for probabilistic reliability assessments. This paper focuses on determination of wave model uncertainties. Four different wave models are considered, and validation...... data are collected from published scientific research. The bias and the root-mean-square error, as well as the scatter index, are considered for the significant wave height as well as the mean zero-crossing wave period. Based on an illustrative generic example, this paper presents how the quantified...... uncertainties can be implemented in probabilistic reliability assessments....

  18. Determination of Wave Model Uncertainties used for Probabilistic Reliability Assessments of Wave Energy Devices

    DEFF Research Database (Denmark)

    Ambühl, Simon; Kofoed, Jens Peter; Sørensen, John Dalsgaard

    2014-01-01

    Wave models used for site assessments are subject to model uncertainties, which need to be quantified when using wave model results for probabilistic reliability assessments. This paper focuses on determination of wave model uncertainties. Considered are four different wave models and validation...... data is collected from published scientific research. The bias, the root-mean-square error as well as the scatter index are considered for the significant wave height as well as the mean zero-crossing wave period. Based on an illustrative generic example it is shown how the estimated uncertainties can...... be implemented in probabilistic reliability assessments....

  19. Probability Modeling and Thinking: What Can We Learn from Practice?

    Science.gov (United States)

    Pfannkuch, Maxine; Budgett, Stephanie; Fewster, Rachel; Fitch, Marie; Pattenwise, Simeon; Wild, Chris; Ziedins, Ilze

    2016-01-01

    Because new learning technologies are enabling students to build and explore probability models, we believe that there is a need to determine the big enduring ideas that underpin probabilistic thinking and modeling. By uncovering the elements of the thinking modes of expert users of probability models we aim to provide a base for the setting of…

  20. An Empirical Study of Efficiency and Accuracy of Probabilistic Graphical Models

    DEFF Research Database (Denmark)

    Nielsen, Jens Dalgaard; Jaeger, Manfred

    2006-01-01

    In this paper we compare Na\\ii ve Bayes (NB) models, general Bayes Net (BN) models and Probabilistic Decision Graph (PDG) models w.r.t. accuracy and efficiency. As the basis for our analysis we use graphs of size vs. likelihood that show the theoretical capabilities of the models. We also measure...

  1. Integration of Advanced Probabilistic Analysis Techniques with Multi-Physics Models

    Energy Technology Data Exchange (ETDEWEB)

    Cetiner, Mustafa Sacit; none,; Flanagan, George F. [ORNL; Poore III, Willis P. [ORNL; Muhlheim, Michael David [ORNL

    2014-07-30

    An integrated simulation platform that couples probabilistic analysis-based tools with model-based simulation tools can provide valuable insights for reactive and proactive responses to plant operating conditions. The objective of this work is to demonstrate the benefits of a partial implementation of the Small Modular Reactor (SMR) Probabilistic Risk Assessment (PRA) Detailed Framework Specification through the coupling of advanced PRA capabilities and accurate multi-physics plant models. Coupling a probabilistic model with a multi-physics model will aid in design, operations, and safety by providing a more accurate understanding of plant behavior. This represents the first attempt at actually integrating these two types of analyses for a control system used for operations, on a faster than real-time basis. This report documents the development of the basic communication capability to exchange data with the probabilistic model using Reliability Workbench (RWB) and the multi-physics model using Dymola. The communication pathways from injecting a fault (i.e., failing a component) to the probabilistic and multi-physics models were successfully completed. This first version was tested with prototypic models represented in both RWB and Modelica. First, a simple event tree/fault tree (ET/FT) model was created to develop the software code to implement the communication capabilities between the dynamic-link library (dll) and RWB. A program, written in C#, successfully communicates faults to the probabilistic model through the dll. A systems model of the Advanced Liquid-Metal Reactor–Power Reactor Inherently Safe Module (ALMR-PRISM) design developed under another DOE project was upgraded using Dymola to include proper interfaces to allow data exchange with the control application (ConApp). A program, written in C+, successfully communicates faults to the multi-physics model. The results of the example simulation were successfully plotted.

  2. Probabilistic Fatigue Damage Prognosis Using a Surrogate Model Trained Via 3D Finite Element Analysis

    Science.gov (United States)

    Leser, Patrick E.; Hochhalter, Jacob D.; Newman, John A.; Leser, William P.; Warner, James E.; Wawrzynek, Paul A.; Yuan, Fuh-Gwo

    2015-01-01

    Utilizing inverse uncertainty quantification techniques, structural health monitoring can be integrated with damage progression models to form probabilistic predictions of a structure's remaining useful life. However, damage evolution in realistic structures is physically complex. Accurately representing this behavior requires high-fidelity models which are typically computationally prohibitive. In the present work, a high-fidelity finite element model is represented by a surrogate model, reducing computation times. The new approach is used with damage diagnosis data to form a probabilistic prediction of remaining useful life for a test specimen under mixed-mode conditions.

  3. Brain function during probabilistic learning in relation to IQ and level of education.

    Science.gov (United States)

    van den Bos, Wouter; Crone, Eveline A; Güroğlu, Berna

    2012-02-15

    Knowing how to adapt your behavior based on feedback lies at the core of successful learning. We investigated the relation between brain function, grey matter volume, educational level and IQ in a Dutch adolescent sample. In total 45 healthy volunteers between ages 13 and 16 were recruited from schools for pre-vocational and pre-university education. For each individual, IQ was estimated using two subtests from the WISC-III-R (similarities and block design). While in the magnetic resonance imaging (MRI) scanner, participants performed a probabilistic learning task. Behavioral comparisons showed that participants with higher IQ used a more adaptive learning strategy after receiving positive feedback. Analysis of neural activation revealed that higher IQ was associated with increased activation in DLPFC and dACC when receiving positive feedback, specifically for rules with low reward probability (i.e., unexpected positive feedback). Furthermore, VBM analyses revealed that IQ correlated positively with grey matter volume within these regions. These results provide support for IQ-related individual differences in the developmental time courses of neural circuitry supporting feedback-based learning. Current findings are interpreted in terms of a prolonged window of flexibility and opportunity for adolescents with higher IQ scores. Copyright © 2011 Elsevier Ltd. All rights reserved.

  4. A comprehensive probabilistic analysis model of oil pipelines network based on Bayesian network

    Science.gov (United States)

    Zhang, C.; Qin, T. X.; Jiang, B.; Huang, C.

    2018-02-01

    Oil pipelines network is one of the most important facilities of energy transportation. But oil pipelines network accident may result in serious disasters. Some analysis models for these accidents have been established mainly based on three methods, including event-tree, accident simulation and Bayesian network. Among these methods, Bayesian network is suitable for probabilistic analysis. But not all the important influencing factors are considered and the deployment rule of the factors has not been established. This paper proposed a probabilistic analysis model of oil pipelines network based on Bayesian network. Most of the important influencing factors, including the key environment condition and emergency response are considered in this model. Moreover, the paper also introduces a deployment rule for these factors. The model can be used in probabilistic analysis and sensitive analysis of oil pipelines network accident.

  5. Probabilistic approaches to recommendations

    CERN Document Server

    Barbieri, Nicola; Ritacco, Ettore

    2014-01-01

    The importance of accurate recommender systems has been widely recognized by academia and industry, and recommendation is rapidly becoming one of the most successful applications of data mining and machine learning. Understanding and predicting the choices and preferences of users is a challenging task: real-world scenarios involve users behaving in complex situations, where prior beliefs, specific tendencies, and reciprocal influences jointly contribute to determining the preferences of users toward huge amounts of information, services, and products. Probabilistic modeling represents a robus

  6. Assessing uncertainties in global cropland futures using a conditional probabilistic modelling framework

    NARCIS (Netherlands)

    Engström, Kerstin; Olin, Stefan; Rounsevell, Mark D A; Brogaard, Sara; Van Vuuren, Detlef P.; Alexander, Peter; Murray-Rust, Dave; Arneth, Almut

    2016-01-01

    We present a modelling framework to simulate probabilistic futures of global cropland areas that are conditional on the SSP (shared socio-economic pathway) scenarios. Simulations are based on the Parsimonious Land Use Model (PLUM) linked with the global dynamic vegetation model LPJ-GUESS

  7. Modelling probabilistic fatigue crack propagation rates for a mild structural steel

    Directory of Open Access Journals (Sweden)

    J.A.F.O. Correia

    2015-01-01

    Full Text Available A class of fatigue crack growth models based on elastic–plastic stress–strain histories at the crack tip region and local strain-life damage models have been proposed in literature. The fatigue crack growth is regarded as a process of continuous crack initializations over successive elementary material blocks, which may be governed by smooth strain-life damage data. Some approaches account for the residual stresses developing at the crack tip in the actual crack driving force assessment, allowing mean stresses and loading sequential effects to be modelled. An extension of the fatigue crack propagation model originally proposed by Noroozi et al. (2005 to derive probabilistic fatigue crack propagation data is proposed, in particular concerning the derivation of probabilistic da/dN-ΔK-R fields. The elastic-plastic stresses at the vicinity of the crack tip, computed using simplified formulae, are compared with the stresses computed using an elasticplastic finite element analyses for specimens considered in the experimental program proposed to derive the fatigue crack propagation data. Using probabilistic strain-life data available for the S355 structural mild steel, probabilistic crack propagation fields are generated, for several stress ratios, and compared with experimental fatigue crack propagation data. A satisfactory agreement between the predicted probabilistic fields and experimental data is observed.

  8. Comparison of plant-specific probabilistic safety assessments and lessons learned

    Energy Technology Data Exchange (ETDEWEB)

    Balfanz, H.P. [TUeV Nord, Hamburg (Germany); Berg, H.P. [Bundesamt fuer Strahlenschutz, Salzgitter (Germany); Steininger, U. [TUeV Energie- und Systemtechnik GmbH, Unternehmensgruppe TUeV Sueddeutschland, Muenchen (Germany)

    2001-11-01

    Probabilistic safety assessments (PSA) have been performed for all German nuclear power plants in operation. These assessments are mainly based on the recent German PSA guide and an earlier draft, respectively. However, comparison of these PSA show differences in the results which are discussed in this paper. Lessons learned from this comparison and further development of the PSA methodology are described. (orig.) [German] Probabilistische Sicherheitsanalysen (PSA) sind fuer alle in Betrieb befindlichen deutschen Kernkraftwerke durchgefuehrt worden. Diese Analysen basierten in der Regel auf dem aktuellen deutschen PSA-Leitfaden bzw. einem frueheren Entwurf. Ein Vergleich dieser PSA zeigt Unterschiede in den Ergebnissen, die in diesem Beitrag diskutiert werden. Erfahrungen und Erkenntnisse, die aus diesem Vergleich abgeleitet werden koennen, und weitere Entwicklungen der PSA-Methoden werden beschrieben. (orig.)

  9. Integrating probabilistic models of perception and interactive neural networks: a historical and tutorial review.

    Science.gov (United States)

    McClelland, James L

    2013-01-01

    This article seeks to establish a rapprochement between explicitly Bayesian models of contextual effects in perception and neural network models of such effects, particularly the connectionist interactive activation (IA) model of perception. The article is in part an historical review and in part a tutorial, reviewing the probabilistic Bayesian approach to understanding perception and how it may be shaped by context, and also reviewing ideas about how such probabilistic computations may be carried out in neural networks, focusing on the role of context in interactive neural networks, in which both bottom-up and top-down signals affect the interpretation of sensory inputs. It is pointed out that connectionist units that use the logistic or softmax activation functions can exactly compute Bayesian posterior probabilities when the bias terms and connection weights affecting such units are set to the logarithms of appropriate probabilistic quantities. Bayesian concepts such the prior, likelihood, (joint and marginal) posterior, probability matching and maximizing, and calculating vs. sampling from the posterior are all reviewed and linked to neural network computations. Probabilistic and neural network models are explicitly linked to the concept of a probabilistic generative model that describes the relationship between the underlying target of perception (e.g., the word intended by a speaker or other source of sensory stimuli) and the sensory input that reaches the perceiver for use in inferring the underlying target. It is shown how a new version of the IA model called the multinomial interactive activation (MIA) model can sample correctly from the joint posterior of a proposed generative model for perception of letters in words, indicating that interactive processing is fully consistent with principled probabilistic computation. Ways in which these computations might be realized in real neural systems are also considered.

  10. A probabilistic model for estimating the waiting time until the simultaneous collapse of two contingencies

    International Nuclear Information System (INIS)

    Barnett, C.S.

    1991-01-01

    The Double Contingency Principle (DCP) is widely applied to criticality safety practice in the United States. Most practitioners base their application of the principle on qualitative, intuitive assessments. The recent trend toward probabilistic safety assessments provides a motive to search for a quantitative, probabilistic foundation for the DCP. A Markov model is tractable and leads to relatively simple results. The model yields estimates of mean time to simultaneous collapse of two contingencies as a function of estimates of mean failure times and mean recovery times of two independent contingencies. The model is a tool that can be used to supplement the qualitative methods now used to assess effectiveness of the DCP. (Author)

  11. Probabilistic safety assessment model in consideration of human factors based on object-oriented bayesian networks

    International Nuclear Information System (INIS)

    Zhou Zhongbao; Zhou Jinglun; Sun Quan

    2007-01-01

    Effect of Human factors on system safety is increasingly serious, which is often ignored in traditional probabilistic safety assessment methods however. A new probabilistic safety assessment model based on object-oriented Bayesian networks is proposed in this paper. Human factors are integrated into the existed event sequence diagrams. Then the classes of the object-oriented Bayesian networks are constructed which are converted to latent Bayesian networks for inference. Finally, the inference results are integrated into event sequence diagrams for probabilistic safety assessment. The new method is applied to the accident of loss of coolant in a nuclear power plant. the results show that the model is not only applicable to real-time situation assessment, but also applicable to situation assessment based certain amount of information. The modeling complexity is kept down and the new method is appropriate to large complex systems due to the thoughts of object-oriented. (authors)

  12. Mathematical Models of Elementary Mathematics Learning and Performance. Final Report.

    Science.gov (United States)

    Suppes, Patrick

    This project was concerned with the development of mathematical models of elementary mathematics learning and performance. Probabilistic finite automata and register machines with a finite number of registers were developed as models and extensively tested with data arising from the elementary-mathematics strand curriculum developed by the…

  13. Learning Graphical Models With Hubs.

    Science.gov (United States)

    Tan, Kean Ming; London, Palma; Mohan, Karthik; Lee, Su-In; Fazel, Maryam; Witten, Daniela

    2014-10-01

    We consider the problem of learning a high-dimensional graphical model in which there are a few hub nodes that are densely-connected to many other nodes. Many authors have studied the use of an ℓ 1 penalty in order to learn a sparse graph in the high-dimensional setting. However, the ℓ 1 penalty implicitly assumes that each edge is equally likely and independent of all other edges. We propose a general framework to accommodate more realistic networks with hub nodes, using a convex formulation that involves a row-column overlap norm penalty. We apply this general framework to three widely-used probabilistic graphical models: the Gaussian graphical model, the covariance graph model, and the binary Ising model. An alternating direction method of multipliers algorithm is used to solve the corresponding convex optimization problems. On synthetic data, we demonstrate that our proposed framework outperforms competitors that do not explicitly model hub nodes. We illustrate our proposal on a webpage data set and a gene expression data set.

  14. Probabilistic Models and Generative Neural Networks: Towards an Unified Framework for Modeling Normal and Impaired Neurocognitive Functions.

    Science.gov (United States)

    Testolin, Alberto; Zorzi, Marco

    2016-01-01

    Connectionist models can be characterized within the more general framework of probabilistic graphical models, which allow to efficiently describe complex statistical distributions involving a large number of interacting variables. This integration allows building more realistic computational models of cognitive functions, which more faithfully reflect the underlying neural mechanisms at the same time providing a useful bridge to higher-level descriptions in terms of Bayesian computations. Here we discuss a powerful class of graphical models that can be implemented as stochastic, generative neural networks. These models overcome many limitations associated with classic connectionist models, for example by exploiting unsupervised learning in hierarchical architectures (deep networks) and by taking into account top-down, predictive processing supported by feedback loops. We review some recent cognitive models based on generative networks, and we point out promising research directions to investigate neuropsychological disorders within this approach. Though further efforts are required in order to fill the gap between structured Bayesian models and more realistic, biophysical models of neuronal dynamics, we argue that generative neural networks have the potential to bridge these levels of analysis, thereby improving our understanding of the neural bases of cognition and of pathologies caused by brain damage.

  15. Probabilistic risk models for multiple disturbances: an example of forest insects and wildfires

    Science.gov (United States)

    Haiganoush K. Preisler; Alan A. Ager; Jane L. Hayes

    2010-01-01

    Building probabilistic risk models for highly random forest disturbances like wildfire and forest insect outbreaks is a challenging. Modeling the interactions among natural disturbances is even more difficult. In the case of wildfire and forest insects, we looked at the probability of a large fire given an insect outbreak and also the incidence of insect outbreaks...

  16. Probabilistic Modelling of Fatigue Life of Composite Laminates Using Bayesian Inference

    DEFF Research Database (Denmark)

    Dimitrov, Nikolay Krasimirov; Kiureghian, Armen Der

    2014-01-01

    A probabilistic model for estimating the fatigue life of laminated composite plates subjected to constant-amplitude or variable-amplitude loading is developed. The model is based on lamina-level input data, making it possible to predict fatigue properties for a wide range of laminate configuratio...

  17. Probabilistic Data Modeling and Querying for Location-Based Data Warehouses

    DEFF Research Database (Denmark)

    Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach

    Motivated by the increasing need to handle complex, dynamic, uncertain multidimensional data in location-based warehouses, this paper proposes a novel probabilistic data model that can address the complexities of such data. The model provides a foundation for handling complex hierarchical and unc...

  18. Probabilistic Data Modeling and Querying for Location-Based Data Warehouses

    DEFF Research Database (Denmark)

    Timko, Igor; Dyreson, Curtis E.; Pedersen, Torben Bach

    2005-01-01

    Motivated by the increasing need to handle complex, dynamic, uncertain multidimensional data in location-based warehouses, this paper proposes a novel probabilistic data model that can address the complexities of such data. The model provides a foundation for handling complex hierarchical and unc...

  19. Probabilistic model for fatigue crack growth and fracture of welded joints in civil engineering structures

    NARCIS (Netherlands)

    Maljaars, J.; Steenbergen, H.M.G.M.; Vrouwenvelder, A.C.W.M.

    2012-01-01

    This paper presents a probabilistic assessment model for linear elastic fracture mechanics (LEFM). The model allows the determination of the failure probability of a structure subjected to fatigue loading. The distributions of the random variables for civil engineering structures are provided, and

  20. Decomposing biodiversity data using the Latent Dirichlet Allocation model, a probabilistic multivariate statistical method

    Science.gov (United States)

    Denis Valle; Benjamin Baiser; Christopher W. Woodall; Robin Chazdon; Jerome. Chave

    2014-01-01

    We propose a novel multivariate method to analyse biodiversity data based on the Latent Dirichlet Allocation (LDA) model. LDA, a probabilistic model, reduces assemblages to sets of distinct component communities. It produces easily interpretable results, can represent abrupt and gradual changes in composition, accommodates missing data and allows for coherent estimates...

  1. Using Probabilistic Models to Appraise and Decide on Sovereign Disaster Risk Financing and Insurance

    OpenAIRE

    Ley-Borrás, Roberto; Fox, Benjamin D.

    2015-01-01

    This paper presents an overview of the structure of probabilistic catastrophe risk models, discusses their importance for appraising sovereign disaster risk financing and insurance instruments and strategy, and puts forward a model and a process for improving decision making on the linked disaster risk management strategy and sovereign disaster risk financing and insurance strategy. The pa...

  2. Probabilistic inference in general graphical models through sampling in stochastic networks of spiking neurons.

    Science.gov (United States)

    Pecevski, Dejan; Buesing, Lars; Maass, Wolfgang

    2011-12-01

    An important open problem of computational neuroscience is the generic organization of computations in networks of neurons in the brain. We show here through rigorous theoretical analysis that inherent stochastic features of spiking neurons, in combination with simple nonlinear computational operations in specific network motifs and dendritic arbors, enable networks of spiking neurons to carry out probabilistic inference through sampling in general graphical models. In particular, it enables them to carry out probabilistic inference in Bayesian networks with converging arrows ("explaining away") and with undirected loops, that occur in many real-world tasks. Ubiquitous stochastic features of networks of spiking neurons, such as trial-to-trial variability and spontaneous activity, are necessary ingredients of the underlying computational organization. We demonstrate through computer simulations that this approach can be scaled up to neural emulations of probabilistic inference in fairly large graphical models, yielding some of the most complex computations that have been carried out so far in networks of spiking neurons.

  3. Reconstructing constructivism: Causal models, Bayesian learning mechanisms and the theory theory

    OpenAIRE

    Gopnik, Alison; Wellman, Henry M.

    2012-01-01

    We propose a new version of the “theory theory” grounded in the computational framework of probabilistic causal models and Bayesian learning. Probabilistic models allow a constructivist but rigorous and detailed approach to cognitive development. They also explain the learning of both more specific causal hypotheses and more abstract framework theories. We outline the new theoretical ideas, explain the computational framework in an intuitive and non-technical way, and review an extensive but ...

  4. Probabilistic models for access strategies to dynamic information elements

    DEFF Research Database (Denmark)

    Hansen, Martin Bøgsted; Olsen, Rasmus L.; Schwefel, Hans-Peter

    In various network services (e.g., routing and instances of context-sensitive networking) remote access to dynamically changing information elements is a required functionality. Three fundamentally different strategies for such access are investigated in this paper: (1) a reactive approach...... initiated by the requesting entity, and two versions of proactive approaches in which the entity that contains the information element actively propagates its changes to potential requesters, either (2) periodically or (3) triggered by changes of the information element. This paper develops probabilistic...... for information elements spread over a large number of network nodes are provided, which allow to draw conclusions on scalability properties. The impact of different distribution types for the network delays as well as for the time between changes of the information element on the mismatch probability...

  5. Constrained dictionary learning and probabilistic hypergraph ranking for person re-identification

    Science.gov (United States)

    He, You; Wu, Song; Pu, Nan; Qian, Li; Xiao, Guoqiang

    2018-04-01

    Person re-identification is a fundamental and inevitable task in public security. In this paper, we propose a novel framework to improve the performance of this task. First, two different types of descriptors are extracted to represent a pedestrian: (1) appearance-based superpixel features, which are constituted mainly by conventional color features and extracted from the supepixel rather than a whole picture and (2) due to the limitation of discrimination of appearance features, the deep features extracted by feature fusion Network are also used. Second, a view invariant subspace is learned by dictionary learning constrained by the minimum negative sample (termed as DL-cMN) to reduce the noise in appearance-based superpixel feature domain. Then, we use deep features and sparse codes transformed by appearancebased features to establish the hyperedges respectively by k-nearest neighbor, rather than jointing different features simply. Finally, a final ranking is performed by probabilistic hypergraph ranking algorithm. Extensive experiments on three challenging datasets (VIPeR, PRID450S and CUHK01) demonstrate the advantages and effectiveness of our proposed algorithm.

  6. Support Vector Machines with Manifold Learning and Probabilistic Space Projection for Tourist Expenditure Analysis

    Directory of Open Access Journals (Sweden)

    Xin Xu

    2009-03-01

    Full Text Available The significant economic contributions of the tourism industry in recent years impose an unprecedented force for data mining and machine learning methods to analyze tourism data. The intrinsic problems of raw data in tourism are largely related to the complexity, noise and nonlinearity in the data that may introduce many challenges for the existing data mining techniques such as rough sets and neural networks. In this paper, a novel method using SVM- based classification with two nonlinear feature projection techniques is proposed for tourism data analysis. The first feature projection method is based on ISOMAP (Isometric Feature Mapping, which is a class of manifold learning approaches for dimension reduction. By making use of ISOMAP, part of the noisy data can be identified and the classification accuracy of SVMs can be improved by appropriately discarding the noisy training data. The second feature projection method is a probabilistic space mapping technique for scale transformation. Experimental results on expenditure data of business travelers show that the proposed method can improve prediction performance both in terms of testing accuracy and statistical coincidence. In addition, both of the feature projection methods are helpful to reduce the training time of SVMs.

  7. Model checking optimal finite-horizon control for probabilistic gene regulatory networks.

    Science.gov (United States)

    Wei, Ou; Guo, Zonghao; Niu, Yun; Liao, Wenyuan

    2017-12-14

    Probabilistic Boolean networks (PBNs) have been proposed for analyzing external control in gene regulatory networks with incorporation of uncertainty. A context-sensitive PBN with perturbation (CS-PBNp), extending a PBN with context-sensitivity to reflect the inherent biological stability and random perturbations to express the impact of external stimuli, is considered to be more suitable for modeling small biological systems intervened by conditions from the outside. In this paper, we apply probabilistic model checking, a formal verification technique, to optimal control for a CS-PBNp that minimizes the expected cost over a finite control horizon. We first describe a procedure of modeling a CS-PBNp using the language provided by a widely used probabilistic model checker PRISM. We then analyze the reward-based temporal properties and the computation in probabilistic model checking; based on the analysis, we provide a method to formulate the optimal control problem as minimum reachability reward properties. Furthermore, we incorporate control and state cost information into the PRISM code of a CS-PBNp such that automated model checking a minimum reachability reward property on the code gives the solution to the optimal control problem. We conduct experiments on two examples, an apoptosis network and a WNT5A network. Preliminary experiment results show the feasibility and effectiveness of our approach. The approach based on probabilistic model checking for optimal control avoids explicit computation of large-size state transition relations associated with PBNs. It enables a natural depiction of the dynamics of gene regulatory networks, and provides a canonical form to formulate optimal control problems using temporal properties that can be automated solved by leveraging the analysis power of underlying model checking engines. This work will be helpful for further utilization of the advances in formal verification techniques in system biology.

  8. A range of complex probabilistic models for RNA secondary structure prediction that includes the nearest-neighbor model and more.

    Science.gov (United States)

    Rivas, Elena; Lang, Raymond; Eddy, Sean R

    2012-02-01

    The standard approach for single-sequence RNA secondary structure prediction uses a nearest-neighbor thermodynamic model with several thousand experimentally determined energy parameters. An attractive alternative is to use statistical approaches with parameters estimated from growing databases of structural RNAs. Good results have been reported for discriminative statistical methods using complex nearest-neighbor models, including CONTRAfold, Simfold, and ContextFold. Little work has been reported on generative probabilistic models (stochastic context-free grammars [SCFGs]) of comparable complexity, although probabilistic models are generally easier to train and to use. To explore a range of probabilistic models of increasing complexity, and to directly compare probabilistic, thermodynamic, and discriminative approaches, we created TORNADO, a computational tool that can parse a wide spectrum of RNA grammar architectures (including the standard nearest-neighbor model and more) using a generalized super-grammar that can be parameterized with probabilities, energies, or arbitrary scores. By using TORNADO, we find that probabilistic nearest-neighbor models perform comparably to (but not significantly better than) discriminative methods. We find that complex statistical models are prone to overfitting RNA structure and that evaluations should use structurally nonhomologous training and test data sets. Overfitting has affected at least one published method (ContextFold). The most important barrier to improving statistical approaches for RNA secondary structure prediction is the lack of diversity of well-curated single-sequence RNA secondary structures in current RNA databases.

  9. Approach to modeling of human performance for purposes of probabilistic risk assessment

    International Nuclear Information System (INIS)

    Swain, A.D.

    1983-01-01

    This paper describes the general approach taken in NUREG/CR-1278 to model human performance in sufficienct detail to permit probabilistic risk assessments of nuclear power plant operations. To show the basis for the more specific models in the above NUREG, a simplified model of the human component in man-machine systems is presented, the role of performance shaping factors is discussed, and special problems in modeling the cognitive aspect of behavior are described

  10. Probabilistic estimation of residential air exchange rates for population-based human exposure modeling

    Science.gov (United States)

    Residential air exchange rates (AERs) are a key determinant in the infiltration of ambient air pollution indoors. Population-based human exposure models using probabilistic approaches to estimate personal exposure to air pollutants have relied on input distributions from AER meas...

  11. Probabilistic modeling of dietary intake of substances - The risk management question governs the method

    NARCIS (Netherlands)

    Pieters MN; Ossendorp BC; Bakker MI; Slob W; SIR

    2005-01-01

    In this report the discussion on the use of probabilistic modeling in relation to pesticide use in food crops is analyzed. Due to different policy questions the current discussion is complex and considers safety of an MRL as well as probability of a health risk. The question regarding the use of

  12. A Probabilistic Model of the LMAC Protocol for Concurrent Wireless Sensor Networks

    DEFF Research Database (Denmark)

    Esparza, Luz Judith R; Zeng, Kebin; Nielsen, Bo Friis

    2011-01-01

    We present a probabilistic model for the network setup phase of the Lightweight Medium Access Protocol (LMAC) for concurrent Wireless Sensor Networks. In the network setup phase, time slots are allocated to the individual sensors through resolution of successive collisions. The setup phase...

  13. Psychological Plausibility of the Theory of Probabilistic Mental Models and the Fast and Frugal Heuristics

    Science.gov (United States)

    Dougherty, Michael R.; Franco-Watkins, Ana M.; Thomas, Rick

    2008-01-01

    The theory of probabilistic mental models (PMM; G. Gigerenzer, U. Hoffrage, & H. Kleinbolting, 1991) has had a major influence on the field of judgment and decision making, with the most recent important modifications to PMM theory being the identification of several fast and frugal heuristics (G. Gigerenzer & D. G. Goldstein, 1996). These…

  14. A Probabilistic Model for Diagnosing Misconceptions by a Pattern Classification Approach.

    Science.gov (United States)

    Tatsuoka, Kikumi K.

    A probabilistic approach is introduced to classify and diagnose erroneous rules of operation resulting from a variety of misconceptions ("bugs") in a procedural domain of arithmetic. The model is contrasted with the deterministic approach which has commonly been used in the field of artificial intelligence, and the advantage of treating the…

  15. Developing probabilistic models to predict amphibian site occupancy in a patchy landscape

    Science.gov (United States)

    R. A. Knapp; K.R. Matthews; H. K. Preisler; R. Jellison

    2003-01-01

    Abstract. Human-caused fragmentation of habitats is threatening an increasing number of animal and plant species, making an understanding of the factors influencing patch occupancy ever more important. The overall goal of the current study was to develop probabilistic models of patch occupancy for the mountain yellow-legged frog (Rana muscosa). This once-common species...

  16. Toward a Probabilistic Phenological Model for Wheat Growing Degree Days (GDD)

    Science.gov (United States)

    Rahmani, E.; Hense, A.

    2017-12-01

    Are there deterministic relations between phenological and climate parameters? The answer is surely `No'. This answer motivated us to solve the problem through probabilistic theories. Thus, we developed a probabilistic phenological model which has the advantage of giving additional information in terms of uncertainty. To that aim, we turned to a statistical analysis named survival analysis. Survival analysis deals with death in biological organisms and failure in mechanical systems. In survival analysis literature, death or failure is considered as an event. By event, in this research we mean ripening date of wheat. We will assume only one event in this special case. By time, we mean the growing duration from sowing to ripening as lifetime for wheat which is a function of GDD. To be more precise we will try to perform the probabilistic forecast for wheat ripening. The probability value will change between 0 and 1. Here, the survivor function gives the probability that the not ripened wheat survives longer than a specific time or will survive to the end of its lifetime as a ripened crop. The survival function at each station is determined by fitting a normal distribution to the GDD as the function of growth duration. Verification of the models obtained is done using CRPS skill score (CRPSS). The positive values of CRPSS indicate the large superiority of the probabilistic phonologic survival model to the deterministic models. These results demonstrate that considering uncertainties in modeling are beneficial, meaningful and necessary. We believe that probabilistic phenological models have the potential to help reduce the vulnerability of agricultural production systems to climate change thereby increasing food security.

  17. Probabilistic Seismic Performance Model for Tunnel Form Concrete Building Structures

    Directory of Open Access Journals (Sweden)

    S. Bahram Beheshti Aval

    2016-12-01

    Full Text Available Despite widespread construction of mass-production houses with tunnel form structural system across the world, unfortunately no special seismic code is published for design of this type of construction. Through a literature survey, only a few studies are about the seismic behavior of this type of structural system. Thus based on reasonable numerical results, the seismic performance of structures constructed with this technique considering the effective factors on structural behavior is highly noteworthy in a seismic code development process. In addition, due to newness of this system and observed damages in past earthquakes, and especially random nature of future earthquakes, the importance of probabilistic approach and necessity of developing fragility curves in a next generation Performance Based Earthquake Engineering (PBEE frame work are important. In this study, the seismic behavior of 2, 5 and 10 story tunnel form structures with a regular plan is examined. First, the performance levels of these structures under the design earthquake (return period of 475 years with time history analysis and pushover method are assessed, and then through incremental dynamic analysis, fragility curves are extracted for different levels of damage in walls and spandrels. The results indicated that the case study structures have high capacity and strength and show appropriate seismic performance. Moreover, all three structures subjected were in immediate occupancy performance level.

  18. Conditional Density Models Integrating Fuzzy and Probabilistic Representations of Uncertainty

    NARCIS (Netherlands)

    R.J. Almeida e Santos Nogueira (Rui Jorge)

    2014-01-01

    markdownabstract__Abstract__ Conditional density estimation is an important problem in a variety of areas such as system identification, machine learning, artificial intelligence, empirical economics, macroeconomic analysis, quantitative finance and risk management. This work considers the

  19. Probabilistic model of random uncertainties in structural dynamics for mis-tuned bladed disks; Modele probabiliste des incertitudes en dynamique des structures pour le desaccordage des roues aubagees

    Energy Technology Data Exchange (ETDEWEB)

    Capiez-Lernout, E.; Soize, Ch. [Universite de Marne la Vallee, Lab. de Mecanique, 77 (France)

    2003-10-01

    The mis-tuning of blades is frequently the cause of spatial localizations for the dynamic forced response in turbomachinery industry. The random character of mis-tuning requires the construction of probabilistic models of random uncertainties. A usual parametric probabilistic description considers the mis-tuning through the Young modulus of each blade. This model consists in mis-tuning blade eigenfrequencies, assuming the blade modal shapes unchanged. Recently a new approach known as a non-parametric model of random uncertainties has been introduced for modelling random uncertainties in elasto-dynamics. This paper proposes the construction of a non-parametric model which is coherent with all the uncertainties which characterize mis-tuning. As mis-tuning is a phenomenon which is independent from one blade to another one, the structure is considered as an assemblage of substructures. The mean reduced matrix model required by the non-parametric approach is thus constructed by dynamic sub-structuring. A comparative approach is also needed to study the influence of the non-parametric approach for a usual parametric model adapted to mis-tuning. A numerical example is presented. (authors)

  20. Probabilistic reasoning for assembly-based 3D modeling

    KAUST Repository

    Chaudhuri, Siddhartha; Kalogerakis, Evangelos; Guibas, Leonidas; Koltun, Vladlen

    2011-01-01

    Assembly-based modeling is a promising approach to broadening the accessibility of 3D modeling. In assembly-based modeling, new models are assembled from shape components extracted from a database. A key challenge in assembly-based modeling

  1. Use of probabilistic relational model (PRM) for dependability analysis of complex systems

    OpenAIRE

    Medina-Oliva , Gabriela; Weber , Philippe; Levrat , Eric; Iung , Benoît

    2010-01-01

    International audience; This paper proposes a methodology to develop a aided decision-making tool for assessing the dependability and performances (i.e. reliability) of an industrial system. This tool is built on a model based on a new formalism, called the probabilistic relational model (PRM) which is adapted to deal with large and complex systems. The model is formalized from functional, dysfunctional and informational studies of the technical industrial systems. An application of this meth...

  2. Agent autonomy approach to probabilistic physics-of-failure modeling of complex dynamic systems with interacting failure mechanisms

    Science.gov (United States)

    Gromek, Katherine Emily

    A novel computational and inference framework of the physics-of-failure (PoF) reliability modeling for complex dynamic systems has been established in this research. The PoF-based reliability models are used to perform a real time simulation of system failure processes, so that the system level reliability modeling would constitute inferences from checking the status of component level reliability at any given time. The "agent autonomy" concept is applied as a solution method for the system-level probabilistic PoF-based (i.e. PPoF-based) modeling. This concept originated from artificial intelligence (AI) as a leading intelligent computational inference in modeling of multi agents systems (MAS). The concept of agent autonomy in the context of reliability modeling was first proposed by M. Azarkhail [1], where a fundamentally new idea of system representation by autonomous intelligent agents for the purpose of reliability modeling was introduced. Contribution of the current work lies in the further development of the agent anatomy concept, particularly the refined agent classification within the scope of the PoF-based system reliability modeling, new approaches to the learning and the autonomy properties of the intelligent agents, and modeling interacting failure mechanisms within the dynamic engineering system. The autonomous property of intelligent agents is defined as agent's ability to self-activate, deactivate or completely redefine their role in the analysis. This property of agents and the ability to model interacting failure mechanisms of the system elements makes the agent autonomy fundamentally different from all existing methods of probabilistic PoF-based reliability modeling. 1. Azarkhail, M., "Agent Autonomy Approach to Physics-Based Reliability Modeling of Structures and Mechanical Systems", PhD thesis, University of Maryland, College Park, 2007.

  3. Structural and functional properties of a probabilistic model of neuronal connectivity in a simple locomotor network

    Science.gov (United States)

    Merrison-Hort, Robert; Soffe, Stephen R; Borisyuk, Roman

    2018-01-01

    Although, in most animals, brain connectivity varies between individuals, behaviour is often similar across a species. What fundamental structural properties are shared across individual networks that define this behaviour? We describe a probabilistic model of connectivity in the hatchling Xenopus tadpole spinal cord which, when combined with a spiking model, reliably produces rhythmic activity corresponding to swimming. The probabilistic model allows calculation of structural characteristics that reflect common network properties, independent of individual network realisations. We use the structural characteristics to study examples of neuronal dynamics, in the complete network and various sub-networks, and this allows us to explain the basis for key experimental findings, and make predictions for experiments. We also study how structural and functional features differ between detailed anatomical connectomes and those generated by our new, simpler, model (meta-model). PMID:29589828

  4. A probabilistic model of the electron transport in films of nanocrystals arranged in a cubic lattice

    Energy Technology Data Exchange (ETDEWEB)

    Kriegel, Ilka [Department of Nanochemistry, Istituto Italiano di Tecnologia (IIT), via Morego, 30, 16163 Genova (Italy); Scotognella, Francesco, E-mail: francesco.scotognella@polimi.it [Dipartimento di Fisica, Politecnico di Milano, Piazza Leonardo da Vinci 32, 20133 Milano (Italy); Center for Nano Science and Technology@PoliMi, Istituto Italiano di Tecnologia, Via Giovanni Pascoli, 70/3, 20133 Milan (Italy)

    2016-08-01

    The fabrication of nanocrystal (NC) films, starting from colloidal dispersion, is a very attractive topic in condensed matter physics community. NC films can be employed for transistors, light emitting diodes, lasers, and solar cells. For this reason the understanding of the film conductivity is of major importance. In this paper we describe a probabilistic model that allows the prediction of the conductivity of NC films, in this case of a cubic lattice of Lead Selenide or Cadmium Selenide NCs. The model is based on the hopping probability between NCs. The results are compared to experimental data reported in literature. - Highlights: • Colloidal nanocrystal (NC) film conductivity is a topic of major importance. • We present a probabilistic model to predict the electron conductivity in NC films. • The model is based on the hopping probability between NCs. • We found a good agreement between the model and data reported in literature.

  5. Unified Probabilistic Models for Face Recognition from a Single Example Image per Person

    Institute of Scientific and Technical Information of China (English)

    Pin Liao; Li Shen

    2004-01-01

    This paper presents a new technique of unified probabilistic models for face recognition from only one single example image per person. The unified models, trained on an obtained training set with multiple samples per person, are used to recognize facial images from another disjoint database with a single sample per person. Variations between facial images are modeled as two unified probabilistic models: within-class variations and between-class variations. Gaussian Mixture Models are used to approximate the distributions of the two variations and exploit a classifier combination method to improve the performance. Extensive experimental results on the ORL face database and the authors' database (the ICT-JDL database) including totally 1,750facial images of 350 individuals demonstrate that the proposed technique, compared with traditional eigenface method and some well-known traditional algorithms, is a significantly more effective and robust approach for face recognition.

  6. Probabilistic linguistics

    NARCIS (Netherlands)

    Bod, R.; Heine, B.; Narrog, H.

    2010-01-01

    Probabilistic linguistics takes all linguistic evidence as positive evidence and lets statistics decide. It allows for accurate modelling of gradient phenomena in production and perception, and suggests that rule-like behaviour is no more than a side effect of maximizing probability. This chapter

  7. Associations between polygenic risk for schizophrenia and brain function during probabilistic learning in healthy individuals.

    Science.gov (United States)

    Lancaster, Thomas M; Ihssen, Niklas; Brindley, Lisa M; Tansey, Katherine E; Mantripragada, Kiran; O'Donovan, Michael C; Owen, Michael J; Linden, David E J

    2016-02-01

    A substantial proportion of schizophrenia liability can be explained by additive genetic factors. Risk profile scores (RPS) directly index risk using a summated total of common risk variants weighted by their effect. Previous studies suggest that schizophrenia RPS predict alterations to neural networks that support working memory and verbal fluency. In this study, we apply schizophrenia RPS to fMRI data to elucidate the effects of polygenic risk on functional brain networks during a probabilistic-learning neuroimaging paradigm. The neural networks recruited during this paradigm have previously been shown to be altered to unmedicated schizophrenia patients and relatives of schizophrenia patients, which may reflect genetic susceptibility. We created schizophrenia RPS using summary data from the Psychiatric Genetic Consortium (Schizophrenia Working Group) for 83 healthy individuals and explore associations between schizophrenia RPS and blood oxygen level dependency (BOLD) during periods of choice behavior (switch-stay) and reflection upon choice outcome (reward-punishment). We show that schizophrenia RPS is associated with alterations in the frontal pole (PWHOLE-BRAIN-CORRECTED  = 0.048) and the ventral striatum (PROI-CORRECTED  = 0.036), during choice behavior, but not choice outcome. We suggest that the common risk variants that increase susceptibility to schizophrenia can be associated with alterations in the neural circuitry that support the processing of changing reward contingencies. Hum Brain Mapp 37:491-500, 2016. © 2015 Wiley Periodicals, Inc. © 2015 The Authors Human Brain Mapping Published by Wiley Periodicals, Inc.

  8. Modeling and analysis of cell membrane systems with probabilistic model checking

    Science.gov (United States)

    2011-01-01

    Background Recently there has been a growing interest in the application of Probabilistic Model Checking (PMC) for the formal specification of biological systems. PMC is able to exhaustively explore all states of a stochastic model and can provide valuable insight into its behavior which are more difficult to see using only traditional methods for system analysis such as deterministic and stochastic simulation. In this work we propose a stochastic modeling for the description and analysis of sodium-potassium exchange pump. The sodium-potassium pump is a membrane transport system presents in all animal cell and capable of moving sodium and potassium ions against their concentration gradient. Results We present a quantitative formal specification of the pump mechanism in the PRISM language, taking into consideration a discrete chemistry approach and the Law of Mass Action aspects. We also present an analysis of the system using quantitative properties in order to verify the pump reversibility and understand the pump behavior using trend labels for the transition rates of the pump reactions. Conclusions Probabilistic model checking can be used along with other well established approaches such as simulation and differential equations to better understand pump behavior. Using PMC we can determine if specific events happen such as the potassium outside the cell ends in all model traces. We can also have a more detailed perspective on its behavior such as determining its reversibility and why its normal operation becomes slow over time. This knowledge can be used to direct experimental research and make it more efficient, leading to faster and more accurate scientific discoveries. PMID:22369714

  9. Probabilistic disaggregation model with application to natural hazard risk assessment of portfolios

    DEFF Research Database (Denmark)

    Custer, Rocco; Nishijima, Kazuyoshi

    In natural hazard risk assessment, a resolution mismatch between hazard data and aggregated exposure data is often observed. A possible solution to this issue is the disaggregation of exposure data to match the spatial resolution of hazard data. Disaggregation models available in literature...... disaggregation model that considers the uncertainty in the disaggregation, taking basis in the scaled Dirichlet distribution. The proposed probabilistic disaggregation model is applied to a portfolio of residential buildings in the Canton Bern, Switzerland, subject to flood risk. Thereby, the model is verified...... are usually deterministic and make use of auxiliary indicator, such as land cover, to spatially distribute exposures. As the dependence between auxiliary indicator and disaggregated number of exposures is generally imperfect, uncertainty arises in disaggregation. This paper therefore proposes a probabilistic...

  10. Trait-Dependent Biogeography: (Re)Integrating Biology into Probabilistic Historical Biogeographical Models.

    Science.gov (United States)

    Sukumaran, Jeet; Knowles, L Lacey

    2018-04-20

    The development of process-based probabilistic models for historical biogeography has transformed the field by grounding it in modern statistical hypothesis testing. However, most of these models abstract away biological differences, reducing species to interchangeable lineages. We present here the case for reintegration of biology into probabilistic historical biogeographical models, allowing a broader range of questions about biogeographical processes beyond ancestral range estimation or simple correlation between a trait and a distribution pattern, as well as allowing us to assess how inferences about ancestral ranges themselves might be impacted by differential biological traits. We show how new approaches to inference might cope with the computational challenges resulting from the increased complexity of these trait-based historical biogeographical models. Copyright © 2018 Elsevier Ltd. All rights reserved.

  11. Probabilistic Modelling of Information Propagation in Wireless Mobile Ad-Hoc Network

    DEFF Research Database (Denmark)

    Schiøler, Henrik; Hansen, Martin Bøgsted; Schwefel, Hans-Peter

    2005-01-01

    In this paper the dynamics of broadcasting wireless ad-hoc networks is studied through probabilistic modelling. A randomized transmission discipline is assumed in accordance with existing MAC definitions such as WLAN with Decentralized Coordination or IEEE-802.15.4. Message reception is assumed...... to be governed by node power-down policies and is equivalently assumed to be randomized. Altogether randomization facilitates a probabilistic model in the shape of an integro-differential equation governing the propagation of information, where brownian node mobility may be accounted for by including an extra...... diffusion term. The established model is analyzed for transient behaviour and a travelling wave solution facilitates expressions for propagation speed as well as parametrized analysis of network reliability and node power consumption. Applications of the developed models for node localization and network...

  12. Cognitive modeling and dynamic probabilistic simulation of operating crew response to complex system accidents

    International Nuclear Information System (INIS)

    Chang, Y.H.J.; Mosleh, A.

    2007-01-01

    This is the last in a series of five papers that discuss the Information Decision and Action in Crew (IDAC) context for human reliability analysis (HRA) and example application. The model is developed to probabilistically predict the responses of the control room operating crew in nuclear power plants during an accident, for use in probabilistic risk assessments (PRA). The operator response spectrum includes cognitive, emotional, and physical activities during the course of an accident. This paper describes a dynamic PRA computer simulation program, accident dynamics simulator (ADS), developed in part to implement the IDAC model. This paper also provides a detailed example of implementing a simpler version of IDAC, compared with the IDAC model discussed in the first four papers of this series, to demonstrate the practicality of integrating a detailed cognitive HRA model within a dynamic PRA framework

  13. A note on probabilistic models over strings: the linear algebra approach.

    Science.gov (United States)

    Bouchard-Côté, Alexandre

    2013-12-01

    Probabilistic models over strings have played a key role in developing methods that take into consideration indels as phylogenetically informative events. There is an extensive literature on using automata and transducers on phylogenies to do inference on these probabilistic models, in which an important theoretical question is the complexity of computing the normalization of a class of string-valued graphical models. This question has been investigated using tools from combinatorics, dynamic programming, and graph theory, and has practical applications in Bayesian phylogenetics. In this work, we revisit this theoretical question from a different point of view, based on linear algebra. The main contribution is a set of results based on this linear algebra view that facilitate the analysis and design of inference algorithms on string-valued graphical models. As an illustration, we use this method to give a new elementary proof of a known result on the complexity of inference on the "TKF91" model, a well-known probabilistic model over strings. Compared to previous work, our proving method is easier to extend to other models, since it relies on a novel weak condition, triangular transducers, which is easy to establish in practice. The linear algebra view provides a concise way of describing transducer algorithms and their compositions, opens the possibility of transferring fast linear algebra libraries (for example, based on GPUs), as well as low rank matrix approximation methods, to string-valued inference problems.

  14. Generalized outcome-based strategy classification: comparing deterministic and probabilistic choice models.

    Science.gov (United States)

    Hilbig, Benjamin E; Moshagen, Morten

    2014-12-01

    Model comparisons are a vital tool for disentangling which of several strategies a decision maker may have used--that is, which cognitive processes may have governed observable choice behavior. However, previous methodological approaches have been limited to models (i.e., decision strategies) with deterministic choice rules. As such, psychologically plausible choice models--such as evidence-accumulation and connectionist models--that entail probabilistic choice predictions could not be considered appropriately. To overcome this limitation, we propose a generalization of Bröder and Schiffer's (Journal of Behavioral Decision Making, 19, 361-380, 2003) choice-based classification method, relying on (1) parametric order constraints in the multinomial processing tree framework to implement probabilistic models and (2) minimum description length for model comparison. The advantages of the generalized approach are demonstrated through recovery simulations and an experiment. In explaining previous methods and our generalization, we maintain a nontechnical focus--so as to provide a practical guide for comparing both deterministic and probabilistic choice models.

  15. A Stochastic Lagrangian Basis for a Probabilistic Parameterization of Moisture Condensation in Eulerian Models

    OpenAIRE

    Tsang, Yue-Kin; Vallis, Geoffrey K.

    2018-01-01

    In this paper we describe the construction of an efficient probabilistic parameterization that could be used in a coarse-resolution numerical model in which the variation of moisture is not properly resolved. An Eulerian model using a coarse-grained field on a grid cannot properly resolve regions of saturation---in which condensation occurs---that are smaller than the grid boxes. Thus, in the absence of a parameterization scheme, either the grid box must become saturated or condensation will ...

  16. Precise Quantitative Analysis of Probabilistic Business Process Model and Notation Workflows

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Sharp, Robin

    2013-01-01

    We present a framework for modeling and analysis of real-world business workflows. We present a formalized core subset of the business process modeling and notation (BPMN) and then proceed to extend this language with probabilistic nondeterministic branching and general-purpose reward annotations...... the entire BPMN language, allow for more complex annotations and ultimately to automatically synthesize workflows by composing predefined subprocesses, in order to achieve a configuration that is optimal for parameters of interest....

  17. A probabilistic evaluation procedure for process model matching techniques

    NARCIS (Netherlands)

    Kuss, Elena; Leopold, Henrik; van der Aa, Han; Stuckenschmidt, Heiner; Reijers, Hajo A.

    2018-01-01

    Process model matching refers to the automatic identification of corresponding activities between two process models. It represents the basis for many advanced process model analysis techniques such as the identification of similar process parts or process model search. A central problem is how to

  18. Probabilistic modelling of human exposure to intense sweeteners in Italian teenagers: validation and sensitivity analysis of a probabilistic model including indicators of market share and brand loyalty.

    Science.gov (United States)

    Arcella, D; Soggiu, M E; Leclercq, C

    2003-10-01

    For the assessment of exposure to food-borne chemicals, the most commonly used methods in the European Union follow a deterministic approach based on conservative assumptions. Over the past few years, to get a more realistic view of exposure to food chemicals, risk managers are getting more interested in the probabilistic approach. Within the EU-funded 'Monte Carlo' project, a stochastic model of exposure to chemical substances from the diet and a computer software program were developed. The aim of this paper was to validate the model with respect to the intake of saccharin from table-top sweeteners and cyclamate from soft drinks by Italian teenagers with the use of the software and to evaluate the impact of the inclusion/exclusion of indicators on market share and brand loyalty through a sensitivity analysis. Data on food consumption and the concentration of sweeteners were collected. A food frequency questionnaire aimed at identifying females who were high consumers of sugar-free soft drinks and/or of table top sweeteners was filled in by 3982 teenagers living in the District of Rome. Moreover, 362 subjects participated in a detailed food survey by recording, at brand level, all foods and beverages ingested over 12 days. Producers were asked to provide the intense sweeteners' concentration of sugar-free products. Results showed that consumer behaviour with respect to brands has an impact on exposure assessments. Only probabilistic models that took into account indicators of market share and brand loyalty met the validation criteria.

  19. Probabilistic Inference: Task Dependency and Individual Differences of Probability Weighting Revealed by Hierarchical Bayesian Modeling.

    Science.gov (United States)

    Boos, Moritz; Seer, Caroline; Lange, Florian; Kopp, Bruno

    2016-01-01

    Cognitive determinants of probabilistic inference were examined using hierarchical Bayesian modeling techniques. A classic urn-ball paradigm served as experimental strategy, involving a factorial two (prior probabilities) by two (likelihoods) design. Five computational models of cognitive processes were compared with the observed behavior. Parameter-free Bayesian posterior probabilities and parameter-free base rate neglect provided inadequate models of probabilistic inference. The introduction of distorted subjective probabilities yielded more robust and generalizable results. A general class of (inverted) S-shaped probability weighting functions had been proposed; however, the possibility of large differences in probability distortions not only across experimental conditions, but also across individuals, seems critical for the model's success. It also seems advantageous to consider individual differences in parameters of probability weighting as being sampled from weakly informative prior distributions of individual parameter values. Thus, the results from hierarchical Bayesian modeling converge with previous results in revealing that probability weighting parameters show considerable task dependency and individual differences. Methodologically, this work exemplifies the usefulness of hierarchical Bayesian modeling techniques for cognitive psychology. Theoretically, human probabilistic inference might be best described as the application of individualized strategic policies for Bayesian belief revision.

  20. Predicting inpatient clinical order patterns with probabilistic topic models vs conventional order sets.

    Science.gov (United States)

    Chen, Jonathan H; Goldstein, Mary K; Asch, Steven M; Mackey, Lester; Altman, Russ B

    2017-05-01

    Build probabilistic topic model representations of hospital admissions processes and compare the ability of such models to predict clinical order patterns as compared to preconstructed order sets. The authors evaluated the first 24 hours of structured electronic health record data for > 10 K inpatients. Drawing an analogy between structured items (e.g., clinical orders) to words in a text document, the authors performed latent Dirichlet allocation probabilistic topic modeling. These topic models use initial clinical information to predict clinical orders for a separate validation set of > 4 K patients. The authors evaluated these topic model-based predictions vs existing human-authored order sets by area under the receiver operating characteristic curve, precision, and recall for subsequent clinical orders. Existing order sets predict clinical orders used within 24 hours with area under the receiver operating characteristic curve 0.81, precision 16%, and recall 35%. This can be improved to 0.90, 24%, and 47% ( P  sets tend to provide nonspecific, process-oriented aid, with usability limitations impairing more precise, patient-focused support. Algorithmic summarization has the potential to breach this usability barrier by automatically inferring patient context, but with potential tradeoffs in interpretability. Probabilistic topic modeling provides an automated approach to detect thematic trends in patient care and generate decision support content. A potential use case finds related clinical orders for decision support. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association.

  1. Developing Pavement Distress Deterioration Models for Pavement Management System Using Markovian Probabilistic Process

    Directory of Open Access Journals (Sweden)

    Promothes Saha

    2017-01-01

    Full Text Available In the state of Colorado, the Colorado Department of Transportation (CDOT utilizes their pavement management system (PMS to manage approximately 9,100 miles of interstate, highways, and low-volume roads. Three types of deterioration models are currently being used in the existing PMS: site-specific, family, and expert opinion curves. These curves are developed using deterministic techniques. In the deterministic technique, the uncertainties of pavement deterioration related to traffic and weather are not considered. Probabilistic models that take into account the uncertainties result in more accurate curves. In this study, probabilistic models using the discrete-time Markov process were developed for five distress indices: transverse, longitudinal, fatigue, rut, and ride indices, as a case study on low-volume roads. Regression techniques were used to develop the deterioration paths using the predicted distribution of indices estimated from the Markov process. Results indicated that longitudinal, fatigue, and rut indices had very slow deterioration over time, whereas transverse and ride indices showed faster deterioration. The developed deterioration models had the coefficient of determination (R2 above 0.84. As probabilistic models provide more accurate results, it is recommended that these models be used as the family curves in the CDOT PMS for low-volume roads.

  2. A probabilistic model for estimating the waiting time until the simultaneous collapse of two contingencies

    International Nuclear Information System (INIS)

    Barnett, C.S.

    1991-06-01

    The Double Contingency Principle (DCP) is widely applied to criticality safety practice in the United States. Most practitioners base their application of the principle on qualitative, intuitive assessments. The recent trend toward probabilistic safety assessments provides a motive to search for a quantitative, probabilistic foundation for the DCP. A Markov model is tractable and leads to relatively simple results. The model yields estimates of mean time to simultaneous collapse of two contingencies as a function of estimates of mean failure times and mean recovery times of two independent contingencies. The model is a tool that can be used to supplement the qualitative methods now used to assess effectiveness of the DCP. 3 refs., 1 fig

  3. Multiple sequential failure model: A probabilistic approach to quantifying human error dependency

    International Nuclear Information System (INIS)

    Samanta

    1985-01-01

    This paper rpesents a probabilistic approach to quantifying human error dependency when multiple tasks are performed. Dependent human failures are dominant contributors to risks from nuclear power plants. An overview of the Multiple Sequential Failure (MSF) model developed and its use in probabilistic risk assessments (PRAs) depending on the available data are discussed. A small-scale psychological experiment was conducted on the nature of human dependency and the interpretation of the experimental data by the MSF model show remarkable accommodation of the dependent failure data. The model, which provides an unique method for quantification of dependent failures in human reliability analysis, can be used in conjunction with any of the general methods currently used for performing the human reliability aspect in PRAs

  4. A probabilistic model for estimating the waiting time until the simultaneous collapse of two contingencies

    International Nuclear Information System (INIS)

    Barnett, C.S.

    1992-01-01

    The double contingency principle (DCP) is widely applied to criticality safety practice in the United States. Most practitioners base their application of the principle on qualitative and intuitive assessments. The recent trend toward probabilistic safety assessments provides a motive for a search for a quantitative and probabilistic foundation for the DCP. A Markov model is tractable and leads to relatively simple results. The model yields estimates of mean time to simultaneous collapse of two contingencies, as functions of estimates of mean failure times and mean recovery times of two independent contingencies. The model is a tool that can be used to supplement the qualitative methods now used to assess the effectiveness of the DCP. (Author)

  5. HIV-specific probabilistic models of protein evolution.

    Directory of Open Access Journals (Sweden)

    David C Nickle

    2007-06-01

    Full Text Available Comparative sequence analyses, including such fundamental bioinformatics techniques as similarity searching, sequence alignment and phylogenetic inference, have become a mainstay for researchers studying type 1 Human Immunodeficiency Virus (HIV-1 genome structure and evolution. Implicit in comparative analyses is an underlying model of evolution, and the chosen model can significantly affect the results. In general, evolutionary models describe the probabilities of replacing one amino acid character with another over a period of time. Most widely used evolutionary models for protein sequences have been derived from curated alignments of hundreds of proteins, usually based on mammalian genomes. It is unclear to what extent these empirical models are generalizable to a very different organism, such as HIV-1-the most extensively sequenced organism in existence. We developed a maximum likelihood model fitting procedure to a collection of HIV-1 alignments sampled from different viral genes, and inferred two empirical substitution models, suitable for describing between-and within-host evolution. Our procedure pools the information from multiple sequence alignments, and provided software implementation can be run efficiently in parallel on a computer cluster. We describe how the inferred substitution models can be used to generate scoring matrices suitable for alignment and similarity searches. Our models had a consistently superior fit relative to the best existing models and to parameter-rich data-driven models when benchmarked on independent HIV-1 alignments, demonstrating evolutionary biases in amino-acid substitution that are unique to HIV, and that are not captured by the existing models. The scoring matrices derived from the models showed a marked difference from common amino-acid scoring matrices. The use of an appropriate evolutionary model recovered a known viral transmission history, whereas a poorly chosen model introduced phylogenetic

  6. Metabolic level recognition of progesterone in dairy Holstein cows using probabilistic models

    Directory of Open Access Journals (Sweden)

    Ludmila N. Turino

    2014-05-01

    Full Text Available Administration of exogenous progesterone is widely used in hormonal protocols for estrous (resynchronization of dairy cattle without regarding pharmacological issues for dose calculation. This happens because it is difficult to estimate the metabolic level of progesterone for each individual cow before administration. In the present contribution, progesterone pharmacokinetics has been determined in lactating Holstein cows with different milk production yields. A Bayesian approach has been implemented to build two probabilistic progesterone pharmacokinetic models for high and low yield dairy cows. Such models are based on a one-compartment Hill structure. Posterior probabilistic models have been structurally set up and parametric probability density functions have been empirically estimated. Moreover, a global sensitivity analysis has been done to know sensitivity profile of each model. Finally, posterior probabilistic models have adequately recognized cow’s progesterone metabolic level in a validation set when Kullback-Leibler based indices were used. These results suggest that milk yield may be a good index for estimating pharmacokinetic level of progesterone.

  7. Probabilistic inference: Task dependency and individual differences of probability weighting revealed by hierarchical Bayesian modelling

    Directory of Open Access Journals (Sweden)

    Moritz eBoos

    2016-05-01

    Full Text Available Cognitive determinants of probabilistic inference were examined using hierarchical Bayesian modelling techniques. A classic urn-ball paradigm served as experimental strategy, involving a factorial two (prior probabilities by two (likelihoods design. Five computational models of cognitive processes were compared with the observed behaviour. Parameter-free Bayesian posterior probabilities and parameter-free base rate neglect provided inadequate models of probabilistic inference. The introduction of distorted subjective probabilities yielded more robust and generalizable results. A general class of (inverted S-shaped probability weighting functions had been proposed; however, the possibility of large differences in probability distortions not only across experimental conditions, but also across individuals, seems critical for the model’s success. It also seems advantageous to consider individual differences in parameters of probability weighting as being sampled from weakly informative prior distributions of individual parameter values. Thus, the results from hierarchical Bayesian modelling converge with previous results in revealing that probability weighting parameters show considerable task dependency and individual differences. Methodologically, this work exemplifies the usefulness of hierarchical Bayesian modelling techniques for cognitive psychology. Theoretically, human probabilistic inference might be best described as the application of individualized strategic policies for Bayesian belief revision.

  8. The management of subsurface uncertainty using probabilistic modeling of life cycle production forecasts and cash flows

    International Nuclear Information System (INIS)

    Olatunbosun, O. O.

    1998-01-01

    The subject pertains to the implementation of the full range of subsurface uncertainties in life cycle probabilistic forecasting and its extension to project cash flows using the methodology of probabilities. A new tool has been developed in the probabilistic application of Crystal-Ball which can model reservoir volumetrics, life cycle production forecasts and project cash flows in a single environment. The tool is modular such that the volumetrics and cash flow modules are optional. Production forecasts are often generated by applying a decline equation to single best estimate values of input parameters such as initial potential, decline rate, abandonment rate etc -or sometimes by results of reservoir simulation. This new tool provides a means of implementing the full range of uncertainties and interdependencies of the input parameters into the production forecasts by defining the input parameters as probability density functions, PDFs and performing several iterations to generate an expectation curve forecast. Abandonment rate is implemented in each iteration via a link to an OPEX model. The expectation curve forecast is input into a cash flow model to generate a probabilistic NPV. Base case and sensitivity runs from reservoir simulation can likewise form the basis for a probabilistic production forecast from which a probabilistic cash flow can be generated. A good illustration of the application of this tool is in the modelling of the production forecast for a well that encounters its target reservoirs in OUT/ODT situation and thus has significant uncertainties. The uncertainty in presence and size (if present) of gas cap and dependency between ultimate recovery and initial potential amongst other uncertainties can be easily implemented in the production forecast with this tool. From the expectation curve forecast, a probabilistic NPV can be easily generated. Possible applications of this tool include: i. estimation of range of actual recoverable volumes based

  9. Probabilistic model for the simulation of secondary electron emission

    Directory of Open Access Journals (Sweden)

    M. A. Furman

    2002-12-01

    Full Text Available We provide a detailed description of a model and its computational algorithm for the secondary electron emission process. The model is based on a broad phenomenological fit to data for the secondary-emission yield and the emitted-energy spectrum. We provide two sets of values for the parameters by fitting our model to two particular data sets, one for copper and the other one for stainless steel.

  10. Chiefly Symmetric: Results on the Scalability of Probabilistic Model Checking for Operating-System Code

    Directory of Open Access Journals (Sweden)

    Marcus Völp

    2012-11-01

    Full Text Available Reliability in terms of functional properties from the safety-liveness spectrum is an indispensable requirement of low-level operating-system (OS code. However, with evermore complex and thus less predictable hardware, quantitative and probabilistic guarantees become more and more important. Probabilistic model checking is one technique to automatically obtain these guarantees. First experiences with the automated quantitative analysis of low-level operating-system code confirm the expectation that the naive probabilistic model checking approach rapidly reaches its limits when increasing the numbers of processes. This paper reports on our work-in-progress to tackle the state explosion problem for low-level OS-code caused by the exponential blow-up of the model size when the number of processes grows. We studied the symmetry reduction approach and carried out our experiments with a simple test-and-test-and-set lock case study as a representative example for a wide range of protocols with natural inter-process dependencies and long-run properties. We quickly see a state-space explosion for scenarios where inter-process dependencies are insignificant. However, once inter-process dependencies dominate the picture models with hundred and more processes can be constructed and analysed.

  11. Probabilistic modelling of security of supply in gas networks and evaluation of new infrastructure

    International Nuclear Information System (INIS)

    Praks, Pavel; Kopustinskas, Vytis; Masera, Marcelo

    2015-01-01

    The paper presents a probabilistic model to study security of supply in a gas network. The model is based on Monte-Carlo simulations with graph theory, and is implemented in the software tool ProGasNet. The software allows studying gas networks in various aspects including identification of weakest links and nodes, vulnerability analysis, bottleneck analysis, evaluation of new infrastructure etc. In this paper ProGasNet is applied to a benchmark network based on a real EU gas transmission network of several countries with the purpose of evaluating the security of supply effects of new infrastructure, either under construction, recently completed or under planning. The probabilistic model enables quantitative evaluations by comparing the reliability of gas supply in each consuming node of the network. - Highlights: • A Monte-Carlo algorithm for stochastic flow networks is presented. • Network elements can fail according to a given probabilistic model. • Priority supply pattern of gas transmission networks is assumed. • A real-world EU gas transmission network is presented and analyzed. • A risk ratio is used for security of supply quantification of a new infrastructure.

  12. Financial and Real Sector Leading Indicators of Recessions in Brazil Using Probabilistic Models

    Directory of Open Access Journals (Sweden)

    Fernando Nascimento de Oliveira

    Full Text Available We examine the usefulness of various financial and real sector variables to forecast recessions in Brazil between one and eight quarters ahead. We estimate probabilistic models of recession and select models based on their outof-sample forecasts, using the Receiver Operating Characteristic (ROC function. We find that the predictive out-of-sample ability of several models vary depending on the numbers of quarters ahead to forecast and on the number of regressors used in the model specification. The models selected seem to be relevant to give early warnings of recessions in Brazil.

  13. Proposal of a probabilistic dose-response model

    International Nuclear Information System (INIS)

    Barrachina, M.

    1997-01-01

    A biologically updated dose-response model is presented as an alternative to the linear-quadratic model currently in use for cancer risk assessment. The new model is based on the probability functions for misrepair and/or unrepair of DNA lesions, in terms of the radiation damage production rate in the cell (supposedly, a stem cell) and its repair-rate constant. The model makes use, interpreting it on the basis of misrepair probabilities, of the ''dose and dose-rate effectiveness factor'' of ICRP, and provides the way for a continuous extrapolation between the high and low dose-rate regions, ratifying the ''linear non-threshold hypothesis'' as the main option. Anyhow, the model throws some doubts about the additive property of the dose. (author)

  14. Probabilistic Modeling of Seismic Risk Based Design for a Dual System Structure

    OpenAIRE

    Sidi, Indra Djati

    2017-01-01

    The dual system structure concept has gained popularity in the construction of high-rise buildings over the last decades. Meanwhile, earthquake engineering design provisions for buildings have moved from the uniform hazard concept to the uniform risk concept upon recognizing the uncertainties involved in the earthquake resistance of concrete structures. In this study, a probabilistic model for the evaluation of such risk is proposed for a dual system structure consisting of shear walls or cor...

  15. An integrated dynamic model for probabilistic risk assessments

    International Nuclear Information System (INIS)

    Hsueh, K.-S.; Wang Kong

    2004-01-01

    The purpose of this dissertation is to develop a simulation based accident sequence analysis program (ADS) for large scale dynamic accident sequence simulation. Human operators, front-line and support systems as well as plant thermal-hydraulic behavior are explicitly modeled as integrated active parts in the development of accident scenarios. To overcome the model size, the proposed methodology employs several techniques including use of 'initial state vector' which decouples time-dependent and time-independent factors, and a depth first integration method in which the computation memory demand increases in a linear order. The computer implementation of the method is capable of simulating up to 500 branch points in sequence development, models system failure during operation, allows for recovery from operator errors and hardware failures, and implements a simple model for operator system interactions. (author)

  16. Probabilistic Modeling and Simulation of Metal Fatigue Life Prediction

    National Research Council Canada - National Science Library

    Heffern, Thomas

    2002-01-01

    ...% FLE The work of this thesis was to investigate the probability distributions of test data taken for aluminum 7050-T745 1, and to attempt to develop a probability based model from the variation...

  17. A Probabilistic Cost Estimation Model for Unexploded Ordnance Removal

    National Research Council Canada - National Science Library

    Poppe, Peter

    1999-01-01

    ...) contaminated sites that the services must decontaminate. Existing models for estimating the cost of UXO removal often require a high level of expertise and provide only a point estimate for the costs...

  18. Economic Dispatch for Microgrid Containing Electric Vehicles via Probabilistic Modeling: Preprint

    Energy Technology Data Exchange (ETDEWEB)

    Yao, Yin; Gao, Wenzhong; Momoh, James; Muljadi, Eduard

    2016-02-11

    In this paper, an economic dispatch model with probabilistic modeling is developed for a microgrid. The electric power supply in a microgrid consists of conventional power plants and renewable energy power plants, such as wind and solar power plants. Because of the fluctuation in the output of solar and wind power plants, an empirical probabilistic model is developed to predict their hourly output. According to different characteristics of wind and solar power plants, the parameters for probabilistic distribution are further adjusted individually for both. On the other hand, with the growing trend in plug-in electric vehicles (PHEVs), an integrated microgrid system must also consider the impact of PHEVs. The charging loads from PHEVs as well as the discharging output via the vehicle-to-grid (V2G) method can greatly affect the economic dispatch for all of the micro energy sources in a microgrid. This paper presents an optimization method for economic dispatch in a microgrid considering conventional power plants, renewable power plants, and PHEVs. The simulation results reveal that PHEVs with V2G capability can be an indispensable supplement in a modern microgrid.

  19. A probabilistic model for the evolution of RNA structure

    Directory of Open Access Journals (Sweden)

    Holmes Ian

    2004-10-01

    Full Text Available Abstract Background For the purposes of finding and aligning noncoding RNA gene- and cis-regulatory elements in multiple-genome datasets, it is useful to be able to derive multi-sequence stochastic grammars (and hence multiple alignment algorithms systematically, starting from hypotheses about the various kinds of random mutation event and their rates. Results Here, we consider a highly simplified evolutionary model for RNA, called "The TKF91 Structure Tree" (following Thorne, Kishino and Felsenstein's 1991 model of sequence evolution with indels, which we have implemented for pairwise alignment as proof of principle for such an approach. The model, its strengths and its weaknesses are discussed with reference to four examples of functional ncRNA sequences: a riboswitch (guanine, a zipcode (nanos, a splicing factor (U4 and a ribozyme (RNase P. As shown by our visualisations of posterior probability matrices, the selected examples illustrate three different signatures of natural selection that are highly characteristic of ncRNA: (i co-ordinated basepair substitutions, (ii co-ordinated basepair indels and (iii whole-stem indels. Conclusions Although all three types of mutation "event" are built into our model, events of type (i and (ii are found to be better modeled than events of type (iii. Nevertheless, we hypothesise from the model's performance on pairwise alignments that it would form an adequate basis for a prototype multiple alignment and genefinding tool.

  20. Statistical analysis of probabilistic models of software product lines with quantitative constraints

    DEFF Research Database (Denmark)

    Beek, M.H. ter; Legay, A.; Lluch Lafuente, Alberto

    2015-01-01

    We investigate the suitability of statistical model checking for the analysis of probabilistic models of software product lines with complex quantitative constraints and advanced feature installation options. Such models are specified in the feature-oriented language QFLan, a rich process algebra...... of certain behaviour to the expected average cost of products. This is supported by a Maude implementation of QFLan, integrated with the SMT solver Z3 and the distributed statistical model checker MultiVeStA. Our approach is illustrated with a bikes product line case study....

  1. Competing probabilistic models for catch-effort relationships in wildlife censuses

    Energy Technology Data Exchange (ETDEWEB)

    Skalski, J.R.; Robson, D.S.; Matsuzaki, C.L.

    1983-01-01

    Two probabilistic models are presented for describing the chance that an animal is captured during a wildlife census, as a function of trapping effort. The models in turn are used to propose relationships between sampling intensity and catch-per-unit-effort (C.P.U.E.) that were field tested on small mammal populations. Capture data suggests a model of diminshing C.P.U.E. with increasing levels of trapping intensity. The catch-effort model is used to illustrate optimization procedures in the design of mark-recapture experiments for censusing wild populations. 14 references, 2 tables.

  2. Dependence in probabilistic modeling Dempster-Shafer theory and probability bounds analysis

    Energy Technology Data Exchange (ETDEWEB)

    Ferson, Scott [Applied Biomathematics, Setauket, NY (United States); Nelsen, Roger B. [Lewis & Clark College, Portland OR (United States); Hajagos, Janos [Applied Biomathematics, Setauket, NY (United States); Berleant, Daniel J. [Iowa State Univ., Ames, IA (United States); Zhang, Jianzhong [Iowa State Univ., Ames, IA (United States); Tucker, W. Troy [Applied Biomathematics, Setauket, NY (United States); Ginzburg, Lev R. [Applied Biomathematics, Setauket, NY (United States); Oberkampf, William L. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2015-05-01

    This report summarizes methods to incorporate information (or lack of information) about inter-variable dependence into risk assessments that use Dempster-Shafer theory or probability bounds analysis to address epistemic and aleatory uncertainty. The report reviews techniques for simulating correlated variates for a given correlation measure and dependence model, computation of bounds on distribution functions under a specified dependence model, formulation of parametric and empirical dependence models, and bounding approaches that can be used when information about the intervariable dependence is incomplete. The report also reviews several of the most pervasive and dangerous myths among risk analysts about dependence in probabilistic models.

  3. Probabilistic Model for Integrated Assessment of the Behavior at the T.D.P. Version 2

    International Nuclear Information System (INIS)

    Hurtado, A.; Eguilior, S.; Recreo, F

    2015-01-01

    This report documents the completion of the first phase of the implementation of the methodology ABACO2G (Bayes Application to Geological Storage of CO2) and the final version of the ABACO2G probabilistic model for the injection phase before its future validation in the experimental field of the Technology Development Plant in Hontom (Burgos). The model, which is based on the determination of the probabilistic risk component of a geological storage of CO2 using the formalism of Bayesian networks and Monte Carlo probability yields quantitative probability functions of the total system CO2 storage and of each one of their subsystems (storage subsystem and the primary seal; secondary containment subsystem and dispersion subsystem or tertiary one); the implementation of the stochastic time evolution of the CO2 plume during the injection period, the stochastic time evolution of the drying front, the probabilistic evolution of the pressure front, decoupled from the CO2 plume progress front, and the implementation of submodels and leakage probability functions through major leakage risk elements (fractures / faults and wells / deep boreholes) which together define the space of events to estimate the risks associated with the CO2 geological storage system. The activities included in this report have been to replace the previous qualitative estimation submodels of former ABACO2G version developed during Phase I of the project ALM-10-017, by analytical, semi-analytical or numerical submodels for the main elements of risk (wells and fractures), to obtain an integrated probabilistic model of a CO2 storage complex in carbonate formations that meets the needs of the integrated behavior evaluation of the Technology Development Plant in Hontomín

  4. A novel soft tissue prediction methodology for orthognathic surgery based on probabilistic finite element modelling.

    Science.gov (United States)

    Knoops, Paul G M; Borghi, Alessandro; Ruggiero, Federica; Badiali, Giovanni; Bianchi, Alberto; Marchetti, Claudio; Rodriguez-Florez, Naiara; Breakey, Richard W F; Jeelani, Owase; Dunaway, David J; Schievano, Silvia

    2018-01-01

    Repositioning of the maxilla in orthognathic surgery is carried out for functional and aesthetic purposes. Pre-surgical planning tools can predict 3D facial appearance by computing the response of the soft tissue to the changes to the underlying skeleton. The clinical use of commercial prediction software remains controversial, likely due to the deterministic nature of these computational predictions. A novel probabilistic finite element model (FEM) for the prediction of postoperative facial soft tissues is proposed in this paper. A probabilistic FEM was developed and validated on a cohort of eight patients who underwent maxillary repositioning and had pre- and postoperative cone beam computed tomography (CBCT) scans taken. Firstly, a variables correlation assessed various modelling parameters. Secondly, a design of experiments (DOE) provided a range of potential outcomes based on uniformly distributed input parameters, followed by an optimisation. Lastly, the second DOE iteration provided optimised predictions with a probability range. A range of 3D predictions was obtained using the probabilistic FEM and validated using reconstructed soft tissue surfaces from the postoperative CBCT data. The predictions in the nose and upper lip areas accurately include the true postoperative position, whereas the prediction under-estimates the position of the cheeks and lower lip. A probabilistic FEM has been developed and validated for the prediction of the facial appearance following orthognathic surgery. This method shows how inaccuracies in the modelling and uncertainties in executing surgical planning influence the soft tissue prediction and it provides a range of predictions including a minimum and maximum, which may be helpful for patients in understanding the impact of surgery on the face.

  5. A Coupled Probabilistic Wake Vortex and Aircraft Response Prediction Model

    Science.gov (United States)

    Gloudemans, Thijs; Van Lochem, Sander; Ras, Eelco; Malissa, Joel; Ahmad, Nashat N.; Lewis, Timothy A.

    2016-01-01

    Wake vortex spacing standards along with weather and runway occupancy time, restrict terminal area throughput and impose major constraints on the overall capacity and efficiency of the National Airspace System (NAS). For more than two decades, the National Aeronautics and Space Administration (NASA) has been conducting research on characterizing wake vortex behavior in order to develop fast-time wake transport and decay prediction models. It is expected that the models can be used in the systems level design of advanced air traffic management (ATM) concepts that safely increase the capacity of the NAS. It is also envisioned that at a later stage of maturity, these models could potentially be used operationally, in groundbased spacing and scheduling systems as well as on the flight deck.

  6. Dynamic probabilistic models and social structure essays on socioeconomic continuity

    CERN Document Server

    Gómez M , Guillermo L

    1992-01-01

    Mathematical models have been very successful in the study of the physical world. Galilei and Newton introduced point particles moving without friction under the action of simple forces as the basis for the description of concrete motions like the ones of the planets. This approach was sustained by appro­ priate mathematical methods, namely infinitesimal calculus, which was being developed at that time. In this way classical analytical mechanics was able to establish some general results, gaining insight through explicit solution of some simple cases and developing various methods of approximation for handling more complicated ones. Special relativity theory can be seen as an extension of this kind of modelling. In the study of electromagnetic phenomena and in general relativity another mathematical model is used, in which the concept of classical field plays the fundamental role. The equations of motion here are partial differential equations, and the methods of study used involve further developments of cl...

  7. Probabilistic logics and probabilistic networks

    CERN Document Server

    Haenni, Rolf; Wheeler, Gregory; Williamson, Jon; Andrews, Jill

    2014-01-01

    Probabilistic Logic and Probabilistic Networks presents a groundbreaking framework within which various approaches to probabilistic logic naturally fit. Additionally, the text shows how to develop computationally feasible methods to mesh with this framework.

  8. Validation of a probabilistic post-fire erosion model

    Science.gov (United States)

    Pete Robichaud; William J. Elliot; Sarah A. Lewis; Mary Ellen Miller

    2016-01-01

    Post-fire increases of runoff and erosion often occur and land managers need tools to be able to project the increased risk. The Erosion Risk Management Tool (ERMiT) uses the Water Erosion Prediction Project (WEPP) model as the underlying processor. ERMiT predicts the probability of a given amount of hillslope sediment delivery from a single rainfall or...

  9. Development of a perfect prognosis probabilistic model for ...

    Indian Academy of Sciences (India)

    A prediction model based on the perfect prognosis method was developed to predict the probability of lightning and probable time of its occurrence over the south-east Indian region. In the perfect prognosis method, statistical relationships are established using past observed data. For real time applications, the predictors ...

  10. Computational models for probabilistic neutronic calculation in TADSEA

    International Nuclear Information System (INIS)

    Garcia, Jesus A.R.; Curbelo, Jesus P.; Hernandez, Carlos R.G.; Oliva, Amaury M.; Lira, Carlos A.B.O.

    2013-01-01

    The Very High Temperature Reactor is one of the main candidates for the next generation of nuclear power plants. In pebble bed reactors, the fuel is contained within graphite pebbles in the form of TRISO particles, which form a randomly packed bed inside a graphite-walled cylindrical cavity. In previous studies, the conceptual design of a Transmutation Advanced Device for Sustainable Energy Applications (TADSEA) has been made. The TADSEA is a pebble-bed ADS cooled by helium and moderated by graphite. In order to simulate the TADSEA correctly, the double heterogeneity of the system must be considered. It consists on randomly located pebbles into the core and randomly located TRISO particles into the fuel pebbles. These features are often neglected due to the difficulty to model with MCNP code. The main reason is that there is a limited number of cells and surfaces to be defined. In this paper a computational tool, which allows to get a new geometrical model for fuel pebble to neutronic calculation with MCNPX, was presented. The heterogeneity of system is considered, and also the randomly located TRISO particles inside the pebble. There are also compared several neutronic computational models for TADSEA's fuel pebbles in order to study heterogeneity effects. On the other hand the boundary effect given by the intersection between the pebble surface and the TRISO particles could be significative in the multiplicative properties. A model to study this e ect is also presented. (author)

  11. A Probabilistic Model of Meter Perception: Simulating Enculturation

    NARCIS (Netherlands)

    van der Weij, B.; Pearce, M.T.; Honing, H.

    Enculturation is known to shape the perception of meter in music but this is not explicitly accounted for by current cognitive models of meter perception. We hypothesize that the induction of meter is a result of predictive coding: interpreting onsets in a rhythm relative to a periodic meter

  12. Using statistical compatibility to derive advanced probabilistic fatigue models

    Czech Academy of Sciences Publication Activity Database

    Fernández-Canteli, A.; Castillo, E.; López-Aenlle, M.; Seitl, Stanislav

    2010-01-01

    Roč. 2, č. 1 (2010), s. 1131-1140 E-ISSN 1877-7058. [Fatigue 2010. Praha, 06.06.2010-11.06.2010] Institutional research plan: CEZ:AV0Z20410507 Keywords : Fatigue models * Statistical compatibility * Functional equations Subject RIV: JL - Materials Fatigue, Friction Mechanics

  13. Probabilistic forecasting of the solar irradiance with recursive ARMA and GARCH models

    DEFF Research Database (Denmark)

    David, M.; Ramahatana, F.; Trombe, Pierre-Julien

    2016-01-01

    Forecasting of the solar irradiance is a key feature in order to increase the penetration rate of solar energy into the energy grids. Indeed, the anticipation of the fluctuations of the solar renewables allows a better management of the production means of electricity and a better operation...... sky index show some similarities with that of financial time series. The aim of this paper is to assess the performances of a commonly used combination of two linear models (ARMA and GARCH) in econometrics in order to provide probabilistic forecasts of solar irradiance. In addition, a recursive...... regarding the statistical distribution of the error, the reliability of the probabilistic forecasts stands in the same order of magnitude as other works done in the field of solar forecasting....

  14. Validation analysis of probabilistic models of dietary exposure to food additives.

    Science.gov (United States)

    Gilsenan, M B; Thompson, R L; Lambe, J; Gibney, M J

    2003-10-01

    The validity of a range of simple conceptual models designed specifically for the estimation of food additive intakes using probabilistic analysis was assessed. Modelled intake estimates that fell below traditional conservative point estimates of intake and above 'true' additive intakes (calculated from a reference database at brand level) were considered to be in a valid region. Models were developed for 10 food additives by combining food intake data, the probability of an additive being present in a food group and additive concentration data. Food intake and additive concentration data were entered as raw data or as a lognormal distribution, and the probability of an additive being present was entered based on the per cent brands or the per cent eating occasions within a food group that contained an additive. Since the three model components assumed two possible modes of input, the validity of eight (2(3)) model combinations was assessed. All model inputs were derived from the reference database. An iterative approach was employed in which the validity of individual model components was assessed first, followed by validation of full conceptual models. While the distribution of intake estimates from models fell below conservative intakes, which assume that the additive is present at maximum permitted levels (MPLs) in all foods in which it is permitted, intake estimates were not consistently above 'true' intakes. These analyses indicate the need for more complex models for the estimation of food additive intakes using probabilistic analysis. Such models should incorporate information on market share and/or brand loyalty.

  15. Probabilistic Inference in General Graphical Models through Sampling in Stochastic Networks of Spiking Neurons

    Science.gov (United States)

    Pecevski, Dejan; Buesing, Lars; Maass, Wolfgang

    2011-01-01

    An important open problem of computational neuroscience is the generic organization of computations in networks of neurons in the brain. We show here through rigorous theoretical analysis that inherent stochastic features of spiking neurons, in combination with simple nonlinear computational operations in specific network motifs and dendritic arbors, enable networks of spiking neurons to carry out probabilistic inference through sampling in general graphical models. In particular, it enables them to carry out probabilistic inference in Bayesian networks with converging arrows (“explaining away”) and with undirected loops, that occur in many real-world tasks. Ubiquitous stochastic features of networks of spiking neurons, such as trial-to-trial variability and spontaneous activity, are necessary ingredients of the underlying computational organization. We demonstrate through computer simulations that this approach can be scaled up to neural emulations of probabilistic inference in fairly large graphical models, yielding some of the most complex computations that have been carried out so far in networks of spiking neurons. PMID:22219717

  16. Probabilistic Price Forecasting for Day-Ahead and Intraday Markets: Beyond the Statistical Model

    Directory of Open Access Journals (Sweden)

    José R. Andrade

    2017-10-01

    Full Text Available Forecasting the hourly spot price of day-ahead and intraday markets is particularly challenging in electric power systems characterized by high installed capacity of renewable energy technologies. In particular, periods with low and high price levels are difficult to predict due to a limited number of representative cases in the historical dataset, which leads to forecast bias problems and wide forecast intervals. Moreover, these markets also require the inclusion of multiple explanatory variables, which increases the complexity of the model without guaranteeing a forecasting skill improvement. This paper explores information from daily futures contract trading and forecast of the daily average spot price to correct point and probabilistic forecasting bias. It also shows that an adequate choice of explanatory variables and use of simple models like linear quantile regression can lead to highly accurate spot price point and probabilistic forecasts. In terms of point forecast, the mean absolute error was 3.03 €/MWh for day-ahead market and a maximum value of 2.53 €/MWh was obtained for intraday session 6. The probabilistic forecast results show sharp forecast intervals and deviations from perfect calibration below 7% for all market sessions.

  17. Event-Based Media Enrichment Using an Adaptive Probabilistic Hypergraph Model.

    Science.gov (United States)

    Liu, Xueliang; Wang, Meng; Yin, Bao-Cai; Huet, Benoit; Li, Xuelong

    2015-11-01

    Nowadays, with the continual development of digital capture technologies and social media services, a vast number of media documents are captured and shared online to help attendees record their experience during events. In this paper, we present a method combining semantic inference and multimodal analysis for automatically finding media content to illustrate events using an adaptive probabilistic hypergraph model. In this model, media items are taken as vertices in the weighted hypergraph and the task of enriching media to illustrate events is formulated as a ranking problem. In our method, each hyperedge is constructed using the K-nearest neighbors of a given media document. We also employ a probabilistic representation, which assigns each vertex to a hyperedge in a probabilistic way, to further exploit the correlation among media data. Furthermore, we optimize the hypergraph weights in a regularization framework, which is solved as a second-order cone problem. The approach is initiated by seed media and then used to rank the media documents using a transductive inference process. The results obtained from validating the approach on an event dataset collected from EventMedia demonstrate the effectiveness of the proposed approach.

  18. Probabilistic inference in general graphical models through sampling in stochastic networks of spiking neurons.

    Directory of Open Access Journals (Sweden)

    Dejan Pecevski

    2011-12-01

    Full Text Available An important open problem of computational neuroscience is the generic organization of computations in networks of neurons in the brain. We show here through rigorous theoretical analysis that inherent stochastic features of spiking neurons, in combination with simple nonlinear computational operations in specific network motifs and dendritic arbors, enable networks of spiking neurons to carry out probabilistic inference through sampling in general graphical models. In particular, it enables them to carry out probabilistic inference in Bayesian networks with converging arrows ("explaining away" and with undirected loops, that occur in many real-world tasks. Ubiquitous stochastic features of networks of spiking neurons, such as trial-to-trial variability and spontaneous activity, are necessary ingredients of the underlying computational organization. We demonstrate through computer simulations that this approach can be scaled up to neural emulations of probabilistic inference in fairly large graphical models, yielding some of the most complex computations that have been carried out so far in networks of spiking neurons.

  19. Probabilistic model of ligaments and tendons: Quasistatic linear stretching

    Science.gov (United States)

    Bontempi, M.

    2009-03-01

    Ligaments and tendons have a significant role in the musculoskeletal system and are frequently subjected to injury. This study presents a model of collagen fibers, based on the study of a statistical distribution of fibers when they are subjected to quasistatic linear stretching. With respect to other methodologies, this model is able to describe the behavior of the bundle using less ad hoc hypotheses and is able to describe all the quasistatic stretch-load responses of the bundle, including the yield and failure regions described in the literature. It has two other important results: the first is that it is able to correlate the mechanical behavior of the bundle with its internal structure, and it suggests a methodology to deduce the fibers population distribution directly from the tensile-test data. The second is that it can follow fibers’ structure evolution during the stretching and it is possible to study the internal adaptation of fibers in physiological and pathological conditions.

  20. A probabilistic model for x-ray PHA data

    International Nuclear Information System (INIS)

    Diesso, M.; Hill, K.

    1986-01-01

    In this paper, a mathematical model of the data produced by a single-arm x-ray pulse height analyzer (PHA) system is developed. Given an assumption on the electron temperature and density profiles, a maximum likelihood technique is applied to calculate the peak electron temperature and enhancement factor of the plasma. This method is currently being used in the analysis of x-ray data from the tokamak fusion test reactor (TFTR); sample results are presented

  1. The implicit possibility of dualism in quantum probabilistic cognitive modeling.

    Science.gov (United States)

    Mender, Donald

    2013-06-01

    Pothos & Busemeyer (P&B) argue convincingly that quantum probability offers an improvement over classical Bayesian probability in modeling the empirical data of cognitive science. However, a weakness related to restrictions on the dimensionality of incompatible physical observables flows from the authors' "agnosticism" regarding quantum processes in neural substrates underlying cognition. Addressing this problem will require either future research findings validating quantum neurophysics or theoretical expansion of the uncertainty principle as a new, neurocognitively contextualized, "local" symmetry.

  2. Probabilistic object and viewpoint models for active object recognition

    CSIR Research Space (South Africa)

    Govender, N

    2013-09-01

    Full Text Available ,θ′(f occ). V. EXPERIMENTS A. Dataset For our experiments, we use the active recognition dataset introduced by [12]. The training data consists of everyday objects such as cereal boxes, ornaments, spice bottle, etc. Images were captured every 20 degrees... are to be verified TABLE I CONFUSION MATRIX FOR BINARY A MODEL Obscured Obscured Obscured Obscured Obscured Obscured Obscured Obscured Obscured Obscured Cereal Battery Curry box Elephant Handbag MrMin Salad Bottle Spice Bottle Spray Can Spray Can 1 Cereal 0.9800 0...

  3. Probabilistic image processing by means of the Bethe approximation for the Q-Ising model

    International Nuclear Information System (INIS)

    Tanaka, Kazuyuki; Inoue, Jun-ichi; Titterington, D M

    2003-01-01

    The framework of Bayesian image restoration for multi-valued images by means of the Q-Ising model with nearest-neighbour interactions is presented. Hyperparameters in the probabilistic model are determined so as to maximize the marginal likelihood. A practical algorithm is described for multi-valued image restoration based on the Bethe approximation. The algorithm corresponds to loopy belief propagation in artificial intelligence. We conclude that, in real world grey-level images, the Q-Ising model can give us good results

  4. Effects of varying the step particle distribution on a probabilistic transport model

    International Nuclear Information System (INIS)

    Bouzat, S.; Farengo, R.

    2005-01-01

    The consequences of varying the step particle distribution on a probabilistic transport model, which captures the basic features of transport in plasmas and was recently introduced in Ref. 1 [B. Ph. van Milligen et al., Phys. Plasmas 11, 2272 (2004)], are studied. Different superdiffusive transport mechanisms generated by a family of distributions with algebraic decays (Tsallis distributions) are considered. It is observed that the possibility of changing the superdiffusive transport mechanism improves the flexibility of the model for describing different situations. The use of the model to describe the low (L) and high (H) confinement modes is also analyzed

  5. Tractable approximations for probabilistic models: The adaptive Thouless-Anderson-Palmer mean field approach

    DEFF Research Database (Denmark)

    Opper, Manfred; Winther, Ole

    2001-01-01

    We develop an advanced mean held method for approximating averages in probabilistic data models that is based on the Thouless-Anderson-Palmer (TAP) approach of disorder physics. In contrast to conventional TAP. where the knowledge of the distribution of couplings between the random variables...... is required. our method adapts to the concrete couplings. We demonstrate the validity of our approach, which is so far restricted to models with nonglassy behavior? by replica calculations for a wide class of models as well as by simulations for a real data set....

  6. Comparison of Microscopic Drivers' Probabilistic Lane-changing Models With Real Traffic Microscopic Data

    Directory of Open Access Journals (Sweden)

    Seyyed Mohammad Sadat Hoseini

    2011-07-01

    Full Text Available The difficulties of microscopic-level simulation models to accurately reproduce real traffic phenomena stem not only from the complexity of calibration and validation operations, but also from the structural inadequacies of the sub-models themselves. Both of these drawbacks originate from the scant information available on real phenomena because of the difficulty in gathering accurate field data. This paper studies the traffic behaviour of individual drivers utilizing vehicle trajectory data extracted from digital images collected from freeways in Iran. These data are used to evaluate the four proposed microscopic traffic models. One of the models is based on the traffic regulations in Iran and the three others are probabilistic models that use a decision factor for calculating the probability of choosing a position on the freeway by a driver. The decision factors for three probabilistic models are increasing speed, decreasing risk of collision, and increasing speed combined with decreasing risk of collision. The models are simulated by a cellular automata simulator and compared with the real data. It is shown that the model based on driving regulations is not valid, but that other models appear useful for predicting the driver’s behaviour on freeway segments in Iran during noncongested conditions.

  7. Probabilistic Modeling Of Ocular Biomechanics In VIIP: Risk Stratification

    Science.gov (United States)

    Feola, A.; Myers, J. G.; Raykin, J.; Nelson, E. S.; Mulugeta, L.; Samuels, B.; Ethier, C. R.

    2016-01-01

    Visual Impairment and Intracranial Pressure (VIIP) syndrome is a major health concern for long-duration space missions. Currently, it is thought that a cephalad fluid shift in microgravity causes elevated intracranial pressure (ICP) that is transmitted along the optic nerve sheath (ONS). We hypothesize that this in turn leads to alteration and remodeling of connective tissue in the posterior eye which impacts vision. Finite element (FE) analysis is a powerful tool for examining the effects of mechanical loads in complex geometries. Our goal is to build a FE analysis framework to understand the response of the lamina cribrosa and optic nerve head to elevations in ICP in VIIP. To simulate the effects of different pressures on tissues in the posterior eye, we developed a geometric model of the posterior eye and optic nerve sheath and used a Latin hypercubepartial rank correlation coef-ficient (LHSPRCC) approach to assess the influence of uncertainty in our input parameters (i.e. pressures and material properties) on the peak strains within the retina, lamina cribrosa and optic nerve. The LHSPRCC approach was repeated for three relevant ICP ranges, corresponding to upright and supine posture on earth, and microgravity [1]. At each ICP condition we used intraocular pressure (IOP) and mean arterial pressure (MAP) measurements of in-flight astronauts provided by Lifetime Surveillance of Astronaut Health Program, NASA Johnson Space Center. The lamina cribrosa, optic nerve, retinal vessel and retina were modeled as linear-elastic materials, while other tissues were modeled as a Mooney-Rivlin solid (representing ground substance, stiffness parameter c1) with embedded collagen fibers (stiffness parameters c3, c4 and c5). Geometry creationmesh generation was done in Gmsh [2], while FEBio was used for all FE simulations [3]. The LHSPRCC approach resulted in correlation coefficients in the range of 1. To assess the relative influence of the uncertainty in an input parameter on

  8. Fatigue crack propagation: Probabilistic models and experimental evidence

    International Nuclear Information System (INIS)

    Lucia, A.C.; Jovanovic, A.

    1987-01-01

    The central aim of the LWR Primary Circuit Component Life Prediction Project, going on at JRC-Ispra, is to develop and check a 'procedure' (encompassing monitoring and inspection, data collection and analysis, prediction) allowing the quantitatives estimation of the accumulation of structural damage and of the residual lifetime. The ongoing activity matches theoretical development and experimentation, the latter being at present essentially based on a test-rig for room-temperature fatigue cycling of 1:5 scaled models of pressure vessels. During Phase I of fatigue testing of vessel R2, different pieces of information coming from material characterization, non-destructive inspection, continuous monitoring, stress analysis, have been merged and used to infere the future behaviour of the structure. The prediction of residual lifetime (cycles to failure), based on the outcomes of the ultrasonic continuous monitoring and made by means of the COVASTOL code, was in quite good agreement with experimental evidence. (orig./HP)

  9. Probabilistic Logical Characterization

    DEFF Research Database (Denmark)

    Hermanns, Holger; Parma, Augusto; Segala, Roberto

    2011-01-01

    Probabilistic automata exhibit both probabilistic and non-deterministic choice. They are therefore a powerful semantic foundation for modeling concurrent systems with random phenomena arising in many applications ranging from artificial intelligence, security, systems biology to performance...... modeling. Several variations of bisimulation and simulation relations have proved to be useful as means to abstract and compare different automata. This paper develops a taxonomy of logical characterizations of these relations on image-finite and image-infinite probabilistic automata....

  10. Probabilistic Multi-Factor Interaction Model for Complex Material Behavior

    Science.gov (United States)

    Abumeri, Galib H.; Chamis, Christos C.

    2010-01-01

    Complex material behavior is represented by a single equation of product form to account for interaction among the various factors. The factors are selected by the physics of the problem and the environment that the model is to represent. For example, different factors will be required for each to represent temperature, moisture, erosion, corrosion, etc. It is important that the equation represent the physics of the behavior in its entirety accurately. The Multi-Factor Interaction Model (MFIM) is used to evaluate the divot weight (foam weight ejected) from the external launch tanks. The multi-factor has sufficient degrees of freedom to evaluate a large number of factors that may contribute to the divot ejection. It also accommodates all interactions by its product form. Each factor has an exponent that satisfies only two points - the initial and final points. The exponent describes a monotonic path from the initial condition to the final. The exponent values are selected so that the described path makes sense in the absence of experimental data. In the present investigation, the data used were obtained by testing simulated specimens in launching conditions. Results show that the MFIM is an effective method of describing the divot weight ejected under the conditions investigated. The problem lies in how to represent the divot weight with a single equation. A unique solution to this problem is a multi-factor equation of product form. Each factor is of the following form (1 xi/xf)ei, where xi is the initial value, usually at ambient conditions, xf the final value, and ei the exponent that makes the curve represented unimodal that meets the initial and final values. The exponents are either evaluated by test data or by technical judgment. A minor disadvantage may be the selection of exponents in the absence of any empirical data. This form has been used successfully in describing the foam ejected in simulated space environmental conditions. Seven factors were required

  11. Comparison of probabilistic models of the distribution of counts

    International Nuclear Information System (INIS)

    Salma, I.; Zemplen-Papp, E.

    1992-01-01

    The binominal, Poisson and modified Poisson models for describing the statistical nature of the distribution of counts are compared theoretically, and conclusions for application are proposed. The validity of the Poisson and the modified Poisson distribution for observing k events in a short time interval is investigated experimentally for various measuring times. The experiments to measure the influence of the significant radioactive decay were performed with 89m Y (T 1/2 =16.06 s), using a multichannel analyser (4096 channels) in the multiscaling mode. According to the results, Poisson distribution describes the counting experiment for short measuring times (up to T=0.5 T 1/2 ) and its application is recommended. However, the analysis of the data demonstrated that for long measurements (T≥1 T 1/2 ) Poisson distribution is not valid and the modified Poisson distribution is preferable. The practical implications in calculating uncertainties and in optimizing the measuring time are discussed. (author) 20 refs.; 7 figs.; 1 tab

  12. A probabilistic approach to the drag-based model

    Science.gov (United States)

    Napoletano, Gianluca; Forte, Roberta; Moro, Dario Del; Pietropaolo, Ermanno; Giovannelli, Luca; Berrilli, Francesco

    2018-02-01

    The forecast of the time of arrival (ToA) of a coronal mass ejection (CME) to Earth is of critical importance for our high-technology society and for any future manned exploration of the Solar System. As critical as the forecast accuracy is the knowledge of its precision, i.e. the error associated to the estimate. We propose a statistical approach for the computation of the ToA using the drag-based model by introducing the probability distributions, rather than exact values, as input parameters, thus allowing the evaluation of the uncertainty on the forecast. We test this approach using a set of CMEs whose transit times are known, and obtain extremely promising results: the average value of the absolute differences between measure and forecast is 9.1h, and half of these residuals are within the estimated errors. These results suggest that this approach deserves further investigation. We are working to realize a real-time implementation which ingests the outputs of automated CME tracking algorithms as inputs to create a database of events useful for a further validation of the approach.

  13. Probabilistic graphical models to deal with age estimation of living persons.

    Science.gov (United States)

    Sironi, Emanuele; Gallidabino, Matteo; Weyermann, Céline; Taroni, Franco

    2016-03-01

    Due to the rise of criminal, civil and administrative judicial situations involving people lacking valid identity documents, age estimation of living persons has become an important operational procedure for numerous forensic and medicolegal services worldwide. The chronological age of a given person is generally estimated from the observed degree of maturity of some selected physical attributes by means of statistical methods. However, their application in the forensic framework suffers from some conceptual and practical drawbacks, as recently claimed in the specialised literature. The aim of this paper is therefore to offer an alternative solution for overcoming these limits, by reiterating the utility of a probabilistic Bayesian approach for age estimation. This approach allows one to deal in a transparent way with the uncertainty surrounding the age estimation process and to produce all the relevant information in the form of posterior probability distribution about the chronological age of the person under investigation. Furthermore, this probability distribution can also be used for evaluating in a coherent way the possibility that the examined individual is younger or older than a given legal age threshold having a particular legal interest. The main novelty introduced by this work is the development of a probabilistic graphical model, i.e. a Bayesian network, for dealing with the problem at hand. The use of this kind of probabilistic tool can significantly facilitate the application of the proposed methodology: examples are presented based on data related to the ossification status of the medial clavicular epiphysis. The reliability and the advantages of this probabilistic tool are presented and discussed.

  14. Probabilistic modelling and analysis of stand-alone hybrid power systems

    International Nuclear Information System (INIS)

    Lujano-Rojas, Juan M.; Dufo-López, Rodolfo; Bernal-Agustín, José L.

    2013-01-01

    As a part of the Hybrid Intelligent Algorithm, a model based on an ANN (artificial neural network) has been proposed in this paper to represent hybrid system behaviour considering the uncertainty related to wind speed and solar radiation, battery bank lifetime, and fuel prices. The Hybrid Intelligent Algorithm suggests a combination of probabilistic analysis based on a Monte Carlo simulation approach and artificial neural network training embedded in a genetic algorithm optimisation model. The installation of a typical hybrid system was analysed. Probabilistic analysis was used to generate an input–output dataset of 519 samples that was later used to train the ANNs to reduce the computational effort required. The generalisation ability of the ANNs was measured in terms of RMSE (Root Mean Square Error), MBE (Mean Bias Error), MAE (Mean Absolute Error), and R-squared estimators using another data group of 200 samples. The results obtained from the estimation of the expected energy not supplied, the probability of a determined reliability level, and the estimation of expected value of net present cost show that the presented model is able to represent the main characteristics of a typical hybrid power system under uncertain operating conditions. - Highlights: • This paper presents a probabilistic model for stand-alone hybrid power system. • The model considers the main sources of uncertainty related to renewable resources. • The Hybrid Intelligent Algorithm has been applied to represent hybrid system behaviour. • The installation of a typical hybrid system was analysed. • The results obtained from the study case validate the presented model

  15. Learning abstract visual concepts via probabilistic program induction in a Language of Thought.

    Science.gov (United States)

    Overlan, Matthew C; Jacobs, Robert A; Piantadosi, Steven T

    2017-11-01

    The ability to learn abstract concepts is a powerful component of human cognition. It has been argued that variable binding is the key element enabling this ability, but the computational aspects of variable binding remain poorly understood. Here, we address this shortcoming by formalizing the Hierarchical Language of Thought (HLOT) model of rule learning. Given a set of data items, the model uses Bayesian inference to infer a probability distribution over stochastic programs that implement variable binding. Because the model makes use of symbolic variables as well as Bayesian inference and programs with stochastic primitives, it combines many of the advantages of both symbolic and statistical approaches to cognitive modeling. To evaluate the model, we conducted an experiment in which human subjects viewed training items and then judged which test items belong to the same concept as the training items. We found that the HLOT model provides a close match to human generalization patterns, significantly outperforming two variants of the Generalized Context Model, one variant based on string similarity and the other based on visual similarity using features from a deep convolutional neural network. Additional results suggest that variable binding happens automatically, implying that binding operations do not add complexity to peoples' hypothesized rules. Overall, this work demonstrates that a cognitive model combining symbolic variables with Bayesian inference and stochastic program primitives provides a new perspective for understanding people's patterns of generalization. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. COMPONENT SUPPLY MODEL FOR REPAIR ACTIVITIES NETWORK UNDER CONDITIONS OF PROBABILISTIC INDEFINITENESS.

    Directory of Open Access Journals (Sweden)

    Victor Yurievich Stroganov

    2017-02-01

    Full Text Available This article contains the systematization of the major production functions of repair activities network and the list of planning and control functions, which are described in the form of business processes (BP. Simulation model for analysis of the delivery effectiveness of components under conditions of probabilistic uncertainty was proposed. It has been shown that a significant portion of the total number of business processes is represented by the management and planning of the parts and components movement. Questions of construction of experimental design techniques on the simulation model in the conditions of non-stationarity were considered.

  17. An application of probabilistic safety assessment methods to model aircraft systems and accidents

    Energy Technology Data Exchange (ETDEWEB)

    Martinez-Guridi, G.; Hall, R.E.; Fullwood, R.R.

    1998-08-01

    A case study modeling the thrust reverser system (TRS) in the context of the fatal accident of a Boeing 767 is presented to illustrate the application of Probabilistic Safety Assessment methods. A simplified risk model consisting of an event tree with supporting fault trees was developed to represent the progression of the accident, taking into account the interaction between the TRS and the operating crew during the accident, and the findings of the accident investigation. A feasible sequence of events leading to the fatal accident was identified. Several insights about the TRS and the accident were obtained by applying PSA methods. Changes proposed for the TRS also are discussed.

  18. Abstract probabilistic CNOT gate model based on double encoding: study of the errors and physical realizability

    Science.gov (United States)

    Gueddana, Amor; Attia, Moez; Chatta, Rihab

    2015-03-01

    In this work, we study the error sources standing behind the non-perfect linear optical quantum components composing a non-deterministic quantum CNOT gate model, which performs the CNOT function with a success probability of 4/27 and uses a double encoding technique to represent photonic qubits at the control and the target. We generalize this model to an abstract probabilistic CNOT version and determine the realizability limits depending on a realistic range of the errors. Finally, we discuss physical constraints allowing the implementation of the Asymmetric Partially Polarizing Beam Splitter (APPBS), which is at the heart of correctly realizing the CNOT function.

  19. Learning classification models with soft-label information.

    Science.gov (United States)

    Nguyen, Quang; Valizadegan, Hamed; Hauskrecht, Milos

    2014-01-01

    Learning of classification models in medicine often relies on data labeled by a human expert. Since labeling of clinical data may be time-consuming, finding ways of alleviating the labeling costs is critical for our ability to automatically learn such models. In this paper we propose a new machine learning approach that is able to learn improved binary classification models more efficiently by refining the binary class information in the training phase with soft labels that reflect how strongly the human expert feels about the original class labels. Two types of methods that can learn improved binary classification models from soft labels are proposed. The first relies on probabilistic/numeric labels, the other on ordinal categorical labels. We study and demonstrate the benefits of these methods for learning an alerting model for heparin induced thrombocytopenia. The experiments are conducted on the data of 377 patient instances labeled by three different human experts. The methods are compared using the area under the receiver operating characteristic curve (AUC) score. Our AUC results show that the new approach is capable of learning classification models more efficiently compared to traditional learning methods. The improvement in AUC is most remarkable when the number of examples we learn from is small. A new classification learning framework that lets us learn from auxiliary soft-label information provided by a human expert is a promising new direction for learning classification models from expert labels, reducing the time and cost needed to label data.

  20. Probabilistic topic modeling for the analysis and classification of genomic sequences

    Science.gov (United States)

    2015-01-01

    Background Studies on genomic sequences for classification and taxonomic identification have a leading role in the biomedical field and in the analysis of biodiversity. These studies are focusing on the so-called barcode genes, representing a well defined region of the whole genome. Recently, alignment-free techniques are gaining more importance because they are able to overcome the drawbacks of sequence alignment techniques. In this paper a new alignment-free method for DNA sequences clustering and classification is proposed. The method is based on k-mers representation and text mining techniques. Methods The presented method is based on Probabilistic Topic Modeling, a statistical technique originally proposed for text documents. Probabilistic topic models are able to find in a document corpus the topics (recurrent themes) characterizing classes of documents. This technique, applied on DNA sequences representing the documents, exploits the frequency of fixed-length k-mers and builds a generative model for a training group of sequences. This generative model, obtained through the Latent Dirichlet Allocation (LDA) algorithm, is then used to classify a large set of genomic sequences. Results and conclusions We performed classification of over 7000 16S DNA barcode sequences taken from Ribosomal Database Project (RDP) repository, training probabilistic topic models. The proposed method is compared to the RDP tool and Support Vector Machine (SVM) classification algorithm in a extensive set of trials using both complete sequences and short sequence snippets (from 400 bp to 25 bp). Our method reaches very similar results to RDP classifier and SVM for complete sequences. The most interesting results are obtained when short sequence snippets are considered. In these conditions the proposed method outperforms RDP and SVM with ultra short sequences and it exhibits a smooth decrease of performance, at every taxonomic level, when the sequence length is decreased. PMID:25916734

  1. Learning and Model-checking Networks of I/O Automata

    DEFF Research Database (Denmark)

    Mao, Hua; Jaeger, Manfred

    2012-01-01

    We introduce a new statistical relational learning (SRL) approach in which models for structured data, especially network data, are constructed as networks of communicating nite probabilistic automata. Leveraging existing automata learning methods from the area of grammatical inference, we can...... learn generic models for network entities in the form of automata templates. As is characteristic for SRL techniques, the abstraction level aorded by learning generic templates enables one to apply the learned model to new domains. A main benet of learning models based on nite automata lies in the fact...

  2. Integrating statistical and process-based models to produce probabilistic landslide hazard at regional scale

    Science.gov (United States)

    Strauch, R. L.; Istanbulluoglu, E.

    2017-12-01

    We develop a landslide hazard modeling approach that integrates a data-driven statistical model and a probabilistic process-based shallow landslide model for mapping probability of landslide initiation, transport, and deposition at regional scales. The empirical model integrates the influence of seven site attribute (SA) classes: elevation, slope, curvature, aspect, land use-land cover, lithology, and topographic wetness index, on over 1,600 observed landslides using a frequency ratio (FR) approach. A susceptibility index is calculated by adding FRs for each SA on a grid-cell basis. Using landslide observations we relate susceptibility index to an empirically-derived probability of landslide impact. This probability is combined with results from a physically-based model to produce an integrated probabilistic map. Slope was key in landslide initiation while deposition was linked to lithology and elevation. Vegetation transition from forest to alpine vegetation and barren land cover with lower root cohesion leads to higher frequency of initiation. Aspect effects are likely linked to differences in root cohesion and moisture controlled by solar insulation and snow. We demonstrate the model in the North Cascades of Washington, USA and identify locations of high and low probability of landslide impacts that can be used by land managers in their design, planning, and maintenance.

  3. Fear of negative evaluation biases social evaluation inference: evidence from a probabilistic learning task.

    Science.gov (United States)

    Button, Katherine S; Kounali, Daphne; Stapinski, Lexine; Rapee, Ronald M; Lewis, Glyn; Munafò, Marcus R

    2015-01-01

    Fear of negative evaluation (FNE) defines social anxiety yet the process of inferring social evaluation, and its potential role in maintaining social anxiety, is poorly understood. We developed an instrumental learning task to model social evaluation learning, predicting that FNE would specifically bias learning about the self but not others. During six test blocks (3 self-referential, 3 other-referential), participants (n = 100) met six personas and selected a word from a positive/negative pair to finish their social evaluation sentences "I think [you are / George is]…". Feedback contingencies corresponded to 3 rules, liked, neutral and disliked, with P[positive word correct] = 0.8, 0.5 and 0.2, respectively. As FNE increased participants selected fewer positive words (β = -0.4, 95% CI -0.7, -0.2, p = 0.001), which was strongest in the self-referential condition (FNE × condition 0.28, 95% CI 0.01, 0.54, p = 0.04), and the neutral and dislike rules (FNE × condition × rule, p = 0.07). At low FNE the proportion of positive words selected for self-neutral and self-disliked greatly exceeded the feedback contingency, indicating poor learning, which improved as FNE increased. FNE is associated with differences in processing social-evaluative information specifically about the self. At low FNE this manifests as insensitivity to learning negative self-referential evaluation. High FNE individuals are equally sensitive to learning positive or negative evaluation, which although objectively more accurate, may have detrimental effects on mental health.

  4. Fear of Negative Evaluation Biases Social Evaluation Inference: Evidence from a Probabilistic Learning Task

    Science.gov (United States)

    Button, Katherine S.; Kounali, Daphne; Stapinski, Lexine; Rapee, Ronald M.; Lewis, Glyn; Munafò, Marcus R.

    2015-01-01

    Background Fear of negative evaluation (FNE) defines social anxiety yet the process of inferring social evaluation, and its potential role in maintaining social anxiety, is poorly understood. We developed an instrumental learning task to model social evaluation learning, predicting that FNE would specifically bias learning about the self but not others. Methods During six test blocks (3 self-referential, 3 other-referential), participants (n = 100) met six personas and selected a word from a positive/negative pair to finish their social evaluation sentences “I think [you are / George is]…”. Feedback contingencies corresponded to 3 rules, liked, neutral and disliked, with P[positive word correct] = 0.8, 0.5 and 0.2, respectively. Results As FNE increased participants selected fewer positive words (β = −0.4, 95% CI −0.7, −0.2, p = 0.001), which was strongest in the self-referential condition (FNE × condition 0.28, 95% CI 0.01, 0.54, p = 0.04), and the neutral and dislike rules (FNE × condition × rule, p = 0.07). At low FNE the proportion of positive words selected for self-neutral and self-disliked greatly exceeded the feedback contingency, indicating poor learning, which improved as FNE increased. Conclusions FNE is associated with differences in processing social-evaluative information specifically about the self. At low FNE this manifests as insensitivity to learning negative self-referential evaluation. High FNE individuals are equally sensitive to learning positive or negative evaluation, which although objectively more accurate, may have detrimental effects on mental health. PMID:25853835

  5. Probabilistic modelling of the high-pressure arc cathode spot displacement dynamic

    International Nuclear Information System (INIS)

    Coulombe, Sylvain

    2003-01-01

    A probabilistic modelling approach for the study of the cathode spot displacement dynamic in high-pressure arc systems is developed in an attempt to interpret the observed voltage fluctuations. The general framework of the model allows to define simple, probabilistic displacement rules, the so-called cathode spot dynamic rules, for various possible surface states (un-arced metal, arced, contaminated) and to study the resulting dynamic of the cathode spot displacements over one or several arc passages. The displacements of the type-A cathode spot (macro-spot) in a magnetically rotating arc using concentric electrodes made up of either clean or contaminated metal surfaces is considered. Experimental observations for this system revealed a 1/f -tilde1 signature in the frequency power spectrum (FPS) of the arc voltage for anchoring arc conditions on the cathode (e.g. clean metal surface), while it shows a 'white noise' signature for conditions favouring a smooth movement (e.g. oxide-contaminated cathode surface). Through an appropriate choice of the local probabilistic displacement rules, the model is able to correctly represent the dynamic behaviours of the type-A cathode spot, including the FPS for the arc elongation (i.e. voltage) and the arc erosion trace formation. The model illustrates that the cathode spot displacements between re-strikes can be seen as a diffusion process with a diffusion constant which depends on the surface structure. A physical interpretation for the jumping probability associated with the re-strike event is given in terms of the electron emission processes across dielectric contaminants present on the cathode surface

  6. Deterministic and probabilistic interval prediction for short-term wind power generation based on variational mode decomposition and machine learning methods

    International Nuclear Information System (INIS)

    Zhang, Yachao; Liu, Kaipei; Qin, Liang; An, Xueli

    2016-01-01

    Highlights: • Variational mode decomposition is adopted to process original wind power series. • A novel combined model based on machine learning methods is established. • An improved differential evolution algorithm is proposed for weight adjustment. • Probabilistic interval prediction is performed by quantile regression averaging. - Abstract: Due to the increasingly significant energy crisis nowadays, the exploitation and utilization of new clean energy gains more and more attention. As an important category of renewable energy, wind power generation has become the most rapidly growing renewable energy in China. However, the intermittency and volatility of wind power has restricted the large-scale integration of wind turbines into power systems. High-precision wind power forecasting is an effective measure to alleviate the negative influence of wind power generation on the power systems. In this paper, a novel combined model is proposed to improve the prediction performance for the short-term wind power forecasting. Variational mode decomposition is firstly adopted to handle the instability of the raw wind power series, and the subseries can be reconstructed by measuring sample entropy of the decomposed modes. Then the base models can be established for each subseries respectively. On this basis, the combined model is developed based on the optimal virtual prediction scheme, the weight matrix of which is dynamically adjusted by a self-adaptive multi-strategy differential evolution algorithm. Besides, a probabilistic interval prediction model based on quantile regression averaging and variational mode decomposition-based hybrid models is presented to quantify the potential risks of the wind power series. The simulation results indicate that: (1) the normalized mean absolute errors of the proposed combined model from one-step to three-step forecasting are 4.34%, 6.49% and 7.76%, respectively, which are much lower than those of the base models and the hybrid

  7. Probabilistic Elastic Part Model: A Pose-Invariant Representation for Real-World Face Verification.

    Science.gov (United States)

    Li, Haoxiang; Hua, Gang

    2018-04-01

    Pose variation remains to be a major challenge for real-world face recognition. We approach this problem through a probabilistic elastic part model. We extract local descriptors (e.g., LBP or SIFT) from densely sampled multi-scale image patches. By augmenting each descriptor with its location, a Gaussian mixture model (GMM) is trained to capture the spatial-appearance distribution of the face parts of all face images in the training corpus, namely the probabilistic elastic part (PEP) model. Each mixture component of the GMM is confined to be a spherical Gaussian to balance the influence of the appearance and the location terms, which naturally defines a part. Given one or multiple face images of the same subject, the PEP-model builds its PEP representation by sequentially concatenating descriptors identified by each Gaussian component in a maximum likelihood sense. We further propose a joint Bayesian adaptation algorithm to adapt the universally trained GMM to better model the pose variations between the target pair of faces/face tracks, which consistently improves face verification accuracy. Our experiments show that we achieve state-of-the-art face verification accuracy with the proposed representations on the Labeled Face in the Wild (LFW) dataset, the YouTube video face database, and the CMU MultiPIE dataset.

  8. Probabilistic models for steel corrosion loss and pitting of marine infrastructure

    International Nuclear Information System (INIS)

    Melchers, R.E.; Jeffrey, R.J.

    2008-01-01

    With the increasing emphasis on attempting to retain in service ageing infrastructure models for the description and prediction of corrosion losses and for maximum pit depth are of increasing interest. In most cases assessment and prediction will be done in a probabilistic risk assessment framework and this then requires probabilistic corrosion models. Recently, novel models for corrosion loss and maximum pit depth under marine immersion conditions have been developed. The models show that both corrosion loss and pit depth progress in a non-linear fashion with increased exposure time and do so in a non-monotonic manner as a result of the controlling corrosion process changing from oxidation to being influenced by bacterial action. For engineers the importance of this lies in the fact that conventional 'corrosion rates' have no validity, particularly for the long-term corrosion effects as relevant to deteriorated infrastructure. The models are consistent with corrosion science principles as well as current understanding of the considerable influence of bacterial processes on corrosion loss and pitting. The considerable practical implications of this are described

  9. Tensit - a novel probabilistic simulation tool for safety assessments. Tests and verifications using biosphere models

    Energy Technology Data Exchange (ETDEWEB)

    Jones, Jakob; Vahlund, Fredrik; Kautsky, Ulrik

    2004-06-01

    This report documents the verification of a new simulation tool for dose assessment put together in a package under the name Tensit (Technical Nuclide Simulation Tool). The tool is developed to solve differential equation systems describing transport and decay of radionuclides. It is capable of handling both deterministic and probabilistic simulations. The verifications undertaken shows good results. Exceptions exist only where the reference results are unclear. Tensit utilise and connects two separate commercial softwares. The equation solving capability is derived from the Matlab/Simulink software environment to which Tensit adds a library of interconnectable building blocks. Probabilistic simulations are provided through a statistical software named at{sub R}isk that communicates with Matlab/Simulink. More information about these softwares can be found at www.palisade.com and www.mathworks.com. The underlying intention of developing this new tool has been to make available a cost efficient and easy to use means for advanced dose assessment simulations. The mentioned benefits are gained both through the graphical user interface provided by Simulink and at{sub R}isk, and the use of numerical equation solving routines in Matlab. To verify Tensit's numerical correctness, an implementation was done of the biosphere modules for dose assessments used in the earlier safety assessment project SR 97. Acquired probabilistic results for deterministic as well as probabilistic simulations have been compared with documented values. Additional verification has been made both with another simulation tool named AMBER and also against the international test case from PSACOIN named Level 1B. This report documents the models used for verification with equations and parameter values so that the results can be recreated. For a background and a more detailed description of the underlying processes in the models, the reader is referred to the original references. Finally, in the

  10. Propagating Water Quality Analysis Uncertainty Into Resource Management Decisions Through Probabilistic Modeling

    Science.gov (United States)

    Gronewold, A. D.; Wolpert, R. L.; Reckhow, K. H.

    2007-12-01

    Most probable number (MPN) and colony-forming-unit (CFU) are two estimates of fecal coliform bacteria concentration commonly used as measures of water quality in United States shellfish harvesting waters. The MPN is the maximum likelihood estimate (or MLE) of the true fecal coliform concentration based on counts of non-sterile tubes in serial dilution of a sample aliquot, indicating bacterial metabolic activity. The CFU is the MLE of the true fecal coliform concentration based on the number of bacteria colonies emerging on a growth plate after inoculation from a sample aliquot. Each estimating procedure has intrinsic variability and is subject to additional uncertainty arising from minor variations in experimental protocol. Several versions of each procedure (using different sized aliquots or different numbers of tubes, for example) are in common use, each with its own levels of probabilistic and experimental error and uncertainty. It has been observed empirically that the MPN procedure is more variable than the CFU procedure, and that MPN estimates are somewhat higher on average than CFU estimates, on split samples from the same water bodies. We construct a probabilistic model that provides a clear theoretical explanation for the observed variability in, and discrepancy between, MPN and CFU measurements. We then explore how this variability and uncertainty might propagate into shellfish harvesting area management decisions through a two-phased modeling strategy. First, we apply our probabilistic model in a simulation-based analysis of future water quality standard violation frequencies under alternative land use scenarios, such as those evaluated under guidelines of the total maximum daily load (TMDL) program. Second, we apply our model to water quality data from shellfish harvesting areas which at present are closed (either conditionally or permanently) to shellfishing, to determine if alternative laboratory analysis procedures might have led to different

  11. Tensit - a novel probabilistic simulation tool for safety assessments. Tests and verifications using biosphere models

    International Nuclear Information System (INIS)

    Jones, Jakob; Vahlund, Fredrik; Kautsky, Ulrik

    2004-06-01

    This report documents the verification of a new simulation tool for dose assessment put together in a package under the name Tensit (Technical Nuclide Simulation Tool). The tool is developed to solve differential equation systems describing transport and decay of radionuclides. It is capable of handling both deterministic and probabilistic simulations. The verifications undertaken shows good results. Exceptions exist only where the reference results are unclear. Tensit utilise and connects two separate commercial softwares. The equation solving capability is derived from the Matlab/Simulink software environment to which Tensit adds a library of interconnectable building blocks. Probabilistic simulations are provided through a statistical software named at R isk that communicates with Matlab/Simulink. More information about these softwares can be found at www.palisade.com and www.mathworks.com. The underlying intention of developing this new tool has been to make available a cost efficient and easy to use means for advanced dose assessment simulations. The mentioned benefits are gained both through the graphical user interface provided by Simulink and at R isk, and the use of numerical equation solving routines in Matlab. To verify Tensit's numerical correctness, an implementation was done of the biosphere modules for dose assessments used in the earlier safety assessment project SR 97. Acquired probabilistic results for deterministic as well as probabilistic simulations have been compared with documented values. Additional verification has been made both with another simulation tool named AMBER and also against the international test case from PSACOIN named Level 1B. This report documents the models used for verification with equations and parameter values so that the results can be recreated. For a background and a more detailed description of the underlying processes in the models, the reader is referred to the original references. Finally, in the perspective of

  12. Probabilistic representation in syllogistic reasoning: A theory to integrate mental models and heuristics.

    Science.gov (United States)

    Hattori, Masasi

    2016-12-01

    This paper presents a new theory of syllogistic reasoning. The proposed model assumes there are probabilistic representations of given signature situations. Instead of conducting an exhaustive search, the model constructs an individual-based "logical" mental representation that expresses the most probable state of affairs, and derives a necessary conclusion that is not inconsistent with the model using heuristics based on informativeness. The model is a unification of previous influential models. Its descriptive validity has been evaluated against existing empirical data and two new experiments, and by qualitative analyses based on previous empirical findings, all of which supported the theory. The model's behavior is also consistent with findings in other areas, including working memory capacity. The results indicate that people assume the probabilities of all target events mentioned in a syllogism to be almost equal, which suggests links between syllogistic reasoning and other areas of cognition. Copyright © 2016 The Author(s). Published by Elsevier B.V. All rights reserved.

  13. The Integrated Medical Model: A Probabilistic Simulation Model Predicting In-Flight Medical Risks

    Science.gov (United States)

    Keenan, Alexandra; Young, Millennia; Saile, Lynn; Boley, Lynn; Walton, Marlei; Kerstman, Eric; Shah, Ronak; Goodenow, Debra A.; Myers, Jerry G., Jr.

    2015-01-01

    The Integrated Medical Model (IMM) is a probabilistic model that uses simulation to predict mission medical risk. Given a specific mission and crew scenario, medical events are simulated using Monte Carlo methodology to provide estimates of resource utilization, probability of evacuation, probability of loss of crew, and the amount of mission time lost due to illness. Mission and crew scenarios are defined by mission length, extravehicular activity (EVA) schedule, and crew characteristics including: sex, coronary artery calcium score, contacts, dental crowns, history of abdominal surgery, and EVA eligibility. The Integrated Medical Evidence Database (iMED) houses the model inputs for one hundred medical conditions using in-flight, analog, and terrestrial medical data. Inputs include incidence, event durations, resource utilization, and crew functional impairment. Severity of conditions is addressed by defining statistical distributions on the dichotomized best and worst-case scenarios for each condition. The outcome distributions for conditions are bounded by the treatment extremes of the fully treated scenario in which all required resources are available and the untreated scenario in which no required resources are available. Upon occurrence of a simulated medical event, treatment availability is assessed, and outcomes are generated depending on the status of the affected crewmember at the time of onset, including any pre-existing functional impairments or ongoing treatment of concurrent conditions. The main IMM outcomes, including probability of evacuation and loss of crew life, time lost due to medical events, and resource utilization, are useful in informing mission planning decisions. To date, the IMM has been used to assess mission-specific risks with and without certain crewmember characteristics, to determine the impact of eliminating certain resources from the mission medical kit, and to design medical kits that maximally benefit crew health while meeting

  14. The Integrated Medical Model: A Probabilistic Simulation Model for Predicting In-Flight Medical Risks

    Science.gov (United States)

    Keenan, Alexandra; Young, Millennia; Saile, Lynn; Boley, Lynn; Walton, Marlei; Kerstman, Eric; Shah, Ronak; Goodenow, Debra A.; Myers, Jerry G.

    2015-01-01

    The Integrated Medical Model (IMM) is a probabilistic model that uses simulation to predict mission medical risk. Given a specific mission and crew scenario, medical events are simulated using Monte Carlo methodology to provide estimates of resource utilization, probability of evacuation, probability of loss of crew, and the amount of mission time lost due to illness. Mission and crew scenarios are defined by mission length, extravehicular activity (EVA) schedule, and crew characteristics including: sex, coronary artery calcium score, contacts, dental crowns, history of abdominal surgery, and EVA eligibility. The Integrated Medical Evidence Database (iMED) houses the model inputs for one hundred medical conditions using in-flight, analog, and terrestrial medical data. Inputs include incidence, event durations, resource utilization, and crew functional impairment. Severity of conditions is addressed by defining statistical distributions on the dichotomized best and worst-case scenarios for each condition. The outcome distributions for conditions are bounded by the treatment extremes of the fully treated scenario in which all required resources are available and the untreated scenario in which no required resources are available. Upon occurrence of a simulated medical event, treatment availability is assessed, and outcomes are generated depending on the status of the affected crewmember at the time of onset, including any pre-existing functional impairments or ongoing treatment of concurrent conditions. The main IMM outcomes, including probability of evacuation and loss of crew life, time lost due to medical events, and resource utilization, are useful in informing mission planning decisions. To date, the IMM has been used to assess mission-specific risks with and without certain crewmember characteristics, to determine the impact of eliminating certain resources from the mission medical kit, and to design medical kits that maximally benefit crew health while meeting

  15. Resolution and Probabilistic Models of Components in CryoEM Maps of Mature P22 Bacteriophage

    Science.gov (United States)

    Pintilie, Grigore; Chen, Dong-Hua; Haase-Pettingell, Cameron A.; King, Jonathan A.; Chiu, Wah

    2016-01-01

    CryoEM continues to produce density maps of larger and more complex assemblies with multiple protein components of mixed symmetries. Resolution is not always uniform throughout a cryoEM map, and it can be useful to estimate the resolution in specific molecular components of a large assembly. In this study, we present procedures to 1) estimate the resolution in subcomponents by gold-standard Fourier shell correlation (FSC); 2) validate modeling procedures, particularly at medium resolutions, which can include loop modeling and flexible fitting; and 3) build probabilistic models that combine high-accuracy priors (such as crystallographic structures) with medium-resolution cryoEM densities. As an example, we apply these methods to new cryoEM maps of the mature bacteriophage P22, reconstructed without imposing icosahedral symmetry. Resolution estimates based on gold-standard FSC show the highest resolution in the coat region (7.6 Å), whereas other components are at slightly lower resolutions: portal (9.2 Å), hub (8.5 Å), tailspike (10.9 Å), and needle (10.5 Å). These differences are indicative of inherent structural heterogeneity and/or reconstruction accuracy in different subcomponents of the map. Probabilistic models for these subcomponents provide new insights, to our knowledge, and structural information when taking into account uncertainty given the limitations of the observed density. PMID:26743049

  16. From Cyclone Tracks to the Costs of European Winter Storms: A Probabilistic Loss Assessment Model

    Science.gov (United States)

    Orwig, K.; Renggli, D.; Corti, T.; Reese, S.; Wueest, M.; Viktor, E.; Zimmerli, P.

    2014-12-01

    European winter storms cause billions of dollars of insured losses every year. Therefore, it is essential to understand potential impacts of future events, and the role reinsurance can play to mitigate the losses. The authors will present an overview on natural catastrophe risk assessment modeling in the reinsurance industry, and the development of a new innovative approach for modeling the risk associated with European winter storms.The new innovative approach includes the development of physically meaningful probabilistic (i.e. simulated) events for European winter storm loss assessment. The meteorological hazard component of the new model is based on cyclone and windstorm tracks identified in the 20thCentury Reanalysis data. The knowledge of the evolution of winter storms both in time and space allows the physically meaningful perturbation of historical event properties (e.g. track, intensity, etc.). The perturbation includes a random element but also takes the local climatology and the evolution of the historical event into account.The low-resolution wind footprints taken from the 20thCentury Reanalysis are processed by a statistical-dynamical downscaling to generate high-resolution footprints for both the simulated and historical events. Downscaling transfer functions are generated using ENSEMBLES regional climate model data. The result is a set of reliable probabilistic events representing thousands of years. The event set is then combined with country and site-specific vulnerability functions and detailed market- or client-specific information to compute annual expected losses.

  17. Identifiability of tree-child phylogenetic networks under a probabilistic recombination-mutation model of evolution.

    Science.gov (United States)

    Francis, Andrew; Moulton, Vincent

    2018-06-07

    Phylogenetic networks are an extension of phylogenetic trees which are used to represent evolutionary histories in which reticulation events (such as recombination and hybridization) have occurred. A central question for such networks is that of identifiability, which essentially asks under what circumstances can we reliably identify the phylogenetic network that gave rise to the observed data? Recently, identifiability results have appeared for networks relative to a model of sequence evolution that generalizes the standard Markov models used for phylogenetic trees. However, these results are quite limited in terms of the complexity of the networks that are considered. In this paper, by introducing an alternative probabilistic model for evolution along a network that is based on some ground-breaking work by Thatte for pedigrees, we are able to obtain an identifiability result for a much larger class of phylogenetic networks (essentially the class of so-called tree-child networks). To prove our main theorem, we derive some new results for identifying tree-child networks combinatorially, and then adapt some techniques developed by Thatte for pedigrees to show that our combinatorial results imply identifiability in the probabilistic setting. We hope that the introduction of our new model for networks could lead to new approaches to reliably construct phylogenetic networks. Copyright © 2018 Elsevier Ltd. All rights reserved.

  18. Multi-Objective Demand Response Model Considering the Probabilistic Characteristic of Price Elastic Load

    Directory of Open Access Journals (Sweden)

    Shengchun Yang

    2016-01-01

    Full Text Available Demand response (DR programs provide an effective approach for dealing with the challenge of wind power output fluctuations. Given that uncertain DR, such as price elastic load (PEL, plays an important role, the uncertainty of demand response behavior must be studied. In this paper, a multi-objective stochastic optimization problem of PEL is proposed on the basis of the analysis of the relationship between price elasticity and probabilistic characteristic, which is about stochastic demand models for consumer loads. The analysis aims to improve the capability of accommodating wind output uncertainty. In our approach, the relationship between the amount of demand response and interaction efficiency is developed by actively participating in power grid interaction. The probabilistic representation and uncertainty range of the PEL demand response amount are formulated differently compared with those of previous research. Based on the aforementioned findings, a stochastic optimization model with the combined uncertainties from the wind power output and the demand response scenario is proposed. The proposed model analyzes the demand response behavior of PEL by maximizing the electricity consumption satisfaction and interaction benefit satisfaction of PEL. Finally, a case simulation on the provincial power grid with a 151-bus system verifies the effectiveness and feasibility of the proposed mechanism and models.

  19. Probabilistic flood inundation mapping at ungauged streams due to roughness coefficient uncertainty in hydraulic modelling

    Science.gov (United States)

    Papaioannou, George; Vasiliades, Lampros; Loukas, Athanasios; Aronica, Giuseppe T.

    2017-04-01

    Probabilistic flood inundation mapping is performed and analysed at the ungauged Xerias stream reach, Volos, Greece. The study evaluates the uncertainty introduced by the roughness coefficient values on hydraulic models in flood inundation modelling and mapping. The well-established one-dimensional (1-D) hydraulic model, HEC-RAS is selected and linked to Monte-Carlo simulations of hydraulic roughness. Terrestrial Laser Scanner data have been used to produce a high quality DEM for input data uncertainty minimisation and to improve determination accuracy on stream channel topography required by the hydraulic model. Initial Manning's n roughness coefficient values are based on pebble count field surveys and empirical formulas. Various theoretical probability distributions are fitted and evaluated on their accuracy to represent the estimated roughness values. Finally, Latin Hypercube Sampling has been used for generation of different sets of Manning roughness values and flood inundation probability maps have been created with the use of Monte Carlo simulations. Historical flood extent data, from an extreme historical flash flood event, are used for validation of the method. The calibration process is based on a binary wet-dry reasoning with the use of Median Absolute Percentage Error evaluation metric. The results show that the proposed procedure supports probabilistic flood hazard mapping at ungauged rivers and provides water resources managers with valuable information for planning and implementing flood risk mitigation strategies.

  20. Ignorability in Statistical and Probabilistic Inference

    DEFF Research Database (Denmark)

    Jaeger, Manfred

    2005-01-01

    When dealing with incomplete data in statistical learning, or incomplete observations in probabilistic inference, one needs to distinguish the fact that a certain event is observed from the fact that the observed event has happened. Since the modeling and computational complexities entailed...

  1. Learning topic models by belief propagation.

    Science.gov (United States)

    Zeng, Jia; Cheung, William K; Liu, Jiming

    2013-05-01

    Latent Dirichlet allocation (LDA) is an important hierarchical Bayesian model for probabilistic topic modeling, which attracts worldwide interest and touches on many important applications in text mining, computer vision and computational biology. This paper represents the collapsed LDA as a factor graph, which enables the classic loopy belief propagation (BP) algorithm for approximate inference and parameter estimation. Although two commonly used approximate inference methods, such as variational Bayes (VB) and collapsed Gibbs sampling (GS), have gained great success in learning LDA, the proposed BP is competitive in both speed and accuracy, as validated by encouraging experimental results on four large-scale document datasets. Furthermore, the BP algorithm has the potential to become a generic scheme for learning variants of LDA-based topic models in the collapsed space. To this end, we show how to learn two typical variants of LDA-based topic models, such as author-topic models (ATM) and relational topic models (RTM), using BP based on the factor graph representations.

  2. A probabilistic model-based soft sensor to monitor lactic acid bacteria fermentations

    DEFF Research Database (Denmark)

    Spann, Robert; Roca, Christophe; Kold, David

    2018-01-01

    A probabilistic soft sensor based on a mechanistic model was designed to monitor S. thermophilus fermentations, and validated with experimental lab-scale data. It considered uncertainties in the initial conditions, on-line measurements, and model parameters by performing Monte Carlo simulations...... the model parameters that were then used as input to the mechanistic model. The soft sensor predicted both the current state variables, as well as the future course of the fermentation, e.g. with a relative mean error of the biomass concentration of 8 %. This successful implementation of a process...... within the monitoring system. It predicted, therefore, the probability distributions of the unmeasured states, such as biomass, lactose, and lactic acid concentrations. To this end, a mechanistic model was developed first, and a statistical parameter estimation was performed in order to assess parameter...

  3. The Implementation of Vendor Managed Inventory In the Supply Chain with Simple Probabilistic Inventory Model

    Directory of Open Access Journals (Sweden)

    Anna Ika Deefi

    2016-01-01

    Full Text Available Numerous studies show that the implementation of Vendor Managed Inventory (VMI benefits all members of the supply chain. This research develops model to prove the benefits obtained from implementing VMI to supplier-buyer partnership analytically. The model considers a two-level supply chain which consists of a single supplier and a single buyer. The analytical model is developed to supply chain inventory with probabilistic demand which follows normal distribution. The model also incorporates lead time as a decision variable and investigates the impacts of inventory management before and after the implementation of the VMI. The result shows that the analytical model has the ability to reduce the supply chain expected cost, improve the service level and increase the inventory replenishment. Numerical examples are given to prove them.

  4. Probabilistic, multi-variate flood damage modelling using random forests and Bayesian networks

    Science.gov (United States)

    Kreibich, Heidi; Schröter, Kai

    2015-04-01

    Decisions on flood risk management and adaptation are increasingly based on risk analyses. Such analyses are associated with considerable uncertainty, even more if changes in risk due to global change are expected. Although uncertainty analysis and probabilistic approaches have received increased attention recently, they are hardly applied in flood damage assessments. Most of the damage models usually applied in standard practice have in common that complex damaging processes are described by simple, deterministic approaches like stage-damage functions. This presentation will show approaches for probabilistic, multi-variate flood damage modelling on the micro- and meso-scale and discuss their potential and limitations. Reference: Merz, B.; Kreibich, H.; Lall, U. (2013): Multi-variate flood damage assessment: a tree-based data-mining approach. NHESS, 13(1), 53-64. Schröter, K., Kreibich, H., Vogel, K., Riggelsen, C., Scherbaum, F., Merz, B. (2014): How useful are complex flood damage models? - Water Resources Research, 50, 4, p. 3378-3395.

  5. Using Bayesian Model Averaging (BMA) to calibrate probabilistic surface temperature forecasts over Iran

    Energy Technology Data Exchange (ETDEWEB)

    Soltanzadeh, I. [Tehran Univ. (Iran, Islamic Republic of). Inst. of Geophysics; Azadi, M.; Vakili, G.A. [Atmospheric Science and Meteorological Research Center (ASMERC), Teheran (Iran, Islamic Republic of)

    2011-07-01

    Using Bayesian Model Averaging (BMA), an attempt was made to obtain calibrated probabilistic numerical forecasts of 2-m temperature over Iran. The ensemble employs three limited area models (WRF, MM5 and HRM), with WRF used with five different configurations. Initial and boundary conditions for MM5 and WRF are obtained from the National Centers for Environmental Prediction (NCEP) Global Forecast System (GFS) and for HRM the initial and boundary conditions come from analysis of Global Model Europe (GME) of the German Weather Service. The resulting ensemble of seven members was run for a period of 6 months (from December 2008 to May 2009) over Iran. The 48-h raw ensemble outputs were calibrated using BMA technique for 120 days using a 40 days training sample of forecasts and relative verification data. The calibrated probabilistic forecasts were assessed using rank histogram and attribute diagrams. Results showed that application of BMA improved the reliability of the raw ensemble. Using the weighted ensemble mean forecast as a deterministic forecast it was found that the deterministic-style BMA forecasts performed usually better than the best member's deterministic forecast. (orig.)

  6. Using Bayesian Model Averaging (BMA to calibrate probabilistic surface temperature forecasts over Iran

    Directory of Open Access Journals (Sweden)

    I. Soltanzadeh

    2011-07-01

    Full Text Available Using Bayesian Model Averaging (BMA, an attempt was made to obtain calibrated probabilistic numerical forecasts of 2-m temperature over Iran. The ensemble employs three limited area models (WRF, MM5 and HRM, with WRF used with five different configurations. Initial and boundary conditions for MM5 and WRF are obtained from the National Centers for Environmental Prediction (NCEP Global Forecast System (GFS and for HRM the initial and boundary conditions come from analysis of Global Model Europe (GME of the German Weather Service. The resulting ensemble of seven members was run for a period of 6 months (from December 2008 to May 2009 over Iran. The 48-h raw ensemble outputs were calibrated using BMA technique for 120 days using a 40 days training sample of forecasts and relative verification data. The calibrated probabilistic forecasts were assessed using rank histogram and attribute diagrams. Results showed that application of BMA improved the reliability of the raw ensemble. Using the weighted ensemble mean forecast as a deterministic forecast it was found that the deterministic-style BMA forecasts performed usually better than the best member's deterministic forecast.

  7. Probabilistic Design and Management of Sustainable Concrete Infrastructure Using Multi-Physics Service Life Models

    DEFF Research Database (Denmark)

    Lepech, Michael; Geiker, Mette; Michel, Alexander

    This paper looks to address the grand challenge of integrating construction materials engineering research within a multi-scale, inter-disciplinary research and management framework for sustainable concrete infrastructure. The ultimate goal is to drive sustainability-focused innovation and adoption...... cycles in the broader architecture, engineering, construction (AEC) industry. Specifically, a probabilistic design framework for sustainable concrete infrastructure and a multi-physics service life model for reinforced concrete are presented as important points of integration for innovation between...... design, consists of concrete service life models and life cycle assessment (LCA) models. Both types of models (service life and LCA) are formulated stochastically so that the service life and time(s) to repair, as well as total sustainability impact, are described by a probability distribution. A central...

  8. Infrared maritime target detection using a probabilistic single Gaussian model of sea clutter in Fourier domain

    Science.gov (United States)

    Zhou, Anran; Xie, Weixin; Pei, Jihong; Chen, Yapei

    2018-02-01

    For ship targets detection in cluttered infrared image sequences, a robust detection method, based on the probabilistic single Gaussian model of sea background in Fourier domain, is put forward. The amplitude spectrum sequences at each frequency point of the pure seawater images in Fourier domain, being more stable than the gray value sequences of each background pixel in the spatial domain, are regarded as a Gaussian model. Next, a probability weighted matrix is built based on the stability of the pure seawater's total energy spectrum in the row direction, to make the Gaussian model more accurate. Then, the foreground frequency points are separated from the background frequency points by the model. Finally, the false-alarm points are removed utilizing ships' shape features. The performance of the proposed method is tested by visual and quantitative comparisons with others.

  9. Uniform and localized corrosion modelling by means of probabilistic cellular automata

    International Nuclear Information System (INIS)

    Perez-Brokate, Cristian

    2016-01-01

    Numerical modelling is complementary tool for corrosion prediction. The objective of this work is to develop a corrosion model by means of a probabilistic cellular automata approach at a mesoscopic scale. In this work, we study the morphological evolution and kinetics of corrosion. This model couples electrochemical oxidation and reduction reactions. Regarding kinetics, cellular automata models are able to describe current as a function of the applied potential for a redox reaction on an inert electrode. The inclusion of probabilities allows the description of the stochastic nature of anodic and cathodic reactions. Corrosion morphology has been studied in different context: generalised corrosion, pitting corrosion and corrosion in an occluded environment. a general tendency of two regimes is found. a first regime of uniform corrosion where the anodic and cathodic reactions occur homogeneously over the surface. a second regime of localized corrosion when there is a spatial separation of anodic and cathodic zones, with an increase of anodic reaction rate. (author) [fr

  10. Probabilistic modeling of the flows and environmental risks of nano-silica

    International Nuclear Information System (INIS)

    Wang, Yan; Kalinina, Anna; Sun, Tianyin; Nowack, Bernd

    2016-01-01

    Nano-silica, the engineered nanomaterial with one of the largest production volumes, has a wide range of applications in consumer products and industry. This study aimed to quantify the exposure of nano-silica to the environment and to assess its risk to surface waters. Concentrations were calculated for four environmental (air, soil, surface water, sediments) and two technical compartments (wastewater, solid waste) for the EU and Switzerland using probabilistic material flow modeling. The corresponding median concentration in surface water is predicted to be 0.12 μg/l in the EU (0.053–3.3 μg/l, 15/85% quantiles). The concentrations in sediments in the complete sedimentation scenario were found to be the largest among all environmental compartments, with a median annual increase of 0.43 mg/kg·y in the EU (0.19–12 mg/kg·y, 15/85% quantiles). Moreover, probabilistic species sensitivity distributions (PSSD) were computed and the risk of nano-silica in surface waters was quantified by comparing the predicted environmental concentration (PEC) with the predicted no-effect concentration (PNEC) distribution, which was derived from the cumulative PSSD. This assessment suggests that nano-silica currently poses no risk to aquatic organisms in surface waters. Further investigations are needed to assess the risk of nano-silica in other environmental compartments, which is currently not possible due to a lack of ecotoxicological data. - Highlights: • We quantify the exposure of nano-silica to technical systems and the environment. • The median concentration in surface waters is predicted to be 0.12 μg/L in the EU. • Probabilistic species sensitivity distributions were computed for surface waters. • The risk assessment suggests that nano-silica poses no risk to aquatic organisms.

  11. A probabilistic topic model for clinical risk stratification from electronic health records.

    Science.gov (United States)

    Huang, Zhengxing; Dong, Wei; Duan, Huilong

    2015-12-01

    Risk stratification aims to provide physicians with the accurate assessment of a patient's clinical risk such that an individualized prevention or management strategy can be developed and delivered. Existing risk stratification techniques mainly focus on predicting the overall risk of an individual patient in a supervised manner, and, at the cohort level, often offer little insight beyond a flat score-based segmentation from the labeled clinical dataset. To this end, in this paper, we propose a new approach for risk stratification by exploring a large volume of electronic health records (EHRs) in an unsupervised fashion. Along this line, this paper proposes a novel probabilistic topic modeling framework called probabilistic risk stratification model (PRSM) based on Latent Dirichlet Allocation (LDA). The proposed PRSM recognizes a patient clinical state as a probabilistic combination of latent sub-profiles, and generates sub-profile-specific risk tiers of patients from their EHRs in a fully unsupervised fashion. The achieved stratification results can be easily recognized as high-, medium- and low-risk, respectively. In addition, we present an extension of PRSM, called weakly supervised PRSM (WS-PRSM) by incorporating minimum prior information into the model, in order to improve the risk stratification accuracy, and to make our models highly portable to risk stratification tasks of various diseases. We verify the effectiveness of the proposed approach on a clinical dataset containing 3463 coronary heart disease (CHD) patient instances. Both PRSM and WS-PRSM were compared with two established supervised risk stratification algorithms, i.e., logistic regression and support vector machine, and showed the effectiveness of our models in risk stratification of CHD in terms of the Area Under the receiver operating characteristic Curve (AUC) analysis. As well, in comparison with PRSM, WS-PRSM has over 2% performance gain, on the experimental dataset, demonstrating that

  12. Probabilistic modelling of the high-pressure arc cathode spot displacement dynamic

    CERN Document Server

    Coulombe, S

    2003-01-01

    A probabilistic modelling approach for the study of the cathode spot displacement dynamic in high-pressure arc systems is developed in an attempt to interpret the observed voltage fluctuations. The general framework of the model allows to define simple, probabilistic displacement rules, the so-called cathode spot dynamic rules, for various possible surface states (un-arced metal, arced, contaminated) and to study the resulting dynamic of the cathode spot displacements over one or several arc passages. The displacements of the type-A cathode spot (macro-spot) in a magnetically rotating arc using concentric electrodes made up of either clean or contaminated metal surfaces is considered. Experimental observations for this system revealed a 1/f sup - sup t sup i sup l sup d sup e sup 1 signature in the frequency power spectrum (FPS) of the arc voltage for anchoring arc conditions on the cathode (e.g. clean metal surface), while it shows a 'white noise' signature for conditions favouring a smooth movement (e.g. ox...

  13. Effects of shipping on marine acoustic habitats in Canadian Arctic estimated via probabilistic modeling and mapping.

    Science.gov (United States)

    Aulanier, Florian; Simard, Yvan; Roy, Nathalie; Gervaise, Cédric; Bandet, Marion

    2017-12-15

    Canadian Arctic and Subarctic regions experience a rapid decrease of sea ice accompanied with increasing shipping traffic. The resulting time-space changes in shipping noise are studied for four key regions of this pristine environment, for 2013 traffic conditions and a hypothetical tenfold traffic increase. A probabilistic modeling and mapping framework, called Ramdam, which integrates the intrinsic variability and uncertainties of shipping noise and its effects on marine habitats, is developed and applied. A substantial transformation of soundscapes is observed in areas where shipping noise changes from present occasional-transient contributor to a dominant noise source. Examination of impacts on low-frequency mammals within ecologically and biologically significant areas reveals that shipping noise has the potential to trigger behavioral responses and masking in the future, although no risk of temporary or permanent hearing threshold shifts is noted. Such probabilistic modeling and mapping is strategic in marine spatial planning of this emerging noise issues. Crown Copyright © 2017. Published by Elsevier Ltd. All rights reserved.

  14. A Probabilistic Model to Evaluate the Optimal Density of Stations Measuring Snowfall.

    Science.gov (United States)

    Schneebeli, Martin; Laternser, Martin

    2004-05-01

    Daily new snow measurements are very important for avalanche forecasting and tourism. A dense network of manual or automatic stations measuring snowfall is necessary to have spatially reliable data. Snow stations in Switzerland were built at partially subjective locations. A probabilistic model based on the frequency and spatial extent of areas covered by heavy snowfalls was developed to quantify the probability that snowfall events are measured by the stations. Area probability relations were calculated for different thresholds of daily accumulated snowfall. A probabilistic model, including autocorrelation, was used to calculate the optimal spacing of stations based on simulated triangular grids and to compare the capture probability of different networks and snowfall thresholds. The Swiss operational snow-stations network captured snowfall events with high probability, but the distribution of the stations could be optimized. The spatial variability increased with higher thresholds of daily accumulated snowfall, and the capture probability decreased with increasing thresholds. The method can be used for other areas where the area probability relation for threshold values of snow or rain can be calculated.

  15. PBDE exposure from food in Ireland: optimising data exploitation in probabilistic exposure modelling.

    Science.gov (United States)

    Trudel, David; Tlustos, Christina; Von Goetz, Natalie; Scheringer, Martin; Hungerbühler, Konrad

    2011-01-01

    Polybrominated diphenyl ethers (PBDEs) are a class of brominated flame retardants added to plastics, polyurethane foam, electronics, textiles, and other products. These products release PBDEs into the indoor and outdoor environment, thus causing human exposure through food and dust. This study models PBDE dose distributions from ingestion of food for Irish adults on congener basis by using two probabilistic and one semi-deterministic method. One of the probabilistic methods was newly developed and is based on summary statistics of food consumption combined with a model generating realistic daily energy supply from food. Median (intermediate) doses of total PBDEs are in the range of 0.4-0.6 ng/kg(bw)/day for Irish adults. The 97.5th percentiles of total PBDE doses lie in a range of 1.7-2.2 ng/kg(bw)/day, which is comparable to doses derived for Belgian and Dutch adults. BDE-47 and BDE-99 were identified as the congeners contributing most to estimated intakes, accounting for more than half of the total doses. The most influential food groups contributing to this intake are lean fish and salmon which together account for about 22-25% of the total doses.

  16. Incorporating networks in a probabilistic graphical model to find drivers for complex human diseases.

    Science.gov (United States)

    Mezlini, Aziz M; Goldenberg, Anna

    2017-10-01

    Discovering genetic mechanisms driving complex diseases is a hard problem. Existing methods often lack power to identify the set of responsible genes. Protein-protein interaction networks have been shown to boost power when detecting gene-disease associations. We introduce a Bayesian framework, Conflux, to find disease associated genes from exome sequencing data using networks as a prior. There are two main advantages to using networks within a probabilistic graphical model. First, networks are noisy and incomplete, a substantial impediment to gene discovery. Incorporating networks into the structure of a probabilistic models for gene inference has less impact on the solution than relying on the noisy network structure directly. Second, using a Bayesian framework we can keep track of the uncertainty of each gene being associated with the phenotype rather than returning a fixed list of genes. We first show that using networks clearly improves gene detection compared to individual gene testing. We then show consistently improved performance of Conflux compared to the state-of-the-art diffusion network-based method Hotnet2 and a variety of other network and variant aggregation methods, using randomly generated and literature-reported gene sets. We test Hotnet2 and Conflux on several network configurations to reveal biases and patterns of false positives and false negatives in each case. Our experiments show that our novel Bayesian framework Conflux incorporates many of the advantages of the current state-of-the-art methods, while offering more flexibility and improved power in many gene-disease association scenarios.

  17. PROBABILISTIC MODEL OF LASER RANGE FINDER FOR THREE DIMENSIONAL GRID CELL IN CLOSE RANGE ENVIRONMENT

    Directory of Open Access Journals (Sweden)

    Hafiz b Iman

    2016-04-01

    Full Text Available The probabilistic model of a laser scanner presents an important aspect for simultaneous localization and map-building (SLAM. However, the characteristic of the beam of the laser range finder under extreme incident angles approaching 900 has not been thoroughly investigated. This research paper reports the characteristic of the density of the range value coming from a laser range finder under close range circumstances where the laser is imposed with a high incident angle. The laser was placed in a controlled environment consisting of walls at a close range and 1000 iteration of scans was collected. The assumption of normal density of the metrical data collapses when the beam traverses across sharp edges in this environment. The data collected also shows multimodal density at instances where the range has discontinuity. The standard deviation of the laser range finder is reported to average at 10.54 mm, with 0.96 of accuracy. This significance suggests that under extreme incident angles, a laser range finder reading behaves differently compared to normal distribution. The use of this information is crucial for SLAM activity in enclosed environments such as inside piping grid or other cluttered environments.KEYWORDS:   Hokuyo UTM-30LX; kernel density estimation; probabilistic model  

  18. Adaptive and self-averaging Thouless-Anderson-Palmer mean-field theory for probabilistic modeling

    DEFF Research Database (Denmark)

    Opper, Manfred; Winther, Ole

    2001-01-01

    We develop a generalization of the Thouless-Anderson-Palmer (TAP) mean-field approach of disorder physics. which makes the method applicable to the computation of approximate averages in probabilistic models for real data. In contrast to the conventional TAP approach, where the knowledge...... of the distribution of couplings between the random variables is required, our method adapts to the concrete set of couplings. We show the significance of the approach in two ways: Our approach reproduces replica symmetric results for a wide class of toy models (assuming a nonglassy phase) with given disorder...... distributions in the thermodynamic limit. On the other hand, simulations on a real data model demonstrate that the method achieves more accurate predictions as compared to conventional TAP approaches....

  19. A probabilistic degradation model for the estimation of the remaining life distribution of feeders

    International Nuclear Information System (INIS)

    Yuan, X.-X.; Pandey, M.D.; Bickel, G.A.

    2006-01-01

    Wall thinning due to flow accelerated corrosion (FAC) is a pervasive form of degradation in the outlet feeder pipes of the primary heat transport system of CANDU reactors. The prediction of the end-of-life of a feeder from wall thickness measurement data is confounded by the sampling and temporal uncertainties associated with the FAC degradation phenomenon. Traditional regression-based statistical methods deal with only the sampling uncertainties, leaving the temporal uncertainties unresolved. This paper presents an advanced probabilistic model, which is able to integrate the temporal uncertainties into the prediction of lifetime. In particular, a random gamma process model is proposed to model the FAC process and it is calibrated with a set of wall thickness measurements using the method of maximum likelihood. This information can be used to establish an optimum strategy for inspection and replacement of feeders. (author)

  20. Probabilistic Decision Graphs - Combining Verification and AI Techniques for Probabilistic Inference

    DEFF Research Database (Denmark)

    Jaeger, Manfred

    2004-01-01

    We adopt probabilistic decision graphs developed in the field of automated verification as a tool for probabilistic model representation and inference. We show that probabilistic inference has linear time complexity in the size of the probabilistic decision graph, that the smallest probabilistic ...

  1. Probabilistic failure analysis of bone using a finite element model of mineral-collagen composites.

    Science.gov (United States)

    Dong, X Neil; Guda, Teja; Millwater, Harry R; Wang, Xiaodu

    2009-02-09

    Microdamage accumulation is a major pathway for energy dissipation during the post-yield deformation of bone. In this study, a two-dimensional probabilistic finite element model of a mineral-collagen composite was developed to investigate the influence of the tissue and ultrastructural properties of bone on the evolution of microdamage from an initial defect in tension. The probabilistic failure analyses indicated that the microdamage progression would be along the plane of the initial defect when the debonding at mineral-collagen interfaces was either absent or limited in the vicinity of the defect. In this case, the formation of a linear microcrack would be facilitated. However, the microdamage progression would be scattered away from the initial defect plane if interfacial debonding takes place at a large scale. This would suggest the possible formation of diffuse damage. In addition to interfacial debonding, the sensitivity analyses indicated that the microdamage progression was also dependent on the other material and ultrastructural properties of bone. The intensity of stress concentration accompanied with microdamage progression was more sensitive to the elastic modulus of the mineral phase and the nonlinearity of the collagen phase, whereas the scattering of failure location was largely dependent on the mineral to collagen ratio and the nonlinearity of the collagen phase. The findings of this study may help understanding the post-yield behavior of bone at the ultrastructural level and shed light on the underlying mechanism of bone fractures.

  2. A Hybrid Probabilistic Model for Unified Collaborative and Content-Based Image Tagging.

    Science.gov (United States)

    Zhou, Ning; Cheung, William K; Qiu, Guoping; Xue, Xiangyang

    2011-07-01

    The increasing availability of large quantities of user contributed images with labels has provided opportunities to develop automatic tools to tag images to facilitate image search and retrieval. In this paper, we present a novel hybrid probabilistic model (HPM) which integrates low-level image features and high-level user provided tags to automatically tag images. For images without any tags, HPM predicts new tags based solely on the low-level image features. For images with user provided tags, HPM jointly exploits both the image features and the tags in a unified probabilistic framework to recommend additional tags to label the images. The HPM framework makes use of the tag-image association matrix (TIAM). However, since the number of images is usually very large and user-provided tags are diverse, TIAM is very sparse, thus making it difficult to reliably estimate tag-to-tag co-occurrence probabilities. We developed a collaborative filtering method based on nonnegative matrix factorization (NMF) for tackling this data sparsity issue. Also, an L1 norm kernel method is used to estimate the correlations between image features and semantic concepts. The effectiveness of the proposed approach has been evaluated using three databases containing 5,000 images with 371 tags, 31,695 images with 5,587 tags, and 269,648 images with 5,018 tags, respectively.

  3. Application of probabilistic seismic hazard models with special calculation for the waste storage sites in Egypt

    International Nuclear Information System (INIS)

    Othman, A.A.; El-Hemamy, S.T.

    2000-01-01

    Probabilistic strong motion maps of Egypt are derived by applying Gumbel models and likelihood method to 8 earthquake source zones in Egypt and adjacent regions. Peak horizontal acceleration is mapped. Seismic data are collected from Helwan Catalog (1900-1997), regional catalog of earthquakes from the International Seismological Center (ISC,1910-1993) and earthquake data reports of US Department of International Geological Survey (USCGS, 1900-1994). Iso-seismic maps are also available for some events, which occurred in Egypt. Some earthquake source zones are well defined on the basis of both tectonics and average seismicity rates, but a lack of understanding of the near field effects of the large earthquakes prohibits accurate estimates of ground motion in their vicinity. Some source zones have no large-scale crustal features or zones of weakness that can explain the seismicity and must, therefore, be defined simply as concentrations of seismic activity with no geological or geophysical controls on the boundaries. Other source zones lack information on low-magnitude seismicity that would be representative of longer periods of time. Comparisons of the new probabilistic ground motion estimates in Egypt with equivalent estimates made in 1990 have been done. The new ground motion estimates are used to produce a new peak ground acceleration map to replace the 1990 peak acceleration zoning maps in the Building code of Egypt. (author)

  4. Probabilistic modeling of the flows and environmental risks of nano-silica.

    Science.gov (United States)

    Wang, Yan; Kalinina, Anna; Sun, Tianyin; Nowack, Bernd

    2016-03-01

    Nano-silica, the engineered nanomaterial with one of the largest production volumes, has a wide range of applications in consumer products and industry. This study aimed to quantify the exposure of nano-silica to the environment and to assess its risk to surface waters. Concentrations were calculated for four environmental (air, soil, surface water, sediments) and two technical compartments (wastewater, solid waste) for the EU and Switzerland using probabilistic material flow modeling. The corresponding median concentration in surface water is predicted to be 0.12 μg/l in the EU (0.053-3.3 μg/l, 15/85% quantiles). The concentrations in sediments in the complete sedimentation scenario were found to be the largest among all environmental compartments, with a median annual increase of 0.43 mg/kg · y in the EU (0.19-12 mg/kg · y, 15/85% quantiles). Moreover, probabilistic species sensitivity distributions (PSSD) were computed and the risk of nano-silica in surface waters was quantified by comparing the predicted environmental concentration (PEC) with the predicted no-effect concentration (PNEC) distribution, which was derived from the cumulative PSSD. This assessment suggests that nano-silica currently poses no risk to aquatic organisms in surface waters. Further investigations are needed to assess the risk of nano-silica in other environmental compartments, which is currently not possible due to a lack of ecotoxicological data. Copyright © 2015 Elsevier B.V. All rights reserved.

  5. Probabilistic risk assessment model for allergens in food: sensitivity analysis of the minimum eliciting dose and food consumption

    NARCIS (Netherlands)

    Kruizinga, A.G.; Briggs, D.; Crevel, R.W.R.; Knulst, A.C.; Bosch, L.M.C.v.d.; Houben, G.F.

    2008-01-01

    Previously, TNO developed a probabilistic model to predict the likelihood of an allergic reaction, resulting in a quantitative assessment of the risk associated with unintended exposure to food allergens. The likelihood is estimated by including in the model the proportion of the population who is

  6. Alignment and prediction of cis-regulatory modules based on a probabilistic model of evolution.

    Directory of Open Access Journals (Sweden)

    Xin He

    2009-03-01

    Full Text Available Cross-species comparison has emerged as a powerful paradigm for predicting cis-regulatory modules (CRMs and understanding their evolution. The comparison requires reliable sequence alignment, which remains a challenging task for less conserved noncoding sequences. Furthermore, the existing models of DNA sequence evolution generally do not explicitly treat the special properties of CRM sequences. To address these limitations, we propose a model of CRM evolution that captures different modes of evolution of functional transcription factor binding sites (TFBSs and the background sequences. A particularly novel aspect of our work is a probabilistic model of gains and losses of TFBSs, a process being recognized as an important part of regulatory sequence evolution. We present a computational framework that uses this model to solve the problems of CRM alignment and prediction. Our alignment method is similar to existing methods of statistical alignment but uses the conserved binding sites to improve alignment. Our CRM prediction method deals with the inherent uncertainties of binding site annotations and sequence alignment in a probabilistic framework. In simulated as well as real data, we demonstrate that our program is able to improve both alignment and prediction of CRM sequences over several state-of-the-art methods. Finally, we used alignments produced by our program to study binding site conservation in genome-wide binding data of key transcription factors in the Drosophila blastoderm, with two intriguing results: (i the factor-bound sequences are under strong evolutionary constraints even if their neighboring genes are not expressed in the blastoderm and (ii binding sites in distal bound sequences (relative to transcription start sites tend to be more conserved than those in proximal regions. Our approach is implemented as software, EMMA (Evolutionary Model-based cis-regulatory Module Analysis, ready to be applied in a broad biological context.

  7. The Performance of Structure-Controller Coupled Systems Analysis Using Probabilistic Evaluation and Identification Model Approach

    Directory of Open Access Journals (Sweden)

    Mosbeh R. Kaloop

    2017-01-01

    Full Text Available This study evaluates the performance of passively controlled steel frame building under dynamic loads using time series analysis. A novel application is utilized for the time and frequency domains evaluation to analyze the behavior of controlling systems. In addition, the autoregressive moving average (ARMA neural networks are employed to identify the performance of the controller system. Three passive vibration control devices are utilized in this study, namely, tuned mass damper (TMD, tuned liquid damper (TLD, and tuned liquid column damper (TLCD. The results show that the TMD control system is a more reliable controller than TLD and TLCD systems in terms of vibration mitigation. The probabilistic evaluation and identification model showed that the probability analysis and ARMA neural network model are suitable to evaluate and predict the response of coupled building-controller systems.

  8. Multiscale probabilistic modeling of a crack bridge in glass fiber reinforced concrete

    Directory of Open Access Journals (Sweden)

    Rypla R.

    2017-06-01

    Full Text Available The present paper introduces a probabilistic approach to simulating the crack bridging effects of chopped glass strands in cement-based matrices and compares it to a discrete rigid body spring network model with semi-discrete representation of the chopped strands. The glass strands exhibit random features at various scales, which are taken into account by both models. Fiber strength and interface stress are considered as random variables at the scale of a single fiber bundle while the orientation and position of individual bundles with respect to a crack plane are considered as random variables at the crack bridge scale. At the scale of the whole composite domain, the distribution of fibers and the resulting number of crack-bridging fibers is considered. All the above random effects contribute to the variability of the crack bridge performance and result in size-dependent behavior of a multiply cracked composite.

  9. Probabilistic Forecasts of Wind Power Generation by Stochastic Differential Equation Models

    DEFF Research Database (Denmark)

    Møller, Jan Kloppenborg; Zugno, Marco; Madsen, Henrik

    2016-01-01

    The increasing penetration of wind power has resulted in larger shares of volatile sources of supply in power systems worldwide. In order to operate such systems efficiently, methods for reliable probabilistic forecasts of future wind power production are essential. It is well known...... that the conditional density of wind power production is highly dependent on the level of predicted wind power and prediction horizon. This paper describes a new approach for wind power forecasting based on logistic-type stochastic differential equations (SDEs). The SDE formulation allows us to calculate both state......-dependent conditional uncertainties as well as correlation structures. Model estimation is performed by maximizing the likelihood of a multidimensional random vector while accounting for the correlation structure defined by the SDE formulation. We use non-parametric modelling to explore conditional correlation...

  10. Modeling of the Sedimentary Interbedded Basalt Stratigraphy for the Idaho National Laboratory Probabilistic Seismic Hazard Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Suzette Payne

    2006-04-01

    This report summarizes how the effects of the sedimentary interbedded basalt stratigraphy were modeled in the probabilistic seismic hazard analysis (PSHA) of the Idaho National Laboratory (INL). Drill holes indicate the bedrock beneath INL facilities is composed of about 1.1 km of alternating layers of basalt rock and loosely consolidated sediments. Alternating layers of hard rock and “soft” loose sediments tend to attenuate seismic energy greater than uniform rock due to scattering and damping. The INL PSHA incorporated the effects of the sedimentary interbedded basalt stratigraphy by developing site-specific shear (S) wave velocity profiles. The profiles were used in the PSHA to model the near-surface site response by developing site-specific stochastic attenuation relationships.

  11. Modeling of the Sedimentary Interbedded Basalt Stratigraphy for the Idaho National Laboratory Probabilistic Seismic Hazard Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Suzette Payne

    2007-08-01

    This report summarizes how the effects of the sedimentary interbedded basalt stratigraphy were modeled in the probabilistic seismic hazard analysis (PSHA) of the Idaho National Laboratory (INL). Drill holes indicate the bedrock beneath INL facilities is composed of about 1.1 km of alternating layers of basalt rock and loosely consolidated sediments. Alternating layers of hard rock and “soft” loose sediments tend to attenuate seismic energy greater than uniform rock due to scattering and damping. The INL PSHA incorporated the effects of the sedimentary interbedded basalt stratigraphy by developing site-specific shear (S) wave velocity profiles. The profiles were used in the PSHA to model the near-surface site response by developing site-specific stochastic attenuation relationships.

  12. Probabilistic systems coalgebraically: A survey

    Science.gov (United States)

    Sokolova, Ana

    2011-01-01

    We survey the work on both discrete and continuous-space probabilistic systems as coalgebras, starting with how probabilistic systems are modeled as coalgebras and followed by a discussion of their bisimilarity and behavioral equivalence, mentioning results that follow from the coalgebraic treatment of probabilistic systems. It is interesting to note that, for different reasons, for both discrete and continuous probabilistic systems it may be more convenient to work with behavioral equivalence than with bisimilarity. PMID:21998490

  13. A probabilistic quantitative risk assessment model for the long-term work zone crashes.

    Science.gov (United States)

    Meng, Qiang; Weng, Jinxian; Qu, Xiaobo

    2010-11-01

    Work zones especially long-term work zones increase traffic conflicts and cause safety problems. Proper casualty risk assessment for a work zone is of importance for both traffic safety engineers and travelers. This paper develops a novel probabilistic quantitative risk assessment (QRA) model to evaluate the casualty risk combining frequency and consequence of all accident scenarios triggered by long-term work zone crashes. The casualty risk is measured by the individual risk and societal risk. The individual risk can be interpreted as the frequency of a driver/passenger being killed or injured, and the societal risk describes the relation between frequency and the number of casualties. The proposed probabilistic QRA model consists of the estimation of work zone crash frequency, an event tree and consequence estimation models. There are seven intermediate events--age (A), crash unit (CU), vehicle type (VT), alcohol (AL), light condition (LC), crash type (CT) and severity (S)--in the event tree. Since the estimated value of probability for some intermediate event may have large uncertainty, the uncertainty can thus be characterized by a random variable. The consequence estimation model takes into account the combination effects of speed and emergency medical service response time (ERT) on the consequence of work zone crash. Finally, a numerical example based on the Southeast Michigan work zone crash data is carried out. The numerical results show that there will be a 62% decrease of individual fatality risk and 44% reduction of individual injury risk if the mean travel speed is slowed down by 20%. In addition, there will be a 5% reduction of individual fatality risk and 0.05% reduction of individual injury risk if ERT is reduced by 20%. In other words, slowing down speed is more effective than reducing ERT in the casualty risk mitigation. 2010 Elsevier Ltd. All rights reserved.

  14. Improving probabilistic prediction of daily streamflow by identifying Pareto optimal approaches for modelling heteroscedastic residual errors

    Science.gov (United States)

    David, McInerney; Mark, Thyer; Dmitri, Kavetski; George, Kuczera

    2017-04-01

    This study provides guidance to hydrological researchers which enables them to provide probabilistic predictions of daily streamflow with the best reliability and precision for different catchment types (e.g. high/low degree of ephemerality). Reliable and precise probabilistic prediction of daily catchment-scale streamflow requires statistical characterization of residual errors of hydrological models. It is commonly known that hydrological model residual errors are heteroscedastic, i.e. there is a pattern of larger errors in higher streamflow predictions. Although multiple approaches exist for representing this heteroscedasticity, few studies have undertaken a comprehensive evaluation and comparison of these approaches. This study fills this research gap by evaluating 8 common residual error schemes, including standard and weighted least squares, the Box-Cox transformation (with fixed and calibrated power parameter, lambda) and the log-sinh transformation. Case studies include 17 perennial and 6 ephemeral catchments in Australia and USA, and two lumped hydrological models. We find the choice of heteroscedastic error modelling approach significantly impacts on predictive performance, though no single scheme simultaneously optimizes all performance metrics. The set of Pareto optimal schemes, reflecting performance trade-offs, comprises Box-Cox schemes with lambda of 0.2 and 0.5, and the log scheme (lambda=0, perennial catchments only). These schemes significantly outperform even the average-performing remaining schemes (e.g., across ephemeral catchments, median precision tightens from 105% to 40% of observed streamflow, and median biases decrease from 25% to 4%). Theoretical interpretations of empirical results highlight the importance of capturing the skew/kurtosis of raw residuals and reproducing zero flows. Recommendations for researchers and practitioners seeking robust residual error schemes for practical work are provided.

  15. Probabilistic risk assessment modeling of digital instrumentation and control systems using two dynamic methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Aldemir, T., E-mail: aldemir.1@osu.ed [Ohio State University, Nuclear Engineering Program, Columbus, OH 43210 (United States); Guarro, S. [ASCA, Inc., 1720 S. Catalina Avenue, Suite 220, Redondo Beach, CA 90277-5501 (United States); Mandelli, D. [Ohio State University, Nuclear Engineering Program, Columbus, OH 43210 (United States); Kirschenbaum, J. [Ohio State University, Department of Computer Science and Engineering, Columbus, OH 43210 (United States); Mangan, L.A. [Ohio State University, Nuclear Engineering Program, Columbus, OH 43210 (United States); Bucci, P. [Ohio State University, Department of Computer Science and Engineering, Columbus, OH 43210 (United States); Yau, M. [ASCA, Inc., 1720 S. Catalina Avenue, Suite 220, Redondo Beach, CA 90277-5501 (United States); Ekici, E. [Ohio State University, Department of Electrical and Computer Engineering, Columbus, OH 43210 (United States); Miller, D.W.; Sun, X. [Ohio State University, Nuclear Engineering Program, Columbus, OH 43210 (United States); Arndt, S.A. [U.S. Nuclear Regulatory Commission, Washington, DC 20555-0001 (United States)

    2010-10-15

    The Markov/cell-to-cell mapping technique (CCMT) and the dynamic flowgraph methodology (DFM) are two system logic modeling methodologies that have been proposed to address the dynamic characteristics of digital instrumentation and control (I and C) systems and provide risk-analytical capabilities that supplement those provided by traditional probabilistic risk assessment (PRA) techniques for nuclear power plants. Both methodologies utilize a discrete state, multi-valued logic representation of the digital I and C system. For probabilistic quantification purposes, both techniques require the estimation of the probabilities of basic system failure modes, including digital I and C software failure modes, that appear in the prime implicants identified as contributors to a given system event of interest. As in any other system modeling process, the accuracy and predictive value of the models produced by the two techniques, depend not only on the intrinsic features of the modeling paradigm, but also and to a considerable extent on information and knowledge available to the analyst, concerning the system behavior and operation rules under normal and off-nominal conditions, and the associated controlled/monitored process dynamics. The application of the two methodologies is illustrated using a digital feedwater control system (DFWCS) similar to that of an operating pressurized water reactor. This application was carried out to demonstrate how the use of either technique, or both, can facilitate the updating of an existing nuclear power plant PRA model following an upgrade of the instrumentation and control system from analog to digital. Because of scope limitations, the focus of the demonstration of the methodologies was intentionally limited to aspects of digital I and C system behavior for which probabilistic data was on hand or could be generated within the existing project bounds of time and resources. The data used in the probabilistic quantification portion of the

  16. Improving predictive power of physically based rainfall-induced shallow landslide models: a probabilistic approach

    Directory of Open Access Journals (Sweden)

    S. Raia

    2014-03-01

    Full Text Available Distributed models to forecast the spatial and temporal occurrence of rainfall-induced shallow landslides are based on deterministic laws. These models extend spatially the static stability models adopted in geotechnical engineering, and adopt an infinite-slope geometry to balance the resisting and the driving forces acting on the sliding mass. An infiltration model is used to determine how rainfall changes pore-water conditions, modulating the local stability/instability conditions. A problem with the operation of the existing models lays in the difficulty in obtaining accurate values for the several variables that describe the material properties of the slopes. The problem is particularly severe when the models are applied over large areas, for which sufficient information on the geotechnical and hydrological conditions of the slopes is not generally available. To help solve the problem, we propose a probabilistic Monte Carlo approach to the distributed modeling of rainfall-induced shallow landslides. For this purpose, we have modified the transient rainfall infiltration and grid-based regional slope-stability analysis (TRIGRS code. The new code (TRIGRS-P adopts a probabilistic approach to compute, on a cell-by-cell basis, transient pore-pressure changes and related changes in the factor of safety due to rainfall infiltration. Infiltration is modeled using analytical solutions of partial differential equations describing one-dimensional vertical flow in isotropic, homogeneous materials. Both saturated and unsaturated soil conditions can be considered. TRIGRS-P copes with the natural variability inherent to the mechanical and hydrological properties of the slope materials by allowing values of the TRIGRS model input parameters to be sampled randomly from a given probability distribution. The range of variation and the mean value of the parameters can be determined by the usual methods used for preparing the TRIGRS input parameters. The outputs

  17. Probabilistic, Multivariable Flood Loss Modeling on the Mesoscale with BT-FLEMO.

    Science.gov (United States)

    Kreibich, Heidi; Botto, Anna; Merz, Bruno; Schröter, Kai

    2017-04-01

    Flood loss modeling is an important component for risk analyses and decision support in flood risk management. Commonly, flood loss models describe complex damaging processes by simple, deterministic approaches like depth-damage functions and are associated with large uncertainty. To improve flood loss estimation and to provide quantitative information about the uncertainty associated with loss modeling, a probabilistic, multivariable Bagging decision Tree Flood Loss Estimation MOdel (BT-FLEMO) for residential buildings was developed. The application of BT-FLEMO provides a probability distribution of estimated losses to residential buildings per municipality. BT-FLEMO was applied and validated at the mesoscale in 19 municipalities that were affected during the 2002 flood by the River Mulde in Saxony, Germany. Validation was undertaken on the one hand via a comparison with six deterministic loss models, including both depth-damage functions and multivariable models. On the other hand, the results were compared with official loss data. BT-FLEMO outperforms deterministic, univariable, and multivariable models with regard to model accuracy, although the prediction uncertainty remains high. An important advantage of BT-FLEMO is the quantification of prediction uncertainty. The probability distribution of loss estimates by BT-FLEMO well represents the variation range of loss estimates of the other models in the case study. © 2016 Society for Risk Analysis.

  18. ToPS: a framework to manipulate probabilistic models of sequence data.

    Directory of Open Access Journals (Sweden)

    André Yoshiaki Kashiwabara

    Full Text Available Discrete Markovian models can be used to characterize patterns in sequences of values and have many applications in biological sequence analysis, including gene prediction, CpG island detection, alignment, and protein profiling. We present ToPS, a computational framework that can be used to implement different applications in bioinformatics analysis by combining eight kinds of models: (i independent and identically distributed process; (ii variable-length Markov chain; (iii inhomogeneous Markov chain; (iv hidden Markov model; (v profile hidden Markov model; (vi pair hidden Markov model; (vii generalized hidden Markov model; and (viii similarity based sequence weighting. The framework includes functionality for training, simulation and decoding of the models. Additionally, it provides two methods to help parameter setting: Akaike and Bayesian information criteria (AIC and BIC. The models can be used stand-alone, combined in Bayesian classifiers, or included in more complex, multi-model, probabilistic architectures using GHMMs. In particular the framework provides a novel, flexible, implementation of decoding in GHMMs that detects when the architecture can be traversed efficiently.

  19. Application of a probabilistic model of rainfall-induced shallow landslides to complex hollows

    Directory of Open Access Journals (Sweden)

    A. Talebi

    2008-07-01

    Full Text Available Recently, D'Odorico and Fagherazzi (2003 proposed "A probabilistic model of rainfall-triggered shallow landslides in hollows" (Water Resour. Res., 39, 2003. Their model describes the long-term evolution of colluvial deposits through a probabilistic soil mass balance at a point. Further building blocks of the model are: an infinite-slope stability analysis; a steady-state kinematic wave model (KW of hollow groundwater hydrology; and a statistical model relating intensity, duration, and frequency of extreme precipitation. Here we extend the work of D'Odorico and Fagherazzi (2003 by incorporating a more realistic description of hollow hydrology (hillslope storage Boussinesq model, HSB such that this model can also be applied to more gentle slopes and hollows with different plan shapes. We show that results obtained using the KW and HSB models are significantly different as in the KW model the diffusion term is ignored. We generalize our results by examining the stability of several hollow types with different plan shapes (different convergence degree. For each hollow type, the minimum value of the landslide-triggering saturated depth corresponding to the triggering precipitation (critical recharge rate is computed for steep and gentle hollows. Long term analysis of shallow landslides by the presented model illustrates that all hollows show a quite different behavior from the stability view point. In hollows with more convergence, landslide occurrence is limited by the supply of deposits (supply limited regime or rainfall events (event limited regime while hollows with low convergence degree are unconditionally stable regardless of the soil thickness or rainfall intensity. Overall, our results show that in addition to the effect of slope angle, plan shape (convergence degree also controls the subsurface flow and this process affects the probability distribution of landslide occurrence in different hollows. Finally, we conclude that

  20. Probabilistic Graphical Models for the Analysis and Synthesis of Musical Audio

    Science.gov (United States)

    Hoffmann, Matthew Douglas

    Content-based Music Information Retrieval (MIR) systems seek to automatically extract meaningful information from musical audio signals. This thesis applies new and existing generative probabilistic models to several content-based MIR tasks: timbral similarity estimation, semantic annotation and retrieval, and latent source discovery and separation. In order to estimate how similar two songs sound to one another, we employ a Hierarchical Dirichlet Process (HDP) mixture model to discover a shared representation of the distribution of timbres in each song. Comparing songs under this shared representation yields better query-by-example retrieval quality and scalability than previous approaches. To predict what tags are likely to apply to a song (e.g., "rap," "happy," or "driving music"), we develop the Codeword Bernoulli Average (CBA) model, a simple and fast mixture-of-experts model. Despite its simplicity, CBA performs at least as well as state-of-the-art approaches at automatically annotating songs and finding to what songs in a database a given tag most applies. Finally, we address the problem of latent source discovery and separation by developing two Bayesian nonparametric models, the Shift-Invariant HDP and Gamma Process NMF. These models allow us to discover what sounds (e.g. bass drums, guitar chords, etc.) are present in a song or set of songs and to isolate or suppress individual source. These models' ability to decide how many latent sources are necessary to model the data is particularly valuable in this application, since it is impossible to guess a priori how many sounds will appear in a given song or set of songs. Once they have been fit to data, probabilistic models can also be used to drive the synthesis of new musical audio, both for creative purposes and to qualitatively diagnose what information a model does and does not capture. We also adapt the SIHDP model to create new versions of input audio with arbitrary sample sets, for example, to create

  1. Erratum: Probabilistic application of a fugacity model to predict triclosan fate during wastewater treatment.

    Science.gov (United States)

    Bock, Michael; Lyndall, Jennifer; Barber, Timothy; Fuchsman, Phyllis; Perruchon, Elyse; Capdevielle, Marie

    2010-10-01

    The fate and partitioning of the antimicrobial compound, triclosan, in wastewater treatment plants (WWTPs) is evaluated using a probabilistic fugacity model to predict the range of triclosan concentrations in effluent and secondary biosolids. The WWTP model predicts 84% to 92% triclosan removal, which is within the range of measured removal efficiencies (typically 70% to 98%). Triclosan is predominantly removed by sorption and subsequent settling of organic particulates during primary treatment and by aerobic biodegradation during secondary treatment. Median modeled removal efficiency due to sorption is 40% for all treatment phases and 31% in the primary treatment phase. Median modeled removal efficiency due to biodegradation is 48% for all treatment phases and 44% in the secondary treatment phase. Important factors contributing to variation in predicted triclosan concentrations in effluent and biosolids include influent concentrations, solids concentrations in settling tanks, and factors related to solids retention time. Measured triclosan concentrations in biosolids and non-United States (US) effluent are consistent with model predictions. However, median concentrations in US effluent are over-predicted with this model, suggesting that differences in some aspect of treatment practices not incorporated in the model (e.g., disinfection methods) may affect triclosan removal from effluent. Model applications include predicting changes in environmental loadings associated with new triclosan applications and supporting risk analyses for biosolids-amended land and effluent receiving waters. © 2010 SETAC.

  2. Probabilistic application of a fugacity model to predict triclosan fate during wastewater treatment.

    Science.gov (United States)

    Bock, Michael; Lyndall, Jennifer; Barber, Timothy; Fuchsman, Phyllis; Perruchon, Elyse; Capdevielle, Marie

    2010-07-01

    The fate and partitioning of the antimicrobial compound, triclosan, in wastewater treatment plants (WWTPs) is evaluated using a probabilistic fugacity model to predict the range of triclosan concentrations in effluent and secondary biosolids. The WWTP model predicts 84% to 92% triclosan removal, which is within the range of measured removal efficiencies (typically 70% to 98%). Triclosan is predominantly removed by sorption and subsequent settling of organic particulates during primary treatment and by aerobic biodegradation during secondary treatment. Median modeled removal efficiency due to sorption is 40% for all treatment phases and 31% in the primary treatment phase. Median modeled removal efficiency due to biodegradation is 48% for all treatment phases and 44% in the secondary treatment phase. Important factors contributing to variation in predicted triclosan concentrations in effluent and biosolids include influent concentrations, solids concentrations in settling tanks, and factors related to solids retention time. Measured triclosan concentrations in biosolids and non-United States (US) effluent are consistent with model predictions. However, median concentrations in US effluent are over-predicted with this model, suggesting that differences in some aspect of treatment practices not incorporated in the model (e.g., disinfection methods) may affect triclosan removal from effluent. Model applications include predicting changes in environmental loadings associated with new triclosan applications and supporting risk analyses for biosolids-amended land and effluent receiving waters. (c) 2010 SETAC.

  3. An individual-based probabilistic model for simulating fisheries population dynamics

    Directory of Open Access Journals (Sweden)

    Jie Cao

    2016-12-01

    Full Text Available The purpose of stock assessment is to support managers to provide intelligent decisions regarding removal from fish populations. Errors in assessment models may have devastating impacts on the population fitness and negative impacts on the economy of the resource users. Thus, accuracte estimations of population size, growth rates are critical for success. Evaluating and testing the behavior and performance of stock assessment models and assessing the consequences of model mis-specification and the impact of management strategies requires an operating model that accurately describe the dynamics of the target species, and can resolve spatial and seasonal changes. In addition, the most thorough evaluations of assessment models use an operating model that takes a different form than the assessment model. This paper presents an individual-based probabilistic model used to simulate the complex dynamics of populations and their associated fisheries. Various components of population dynamics are expressed as random Bernoulli trials in the model and detailed life and fishery histories of each individual are tracked over their life span. The simulation model is designed to be flexible so it can be used for different species and fisheries. It can simulate mixing among multiple stocks and link stock-recruit relationships to environmental factors. Furthermore, the model allows for flexibility in sub-models (e.g., growth and recruitment and model assumptions (e.g., age- or size-dependent selectivity. This model enables the user to conduct various simulation studies, including testing the performance of assessment models under different assumptions, assessing the impacts of model mis-specification and evaluating management strategies.

  4. Dynamic modeling of physical phenomena for probabilistic assessment of spent fuel accidents

    International Nuclear Information System (INIS)

    Benjamin, A.S.

    1997-01-01

    If there should be an accident involving drainage of all the water from a spent fuel pool, the fuel elements will heat up until the heat produced by radioactive decay is balanced by that removed by natural convection to air, thermal radiation, and other means. If the temperatures become high enough for the cladding or other materials to ignite due to rapid oxidation, then some of the fuel might melt, leading to an undesirable release of radioactive materials. The amount of melting is dependent upon the fuel loading configuration and its age, the oxidation and melting characteristics of the materials, and the potential effectiveness of recovery actions. The authors have developed methods for modeling the pertinent physical phenomena and integrating the results with a probabilistic treatment of the uncertainty distributions. The net result is a set of complementary cumulative distribution functions for the amount of fuel melted

  5. A Model-Based Probabilistic Inversion Framework for Wire Fault Detection Using TDR

    Science.gov (United States)

    Schuet, Stefan R.; Timucin, Dogan A.; Wheeler, Kevin R.

    2010-01-01

    Time-domain reflectometry (TDR) is one of the standard methods for diagnosing faults in electrical wiring and interconnect systems, with a long-standing history focused mainly on hardware development of both high-fidelity systems for laboratory use and portable hand-held devices for field deployment. While these devices can easily assess distance to hard faults such as sustained opens or shorts, their ability to assess subtle but important degradation such as chafing remains an open question. This paper presents a unified framework for TDR-based chafing fault detection in lossy coaxial cables by combining an S-parameter based forward modeling approach with a probabilistic (Bayesian) inference algorithm. Results are presented for the estimation of nominal and faulty cable parameters from laboratory data.

  6. Regional probabilistic nuclear risk and vulnerability assessment by integration of mathematical modelling land GIS-analysis

    International Nuclear Information System (INIS)

    Rigina, O.; Baklanov, A.

    2002-01-01

    The Kola Peninsula, Russian Arctic exceeds all other regions in the world in the number of nuclear reactors. The study was aimed at estimating possible radiation risks to the population in the Nordic countries in case of a severe accident in the Kola Peninsula. A new approach based on probabilistic analysis of modelled possible pathways of radionuclide transport and precipitation was developed. For the general population, Finland is at most risk with respect to the Kola NPP, because of: high population density or proximity to the radiation-risk sites and relatively high probability of an airflow trajectory there, and precipitation. After considering the critical group, northern counties in Norway, Finland and Sweden appear to be most vulnerable. (au)

  7. Novel Complete Probabilistic Models of Random Variation in High Frequency Performance of Nanoscale MOSFET

    Directory of Open Access Journals (Sweden)

    Rawid Banchuin

    2013-01-01

    Full Text Available The novel probabilistic models of the random variations in nanoscale MOSFET's high frequency performance defined in terms of gate capacitance and transition frequency have been proposed. As the transition frequency variation has also been considered, the proposed models are considered as complete unlike the previous one which take only the gate capacitance variation into account. The proposed models have been found to be both analytic and physical level oriented as they are the precise mathematical expressions in terms of physical parameters. Since the up-to-date model of variation in MOSFET's characteristic induced by physical level fluctuation has been used, part of the proposed models for gate capacitance is more accurate and physical level oriented than its predecessor. The proposed models have been verified based on the 65 nm CMOS technology by using the Monte-Carlo SPICE simulations of benchmark circuits and Kolmogorov-Smirnov tests as highly accurate since they fit the Monte-Carlo-based analysis results with 99% confidence. Hence, these novel models have been found to be versatile for the statistical/variability aware analysis/design of nanoscale MOSFET-based analog/mixed signal circuits and systems.

  8. A probabilistic model for the persistence of early planar fabrics in polydeformed pelitic schists

    Science.gov (United States)

    Ferguson, C.C.

    1984-01-01

    Although early planar fabrics are commonly preserved within microlithons in low-grade pelites, in higher-grade (amphibolite facies) pelitic schists fabric regeneration often appears complete. Evidence for early fabrics may be preserved within porphyroblasts but, within the matrix, later deformation often appears to totally obliterate or reorient earlier fabrics. However, examination of several hundred Dalradian pelites from Connemara, western Ireland, reveals that preservation of early fabrics is by no means uncommon; relict matrix domains, although volumetrically insignificant, are remarkably persistent even when inferred later strains are very large and fabric regeneration appears, at first sight, complete. Deterministic plasticity theories are ill-suited to the analysis of such an inhomogeneous material response, and a probabilistic model is proposed instead. It assumes that ductile polycrystal deformation is controlled by elementary flow units which can be activated once their associated stress barrier is overcome. Bulk flow propensity is related to the proportion of simultaneous activations, and a measure of this is derived from the probabilistic interaction between a stress-barrier spectrum and an internal stress spectrum (the latter determined by the external loading and the details of internal stress transfer). The spectra are modelled as Gaussian distributions although the treatment is very general and could be adapted for other distributions. Using the time rate of change of activation probability it is predicted that, initially, fabric development will be rapid but will then slow down dramatically even though stress increases at a constant rate. This highly non-linear response suggests that early fabrics persist because they comprise unfavourable distributions of stress-barriers which remain unregenerated at the time bulk stress is stabilized by steady-state flow. Relict domains will, however, bear the highest stress and are potential upper

  9. Development and Application of a Probabilistic Risk-Benefit Assessment Model for Infant Feeding Integrating Microbiological, Nutritional, and Chemical Components.

    Science.gov (United States)

    Boué, Géraldine; Cummins, Enda; Guillou, Sandrine; Antignac, Jean-Philippe; Le Bizec, Bruno; Membré, Jeanne-Marie

    2017-12-01

    A probabilistic and interdisciplinary risk-benefit assessment (RBA) model integrating microbiological, nutritional, and chemical components was developed for infant milk, with the objective of predicting the health impact of different scenarios of consumption. Infant feeding is a particular concern of interest in RBA as breast milk and powder infant formula have both been associated with risks and benefits related to chemicals, bacteria, and nutrients, hence the model considers these three facets. Cronobacter sakazakii, dioxin-like polychlorinated biphenyls (dl-PCB), and docosahexaenoic acid (DHA) were three risk/benefit factors selected as key issues in microbiology, chemistry, and nutrition, respectively. The present model was probabilistic with variability and uncertainty separated using a second-order Monte Carlo simulation process. In this study, advantages and limitations of undertaking probabilistic and interdisciplinary RBA are discussed. In particular, the probabilistic technique was found to be powerful in dealing with missing data and to translate assumptions into quantitative inputs while taking uncertainty into account. In addition, separation of variability and uncertainty strengthened the interpretation of the model outputs by enabling better consideration and distinction of natural heterogeneity from lack of knowledge. Interdisciplinary RBA is necessary to give more structured conclusions and avoid contradictory messages to policymakers and also to consumers, leading to more decisive food recommendations. This assessment provides a conceptual development of the RBA methodology and is a robust basis on which to build upon. © 2017 Society for Risk Analysis.

  10. Comparison of Four Probabilistic Models (CARES, Calendex, ConsEspo, SHEDS) to Estimate Aggregate Residential Exposures to Pesticides

    Science.gov (United States)

    Two deterministic models (US EPA’s Office of Pesticide Programs Residential Standard Operating Procedures (OPP Residential SOPs) and Draft Protocol for Measuring Children’s Non-Occupational Exposure to Pesticides by all Relevant Pathways (Draft Protocol)) and four probabilistic mo...

  11. Application of the methodology of safety probabilistic analysis to the modelling the emergency feedwater system of Juragua nuclear power plant

    International Nuclear Information System (INIS)

    Troncoso, M.; Oliva, G.

    1993-01-01

    The application of the methodology developed in the framework of the national plan of safety probabilistic analysis (APS) to the emergency feed water system for the failures of small LOCAS and external electrical supply loss in the nuclear power plant is illustrated in this work. The facilities created by the ARCON code to model the systems and its documentation are also expounded

  12. Student Modeling and Machine Learning

    OpenAIRE

    Sison , Raymund; Shimura , Masamichi

    1998-01-01

    After identifying essential student modeling issues and machine learning approaches, this paper examines how machine learning techniques have been used to automate the construction of student models as well as the background knowledge necessary for student modeling. In the process, the paper sheds light on the difficulty, suitability and potential of using machine learning for student modeling processes, and, to a lesser extent, the potential of using student modeling techniques in machine le...

  13. Probabilistic insurance

    OpenAIRE

    Wakker, P.P.; Thaler, R.H.; Tversky, A.

    1997-01-01

    textabstractProbabilistic insurance is an insurance policy involving a small probability that the consumer will not be reimbursed. Survey data suggest that people dislike probabilistic insurance and demand more than a 20% reduction in the premium to compensate for a 1% default risk. While these preferences are intuitively appealing they are difficult to reconcile with expected utility theory. Under highly plausible assumptions about the utility function, willingness to pay for probabilistic i...

  14. The MCRA model for probabilistic single-compound and cumulative risk assessment of pesticides.

    Science.gov (United States)

    van der Voet, Hilko; de Boer, Waldo J; Kruisselbrink, Johannes W; Goedhart, Paul W; van der Heijden, Gerie W A M; Kennedy, Marc C; Boon, Polly E; van Klaveren, Jacob D

    2015-05-01

    Pesticide risk assessment is hampered by worst-case assumptions leading to overly pessimistic assessments. On the other hand, cumulative health effects of similar pesticides are often not taken into account. This paper describes models and a web-based software system developed in the European research project ACROPOLIS. The models are appropriate for both acute and chronic exposure assessments of single compounds and of multiple compounds in cumulative assessment groups. The software system MCRA (Monte Carlo Risk Assessment) is available for stakeholders in pesticide risk assessment at mcra.rivm.nl. We describe the MCRA implementation of the methods as advised in the 2012 EFSA Guidance on probabilistic modelling, as well as more refined methods developed in the ACROPOLIS project. The emphasis is on cumulative assessments. Two approaches, sample-based and compound-based, are contrasted. It is shown that additional data on agricultural use of pesticides may give more realistic risk assessments. Examples are given of model and software validation of acute and chronic assessments, using both simulated data and comparisons against the previous release of MCRA and against the standard software DEEM-FCID used by the Environmental Protection Agency in the USA. It is shown that the EFSA Guidance pessimistic model may not always give an appropriate modelling of exposure. Crown Copyright © 2014. Published by Elsevier Ltd. All rights reserved.

  15. A Probabilistic Short-Term Water Demand Forecasting Model Based on the Markov Chain

    Directory of Open Access Journals (Sweden)

    Francesca Gagliardi

    2017-07-01

    Full Text Available This paper proposes a short-term water demand forecasting method based on the use of the Markov chain. This method provides estimates of future demands by calculating probabilities that the future demand value will fall within pre-assigned intervals covering the expected total variability. More specifically, two models based on homogeneous and non-homogeneous Markov chains were developed and presented. These models, together with two benchmark models (based on artificial neural network and naïve methods, were applied to three real-life case studies for the purpose of forecasting the respective water demands from 1 to 24 h ahead. The results obtained show that the model based on a homogeneous Markov chain provides more accurate short-term forecasts than the one based on a non-homogeneous Markov chain, which is in line with the artificial neural network model. Both Markov chain models enable probabilistic information regarding the stochastic demand forecast to be easily obtained.

  16. Multi-model approach to petroleum resource appraisal using analytic methodologies for probabilistic systems

    Science.gov (United States)

    Crovelli, R.A.

    1988-01-01

    The geologic appraisal model that is selected for a petroleum resource assessment depends upon purpose of the assessment, basic geologic assumptions of the area, type of available data, time available before deadlines, available human and financial resources, available computer facilities, and, most importantly, the available quantitative methodology with corresponding computer software and any new quantitative methodology that would have to be developed. Therefore, different resource assessment projects usually require different geologic models. Also, more than one geologic model might be needed in a single project for assessing different regions of the study or for cross-checking resource estimates of the area. Some geologic analyses used in the past for petroleum resource appraisal involved play analysis. The corresponding quantitative methodologies of these analyses usually consisted of Monte Carlo simulation techniques. A probabilistic system of petroleum resource appraisal for play analysis has been designed to meet the following requirements: (1) includes a variety of geologic models, (2) uses an analytic methodology instead of Monte Carlo simulation, (3) possesses the capacity to aggregate estimates from many areas that have been assessed by different geologic models, and (4) runs quickly on a microcomputer. Geologic models consist of four basic types: reservoir engineering, volumetric yield, field size, and direct assessment. Several case histories and present studies by the U.S. Geological Survey are discussed. ?? 1988 International Association for Mathematical Geology.

  17. Probabilistic model for untargeted peak detection in LC-MS using Bayesian statistics

    NARCIS (Netherlands)

    Woldegebriel, M.; Vivó-Truyols, G.

    2015-01-01

    We introduce a novel Bayesian probabilistic peak detection algorithm for liquid chromatography mass spectroscopy (LC-MS). The final probabilistic result allows the user to make a final decision about which points in a 2 chromatogram are affected by a chromatographic peak and which ones are only

  18. Application of Fragment Ion Information as Further Evidence in Probabilistic Compound Screening Using Bayesian Statistics and Machine Learning: A Leap Toward Automation.

    Science.gov (United States)

    Woldegebriel, Michael; Zomer, Paul; Mol, Hans G J; Vivó-Truyols, Gabriel

    2016-08-02

    In this work, we introduce an automated, efficient, and elegant model to combine all pieces of evidence (e.g., expected retention times, peak shapes, isotope distributions, fragment-to-parent ratio) obtained from liquid chromatography-tandem mass spectrometry (LC-MS/MS/MS) data for screening purposes. Combining all these pieces of evidence requires a careful assessment of the uncertainties in the analytical system as well as all possible outcomes. To-date, the majority of the existing algorithms are highly dependent on user input parameters. Additionally, the screening process is tackled as a deterministic problem. In this work we present a Bayesian framework to deal with the combination of all these pieces of evidence. Contrary to conventional algorithms, the information is treated in a probabilistic way, and a final probability assessment of the presence/absence of a compound feature is computed. Additionally, all the necessary parameters except the chromatographic band broadening for the method are learned from the data in training and learning phase of the algorithm, avoiding the introduction of a large number of user-defined parameters. The proposed method was validated with a large data set and has shown improved sensitivity and specificity in comparison to a threshold-based commercial software package.

  19. Using the Rasch model as an objective and probabilistic technique to integrate different soil properties

    Science.gov (United States)

    Rebollo, Francisco J.; Jesús Moral García, Francisco

    2016-04-01

    Soil apparent electrical conductivity (ECa) is one of the simplest, least expensive soil measurements that integrates many soil properties affecting crop productivity, including, for instance, soil texture, water content, and cation exchange capacity. The ECa measurements obtained with a 3100 Veris sensor, operating in both shallow (0-30 cm), ECs, and deep (0-90 cm), ECd, mode, can be used as an additional and essential information to be included in a probabilistic model, the Rasch model, with the aim of quantifying the overall soil fertililty potential in an agricultural field. This quantification should integrate the main soil physical and chemical properties, with different units. In this work, the formulation of the Rasch model integrates 11 soil properties (clay, silt and sand content, organic matter -OM-, pH, total nitrogen -TN-, available phosphorus -AP- and potassium -AK-, cation exchange capacity -CEC-, ECd, and ECs) measured at 70 locations in a field. The main outputs of the model include a ranking of all soil samples according to their relative fertility potential and the unexpected behaviours of some soil samples and properties. In the case study, the considered soil variables fit the model reasonably, having an important influence on soil fertility, except pH, probably due to its homogeneity in the field. Moreover, ECd, ECs are the most influential properties on soil fertility and, on the other hand, AP and AK the less influential properties. The use of the Rasch model to estimate soil fertility potential (always in a relative way, taking into account the characteristics of the studied soil) constitutes a new application of great practical importance, enabling to rationally determine locations in a field where high soil fertility potential exists and establishing those soil samples or properties which have any anomaly; this information can be necessary to conduct site-specific treatments, leading to a more cost-effective and sustainable field

  20. A physical probabilistic model to predict failure rates in buried PVC pipelines

    International Nuclear Information System (INIS)

    Davis, P.; Burn, S.; Moglia, M.; Gould, S.

    2007-01-01

    For older water pipeline materials such as cast iron and asbestos cement, future pipe failure rates can be extrapolated from large volumes of existing historical failure data held by water utilities. However, for newer pipeline materials such as polyvinyl chloride (PVC), only limited failure data exists and confident forecasts of future pipe failures cannot be made from historical data alone. To solve this problem, this paper presents a physical probabilistic model, which has been developed to estimate failure rates in buried PVC pipelines as they age. The model assumes that under in-service operating conditions, crack initiation can occur from inherent defects located in the pipe wall. Linear elastic fracture mechanics theory is used to predict the time to brittle fracture for pipes with internal defects subjected to combined internal pressure and soil deflection loading together with through-wall residual stress. To include uncertainty in the failure process, inherent defect size is treated as a stochastic variable, and modelled with an appropriate probability distribution. Microscopic examination of fracture surfaces from field failures in Australian PVC pipes suggests that the 2-parameter Weibull distribution can be applied. Monte Carlo simulation is then used to estimate lifetime probability distributions for pipes with internal defects, subjected to typical operating conditions. As with inherent defect size, the 2-parameter Weibull distribution is shown to be appropriate to model uncertainty in predicted pipe lifetime. The Weibull hazard function for pipe lifetime is then used to estimate the expected failure rate (per pipe length/per year) as a function of pipe age. To validate the model, predicted failure rates are compared to aggregated failure data from 17 UK water utilities obtained from the United Kingdom Water Industry Research (UKWIR) National Mains Failure Database. In the absence of actual operating pressure data in the UKWIR database, typical

  1. A state-based probabilistic model for tumor respiratory motion prediction

    International Nuclear Information System (INIS)

    Kalet, Alan; Sandison, George; Schmitz, Ruth; Wu Huanmei

    2010-01-01

    This work proposes a new probabilistic mathematical model for predicting tumor motion and position based on a finite state representation using the natural breathing states of exhale, inhale and end of exhale. Tumor motion was broken down into linear breathing states and sequences of states. Breathing state sequences and the observables representing those sequences were analyzed using a hidden Markov model (HMM) to predict the future sequences and new observables. Velocities and other parameters were clustered using a k-means clustering algorithm to associate each state with a set of observables such that a prediction of state also enables a prediction of tumor velocity. A time average model with predictions based on average past state lengths was also computed. State sequences which are known a priori to fit the data were fed into the HMM algorithm to set a theoretical limit of the predictive power of the model. The effectiveness of the presented probabilistic model has been evaluated for gated radiation therapy based on previously tracked tumor motion in four lung cancer patients. Positional prediction accuracy is compared with actual position in terms of the overall RMS errors. Various system delays, ranging from 33 to 1000 ms, were tested. Previous studies have shown duty cycles for latencies of 33 and 200 ms at around 90% and 80%, respectively, for linear, no prediction, Kalman filter and ANN methods as averaged over multiple patients. At 1000 ms, the previously reported duty cycles range from approximately 62% (ANN) down to 34% (no prediction). Average duty cycle for the HMM method was found to be 100% and 91 ± 3% for 33 and 200 ms latency and around 40% for 1000 ms latency in three out of four breathing motion traces. RMS errors were found to be lower than linear and no prediction methods at latencies of 1000 ms. The results show that for system latencies longer than 400 ms, the time average HMM prediction outperforms linear, no prediction, and the more

  2. Photometric redshift estimation via deep learning. Generalized and pre-classification-less, image based, fully probabilistic redshifts

    Science.gov (United States)

    D'Isanto, A.; Polsterer, K. L.

    2018-01-01

    Context. The need to analyze the available large synoptic multi-band surveys drives the development of new data-analysis methods. Photometric redshift estimation is one field of application where such new methods improved the results, substantially. Up to now, the vast majority of applied redshift estimation methods have utilized photometric features. Aims: We aim to develop a method to derive probabilistic photometric redshift directly from multi-band imaging data, rendering pre-classification of objects and feature extraction obsolete. Methods: A modified version of a deep convolutional network was combined with a mixture density network. The estimates are expressed as Gaussian mixture models representing the probability density functions (PDFs) in the redshift space. In addition to the traditional scores, the continuous ranked probability score (CRPS) and the probability integral transform (PIT) were applied as performance criteria. We have adopted a feature based random forest and a plain mixture density network to compare performances on experiments with data from SDSS (DR9). Results: We show that the proposed method is able to predict redshift PDFs independently from the type of source, for example galaxies, quasars or stars. Thereby the prediction performance is better than both presented reference methods and is comparable to results from the literature. Conclusions: The presented method is extremely general and allows us to solve of any kind of probabilistic regression problems based on imaging data, for example estimating metallicity or star formation rate of galaxies. This kind of methodology is tremendously important for the next generation of surveys.

  3. COMPOSER: A Probabilistic Solution to the Utility Problem in Speed-up Learning.

    Science.gov (United States)

    Gratch, Jonathan; DeJong, Gerald

    In machine learning there is considerable interest in techniques which improve planning ability. Initial investigations have identified a wide variety of techniques to address this issue. Progress has been hampered by the utility problem, a basic tradeoff between the benefit of learned knowledge and the cost to locate and apply relevant knowledge.…

  4. Face Recognition for Access Control Systems Combining Image-Difference Features Based on a Probabilistic Model

    Science.gov (United States)

    Miwa, Shotaro; Kage, Hiroshi; Hirai, Takashi; Sumi, Kazuhiko

    We propose a probabilistic face recognition algorithm for Access Control System(ACS)s. Comparing with existing ACSs using low cost IC-cards, face recognition has advantages in usability and security that it doesn't require people to hold cards over scanners and doesn't accept imposters with authorized cards. Therefore face recognition attracts more interests in security markets than IC-cards. But in security markets where low cost ACSs exist, price competition is important, and there is a limitation on the quality of available cameras and image control. Therefore ACSs using face recognition are required to handle much lower quality images, such as defocused and poor gain-controlled images than high security systems, such as immigration control. To tackle with such image quality problems we developed a face recognition algorithm based on a probabilistic model which combines a variety of image-difference features trained by Real AdaBoost with their prior probability distributions. It enables to evaluate and utilize only reliable features among trained ones during each authentication, and achieve high recognition performance rates. The field evaluation using a pseudo Access Control System installed in our office shows that the proposed system achieves a constant high recognition performance rate independent on face image qualities, that is about four times lower EER (Equal Error Rate) under a variety of image conditions than one without any prior probability distributions. On the other hand using image difference features without any prior probabilities are sensitive to image qualities. We also evaluated PCA, and it has worse, but constant performance rates because of its general optimization on overall data. Comparing with PCA, Real AdaBoost without any prior distribution performs twice better under good image conditions, but degrades to a performance as good as PCA under poor image conditions.

  5. Building a high-resolution T2-weighted MR-based probabilistic model of tumor occurrence in the prostate.

    Science.gov (United States)

    Nagarajan, Mahesh B; Raman, Steven S; Lo, Pechin; Lin, Wei-Chan; Khoshnoodi, Pooria; Sayre, James W; Ramakrishna, Bharath; Ahuja, Preeti; Huang, Jiaoti; Margolis, Daniel J A; Lu, David S K; Reiter, Robert E; Goldin, Jonathan G; Brown, Matthew S; Enzmann, Dieter R

    2018-02-19

    We present a method for generating a T2 MR-based probabilistic model of tumor occurrence in the prostate to guide the selection of anatomical sites for targeted biopsies and serve as a diagnostic tool to aid radiological evaluation of prostate cancer. In our study, the prostate and any radiological findings within were segmented retrospectively on 3D T2-weighted MR images of 266 subjects who underwent radical prostatectomy. Subsequent histopathological analysis determined both the ground truth and the Gleason grade of the tumors. A randomly chosen subset of 19 subjects was used to generate a multi-subject-derived prostate template. Subsequently, a cascading registration algorithm involving both affine and non-rigid B-spline transforms was used to register the prostate of every subject to the template. Corresponding transformation of radiological findings yielded a population-based probabilistic model of tumor occurrence. The quality of our probabilistic model building approach was statistically evaluated by measuring the proportion of correct placements of tumors in the prostate template, i.e., the number of tumors that maintained their anatomical location within the prostate after their transformation into the prostate template space. Probabilistic model built with tumors deemed clinically significant demonstrated a heterogeneous distribution of tumors, with higher likelihood of tumor occurrence at the mid-gland anterior transition zone and the base-to-mid-gland posterior peripheral zones. Of 250 MR lesions analyzed, 248 maintained their original anatomical location with respect to the prostate zones after transformation to the prostate. We present a robust method for generating a probabilistic model of tumor occurrence in the prostate that could aid clinical decision making, such as selection of anatomical sites for MR-guided prostate biopsies.

  6. Probabilistic modelling of the damage of geological barriers of the nuclear waste deep storage - ENDOSTON project, final report

    International Nuclear Information System (INIS)

    2010-01-01

    As the corrosion of metallic casings of radioactive waste storage packages releases hydrogen under pressure, and as the overpressure disturbs the stress fields, the authors report the development of methodologies and numerical simulation tools aimed at a better understanding of the mechanisms of development and propagation of crack networks in the geological barrier due to this overpressure. They present a probabilistic model of the formation of crack networks in rocks, with the probabilistic post-processing of a finite element calculation. They describe the modelling of crack propagation and damage in quasi-brittle materials. They present the ENDO-HETEROGENE model for the formation and propagation of cracks in heterogeneous media, describe the integration of the model into the Aster code, and report the model validation (calculation of the stress intensity factor, grid dependence). They finally report a test case of the ENDO-HETEROGENE model

  7. Towards a probabilistic model for predicting ship besetting in ice in Arctic waters

    International Nuclear Information System (INIS)

    Fu, Shanshan; Zhang, Di; Montewka, Jakub; Yan, Xinping; Zio, Enrico

    2016-01-01

    Recently, the melting of sea ice due to global warming has made it possible for merchant ships to navigate through Arctic Waters. However, Arctic Marine Transportation System remains a very demanding, dynamic and complex system due to challenging hydro-meteorological conditions, poorly charted waters and remoteness of the area resulting in lack of appropriate response capacity in case of emergency. In order to ensure a proper safety level for operations such as ship transit within the area, a risk analysis should be carried out, where the relevant factors pertaining to a given operation are defined and organized in a model. Such a model can assist onshore managers or ships’ crews in planning and conducting an actual sea passage through Arctic waters. However, research in this domain is scarce, mainly due to lack of data. In this paper, we demonstrate the use of a dataset and expert judgment to determine the risk influencing factors and develop a probabilistic model for a ship besetting in ice along the Northeast Passage. For that purpose, we adopt Bayesian belief Networks (BBNs), due to their predominant feature of reasoning under uncertainty and their ability to accommodate data from various sources. The obtained BBN model has been validated showing good agreement with available state-of-the-art models, and providing good understanding of the analyzed phenomena.

  8. A probabilistic model for hydrokinetic turbine collision risks: exploring impacts on fish.

    Science.gov (United States)

    Hammar, Linus; Eggertsen, Linda; Andersson, Sandra; Ehnberg, Jimmy; Arvidsson, Rickard; Gullström, Martin; Molander, Sverker

    2015-01-01

    A variety of hydrokinetic turbines are currently under development for power generation in rivers, tidal straits and ocean currents. Because some of these turbines are large, with rapidly moving rotor blades, the risk of collision with aquatic animals has been brought to attention. The behavior and fate of animals that approach such large hydrokinetic turbines have not yet been monitored at any detail. In this paper, we conduct a synthesis of the current knowledge and understanding of hydrokinetic turbine collision risks. The outcome is a generic fault tree based probabilistic model suitable for estimating population-level ecological risks. New video-based data on fish behavior in strong currents are provided and models describing fish avoidance behaviors are presented. The findings indicate low risk for small-sized fish. However, at large turbines (≥5 m), bigger fish seem to have high probability of collision, mostly because rotor detection and avoidance is difficult in low visibility. Risks can therefore be substantial for vulnerable populations of large-sized fish, which thrive in strong currents. The suggested collision risk model can be applied to different turbine designs and at a variety of locations as basis for case-specific risk assessments. The structure of the model facilitates successive model validation, refinement and application to other organism groups such as marine mammals.

  9. A probabilistic model for hydrokinetic turbine collision risks: exploring impacts on fish.

    Directory of Open Access Journals (Sweden)

    Linus Hammar

    Full Text Available A variety of hydrokinetic turbines are currently under development for power generation in rivers, tidal straits and ocean currents. Because some of these turbines are large, with rapidly moving rotor blades, the risk of collision with aquatic animals has been brought to attention. The behavior and fate of animals that approach such large hydrokinetic turbines have not yet been monitored at any detail. In this paper, we conduct a synthesis of the current knowledge and understanding of hydrokinetic turbine collision risks. The outcome is a generic fault tree based probabilistic model suitable for estimating population-level ecological risks. New video-based data on fish behavior in strong currents are provided and models describing fish avoidance behaviors are presented. The findings indicate low risk for small-sized fish. However, at large turbines (≥5 m, bigger fish seem to have high probability of collision, mostly because rotor detection and avoidance is difficult in low visibility. Risks can therefore be substantial for vulnerable populations of large-sized fish, which thrive in strong currents. The suggested collision risk model can be applied to different turbine designs and at a variety of locations as basis for case-specific risk assessments. The structure of the model facilitates successive model validation, refinement and application to other organism groups such as marine mammals.

  10. Probabilistic models for neural populations that naturally capture global coupling and criticality.

    Science.gov (United States)

    Humplik, Jan; Tkačik, Gašper

    2017-09-01

    Advances in multi-unit recordings pave the way for statistical modeling of activity patterns in large neural populations. Recent studies have shown that the summed activity of all neurons strongly shapes the population response. A separate recent finding has been that neural populations also exhibit criticality, an anomalously large dynamic range for the probabilities of different population activity patterns. Motivated by these two observations, we introduce a class of probabilistic models which takes into account the prior knowledge that the neural population could be globally coupled and close to critical. These models consist of an energy function which parametrizes interactions between small groups of neurons, and an arbitrary positive, strictly increasing, and twice differentiable function which maps the energy of a population pattern to its probability. We show that: 1) augmenting a pairwise Ising model with a nonlinearity yields an accurate description of the activity of retinal ganglion cells which outperforms previous models based on the summed activity of neurons; 2) prior knowledge that the population is critical translates to prior expectations about the shape of the nonlinearity; 3) the nonlinearity admits an interpretation in terms of a continuous latent variable globally coupling the system whose distribution we can infer from data. Our method is independent of the underlying system's state space; hence, it can be applied to other systems such as natural scenes or amino acid sequences of proteins which are also known to exhibit criticality.

  11. Probabilistic and technology-specific modeling of emissions from municipal solid-waste incineration.

    Science.gov (United States)

    Koehler, Annette; Peyer, Fabio; Salzmann, Christoph; Saner, Dominik

    2011-04-15

    The European legislation increasingly directs waste streams which cannot be recycled toward thermal treatment. Models are therefore needed that help to quantify emissions of waste incineration and thus reveal potential risks and mitigation needs. This study presents a probabilistic model which computes emissions as a function of waste composition and technological layout of grate incineration plants and their pollution-control equipment. In contrast to previous waste-incineration models, this tool is based on a broader empirical database and allows uncertainties in emission loads to be quantified. Comparison to monitoring data of 83 actual European plants showed no significant difference between modeled emissions and measured data. An inventory of all European grate incineration plants including technical characteristics and plant capacities was established, and waste material mixtures were determined for different European countries, including generic elemental waste-material compositions. The model thus allows for calculation of country-specific and material-dependent emission factors and enables identification and tracking of emission sources. It thereby helps to develop strategies to decrease plant emissions by reducing or redirecting problematic waste fractions to other treatment options or adapting the technological equipment of waste incinerators.

  12. Probabilistic Unawareness

    Directory of Open Access Journals (Sweden)

    Mikaël Cozic

    2016-11-01

    Full Text Available The modeling of awareness and unawareness is a significant topic in the doxastic logic literature, where it is usually tackled in terms of full belief operators. The present paper aims at a treatment in terms of partial belief operators. It draws upon the modal probabilistic logic that was introduced by Aumann (1999 at the semantic level, and then axiomatized by Heifetz and Mongin (2001. The paper embodies in this framework those properties of unawareness that have been highlighted in the seminal paper by Modica and Rustichini (1999. Their paper deals with full belief, but we argue that the properties in question also apply to partial belief. Our main result is a (soundness and completeness theorem that reunites the two strands—modal and probabilistic—of doxastic logic.

  13. Probabilistic Material Strength Degradation Model for Inconel 718 Components Subjected to High Temperature, Mechanical Fatigue, Creep and Thermal Fatigue Effects

    Science.gov (United States)

    Bast, Callie Corinne Scheidt

    1994-01-01

    This thesis presents the on-going development of methodology for a probabilistic material strength degradation model. The probabilistic model, in the form of a postulated randomized multifactor equation, provides for quantification of uncertainty in the lifetime material strength of aerospace propulsion system components subjected to a number of diverse random effects. This model is embodied in the computer program entitled PROMISS, which can include up to eighteen different effects. Presently, the model includes four effects that typically reduce lifetime strength: high temperature, mechanical fatigue, creep, and thermal fatigue. Statistical analysis was conducted on experimental Inconel 718 data obtained from the open literature. This analysis provided regression parameters for use as the model's empirical material constants, thus calibrating the model specifically for Inconel 718. Model calibration was carried out for four variables, namely, high temperature, mechanical fatigue, creep, and thermal fatigue. Methodology to estimate standard deviations of these material constants for input into the probabilistic material strength model was developed. Using the current version of PROMISS, entitled PROMISS93, a sensitivity study for the combined effects of mechanical fatigue, creep, and thermal fatigue was performed. Results, in the form of cumulative distribution functions, illustrated the sensitivity of lifetime strength to any current value of an effect. In addition, verification studies comparing a combination of mechanical fatigue and high temperature effects by model to the combination by experiment were conducted. Thus, for Inconel 718, the basic model assumption of independence between effects was evaluated. Results from this limited verification study strongly supported this assumption.

  14. Size Evolution and Stochastic Models: Explaining Ostracod Size through Probabilistic Distributions

    Science.gov (United States)

    Krawczyk, M.; Decker, S.; Heim, N. A.; Payne, J.

    2014-12-01

    The biovolume of animals has functioned as an important benchmark for measuring evolution throughout geologic time. In our project, we examined the observed average body size of ostracods over time in order to understand the mechanism of size evolution in these marine organisms. The body size of ostracods has varied since the beginning of the Ordovician, where the first true ostracods appeared. We created a stochastic branching model to create possible evolutionary trees of ostracod size. Using stratigraphic ranges for ostracods compiled from over 750 genera in the Treatise on Invertebrate Paleontology, we calculated overall speciation and extinction rates for our model. At each timestep in our model, new lineages can evolve or existing lineages can become extinct. Newly evolved lineages are assigned sizes based on their parent genera. We parameterized our model to generate neutral and directional changes in ostracod size to compare with the observed data. New sizes were chosen via a normal distribution, and the neutral model selected new sizes differentials centered on zero, allowing for an equal chance of larger or smaller ostracods at each speciation. Conversely, the directional model centered the distribution on a negative value, giving a larger chance of smaller ostracods. Our data strongly suggests that the overall direction of ostracod evolution has been following a model that directionally pushes mean ostracod size down, shying away from a neutral model. Our model was able to match the magnitude of size decrease. Our models had a constant linear decrease while the actual data had a much more rapid initial rate followed by a constant size. The nuance of the observed trends ultimately suggests a more complex method of size evolution. In conclusion, probabilistic methods can provide valuable insight into possible evolutionary mechanisms determining size evolution in ostracods.

  15. Visual Perceptual Learning and Models.

    Science.gov (United States)

    Dosher, Barbara; Lu, Zhong-Lin

    2017-09-15

    Visual perceptual learning through practice or training can significantly improve performance on visual tasks. Originally seen as a manifestation of plasticity in the primary visual cortex, perceptual learning is more readily understood as improvements in the function of brain networks that integrate processes, including sensory representations, decision, attention, and reward, and balance plasticity with system stability. This review considers the primary phenomena of perceptual learning, theories of perceptual learning, and perceptual learning's effect on signal and noise in visual processing and decision. Models, especially computational models, play a key role in behavioral and physiological investigations of the mechanisms of perceptual learning and for understanding, predicting, and optimizing human perceptual processes, learning, and performance. Performance improvements resulting from reweighting or readout of sensory inputs to decision provide a strong theoretical framework for interpreting perceptual learning and transfer that may prove useful in optimizing learning in real-world applications.

  16. Probabilistic multi-scale models and measurements of self-heating under multiaxial high cycle fatigue

    International Nuclear Information System (INIS)

    Poncelet, M.; Hild, F.; Doudard, C.; Calloch, S.; Weber, B.

    2010-01-01

    Different approaches have been proposed to link high cycle fatigue properties to thermal measurements under cyclic loadings, usually referred to as 'self-heating tests'. This paper focuses on two models whose parameters are tuned by resorting to self-heating tests and then used to predict high cycle fatigue properties. The first model is based upon a yield surface approach to account for stress multi-axiality at a microscopic scale, whereas the second one relies on a probabilistic modelling of micro-plasticity at the scale of slip-planes. Both model identifications are cost effective, relying mainly on quickly obtained temperature data in self-heating tests. They both describe the influence of the stress heterogeneity, the volume effect and the hydrostatic stress on fatigue limits. The thermal effects and mean fatigue limit predictions are in good agreement with experimental results for in and out-of phase tension-torsion loadings. In the case of fatigue under non-proportional loading paths, the mean fatigue limit prediction error of the critical shear stress approach is three times less than with the yield surface approach. (authors)

  17. Probabilistic multi-scale models and measurements of self-heating under multiaxial high cycle fatigue

    Energy Technology Data Exchange (ETDEWEB)

    Poncelet, M.; Hild, F. [Univ Paris 11, PRES, Univ Paris 06, LMT Cachan, ENS Cachan, CNRS, F-94235 Cachan (France); Doudard, C.; Calloch, S. [Univ Brest, ENIB, ENSIETA, LBMS EA 4325, F-29806 Brest, (France); Weber, B. [ArcelorMittal Maizieres Res Voie Romaine, F-57283 Maizieres Les Metz (France)

    2010-07-01

    Different approaches have been proposed to link high cycle fatigue properties to thermal measurements under cyclic loadings, usually referred to as 'self-heating tests'. This paper focuses on two models whose parameters are tuned by resorting to self-heating tests and then used to predict high cycle fatigue properties. The first model is based upon a yield surface approach to account for stress multi-axiality at a microscopic scale, whereas the second one relies on a probabilistic modelling of micro-plasticity at the scale of slip-planes. Both model identifications are cost effective, relying mainly on quickly obtained temperature data in self-heating tests. They both describe the influence of the stress heterogeneity, the volume effect and the hydrostatic stress on fatigue limits. The thermal effects and mean fatigue limit predictions are in good agreement with experimental results for in and out-of phase tension-torsion loadings. In the case of fatigue under non-proportional loading paths, the mean fatigue limit prediction error of the critical shear stress approach is three times less than with the yield surface approach. (authors)

  18. Development of System Model for Level 1 Probabilistic Safety Assessment of TRIGA PUSPATI Reactor

    International Nuclear Information System (INIS)

    Tom, P.P; Mazleha Maskin; Ahmad Hassan Sallehudin Mohd Sarif; Faizal Mohamed; Mohd Fazli Zakaria; Shaharum Ramli; Muhamad Puad Abu

    2014-01-01

    Nuclear safety is a very big issue in the world. As a consequence of the accident at Fukushima, Japan, most of the reactors in the world have been reviewed their safety of the reactors including also research reactors. To develop Level 1 Probabilistic Safety Assessment (PSA) of TRIGA PUSPATI Reactor (RTP), three organizations are involved; Nuclear Malaysia, AELB and UKM. PSA methodology is a logical, deductive technique which specifies an undesired top event and uses fault trees and event trees to model the various parallel and sequential combinations of failures that might lead to an undesired event. Fault Trees (FT) methodology is use in developing of system models. At the lowest level, the Basic Events (BE) of the fault trees (components failure and human errors) are assigned probability distributions. In this study, Risk Spectrum software used to construct the fault trees and analyze the system models. The results of system models analysis such as core damage frequency (CDF), minimum cut set (MCS) and common cause failure (CCF) uses to support decision making for upgrading or modification of the RTP?s safety system. (author)

  19. Business Value of Information Technology Service Quality Based on Probabilistic Business-Driven Model

    Directory of Open Access Journals (Sweden)

    Jaka Sembiring

    2015-08-01

    Full Text Available The business value of information technology (IT services is often difficult to assess, especially from the point of view of a non-IT manager. This condition could severely impact organizational IT strategic decisions. Various approaches have been proposed to quantify the business value, but some are trapped in technical complexity while others misguide managers into directly and subjectively judging some technical entities outside their domain of expertise. This paper describes a method on how to properly capture both perspectives based on a  probabilistic business-driven model. The proposed model presents a procedure to calculate the business value of IT services. The model also covers IT security services and their business value as an important aspect of IT services that is not covered in previously published researches. The impact of changes in the quality of IT services on business value will also be discussed. A simulation and a case illustration are provided to show the possible application of the proposed model for a simple business process in an enterprise.

  20. Cognitive modeling and dynamic probabilistic simulation of operating crew response to complex system accidents

    International Nuclear Information System (INIS)

    Chang, Y.H.J.; Mosleh, A.

    2007-01-01

    This is the third in a series of five papers describing the IDAC (Information, Decision, and Action in Crew context) model for human reliability analysis. An example application of this modeling technique is also discussed in this series. The model is developed to probabilistically predict the responses of the nuclear power plant control room operating crew in accident conditions. The operator response spectrum includes cognitive, emotional, and physical activities during the course of an accident. This paper discusses the modeling components and their process rules. An operator's problem-solving process is divided into three types: information pre-processing (I), diagnosis and decision-making (D), and action execution (A). Explicit and context-dependent behavior rules for each type of operator are developed in the form of tables, and logical or mathematical relations. These regulate the process and activities of each of the three types of response. The behavior rules are developed for three generic types of operator: Decision Maker, Action Taker, and Consultant. This paper also provides a simple approach to calculating normalized probabilities of alternative behaviors given a context

  1. Identifying biological concepts from a protein-related corpus with a probabilistic topic model

    Directory of Open Access Journals (Sweden)

    Lu Xinghua

    2006-02-01

    Full Text Available Abstract Background Biomedical literature, e.g., MEDLINE, contains a wealth of knowledge regarding functions of proteins. Major recurring biological concepts within such text corpora represent the domains of this body of knowledge. The goal of this research is to identify the major biological topics/concepts from a corpus of protein-related MEDLINE© titles and abstracts by applying a probabilistic topic model. Results The latent Dirichlet allocation (LDA model was applied to the corpus. Based on the Bayesian model selection, 300 major topics were extracted from the corpus. The majority of identified topics/concepts was found to be semantically coherent and most represented biological objects or concepts. The identified topics/concepts were further mapped to the controlled vocabulary of the Gene Ontology (GO terms based on mutual information. Conclusion The major and recurring biological concepts within a collection of MEDLINE documents can be extracted by the LDA model. The identified topics/concepts provide parsimonious and semantically-enriched representation of the texts in a semantic space with reduced dimensionality and can be used to index text.

  2. The EBR-II Probabilistic Risk Assessment: lessons learned regarding passive safety

    International Nuclear Information System (INIS)

    Hill, D.J.; Ragland, W.A.; Roglans, J.

    1998-01-01

    This paper summarizes the results from the EBR-II Probabilistic Risk Assessment (PRA) and provides an analysis of the source of risk of the operation of EBR-II from both internal and external initiating events. The EBR-II PRA explicitly accounts for the role of reactivity feedbacks in reducing fuel damage. The results show that the expected core damage frequency from internal initiating events at EBR-II is very low, 1.6 10 -6 yr -1 , even with a wide definition of core damage (essentially that of exceeding Technical Specification limits). The annual frequency of damage, primarily due to liquid metal fires, from externally initiated events (excluding earthquakes) is 3.6 10 -6 yr -1 and the contribution of seismic events is 1.7 10 -5 yr -1 . Overall these results are considerably better than results for other research reactors and the nuclear industry in general and stem from three main sources: low likelihood of loss of coolant due to low system pressure and top entry double vessels; low likelihood of loss of decay heat removal due to reliance on passive means; and low likelihood of power/flow mismatch due to both passive feedbacks and reliability of rod scram capability

  3. The EBR-II Probabilistic Risk Assessment: lessons learned regarding passive safety

    Energy Technology Data Exchange (ETDEWEB)

    Hill, D J; Ragland, W A; Roglans, J

    1998-11-01

    This paper summarizes the results from the EBR-II Probabilistic Risk Assessment (PRA) and provides an analysis of the source of risk of the operation of EBR-II from both internal and external initiating events. The EBR-II PRA explicitly accounts for the role of reactivity feedbacks in reducing fuel damage. The results show that the expected core damage frequency from internal initiating events at EBR-II is very low, 1.6 10{sup -6} yr{sup -1}, even with a wide definition of core damage (essentially that of exceeding Technical Specification limits). The annual frequency of damage, primarily due to liquid metal fires, from externally initiated events (excluding earthquakes) is 3.6 10{sup -6} yr{sup -1} and the contribution of seismic events is 1.7 10{sup -5} yr{sup -1}. Overall these results are considerably better than results for other research reactors and the nuclear industry in general and stem from three main sources: low likelihood of loss of coolant due to low system pressure and top entry double vessels; low likelihood of loss of decay heat removal due to reliance on passive means; and low likelihood of power/flow mismatch due to both passive feedbacks and reliability of rod scram capability.

  4. The EBR-II probabilistic risk assessment lessons learned regarding passive safety

    International Nuclear Information System (INIS)

    Hill, D.J.; Ragland, W.A.; Roglans, J.

    1994-01-01

    This paper summarizes the results from the recently completed EBR-II Probabilistic Risk Assessment (PRA) and provides an analysis of the source of risk of the operation of EBR-II from both internal and external initiating events. The EBR-II PRA explicitly accounts for the role of reactivity feedbacks in reducing fuel damage. The results show that the expected core damage frequency from internal initiating events at EBR-II is very low, 1.6 10 -6 yr -1 , even with a wide definition of core damage (essentially that of exceeding Technical Specification limits). The annual frequency of damage, primarily due to liquid metal fires, from externally initiated events (excluding earthquakes) is 3.6 10 -6 yr -1 and the contribution of seismic events is 1.7 10 -5 yr -1 . Overall these results are considerably better than results for other research reactors and the nuclear industry in general and stem from three main sources: low likelihood of loss of coolant due to low system pressure and top entry double vessels; low likelihood of loss of decay heat removal due to reliance on passive means; and low likelihood of power/flow mismatch due to both passive feedbacks and reliability of rod scram capability

  5. Real-Time Probabilistic Structural Health Management Using Machine Learning and GPU Computing Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The proposed project seeks to deliver an ultra-efficient, high-fidelity structural health management (SHM) framework using machine learning and graphics processing...

  6. A physics-based probabilistic forecasting model for rainfall-induced shallow landslides at regional scale

    Directory of Open Access Journals (Sweden)

    S. Zhang

    2018-03-01

    Full Text Available Conventional outputs of physics-based landslide forecasting models are presented as deterministic warnings by calculating the safety factor (Fs of potentially dangerous slopes. However, these models are highly dependent on variables such as cohesion force and internal friction angle which are affected by a high degree of uncertainty especially at a regional scale, resulting in unacceptable uncertainties of Fs. Under such circumstances, the outputs of physical models are more suitable if presented in the form of landslide probability values. In order to develop such models, a method to link the uncertainty of soil parameter values with landslide probability is devised. This paper proposes the use of Monte Carlo methods to quantitatively express uncertainty by assigning random values to physical variables inside a defined interval. The inequality Fs < 1 is tested for each pixel in n simulations which are integrated in a unique parameter. This parameter links the landslide probability to the uncertainties of soil mechanical parameters and is used to create a physics-based probabilistic forecasting model for rainfall-induced shallow landslides. The prediction ability of this model was tested in a case study, in which simulated forecasting of landslide disasters associated with heavy rainfalls on 9 July 2013 in the Wenchuan earthquake region of Sichuan province, China, was performed. The proposed model successfully forecasted landslides in 159 of the 176 disaster points registered by the geo-environmental monitoring station of Sichuan province. Such testing results indicate that the new model can be operated in a highly efficient way and show more reliable results, attributable to its high prediction accuracy. Accordingly, the new model can be potentially packaged into a forecasting system for shallow landslides providing technological support for the mitigation of these disasters at regional scale.

  7. A hierarchical probabilistic model for rapid object categorization in natural scenes.

    Directory of Open Access Journals (Sweden)

    Xiaofu He

    Full Text Available Humans can categorize objects in complex natural scenes within 100-150 ms. This amazing ability of rapid categorization has motivated many computational models. Most of these models require extensive training to obtain a decision boundary in a very high dimensional (e.g., ∼6,000 in a leading model feature space and often categorize objects in natural scenes by categorizing the context that co-occurs with objects when objects do not occupy large portions of the scenes. It is thus unclear how humans achieve rapid scene categorization.To address this issue, we developed a hierarchical probabilistic model for rapid object categorization in natural scenes. In this model, a natural object category is represented by a coarse hierarchical probability distribution (PD, which includes PDs of object geometry and spatial configuration of object parts. Object parts are encoded by PDs of a set of natural object structures, each of which is a concatenation of local object features. Rapid categorization is performed as statistical inference. Since the model uses a very small number (∼100 of structures for even complex object categories such as animals and cars, it requires little training and is robust in the presence of large variations within object categories and in their occurrences in natural scenes. Remarkably, we found that the model categorized animals in natural scenes and cars in street scenes with a near human-level performance. We also found that the model located animals and cars in natural scenes, thus overcoming a flaw in many other models which is to categorize objects in natural context by categorizing contextual features. These results suggest that coarse PDs of object categories based on natural object structures and statistical operations on these PDs may underlie the human ability to rapidly categorize scenes.

  8. A physics-based probabilistic forecasting model for rainfall-induced shallow landslides at regional scale

    Science.gov (United States)

    Zhang, Shaojie; Zhao, Luqiang; Delgado-Tellez, Ricardo; Bao, Hongjun

    2018-03-01

    Conventional outputs of physics-based landslide forecasting models are presented as deterministic warnings by calculating the safety factor (Fs) of potentially dangerous slopes. However, these models are highly dependent on variables such as cohesion force and internal friction angle which are affected by a high degree of uncertainty especially at a regional scale, resulting in unacceptable uncertainties of Fs. Under such circumstances, the outputs of physical models are more suitable if presented in the form of landslide probability values. In order to develop such models, a method to link the uncertainty of soil parameter values with landslide probability is devised. This paper proposes the use of Monte Carlo methods to quantitatively express uncertainty by assigning random values to physical variables inside a defined interval. The inequality Fs soil mechanical parameters and is used to create a physics-based probabilistic forecasting model for rainfall-induced shallow landslides. The prediction ability of this model was tested in a case study, in which simulated forecasting of landslide disasters associated with heavy rainfalls on 9 July 2013 in the Wenchuan earthquake region of Sichuan province, China, was performed. The proposed model successfully forecasted landslides in 159 of the 176 disaster points registered by the geo-environmental monitoring station of Sichuan province. Such testing results indicate that the new model can be operated in a highly efficient way and show more reliable results, attributable to its high prediction accuracy. Accordingly, the new model can be potentially packaged into a forecasting system for shallow landslides providing technological support for the mitigation of these disasters at regional scale.

  9. Probabilistic Networks

    DEFF Research Database (Denmark)

    Jensen, Finn Verner; Lauritzen, Steffen Lilholt

    2001-01-01

    This article describes the basic ideas and algorithms behind specification and inference in probabilistic networks based on directed acyclic graphs, undirected graphs, and chain graphs.......This article describes the basic ideas and algorithms behind specification and inference in probabilistic networks based on directed acyclic graphs, undirected graphs, and chain graphs....

  10. Probabilistic Insurance

    NARCIS (Netherlands)

    Wakker, P.P.; Thaler, R.H.; Tversky, A.

    1997-01-01

    Probabilistic insurance is an insurance policy involving a small probability that the consumer will not be reimbursed. Survey data suggest that people dislike probabilistic insurance and demand more than a 20% reduction in premium to compensate for a 1% default risk. These observations cannot be

  11. Probabilistic Insurance

    NARCIS (Netherlands)

    P.P. Wakker (Peter); R.H. Thaler (Richard); A. Tversky (Amos)

    1997-01-01

    textabstractProbabilistic insurance is an insurance policy involving a small probability that the consumer will not be reimbursed. Survey data suggest that people dislike probabilistic insurance and demand more than a 20% reduction in the premium to compensate for a 1% default risk. While these

  12. Study on quantification method based on Monte Carlo sampling for multiunit probabilistic safety assessment models

    Energy Technology Data Exchange (ETDEWEB)

    Oh, Kye Min [KHNP Central Research Institute, Daejeon (Korea, Republic of); Han, Sang Hoon; Park, Jin Hee; Lim, Ho Gon; Yang, Joon Yang [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Heo, Gyun Young [Kyung Hee University, Yongin (Korea, Republic of)

    2017-06-15

    In Korea, many nuclear power plants operate at a single site based on geographical characteristics, but the population density near the sites is higher than that in other countries. Thus, multiunit accidents are a more important consideration than in other countries and should be addressed appropriately. Currently, there are many issues related to a multiunit probabilistic safety assessment (PSA). One of them is the quantification of a multiunit PSA model. A traditional PSA uses a Boolean manipulation of the fault tree in terms of the minimal cut set. However, such methods have some limitations when rare event approximations cannot be used effectively or a very small truncation limit should be applied to identify accident sequence combinations for a multiunit site. In particular, it is well known that seismic risk in terms of core damage frequency can be overestimated because there are many events that have a high failure probability. In this study, we propose a quantification method based on a Monte Carlo approach for a multiunit PSA model. This method can consider all possible accident sequence combinations in a multiunit site and calculate a more exact value for events that have a high failure probability. An example model for six identical units at a site was also developed and quantified to confirm the applicability of the proposed method.

  13. Probabilistic model for fluences and peak fluxes of solar energetic particles

    International Nuclear Information System (INIS)

    Nymmik, R.A.

    1999-01-01

    The model is intended for calculating the probability for solar energetic particles (SEP), i.e., protons and Z=2-28 ions, to have an effect on hardware and on biological and other objects in the space. The model describes the probability for the ≥10 MeV/nucleon SEP fluences and peak fluxes to occur in the near-Earth space beyond the Earth magnetosphere under varying solar activity. The physical prerequisites of the model are as follows. The occurrence of SEP is a probabilistic process. The mean SEP occurrence frequency is a power-law function of solar activity (sunspot number). The SEP size (taken to be the ≥30 MeV proton fluence size) distribution is a power-law function within a 10 5 -10 11 proton/cm 2 range. The SEP event particle energy spectra are described by a common function whose parameters are distributed log-normally. The SEP mean composition is energy-dependent and suffers fluctuations described by log-normal functions in separate events

  14. A Bimodal Hybrid Model for Time-Dependent Probabilistic Seismic Hazard Analysis

    Science.gov (United States)

    Yaghmaei-Sabegh, Saman; Shoaeifar, Nasser; Shoaeifar, Parva

    2018-03-01

    The evaluation of evidence provided by geological studies and historical catalogs indicates that in some seismic regions and faults, multiple large earthquakes occur in cluster. Then, the occurrences of large earthquakes confront with quiescence and only the small-to-moderate earthquakes take place. Clustering of large earthquakes is the most distinguishable departure from the assumption of constant hazard of random occurrence of earthquakes in conventional seismic hazard analysis. In the present study, a time-dependent recurrence model is proposed to consider a series of large earthquakes that occurs in clusters. The model is flexible enough to better reflect the quasi-periodic behavior of large earthquakes with long-term clustering, which can be used in time-dependent probabilistic seismic hazard analysis with engineering purposes. In this model, the time-dependent hazard results are estimated by a hazard function which comprises three parts. A decreasing hazard of last large earthquake cluster and an increasing hazard of the next large earthquake cluster, along with a constant hazard of random occurrence of small-to-moderate earthquakes. In the final part of the paper, the time-dependent seismic hazard of the New Madrid Seismic Zone at different time intervals has been calculated for illustrative purpose.

  15. Optimizing Negotiation Conflict in the Cloud Service Negotiation Framework Using Probabilistic Decision Making Model

    Directory of Open Access Journals (Sweden)

    Rajkumar Rajavel

    2015-01-01

    Full Text Available Optimization of negotiation conflict in the cloud service negotiation framework is identified as one of the major challenging issues. This negotiation conflict occurs during the bilateral negotiation process between the participants due to the misperception, aggressive behavior, and uncertain preferences and goals about their opponents. Existing research work focuses on the prerequest context of negotiation conflict optimization by grouping similar negotiation pairs using distance, binary, context-dependent, and fuzzy similarity approaches. For some extent, these approaches can maximize the success rate and minimize the communication overhead among the participants. To further optimize the success rate and communication overhead, the proposed research work introduces a novel probabilistic decision making model for optimizing the negotiation conflict in the long-term negotiation context. This decision model formulates the problem of managing different types of negotiation conflict that occurs during negotiation process as a multistage Markov decision problem. At each stage of negotiation process, the proposed decision model generates the heuristic decision based on the past negotiation state information without causing any break-off among the participants. In addition, this heuristic decision using the stochastic decision tree scenario can maximize the revenue among the participants available in the cloud service negotiation framework.

  16. Optimizing Negotiation Conflict in the Cloud Service Negotiation Framework Using Probabilistic Decision Making Model.

    Science.gov (United States)

    Rajavel, Rajkumar; Thangarathinam, Mala

    2015-01-01

    Optimization of negotiation conflict in the cloud service negotiation framework is identified as one of the major challenging issues. This negotiation conflict occurs during the bilateral negotiation process between the participants due to the misperception, aggressive behavior, and uncertain preferences and goals about their opponents. Existing research work focuses on the prerequest context of negotiation conflict optimization by grouping similar negotiation pairs using distance, binary, context-dependent, and fuzzy similarity approaches. For some extent, these approaches can maximize the success rate and minimize the communication overhead among the participants. To further optimize the success rate and communication overhead, the proposed research work introduces a novel probabilistic decision making model for optimizing the negotiation conflict in the long-term negotiation context. This decision model formulates the problem of managing different types of negotiation conflict that occurs during negotiation process as a multistage Markov decision problem. At each stage of negotiation process, the proposed decision model generates the heuristic decision based on the past negotiation state information without causing any break-off among the participants. In addition, this heuristic decision using the stochastic decision tree scenario can maximize the revenue among the participants available in the cloud service negotiation framework.

  17. Modeling and Quantification of Team Performance in Human Reliability Analysis for Probabilistic Risk Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Jeffrey C. JOe; Ronald L. Boring

    2014-06-01

    Probabilistic Risk Assessment (PRA) and Human Reliability Assessment (HRA) are important technical contributors to the United States (U.S.) Nuclear Regulatory Commission’s (NRC) risk-informed and performance based approach to regulating U.S. commercial nuclear activities. Furthermore, all currently operating commercial NPPs in the U.S. are required by federal regulation to be staffed with crews of operators. Yet, aspects of team performance are underspecified in most HRA methods that are widely used in the nuclear industry. There are a variety of "emergent" team cognition and teamwork errors (e.g., communication errors) that are 1) distinct from individual human errors, and 2) important to understand from a PRA perspective. The lack of robust models or quantification of team performance is an issue that affects the accuracy and validity of HRA methods and models, leading to significant uncertainty in estimating HEPs. This paper describes research that has the objective to model and quantify team dynamics and teamwork within NPP control room crews for risk informed applications, thereby improving the technical basis of HRA, which improves the risk-informed approach the NRC uses to regulate the U.S. commercial nuclear industry.

  18. Probabilistic-Stochastic Model of Distribution of Physical and Mechanical Properties of Soft Mineral Rocks

    Directory of Open Access Journals (Sweden)

    O.O. Sdvizhkova

    2017-12-01

    Full Text Available The physical and mechanical characteristics of soils and soft rocks obtained as a result of laboratory tests are important initial parameters for assessing the stability of natural and artificial slopes. Such properties of rocks as adhesion and the angle of internal friction are due to the influence of a number of natural and technogenic factors. At the same time, from the set of factors influencing the stability of the slope, the most significant ones are singled out, which to a greater extent determine the properties of the rocks. The more factors are taken into account in the geotechnical model, the more closely the properties of the rocks are studied, which increases the accuracy of the scientific forecast of the landslide danger of the slope. On the other hand, an increase in the number of factors involved in the model complicates it and causes a decrease in the reliability of geotechnical calculations. The aim of the work is to construct a statistical distribution of the studied physical and mechanical properties of soft rocks and to substantiate a probabilistic statistical model. Based on the results of laboratory tests of rocks, the statistical distributions of the quantitative traits studied, the angle of internal friction φ and the cohesion, were constructed. It was established that the statistical distribution of physical mechanical properties of rocks is close to a uniform law.

  19. The PSACOIN level 1B exercise: A probabilistic code intercomparison involving a four compartment biosphere model

    International Nuclear Information System (INIS)

    Klos, R.A.; Sinclair, J.E.; Torres, C.; Mobbs, S.F.; Galson, D.A.

    1991-01-01

    The probabilistic Systems Assessment Code (PSAC) User Group of the OECD Nuclear Energy Agency has organised a series of code intercomparison studies of relevance to the performance assessment of underground repositories for radioactive wastes - known collectively by the name PSACOIN. The latest of these to be undertaken is designated PSACOIN Level 1b, and the case specification provides a complete assessment model of the behaviour of radionuclides following release into the biosphere. PSACOIN Level 1b differs from other biosphere oriented intercomparison exercises in that individual dose is the end point of the calculations as opposed to any other intermediate quantity. The PSACOIN Level 1b case specification describes a simple source term which is used to simulate the release of activity to the biosphere from certain types of near surface waste repository, the transport of radionuclides through the biosphere and their eventual uptake by humankind. The biosphere sub model comprises 4 compartments representing top and deep soil layers, river water and river sediment. The transport of radionuclides between the physical compartments is described by ten transfer coefficients and doses to humankind arise from the simultaneous consumption of water, fish, meat, milk, and grain as well as from dust inhalation and external γ-irradiation. The parameters of the exposure pathway sub model are chosen to be representative of an individual living in a small agrarian community. (13 refs., 3 figs., 2 tabs.)

  20. Towards using the chordal graph polytope in learning decomposable models

    Czech Academy of Sciences Publication Activity Database

    Studený, Milan; Cussens, J.

    2017-01-01

    Roč. 88, č. 1 (2017), s. 259-281 ISSN 0888-613X. [8th International Conference of Probabilistic Graphical Models. Lugano, 06.09.2016-09.09.2016] R&D Projects: GA ČR(CZ) GA16-12010S Institutional support: RVO:67985556 Keywords : learning decomposable models * integer linear programming * characteristic imset * chordal graph polytope * clutter inequalities * separation problem Subject RIV: BA - General Mathematics OBOR OECD: Statistics and probability Impact factor: 2.845, year: 2016 http://library.utia.cas.cz/separaty/2017/MTR/studeny-0475614.pdf

  1. Probabilistic Structural Analysis Program

    Science.gov (United States)

    Pai, Shantaram S.; Chamis, Christos C.; Murthy, Pappu L. N.; Stefko, George L.; Riha, David S.; Thacker, Ben H.; Nagpal, Vinod K.; Mital, Subodh K.

    2010-01-01

    NASA/NESSUS 6.2c is a general-purpose, probabilistic analysis program that computes probability of failure and probabilistic sensitivity measures of engineered systems. Because NASA/NESSUS uses highly computationally efficient and accurate analysis techniques, probabilistic solutions can be obtained even for extremely large and complex models. Once the probabilistic response is quantified, the results can be used to support risk-informed decisions regarding reliability for safety-critical and one-of-a-kind systems, as well as for maintaining a level of quality while reducing manufacturing costs for larger-quantity products. NASA/NESSUS has been successfully applied to a diverse range of problems in aerospace, gas turbine engines, biomechanics, pipelines, defense, weaponry, and infrastructure. This program combines state-of-the-art probabilistic algorithms with general-purpose structural analysis and lifting methods to compute the probabilistic response and reliability of engineered structures. Uncertainties in load, material properties, geometry, boundary conditions, and initial conditions can be simulated. The structural analysis methods include non-linear finite-element methods, heat-transfer analysis, polymer/ceramic matrix composite analysis, monolithic (conventional metallic) materials life-prediction methodologies, boundary element methods, and user-written subroutines. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. NASA/NESSUS 6.2c is structured in a modular format with 15 elements.

  2. A High Performance Computing Approach to Tree Cover Delineation in 1-m NAIP Imagery Using a Probabilistic Learning Framework

    Science.gov (United States)

    Basu, Saikat; Ganguly, Sangram; Michaelis, Andrew; Votava, Petr; Roy, Anshuman; Mukhopadhyay, Supratik; Nemani, Ramakrishna

    2015-01-01

    Tree cover delineation is a useful instrument in deriving Above Ground Biomass (AGB) density estimates from Very High Resolution (VHR) airborne imagery data. Numerous algorithms have been designed to address this problem, but most of them do not scale to these datasets, which are of the order of terabytes. In this paper, we present a semi-automated probabilistic framework for the segmentation and classification of 1-m National Agriculture Imagery Program (NAIP) for tree-cover delineation for the whole of Continental United States, using a High Performance Computing Architecture. Classification is performed using a multi-layer Feedforward Backpropagation Neural Network and segmentation is performed using a Statistical Region Merging algorithm. The results from the classification and segmentation algorithms are then consolidated into a structured prediction framework using a discriminative undirected probabilistic graphical model based on Conditional Random Field, which helps in capturing the higher order contextual dependencies between neighboring pixels. Once the final probability maps are generated, the framework is updated and re-trained by relabeling misclassified image patches. This leads to a significant improvement in the true positive rates and reduction in false positive rates. The tree cover maps were generated for the whole state of California, spanning a total of 11,095 NAIP tiles covering a total geographical area of 163,696 sq. miles. The framework produced true positive rates of around 88% for fragmented forests and 74% for urban tree cover areas, with false positive rates lower than 2% for both landscapes. Comparative studies with the National Land Cover Data (NLCD) algorithm and the LiDAR canopy height model (CHM) showed the effectiveness of our framework for generating accurate high-resolution tree-cover maps.

  3. A High Performance Computing Approach to Tree Cover Delineation in 1-m NAIP Imagery using a Probabilistic Learning Framework

    Science.gov (United States)

    Basu, S.; Ganguly, S.; Michaelis, A.; Votava, P.; Roy, A.; Mukhopadhyay, S.; Nemani, R. R.

    2015-12-01

    Tree cover delineation is a useful instrument in deriving Above Ground Biomass (AGB) density estimates from Very High Resolution (VHR) airborne imagery data. Numerous algorithms have been designed to address this problem, but most of them do not scale to these datasets which are of the order of terabytes. In this paper, we present a semi-automated probabilistic framework for the segmentation and classification of 1-m National Agriculture Imagery Program (NAIP) for tree-cover delineation for the whole of Continental United States, using a High Performance Computing Architecture. Classification is performed using a multi-layer Feedforward Backpropagation Neural Network and segmentation is performed using a Statistical Region Merging algorithm. The results from the classification and segmentation algorithms are then consolidated into a structured prediction framework using a discriminative undirected probabilistic graphical model based on Conditional Random Field, which helps in capturing the higher order contextual dependencies between neighboring pixels. Once the final probability maps are generated, the framework is updated and re-trained by relabeling misclassified image patches. This leads to a significant improvement in the true positive rates and reduction in false positive rates. The tree cover maps were generated for the whole state of California, spanning a total of 11,095 NAIP tiles covering a total geographical area of 163,696 sq. miles. The framework produced true positive rates of around 88% for fragmented forests and 74% for urban tree cover areas, with false positive rates lower than 2% for both landscapes. Comparative studies with the National Land Cover Data (NLCD) algorithm and the LiDAR canopy height model (CHM) showed the effectiveness of our framework for generating accurate high-resolution tree-cover maps.

  4. Probabilistic models for 2D active shape recognition using Fourier descriptors and mutual information

    CSIR Research Space (South Africa)

    Govender, N

    2014-08-01

    Full Text Available information to improve the initial shape recognition results. We propose an initial system which performs shape recognition using the euclidean distances of Fourier descriptors. To improve upon these results we build multinomial and Gaussian probabilistic...

  5. Analytical probabilistic modeling of RBE-weighted dose for ion therapy

    Science.gov (United States)

    Wieser, H. P.; Hennig, P.; Wahl, N.; Bangert, M.

    2017-12-01

    Particle therapy is especially prone to uncertainties. This issue is usually addressed with uncertainty quantification and minimization techniques based on scenario sampling. For proton therapy, however, it was recently shown that it is also possible to use closed-form computations based on analytical probabilistic modeling (APM) for this purpose. APM yields unique features compared to sampling-based approaches, motivating further research in this context. This paper demonstrates the application of APM for intensity-modulated carbon ion therapy to quantify the influence of setup and range uncertainties on the RBE-weighted dose. In particular, we derive analytical forms for the nonlinear computations of the expectation value and variance of the RBE-weighted dose by propagating linearly correlated Gaussian input uncertainties through a pencil beam dose calculation algorithm. Both exact and approximation formulas are presented for the expectation value and variance of the RBE-weighted dose and are subsequently studied in-depth for a one-dimensional carbon ion spread-out Bragg peak. With V and B being the number of voxels and pencil beams, respectively, the proposed approximations induce only a marginal loss of accuracy while lowering the computational complexity from order O(V × B^2) to O(V × B) for the expectation value and from O(V × B^4) to O(V × B^2) for the variance of the RBE-weighted dose. Moreover, we evaluated the approximated calculation of the expectation value and standard deviation of the RBE-weighted dose in combination with a probabilistic effect-based optimization on three patient cases considering carbon ions as radiation modality against sampled references. The resulting global γ-pass rates (2 mm,2%) are > 99.15% for the expectation value and > 94.95% for the standard deviation of the RBE-weighted dose, respectively. We applied the derived analytical model to carbon ion treatment planning, although the concept is in general applicable to other

  6. Combining exposure and effect modeling into an integrated probabilistic environmental risk assessment for nanoparticles.

    Science.gov (United States)

    Jacobs, Rianne; Meesters, Johannes A J; Ter Braak, Cajo J F; van de Meent, Dik; van der Voet, Hilko

    2016-12-01

    There is a growing need for good environmental risk assessment of engineered nanoparticles (ENPs). Environmental risk assessment of ENPs has been hampered by lack of data and knowledge about ENPs, their environmental fate, and their toxicity. This leads to uncertainty in the risk assessment. To deal with uncertainty in the risk assessment effectively, probabilistic methods are advantageous. In the present study, the authors developed a method to model both the variability and the uncertainty in environmental risk assessment of ENPs. This method is based on the concentration ratio and the ratio of the exposure concentration to the critical effect concentration, both considered to be random. In this method, variability and uncertainty are modeled separately so as to allow the user to see which part of the total variation in the concentration ratio is attributable to uncertainty and which part is attributable to variability. The authors illustrate the use of the method with a simplified aquatic risk assessment of nano-titanium dioxide. The authors' method allows a more transparent risk assessment and can also direct further environmental and toxicological research to the areas in which it is most needed. Environ Toxicol Chem 2016;35:2958-2967. © 2016 The Authors. Environmental Toxicology and Chemistry published by Wiley Periodicals, Inc. on behalf of SETAC. © 2016 The Authors. Environmental Toxicology and Chemistry published by Wiley Periodicals, Inc. on behalf of SETAC.

  7. A hybrid deterministic-probabilistic approach to model the mechanical response of helically arranged hierarchical strands

    Science.gov (United States)

    Fraldi, M.; Perrella, G.; Ciervo, M.; Bosia, F.; Pugno, N. M.

    2017-09-01

    Very recently, a Weibull-based probabilistic strategy has been successfully applied to bundles of wires to determine their overall stress-strain behaviour, also capturing previously unpredicted nonlinear and post-elastic features of hierarchical strands. This approach is based on the so-called "Equal Load Sharing (ELS)" hypothesis by virtue of which, when a wire breaks, the load acting on the strand is homogeneously redistributed among the surviving wires. Despite the overall effectiveness of the method, some discrepancies between theoretical predictions and in silico Finite Element-based simulations or experimental findings might arise when more complex structures are analysed, e.g. helically arranged bundles. To overcome these limitations, an enhanced hybrid approach is proposed in which the probability of rupture is combined with a deterministic mechanical model of a strand constituted by helically-arranged and hierarchically-organized wires. The analytical model is validated comparing its predictions with both Finite Element simulations and experimental tests. The results show that generalized stress-strain responses - incorporating tension/torsion coupling - are naturally found and, once one or more elements break, the competition between geometry and mechanics of the strand microstructure, i.e. the different cross sections and helical angles of the wires in the different hierarchical levels of the strand, determines the no longer homogeneous stress redistribution among the surviving wires whose fate is hence governed by a "Hierarchical Load Sharing" criterion.

  8. Probabilistic common cause failure modeling for auxiliary feedwater system after the introduction of flood barriers

    International Nuclear Information System (INIS)

    Zheng, Xiaoyu; Yamaguchi, Akira; Takata, Takashi

    2013-01-01

    Causal inference is capable of assessing common cause failure (CCF) events from the viewpoint of causes' risk significance. Authors proposed the alpha decomposition method for probabilistic CCF analysis, in which the classical alpha factor model and causal inference are integrated to conduct a quantitative assessment of causes' CCF risk significance. The alpha decomposition method includes a hybrid Bayesian network for revealing the relationship between component failures and potential causes, and a regression model in which CCF parameters (global alpha factors) are expressed by explanatory variables (causes' occurrence frequencies) and parameters (decomposed alpha factors). This article applies this method and associated databases needed to predict CCF parameters of auxiliary feedwater (AFW) system when defense barriers against internal flood are introduced. There is scarce operation data for functionally modified safety systems and the utilization of generic CCF databases is of unknown uncertainty. The alpha decomposition method has the potential of analyzing the CCF risk of modified AFW system reasonably based on generic CCF databases. Moreover, the sources of uncertainty in parameter estimation can be studied. An example is presented to demonstrate the process of applying Bayesian inference in the alpha decomposition process. The results show that the system-specific posterior distributions for CCF parameters can be predicted. (author)

  9. Stochastic network interdiction optimization via capacitated network reliability modeling and probabilistic solution discovery

    International Nuclear Information System (INIS)

    Ramirez-Marquez, Jose Emmanuel; Rocco S, Claudio M.

    2009-01-01

    This paper introduces an evolutionary optimization approach that can be readily applied to solve stochastic network interdiction problems (SNIP). The network interdiction problem solved considers the minimization of the cost associated with an interdiction strategy such that the maximum flow that can be transmitted between a source node and a sink node for a fixed network design is greater than or equal to a given reliability requirement. Furthermore, the model assumes that the nominal capacity of each network link and the cost associated with their interdiction can change from link to link and that such interdiction has a probability of being successful. This version of the SNIP is for the first time modeled as a capacitated network reliability problem allowing for the implementation of computation and solution techniques previously unavailable. The solution process is based on an evolutionary algorithm that implements: (1) Monte-Carlo simulation, to generate potential network interdiction strategies, (2) capacitated network reliability techniques to analyze strategies' source-sink flow reliability and, (3) an evolutionary optimization technique to define, in probabilistic terms, how likely a link is to appear in the final interdiction strategy. Examples for different sizes of networks are used throughout the paper to illustrate the approach

  10. Variability of concrete properties: experimental characterisation and probabilistic modelling for calcium leaching

    International Nuclear Information System (INIS)

    De Larrard, Th.

    2010-09-01

    Evaluating structures durability requires taking into account the variability of material properties. The thesis has two main aspects: on the one hand, an experimental campaign aimed at quantifying the variability of many indicators of concrete behaviour; on the other hand, a simple numerical model for calcium leaching is developed in order to implement probabilistic methods so as to estimate the lifetime of structures such as those related to radioactive waste disposal. The experimental campaign consisted in following up two real building sites, and quantifying the variability of these indicators, studying their correlation, and characterising the random fields variability for the considered variables (especially the correlation length). To draw any conclusion from the accelerated leaching tests with ammonium nitrate by overcoming the effects of temperature, an inverse analysis tool based on the theory of artificial neural networks was developed. Simple numerical tools are presented to investigate the propagation of variability in durability issues, quantify the influence of this variability on the lifespan of structures and explain the variability of the input parameters of the numerical model and the physical measurable quantities of the material. (author)

  11. Co-occurrence of medical conditions: Exposing patterns through probabilistic topic modeling of snomed codes.

    Science.gov (United States)

    Bhattacharya, Moumita; Jurkovitz, Claudine; Shatkay, Hagit

    2018-04-12

    Patients associated with multiple co-occurring health conditions often face aggravated complications and less favorable outcomes. Co-occurring conditions are especially prevalent among individuals suffering from kidney disease, an increasingly widespread condition affecting 13% of the general population in the US. This study aims to identify and characterize patterns of co-occurring medical conditions in patients employing a probabilistic framework. Specifically, we apply topic modeling in a non-traditional way to find associations across SNOMED-CT codes assigned and recorded in the EHRs of >13,000 patients diagnosed with kidney disease. Unlike most prior work on topic modeling, we apply the method to codes rather than to natural language. Moreover, we quantitatively evaluate the topics, assessing their tightness and distinctiveness, and also assess the medical validity of our results. Our experiments show that each topic is succinctly characterized by a few highly probable and unique disease codes, indicating that the topics are tight. Furthermore, inter-topic distance between each pair of topics is typically high, illustrating distinctiveness. Last, most coded conditions grouped together within a topic, are indeed reported to co-occur in the medical literature. Notably, our results uncover a few indirect associations among conditions that have hitherto not been reported as correlated in the medical literature. Copyright © 2018. Published by Elsevier Inc.

  12. Active Learning with Statistical Models.

    Science.gov (United States)

    1995-01-01

    Active Learning with Statistical Models ASC-9217041, NSF CDA-9309300 6. AUTHOR(S) David A. Cohn, Zoubin Ghahramani, and Michael I. Jordan 7. PERFORMING...TERMS 15. NUMBER OF PAGES Al, MIT, Artificial Intelligence, active learning , queries, locally weighted 6 regression, LOESS, mixtures of gaussians...COMPUTATIONAL LEARNING DEPARTMENT OF BRAIN AND COGNITIVE SCIENCES A.I. Memo No. 1522 January 9. 1995 C.B.C.L. Paper No. 110 Active Learning with

  13. Growing hierarchical probabilistic self-organizing graphs.

    Science.gov (United States)

    López-Rubio, Ezequiel; Palomo, Esteban José

    2011-07-01

    Since the introduction of the growing hierarchical self-organizing map, much work has been done on self-organizing neural models with a dynamic structure. These models allow adjusting the layers of the model to the features of the input dataset. Here we propose a new self-organizing model which is based on a probabilistic mixture of multivariate Gaussian components. The learning rule is derived from the stochastic approximation framework, and a probabilistic criterion is used to control the growth of the model. Moreover, the model is able to adapt to the topology of each layer, so that a hierarchy of dynamic graphs is built. This overcomes the limitations of the self-organizing maps with a fixed topology, and gives rise to a faithful visualization method for high-dimensional data.

  14. Comparison of Control Approaches in Genetic Regulatory Networks by Using Stochastic Master Equation Models, Probabilistic Boolean Network Models and Differential Equation Models and Estimated Error Analyzes

    Science.gov (United States)

    Caglar, Mehmet Umut; Pal, Ranadip

    2011-03-01

    Central dogma of molecular biology states that ``information cannot be transferred back from protein to either protein or nucleic acid''. However, this assumption is not exactly correct in most of the cases. There are a lot of feedback loops and interactions between different levels of systems. These types of interactions are hard to analyze due to the lack of cell level data and probabilistic - nonlinear nature of interactions. Several models widely used to analyze and simulate these types of nonlinear interactions. Stochastic Master Equation (SME) models give probabilistic nature of the interactions in a detailed manner, with a high calculation cost. On the other hand Probabilistic Boolean Network (PBN) models give a coarse scale picture of the stochastic processes, with a less calculation cost. Differential Equation (DE) models give the time evolution of mean values of processes in a highly cost effective way. The understanding of the relations between the predictions of these models is important to understand the reliability of the simulations of genetic regulatory networks. In this work the success of the mapping between SME, PBN and DE models is analyzed and the accuracy and affectivity of the control policies generated by using PBN and DE models is compared.

  15. Methodology for Developing a Probabilistic Risk Assessment Model of Spacecraft Rendezvous and Dockings

    Science.gov (United States)

    Farnham, Steven J., II; Garza, Joel, Jr.; Castillo, Theresa M.; Lutomski, Michael

    2011-01-01

    In 2007 NASA was preparing to send two new visiting vehicles carrying logistics and propellant to the International Space Station (ISS). These new vehicles were the European Space Agency s (ESA) Automated Transfer Vehicle (ATV), the Jules Verne, and the Japanese Aerospace and Explorations Agency s (JAXA) H-II Transfer Vehicle (HTV). The ISS Program wanted to quantify the increased risk to the ISS from these visiting vehicles. At the time, only the Shuttle, the Soyuz, and the Progress vehicles rendezvoused and docked to the ISS. The increased risk to the ISS was from an increase in vehicle traffic, thereby, increasing the potential catastrophic collision during the rendezvous and the docking or berthing of the spacecraft to the ISS. A universal method of evaluating the risk of rendezvous and docking or berthing was created by the ISS s Risk Team to accommodate the increasing number of rendezvous and docking or berthing operations due to the increasing number of different spacecraft, as well as the future arrival of commercial spacecraft. Before the first docking attempt of ESA's ATV and JAXA's HTV to the ISS, a probabilistic risk model was developed to quantitatively calculate the risk of collision of each spacecraft with the ISS. The 5 rendezvous and docking risk models (Soyuz, Progress, Shuttle, ATV, and HTV) have been used to build and refine the modeling methodology for rendezvous and docking of spacecrafts. This risk modeling methodology will be NASA s basis for evaluating the addition of future ISS visiting spacecrafts hazards, including SpaceX s Dragon, Orbital Science s Cygnus, and NASA s own Orion spacecraft. This paper will describe the methodology used for developing a visiting vehicle risk model.

  16. Statistical and Probabilistic Extensions to Ground Operations' Discrete Event Simulation Modeling

    Science.gov (United States)

    Trocine, Linda; Cummings, Nicholas H.; Bazzana, Ashley M.; Rychlik, Nathan; LeCroy, Kenneth L.; Cates, Grant R.

    2010-01-01

    NASA's human exploration initiatives will invest in technologies, public/private partnerships, and infrastructure, paving the way for the expansion of human civilization into the solar system and beyond. As it is has been for the past half century, the Kennedy Space Center will be the embarkation point for humankind's journey into the cosmos. Functioning as a next generation space launch complex, Kennedy's launch pads, integration facilities, processing areas, launch and recovery ranges will bustle with the activities of the world's space transportation providers. In developing this complex, KSC teams work through the potential operational scenarios: conducting trade studies, planning and budgeting for expensive and limited resources, and simulating alternative operational schemes. Numerous tools, among them discrete event simulation (DES), were matured during the Constellation Program to conduct such analyses with the purpose of optimizing the launch complex for maximum efficiency, safety, and flexibility while minimizing life cycle costs. Discrete event simulation is a computer-based modeling technique for complex and dynamic systems where the state of the system changes at discrete points in time and whose inputs may include random variables. DES is used to assess timelines and throughput, and to support operability studies and contingency analyses. It is applicable to any space launch campaign and informs decision-makers of the effects of varying numbers of expensive resources and the impact of off nominal scenarios on measures of performance. In order to develop representative DES models, methods were adopted, exploited, or created to extend traditional uses of DES. The Delphi method was adopted and utilized for task duration estimation. DES software was exploited for probabilistic event variation. A roll-up process was used, which was developed to reuse models and model elements in other less - detailed models. The DES team continues to innovate and expand

  17. Probabilistic migration modelling focused on functional barrier efficiency and low migration concepts in support of risk assessment.

    Science.gov (United States)

    Brandsch, Rainer

    2017-10-01

    Migration modelling provides reliable migration estimates from food-contact materials (FCM) to food or food simulants based on mass-transfer parameters like diffusion and partition coefficients related to individual materials. In most cases, mass-transfer parameters are not readily available from the literature and for this reason are estimated with a given uncertainty. Historically, uncertainty was accounted for by introducing upper limit concepts first, turning out to be of limited applicability due to highly overestimated migration results. Probabilistic migration modelling gives the possibility to consider uncertainty of the mass-transfer parameters as well as other model inputs. With respect to a functional barrier, the most important parameters among others are the diffusion properties of the functional barrier and its thickness. A software tool that accepts distribution as inputs and is capable of applying Monte Carlo methods, i.e., random sampling from the input distributions of the relevant parameters (i.e., diffusion coefficient and layer thickness), predicts migration results with related uncertainty and confidence intervals. The capabilities of probabilistic migration modelling are presented in the view of three case studies (1) sensitivity analysis, (2) functional barrier efficiency and (3) validation by experimental testing. Based on the predicted migration by probabilistic migration modelling and related exposure estimates, safety evaluation of new materials in the context of existing or new packaging concepts is possible. Identifying associated migration risk and potential safety concerns in the early stage of packaging development is possible. Furthermore, dedicated material selection exhibiting required functional barrier efficiency under application conditions becomes feasible. Validation of the migration risk assessment by probabilistic migration modelling through a minimum of dedicated experimental testing is strongly recommended.

  18. Predicting carcinogenicity of diverse chemicals using probabilistic neural network modeling approaches

    Energy Technology Data Exchange (ETDEWEB)

    Singh, Kunwar P., E-mail: kpsingh_52@yahoo.com [Academy of Scientific and Innovative Research, Council of Scientific and Industrial Research, New Delhi (India); Environmental Chemistry Division, CSIR-Indian Institute of Toxicology Research, Post Box 80, Mahatma Gandhi Marg, Lucknow 226 001 (India); Gupta, Shikha; Rai, Premanjali [Academy of Scientific and Innovative Research, Council of Scientific and Industrial Research, New Delhi (India); Environmental Chemistry Division, CSIR-Indian Institute of Toxicology Research, Post Box 80, Mahatma Gandhi Marg, Lucknow 226 001 (India)

    2013-10-15

    Robust global models capable of discriminating positive and non-positive carcinogens; and predicting carcinogenic potency of chemicals in rodents were developed. The dataset of 834 structurally diverse chemicals extracted from Carcinogenic Potency Database (CPDB) was used which contained 466 positive and 368 non-positive carcinogens. Twelve non-quantum mechanical molecular descriptors were derived. Structural diversity of the chemicals and nonlinearity in the data were evaluated using Tanimoto similarity index and Brock–Dechert–Scheinkman statistics. Probabilistic neural network (PNN) and generalized regression neural network (GRNN) models were constructed for classification and function optimization problems using the carcinogenicity end point in rat. Validation of the models was performed using the internal and external procedures employing a wide series of statistical checks. PNN constructed using five descriptors rendered classification accuracy of 92.09% in complete rat data. The PNN model rendered classification accuracies of 91.77%, 80.70% and 92.08% in mouse, hamster and pesticide data, respectively. The GRNN constructed with nine descriptors yielded correlation coefficient of 0.896 between the measured and predicted carcinogenic potency with mean squared error (MSE) of 0.44 in complete rat data. The rat carcinogenicity model (GRNN) applied to the mouse and hamster data yielded correlation coefficient and MSE of 0.758, 0.71 and 0.760, 0.46, respectively. The results suggest for wide applicability of the inter-species models in predicting carcinogenic potency of chemicals. Both the PNN and GRNN (inter-species) models constructed here can be useful tools in predicting the carcinogenicity of new chemicals for regulatory purposes. - Graphical abstract: Figure (a) shows classification accuracies (positive and non-positive carcinogens) in rat, mouse, hamster, and pesticide data yielded by optimal PNN model. Figure (b) shows generalization and predictive

  19. The Game Enhanced Learning Model

    DEFF Research Database (Denmark)

    Reng, Lars; Schoenau-Fog, Henrik

    2016-01-01

    will describe the levels of the model, which is based on our experience in teaching professional game development at university level. Furthermore, we have been using the model to inspire numerous educators to improve their students’ motivation and skills. The model presents various game-based learning...... activities, and depicts their required planning and expected outcome through eight levels. At its lower levels, the model contains the possibilities of using stand-alone analogue and digital games as teachers, utilizing games as a facilitator of learning activities, exploiting gamification and motivating......In this paper, we will introduce the Game Enhanced learning Model (GEM), which describes a range of gameoriented learning activities. The model is intended to give an overview of the possibilities of game-based learning in general and all the way up to purposive game productions. In the paper, we...

  20. A probabilistic transmission and population dynamic model to assess tuberculosis infection risk.

    Science.gov (United States)

    Liao, Chung-Min; Cheng, Yi-Hsien; Lin, Yi-Jun; Hsieh, Nan-Hung; Huang, Tang-Luen; Chio, Chia-Pin; Chen, Szu-Chieh; Ling, Min-Pei

    2012-08-01

    The purpose of this study was to examine tuberculosis (TB) population dynamics and to assess potential infection risk in Taiwan. A well-established mathematical model of TB transmission built on previous models was adopted to study the potential impact of TB transmission. A probabilistic risk model was also developed to estimate site-specific risks of developing disease soon after recent primary infection, exogenous reinfection, or through endogenous reactivation (latently infected TB) among Taiwan regions. Here, we showed that the proportion of endogenous reactivation (53-67%) was larger than that of exogenous reinfection (32-47%). Our simulations showed that as epidemic reaches a steady state, age distribution of cases would finally shift toward older age groups dominated by latently infected TB cases as a result of endogenous reactivation. A comparison of age-weighted TB incidence data with our model simulation output with 95% credible intervals revealed that the predictions were in an apparent agreement with observed data. The median value of overall basic reproduction number (R₀) in eastern Taiwan ranged from 1.65 to 1.72, whereas northern Taiwan had the lowest R₀ estimate of 1.50. We found that total TB incidences in eastern Taiwan had 25-27% probabilities of total proportion of infected population exceeding 90%, whereas there were 36-66% probabilities having exceeded 20% of total proportion of infected population attributed to latently infected TB. We suggested that our Taiwan-based analysis can be extended to the context of developing countries, where TB remains a substantial cause of elderly morbidity and mortality. © 2012 Society for Risk Analysis.