WorldWideScience

Sample records for physically inspired probabilistic

  1. A Probabilistic Recommendation Method Inspired by Latent Dirichlet Allocation Model

    Directory of Open Access Journals (Sweden)

    WenBo Xie

    2014-01-01

    Full Text Available The recent decade has witnessed an increasing popularity of recommendation systems, which help users acquire relevant knowledge, commodities, and services from an overwhelming information ocean on the Internet. Latent Dirichlet Allocation (LDA, originally presented as a graphical model for text topic discovery, now has found its application in many other disciplines. In this paper, we propose an LDA-inspired probabilistic recommendation method by taking the user-item collecting behavior as a two-step process: every user first becomes a member of one latent user-group at a certain probability and each user-group will then collect various items with different probabilities. Gibbs sampling is employed to approximate all the probabilities in the two-step process. The experiment results on three real-world data sets MovieLens, Netflix, and Last.fm show that our method exhibits a competitive performance on precision, coverage, and diversity in comparison with the other four typical recommendation methods. Moreover, we present an approximate strategy to reduce the computing complexity of our method with a slight degradation of the performance.

  2. Integration of Advanced Probabilistic Analysis Techniques with Multi-Physics Models

    Energy Technology Data Exchange (ETDEWEB)

    Cetiner, Mustafa Sacit; none,; Flanagan, George F. [ORNL; Poore III, Willis P. [ORNL; Muhlheim, Michael David [ORNL

    2014-07-30

    An integrated simulation platform that couples probabilistic analysis-based tools with model-based simulation tools can provide valuable insights for reactive and proactive responses to plant operating conditions. The objective of this work is to demonstrate the benefits of a partial implementation of the Small Modular Reactor (SMR) Probabilistic Risk Assessment (PRA) Detailed Framework Specification through the coupling of advanced PRA capabilities and accurate multi-physics plant models. Coupling a probabilistic model with a multi-physics model will aid in design, operations, and safety by providing a more accurate understanding of plant behavior. This represents the first attempt at actually integrating these two types of analyses for a control system used for operations, on a faster than real-time basis. This report documents the development of the basic communication capability to exchange data with the probabilistic model using Reliability Workbench (RWB) and the multi-physics model using Dymola. The communication pathways from injecting a fault (i.e., failing a component) to the probabilistic and multi-physics models were successfully completed. This first version was tested with prototypic models represented in both RWB and Modelica. First, a simple event tree/fault tree (ET/FT) model was created to develop the software code to implement the communication capabilities between the dynamic-link library (dll) and RWB. A program, written in C#, successfully communicates faults to the probabilistic model through the dll. A systems model of the Advanced Liquid-Metal Reactor–Power Reactor Inherently Safe Module (ALMR-PRISM) design developed under another DOE project was upgraded using Dymola to include proper interfaces to allow data exchange with the control application (ConApp). A program, written in C+, successfully communicates faults to the multi-physics model. The results of the example simulation were successfully plotted.

  3. Convolution product construction of interactions in probabilistic physical models

    International Nuclear Information System (INIS)

    Ratsimbarison, H.M.; Raboanary, R.

    2007-01-01

    This paper aims to give a probabilistic construction of interactions which may be relevant for building physical theories such as interacting quantum field theories. We start with the path integral definition of partition function in quantum field theory which recall us the probabilistic nature of this physical theory. From a Gaussian law considered as free theory, an interacting theory is constructed by nontrivial convolution product between the free theory and an interacting term which is also a probability law. The resulting theory, again a probability law, exhibits two proprieties already present in nowadays theories of interactions such as Gauge theory : the interaction term does not depend on the free term, and two different free theories can be implemented with the same interaction.

  4. INSPIRE - Premission. [Interactive NASA Space Physics Ionosphere Radio Experiment

    Science.gov (United States)

    Taylor, William W. L.; Mideke, Michael; Pine, William E.; Ericson, James D.

    1992-01-01

    The Interactive NASA Space Physics Ionosphere Radio Experiment (INSPIRE) designed to assist in a Space Experiments with Particle Accelerators (SEPAC) project is discussed. INSPIRE is aimed at recording data from a large number of receivers on the ground to determine the exact propagation paths and absorption of radio waves at frequencies between 50 Hz and 7 kHz. It is indicated how to participate in the experiment that will involve high school classes, colleges, and amateur radio operators.

  5. INSPIRE: Interactive NASA Space Physics Ionosphere Radio Experiment

    Science.gov (United States)

    Franzen, K. A.; Garcia, L. N.; Webb, P. A.; Green, J. L.

    2007-12-01

    The INSPIRE Project is a non-profit scientific and educational corporation whose objective is to bring the excitement of observing very low frequency (VLF) natural radio waves to high school students. Underlying this objective is the conviction that science and technology are the underpinnings of our modern society, and that only with an understanding of these disciplines can people make correct decisions in their lives. Since 1989, the INSPIRE Project has provided specially designed radio receiver kits to over 2,500 students and other groups to make observations of signals in the VLF frequency range. These kits provide an innovative and unique opportunity for students to actively gather data that can be used in a basic research project. Natural VLF emissions that can be studied with the INSPIRE receiver kits include sferics, tweeks, whistlers, and chorus, which originate from phenomena such as lightning. These emissions can either come from the local atmospheric environment within a few tens of kilometers of the receiver or from outer space thousands of kilometers from the Earth. VLF emissions are at such low frequencies that they can be received, amplified and turned into sound that we can hear, with each emission producing in a distinctive sound. In 2006 INSPIRE was re-branded and its mission has expanded to developing new partnerships with multiple science projects. Links to magnetospheric physics, astronomy, and meteorology are being identified. This presentation will introduce the INSPIRE project, display the INSPIRE receiver kits, show examples of the types of VLF emissions that can be collected and provide information on scholarship programs being offered.

  6. Writing Inspired

    Science.gov (United States)

    Tischhauser, Karen

    2015-01-01

    Students need inspiration to write. Assigning is not teaching. In order to inspire students to write fiction worth reading, teachers must take them through the process of writing. Physical objects inspire good writing with depth. In this article, the reader will be taken through the process of inspiring young writers through the use of boxes.…

  7. From biologically-inspired physics to physics-inspired biology From biologically-inspired physics to physics-inspired biology

    Science.gov (United States)

    Kornyshev, Alexei A.

    2010-10-01

    The conference 'From DNA-Inspired Physics to Physics-Inspired Biology' (1-5 June 2009, International Center for Theoretical Physics, Trieste, Italy) that myself and two former presidents of the American Biophysical Society—Wilma Olson (Rutgers University) and Adrian Parsegian (NIH), with the support of an ICTP team (Ralf Gebauer (Local Organizer) and Doreen Sauleek (Conference Secretary)), have organized was intended to establish stronger links between the biology and physics communities on the DNA front. The relationships between them were never easy. In 1997, Adrian published a paper in Physics Today ('Harness the Hubris') summarizing his thoughts about the main obstacles for a successful collaboration. The bottom line of that article was that physicists must seriously learn biology before exploring it and even having an interpreter, a friend or co-worker, who will be cooperating with you and translating the problems of biology into a physical language, may not be enough. He started his story with a joke about a physicist asking a biologist: 'I want to study the brain. Tell me something about it!' Biologist: 'First, the brain consists of two parts, and..' Physicist: 'Stop. You have told me too much.' Adrian listed a few direct avenues where physicists' contributions may be particularly welcome. This gentle and elegantly written paper caused, however, a stormy reaction from Bob Austin (Princeton), published together with Adrian's notes, accusing Adrian of forbidding physicists to attack big questions in biology straightaway. Twelve years have passed and many new developments have taken place in the biologist-physicist interaction. This was something I addressed in my opening conference speech, with my position lying somewhere inbetween Parsegian's and Austin's, which is briefly outlined here. I will first recall certain precepts or 'dogmas' that fly in the air like Valkyries, poisoning those relationships. Since the early seventies when I was a first year Ph

  8. INSPIRE: Managing Metadata in a Global Digital Library for High-Energy Physics

    OpenAIRE

    Martin Montull, Javier

    2011-01-01

    Four leading laboratories in the High-Energy Physics (HEP) field are collaborating to roll-out the next-generation scientific information portal: INSPIRE. The goal of this project is to replace the popular 40 year-old SPIRES database. INSPIRE already provides access to about 1 million records and includes services such as fulltext search, automatic keyword assignment, ingestion and automatic display of LaTeX, citation analysis, automatic author disambiguation, metadata harvesting, extraction ...

  9. Physics of collapses. Probabilistic occurrence of ELMs and crashes

    International Nuclear Information System (INIS)

    Itoh, S.-I.; Toda, S.; Yagi, M.; Itoh, K.; Fukuyama, A.

    1997-01-01

    Statistical picture for the collapse is proposed. The physics picture of the crash phenomena, which is based on the turbulence-turbulence transition, is extended to include the statistical variance of observables. The dynamics of the plasma gradient and the turbulence level is studied, with the hysteresis nature in the flux-gradient relation. The probabilistic excitation is predicted. The critical condition is described by the statistical probability. (author)

  10. Strategic Team AI Path Plans: Probabilistic Pathfinding

    Directory of Open Access Journals (Sweden)

    Tng C. H. John

    2008-01-01

    Full Text Available This paper proposes a novel method to generate strategic team AI pathfinding plans for computer games and simulations using probabilistic pathfinding. This method is inspired by genetic algorithms (Russell and Norvig, 2002, in that, a fitness function is used to test the quality of the path plans. The method generates high-quality path plans by eliminating the low-quality ones. The path plans are generated by probabilistic pathfinding, and the elimination is done by a fitness test of the path plans. This path plan generation method has the ability to generate variation or different high-quality paths, which is desired for games to increase replay values. This work is an extension of our earlier work on team AI: probabilistic pathfinding (John et al., 2006. We explore ways to combine probabilistic pathfinding and genetic algorithm to create a new method to generate strategic team AI pathfinding plans.

  11. Statistical physics inspired energy-efficient coded-modulation for optical communications.

    Science.gov (United States)

    Djordjevic, Ivan B; Xu, Lei; Wang, Ting

    2012-04-15

    Because Shannon's entropy can be obtained by Stirling's approximation of thermodynamics entropy, the statistical physics energy minimization methods are directly applicable to the signal constellation design. We demonstrate that statistical physics inspired energy-efficient (EE) signal constellation designs, in combination with large-girth low-density parity-check (LDPC) codes, significantly outperform conventional LDPC-coded polarization-division multiplexed quadrature amplitude modulation schemes. We also describe an EE signal constellation design algorithm. Finally, we propose the discrete-time implementation of D-dimensional transceiver and corresponding EE polarization-division multiplexed system. © 2012 Optical Society of America

  12. Machine learning, computer vision, and probabilistic models in jet physics

    CERN Multimedia

    CERN. Geneva; NACHMAN, Ben

    2015-01-01

    In this talk we present recent developments in the application of machine learning, computer vision, and probabilistic models to the analysis and interpretation of LHC events. First, we will introduce the concept of jet-images and computer vision techniques for jet tagging. Jet images enabled the connection between jet substructure and tagging with the fields of computer vision and image processing for the first time, improving the performance to identify highly boosted W bosons with respect to state-of-the-art methods, and providing a new way to visualize the discriminant features of different classes of jets, adding a new capability to understand the physics within jets and to design more powerful jet tagging methods. Second, we will present Fuzzy jets: a new paradigm for jet clustering using machine learning methods. Fuzzy jets view jet clustering as an unsupervised learning task and incorporate a probabilistic assignment of particles to jets to learn new features of the jet structure. In particular, we wi...

  13. Physicists get INSPIREd

    CERN Multimedia

    CERN Bulletin

    2010-01-01

    Particle physicists thrive on information. They first create information by performing experiments or elaborating theoretical conjectures and then they share it through publications and various web tools. The INSPIRE service, just released, will bring state of the art information retrieval to the fingertips of researchers.   Keeping track of the information shared within the particle physics community has long been the task of libraries at the larger labs, such as CERN, DESY, Fermilab and SLAC, as well as the focus of indispensible services like arXiv and those of the Particle Data Group. In 2007, many providers of information in the field came together for a summit at SLAC to see how physics information resources could be enhanced, and the INSPIRE project emerged from that meeting. The vision behind INSPIRE was built by a survey launched by the four labs to evaluate the real needs of the community. INSPIRE responds to these directives from the community by combining the most successful aspe...

  14. A Physics-Inspired Mechanistic Model of Migratory Movement Patterns in Birds.

    Science.gov (United States)

    Revell, Christopher; Somveille, Marius

    2017-08-29

    In this paper, we introduce a mechanistic model of migratory movement patterns in birds, inspired by ideas and methods from physics. Previous studies have shed light on the factors influencing bird migration but have mainly relied on statistical correlative analysis of tracking data. Our novel method offers a bottom up explanation of population-level migratory movement patterns. It differs from previous mechanistic models of animal migration and enables predictions of pathways and destinations from a given starting location. We define an environmental potential landscape from environmental data and simulate bird movement within this landscape based on simple decision rules drawn from statistical mechanics. We explore the capacity of the model by qualitatively comparing simulation results to the non-breeding migration patterns of a seabird species, the Black-browed Albatross (Thalassarche melanophris). This minimal, two-parameter model was able to capture remarkably well the previously documented migration patterns of the Black-browed Albatross, with the best combination of parameter values conserved across multiple geographically separate populations. Our physics-inspired mechanistic model could be applied to other bird and highly-mobile species, improving our understanding of the relative importance of various factors driving migration and making predictions that could be useful for conservation.

  15. Inspiring a generation

    CERN Multimedia

    2012-01-01

    The motto of the 2012 Olympic and Paralympic Games is ‘Inspire a generation’ so it was particularly pleasing to see science, the LHC and Higgs bosons featuring so strongly in the opening ceremony of the Paralympics last week.   It’s a sign of just how far our field has come that such a high-profile event featured particle physics so strongly, and we can certainly add our support to that motto. If the legacy of London 2012 is a generation inspired by science as well as sport, then the games will have more than fulfilled their mission. Particle physics has truly inspiring stories to tell, going well beyond Higgs and the LHC, and the entire community has played its part in bringing the excitement of frontier research in particle physics to a wide audience. Nevertheless, we cannot rest on our laurels: maintaining the kind of enthusiasm for science we witnessed at the Paralympic opening ceremony will require constant vigilance, and creative thinking about ways to rea...

  16. Accelerating Inspire

    CERN Document Server

    AUTHOR|(CDS)2266999

    2017-01-01

    CERN has been involved in the dissemination of scientific results since its early days and has continuously updated the distribution channels. Currently, Inspire hosts catalogues of articles, authors, institutions, conferences, jobs, experiments, journals and more. Successful orientation among this amount of data requires comprehensive linking between the content. Inspire has lacked a system for linking experiments and articles together based on which accelerator they were conducted at. The purpose of this project has been to create such a system. Records for 156 accelerators were created and all 2913 experiments on Inspire were given corresponding MARC tags. Records of 18404 accelerator physics related bibliographic entries were also tagged with corresponding accelerator tags. Finally, as a part of the endeavour to broaden CERN's presence on Wikipedia, existing Wikipedia articles of accelerators were updated with short descriptions and links to Inspire. In total, 86 Wikipedia articles were updated. This repo...

  17. Perceptually-Inspired Computing

    Directory of Open Access Journals (Sweden)

    Ming Lin

    2015-08-01

    Full Text Available Human sensory systems allow individuals to see, hear, touch, and interact with the surrounding physical environment. Understanding human perception and its limit enables us to better exploit the psychophysics of human perceptual systems to design more efficient, adaptive algorithms and develop perceptually-inspired computational models. In this talk, I will survey some of recent efforts on perceptually-inspired computing with applications to crowd simulation and multimodal interaction. In particular, I will present data-driven personality modeling based on the results of user studies, example-guided physics-based sound synthesis using auditory perception, as well as perceptually-inspired simplification for multimodal interaction. These perceptually guided principles can be used to accelerating multi-modal interaction and visual computing, thereby creating more natural human-computer interaction and providing more immersive experiences. I will also present their use in interactive applications for entertainment, such as video games, computer animation, and shared social experience. I will conclude by discussing possible future research directions.

  18. Effective Practices for Training and Inspiring High School Physics Teachers

    Science.gov (United States)

    Magee-Sauer, Karen

    It is well-documented that there is a nationwide shortage of highly qualified high school physics teachers. Not surprising, institutions of higher education report that the most common number of physics teacher graduates is zero with the majority of institutions graduating less than two physics teachers per year. With these statistics in mind, it is critical that institutions take a careful look at how they recruit, train, and contribute to the retention of high school physics teachers. PhysTEC is a partnership between the APS and AAPT that is dedicated to improving and promoting the education of high school physics teachers. Primarily funded by the NSF and its partnering organizations, PhysTEC has identified key components that are common to successful physics teacher preparation programs. While creating a successful training program in physics, it is also important that students have the opportunity for a ``do-able'' path to certification that does not add further financial debt. This talk will present an overview of ``what works'' in creating a path for physics majors to a high school physics teaching career, actions and activities that help train and inspire pre-service physics teachers, and frameworks that provide the support for in-service teachers. Obstacles to certification and the importance of a strong partnership with colleges of education will be discussed. Several examples of successful physics high school teacher preparation programs will be presented. This material is part of the Physics Teacher Education Coalition project, which is based upon work supported by the National Science Foundation under Grant Nos. 0808790, 0108787, and 0833210.

  19. INSPIRE: Managing Metadata in a Global Digital Library for High-Energy Physics

    CERN Document Server

    Martin Montull, Javier

    2011-01-01

    Four leading laboratories in the High-Energy Physics (HEP) field are collaborating to roll-out the next-generation scientific information portal: INSPIRE. The goal of this project is to replace the popular 40 year-old SPIRES database. INSPIRE already provides access to about 1 million records and includes services such as fulltext search, automatic keyword assignment, ingestion and automatic display of LaTeX, citation analysis, automatic author disambiguation, metadata harvesting, extraction of figures from fulltext and search in figure captions. In order to achieve high quality metadata both automatic processing and manual curation are needed. The different tools available in the system use modern web technologies to provide the curators of the maximum efficiency, while dealing with the MARC standard format. The project is under heavy development in order to provide new features including semantic analysis, crowdsourcing of metadata curation, user tagging, recommender systems, integration of OAIS standards a...

  20. Probability versus Representativeness in Infancy: Can Infants Use Naïve Physics to Adjust Population Base Rates in Probabilistic Inference?

    Science.gov (United States)

    Denison, Stephanie; Trikutam, Pallavi; Xu, Fei

    2014-01-01

    A rich tradition in developmental psychology explores physical reasoning in infancy. However, no research to date has investigated whether infants can reason about physical objects that behave probabilistically, rather than deterministically. Physical events are often quite variable, in that similar-looking objects can be placed in similar…

  1. INSPIRE: Realizing the dream of a global digital library in High-Energy Physics

    CERN Document Server

    Holtkamp, Annette; Simko, Tibor; Smith, Tim

    2010-01-01

    High-Energy Physics (HEP) has a long tradition in pioneering infrastructures for scholarly communication, and four leading laboratories are now rolling-out the next-generation digital library for the field: INSPIRE. This is an evolution of the extraordinarily successful, 40-years old SPIRES database. Based on the Invenio software, INSPIRE already provides seamless access to almost 1 million records, which will be expanded to cover multimedia, data, software, wikis. Services offered include citation analysis, fulltext search, extraction of figures from fulltext and search in figure captions, automatic keyword assignment, metadata harvesting, retrodigitization, ingestion and automatic display of LaTeX, and storage of supplementary materials like Mathematica notebooks. New services are in different phases of design or implementation, in strategic partnerships with all other information providers in the field and neighbouring disciplines, including automatic author disambiguation, user tagging, crowdsourcing of m...

  2. Probabilistic methods for physics

    International Nuclear Information System (INIS)

    Cirier, G

    2013-01-01

    We present an asymptotic method giving a probability of presence of the iterated spots of R d by a polynomial function f. We use the well-known Perron Frobenius operator (PF) that lets certain sets and measure invariant by f. Probabilistic solutions can exist for the deterministic iteration. If the theoretical result is already known, here we quantify these probabilities. This approach seems interesting to use for computing situations when the deterministic methods don't run. Among the examined applications, are asymptotic solutions of Lorenz, Navier-Stokes or Hamilton's equations. In this approach, linearity induces many difficult problems, all of whom we have not yet resolved.

  3. A Tony Thomas-Inspired Guide to INSPIRE

    Energy Technology Data Exchange (ETDEWEB)

    O' Connell, Heath B.; /Fermilab

    2010-04-01

    The SPIRES database was created in the late 1960s to catalogue the high energy physics preprints received by the SLAC Library. In the early 1990s it became the first database on the web and the first website outside of Europe. Although indispensible to the HEP community, its aging software infrastructure is becoming a serious liability. In a joint project involving CERN, DESY, Fermilab and SLAC, a new database, INSPIRE, is being created to replace SPIRES using CERN's modern, open-source Invenio database software. INSPIRE will maintain the content and functionality of SPIRES plus many new features. I describe this evolution from the birth of SPIRES to the current day, noting that the career of Tony Thomas spans this timeline.

  4. A Tony Thomas-Inspired Guide to INSPIRE

    International Nuclear Information System (INIS)

    O'Connell, Heath B.

    2010-01-01

    The SPIRES database was created in the late 1960s to catalogue the high energy physics preprints received by the SLAC Library. In the early 1990s it became the first database on the web and the first website outside of Europe. Although indispensible to the HEP community, its aging software infrastructure is becoming a serious liability. In a joint project involving CERN, DESY, Fermilab and SLAC, a new database, INSPIRE, is being created to replace SPIRES using CERN's modern, open-source Invenio database software. INSPIRE will maintain the content and functionality of SPIRES plus many new features. I describe this evolution from the birth of SPIRES to the current day, noting that the career of Tony Thomas spans this timeline.

  5. Physicists Get INSPIREd: INSPIRE Project and Grid Applications

    International Nuclear Information System (INIS)

    Klem, Jukka; Iwaszkiewicz, Jan

    2011-01-01

    INSPIRE is the new high-energy physics scientific information system developed by CERN, DESY, Fermilab and SLAC. INSPIRE combines the curated and trusted contents of SPIRES database with Invenio digital library technology. INSPIRE contains the entire HEP literature with about one million records and in addition to becoming the reference HEP scientific information platform, it aims to provide new kinds of data mining services and metrics to assess the impact of articles and authors. Grid and cloud computing provide new opportunities to offer better services in areas that require large CPU and storage resources including document Optical Character Recognition (OCR) processing, full-text indexing of articles and improved metrics. D4Science-II is a European project that develops and operates an e-Infrastructure supporting Virtual Research Environments (VREs). It develops an enabling technology (gCube) which implements a mechanism for facilitating the interoperation of its e-Infrastructure with other autonomously running data e-Infrastructures. As a result, this creates the core of an e-Infrastructure ecosystem. INSPIRE is one of the e-Infrastructures participating in D4Science-II project. In the context of the D4Science-II project, the INSPIRE e-Infrastructure makes available some of its resources and services to other members of the resulting ecosystem. Moreover, it benefits from the ecosystem via a dedicated Virtual Organization giving access to an array of resources ranging from computing and storage resources of grid infrastructures to data and services.

  6. Probability versus representativeness in infancy: can infants use naïve physics to adjust population base rates in probabilistic inference?

    Science.gov (United States)

    Denison, Stephanie; Trikutam, Pallavi; Xu, Fei

    2014-08-01

    A rich tradition in developmental psychology explores physical reasoning in infancy. However, no research to date has investigated whether infants can reason about physical objects that behave probabilistically, rather than deterministically. Physical events are often quite variable, in that similar-looking objects can be placed in similar contexts with different outcomes. Can infants rapidly acquire probabilistic physical knowledge, such as some leaves fall and some glasses break by simply observing the statistical regularity with which objects behave and apply that knowledge in subsequent reasoning? We taught 11-month-old infants physical constraints on objects and asked them to reason about the probability of different outcomes when objects were drawn from a large distribution. Infants could have reasoned either by using the perceptual similarity between the samples and larger distributions or by applying physical rules to adjust base rates and estimate the probabilities. Infants learned the physical constraints quickly and used them to estimate probabilities, rather than relying on similarity, a version of the representativeness heuristic. These results indicate that infants can rapidly and flexibly acquire physical knowledge about objects following very brief exposure and apply it in subsequent reasoning. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  7. Obstacle traversal and self-righting of bio-inspired robots reveal the physics of multi-modal locomotion

    Science.gov (United States)

    Li, Chen; Fearing, Ronald; Full, Robert

    Most animals move in nature in a variety of locomotor modes. For example, to traverse obstacles like dense vegetation, cockroaches can climb over, push across, reorient their bodies to maneuver through slits, or even transition among these modes forming diverse locomotor pathways; if flipped over, they can also self-right using wings or legs to generate body pitch or roll. By contrast, most locomotion studies have focused on a single mode such as running, walking, or jumping, and robots are still far from capable of life-like, robust, multi-modal locomotion in the real world. Here, we present two recent studies using bio-inspired robots, together with new locomotion energy landscapes derived from locomotor-environment interaction physics, to begin to understand the physics of multi-modal locomotion. (1) Our experiment of a cockroach-inspired legged robot traversing grass-like beam obstacles reveals that, with a terradynamically ``streamlined'' rounded body like that of the insect, robot traversal becomes more probable by accessing locomotor pathways that overcome lower potential energy barriers. (2) Our experiment of a cockroach-inspired self-righting robot further suggests that body vibrations are crucial for exploring locomotion energy landscapes and reaching lower barrier pathways. Finally, we posit that our new framework of locomotion energy landscapes holds promise to better understand and predict multi-modal biological and robotic movement.

  8. Probabilistic logics and probabilistic networks

    CERN Document Server

    Haenni, Rolf; Wheeler, Gregory; Williamson, Jon; Andrews, Jill

    2014-01-01

    Probabilistic Logic and Probabilistic Networks presents a groundbreaking framework within which various approaches to probabilistic logic naturally fit. Additionally, the text shows how to develop computationally feasible methods to mesh with this framework.

  9. Perspective of an Artist Inspired by Physics

    Science.gov (United States)

    Sanborn, Jim

    2010-02-01

    Using digital images and video I will be presenting thirty years of my science based artwork. Beginning in the late 1970's my gallery and museum installations used lodestones and suspended compasses to reveal the earths' magnetic field. Through the 1980's my work included these compass installations and geologically inspired tableaux that had one thing in common, they were designed to expose the invisible forces of nature. Tectonics, the Coriolis force, and magnetism were among the subjects of study. In 1988, on the basis of my work with invisible forces, I was selected for a commission from the General Services Administration for the new Central Intelligence Agency headquarters in Langley Virginia. This work titled Kryptos included a large cryptographic component that remains undeciphered twenty years after its installation. In the 1990's Kryptos inspired several of my museum and gallery installations using cryptography and secrecy as their main themes. From 1995-1998 I completed a series of large format projections on the landscape in the western US and Ireland. These projections and the resulting series of photographs emulated the 19th century cartographers hired by the United States Government to map the western landscape. In 1998 I began my project titled Atomic Time. This installation shown for the first time in 2004 at the Corcoran Gallery in Washington DC, then again in the Gwangju Biennale in South Korea was a recreation of the 1944 Manhattan Project laboratory that built the first Atomic Bomb. This installation used original equipment and prototypes from the Los Alamos Lab and was an extremely accurate representation of the laboratory and the first nuclear bomb called the ``Trinity Device.'' I began my current project Terrestrial Physics in 2005. This installation to be shown in June 2010 at the Museum of Contemporary Art in Denver is a recreation of the large particle accelerator and the experiment that fissioned Uranium in 1939 at the Carnegie

  10. Probabilities in physics

    CERN Document Server

    Hartmann, Stephan

    2011-01-01

    Many results of modern physics--those of quantum mechanics, for instance--come in a probabilistic guise. But what do probabilistic statements in physics mean? Are probabilities matters of objective fact and part of the furniture of the world, as objectivists think? Or do they only express ignorance or belief, as Bayesians suggest? And how are probabilistic hypotheses justified and supported by empirical evidence? Finally, what does the probabilistic nature of physics imply for our understanding of the world? This volume is the first to provide a philosophical appraisal of probabilities in all of physics. Its main aim is to make sense of probabilistic statements as they occur in the various physical theories and models and to provide a plausible epistemology and metaphysics of probabilities. The essays collected here consider statistical physics, probabilistic modelling, and quantum mechanics, and critically assess the merits and disadvantages of objectivist and subjectivist views of probabilities in these fie...

  11. Practices of Waldorf-Inspired Schools. Research Brief

    Science.gov (United States)

    Friedlaender, Diane; Beckham, Kyle; Zheng, Xinhua; Darling-Hammond, Linda

    2015-01-01

    "Growing a Waldorf-Inspired Approach in a Public School District" documents the practices and outcomes of Alice Birney, a Waldorf-Inspired School in Sacramento City Unified School District (SCUSD). This study highlights how such a school addresses students' academic, social, emotional, physical, and creative development. The study also…

  12. Integrated Deterministic-Probabilistic Safety Assessment Methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Kudinov, P.; Vorobyev, Y.; Sanchez-Perea, M.; Queral, C.; Jimenez Varas, G.; Rebollo, M. J.; Mena, L.; Gomez-Magin, J.

    2014-02-01

    IDPSA (Integrated Deterministic-Probabilistic Safety Assessment) is a family of methods which use tightly coupled probabilistic and deterministic approaches to address respective sources of uncertainties, enabling Risk informed decision making in a consistent manner. The starting point of the IDPSA framework is that safety justification must be based on the coupling of deterministic (consequences) and probabilistic (frequency) considerations to address the mutual interactions between stochastic disturbances (e.g. failures of the equipment, human actions, stochastic physical phenomena) and deterministic response of the plant (i.e. transients). This paper gives a general overview of some IDPSA methods as well as some possible applications to PWR safety analyses. (Author)

  13. Probabilistic graphs as a conceptual and computational tool in hydrology and water management

    Science.gov (United States)

    Schoups, Gerrit

    2014-05-01

    Originally developed in the fields of machine learning and artificial intelligence, probabilistic graphs constitute a general framework for modeling complex systems in the presence of uncertainty. The framework consists of three components: 1. Representation of the model as a graph (or network), with nodes depicting random variables in the model (e.g. parameters, states, etc), which are joined together by factors. Factors are local probabilistic or deterministic relations between subsets of variables, which, when multiplied together, yield the joint distribution over all variables. 2. Consistent use of probability theory for quantifying uncertainty, relying on basic rules of probability for assimilating data into the model and expressing unknown variables as a function of observations (via the posterior distribution). 3. Efficient, distributed approximation of the posterior distribution using general-purpose algorithms that exploit model structure encoded in the graph. These attributes make probabilistic graphs potentially useful as a conceptual and computational tool in hydrology and water management (and beyond). Conceptually, they can provide a common framework for existing and new probabilistic modeling approaches (e.g. by drawing inspiration from other fields of application), while computationally they can make probabilistic inference feasible in larger hydrological models. The presentation explores, via examples, some of these benefits.

  14. Generative probabilistic models extend the scope of inferential structure determination

    DEFF Research Database (Denmark)

    Olsson, Simon; Boomsma, Wouter; Frellsen, Jes

    2011-01-01

    demonstrate that the use of generative probabilistic models instead of physical forcefields in the Bayesian formalism is not only conceptually attractive, but also improves precision and efficiency. Our results open new vistas for the use of sophisticated probabilistic models of biomolecular structure......Conventional methods for protein structure determination from NMR data rely on the ad hoc combination of physical forcefields and experimental data, along with heuristic determination of free parameters such as weight of experimental data relative to a physical forcefield. Recently, a theoretically...

  15. Probabilistic Counterfactuals: Semantics, Computation, and Applications

    National Research Council Canada - National Science Library

    Balke, Alexander

    1997-01-01

    ... handled within the framework of standard probability theory. Starting with functional description of physical mechanisms, we were able to derive the standard probabilistic properties of Bayesian networks and to show: (1...

  16. A physical probabilistic model to predict failure rates in buried PVC pipelines

    International Nuclear Information System (INIS)

    Davis, P.; Burn, S.; Moglia, M.; Gould, S.

    2007-01-01

    For older water pipeline materials such as cast iron and asbestos cement, future pipe failure rates can be extrapolated from large volumes of existing historical failure data held by water utilities. However, for newer pipeline materials such as polyvinyl chloride (PVC), only limited failure data exists and confident forecasts of future pipe failures cannot be made from historical data alone. To solve this problem, this paper presents a physical probabilistic model, which has been developed to estimate failure rates in buried PVC pipelines as they age. The model assumes that under in-service operating conditions, crack initiation can occur from inherent defects located in the pipe wall. Linear elastic fracture mechanics theory is used to predict the time to brittle fracture for pipes with internal defects subjected to combined internal pressure and soil deflection loading together with through-wall residual stress. To include uncertainty in the failure process, inherent defect size is treated as a stochastic variable, and modelled with an appropriate probability distribution. Microscopic examination of fracture surfaces from field failures in Australian PVC pipes suggests that the 2-parameter Weibull distribution can be applied. Monte Carlo simulation is then used to estimate lifetime probability distributions for pipes with internal defects, subjected to typical operating conditions. As with inherent defect size, the 2-parameter Weibull distribution is shown to be appropriate to model uncertainty in predicted pipe lifetime. The Weibull hazard function for pipe lifetime is then used to estimate the expected failure rate (per pipe length/per year) as a function of pipe age. To validate the model, predicted failure rates are compared to aggregated failure data from 17 UK water utilities obtained from the United Kingdom Water Industry Research (UKWIR) National Mains Failure Database. In the absence of actual operating pressure data in the UKWIR database, typical

  17. Probabilistic analysis of strength and thermal-physic WWER fuel rod characteristics using START-3 code

    International Nuclear Information System (INIS)

    Medvedev, A.; Bogatyr, S.; Khramtsov; Sokolov, F.

    2001-01-01

    During the last years probabilistic methods for evaluation of the influence of the fuel geometry and technology parameters on fuel operational reliability are widely used. In the present work the START-3 procedure is used to calculate the thermal physics and strength characteristics of WWER fuel rods behavior. The procedure is based on the Monte-Carlo method with the application of Sobol quasi-random sequences. This technique allows to treat the fuel rod technological and operating parameters as well as its strength and thermal physics characteristics as random variables. The work deals with a series of WWER-1000 fuel rod statistical tests and verification based on the PIE results. Also preliminary calculations are implemented with the aim to determine the design schema parameters. This should ensure the accuracy of the assessment of the parameters of WWER fuel rod characteristics distribution. The probability characteristics of fuel rod strength and thermal physics are assessed via the statistical analysis of the results of probability calculations

  18. Probabilistic Physics of Failure-based framework for fatigue life prediction of aircraft gas turbine discs under uncertainty

    International Nuclear Information System (INIS)

    Zhu, Shun-Peng; Huang, Hong-Zhong; Peng, Weiwen; Wang, Hai-Kun; Mahadevan, Sankaran

    2016-01-01

    A probabilistic Physics of Failure-based framework for fatigue life prediction of aircraft gas turbine discs operating under uncertainty is developed. The framework incorporates the overall uncertainties appearing in a structural integrity assessment. A comprehensive uncertainty quantification (UQ) procedure is presented to quantify multiple types of uncertainty using multiplicative and additive UQ methods. In addition, the factors that contribute the most to the resulting output uncertainty are investigated and identified for uncertainty reduction in decision-making. A high prediction accuracy of the proposed framework is validated through a comparison of model predictions to the experimental results of GH4133 superalloy and full-scale tests of aero engine high-pressure turbine discs. - Highlights: • A probabilistic PoF-based framework for fatigue life prediction is proposed. • A comprehensive procedure forquantifyingmultiple types of uncertaintyis presented. • The factors that contribute most to the resulting output uncertainty are identified. • The proposed frameworkdemonstrates high prediction accuracybyfull-scale tests.

  19. Probabilistic Decision Graphs - Combining Verification and AI Techniques for Probabilistic Inference

    DEFF Research Database (Denmark)

    Jaeger, Manfred

    2004-01-01

    We adopt probabilistic decision graphs developed in the field of automated verification as a tool for probabilistic model representation and inference. We show that probabilistic inference has linear time complexity in the size of the probabilistic decision graph, that the smallest probabilistic ...

  20. Brain-inspired Stochastic Models and Implementations

    KAUST Repository

    Al-Shedivat, Maruan

    2015-05-12

    One of the approaches to building artificial intelligence (AI) is to decipher the princi- ples of the brain function and to employ similar mechanisms for solving cognitive tasks, such as visual perception or natural language understanding, using machines. The recent breakthrough, named deep learning, demonstrated that large multi-layer networks of arti- ficial neural-like computing units attain remarkable performance on some of these tasks. Nevertheless, such artificial networks remain to be very loosely inspired by the brain, which rich structures and mechanisms may further suggest new algorithms or even new paradigms of computation. In this thesis, we explore brain-inspired probabilistic mechanisms, such as neural and synaptic stochasticity, in the context of generative models. The two questions we ask here are: (i) what kind of models can describe a neural learning system built of stochastic components? and (ii) how can we implement such systems e ̆ciently? To give specific answers, we consider two well known models and the corresponding neural architectures: the Naive Bayes model implemented with a winner-take-all spiking neural network and the Boltzmann machine implemented in a spiking or non-spiking fashion. We propose and analyze an e ̆cient neuromorphic implementation of the stochastic neu- ral firing mechanism and study the e ̄ects of synaptic unreliability on learning generative energy-based models implemented with neural networks.

  1. Learning Probabilistic Logic Models from Probabilistic Examples.

    Science.gov (United States)

    Chen, Jianzhong; Muggleton, Stephen; Santos, José

    2008-10-01

    We revisit an application developed originally using abductive Inductive Logic Programming (ILP) for modeling inhibition in metabolic networks. The example data was derived from studies of the effects of toxins on rats using Nuclear Magnetic Resonance (NMR) time-trace analysis of their biofluids together with background knowledge representing a subset of the Kyoto Encyclopedia of Genes and Genomes (KEGG). We now apply two Probabilistic ILP (PILP) approaches - abductive Stochastic Logic Programs (SLPs) and PRogramming In Statistical modeling (PRISM) to the application. Both approaches support abductive learning and probability predictions. Abductive SLPs are a PILP framework that provides possible worlds semantics to SLPs through abduction. Instead of learning logic models from non-probabilistic examples as done in ILP, the PILP approach applied in this paper is based on a general technique for introducing probability labels within a standard scientific experimental setting involving control and treated data. Our results demonstrate that the PILP approach provides a way of learning probabilistic logic models from probabilistic examples, and the PILP models learned from probabilistic examples lead to a significant decrease in error accompanied by improved insight from the learned results compared with the PILP models learned from non-probabilistic examples.

  2. When science inspires art

    CERN Multimedia

    Anaïs Vernède

    2011-01-01

    On Tuesday 18 January 2011, artist Pipilotti Rist came to CERN to find out how science could provide her with a source of inspiration for her art and perhaps to get ideas for future work. Pipilotti, who is an eclectic artist always on the lookout for an original source of inspiration, is almost as passionate about physics as she is about art.   Ever Is Over All, 1997, audio video installation by Pipilotti Rist.  View of the installation at the National Museum for Foreign Art, Sofia, Bulgaria. © Pipilotti Rist. Courtesy the artist and Hauser & Wirth. Photo by Angel Tzvetanov. Swiss video-maker Pipilotti Rist (her real name is Elisabeth Charlotte Rist), who is well-known in the international art world for her highly colourful videos and creations, visited CERN for the first time on Tuesday 18 January 2011.  Her visit represented a trip down memory lane, since she originally studied physics before becoming interested in pursuing a career as an artist and going on to de...

  3. Probabilistic Design of Wind Turbines

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Toft, H.S.

    2010-01-01

    Probabilistic design of wind turbines requires definition of the structural elements to be included in the probabilistic basis: e.g., blades, tower, foundation; identification of important failure modes; careful stochastic modeling of the uncertain parameters; recommendations for target reliability....... It is described how uncertainties in wind turbine design related to computational models, statistical data from test specimens, results from a few full-scale tests and from prototype wind turbines can be accounted for using the Maximum Likelihood Method and a Bayesian approach. Assessment of the optimal...... reliability level by cost-benefit optimization is illustrated by an offshore wind turbine example. Uncertainty modeling is illustrated by an example where physical, statistical and model uncertainties are estimated....

  4. Dynamic modeling of physical phenomena for probabilistic assessment of spent fuel accidents

    International Nuclear Information System (INIS)

    Benjamin, A.S.

    1997-01-01

    If there should be an accident involving drainage of all the water from a spent fuel pool, the fuel elements will heat up until the heat produced by radioactive decay is balanced by that removed by natural convection to air, thermal radiation, and other means. If the temperatures become high enough for the cladding or other materials to ignite due to rapid oxidation, then some of the fuel might melt, leading to an undesirable release of radioactive materials. The amount of melting is dependent upon the fuel loading configuration and its age, the oxidation and melting characteristics of the materials, and the potential effectiveness of recovery actions. The authors have developed methods for modeling the pertinent physical phenomena and integrating the results with a probabilistic treatment of the uncertainty distributions. The net result is a set of complementary cumulative distribution functions for the amount of fuel melted

  5. Abstract probabilistic CNOT gate model based on double encoding: study of the errors and physical realizability

    Science.gov (United States)

    Gueddana, Amor; Attia, Moez; Chatta, Rihab

    2015-03-01

    In this work, we study the error sources standing behind the non-perfect linear optical quantum components composing a non-deterministic quantum CNOT gate model, which performs the CNOT function with a success probability of 4/27 and uses a double encoding technique to represent photonic qubits at the control and the target. We generalize this model to an abstract probabilistic CNOT version and determine the realizability limits depending on a realistic range of the errors. Finally, we discuss physical constraints allowing the implementation of the Asymmetric Partially Polarizing Beam Splitter (APPBS), which is at the heart of correctly realizing the CNOT function.

  6. Probabilistic Design and Management of Sustainable Concrete Infrastructure Using Multi-Physics Service Life Models

    DEFF Research Database (Denmark)

    Lepech, Michael; Geiker, Mette; Michel, Alexander

    This paper looks to address the grand challenge of integrating construction materials engineering research within a multi-scale, inter-disciplinary research and management framework for sustainable concrete infrastructure. The ultimate goal is to drive sustainability-focused innovation and adoption...... cycles in the broader architecture, engineering, construction (AEC) industry. Specifically, a probabilistic design framework for sustainable concrete infrastructure and a multi-physics service life model for reinforced concrete are presented as important points of integration for innovation between...... design, consists of concrete service life models and life cycle assessment (LCA) models. Both types of models (service life and LCA) are formulated stochastically so that the service life and time(s) to repair, as well as total sustainability impact, are described by a probability distribution. A central...

  7. Combining Bio-inspired Sensing with Bio-inspired Locomotion

    DEFF Research Database (Denmark)

    Shaikh, Danish; Hallam, John; Christensen-Dalsgaard, Jakob

    In this paper we present a preliminary Braitenberg vehicle–like approach to combine bio-inspired audition with bio-inspired quadruped locomotion in simulation. Locomotion gaits of the salamander–like robot Salamandra robotica are modified by a lizard’s peripheral auditory system model that modula......In this paper we present a preliminary Braitenberg vehicle–like approach to combine bio-inspired audition with bio-inspired quadruped locomotion in simulation. Locomotion gaits of the salamander–like robot Salamandra robotica are modified by a lizard’s peripheral auditory system model...

  8. Formulation of probabilistic models of protein structure in atomic detail using the reference ratio method

    DEFF Research Database (Denmark)

    Valentin, Jan B.; Andreetta, Christian; Boomsma, Wouter

    2014-01-01

    We propose a method to formulate probabilistic models of protein structure in atomic detail, for a given amino acid sequence, based on Bayesian principles, while retaining a close link to physics. We start from two previously developed probabilistic models of protein structure on a local length s....... The results indicate that the proposed method and the probabilistic models show considerable promise for probabilistic protein structure prediction and related applications. © 2013 Wiley Periodicals, Inc....

  9. Mathematical simulation of cascade-probabilistic functions for charged particles

    International Nuclear Information System (INIS)

    Kupchishin, A.A.; Kupchishin, A.I.; Smygaleva, T.A.

    1998-01-01

    Analytical expressions for cascade-probabilistic functions (CPF) for electrons, protons, α-particles and ions with taking into account energy losses are received. Mathematical analysis of these functions is carried out and main properties of function are determined. Algorithms of CPF are developed and their computer calculation were conducted. Regularities in behavior of function in dependence on initial particles energy, atomic number and registration depth are established. Book is intended to specialists on mathematical simulation of radiation defects, solid state physics, elementary particle physics and applied mathematics. There are 3 chapters in the book: 1. Cascade-probabilistic functions for electrons; 2. CPF for protons and α-particles; 3. CPF with taking unto account energy losses of ions. (author)

  10. Dynamic modeling of physical phenomena for probabilistic risk assessments using artificial neural networks

    International Nuclear Information System (INIS)

    Benjamin, A.S.; Paez, T.L.; Brown, N.N.

    1998-01-01

    In most probabilistic risk assessments, there is a subset of accident scenarios that involves physical challenges to the system, such as high heat rates and/or accelerations. The system's responses to these challenges may be complicated, and their prediction may require the use of long-running computer codes. To deal with the many scenarios demanded by a risk assessment, the authors have been investigating the use of artificial neural networks (ANNs) as a fast-running estimation tool. They have developed a multivariate linear spline algorithm by extending previous ANN methods that use radial basis functions. They have applied the algorithm to problems involving fires, shocks, and vibrations. They have found that within the parameter range for which it is trained, the algorithm can simulate the nonlinear responses of complex systems with high accuracy. Running times per case are less than one second

  11. INSPIRE: A new scientific information system for HEP

    International Nuclear Information System (INIS)

    Ivanov, R; Raae, L

    2010-01-01

    The status of high-energy physics (HEP) information systems has been jointly analyzed by the libraries of CERN, DESY, Fermilab and SLAC. As a result, the four laboratories have started the INSPIRE project - a new platform built by moving the successful SPIRES features and content, curated at DESY, Fermilab and SLAC, into the open-source CDS Invenio digital library software that was developed at CERN. INSPIRE will integrate current acquisition workflows and databases to host the entire body of the HEP literature (about one million records), aiming to become the reference HEP scientific information platform worldwide. It will provide users with fast access to full text journal articles and preprints, but also material such as conference slides and multimedia. INSPIRE will empower scientists with new tools to discover and access the results most relevant to their research, enable novel text- and data-mining applications, and deploy new metrics to assess the impact of articles and authors. In addition, it will introduce the 'Web 2.0' paradigm of user-enriched content in the domain of sciences, with community-based approaches to scientific publishing. INSPIRE represents a natural evolution of scholarly communication built on successful community-based information systems, and it provides a vision for information management in other fields of science. Inspired by the needs of HEP, we hope that the INSPIRE project will be inspiring for other communities.

  12. Statistical physics of medical diagnostics: Study of a probabilistic model.

    Science.gov (United States)

    Mashaghi, Alireza; Ramezanpour, Abolfazl

    2018-03-01

    We study a diagnostic strategy which is based on the anticipation of the diagnostic process by simulation of the dynamical process starting from the initial findings. We show that such a strategy could result in more accurate diagnoses compared to a strategy that is solely based on the direct implications of the initial observations. We demonstrate this by employing the mean-field approximation of statistical physics to compute the posterior disease probabilities for a given subset of observed signs (symptoms) in a probabilistic model of signs and diseases. A Monte Carlo optimization algorithm is then used to maximize an objective function of the sequence of observations, which favors the more decisive observations resulting in more polarized disease probabilities. We see how the observed signs change the nature of the macroscopic (Gibbs) states of the sign and disease probability distributions. The structure of these macroscopic states in the configuration space of the variables affects the quality of any approximate inference algorithm (so the diagnostic performance) which tries to estimate the sign-disease marginal probabilities. In particular, we find that the simulation (or extrapolation) of the diagnostic process is helpful when the disease landscape is not trivial and the system undergoes a phase transition to an ordered phase.

  13. Statistical physics of medical diagnostics: Study of a probabilistic model

    Science.gov (United States)

    Mashaghi, Alireza; Ramezanpour, Abolfazl

    2018-03-01

    We study a diagnostic strategy which is based on the anticipation of the diagnostic process by simulation of the dynamical process starting from the initial findings. We show that such a strategy could result in more accurate diagnoses compared to a strategy that is solely based on the direct implications of the initial observations. We demonstrate this by employing the mean-field approximation of statistical physics to compute the posterior disease probabilities for a given subset of observed signs (symptoms) in a probabilistic model of signs and diseases. A Monte Carlo optimization algorithm is then used to maximize an objective function of the sequence of observations, which favors the more decisive observations resulting in more polarized disease probabilities. We see how the observed signs change the nature of the macroscopic (Gibbs) states of the sign and disease probability distributions. The structure of these macroscopic states in the configuration space of the variables affects the quality of any approximate inference algorithm (so the diagnostic performance) which tries to estimate the sign-disease marginal probabilities. In particular, we find that the simulation (or extrapolation) of the diagnostic process is helpful when the disease landscape is not trivial and the system undergoes a phase transition to an ordered phase.

  14. A survey of snake-inspired robot designs

    International Nuclear Information System (INIS)

    Hopkins, James K; Spranklin, Brent W; Gupta, Satyandra K

    2009-01-01

    Body undulation used by snakes and the physical architecture of a snake body may offer significant benefits over typical legged or wheeled locomotion designs in certain types of scenarios. A large number of research groups have developed snake-inspired robots to exploit these benefits. The purpose of this review is to report different types of snake-inspired robot designs and categorize them based on their main characteristics. For each category, we discuss their relative advantages and disadvantages. This review will assist in familiarizing a newcomer to the field with the existing designs and their distinguishing features. We hope that by studying existing robots, future designers will be able to create new designs by adopting features from successful robots. The review also summarizes the design challenges associated with the further advancement of the field and deploying snake-inspired robots in practice. (topical review)

  15. An Enhanced Artificial Bee Colony Algorithm with Solution Acceptance Rule and Probabilistic Multisearch.

    Science.gov (United States)

    Yurtkuran, Alkın; Emel, Erdal

    2016-01-01

    The artificial bee colony (ABC) algorithm is a popular swarm based technique, which is inspired from the intelligent foraging behavior of honeybee swarms. This paper proposes a new variant of ABC algorithm, namely, enhanced ABC with solution acceptance rule and probabilistic multisearch (ABC-SA) to address global optimization problems. A new solution acceptance rule is proposed where, instead of greedy selection between old solution and new candidate solution, worse candidate solutions have a probability to be accepted. Additionally, the acceptance probability of worse candidates is nonlinearly decreased throughout the search process adaptively. Moreover, in order to improve the performance of the ABC and balance the intensification and diversification, a probabilistic multisearch strategy is presented. Three different search equations with distinctive characters are employed using predetermined search probabilities. By implementing a new solution acceptance rule and a probabilistic multisearch approach, the intensification and diversification performance of the ABC algorithm is improved. The proposed algorithm has been tested on well-known benchmark functions of varying dimensions by comparing against novel ABC variants, as well as several recent state-of-the-art algorithms. Computational results show that the proposed ABC-SA outperforms other ABC variants and is superior to state-of-the-art algorithms proposed in the literature.

  16. An Enhanced Artificial Bee Colony Algorithm with Solution Acceptance Rule and Probabilistic Multisearch

    Directory of Open Access Journals (Sweden)

    Alkın Yurtkuran

    2016-01-01

    Full Text Available The artificial bee colony (ABC algorithm is a popular swarm based technique, which is inspired from the intelligent foraging behavior of honeybee swarms. This paper proposes a new variant of ABC algorithm, namely, enhanced ABC with solution acceptance rule and probabilistic multisearch (ABC-SA to address global optimization problems. A new solution acceptance rule is proposed where, instead of greedy selection between old solution and new candidate solution, worse candidate solutions have a probability to be accepted. Additionally, the acceptance probability of worse candidates is nonlinearly decreased throughout the search process adaptively. Moreover, in order to improve the performance of the ABC and balance the intensification and diversification, a probabilistic multisearch strategy is presented. Three different search equations with distinctive characters are employed using predetermined search probabilities. By implementing a new solution acceptance rule and a probabilistic multisearch approach, the intensification and diversification performance of the ABC algorithm is improved. The proposed algorithm has been tested on well-known benchmark functions of varying dimensions by comparing against novel ABC variants, as well as several recent state-of-the-art algorithms. Computational results show that the proposed ABC-SA outperforms other ABC variants and is superior to state-of-the-art algorithms proposed in the literature.

  17. Inspired Responses

    Science.gov (United States)

    Steele, Carol Frederick

    2011-01-01

    In terms of teacher quality, Steele believes the best teachers have reached a stage she terms inspired, and that teachers move progressively through the stages of unaware, aware, and capable until the most reflective teachers finally reach the inspired level. Inspired teachers have a wide repertoire of teaching and class management techniques and…

  18. The enigma of probability and physics

    International Nuclear Information System (INIS)

    Mayants, L.

    1984-01-01

    This volume contains a coherent exposition of the elements of two unique sciences: probabilistics (science of probability) and probabilistic physics (application of probabilistics to physics). Proceeding from a key methodological principle, it starts with the disclosure of the true content of probability and the interrelation between probability theory and experimental statistics. This makes is possible to introduce a proper order in all the sciences dealing with probability and, by conceiving the real content of statistical mechanics and quantum mechanics in particular, to construct both as two interconnected domains of probabilistic physics. Consistent theories of kinetics of physical transformations, decay processes, and intramolecular rearrangements are also outlined. The interrelation between the electromagnetic field, photons, and the theoretically discovered subatomic particle 'emon' is considered. Numerous internal imperfections of conventional probability theory, statistical physics, and quantum physics are exposed and removed - quantum physics no longer needs special interpretation. EPR, Bohm, and Bell paradoxes are easily resolved, among others. (Auth.)

  19. Probabilistic insurance

    OpenAIRE

    Wakker, P.P.; Thaler, R.H.; Tversky, A.

    1997-01-01

    textabstractProbabilistic insurance is an insurance policy involving a small probability that the consumer will not be reimbursed. Survey data suggest that people dislike probabilistic insurance and demand more than a 20% reduction in the premium to compensate for a 1% default risk. While these preferences are intuitively appealing they are difficult to reconcile with expected utility theory. Under highly plausible assumptions about the utility function, willingness to pay for probabilistic i...

  20. Biology-inspired AMO physics

    Science.gov (United States)

    Mathur, Deepak

    2015-01-01

    This Topical Review presents an overview of increasingly robust interconnects that are being established between atomic, molecular and optical (AMO) physics and the life sciences. AMO physics, outgrowing its historical role as a facilitator—a provider of optical methodologies, for instance—now seeks to partner biology in its quest to link systems-level descriptions of biological entities to insights based on molecular processes. Of course, perspectives differ when AMO physicists and biologists consider various processes. For instance, while AMO physicists link molecular properties and dynamics to potential energy surfaces, these have to give way to energy landscapes in considerations of protein dynamics. But there are similarities also: tunnelling and non-adiabatic transitions occur both in protein dynamics and in molecular dynamics. We bring to the fore some such differences and similarities; we consider imaging techniques based on AMO concepts, like 4D fluorescence microscopy which allows access to the dynamics of cellular processes, multiphoton microscopy which offers a built-in confocality, and microscopy with femtosecond laser beams to saturate the suppression of fluorescence in spatially controlled fashion so as to circumvent the diffraction limit. Beyond imaging, AMO physics contributes with optical traps that probe the mechanical and dynamical properties of single ‘live’ cells, highlighting differences between healthy and diseased cells. Trap methodologies have also begun to probe the dynamics governing of neural stem cells adhering to each other to form neurospheres and, with squeezed light to probe sub-diffusive motion of yeast cells. Strong field science contributes not only by providing a source of energetic electrons and γ-rays via laser-plasma accelerations schemes, but also via filamentation and supercontinuum generation, enabling mainstream collision physics into play in diverse processes like DNA damage induced by low-energy collisions to

  1. Wind effects on long-span bridges: Probabilistic wind data format for buffeting and VIV load assessments

    Science.gov (United States)

    Hoffmann, K.; Srouji, R. G.; Hansen, S. O.

    2017-12-01

    The technology development within the structural design of long-span bridges in Norwegian fjords has created a need for reformulating the calculation format and the physical quantities used to describe the properties of wind and the associated wind-induced effects on bridge decks. Parts of a new probabilistic format describing the incoming, undisturbed wind is presented. It is expected that a fixed probabilistic format will facilitate a more physically consistent and precise description of the wind conditions, which in turn increase the accuracy and considerably reduce uncertainties in wind load assessments. Because the format is probabilistic, a quantification of the level of safety and uncertainty in predicted wind loads is readily accessible. A simple buffeting response calculation demonstrates the use of probabilistic wind data in the assessment of wind loads and responses. Furthermore, vortex-induced fatigue damage is discussed in relation to probabilistic wind turbulence data and response measurements from wind tunnel tests.

  2. Probabilistic-Stochastic Model of Distribution of Physical and Mechanical Properties of Soft Mineral Rocks

    Directory of Open Access Journals (Sweden)

    O.O. Sdvizhkova

    2017-12-01

    Full Text Available The physical and mechanical characteristics of soils and soft rocks obtained as a result of laboratory tests are important initial parameters for assessing the stability of natural and artificial slopes. Such properties of rocks as adhesion and the angle of internal friction are due to the influence of a number of natural and technogenic factors. At the same time, from the set of factors influencing the stability of the slope, the most significant ones are singled out, which to a greater extent determine the properties of the rocks. The more factors are taken into account in the geotechnical model, the more closely the properties of the rocks are studied, which increases the accuracy of the scientific forecast of the landslide danger of the slope. On the other hand, an increase in the number of factors involved in the model complicates it and causes a decrease in the reliability of geotechnical calculations. The aim of the work is to construct a statistical distribution of the studied physical and mechanical properties of soft rocks and to substantiate a probabilistic statistical model. Based on the results of laboratory tests of rocks, the statistical distributions of the quantitative traits studied, the angle of internal friction φ and the cohesion, were constructed. It was established that the statistical distribution of physical mechanical properties of rocks is close to a uniform law.

  3. Biology-inspired AMO physics

    International Nuclear Information System (INIS)

    Mathur, Deepak

    2015-01-01

    This Topical Review presents an overview of increasingly robust interconnects that are being established between atomic, molecular and optical (AMO) physics and the life sciences. AMO physics, outgrowing its historical role as a facilitator—a provider of optical methodologies, for instance—now seeks to partner biology in its quest to link systems-level descriptions of biological entities to insights based on molecular processes. Of course, perspectives differ when AMO physicists and biologists consider various processes. For instance, while AMO physicists link molecular properties and dynamics to potential energy surfaces, these have to give way to energy landscapes in considerations of protein dynamics. But there are similarities also: tunnelling and non-adiabatic transitions occur both in protein dynamics and in molecular dynamics. We bring to the fore some such differences and similarities; we consider imaging techniques based on AMO concepts, like 4D fluorescence microscopy which allows access to the dynamics of cellular processes, multiphoton microscopy which offers a built-in confocality, and microscopy with femtosecond laser beams to saturate the suppression of fluorescence in spatially controlled fashion so as to circumvent the diffraction limit. Beyond imaging, AMO physics contributes with optical traps that probe the mechanical and dynamical properties of single ‘live’ cells, highlighting differences between healthy and diseased cells. Trap methodologies have also begun to probe the dynamics governing of neural stem cells adhering to each other to form neurospheres and, with squeezed light to probe sub-diffusive motion of yeast cells. Strong field science contributes not only by providing a source of energetic electrons and γ-rays via laser-plasma accelerations schemes, but also via filamentation and supercontinuum generation, enabling mainstream collision physics into play in diverse processes like DNA damage induced by low-energy collisions to

  4. A physics-based probabilistic forecasting model for rainfall-induced shallow landslides at regional scale

    Science.gov (United States)

    Zhang, Shaojie; Zhao, Luqiang; Delgado-Tellez, Ricardo; Bao, Hongjun

    2018-03-01

    Conventional outputs of physics-based landslide forecasting models are presented as deterministic warnings by calculating the safety factor (Fs) of potentially dangerous slopes. However, these models are highly dependent on variables such as cohesion force and internal friction angle which are affected by a high degree of uncertainty especially at a regional scale, resulting in unacceptable uncertainties of Fs. Under such circumstances, the outputs of physical models are more suitable if presented in the form of landslide probability values. In order to develop such models, a method to link the uncertainty of soil parameter values with landslide probability is devised. This paper proposes the use of Monte Carlo methods to quantitatively express uncertainty by assigning random values to physical variables inside a defined interval. The inequality Fs soil mechanical parameters and is used to create a physics-based probabilistic forecasting model for rainfall-induced shallow landslides. The prediction ability of this model was tested in a case study, in which simulated forecasting of landslide disasters associated with heavy rainfalls on 9 July 2013 in the Wenchuan earthquake region of Sichuan province, China, was performed. The proposed model successfully forecasted landslides in 159 of the 176 disaster points registered by the geo-environmental monitoring station of Sichuan province. Such testing results indicate that the new model can be operated in a highly efficient way and show more reliable results, attributable to its high prediction accuracy. Accordingly, the new model can be potentially packaged into a forecasting system for shallow landslides providing technological support for the mitigation of these disasters at regional scale.

  5. EAP artificial muscle actuators for bio-inspired intelligent social robotics (Conference Presentation)

    Science.gov (United States)

    Hanson, David F.

    2017-04-01

    Bio-inspired intelligent robots are coming of age in both research and industry, propelling market growth for robots and A.I. However, conventional motors limit bio-inspired robotics. EAP actuators and sensors could improve the simplicity, compliance, physical scaling, and offer bio-inspired advantages in robotic locomotion, grasping and manipulation, and social expressions. For EAP actuators to realize their transformative potential, further innovations are needed: the actuators must be robust, fast, powerful, manufacturable, and affordable. This presentation surveys progress, opportunities, and challenges in the author's latest work in social robots and EAP actuators, and proposes a roadmap for EAP actuators in bio-inspired intelligent robotics.

  6. Efficient probabilistic inference in generic neural networks trained with non-probabilistic feedback.

    Science.gov (United States)

    Orhan, A Emin; Ma, Wei Ji

    2017-07-26

    Animals perform near-optimal probabilistic inference in a wide range of psychophysical tasks. Probabilistic inference requires trial-to-trial representation of the uncertainties associated with task variables and subsequent use of this representation. Previous work has implemented such computations using neural networks with hand-crafted and task-dependent operations. We show that generic neural networks trained with a simple error-based learning rule perform near-optimal probabilistic inference in nine common psychophysical tasks. In a probabilistic categorization task, error-based learning in a generic network simultaneously explains a monkey's learning curve and the evolution of qualitative aspects of its choice behavior. In all tasks, the number of neurons required for a given level of performance grows sublinearly with the input population size, a substantial improvement on previous implementations of probabilistic inference. The trained networks develop a novel sparsity-based probabilistic population code. Our results suggest that probabilistic inference emerges naturally in generic neural networks trained with error-based learning rules.Behavioural tasks often require probability distributions to be inferred about task specific variables. Here, the authors demonstrate that generic neural networks can be trained using a simple error-based learning rule to perform such probabilistic computations efficiently without any need for task specific operations.

  7. Probabilistic Structural Analysis of SSME Turbopump Blades: Probabilistic Geometry Effects

    Science.gov (United States)

    Nagpal, V. K.

    1985-01-01

    A probabilistic study was initiated to evaluate the precisions of the geometric and material properties tolerances on the structural response of turbopump blades. To complete this study, a number of important probabilistic variables were identified which are conceived to affect the structural response of the blade. In addition, a methodology was developed to statistically quantify the influence of these probabilistic variables in an optimized way. The identified variables include random geometric and material properties perturbations, different loadings and a probabilistic combination of these loadings. Influences of these probabilistic variables are planned to be quantified by evaluating the blade structural response. Studies of the geometric perturbations were conducted for a flat plate geometry as well as for a space shuttle main engine blade geometry using a special purpose code which uses the finite element approach. Analyses indicate that the variances of the perturbations about given mean values have significant influence on the response.

  8. On Probabilistic Alpha-Fuzzy Fixed Points and Related Convergence Results in Probabilistic Metric and Menger Spaces under Some Pompeiu-Hausdorff-Like Probabilistic Contractive Conditions

    OpenAIRE

    De la Sen, M.

    2015-01-01

    In the framework of complete probabilistic metric spaces and, in particular, in probabilistic Menger spaces, this paper investigates some relevant properties of convergence of sequences to probabilistic α-fuzzy fixed points under some types of probabilistic contractive conditions.

  9. Probabilistic Insurance

    NARCIS (Netherlands)

    Wakker, P.P.; Thaler, R.H.; Tversky, A.

    1997-01-01

    Probabilistic insurance is an insurance policy involving a small probability that the consumer will not be reimbursed. Survey data suggest that people dislike probabilistic insurance and demand more than a 20% reduction in premium to compensate for a 1% default risk. These observations cannot be

  10. Probabilistic Insurance

    NARCIS (Netherlands)

    P.P. Wakker (Peter); R.H. Thaler (Richard); A. Tversky (Amos)

    1997-01-01

    textabstractProbabilistic insurance is an insurance policy involving a small probability that the consumer will not be reimbursed. Survey data suggest that people dislike probabilistic insurance and demand more than a 20% reduction in the premium to compensate for a 1% default risk. While these

  11. Retina-Inspired Filter.

    Science.gov (United States)

    Doutsi, Effrosyni; Fillatre, Lionel; Antonini, Marc; Gaulmin, Julien

    2018-07-01

    This paper introduces a novel filter, which is inspired by the human retina. The human retina consists of three different layers: the Outer Plexiform Layer (OPL), the inner plexiform layer, and the ganglionic layer. Our inspiration is the linear transform which takes place in the OPL and has been mathematically described by the neuroscientific model "virtual retina." This model is the cornerstone to derive the non-separable spatio-temporal OPL retina-inspired filter, briefly renamed retina-inspired filter, studied in this paper. This filter is connected to the dynamic behavior of the retina, which enables the retina to increase the sharpness of the visual stimulus during filtering before its transmission to the brain. We establish that this retina-inspired transform forms a group of spatio-temporal Weighted Difference of Gaussian (WDoG) filters when it is applied to a still image visible for a given time. We analyze the spatial frequency bandwidth of the retina-inspired filter with respect to time. It is shown that the WDoG spectrum varies from a lowpass filter to a bandpass filter. Therefore, while time increases, the retina-inspired filter enables to extract different kinds of information from the input image. Finally, we discuss the benefits of using the retina-inspired filter in image processing applications such as edge detection and compression.

  12. Agent autonomy approach to probabilistic physics-of-failure modeling of complex dynamic systems with interacting failure mechanisms

    Science.gov (United States)

    Gromek, Katherine Emily

    A novel computational and inference framework of the physics-of-failure (PoF) reliability modeling for complex dynamic systems has been established in this research. The PoF-based reliability models are used to perform a real time simulation of system failure processes, so that the system level reliability modeling would constitute inferences from checking the status of component level reliability at any given time. The "agent autonomy" concept is applied as a solution method for the system-level probabilistic PoF-based (i.e. PPoF-based) modeling. This concept originated from artificial intelligence (AI) as a leading intelligent computational inference in modeling of multi agents systems (MAS). The concept of agent autonomy in the context of reliability modeling was first proposed by M. Azarkhail [1], where a fundamentally new idea of system representation by autonomous intelligent agents for the purpose of reliability modeling was introduced. Contribution of the current work lies in the further development of the agent anatomy concept, particularly the refined agent classification within the scope of the PoF-based system reliability modeling, new approaches to the learning and the autonomy properties of the intelligent agents, and modeling interacting failure mechanisms within the dynamic engineering system. The autonomous property of intelligent agents is defined as agent's ability to self-activate, deactivate or completely redefine their role in the analysis. This property of agents and the ability to model interacting failure mechanisms of the system elements makes the agent autonomy fundamentally different from all existing methods of probabilistic PoF-based reliability modeling. 1. Azarkhail, M., "Agent Autonomy Approach to Physics-Based Reliability Modeling of Structures and Mechanical Systems", PhD thesis, University of Maryland, College Park, 2007.

  13. Formulation of probabilistic models of protein structure in atomic detail using the reference ratio method.

    Science.gov (United States)

    Valentin, Jan B; Andreetta, Christian; Boomsma, Wouter; Bottaro, Sandro; Ferkinghoff-Borg, Jesper; Frellsen, Jes; Mardia, Kanti V; Tian, Pengfei; Hamelryck, Thomas

    2014-02-01

    We propose a method to formulate probabilistic models of protein structure in atomic detail, for a given amino acid sequence, based on Bayesian principles, while retaining a close link to physics. We start from two previously developed probabilistic models of protein structure on a local length scale, which concern the dihedral angles in main chain and side chains, respectively. Conceptually, this constitutes a probabilistic and continuous alternative to the use of discrete fragment and rotamer libraries. The local model is combined with a nonlocal model that involves a small number of energy terms according to a physical force field, and some information on the overall secondary structure content. In this initial study we focus on the formulation of the joint model and the evaluation of the use of an energy vector as a descriptor of a protein's nonlocal structure; hence, we derive the parameters of the nonlocal model from the native structure without loss of generality. The local and nonlocal models are combined using the reference ratio method, which is a well-justified probabilistic construction. For evaluation, we use the resulting joint models to predict the structure of four proteins. The results indicate that the proposed method and the probabilistic models show considerable promise for probabilistic protein structure prediction and related applications. Copyright © 2013 Wiley Periodicals, Inc.

  14. Integration of Probabilistic Exposure Assessment and Probabilistic Hazard Characterization

    NARCIS (Netherlands)

    Voet, van der H.; Slob, W.

    2007-01-01

    A method is proposed for integrated probabilistic risk assessment where exposure assessment and hazard characterization are both included in a probabilistic way. The aim is to specify the probability that a random individual from a defined (sub)population will have an exposure high enough to cause a

  15. Probabilistic Networks

    DEFF Research Database (Denmark)

    Jensen, Finn Verner; Lauritzen, Steffen Lilholt

    2001-01-01

    This article describes the basic ideas and algorithms behind specification and inference in probabilistic networks based on directed acyclic graphs, undirected graphs, and chain graphs.......This article describes the basic ideas and algorithms behind specification and inference in probabilistic networks based on directed acyclic graphs, undirected graphs, and chain graphs....

  16. The physical world an inspirational tour of fundamental physics

    CERN Document Server

    Manton, Nicholas

    2017-01-01

    The Physical World offers a grand vision of the essential unity of physics that will enable the reader to see the world through the eyes of a physicist and understand their thinking. The text follows Einstein's dictum that 'explanations should be made as simple as possible, but no simpler', to give an honest account of how modern physicists understand their subject, including the shortcomings of current theory. The result is an up-to-date and engaging portrait of physics that contains concise derivations of the important results in a style where every step in a derivation is clearly explained, so that anyone with the appropriate mathematical skills will find the text easy to digest. It is over half a century since The Feynman Lectures in Physics were published. A new authoritative account of fundamental physics covering all branches of the subject is now well overdue. The Physical World has been written to satisfy this need. The book concentrates on the conceptual principles of each branch of physics and sho...

  17. Applying a probabilistic seismic-petrophysical inversion and two different rock-physics models for reservoir characterization in offshore Nile Delta

    Science.gov (United States)

    Aleardi, Mattia

    2018-01-01

    We apply a two-step probabilistic seismic-petrophysical inversion for the characterization of a clastic, gas-saturated, reservoir located in offshore Nile Delta. In particular, we discuss and compare the results obtained when two different rock-physics models (RPMs) are employed in the inversion. The first RPM is an empirical, linear model directly derived from the available well log data by means of an optimization procedure. The second RPM is a theoretical, non-linear model based on the Hertz-Mindlin contact theory. The first step of the inversion procedure is a Bayesian linearized amplitude versus angle (AVA) inversion in which the elastic properties, and the associated uncertainties, are inferred from pre-stack seismic data. The estimated elastic properties constitute the input to the second step that is a probabilistic petrophysical inversion in which we account for the noise contaminating the recorded seismic data and the uncertainties affecting both the derived rock-physics models and the estimated elastic parameters. In particular, a Gaussian mixture a-priori distribution is used to properly take into account the facies-dependent behavior of petrophysical properties, related to the different fluid and rock properties of the different litho-fluid classes. In the synthetic and in the field data tests, the very minor differences between the results obtained by employing the two RPMs, and the good match between the estimated properties and well log information, confirm the applicability of the inversion approach and the suitability of the two different RPMs for reservoir characterization in the investigated area.

  18. Towards quantum gravity: a framework for probabilistic theories with non-fixed causal structure

    International Nuclear Information System (INIS)

    Hardy, Lucien

    2007-01-01

    General relativity is a deterministic theory with non-fixed causal structure. Quantum theory is a probabilistic theory with fixed causal structure. In this paper, we build a framework for probabilistic theories with non-fixed causal structure. This combines the radical elements of general relativity and quantum theory. We adopt an operational methodology for the purposes of theory construction (though without committing to operationalism as a fundamental philosophy). The key idea in the construction is physical compression. A physical theory relates quantities. Thus, if we specify a sufficiently large set of quantities (this is the compressed set), we can calculate all the others. We apply three levels of physical compression. First, we apply it locally to quantities (actually probabilities) that might be measured in a particular region of spacetime. Then we consider composite regions. We find that there is a second level of physical compression for a composite region over and above the first level physical compression for the component regions. Each application of first and second level physical compression is quantified by a matrix. We find that these matrices themselves are related by the physical theory and can therefore be subject to compression. This is the third level of physical compression. The third level of physical compression gives rise to a new mathematical object which we call the causaloid. From the causaloid for a particular physical theory we can calculate everything the physical theory can calculate. This approach allows us to set up a framework for calculating probabilistic correlations in data without imposing a fixed causal structure (such as a background time). We show how to put quantum theory in this framework (thus providing a new formulation of this theory). We indicate how general relativity might be put into this framework and how the framework might be used to construct a theory of quantum gravity

  19. Probabilistic Logical Characterization

    DEFF Research Database (Denmark)

    Hermanns, Holger; Parma, Augusto; Segala, Roberto

    2011-01-01

    Probabilistic automata exhibit both probabilistic and non-deterministic choice. They are therefore a powerful semantic foundation for modeling concurrent systems with random phenomena arising in many applications ranging from artificial intelligence, security, systems biology to performance...... modeling. Several variations of bisimulation and simulation relations have proved to be useful as means to abstract and compare different automata. This paper develops a taxonomy of logical characterizations of these relations on image-finite and image-infinite probabilistic automata....

  20. Probabilistic metric spaces

    CERN Document Server

    Schweizer, B

    2005-01-01

    Topics include special classes of probabilistic metric spaces, topologies, and several related structures, such as probabilistic normed and inner-product spaces. 1983 edition, updated with 3 new appendixes. Includes 17 illustrations.

  1. Probabilistic record linkage.

    Science.gov (United States)

    Sayers, Adrian; Ben-Shlomo, Yoav; Blom, Ashley W; Steele, Fiona

    2016-06-01

    Studies involving the use of probabilistic record linkage are becoming increasingly common. However, the methods underpinning probabilistic record linkage are not widely taught or understood, and therefore these studies can appear to be a 'black box' research tool. In this article, we aim to describe the process of probabilistic record linkage through a simple exemplar. We first introduce the concept of deterministic linkage and contrast this with probabilistic linkage. We illustrate each step of the process using a simple exemplar and describe the data structure required to perform a probabilistic linkage. We describe the process of calculating and interpreting matched weights and how to convert matched weights into posterior probabilities of a match using Bayes theorem. We conclude this article with a brief discussion of some of the computational demands of record linkage, how you might assess the quality of your linkage algorithm, and how epidemiologists can maximize the value of their record-linked research using robust record linkage methods. © The Author 2015; Published by Oxford University Press on behalf of the International Epidemiological Association.

  2. Probabilistic sharing solves the problem of costly punishment

    Science.gov (United States)

    Chen, Xiaojie; Szolnoki, Attila; Perc, Matjaž

    2014-08-01

    Cooperators that refuse to participate in sanctioning defectors create the second-order free-rider problem. Such cooperators will not be punished because they contribute to the public good, but they also eschew the costs associated with punishing defectors. Altruistic punishers—those that cooperate and punish—are at a disadvantage, and it is puzzling how such behaviour has evolved. We show that sharing the responsibility to sanction defectors rather than relying on certain individuals to do so permanently can solve the problem of costly punishment. Inspired by the fact that humans have strong but also emotional tendencies for fair play, we consider probabilistic sanctioning as the simplest way of distributing the duty. In well-mixed populations the public goods game is transformed into a coordination game with full cooperation and defection as the two stable equilibria, while in structured populations pattern formation supports additional counterintuitive solutions that are reminiscent of Parrondo's paradox.

  3. Probabilistic interpretation of the reduction criterion for entanglement

    International Nuclear Information System (INIS)

    Zhang, Zhengmin; Luo, Shunlong

    2007-01-01

    Inspired by the idea of conditional probabilities, we introduce a variant of conditional density operators. But unlike the conditional probabilities which are bounded by 1, the conditional density operators may have eigenvalues exceeding 1 for entangled states. This has the consequence that although any bivariate classical probability distribution has a natural separable decomposition in terms of conditional probabilities, we do not have a quantum analogue of this separable decomposition in general. The 'nonclassical' eigenvalues of conditional density operators are indications of entanglement. The resulting separability criterion turns out to be equivalent to the reduction criterion introduced by Horodecki [Phys. Rev. A 59, 4206 (1999)] and Cerf et al. [Phys. Rev. A 60, 898 (1999)]. This supplies an intuitive probabilistic interpretation for the reduction criterion. The conditional density operators are also used to define a form of quantum conditional entropy which provides an alternative mechanism to reveal quantum discord

  4. Evolving Understanding of Antarctic Ice-Sheet Physics and Ambiguity in Probabilistic Sea-Level Projections

    Science.gov (United States)

    Kopp, Robert E.; DeConto, Robert M.; Bader, Daniel A.; Hay, Carling C.; Horton, Radley M.; Kulp, Scott; Oppenheimer, Michael; Pollard, David; Strauss, Benjamin H.

    2017-12-01

    Mechanisms such as ice-shelf hydrofracturing and ice-cliff collapse may rapidly increase discharge from marine-based ice sheets. Here, we link a probabilistic framework for sea-level projections to a small ensemble of Antarctic ice-sheet (AIS) simulations incorporating these physical processes to explore their influence on global-mean sea-level (GMSL) and relative sea-level (RSL). We compare the new projections to past results using expert assessment and structured expert elicitation about AIS changes. Under high greenhouse gas emissions (Representative Concentration Pathway [RCP] 8.5), median projected 21st century GMSL rise increases from 79 to 146 cm. Without protective measures, revised median RSL projections would by 2100 submerge land currently home to 153 million people, an increase of 44 million. The use of a physical model, rather than simple parameterizations assuming constant acceleration of ice loss, increases forcing sensitivity: overlap between the central 90% of simulations for 2100 for RCP 8.5 (93-243 cm) and RCP 2.6 (26-98 cm) is minimal. By 2300, the gap between median GMSL estimates for RCP 8.5 and RCP 2.6 reaches >10 m, with median RSL projections for RCP 8.5 jeopardizing land now occupied by 950 million people (versus 167 million for RCP 2.6). The minimal correlation between the contribution of AIS to GMSL by 2050 and that in 2100 and beyond implies current sea-level observations cannot exclude future extreme outcomes. The sensitivity of post-2050 projections to deeply uncertain physics highlights the need for robust decision and adaptive management frameworks.

  5. Evolving Understanding of Antarctic Ice-Sheet Physics and Ambiguity in Probabilistic Sea-Level Projections

    Science.gov (United States)

    Kopp, Robert E.; DeConto, Robert M.; Bader, Daniel A.; Hay, Carling C.; Horton, Radley M.; Kulp, Scott; Oppenheimer, Michael; Pollard, David; Strauss, Benjamin

    2017-01-01

    Mechanisms such as ice-shelf hydrofracturing and ice-cliff collapse may rapidly increase discharge from marine-based ice sheets. Here, we link a probabilistic framework for sea-level projections to a small ensemble of Antarctic ice-sheet (AIS) simulations incorporating these physical processes to explore their influence on global-mean sea-level (GMSL) and relative sea-level (RSL). We compare the new projections to past results using expert assessment and structured expert elicitation about AIS changes. Under high greenhouse gas emissions (Representative Concentration Pathway [RCP] 8.5), median projected 21st century GMSL rise increases from 79 to 146 cm. Without protective measures, revised median RSL projections would by 2100 submerge land currently home to 153 million people, an increase of 44 million. The use of a physical model, rather than simple parameterizations assuming constant acceleration of ice loss, increases forcing sensitivity: overlap between the central 90% of simulations for 2100 for RCP 8.5 (93-243 cm) and RCP 2.6 (26-98 cm) is minimal. By 2300, the gap between median GMSL estimates for RCP 8.5 and RCP 2.6 reaches >10 m, with median RSL projections for RCP 8.5 jeopardizing land now occupied by 950 million people (versus 167 million for RCP 2.6). The minimal correlation between the contribution of AIS to GMSL by 2050 and that in 2100 and beyond implies current sea-level observations cannot exclude future extreme outcomes. The sensitivity of post-2050 projections to deeply uncertain physics highlights the need for robust decision and adaptive management frameworks.

  6. INSPIRE: a new scientific information system for HEP

    CERN Document Server

    Ivanov, R; CERN. Geneva. IT Department

    2010-01-01

    The status of high-energy physics (HEP) information systems has been jointly analyzed by the libraries of CERN, DESY, Fermilab and SLAC. As a result, the four laboratories have started the INSPIRE project – a new platform built by moving the successful SPIRES features and content, curated at DESY, Fermilab and SLAC, into the open-source CDS Invenio digital library software that was developed at CERN. INSPIRE will integrate current acquisition workflows and databases to host the entire body of the HEP literature (about one million records), aiming to become the reference HEP scientific information platform worldwide. It will provide users with fast access to full text journal articles and preprints, but also material such as conference slides and multimedia. INSPIRE will empower scientists with new tools to discover and access the results most relevant to their research, enable novel text- and data-mining applications, and deploy new metrics to assess the impact of articles and authors. In addition, it will ...

  7. INSPIRE: a new scientific information system for HEP

    CERN Multimedia

    Ivanov, R

    2009-01-01

    The status of high-energy physics (HEP) information systems has been jointly analyzed by the libraries of CERN, DESY, Fermilab and SLAC. As a result, the four laboratories have started the INSPIRE project – a new platform built by moving the successful SPIRES features and content, curated at DESY, Fermilab and SLAC, into the open-source CDS Invenio digital library software that was developed at CERN. INSPIRE will integrate present acquisition workflows and databases to host the entire body of the HEP literature (about one million records), aiming to become the reference HEP scientific information platform worldwide. It will provide users with fast access to full-text journal articles and preprints, but also material such as conference slides and multimedia. INSPIRE will empower scientists with new tools to discover and access the results most relevant to their research, enable novel text- and data-mining applications, and deploy new metrics to assess the impact of articles and authors. In addition, it will ...

  8. A physics-based probabilistic forecasting model for rainfall-induced shallow landslides at regional scale

    Directory of Open Access Journals (Sweden)

    S. Zhang

    2018-03-01

    Full Text Available Conventional outputs of physics-based landslide forecasting models are presented as deterministic warnings by calculating the safety factor (Fs of potentially dangerous slopes. However, these models are highly dependent on variables such as cohesion force and internal friction angle which are affected by a high degree of uncertainty especially at a regional scale, resulting in unacceptable uncertainties of Fs. Under such circumstances, the outputs of physical models are more suitable if presented in the form of landslide probability values. In order to develop such models, a method to link the uncertainty of soil parameter values with landslide probability is devised. This paper proposes the use of Monte Carlo methods to quantitatively express uncertainty by assigning random values to physical variables inside a defined interval. The inequality Fs < 1 is tested for each pixel in n simulations which are integrated in a unique parameter. This parameter links the landslide probability to the uncertainties of soil mechanical parameters and is used to create a physics-based probabilistic forecasting model for rainfall-induced shallow landslides. The prediction ability of this model was tested in a case study, in which simulated forecasting of landslide disasters associated with heavy rainfalls on 9 July 2013 in the Wenchuan earthquake region of Sichuan province, China, was performed. The proposed model successfully forecasted landslides in 159 of the 176 disaster points registered by the geo-environmental monitoring station of Sichuan province. Such testing results indicate that the new model can be operated in a highly efficient way and show more reliable results, attributable to its high prediction accuracy. Accordingly, the new model can be potentially packaged into a forecasting system for shallow landslides providing technological support for the mitigation of these disasters at regional scale.

  9. Cascade probabilistic function and the Markov's processes. Chapter 1

    International Nuclear Information System (INIS)

    2002-01-01

    In the Chapter 1 the physical and mathematical descriptions of radiation processes are carried out. The relation of the cascade probabilistic functions (CPF) for electrons, protons, alpha-particles and ions with Markov's chain is shown. The algorithms for CPF calculation with accounting energy losses are given

  10. Disjunctive Probabilistic Modal Logic is Enough for Bisimilarity on Reactive Probabilistic Systems

    OpenAIRE

    Bernardo, Marco; Miculan, Marino

    2016-01-01

    Larsen and Skou characterized probabilistic bisimilarity over reactive probabilistic systems with a logic including true, negation, conjunction, and a diamond modality decorated with a probabilistic lower bound. Later on, Desharnais, Edalat, and Panangaden showed that negation is not necessary to characterize the same equivalence. In this paper, we prove that the logical characterization holds also when conjunction is replaced by disjunction, with negation still being not necessary. To this e...

  11. Probabilistic Structural Analysis Program

    Science.gov (United States)

    Pai, Shantaram S.; Chamis, Christos C.; Murthy, Pappu L. N.; Stefko, George L.; Riha, David S.; Thacker, Ben H.; Nagpal, Vinod K.; Mital, Subodh K.

    2010-01-01

    NASA/NESSUS 6.2c is a general-purpose, probabilistic analysis program that computes probability of failure and probabilistic sensitivity measures of engineered systems. Because NASA/NESSUS uses highly computationally efficient and accurate analysis techniques, probabilistic solutions can be obtained even for extremely large and complex models. Once the probabilistic response is quantified, the results can be used to support risk-informed decisions regarding reliability for safety-critical and one-of-a-kind systems, as well as for maintaining a level of quality while reducing manufacturing costs for larger-quantity products. NASA/NESSUS has been successfully applied to a diverse range of problems in aerospace, gas turbine engines, biomechanics, pipelines, defense, weaponry, and infrastructure. This program combines state-of-the-art probabilistic algorithms with general-purpose structural analysis and lifting methods to compute the probabilistic response and reliability of engineered structures. Uncertainties in load, material properties, geometry, boundary conditions, and initial conditions can be simulated. The structural analysis methods include non-linear finite-element methods, heat-transfer analysis, polymer/ceramic matrix composite analysis, monolithic (conventional metallic) materials life-prediction methodologies, boundary element methods, and user-written subroutines. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. NASA/NESSUS 6.2c is structured in a modular format with 15 elements.

  12. Searching for inspiration during idea generation : Pictures or words?

    NARCIS (Netherlands)

    Coimbra Cardoso, C.M.; Guerreiro Goncalves, M.; Badke-Schaub, P.G.

    2012-01-01

    People from different professional arenas search for inspiration in a number of sources, be it in memories from past experiences or in the physical environment that surrounds them. Purposefully or unconsciously, scientists, artists, writers and different types of designers for instance, come across

  13. Probabilistic escalation modelling

    Energy Technology Data Exchange (ETDEWEB)

    Korneliussen, G.; Eknes, M.L.; Haugen, K.; Selmer-Olsen, S. [Det Norske Veritas, Oslo (Norway)

    1997-12-31

    This paper describes how structural reliability methods may successfully be applied within quantitative risk assessment (QRA) as an alternative to traditional event tree analysis. The emphasis is on fire escalation in hydrocarbon production and processing facilities. This choice was made due to potential improvements over current QRA practice associated with both the probabilistic approach and more detailed modelling of the dynamics of escalating events. The physical phenomena important for the events of interest are explicitly modelled as functions of time. Uncertainties are represented through probability distributions. The uncertainty modelling enables the analysis to be simple when possible and detailed when necessary. The methodology features several advantages compared with traditional risk calculations based on event trees. (Author)

  14. Prediction of Intention during Interaction with iCub with Probabilistic Movement Primitives

    Directory of Open Access Journals (Sweden)

    Oriane Dermy

    2017-10-01

    Full Text Available This article describes our open-source software for predicting the intention of a user physically interacting with the humanoid robot iCub. Our goal is to allow the robot to infer the intention of the human partner during collaboration, by predicting the future intended trajectory: this capability is critical to design anticipatory behaviors that are crucial in human–robot collaborative scenarios, such as in co-manipulation, cooperative assembly, or transportation. We propose an approach to endow the iCub with basic capabilities of intention recognition, based on Probabilistic Movement Primitives (ProMPs, a versatile method for representing, generalizing, and reproducing complex motor skills. The robot learns a set of motion primitives from several demonstrations, provided by the human via physical interaction. During training, we model the collaborative scenario using human demonstrations. During the reproduction of the collaborative task, we use the acquired knowledge to recognize the intention of the human partner. Using a few early observations of the state of the robot, we can not only infer the intention of the partner but also complete the movement, even if the user breaks the physical interaction with the robot. We evaluate our approach in simulation and on the real iCub. In simulation, the iCub is driven by the user using the Geomagic Touch haptic device. In the real robot experiment, we directly interact with the iCub by grabbing and manually guiding the robot’s arm. We realize two experiments on the real robot: one with simple reaching trajectories, and one inspired by collaborative object sorting. The software implementing our approach is open source and available on the GitHub platform. In addition, we provide tutorials and videos.

  15. Progress in methodology for probabilistic assessment of accidents: timing of accident sequences

    International Nuclear Information System (INIS)

    Lanore, J.M.; Villeroux, C.; Bouscatie, F.; Maigret, N.

    1981-09-01

    There is an important problem for probabilistic studies of accident sequences using the current event tree techniques. Indeed this method does not take into account the dependence in time of the real accident scenarios, involving the random behaviour of the systems (lack or delay in intervention, partial failures, repair, operator actions ...) and the correlated evolution of the physical parameters. A powerful method to perform the probabilistic treatment of these complex sequences (dynamic evolution of systems and associated physics) is Monte-Carlo simulation, very rare events being treated with the help of suitable weighting and biasing techniques. As a practical example the accident sequences related to the loss of the residual heat removal system in a fast breeder reactor has been treated with that method

  16. Probabilistic short-term forecasting of eruption rate at Kīlauea Volcano using a physics-based model

    Science.gov (United States)

    Anderson, K. R.

    2016-12-01

    Deterministic models of volcanic eruptions yield predictions of future activity conditioned on uncertainty in the current state of the system. Physics-based eruption models are well-suited for deterministic forecasting as they can relate magma physics with a wide range of observations. Yet, physics-based eruption forecasting is strongly limited by an inadequate understanding of volcanic systems, and the need for eruption models to be computationally tractable. At Kīlauea Volcano, Hawaii, episodic depressurization-pressurization cycles of the magma system generate correlated, quasi-exponential variations in ground deformation and surface height of the active summit lava lake. Deflations are associated with reductions in eruption rate, or even brief eruptive pauses, and thus partly control lava flow advance rates and associated hazard. Because of the relatively well-understood nature of Kīlauea's shallow magma plumbing system, and because more than 600 of these events have been recorded to date, they offer a unique opportunity to refine a physics-based effusive eruption forecasting approach and apply it to lava eruption rates over short (hours to days) time periods. A simple physical model of the volcano ascribes observed data to temporary reductions in magma supply to an elastic reservoir filled with compressible magma. This model can be used to predict the evolution of an ongoing event, but because the mechanism that triggers events is unknown, event durations are modeled stochastically from previous observations. A Bayesian approach incorporates diverse data sets and prior information to simultaneously estimate uncertain model parameters and future states of the system. Forecasts take the form of probability distributions for eruption rate or cumulative erupted volume at some future time. Results demonstrate the significant uncertainties that still remain even for short-term eruption forecasting at a well-monitored volcano - but also the value of a physics

  17. Development of Probabilistic Structural Analysis Integrated with Manufacturing Processes

    Science.gov (United States)

    Pai, Shantaram S.; Nagpal, Vinod K.

    2007-01-01

    An effort has been initiated to integrate manufacturing process simulations with probabilistic structural analyses in order to capture the important impacts of manufacturing uncertainties on component stress levels and life. Two physics-based manufacturing process models (one for powdered metal forging and the other for annular deformation resistance welding) have been linked to the NESSUS structural analysis code. This paper describes the methodology developed to perform this integration including several examples. Although this effort is still underway, particularly for full integration of a probabilistic analysis, the progress to date has been encouraging and a software interface that implements the methodology has been developed. The purpose of this paper is to report this preliminary development.

  18. Duplicate Detection in Probabilistic Data

    NARCIS (Netherlands)

    Panse, Fabian; van Keulen, Maurice; de Keijzer, Ander; Ritter, Norbert

    2009-01-01

    Collected data often contains uncertainties. Probabilistic databases have been proposed to manage uncertain data. To combine data from multiple autonomous probabilistic databases, an integration of probabilistic data has to be performed. Until now, however, data integration approaches have focused

  19. Probabilistic Fatigue Design of Composite Material for Wind Turbine Blades

    DEFF Research Database (Denmark)

    Toft, Henrik Stensgaard; Sørensen, John Dalsgaard

    2011-01-01

    In the present paper a probabilistic design approach to fatigue design of wind turbine blades is presented. The physical uncertainty on the fatigue strength for composite material is estimated using public available fatigue tests. Further, the model uncertainty on Miner rule for damage accumulation...

  20. Smart Nacre-inspired Nanocomposites.

    Science.gov (United States)

    Peng, Jingsong; Cheng, Qunfeng

    2018-03-15

    Nacre-inspired nanocomposites with excellent mechanical properties have achieved remarkable attention in the past decades. The high performance of nacre-inspired nanocomposites is a good basis for the further application of smart devices. Recently, some smart nanocomposites inspired by nacre have demonstrated good mechanical properties as well as effective and stable stimuli-responsive functions. In this Concept, we summarize the recent development of smart nacre-inspired nanocomposites, including 1D fibers, 2D films and 3D bulk nanocomposites, in response to temperature, moisture, light, strain, and so on. We show that diverse smart nanocomposites could be designed by combining various conventional fabrication methods of nacre-inspired nanocomposites with responsive building blocks and interface interactions. The nacre-inspired strategy is versatile for different kinds of smart nanocomposites in extensive applications, such as strain sensors, displays, artificial muscles, robotics, and so on, and may act as an effective roadmap for designing smart nanocomposites in the future. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  1. Confluence reduction for probabilistic systems

    NARCIS (Netherlands)

    Timmer, Mark; van de Pol, Jan Cornelis; Stoelinga, Mariëlle Ida Antoinette

    In this presentation we introduce a novel technique for state space reduction of probabilistic specifications, based on a newly developed notion of confluence for probabilistic automata. We proved that this reduction preserves branching probabilistic bisimulation and can be applied on-the-fly. To

  2. Probabilistic systems coalgebraically: A survey

    Science.gov (United States)

    Sokolova, Ana

    2011-01-01

    We survey the work on both discrete and continuous-space probabilistic systems as coalgebras, starting with how probabilistic systems are modeled as coalgebras and followed by a discussion of their bisimilarity and behavioral equivalence, mentioning results that follow from the coalgebraic treatment of probabilistic systems. It is interesting to note that, for different reasons, for both discrete and continuous probabilistic systems it may be more convenient to work with behavioral equivalence than with bisimilarity. PMID:21998490

  3. Simulation on a computer the cascade probabilistic functions and theirs relation with Markov's processes

    International Nuclear Information System (INIS)

    Kupchishin, A.A.; Kupchishin, A.I.; Shmygaleva, T.A.

    2002-01-01

    Within framework of the cascade-probabilistic (CP) method the radiation and physical processes are studied, theirs relation with Markov's processes are found. The conclusion that CP-function for electrons, protons, alpha-particles and ions are describing by unhomogeneous Markov's chain is drawn. The algorithms are developed, the CP-functions calculations for charged particles, concentration of radiation defects in solids at ion irradiation are carried out as well. Tables for CPF different parameters and radiation defects concentration at charged particle interaction with solids are given. The book consists of the introduction and two chapters: (1) Cascade probabilistic function and the Markov's processes; (2) Radiation defects formation in solids as a part of the Markov's processes. The book is intended for specialists on the radiation defects mathematical stimulation, solid state physics, elementary particles physics and applied mathematics

  4. Probabilistic fuel rod analyses using the TRANSURANUS code

    Energy Technology Data Exchange (ETDEWEB)

    Lassmann, K; O` Carroll, C; Laar, J Van De [CEC Joint Research Centre, Karlsruhe (Germany)

    1997-08-01

    After more than 25 years of fuel rod modelling research, the basic concepts are well established and the limitations of the specific approaches are known. However, the widely used mechanistic approach leads in many cases to discrepancies between theoretical predictions and experimental evidence indicating that models are not exact and that some of the physical processes encountered are of stochastic nature. To better understand uncertainties and their consequences, the mechanistic approach must therefore be augmented by statistical analyses. In the present paper the basic probabilistic methods are briefly discussed. Two such probabilistic approaches are included in the fuel rod performance code TRANSURANUS: the Monte Carlo method and the Numerical Noise Analysis. These two techniques are compared and their capabilities are demonstrated. (author). 12 refs, 4 figs, 2 tabs.

  5. Conditional Probabilistic Population Forecasting

    OpenAIRE

    Sanderson, Warren C.; Scherbov, Sergei; O'Neill, Brian C.; Lutz, Wolfgang

    2004-01-01

    Since policy-makers often prefer to think in terms of alternative scenarios, the question has arisen as to whether it is possible to make conditional population forecasts in a probabilistic context. This paper shows that it is both possible and useful to make these forecasts. We do this with two different kinds of examples. The first is the probabilistic analog of deterministic scenario analysis. Conditional probabilistic scenario analysis is essential for policy-makers because...

  6. Imprint of accretion disk-induced migration on gravitational waves from extreme mass ratio inspirals.

    Science.gov (United States)

    Yunes, Nicolás; Kocsis, Bence; Loeb, Abraham; Haiman, Zoltán

    2011-10-21

    We study the effects of a thin gaseous accretion disk on the inspiral of a stellar-mass black hole into a supermassive black hole. We construct a phenomenological angular momentum transport equation that reproduces known disk effects. Disk torques modify the gravitational wave phase evolution to detectable levels with LISA for reasonable disk parameters. The Fourier transform of disk-modified waveforms acquires a correction with a different frequency trend than post-Newtonian vacuum terms. Such inspirals could be used to detect accretion disks with LISA and to probe their physical parameters. © 2011 American Physical Society

  7. Probabilistic approach to manipulator kinematics and dynamics

    International Nuclear Information System (INIS)

    Rao, S.S.; Bhatti, P.K.

    2001-01-01

    A high performance, high speed robotic arm must be able to manipulate objects with a high degree of accuracy and repeatability. As with any other physical system, there are a number of factors causing uncertainties in the behavior of a robotic manipulator. These factors include manufacturing and assembling tolerances, and errors in the joint actuators and controllers. In order to study the effect of these uncertainties on the robotic end-effector and to obtain a better insight into the manipulator behavior, the manipulator kinematics and dynamics are modeled using a probabilistic approach. Based on the probabilistic model, kinematic and dynamic performance criteria are defined to provide measures of the behavior of the robotic end-effector. Techniques are presented to compute the kinematic and dynamic reliabilities of the manipulator. The effects of tolerances associated with the various manipulator parameters on the reliabilities are studied. Numerical examples are presented to illustrate the procedures

  8. Conditional Probabilistic Population Forecasting

    OpenAIRE

    Sanderson, W.C.; Scherbov, S.; O'Neill, B.C.; Lutz, W.

    2003-01-01

    Since policy makers often prefer to think in terms of scenarios, the question has arisen as to whether it is possible to make conditional population forecasts in a probabilistic context. This paper shows that it is both possible and useful to make these forecasts. We do this with two different kinds of examples. The first is the probabilistic analog of deterministic scenario analysis. Conditional probabilistic scenario analysis is essential for policy makers it allows them to answer "what if"...

  9. Conditional probabilistic population forecasting

    OpenAIRE

    Sanderson, Warren; Scherbov, Sergei; O'Neill, Brian; Lutz, Wolfgang

    2003-01-01

    Since policy-makers often prefer to think in terms of alternative scenarios, the question has arisen as to whether it is possible to make conditional population forecasts in a probabilistic context. This paper shows that it is both possible and useful to make these forecasts. We do this with two different kinds of examples. The first is the probabilistic analog of deterministic scenario analysis. Conditional probabilistic scenario analysis is essential for policy-makers because it allows them...

  10. Students’ difficulties in probabilistic problem-solving

    Science.gov (United States)

    Arum, D. P.; Kusmayadi, T. A.; Pramudya, I.

    2018-03-01

    There are many errors can be identified when students solving mathematics problems, particularly in solving the probabilistic problem. This present study aims to investigate students’ difficulties in solving the probabilistic problem. It focuses on analyzing and describing students errors during solving the problem. This research used the qualitative method with case study strategy. The subjects in this research involve ten students of 9th grade that were selected by purposive sampling. Data in this research involve students’ probabilistic problem-solving result and recorded interview regarding students’ difficulties in solving the problem. Those data were analyzed descriptively using Miles and Huberman steps. The results show that students have difficulties in solving the probabilistic problem and can be divided into three categories. First difficulties relate to students’ difficulties in understanding the probabilistic problem. Second, students’ difficulties in choosing and using appropriate strategies for solving the problem. Third, students’ difficulties with the computational process in solving the problem. Based on the result seems that students still have difficulties in solving the probabilistic problem. It means that students have not able to use their knowledge and ability for responding probabilistic problem yet. Therefore, it is important for mathematics teachers to plan probabilistic learning which could optimize students probabilistic thinking ability.

  11. A General Framework for Probabilistic Characterizing Formulae

    DEFF Research Database (Denmark)

    Sack, Joshua; Zhang, Lijun

    2012-01-01

    Recently, a general framework on characteristic formulae was proposed by Aceto et al. It offers a simple theory that allows one to easily obtain characteristic formulae of many non-probabilistic behavioral relations. Our paper studies their techniques in a probabilistic setting. We provide...... a general method for determining characteristic formulae of behavioral relations for probabilistic automata using fixed-point probability logics. We consider such behavioral relations as simulations and bisimulations, probabilistic bisimulations, probabilistic weak simulations, and probabilistic forward...

  12. A probabilistic Hu-Washizu variational principle

    Science.gov (United States)

    Liu, W. K.; Belytschko, T.; Besterfield, G. H.

    1987-01-01

    A Probabilistic Hu-Washizu Variational Principle (PHWVP) for the Probabilistic Finite Element Method (PFEM) is presented. This formulation is developed for both linear and nonlinear elasticity. The PHWVP allows incorporation of the probabilistic distributions for the constitutive law, compatibility condition, equilibrium, domain and boundary conditions into the PFEM. Thus, a complete probabilistic analysis can be performed where all aspects of the problem are treated as random variables and/or fields. The Hu-Washizu variational formulation is available in many conventional finite element codes thereby enabling the straightforward inclusion of the probabilistic features into present codes.

  13. Memristive Probabilistic Computing

    KAUST Repository

    Alahmadi, Hamzah

    2017-10-01

    In the era of Internet of Things and Big Data, unconventional techniques are rising to accommodate the large size of data and the resource constraints. New computing structures are advancing based on non-volatile memory technologies and different processing paradigms. Additionally, the intrinsic resiliency of current applications leads to the development of creative techniques in computations. In those applications, approximate computing provides a perfect fit to optimize the energy efficiency while compromising on the accuracy. In this work, we build probabilistic adders based on stochastic memristor. Probabilistic adders are analyzed with respect of the stochastic behavior of the underlying memristors. Multiple adder implementations are investigated and compared. The memristive probabilistic adder provides a different approach from the typical approximate CMOS adders. Furthermore, it allows for a high area saving and design exibility between the performance and power saving. To reach a similar performance level as approximate CMOS adders, the memristive adder achieves 60% of power saving. An image-compression application is investigated using the memristive probabilistic adders with the performance and the energy trade-off.

  14. Uses of probabilistic estimates of seismic hazard and nuclear power plants in the US

    International Nuclear Information System (INIS)

    Reiter, L.

    1983-01-01

    The use of probabilistic estimates is playing an increased role in the review of seismic hazard at nuclear power plants. The NRC Geosciences Branch emphasis has been on using these estimates in a relative rather than absolute manner and to gain insight on other approaches. Examples of this use include estimates to determine design levels, to determine equivalent hazard at different sites, to help define more realistic seismotectonic provinces, and to assess implied levels of acceptable risk using deterministic methods. Increased use of probabilistic estimates is expected. Probabilistic estimates of seismic hazard have a potential for misuse, however, and their successful integration into decision making requires they not be divorced from physical insight and scientific intuition

  15. An empirical system for probabilistic seasonal climate prediction

    Science.gov (United States)

    Eden, Jonathan; van Oldenborgh, Geert Jan; Hawkins, Ed; Suckling, Emma

    2016-04-01

    Preparing for episodes with risks of anomalous weather a month to a year ahead is an important challenge for governments, non-governmental organisations, and private companies and is dependent on the availability of reliable forecasts. The majority of operational seasonal forecasts are made using process-based dynamical models, which are complex, computationally challenging and prone to biases. Empirical forecast approaches built on statistical models to represent physical processes offer an alternative to dynamical systems and can provide either a benchmark for comparison or independent supplementary forecasts. Here, we present a simple empirical system based on multiple linear regression for producing probabilistic forecasts of seasonal surface air temperature and precipitation across the globe. The global CO2-equivalent concentration is taken as the primary predictor; subsequent predictors, including large-scale modes of variability in the climate system and local-scale information, are selected on the basis of their physical relationship with the predictand. The focus given to the climate change signal as a source of skill and the probabilistic nature of the forecasts produced constitute a novel approach to global empirical prediction. Hindcasts for the period 1961-2013 are validated against observations using deterministic (correlation of seasonal means) and probabilistic (continuous rank probability skill scores) metrics. Good skill is found in many regions, particularly for surface air temperature and most notably in much of Europe during the spring and summer seasons. For precipitation, skill is generally limited to regions with known El Niño-Southern Oscillation (ENSO) teleconnections. The system is used in a quasi-operational framework to generate empirical seasonal forecasts on a monthly basis.

  16. Probabilistic Logic and Probabilistic Networks

    NARCIS (Netherlands)

    Haenni, R.; Romeijn, J.-W.; Wheeler, G.; Williamson, J.

    2009-01-01

    While in principle probabilistic logics might be applied to solve a range of problems, in practice they are rarely applied at present. This is perhaps because they seem disparate, complicated, and computationally intractable. However, we shall argue in this programmatic paper that several approaches

  17. Bio-inspired networking

    CERN Document Server

    Câmara, Daniel

    2015-01-01

    Bio-inspired techniques are based on principles, or models, of biological systems. In general, natural systems present remarkable capabilities of resilience and adaptability. In this book, we explore how bio-inspired methods can solve different problems linked to computer networks. Future networks are expected to be autonomous, scalable and adaptive. During millions of years of evolution, nature has developed a number of different systems that present these and other characteristics required for the next generation networks. Indeed, a series of bio-inspired methods have been successfully used to solve the most diverse problems linked to computer networks. This book presents some of these techniques from a theoretical and practical point of view. Discusses the key concepts of bio-inspired networking to aid you in finding efficient networking solutions Delivers examples of techniques both in theoretical concepts and practical applications Helps you apply nature's dynamic resource and task management to your co...

  18. Use of probabilistic methods for analysis of cost and duration uncertainties in a decision analysis framework

    International Nuclear Information System (INIS)

    Boak, D.M.; Painton, L.

    1995-01-01

    Probabilistic forecasting techniques have been used in many risk assessment and performance assessment applications on radioactive waste disposal projects such as Yucca Mountain and the Waste Isolation Pilot Plant (WIPP). Probabilistic techniques such as Monte Carlo and Latin Hypercube sampling methods are routinely used to treat uncertainties in physical parameters important in simulating radionuclide transport in a coupled geohydrologic system and assessing the ability of that system to comply with regulatory release limits. However, the use of probabilistic techniques in the treatment of uncertainties in the cost and duration of programmatic alternatives on risk and performance assessment projects is less common. Where significant uncertainties exist and where programmatic decisions must be made despite existing uncertainties, probabilistic techniques may yield important insights into decision options, especially when used in a decision analysis framework and when properly balanced with deterministic analyses. For relatively simple evaluations, these types of probabilistic evaluations can be made using personal computer-based software

  19. Probabilistic programmable quantum processors

    International Nuclear Information System (INIS)

    Buzek, V.; Ziman, M.; Hillery, M.

    2004-01-01

    We analyze how to improve performance of probabilistic programmable quantum processors. We show how the probability of success of the probabilistic processor can be enhanced by using the processor in loops. In addition, we show that an arbitrary SU(2) transformations of qubits can be encoded in program state of a universal programmable probabilistic quantum processor. The probability of success of this processor can be enhanced by a systematic correction of errors via conditional loops. Finally, we show that all our results can be generalized also for qudits. (Abstract Copyright [2004], Wiley Periodicals, Inc.)

  20. A Probabilistic Model of Social Working Memory for Information Retrieval in Social Interactions.

    Science.gov (United States)

    Li, Liyuan; Xu, Qianli; Gan, Tian; Tan, Cheston; Lim, Joo-Hwee

    2018-05-01

    Social working memory (SWM) plays an important role in navigating social interactions. Inspired by studies in psychology, neuroscience, cognitive science, and machine learning, we propose a probabilistic model of SWM to mimic human social intelligence for personal information retrieval (IR) in social interactions. First, we establish a semantic hierarchy as social long-term memory to encode personal information. Next, we propose a semantic Bayesian network as the SWM, which integrates the cognitive functions of accessibility and self-regulation. One subgraphical model implements the accessibility function to learn the social consensus about IR-based on social information concept, clustering, social context, and similarity between persons. Beyond accessibility, one more layer is added to simulate the function of self-regulation to perform the personal adaptation to the consensus based on human personality. Two learning algorithms are proposed to train the probabilistic SWM model on a raw dataset of high uncertainty and incompleteness. One is an efficient learning algorithm of Newton's method, and the other is a genetic algorithm. Systematic evaluations show that the proposed SWM model is able to learn human social intelligence effectively and outperforms the baseline Bayesian cognitive model. Toward real-world applications, we implement our model on Google Glass as a wearable assistant for social interaction.

  1. PROBABILISTIC RELATIONAL MODELS OF COMPLETE IL-SEMIRINGS

    OpenAIRE

    Tsumagari, Norihiro

    2012-01-01

    This paper studies basic properties of probabilistic multirelations which are generalized the semantic domain of probabilistic systems and then provides two probabilistic models of complete IL-semirings using probabilistic multirelations. Also it is shown that these models need not be models of complete idempotentsemirings.

  2. Probabilistic safety analysis vs probabilistic fracture mechanics -relation and necessary merging

    International Nuclear Information System (INIS)

    Nilsson, Fred

    1997-01-01

    A comparison is made between some general features of probabilistic fracture mechanics (PFM) and probabilistic safety assessment (PSA) in its standard form. We conclude that: Result from PSA is a numerically expressed level of confidence in the system based on the state of current knowledge. It is thus not any objective measure of risk. It is important to carefully define the precise nature of the probabilistic statement and relate it to a well defined situation. Standardisation of PFM methods is necessary. PFM seems to be the only way to obtain estimates of the pipe break probability. Service statistics are of doubtful value because of scarcity of data and statistical inhomogeneity. Collection of service data should be directed towards the occurrence of growing cracks

  3. Growing a Waldorf-Inspired Approach in a Public School District

    Science.gov (United States)

    Friedlaender, Diane; Beckham, Kyle; Zheng, Xinhua; Darling-Hammond, Linda

    2015-01-01

    This report documents the practices and outcomes of Alice Birney, a public K-8 Waldorf-Inspired School in Sacramento City Unified School District (SCUSD). This study highlights how such a school addresses students' academic, social, emotional, physical, and creative development. Birney students outperform similar students in SCUSD on several…

  4. On synchronous parallel computations with independent probabilistic choice

    International Nuclear Information System (INIS)

    Reif, J.H.

    1984-01-01

    This paper introduces probabilistic choice to synchronous parallel machine models; in particular parallel RAMs. The power of probabilistic choice in parallel computations is illustrate by parallelizing some known probabilistic sequential algorithms. The authors characterize the computational complexity of time, space, and processor bounded probabilistic parallel RAMs in terms of the computational complexity of probabilistic sequential RAMs. They show that parallelism uniformly speeds up time bounded probabilistic sequential RAM computations by nearly a quadratic factor. They also show that probabilistic choice can be eliminated from parallel computations by introducing nonuniformity

  5. Probabilistic Structural Analysis Theory Development

    Science.gov (United States)

    Burnside, O. H.

    1985-01-01

    The objective of the Probabilistic Structural Analysis Methods (PSAM) project is to develop analysis techniques and computer programs for predicting the probabilistic response of critical structural components for current and future space propulsion systems. This technology will play a central role in establishing system performance and durability. The first year's technical activity is concentrating on probabilistic finite element formulation strategy and code development. Work is also in progress to survey critical materials and space shuttle mian engine components. The probabilistic finite element computer program NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) is being developed. The final probabilistic code will have, in the general case, the capability of performing nonlinear dynamic of stochastic structures. It is the goal of the approximate methods effort to increase problem solving efficiency relative to finite element methods by using energy methods to generate trial solutions which satisfy the structural boundary conditions. These approximate methods will be less computer intensive relative to the finite element approach.

  6. Hierarchical and Size Dependent Mechanical Properties of Silica and Silicon Nanostructures Inspired by Diatom Algae

    Science.gov (United States)

    2010-09-01

    Chaniotakis. The physical and mechanical properties of composite cements manufactured with cal- careous and clayey greek diatomite mixtures. Cement and...Hierarchical and size dependent mechanical properties of silica and silicon nanostructures inspired by diatom algae by Andre Phillipe Garcia B.S...dependent mechanical properties of silica and silicon nanostructures inspired by diatom algae 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM

  7. Bell-Boole Inequality: Nonlocality or Probabilistic Incompatibility of Random Variables?

    Directory of Open Access Journals (Sweden)

    Andrei Khrennikov

    2008-03-01

    Full Text Available The main aim of this report is to inform the quantum information community about investigations on the problem of probabilistic compatibility of a family of random variables: a possibility to realize such a family on the basis of a single probability measure (to construct a single Kolmogorov probability space. These investigations were started hundred of years ago by J. Boole (who invented Boolean algebras. The complete solution of the problem was obtained by Soviet mathematician Vorobjev in 60th. Surprisingly probabilists and statisticians obtained inequalities for probabilities and correlations among which one can find the famous Bell’s inequality and its generalizations. Such inequalities appeared simply as constraints for probabilistic compatibility. In this framework one can not see a priori any link to such problems as nonlocality and “death of reality” which are typically linked to Bell’s type inequalities in physical literature. We analyze the difference between positions of mathematicians and quantum physicists. In particular, we found that one of the most reasonable explanations of probabilistic incompatibility is mixing in Bell’s type inequalities statistical data from a number of experiments performed under different experimental contexts.

  8. 14th International Probabilistic Workshop

    CERN Document Server

    Taerwe, Luc; Proske, Dirk

    2017-01-01

    This book presents the proceedings of the 14th International Probabilistic Workshop that was held in Ghent, Belgium in December 2016. Probabilistic methods are currently of crucial importance for research and developments in the field of engineering, which face challenges presented by new materials and technologies and rapidly changing societal needs and values. Contemporary needs related to, for example, performance-based design, service-life design, life-cycle analysis, product optimization, assessment of existing structures and structural robustness give rise to new developments as well as accurate and practically applicable probabilistic and statistical engineering methods to support these developments. These proceedings are a valuable resource for anyone interested in contemporary developments in the field of probabilistic engineering applications.

  9. Inspiration from britain?

    DEFF Research Database (Denmark)

    Vagnby, Bo

    2008-01-01

    Danish housing policy needs a dose of renewed social concern - and could find new inspiration in Britain's housing and urban planning policies, says Bo Vagnby. Udgivelsesdato: November......Danish housing policy needs a dose of renewed social concern - and could find new inspiration in Britain's housing and urban planning policies, says Bo Vagnby. Udgivelsesdato: November...

  10. Development of Probabilistic Flood Inundation Mapping For Flooding Induced by Dam Failure

    Science.gov (United States)

    Tsai, C.; Yeh, J. J. J.

    2017-12-01

    A primary function of flood inundation mapping is to forecast flood hazards and assess potential losses. However, uncertainties limit the reliability of inundation hazard assessments. Major sources of uncertainty should be taken into consideration by an optimal flood management strategy. This study focuses on the 20km reach downstream of the Shihmen Reservoir in Taiwan. A dam failure induced flood herein provides the upstream boundary conditions of flood routing. The two major sources of uncertainty that are considered in the hydraulic model and the flood inundation mapping herein are uncertainties in the dam break model and uncertainty of the roughness coefficient. The perturbance moment method is applied to a dam break model and the hydro system model to develop probabilistic flood inundation mapping. Various numbers of uncertain variables can be considered in these models and the variability of outputs can be quantified. The probabilistic flood inundation mapping for dam break induced floods can be developed with consideration of the variability of output using a commonly used HEC-RAS model. Different probabilistic flood inundation mappings are discussed and compared. Probabilistic flood inundation mappings are hoped to provide new physical insights in support of the evaluation of concerning reservoir flooded areas.

  11. Some probabilistic aspects of fracture

    International Nuclear Information System (INIS)

    Thomas, J.M.

    1982-01-01

    Some probabilistic aspects of fracture in structural and mechanical components are examined. The principles of fracture mechanics, material quality and inspection uncertainty are formulated into a conceptual and analytical framework for prediction of failure probability. The role of probabilistic fracture mechanics in a more global context of risk and optimization of decisions is illustrated. An example, where Monte Carlo simulation was used to implement a probabilistic fracture mechanics analysis, is discussed. (orig.)

  12. A global empirical system for probabilistic seasonal climate prediction

    Science.gov (United States)

    Eden, J. M.; van Oldenborgh, G. J.; Hawkins, E.; Suckling, E. B.

    2015-12-01

    Preparing for episodes with risks of anomalous weather a month to a year ahead is an important challenge for governments, non-governmental organisations, and private companies and is dependent on the availability of reliable forecasts. The majority of operational seasonal forecasts are made using process-based dynamical models, which are complex, computationally challenging and prone to biases. Empirical forecast approaches built on statistical models to represent physical processes offer an alternative to dynamical systems and can provide either a benchmark for comparison or independent supplementary forecasts. Here, we present a simple empirical system based on multiple linear regression for producing probabilistic forecasts of seasonal surface air temperature and precipitation across the globe. The global CO2-equivalent concentration is taken as the primary predictor; subsequent predictors, including large-scale modes of variability in the climate system and local-scale information, are selected on the basis of their physical relationship with the predictand. The focus given to the climate change signal as a source of skill and the probabilistic nature of the forecasts produced constitute a novel approach to global empirical prediction. Hindcasts for the period 1961-2013 are validated against observations using deterministic (correlation of seasonal means) and probabilistic (continuous rank probability skill scores) metrics. Good skill is found in many regions, particularly for surface air temperature and most notably in much of Europe during the spring and summer seasons. For precipitation, skill is generally limited to regions with known El Niño-Southern Oscillation (ENSO) teleconnections. The system is used in a quasi-operational framework to generate empirical seasonal forecasts on a monthly basis.

  13. Bio-inspired functional surfaces for advanced applications

    DEFF Research Database (Denmark)

    Malshe, Ajay; Rajurkar, Kamlakar; Samant, Anoop

    2013-01-01

    , are being evolved to a higher state of intelligent functionality. These surfaces became more efficient by using combinations of available materials, along with unique physical and chemical strategies. Noteworthy physical strategies include features such as texturing and structure, and chemical strategies...... such as sensing and actuation. These strategies collectively enable functional surfaces to deliver extraordinary adhesion, hydrophobicity, multispectral response, energy scavenging, thermal regulation, antibiofouling, and other advanced functions. Production industries have been intrigued with such biological...... surface strategies in order to learn clever surface architectures and implement those architectures to impart advanced functionalities into manufactured consumer products. This keynote paper delivers a critical review of such inspiring biological surfaces and their nonbiological product analogs, where...

  14. Probabilistic methods used in NUSS

    International Nuclear Information System (INIS)

    Fischer, J.; Giuliani, P.

    1985-01-01

    Probabilistic considerations are used implicitly or explicitly in all technical areas. In the NUSS codes and guides the two areas of design and siting are those where more use is made of these concepts. A brief review of the relevant documents in these two areas is made in this paper. It covers the documents where either probabilistic considerations are implied or where probabilistic approaches are recommended in the evaluation of situations and of events. In the siting guides the review mainly covers the area of seismic hydrological and external man-made events analysis, as well as some aspects of meteorological extreme events analysis. Probabilistic methods are recommended in the design guides but they are not made a requirement. There are several reasons for this, mainly lack of reliable data and the absence of quantitative safety limits or goals against which to judge the design analysis. As far as practical, engineering judgement should be backed up by quantitative probabilistic analysis. Examples are given and the concept of design basis as used in NUSS design guides is explained. (author)

  15. Sustaining Physics Teacher Education Coalition Programs in Physics Teacher Education

    Science.gov (United States)

    Scherr, Rachel E.; Plisch, Monica; Goertzen, Renee Michelle

    2017-01-01

    Understanding the mechanisms of increasing the number of physics teachers educated per year at institutions with thriving physics teacher preparation programs may inspire and support other institutions in building thriving programs of their own. The Physics Teacher Education Coalition (PhysTEC), led by the American Physical Society (APS) and the…

  16. An improved, bias-reduced probabilistic functional gene network of baker's yeast, Saccharomyces cerevisiae.

    Directory of Open Access Journals (Sweden)

    Insuk Lee

    2007-10-01

    Full Text Available Probabilistic functional gene networks are powerful theoretical frameworks for integrating heterogeneous functional genomics and proteomics data into objective models of cellular systems. Such networks provide syntheses of millions of discrete experimental observations, spanning DNA microarray experiments, physical protein interactions, genetic interactions, and comparative genomics; the resulting networks can then be easily applied to generate testable hypotheses regarding specific gene functions and associations.We report a significantly improved version (v. 2 of a probabilistic functional gene network of the baker's yeast, Saccharomyces cerevisiae. We describe our optimization methods and illustrate their effects in three major areas: the reduction of functional bias in network training reference sets, the application of a probabilistic model for calculating confidences in pair-wise protein physical or genetic interactions, and the introduction of simple thresholds that eliminate many false positive mRNA co-expression relationships. Using the network, we predict and experimentally verify the function of the yeast RNA binding protein Puf6 in 60S ribosomal subunit biogenesis.YeastNet v. 2, constructed using these optimizations together with additional data, shows significant reduction in bias and improvements in precision and recall, in total covering 102,803 linkages among 5,483 yeast proteins (95% of the validated proteome. YeastNet is available from http://www.yeastnet.org.

  17. Nature-Inspired Structural Materials for Flexible Electronic Devices.

    Science.gov (United States)

    Liu, Yaqing; He, Ke; Chen, Geng; Leow, Wan Ru; Chen, Xiaodong

    2017-10-25

    Exciting advancements have been made in the field of flexible electronic devices in the last two decades and will certainly lead to a revolution in peoples' lives in the future. However, because of the poor sustainability of the active materials in complex stress environments, new requirements have been adopted for the construction of flexible devices. Thus, hierarchical architectures in natural materials, which have developed various environment-adapted structures and materials through natural selection, can serve as guides to solve the limitations of materials and engineering techniques. This review covers the smart designs of structural materials inspired by natural materials and their utility in the construction of flexible devices. First, we summarize structural materials that accommodate mechanical deformations, which is the fundamental requirement for flexible devices to work properly in complex environments. Second, we discuss the functionalities of flexible devices induced by nature-inspired structural materials, including mechanical sensing, energy harvesting, physically interacting, and so on. Finally, we provide a perspective on newly developed structural materials and their potential applications in future flexible devices, as well as frontier strategies for biomimetic functions. These analyses and summaries are valuable for a systematic understanding of structural materials in electronic devices and will serve as inspirations for smart designs in flexible electronics.

  18. Evaluation of Probabilistic Disease Forecasts.

    Science.gov (United States)

    Hughes, Gareth; Burnett, Fiona J

    2017-10-01

    The statistical evaluation of probabilistic disease forecasts often involves calculation of metrics defined conditionally on disease status, such as sensitivity and specificity. However, for the purpose of disease management decision making, metrics defined conditionally on the result of the forecast-predictive values-are also important, although less frequently reported. In this context, the application of scoring rules in the evaluation of probabilistic disease forecasts is discussed. An index of separation with application in the evaluation of probabilistic disease forecasts, described in the clinical literature, is also considered and its relation to scoring rules illustrated. Scoring rules provide a principled basis for the evaluation of probabilistic forecasts used in plant disease management. In particular, the decomposition of scoring rules into interpretable components is an advantageous feature of their application in the evaluation of disease forecasts.

  19. Efficient network-matrix architecture for general flow transport inspired by natural pinnate leaves.

    Science.gov (United States)

    Hu, Liguo; Zhou, Han; Zhu, Hanxing; Fan, Tongxiang; Zhang, Di

    2014-11-14

    Networks embedded in three dimensional matrices are beneficial to deliver physical flows to the matrices. Leaf architectures, pervasive natural network-matrix architectures, endow leaves with high transpiration rates and low water pressure drops, providing inspiration for efficient network-matrix architectures. In this study, the network-matrix model for general flow transport inspired by natural pinnate leaves is investigated analytically. The results indicate that the optimal network structure inspired by natural pinnate leaves can greatly reduce the maximum potential drop and the total potential drop caused by the flow through the network while maximizing the total flow rate through the matrix. These results can be used to design efficient networks in network-matrix architectures for a variety of practical applications, such as tissue engineering, cell culture, photovoltaic devices and heat transfer.

  20. CERN - Six Decades of Science, Innovation, Cooperation, and Inspiration

    Energy Technology Data Exchange (ETDEWEB)

    Quigg, Chris [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States)

    2014-09-01

    The European Laboratory for Particle Physics, which straddles the Swiss-French border northwest of Geneva, celebrates its sixtieth birthday in 2014 CERN is the preeminent particle-physics institution in the world, currently emphasizing the study of collisions of protons and heavy nuclei at very high energies and the exploration of physics on the electroweak scale (energies where electromagnetism and the weak nuclear force merge). With brilliant accomplishments in research, innovation, and education, and a sustained history of cooperation among people from different countries and cultures, CERN ranks as one of the signal achievements of the postwar European Project. For physicists the world over, the laboratory is a source of pride and inspiration.

  1. Probabilistic broadcasting of mixed states

    International Nuclear Information System (INIS)

    Li Lvjun; Li Lvzhou; Wu Lihua; Zou Xiangfu; Qiu Daowen

    2009-01-01

    It is well known that the non-broadcasting theorem proved by Barnum et al is a fundamental principle of quantum communication. As we are aware, optimal broadcasting (OB) is the only method to broadcast noncommuting mixed states approximately. In this paper, motivated by the probabilistic cloning of quantum states proposed by Duan and Guo, we propose a new way for broadcasting noncommuting mixed states-probabilistic broadcasting (PB), and we present a sufficient condition for PB of mixed states. To a certain extent, we generalize the probabilistic cloning theorem from pure states to mixed states, and in particular, we generalize the non-broadcasting theorem, since the case that commuting mixed states can be exactly broadcast can be thought of as a special instance of PB where the success ratio is 1. Moreover, we discuss probabilistic local broadcasting (PLB) of separable bipartite states

  2. Probabilistic modeling of timber structures

    DEFF Research Database (Denmark)

    Köhler, Jochen; Sørensen, John Dalsgaard; Faber, Michael Havbro

    2007-01-01

    The present paper contains a proposal for the probabilistic modeling of timber material properties. It is produced in the context of the Probabilistic Model Code (PMC) of the Joint Committee on Structural Safety (JCSS) [Joint Committee of Structural Safety. Probabilistic Model Code, Internet...... Publication: www.jcss.ethz.ch; 2001] and of the COST action E24 ‘Reliability of Timber Structures' [COST Action E 24, Reliability of timber structures. Several meetings and Publications, Internet Publication: http://www.km.fgg.uni-lj.si/coste24/coste24.htm; 2005]. The present proposal is based on discussions...... and comments from participants of the COST E24 action and the members of the JCSS. The paper contains a description of the basic reference properties for timber strength parameters and ultimate limit state equations for timber components. The recommended probabilistic model for these basic properties...

  3. Rockfall hazard assessment integrating probabilistic physically based rockfall source detection (Norddal municipality, Norway).

    Science.gov (United States)

    Yugsi Molina, F. X.; Oppikofer, T.; Fischer, L.; Hermanns, R. L.; Taurisano, A.

    2012-04-01

    Traditional techniques to assess rockfall hazard are partially based on probabilistic analysis. Stochastic methods has been used for run-out analysis of rock blocks to estimate the trajectories that a detached block will follow during its fall until it stops due to kinetic energy loss. However, the selection of rockfall source areas is usually defined either by multivariate analysis or by field observations. For either case, a physically based approach is not used for the source area detection. We present an example of rockfall hazard assessment that integrates a probabilistic rockfall run-out analysis with a stochastic assessment of the rockfall source areas using kinematic stability analysis in a GIS environment. The method has been tested for a steep more than 200 m high rock wall, located in the municipality of Norddal (Møre og Romsdal county, Norway), where a large number of people are either exposed to snow avalanches, rockfalls, or debris flows. The area was selected following the recently published hazard mapping plan of Norway. The cliff is formed by medium to coarse-grained quartz-dioritic to granitic gneisses of Proterozoic age. Scree deposits product of recent rockfall activity are found at the bottom of the rock wall. Large blocks can be found several tens of meters away from the cliff in Sylte, the main locality in the Norddal municipality. Structural characterization of the rock wall was done using terrestrial laser scanning (TLS) point clouds in the software Coltop3D (www.terranum.ch), and results were validated with field data. Orientation data sets from the structural characterization were analyzed separately to assess best-fit probability density functions (PDF) for both dip angle and dip direction angle of each discontinuity set. A GIS-based stochastic kinematic analysis was then carried out using the discontinuity set orientations and the friction angle as random variables. An airborne laser scanning digital elevation model (ALS-DEM) with 1 m

  4. Structural reliability codes for probabilistic design

    DEFF Research Database (Denmark)

    Ditlevsen, Ove Dalager

    1997-01-01

    probabilistic code format has not only strong influence on the formal reliability measure, but also on the formal cost of failure to be associated if a design made to the target reliability level is considered to be optimal. In fact, the formal cost of failure can be different by several orders of size for two...... different, but by and large equally justifiable probabilistic code formats. Thus, the consequence is that a code format based on decision theoretical concepts and formulated as an extension of a probabilistic code format must specify formal values to be used as costs of failure. A principle of prudence...... is suggested for guiding the choice of the reference probabilistic code format for constant reliability. In the author's opinion there is an urgent need for establishing a standard probabilistic reliability code. This paper presents some considerations that may be debatable, but nevertheless point...

  5. Probabilistic Modeling of the Renal Stone Formation Module

    Science.gov (United States)

    Best, Lauren M.; Myers, Jerry G.; Goodenow, Debra A.; McRae, Michael P.; Jackson, Travis C.

    2013-01-01

    The Integrated Medical Model (IMM) is a probabilistic tool, used in mission planning decision making and medical systems risk assessments. The IMM project maintains a database of over 80 medical conditions that could occur during a spaceflight, documenting an incidence rate and end case scenarios for each. In some cases, where observational data are insufficient to adequately define the inflight medical risk, the IMM utilizes external probabilistic modules to model and estimate the event likelihoods. One such medical event of interest is an unpassed renal stone. Due to a high salt diet and high concentrations of calcium in the blood (due to bone depletion caused by unloading in the microgravity environment) astronauts are at a considerable elevated risk for developing renal calculi (nephrolithiasis) while in space. Lack of observed incidences of nephrolithiasis has led HRP to initiate the development of the Renal Stone Formation Module (RSFM) to create a probabilistic simulator capable of estimating the likelihood of symptomatic renal stone presentation in astronauts on exploration missions. The model consists of two major parts. The first is the probabilistic component, which utilizes probability distributions to assess the range of urine electrolyte parameters and a multivariate regression to transform estimated crystal density and size distributions to the likelihood of the presentation of nephrolithiasis symptoms. The second is a deterministic physical and chemical model of renal stone growth in the kidney developed by Kassemi et al. The probabilistic component of the renal stone model couples the input probability distributions describing the urine chemistry, astronaut physiology, and system parameters with the physical and chemical outputs and inputs to the deterministic stone growth model. These two parts of the model are necessary to capture the uncertainty in the likelihood estimate. The model will be driven by Monte Carlo simulations, continuously

  6. Superstring-inspired SO(10) GUT model with intermediate scale

    Science.gov (United States)

    Sasaki, Ken

    1987-12-01

    A new mechanism is proposed for the mixing of Weinberg-Salam Higgs fields in superstring-inspired SO(10) models with no SO(10) singlet fields. The higher-dimensional terms in the superpotential can generate both Higgs field mixing and a small mass for the physical neutrino. I would like to thank Professor C. Iso for hospitality extended to me at the Tokyo Institute of Technology.

  7. Probabilistic analysis of fires in nuclear plants

    International Nuclear Information System (INIS)

    Unione, A.; Teichmann, T.

    1985-01-01

    The aim of this paper is to describe a multilevel (i.e., staged) probabilistic analysis of fire risks in nuclear plants (as part of a general PRA) which maximizes the benefits of the FRA (fire risk assessment) in a cost effective way. The approach uses several stages of screening, physical modeling of clearly dominant risk contributors, searches for direct (e.g., equipment dependences) and secondary (e.g., fire induced internal flooding) interactions, and relies on lessons learned and available data from and surrogate FRAs. The general methodology is outlined. 6 figs., 10 tabs

  8. Probabilistic Multi-Hazard Assessment of Dry Cask Structures

    Energy Technology Data Exchange (ETDEWEB)

    Bencturk, Bora [Univ. of Houston, TX (United States); Padgett, Jamie [Rice Univ., Houston, TX (United States); Uddin, Rizwan [Univ. of Illinois, Urbana-Champaign, IL (United States).

    2017-01-10

    systems the concrete shall not only provide shielding but insures stability of the upright canister, facilitates anchoring, allows ventilation, and provides physical protection against theft, severe weather and natural (seismic) as well as man-made events (blast incidences). Given the need to remain functional for 40 years or even longer in case of interim storage, the concrete outerpack and the internal canister components need to be evaluated with regard to their long-term ability to perform their intended design functions. Just as evidenced by deteriorating concrete bridges, there are reported visible degradation mechanisms of dry storage systems especially when high corrosive environments are considered in maritime locations. The degradation of reinforced concrete is caused by multiple physical and chemical mechanisms, which may be summarized under the heading of environmental aging. The underlying hygro-thermal transport processes are accelerated by irradiation effects, hence creep and shrinkage need to include the effect of chloride penetration, alkali aggregate reaction as well as corrosion of the reinforcing steel. In light of the above, the two main objectives of this project are to (1) develop a probabilistic multi-hazard assessment framework, and (2) through experimental and numerical research perform a comprehensive assessment under combined earthquake loads and aging induced deterioration, which will also provide data for the development and validation of the probabilistic framework.

  9. Confluence Reduction for Probabilistic Systems (extended version)

    NARCIS (Netherlands)

    Timmer, Mark; Stoelinga, Mariëlle Ida Antoinette; van de Pol, Jan Cornelis

    2010-01-01

    This paper presents a novel technique for state space reduction of probabilistic specifications, based on a newly developed notion of confluence for probabilistic automata. We prove that this reduction preserves branching probabilistic bisimulation and can be applied on-the-fly. To support the

  10. Probabilistic Fatigue Damage Prognosis Using a Surrogate Model Trained Via 3D Finite Element Analysis

    Science.gov (United States)

    Leser, Patrick E.; Hochhalter, Jacob D.; Newman, John A.; Leser, William P.; Warner, James E.; Wawrzynek, Paul A.; Yuan, Fuh-Gwo

    2015-01-01

    Utilizing inverse uncertainty quantification techniques, structural health monitoring can be integrated with damage progression models to form probabilistic predictions of a structure's remaining useful life. However, damage evolution in realistic structures is physically complex. Accurately representing this behavior requires high-fidelity models which are typically computationally prohibitive. In the present work, a high-fidelity finite element model is represented by a surrogate model, reducing computation times. The new approach is used with damage diagnosis data to form a probabilistic prediction of remaining useful life for a test specimen under mixed-mode conditions.

  11. Up-gradient transport in a probabilistic transport model

    DEFF Research Database (Denmark)

    Gavnholt, J.; Juul Rasmussen, J.; Garcia, O.E.

    2005-01-01

    The transport of particles or heat against the driving gradient is studied by employing a probabilistic transport model with a characteristic particle step length that depends on the local concentration or heat gradient. When this gradient is larger than a prescribed critical value, the standard....... These results supplement recent works by van Milligen [Phys. Plasmas 11, 3787 (2004)], which applied Levy distributed step sizes in the case of supercritical gradients to obtain the up-gradient transport. (c) 2005 American Institute of Physics....

  12. Probabilistic finite elements for fracture mechanics

    Science.gov (United States)

    Besterfield, Glen

    1988-01-01

    The probabilistic finite element method (PFEM) is developed for probabilistic fracture mechanics (PFM). A finite element which has the near crack-tip singular strain embedded in the element is used. Probabilistic distributions, such as expectation, covariance and correlation stress intensity factors, are calculated for random load, random material and random crack length. The method is computationally quite efficient and can be expected to determine the probability of fracture or reliability.

  13. Kinds of inspiration in interaction design

    DEFF Research Database (Denmark)

    Halskov, Kim

    2010-01-01

    In this paper, we explore the role of sources of inspiration in interaction design. We identify four strategies for relating sources of inspiration to emerging ideas: selection; adaptation; translation; and combination. As our starting point, we argue that sources of inspiration are a form...... of knowledge crucial to creativity. Our research is based on empirical findings arising from the use of Inspiration Card Workshops, which are collaborative design events in which domain and technology insight are combined to create design concepts. In addition to the systematically introduced sources...... of inspiration that form part of the workshop format, a number of spontaneous sources of inspiration emerged during these workshops....

  14. A probabilistic physics-of-failure model for prognostic health management of structures subject to pitting and corrosion-fatigue

    International Nuclear Information System (INIS)

    Chookah, M.; Nuhi, M.; Modarres, M.

    2011-01-01

    A combined probabilistic physics-of-failure-based model for pitting and corrosion-fatigue degradation mechanisms is proposed to estimate the reliability of structures and to perform prognosis and health management. A mechanistic superposition model for corrosion-fatigue mechanism was used as a benchmark model to propose the simple model. The proposed model describes the degradation of the structures as a function of physical and critical environmental stresses, such as amplitude and frequency of mechanical loads (for example caused by the internal piping pressure) and the concentration of corrosive chemical agents. The parameters of the proposed model are represented by the probability density functions and estimated through a Bayesian approach based on the data taken from the experiments performed as part of this research. For demonstrating applications, the proposed model provides prognostic information about the reliability of aging of structures and is helpful in developing inspection and replacement strategies. - Highlights: ► We model an inventory system under static–dynamic uncertainty strategy. ► The demand is stochastic and non-stationary. ► The optimal ordering policy is proven to be a base stock policy. ► A solution algorithm for finding an optimal solution is provided. ► Two heuristics developed produce high quality solutions and scale-up efficiently.

  15. Probabilistic Mu-Calculus

    DEFF Research Database (Denmark)

    Larsen, Kim Guldstrand; Mardare, Radu Iulian; Xue, Bingtian

    2016-01-01

    We introduce a version of the probabilistic µ-calculus (PMC) built on top of a probabilistic modal logic that allows encoding n-ary inequational conditions on transition probabilities. PMC extends previously studied calculi and we prove that, despite its expressiveness, it enjoys a series of good...... metaproperties. Firstly, we prove the decidability of satisfiability checking by establishing the small model property. An algorithm for deciding the satisfiability problem is developed. As a second major result, we provide a complete axiomatization for the alternation-free fragment of PMC. The completeness proof...

  16. Formalizing Probabilistic Safety Claims

    Science.gov (United States)

    Herencia-Zapana, Heber; Hagen, George E.; Narkawicz, Anthony J.

    2011-01-01

    A safety claim for a system is a statement that the system, which is subject to hazardous conditions, satisfies a given set of properties. Following work by John Rushby and Bev Littlewood, this paper presents a mathematical framework that can be used to state and formally prove probabilistic safety claims. It also enables hazardous conditions, their uncertainties, and their interactions to be integrated into the safety claim. This framework provides a formal description of the probabilistic composition of an arbitrary number of hazardous conditions and their effects on system behavior. An example is given of a probabilistic safety claim for a conflict detection algorithm for aircraft in a 2D airspace. The motivation for developing this mathematical framework is that it can be used in an automated theorem prover to formally verify safety claims.

  17. Clay Bells: Edo Inspiration

    Science.gov (United States)

    Wagner, Tom

    2010-01-01

    The ceremonial copper and iron bells at the Smithsonian's National Museum of African Art were the author's inspiration for an interdisciplinary unit with a focus on the contributions various cultures make toward the richness of a community. The author of this article describes an Edo bell-inspired ceramic project incorporating slab-building…

  18. Bio-inspired computation in telecommunications

    CERN Document Server

    Yang, Xin-She; Ting, TO

    2015-01-01

    Bio-inspired computation, especially those based on swarm intelligence, has become increasingly popular in the last decade. Bio-Inspired Computation in Telecommunications reviews the latest developments in bio-inspired computation from both theory and application as they relate to telecommunications and image processing, providing a complete resource that analyzes and discusses the latest and future trends in research directions. Written by recognized experts, this is a must-have guide for researchers, telecommunication engineers, computer scientists and PhD students.

  19. Compression of Probabilistic XML documents

    NARCIS (Netherlands)

    Veldman, Irma

    2009-01-01

    Probabilistic XML (PXML) files resulting from data integration can become extremely large, which is undesired. For XML there are several techniques available to compress the document and since probabilistic XML is in fact (a special form of) XML, it might benefit from these methods even more. In

  20. Sustaining Physics Teacher Education Coalition programs in physics teacher education

    OpenAIRE

    Rachel E. Scherr; Monica Plisch; Renee Michelle Goertzen

    2017-01-01

    Understanding the mechanisms of increasing the number of physics teachers educated per year at institutions with thriving physics teacher preparation programs may inspire and support other institutions in building thriving programs of their own. The Physics Teacher Education Coalition (PhysTEC), led by the American Physical Society (APS) and the American Association of Physics Teachers (AAPT), has supported transformation of physics teacher preparation programs at a number of institutions aro...

  1. INSPIRE 2012 da Istanbul a Firenze

    Directory of Open Access Journals (Sweden)

    Mauro Salvemini

    2012-09-01

    Full Text Available DURING THE CONFERENCE HELD IN  ISTANBUL IN  2012 INSPIRE  THE  NEWS  THAT  MOST  IMPRESSED ITALIANS PRESENT,  EVEN THOSE IN THE PUBLIC ADMINISTRATION , WAS THAT THE NEXT  INSPIRE CONFERENCE WILL TAKE PLACE IN  FLORENCEDurante la conferenza INSPIRE 2012 svoltasi ad Istanbul la notizia che ha maggiormente colpito gli italiani presenti, anche quelli della pubblica amministrazione , è stata che la prossima Conferenza INSPIRE si svolgerà a Firenze dal 23 al 27 giugno 2013.

  2. INSPIRE 2012 da Istanbul a Firenze

    Directory of Open Access Journals (Sweden)

    Mauro Salvemini

    2012-09-01

    Full Text Available DURING THE CONFERENCE HELD IN  ISTANBUL IN  2012 INSPIRE  THE  NEWS  THAT  MOST  IMPRESSED ITALIANS PRESENT,  EVEN THOSE IN THE PUBLIC ADMINISTRATION , WAS THAT THE NEXT  INSPIRE CONFERENCE WILL TAKE PLACE IN  FLORENCE Durante la conferenza INSPIRE 2012 svoltasi ad Istanbul la notizia che ha maggiormente colpito gli italiani presenti, anche quelli della pubblica amministrazione , è stata che la prossima Conferenza INSPIRE si svolgerà a Firenze dal 23 al 27 giugno 2013.

  3. Probabilistic Physics-Based Risk Tools Used to Analyze the International Space Station Electrical Power System Output

    Science.gov (United States)

    Patel, Bhogila M.; Hoge, Peter A.; Nagpal, Vinod K.; Hojnicki, Jeffrey S.; Rusick, Jeffrey J.

    2004-01-01

    This paper describes the methods employed to apply probabilistic modeling techniques to the International Space Station (ISS) power system. These techniques were used to quantify the probabilistic variation in the power output, also called the response variable, due to variations (uncertainties) associated with knowledge of the influencing factors called the random variables. These uncertainties can be due to unknown environmental conditions, variation in the performance of electrical power system components or sensor tolerances. Uncertainties in these variables, cause corresponding variations in the power output, but the magnitude of that effect varies with the ISS operating conditions, e.g. whether or not the solar panels are actively tracking the sun. Therefore, it is important to quantify the influence of these uncertainties on the power output for optimizing the power available for experiments.

  4. Probabilistic Infinite Secret Sharing

    OpenAIRE

    Csirmaz, László

    2013-01-01

    The study of probabilistic secret sharing schemes using arbitrary probability spaces and possibly infinite number of participants lets us investigate abstract properties of such schemes. It highlights important properties, explains why certain definitions work better than others, connects this topic to other branches of mathematics, and might yield new design paradigms. A probabilistic secret sharing scheme is a joint probability distribution of the shares and the secret together with a colle...

  5. Isgur–Wise function in a QCD-inspired potential model with WKB ...

    Indian Academy of Sciences (India)

    2017-02-28

    Feb 28, 2017 ... DOI 10.1007/s12043-016-1357-9. Isgur–Wise function in a QCD-inspired potential model with WKB approximation. BHASKAR JYOTI HAZARIKA1,∗ and D K CHOUDHURY1,2. 1Centre for Theoretical Studies, Pandu College, Guwahati 781 012, India. 2Physics Academy of North East, Gauhati University, ...

  6. Probabilistic brains: knowns and unknowns

    Science.gov (United States)

    Pouget, Alexandre; Beck, Jeffrey M; Ma, Wei Ji; Latham, Peter E

    2015-01-01

    There is strong behavioral and physiological evidence that the brain both represents probability distributions and performs probabilistic inference. Computational neuroscientists have started to shed light on how these probabilistic representations and computations might be implemented in neural circuits. One particularly appealing aspect of these theories is their generality: they can be used to model a wide range of tasks, from sensory processing to high-level cognition. To date, however, these theories have only been applied to very simple tasks. Here we discuss the challenges that will emerge as researchers start focusing their efforts on real-life computations, with a focus on probabilistic learning, structural learning and approximate inference. PMID:23955561

  7. Probabilistic simple sticker systems

    Science.gov (United States)

    Selvarajoo, Mathuri; Heng, Fong Wan; Sarmin, Nor Haniza; Turaev, Sherzod

    2017-04-01

    A model for DNA computing using the recombination behavior of DNA molecules, known as a sticker system, was introduced by by L. Kari, G. Paun, G. Rozenberg, A. Salomaa, and S. Yu in the paper entitled DNA computing, sticker systems and universality from the journal of Acta Informatica vol. 35, pp. 401-420 in the year 1998. A sticker system uses the Watson-Crick complementary feature of DNA molecules: starting from the incomplete double stranded sequences, and iteratively using sticking operations until a complete double stranded sequence is obtained. It is known that sticker systems with finite sets of axioms and sticker rules generate only regular languages. Hence, different types of restrictions have been considered to increase the computational power of sticker systems. Recently, a variant of restricted sticker systems, called probabilistic sticker systems, has been introduced [4]. In this variant, the probabilities are initially associated with the axioms, and the probability of a generated string is computed by multiplying the probabilities of all occurrences of the initial strings in the computation of the string. Strings for the language are selected according to some probabilistic requirements. In this paper, we study fundamental properties of probabilistic simple sticker systems. We prove that the probabilistic enhancement increases the computational power of simple sticker systems.

  8. All-possible-couplings approach to measuring probabilistic context.

    Directory of Open Access Journals (Sweden)

    Ehtibar N Dzhafarov

    Full Text Available From behavioral sciences to biology to quantum mechanics, one encounters situations where (i a system outputs several random variables in response to several inputs, (ii for each of these responses only some of the inputs may "directly" influence them, but (iii other inputs provide a "context" for this response by influencing its probabilistic relations to other responses. These contextual influences are very different, say, in classical kinetic theory and in the entanglement paradigm of quantum mechanics, which are traditionally interpreted as representing different forms of physical determinism. One can mathematically construct systems with other types of contextuality, whether or not empirically realizable: those that form special cases of the classical type, those that fall between the classical and quantum ones, and those that violate the quantum type. We show how one can quantify and classify all logically possible contextual influences by studying various sets of probabilistic couplings, i.e., sets of joint distributions imposed on random outputs recorded at different (mutually incompatible values of inputs.

  9. Nostalgia-Evoked Inspiration: Mediating Mechanisms and Motivational Implications.

    Science.gov (United States)

    Stephan, Elena; Sedikides, Constantine; Wildschut, Tim; Cheung, Wing-Yee; Routledge, Clay; Arndt, Jamie

    2015-10-01

    Six studies examined the nostalgia-inspiration link and its motivational implications. In Study 1, nostalgia proneness was positively associated with inspiration frequency and intensity. In Studies 2 and 3, the recollection of nostalgic (vs. ordinary) experiences increased both general inspiration and specific inspiration to engage in exploratory activities. In Study 4, serial mediational analyses supported a model in which nostalgia increases social connectedness, which subsequently fosters self-esteem, which then boosts inspiration. In Study 5, a rigorous evaluation of this serial mediational model (with a novel nostalgia induction controlling for positive affect) reinforced the idea that nostalgia-elicited social connectedness increases self-esteem, which then heightens inspiration. Study 6 extended the serial mediational model by demonstrating that nostalgia-evoked inspiration predicts goal pursuit (intentions to pursue an important goal). Nostalgia spawns inspiration via social connectedness and attendant self-esteem. In turn, nostalgia-evoked inspiration bolsters motivation. © 2015 by the Society for Personality and Social Psychology, Inc.

  10. Inspired by CERN

    CERN Multimedia

    2004-01-01

    Art students inspired by CERN will be returning to show their work 9 to 16 October in Building 500, outside the Auditorium. Seventeen art students from around Europe visited CERN last January for a week of introductions to particle physics and astrophysics, and discussions with CERN scientists about their projects. A CERN scientist "adopted"each artist so they could ask questions during and after the visit. Now the seeds planted during their visit have come to fruition in a show using many media and exploring varied concepts, such as how people experience the online world, the sheer scale of CERN's equipment, and the abstractness of the entities scientists are looking for. "The work is so varied, people are going to love some pieces and detest others," says Andrew Charalambous, the project coordinator from University College London who is also curating the exhibition. "It's contemporary modern art, and that's sometimes difficult to take in." For more information on this thought-provoking show, see: htt...

  11. Probabilistic numerical discrimination in mice.

    Science.gov (United States)

    Berkay, Dilara; Çavdaroğlu, Bilgehan; Balcı, Fuat

    2016-03-01

    Previous studies showed that both human and non-human animals can discriminate between different quantities (i.e., time intervals, numerosities) with a limited level of precision due to their endogenous/representational uncertainty. In addition, other studies have shown that subjects can modulate their temporal categorization responses adaptively by incorporating information gathered regarding probabilistic contingencies into their time-based decisions. Despite the psychophysical similarities between the interval timing and nonverbal counting functions, the sensitivity of count-based decisions to probabilistic information remains an unanswered question. In the current study, we investigated whether exogenous probabilistic information can be integrated into numerosity-based judgments by mice. In the task employed in this study, reward was presented either after few (i.e., 10) or many (i.e., 20) lever presses, the last of which had to be emitted on the lever associated with the corresponding trial type. In order to investigate the effect of probabilistic information on performance in this task, we manipulated the relative frequency of different trial types across different experimental conditions. We evaluated the behavioral performance of the animals under models that differed in terms of their assumptions regarding the cost of responding (e.g., logarithmically increasing vs. no response cost). Our results showed for the first time that mice could adaptively modulate their count-based decisions based on the experienced probabilistic contingencies in directions predicted by optimality.

  12. Probabilistic Design and Analysis Framework

    Science.gov (United States)

    Strack, William C.; Nagpal, Vinod K.

    2010-01-01

    PRODAF is a software package designed to aid analysts and designers in conducting probabilistic analysis of components and systems. PRODAF can integrate multiple analysis programs to ease the tedious process of conducting a complex analysis process that requires the use of multiple software packages. The work uses a commercial finite element analysis (FEA) program with modules from NESSUS to conduct a probabilistic analysis of a hypothetical turbine blade, disk, and shaft model. PRODAF applies the response surface method, at the component level, and extrapolates the component-level responses to the system level. Hypothetical components of a gas turbine engine are first deterministically modeled using FEA. Variations in selected geometrical dimensions and loading conditions are analyzed to determine the effects of the stress state within each component. Geometric variations include the cord length and height for the blade, inner radius, outer radius, and thickness, which are varied for the disk. Probabilistic analysis is carried out using developing software packages like System Uncertainty Analysis (SUA) and PRODAF. PRODAF was used with a commercial deterministic FEA program in conjunction with modules from the probabilistic analysis program, NESTEM, to perturb loads and geometries to provide a reliability and sensitivity analysis. PRODAF simplified the handling of data among the various programs involved, and will work with many commercial and opensource deterministic programs, probabilistic programs, or modules.

  13. Probabilistic Role Models and the Guarded Fragment

    DEFF Research Database (Denmark)

    Jaeger, Manfred

    2004-01-01

    We propose a uniform semantic framework for interpreting probabilistic concept subsumption and probabilistic role quantification through statistical sampling distributions. This general semantic principle serves as the foundation for the development of a probabilistic version of the guarded fragm...... fragment of first-order logic. A characterization of equivalence in that logic in terms of bisimulations is given....

  14. Probabilistic role models and the guarded fragment

    DEFF Research Database (Denmark)

    Jaeger, Manfred

    2006-01-01

    We propose a uniform semantic framework for interpreting probabilistic concept subsumption and probabilistic role quantification through statistical sampling distributions. This general semantic principle serves as the foundation for the development of a probabilistic version of the guarded fragm...... fragment of first-order logic. A characterization of equivalence in that logic in terms of bisimulations is given....

  15. Effect of inspiration on airway dimensions measured in maximal inspiration CT images of subjects without airflow limitation

    DEFF Research Database (Denmark)

    Petersen, Jens; Wille, Mathilde M.W.; Raket, Lars Lau

    2014-01-01

    . Automated software was utilized to segment lungs and airways, identify segmental bronchi, and match airway branches in all images of the same subject. Inspiration level was defined as segmented total lung volume (TLV) divided by predicted total lung capacity (pTLC). Mixed-effects models were used to predict......OBJECTIVES: To study the effect of inspiration on airway dimensions measured in voluntary inspiration breath-hold examinations. METHODS: 961 subjects with normal spirometry were selected from the Danish Lung Cancer Screening Trial. Subjects were examined annually for five years with low-dose CT...... • The effect of inspiration is greater in higher-generation (more peripheral) airways • Airways of generation 5 and beyond are as distensible as lung parenchyma • Airway dimensions measured from CT should be adjusted for inspiration level....

  16. Probabilistic Programming (Invited Talk)

    OpenAIRE

    Yang, Hongseok

    2017-01-01

    Probabilistic programming refers to the idea of using standard programming constructs for specifying probabilistic models from machine learning and statistics, and employing generic inference algorithms for answering various queries on these models, such as posterior inference and estimation of model evidence. Although this idea itself is not new and was, in fact, explored by several programming-language and statistics researchers in the early 2000, it is only in the last few years that proba...

  17. Physics Teachers Programme or how to bring modern physics to school

    CERN Multimedia

    2002-01-01

    A new programme for teachers took place last weekend at CERN. Fifty high school teachers sacrificed their weekend and plunged into CERN physics to find new inspirations for exciting physics lessons.   The fifty participants in the Physics Teachers programme in the Microcosm garden. High school students are often convinced that physics is a boring and useless subject, and physics teachers have a hard time presenting interesting lessons. To help them, CERN has inaugurated the Physics Teachers Programme, whose goal is to present CERN and its activities, so that teachers can get an idea of what kind of physics research is going on at the frontier of science. 'Our philosophy is to show them today what they will read in textbooks of the future', says Antonella Del Rosso, responsible for this course, 'so that they can inspire their pupils'. This programme can be considered the younger brother of the High School Teachers Programme, since it has a similar aim, but instead of being three-weeks long, it las...

  18. Bayesian networks for identifying incorrect probabilistic intuitions in a climate trend uncertainty quantification context

    NARCIS (Netherlands)

    Hanea, A.M.; Nane, G.F.; Wielicki, B.A.; Cooke, R.M.

    2018-01-01

    Probabilistic thinking can often be unintuitive. This is the case even for simple problems, let alone the more complex ones arising in climate modelling, where disparate information sources need to be combined. The physical models, the natural variability of systems, the measurement errors and

  19. Probabilistic Harmonic Modeling of Wind Power Plants

    DEFF Research Database (Denmark)

    Guest, Emerson; Jensen, Kim H.; Rasmussen, Tonny Wederberg

    2017-01-01

    A probabilistic sequence domain (SD) harmonic model of a grid-connected voltage-source converter is used to estimate harmonic emissions in a wind power plant (WPP) comprised of Type-IV wind turbines. The SD representation naturally partitioned converter generated voltage harmonics into those...... with deterministic phase and those with probabilistic phase. A case study performed on a string of ten 3MW, Type-IV wind turbines implemented in PSCAD was used to verify the probabilistic SD harmonic model. The probabilistic SD harmonic model can be employed in the planning phase of WPP projects to assess harmonic...

  20. Topics in Probabilistic Judgment Aggregation

    Science.gov (United States)

    Wang, Guanchun

    2011-01-01

    This dissertation is a compilation of several studies that are united by their relevance to probabilistic judgment aggregation. In the face of complex and uncertain events, panels of judges are frequently consulted to provide probabilistic forecasts, and aggregation of such estimates in groups often yield better results than could have been made…

  1. A logic for inductive probabilistic reasoning

    DEFF Research Database (Denmark)

    Jaeger, Manfred

    2005-01-01

    Inductive probabilistic reasoning is understood as the application of inference patterns that use statistical background information to assign (subjective) probabilities to single events. The simplest such inference pattern is direct inference: from '70% of As are Bs" and "a is an A" infer...... that a is a B with probability 0.7. Direct inference is generalized by Jeffrey's rule and the principle of cross-entropy minimization. To adequately formalize inductive probabilistic reasoning is an interesting topic for artificial intelligence, as an autonomous system acting in a complex environment may have...... to base its actions on a probabilistic model of its environment, and the probabilities needed to form this model can often be obtained by combining statistical background information with particular observations made, i.e., by inductive probabilistic reasoning. In this paper a formal framework...

  2. A convergence theory for probabilistic metric spaces | Jäger ...

    African Journals Online (AJOL)

    We develop a theory of probabilistic convergence spaces based on Tardiff's neighbourhood systems for probabilistic metric spaces. We show that the resulting category is a topological universe and we characterize a subcategory that is isomorphic to the category of probabilistic metric spaces. Keywords: Probabilistic metric ...

  3. A bio-inspired spatial patterning circuit.

    Science.gov (United States)

    Chen, Kai-Yuan; Joe, Danial J; Shealy, James B; Land, Bruce R; Shen, Xiling

    2014-01-01

    Lateral Inhibition (LI) is a widely conserved patterning mechanism in biological systems across species. Distinct from better-known Turing patterns, LI depend on cell-cell contact rather than diffusion. We built an in silico genetic circuit model to analyze the dynamic properties of LI. The model revealed that LI amplifies differences between neighboring cells to push them into opposite states, hence forming stable 2-D patterns. Inspired by this insight, we designed and implemented an electronic circuit that recapitulates LI patterning dynamics. This biomimetic system serve as a physical model to elucidate the design principle of generating robust patterning through spatial feedback, regardless of the underlying devices being biological or electrical.

  4. Bounding probabilistic safety assessment probabilities by reality

    International Nuclear Information System (INIS)

    Fragola, J.R.; Shooman, M.L.

    1991-01-01

    The investigation of the failure in systems where failure is a rare event makes the continual comparisons between the developed probabilities and empirical evidence difficult. The comparison of the predictions of rare event risk assessments with historical reality is essential to prevent probabilistic safety assessment (PSA) predictions from drifting into fantasy. One approach to performing such comparisons is to search out and assign probabilities to natural events which, while extremely rare, have a basis in the history of natural phenomena or human activities. For example the Segovian aqueduct and some of the Roman fortresses in Spain have existed for several millennia and in many cases show no physical signs of earthquake damage. This evidence could be used to bound the probability of earthquakes above a certain magnitude to less than 10 -3 per year. On the other hand, there is evidence that some repetitive actions can be performed with extremely low historical probabilities when operators are properly trained and motivated, and sufficient warning indicators are provided. The point is not that low probability estimates are impossible, but continual reassessment of the analysis assumptions, and a bounding of the analysis predictions by historical reality. This paper reviews the probabilistic predictions of PSA in this light, attempts to develop, in a general way, the limits which can be historically established and the consequent bounds that these limits place upon the predictions, and illustrates the methodology used in computing such limits. Further, the paper discusses the use of empirical evidence and the requirement for disciplined systematic approaches within the bounds of reality and the associated impact on PSA probabilistic estimates

  5. Arbitrage and Hedging in a non probabilistic framework

    OpenAIRE

    Alvarez, Alexander; Ferrando, Sebastian; Olivares, Pablo

    2011-01-01

    The paper studies the concepts of hedging and arbitrage in a non probabilistic framework. It provides conditions for non probabilistic arbitrage based on the topological structure of the trajectory space and makes connections with the usual notion of arbitrage. Several examples illustrate the non probabilistic arbitrage as well perfect replication of options under continuous and discontinuous trajectories, the results can then be applied in probabilistic models path by path. The approach is r...

  6. Probabilistic studies of accident sequences

    International Nuclear Information System (INIS)

    Villemeur, A.; Berger, J.P.

    1986-01-01

    For several years, Electricite de France has carried out probabilistic assessment of accident sequences for nuclear power plants. In the framework of this program many methods were developed. As the interest in these studies was increasing and as adapted methods were developed, Electricite de France has undertaken a probabilistic safety assessment of a nuclear power plant [fr

  7. Convex sets in probabilistic normed spaces

    International Nuclear Information System (INIS)

    Aghajani, Asadollah; Nourouzi, Kourosh

    2008-01-01

    In this paper we obtain some results on convexity in a probabilistic normed space. We also investigate the concept of CSN-closedness and CSN-compactness in a probabilistic normed space and generalize the corresponding results of normed spaces

  8. Probabilistic finite elements

    Science.gov (United States)

    Belytschko, Ted; Wing, Kam Liu

    1987-01-01

    In the Probabilistic Finite Element Method (PFEM), finite element methods have been efficiently combined with second-order perturbation techniques to provide an effective method for informing the designer of the range of response which is likely in a given problem. The designer must provide as input the statistical character of the input variables, such as yield strength, load magnitude, and Young's modulus, by specifying their mean values and their variances. The output then consists of the mean response and the variance in the response. Thus the designer is given a much broader picture of the predicted performance than with simply a single response curve. These methods are applicable to a wide class of problems, provided that the scale of randomness is not too large and the probabilistic density functions possess decaying tails. By incorporating the computational techniques we have developed in the past 3 years for efficiency, the probabilistic finite element methods are capable of handling large systems with many sources of uncertainties. Sample results for an elastic-plastic ten-bar structure and an elastic-plastic plane continuum with a circular hole subject to cyclic loadings with the yield stress on the random field are given.

  9. Probabilistic Dynamics for Integrated Analysis of Accident Sequences considering Uncertain Events

    Directory of Open Access Journals (Sweden)

    Robertas Alzbutas

    2015-01-01

    Full Text Available The analytical/deterministic modelling and simulation/probabilistic methods are used separately as a rule in order to analyse the physical processes and random or uncertain events. However, in the currently used probabilistic safety assessment this is an issue. The lack of treatment of dynamic interactions between the physical processes on one hand and random events on the other hand causes the limited assessment. In general, there are a lot of mathematical modelling theories, which can be used separately or integrated in order to extend possibilities of modelling and analysis. The Theory of Probabilistic Dynamics (TPD and its augmented version based on the concept of stimulus and delay are introduced for the dynamic reliability modelling and the simulation of accidents in hybrid (continuous-discrete systems considering uncertain events. An approach of non-Markovian simulation and uncertainty analysis is discussed in order to adapt the Stimulus-Driven TPD for practical applications. The developed approach and related methods are used as a basis for a test case simulation in view of various methods applications for severe accident scenario simulation and uncertainty analysis. For this and for wider analysis of accident sequences the initial test case specification is then extended and discussed. Finally, it is concluded that enhancing the modelling of stimulated dynamics with uncertainty and sensitivity analysis allows the detailed simulation of complex system characteristics and representation of their uncertainty. The developed approach of accident modelling and analysis can be efficiently used to estimate the reliability of hybrid systems and at the same time to analyze and possibly decrease the uncertainty of this estimate.

  10. Probabilistic reasoning with graphical security models

    NARCIS (Netherlands)

    Kordy, Barbara; Pouly, Marc; Schweitzer, Patrick

    This work provides a computational framework for meaningful probabilistic evaluation of attack–defense scenarios involving dependent actions. We combine the graphical security modeling technique of attack–defense trees with probabilistic information expressed in terms of Bayesian networks. In order

  11. La maturità di INSPIRE

    Directory of Open Access Journals (Sweden)

    Mauro Salvemini

    2010-03-01

    Full Text Available INPIRE's maturityThe INSPIRE Conference 2010 took place from 23 to 25 June 2010 in Kraków, Poland. On 22 June pre-conference workshops have been organized. The theme of this year’s edition has been "INSPIRE as a Framework for Cooperation".The INSPIRE Conference has been organised through a series of plenary sessions addressing common policy issues, and parallel sessions focusing in particular on applications and implementations of SDIs, research issues and new and evolvingtechnologies and applications and poster presentations.

  12. Application of quercetin and its bio-inspired nanoparticles as anti-adhesive agents against Bacillus subtilis attachment to surface

    Energy Technology Data Exchange (ETDEWEB)

    Raie, Diana S., E-mail: raiediana@yahoo.com [Process Design and Development Department, Egyptian Petroleum Research Institute (EPRI), Nasr City, Cairo (Egypt); Mhatre, Eisha [Terrestrial Biofilms Group, Institute of Microbiology, Friedrich Schiller University Jena (FSU), Jena (Germany); Thiele, Matthias [Nanobiophotonic Department, Leibniz Institute of Photonic Technology Jena (IPHT), Jena (Germany); Labena, A. [Process Design and Development Department, Egyptian Petroleum Research Institute (EPRI), Nasr City, Cairo (Egypt); El-Ghannam, Gamal [National Institute of Laser Enhanced Sciences (NILES), Cairo University, Giza (Egypt); Farahat, Laila A. [Process Design and Development Department, Egyptian Petroleum Research Institute (EPRI), Nasr City, Cairo (Egypt); Youssef, Tareq [National Institute of Laser Enhanced Sciences (NILES), Cairo University, Giza (Egypt); Fritzsche, Wolfgang [Nanobiophotonic Department, Leibniz Institute of Photonic Technology Jena (IPHT), Jena (Germany); Kovács, Ákos T., E-mail: akos-tibor.kovacs@uni-jena.de [Terrestrial Biofilms Group, Institute of Microbiology, Friedrich Schiller University Jena (FSU), Jena (Germany)

    2017-01-01

    The aim of this study was directed to reveal the repulsive effect of coated glass slides by quercetin and its bio-inspired titanium oxide and tungsten oxide nanoparticles on physical surface attachment of Bacillus subtilis as an ab-initio step of biofilm formation. Nanoparticles were successfully synthesized using sol–gel and acid precipitation methods for titanium oxide and tungsten oxide, respectively (in the absence or presence of quercetin). The anti-adhesive impact of the coated-slides was tested through the physical attachment of B. subtilis after 24 h using Confocal Laser Scanning Microscopy (CLSM). Here, quercetin was presented as a bio-route for the synthesis of tungsten mixed oxides nano-plates at room temperature. In addition, quercetin had an impact on zeta potential and adsorption capacity of both bio-inspired amorphous titanium oxide and tungsten oxide nano-plates. Interestingly, our experiments indicated a contrary effect of quercetin as an anti-adhesive agent than previously reported. However, its bio-inspired metal oxide proved their repulsive efficiency. In addition, quercetin-mediated nano-tungsten and quercetin-mediated amorphous titanium showed anti-adhesive activity against B. subtilis biofilm. - Highlights: • Novel quercetin-mediated nanoparticles were tested for anti-adhesion against attachment of cells forming biofilms. • Quercetin showed a low-grade of protection level against bacterial attachment. • Bio-inspired nano-anatase showed a lower efficiency than amorphous titanium. • Thermally treated bio-inspired nano-tungsten gets an improved anti-adhesive activity.

  13. Consideration of aging in probabilistic safety assessment

    International Nuclear Information System (INIS)

    Titina, B.; Cepin, M.

    2007-01-01

    Probabilistic safety assessment is a standardised tool for assessment of safety of nuclear power plants. It is a complement to the safety analyses. Standard probabilistic models of safety equipment assume component failure rate as a constant. Ageing of systems, structures and components can theoretically be included in new age-dependent probabilistic safety assessment, which generally causes the failure rate to be a function of age. New age-dependent probabilistic safety assessment models, which offer explicit calculation of the ageing effects, are developed. Several groups of components are considered which require their unique models: e.g. operating components e.g. stand-by components. The developed models on the component level are inserted into the models of the probabilistic safety assessment in order that the ageing effects are evaluated for complete systems. The preliminary results show that the lack of necessary data for consideration of ageing causes highly uncertain models and consequently the results. (author)

  14. Implications of probabilistic risk assessment

    International Nuclear Information System (INIS)

    Cullingford, M.C.; Shah, S.M.; Gittus, J.H.

    1987-01-01

    Probabilistic risk assessment (PRA) is an analytical process that quantifies the likelihoods, consequences and associated uncertainties of the potential outcomes of postulated events. Starting with planned or normal operation, probabilistic risk assessment covers a wide range of potential accidents and considers the whole plant and the interactions of systems and human actions. Probabilistic risk assessment can be applied in safety decisions in design, licensing and operation of industrial facilities, particularly nuclear power plants. The proceedings include a review of PRA procedures, methods and technical issues in treating uncertainties, operating and licensing issues and future trends. Risk assessment for specific reactor types or components and specific risks (eg aircraft crashing onto a reactor) are used to illustrate the points raised. All 52 articles are indexed separately. (U.K.)

  15. Probabilistic graphical models to deal with age estimation of living persons.

    Science.gov (United States)

    Sironi, Emanuele; Gallidabino, Matteo; Weyermann, Céline; Taroni, Franco

    2016-03-01

    Due to the rise of criminal, civil and administrative judicial situations involving people lacking valid identity documents, age estimation of living persons has become an important operational procedure for numerous forensic and medicolegal services worldwide. The chronological age of a given person is generally estimated from the observed degree of maturity of some selected physical attributes by means of statistical methods. However, their application in the forensic framework suffers from some conceptual and practical drawbacks, as recently claimed in the specialised literature. The aim of this paper is therefore to offer an alternative solution for overcoming these limits, by reiterating the utility of a probabilistic Bayesian approach for age estimation. This approach allows one to deal in a transparent way with the uncertainty surrounding the age estimation process and to produce all the relevant information in the form of posterior probability distribution about the chronological age of the person under investigation. Furthermore, this probability distribution can also be used for evaluating in a coherent way the possibility that the examined individual is younger or older than a given legal age threshold having a particular legal interest. The main novelty introduced by this work is the development of a probabilistic graphical model, i.e. a Bayesian network, for dealing with the problem at hand. The use of this kind of probabilistic tool can significantly facilitate the application of the proposed methodology: examples are presented based on data related to the ossification status of the medial clavicular epiphysis. The reliability and the advantages of this probabilistic tool are presented and discussed.

  16. Nature-inspired optimization algorithms

    CERN Document Server

    Yang, Xin-She

    2014-01-01

    Nature-Inspired Optimization Algorithms provides a systematic introduction to all major nature-inspired algorithms for optimization. The book's unified approach, balancing algorithm introduction, theoretical background and practical implementation, complements extensive literature with well-chosen case studies to illustrate how these algorithms work. Topics include particle swarm optimization, ant and bee algorithms, simulated annealing, cuckoo search, firefly algorithm, bat algorithm, flower algorithm, harmony search, algorithm analysis, constraint handling, hybrid methods, parameter tuning

  17. Branching bisimulation congruence for probabilistic systems

    NARCIS (Netherlands)

    Trcka, N.; Georgievska, S.; Aldini, A.; Baier, C.

    2008-01-01

    The notion of branching bisimulation for the alternating model of probabilistic systems is not a congruence with respect to parallel composition. In this paper we first define another branching bisimulation in the more general model allowing consecutive probabilistic transitions, and we prove that

  18. A probabilistic approach for the computation of non-linear vibrations of tubes under cross-flow

    International Nuclear Information System (INIS)

    Payen, Th.; Langre, E. de.

    1996-01-01

    For the predictive analysis of flow-induced vibration and wear of tube bundles, a probabilistic method is proposed taking into account the uncertainties of the physical parameters. Monte-Carlo simulations are performed to estimate the density probability function of wear work rate and a sensitivity analysis is done on physical parameters influencing wear on the case of loosely supported tube under cross-flow. (authors). 8 refs., 8 figs

  19. A bio-inspired approach for the reduction of left ventricular workload.

    Directory of Open Access Journals (Sweden)

    Niema M Pahlevan

    Full Text Available Previous studies have demonstrated the existence of optimization criteria in the design and development of mammalians cardiovascular systems. Similarities in mammalian arterial wave reflection suggest there are certain design criteria for the optimization of arterial wave dynamics. Inspired by these natural optimization criteria, we investigated the feasibility of optimizing the aortic waves by modifying wave reflection sites. A hydraulic model that has physical and dynamical properties similar to a human aorta and left ventricle was used for a series of in-vitro experiments. The results indicate that placing an artificial reflection site (a ring at a specific location along the aorta may create a constructive wave dynamic that could reduce LV pulsatile workload. This simple bio-inspired approach may have important implications for the future of treatment strategies for diseased aorta.

  20. Geo-inspired model: Agents vectors naturals inspired by the environmental management (AVNG of water tributaries

    Directory of Open Access Journals (Sweden)

    Edwin Eduardo Millán Rojas

    2018-02-01

    Full Text Available Context: Management to care for the environment and the Earth (geo can be source of inspiration for developing models that allow addressing complexity issues; the objective of this research was to develop an additional aspect of the inspired models. The geoinspired model has two features, the first covering aspects related to environmental management and the behavior of natural resources, and the second has a component of spatial location associated with existing objects on the Earth's surface. Method: The approach developed in the research is descriptive and its main objective is the representation or characterization of a case study within a particular context. Results: The result was the design of a model to emulate the natural behavior of the water tributaries of the Amazon foothills, in order to extend the application of the inspired models and allow the use of elements such as geo-referencing and environmental management. The proposed geoinspired model is called “natural vectors agents inspired in environmental management”. Conclusions: The agents vectors naturals inspired by the environmental are polyform elements that can assume the behavior of environmental entities, which makes it possible to achieve progress in other fields of environmental management (use of soil, climate, flora, fauna, and link environmental issues with the structure of the proposed model.

  1. Probabilistic approach to mechanisms

    CERN Document Server

    Sandler, BZ

    1984-01-01

    This book discusses the application of probabilistics to the investigation of mechanical systems. The book shows, for example, how random function theory can be applied directly to the investigation of random processes in the deflection of cam profiles, pitch or gear teeth, pressure in pipes, etc. The author also deals with some other technical applications of probabilistic theory, including, amongst others, those relating to pneumatic and hydraulic mechanisms and roller bearings. Many of the aspects are illustrated by examples of applications of the techniques under discussion.

  2. Probabilistic machine learning and artificial intelligence.

    Science.gov (United States)

    Ghahramani, Zoubin

    2015-05-28

    How can a machine learn from experience? Probabilistic modelling provides a framework for understanding what learning is, and has therefore emerged as one of the principal theoretical and practical approaches for designing machines that learn from data acquired through experience. The probabilistic framework, which describes how to represent and manipulate uncertainty about models and predictions, has a central role in scientific data analysis, machine learning, robotics, cognitive science and artificial intelligence. This Review provides an introduction to this framework, and discusses some of the state-of-the-art advances in the field, namely, probabilistic programming, Bayesian optimization, data compression and automatic model discovery.

  3. Probabilistic machine learning and artificial intelligence

    Science.gov (United States)

    Ghahramani, Zoubin

    2015-05-01

    How can a machine learn from experience? Probabilistic modelling provides a framework for understanding what learning is, and has therefore emerged as one of the principal theoretical and practical approaches for designing machines that learn from data acquired through experience. The probabilistic framework, which describes how to represent and manipulate uncertainty about models and predictions, has a central role in scientific data analysis, machine learning, robotics, cognitive science and artificial intelligence. This Review provides an introduction to this framework, and discusses some of the state-of-the-art advances in the field, namely, probabilistic programming, Bayesian optimization, data compression and automatic model discovery.

  4. Use and Communication of Probabilistic Forecasts.

    Science.gov (United States)

    Raftery, Adrian E

    2016-12-01

    Probabilistic forecasts are becoming more and more available. How should they be used and communicated? What are the obstacles to their use in practice? I review experience with five problems where probabilistic forecasting played an important role. This leads me to identify five types of potential users: Low Stakes Users, who don't need probabilistic forecasts; General Assessors, who need an overall idea of the uncertainty in the forecast; Change Assessors, who need to know if a change is out of line with expectatations; Risk Avoiders, who wish to limit the risk of an adverse outcome; and Decision Theorists, who quantify their loss function and perform the decision-theoretic calculations. This suggests that it is important to interact with users and to consider their goals. The cognitive research tells us that calibration is important for trust in probability forecasts, and that it is important to match the verbal expression with the task. The cognitive load should be minimized, reducing the probabilistic forecast to a single percentile if appropriate. Probabilities of adverse events and percentiles of the predictive distribution of quantities of interest seem often to be the best way to summarize probabilistic forecasts. Formal decision theory has an important role, but in a limited range of applications.

  5. Use and Communication of Probabilistic Forecasts

    Science.gov (United States)

    Raftery, Adrian E.

    2015-01-01

    Probabilistic forecasts are becoming more and more available. How should they be used and communicated? What are the obstacles to their use in practice? I review experience with five problems where probabilistic forecasting played an important role. This leads me to identify five types of potential users: Low Stakes Users, who don’t need probabilistic forecasts; General Assessors, who need an overall idea of the uncertainty in the forecast; Change Assessors, who need to know if a change is out of line with expectatations; Risk Avoiders, who wish to limit the risk of an adverse outcome; and Decision Theorists, who quantify their loss function and perform the decision-theoretic calculations. This suggests that it is important to interact with users and to consider their goals. The cognitive research tells us that calibration is important for trust in probability forecasts, and that it is important to match the verbal expression with the task. The cognitive load should be minimized, reducing the probabilistic forecast to a single percentile if appropriate. Probabilities of adverse events and percentiles of the predictive distribution of quantities of interest seem often to be the best way to summarize probabilistic forecasts. Formal decision theory has an important role, but in a limited range of applications. PMID:28446941

  6. Probabilistic numerics and uncertainty in computations.

    Science.gov (United States)

    Hennig, Philipp; Osborne, Michael A; Girolami, Mark

    2015-07-08

    We deliver a call to arms for probabilistic numerical methods : algorithms for numerical tasks, including linear algebra, integration, optimization and solving differential equations, that return uncertainties in their calculations. Such uncertainties, arising from the loss of precision induced by numerical calculation with limited time or hardware, are important for much contemporary science and industry. Within applications such as climate science and astrophysics, the need to make decisions on the basis of computations with large and complex data have led to a renewed focus on the management of numerical uncertainty. We describe how several seminal classic numerical methods can be interpreted naturally as probabilistic inference. We then show that the probabilistic view suggests new algorithms that can flexibly be adapted to suit application specifics, while delivering improved empirical performance. We provide concrete illustrations of the benefits of probabilistic numeric algorithms on real scientific problems from astrometry and astronomical imaging, while highlighting open problems with these new algorithms. Finally, we describe how probabilistic numerical methods provide a coherent framework for identifying the uncertainty in calculations performed with a combination of numerical algorithms (e.g. both numerical optimizers and differential equation solvers), potentially allowing the diagnosis (and control) of error sources in computations.

  7. Artificial heartbeat: design and fabrication of a biologically inspired pump

    International Nuclear Information System (INIS)

    Walters, Peter; Stephenson, Robert; Lewis, Amy; Stinchcombe, Andrew; Ieropoulos, Ioannis

    2013-01-01

    We present a biologically inspired actuator exhibiting a novel pumping action. The design of the ‘artificial heartbeat’ actuator is inspired by physical principles derived from the structure and function of the human heart. The actuator employs NiTi artificial muscles and is powered by electrical energy generated by microbial fuel cells (MFCs). We describe the design and fabrication of the actuator and report the results of tests conducted to characterize its performance. This is the first artificial muscle-driven pump to be powered by MFCs fed on human urine. Results are presented in terms of the peak pumping pressure generated by the actuator, as well as for the volume of fluid transferred, when the actuator was powered by energy stored in a capacitor bank, which was charged by 24 MFCs fed on urine. The results demonstrate the potential for the artificial heartbeat actuator to be employed as a fluid circulation pump in future generations of MFC-powered robots (‘EcoBots’) that extract energy from organic waste. We also envisage that the actuator could in the future form part of a bio-robotic artwork or ‘bio-automaton’ that could help increase public awareness of research in robotics, bio-energy and biologically inspired design. (paper)

  8. Probabilistic evaluation of fire protection features found in nuclear power plants

    International Nuclear Information System (INIS)

    Azarm, M.A.; Boccio, J.L.; Ruger, C.

    1985-01-01

    This paper describes a method which can be used to evaluate, on a relative basis, the NRC Fire Protection (FP) guidelines as found in Section 9.5.1 (Fire Protection) of the Standard Review Plan (SRP). The approach, a hybrid of existing physical models for fire propagation determinations and probabilistic models for fire-mitigation system reliability, can potentially be used as an adjunct to the present fire safety review process

  9. Probabilistic linguistics

    NARCIS (Netherlands)

    Bod, R.; Heine, B.; Narrog, H.

    2010-01-01

    Probabilistic linguistics takes all linguistic evidence as positive evidence and lets statistics decide. It allows for accurate modelling of gradient phenomena in production and perception, and suggests that rule-like behaviour is no more than a side effect of maximizing probability. This chapter

  10. Radioactivity, a pragmatic pillar of probabilistic conceptions

    International Nuclear Information System (INIS)

    Amaldi, E.

    1979-01-01

    The author expresses his opinion that by looking at the problem of repudiation of causality in physics from the most general and far away point of view, one can be brought to over-estimate the extrinsic influences and over-look intrinsic arguments inherent to two parallel, almost independent developments. The first one starts from the kinetic theory of gases and passes through statistical mechanics, Planck original definition of quantum, the photons conceived as particles and the relations between emission and absorption of photons by atoms. The other path, also intrinsic to physics starts with the accidental discovery of radioactive substances, passes through the experimental recognition of their decay properties and quickly finds its natural settlement in a probabilistic conception which can be accused to be a critical but has certainly a sound pragmatic ground, uncorrelated or at extremely loosely correlated to contemporary or pre-existing philosophical lines of thought. (Auth.)

  11. Norsk inspiration til uddannelse og job

    DEFF Research Database (Denmark)

    Skovhus, Randi Boelskifte; Thomsen, Rie; Buhl, Rita

    2017-01-01

    Anmeldelse af bog om det norske fag Utdanningsvalg - inspiration til arbejde med uddannelse og job......Anmeldelse af bog om det norske fag Utdanningsvalg - inspiration til arbejde med uddannelse og job...

  12. Feeling Is Believing: Inspiration Encourages Belief in God.

    Science.gov (United States)

    Critcher, Clayton R; Lee, Chan Jean

    2018-05-01

    Even without direct evidence of God's existence, about half of the world's population believes in God. Although previous research has found that people arrive at such beliefs intuitively instead of analytically, relatively little research has aimed to understand what experiences encourage or legitimate theistic belief systems. Using cross-cultural correlational and experimental methods, we investigated whether the experience of inspiration encourages a belief in God. Participants who dispositionally experience more inspiration, were randomly assigned to relive or have an inspirational experience, or reported such experiences to be more inspirational all showed stronger belief in God. These effects were specific to inspiration (instead of adjacent affective experiences) and a belief in God (instead of other empirically unverifiable claims). Being inspired by someone or something (but not inspired to do something) offers a spiritually transcendent experience that elevates belief in God, in part because it makes people feel connected to something beyond themselves.

  13. Probabilistic soft sets and dual probabilistic soft sets in decision making with positive and negative parameters

    Science.gov (United States)

    Fatimah, F.; Rosadi, D.; Hakim, R. B. F.

    2018-03-01

    In this paper, we motivate and introduce probabilistic soft sets and dual probabilistic soft sets for handling decision making problem in the presence of positive and negative parameters. We propose several types of algorithms related to this problem. Our procedures are flexible and adaptable. An example on real data is also given.

  14. Why do probabilistic finite element analysis ?

    CERN Document Server

    Thacker, Ben H

    2008-01-01

    The intention of this book is to provide an introduction to performing probabilistic finite element analysis. As a short guideline, the objective is to inform the reader of the use, benefits and issues associated with performing probabilistic finite element analysis without excessive theory or mathematical detail.

  15. Error Discounting in Probabilistic Category Learning

    Science.gov (United States)

    Craig, Stewart; Lewandowsky, Stephan; Little, Daniel R.

    2011-01-01

    The assumption in some current theories of probabilistic categorization is that people gradually attenuate their learning in response to unavoidable error. However, existing evidence for this error discounting is sparse and open to alternative interpretations. We report 2 probabilistic-categorization experiments in which we investigated error…

  16. Probabilistic Space Weather Forecasting: a Bayesian Perspective

    Science.gov (United States)

    Camporeale, E.; Chandorkar, M.; Borovsky, J.; Care', A.

    2017-12-01

    Most of the Space Weather forecasts, both at operational and research level, are not probabilistic in nature. Unfortunately, a prediction that does not provide a confidence level is not very useful in a decision-making scenario. Nowadays, forecast models range from purely data-driven, machine learning algorithms, to physics-based approximation of first-principle equations (and everything that sits in between). Uncertainties pervade all such models, at every level: from the raw data to finite-precision implementation of numerical methods. The most rigorous way of quantifying the propagation of uncertainties is by embracing a Bayesian probabilistic approach. One of the simplest and most robust machine learning technique in the Bayesian framework is Gaussian Process regression and classification. Here, we present the application of Gaussian Processes to the problems of the DST geomagnetic index forecast, the solar wind type classification, and the estimation of diffusion parameters in radiation belt modeling. In each of these very diverse problems, the GP approach rigorously provide forecasts in the form of predictive distributions. In turn, these distributions can be used as input for ensemble simulations in order to quantify the amplification of uncertainties. We show that we have achieved excellent results in all of the standard metrics to evaluate our models, with very modest computational cost.

  17. Probabilistic programming in Python using PyMC3

    Directory of Open Access Journals (Sweden)

    John Salvatier

    2016-04-01

    Full Text Available Probabilistic programming allows for automatic Bayesian inference on user-defined probabilistic models. Recent advances in Markov chain Monte Carlo (MCMC sampling allow inference on increasingly complex models. This class of MCMC, known as Hamiltonian Monte Carlo, requires gradient information which is often not readily available. PyMC3 is a new open source probabilistic programming framework written in Python that uses Theano to compute gradients via automatic differentiation as well as compile probabilistic programs on-the-fly to C for increased speed. Contrary to other probabilistic programming languages, PyMC3 allows model specification directly in Python code. The lack of a domain specific language allows for great flexibility and direct interaction with the model. This paper is a tutorial-style introduction to this software package.

  18. Comparing Categorical and Probabilistic Fingerprint Evidence.

    Science.gov (United States)

    Garrett, Brandon; Mitchell, Gregory; Scurich, Nicholas

    2018-04-23

    Fingerprint examiners traditionally express conclusions in categorical terms, opining that impressions do or do not originate from the same source. Recently, probabilistic conclusions have been proposed, with examiners estimating the probability of a match between recovered and known prints. This study presented a nationally representative sample of jury-eligible adults with a hypothetical robbery case in which an examiner opined on the likelihood that a defendant's fingerprints matched latent fingerprints in categorical or probabilistic terms. We studied model language developed by the U.S. Defense Forensic Science Center to summarize results of statistical analysis of the similarity between prints. Participant ratings of the likelihood the defendant left prints at the crime scene and committed the crime were similar when exposed to categorical and strong probabilistic match evidence. Participants reduced these likelihoods when exposed to the weaker probabilistic evidence, but did not otherwise discriminate among the prints assigned different match probabilities. © 2018 American Academy of Forensic Sciences.

  19. xLPR - a probabilistic approach to piping integrity analysis

    International Nuclear Information System (INIS)

    Harrington, C.; Rudland, D.; Fyfitch, S.

    2015-01-01

    The xLPR Code is a probabilistic fracture mechanics (PFM) computational tool that can be used to quantitatively determine a best-estimate probability of failure with well characterized uncertainties for reactor coolant system components, beginning with the piping systems and including the effects of relevant active degradation mechanisms. The initial application planned for xLPR is somewhat narrowly focused on validating LBB (leak-before-break) compliance in PWSCC-susceptible systems such as coolant systems of PWRs. The xLPR code incorporates a set of deterministic models that represent the full range of physical phenomena necessary to evaluate both fatigue and PWSCC degradation modes from crack initiation through failure. These models are each implemented in a modular form and linked together by a probabilistic framework that contains the logic for xLPR execution, exercises the individual modules as required, and performs necessary administrative and bookkeeping functions. The completion of the first production version of the xLPR code in a fully documented, releasable condition is presently planned for spring 2015

  20. Global/local methods for probabilistic structural analysis

    Science.gov (United States)

    Millwater, H. R.; Wu, Y.-T.

    1993-04-01

    A probabilistic global/local method is proposed to reduce the computational requirements of probabilistic structural analysis. A coarser global model is used for most of the computations with a local more refined model used only at key probabilistic conditions. The global model is used to establish the cumulative distribution function (cdf) and the Most Probable Point (MPP). The local model then uses the predicted MPP to adjust the cdf value. The global/local method is used within the advanced mean value probabilistic algorithm. The local model can be more refined with respect to the g1obal model in terms of finer mesh, smaller time step, tighter tolerances, etc. and can be used with linear or nonlinear models. The basis for this approach is described in terms of the correlation between the global and local models which can be estimated from the global and local MPPs. A numerical example is presented using the NESSUS probabilistic structural analysis program with the finite element method used for the structural modeling. The results clearly indicate a significant computer savings with minimal loss in accuracy.

  1. Probabilistic Simulation of Multi-Scale Composite Behavior

    Science.gov (United States)

    Chamis, Christos C.

    2012-01-01

    A methodology is developed to computationally assess the non-deterministic composite response at all composite scales (from micro to structural) due to the uncertainties in the constituent (fiber and matrix) properties, in the fabrication process and in structural variables (primitive variables). The methodology is computationally efficient for simulating the probability distributions of composite behavior, such as material properties, laminate and structural responses. Bi-products of the methodology are probabilistic sensitivities of the composite primitive variables. The methodology has been implemented into the computer codes PICAN (Probabilistic Integrated Composite ANalyzer) and IPACS (Integrated Probabilistic Assessment of Composite Structures). The accuracy and efficiency of this methodology are demonstrated by simulating the uncertainties in composite typical laminates and comparing the results with the Monte Carlo simulation method. Available experimental data of composite laminate behavior at all scales fall within the scatters predicted by PICAN. Multi-scaling is extended to simulate probabilistic thermo-mechanical fatigue and to simulate the probabilistic design of a composite redome in order to illustrate its versatility. Results show that probabilistic fatigue can be simulated for different temperature amplitudes and for different cyclic stress magnitudes. Results also show that laminate configurations can be selected to increase the redome reliability by several orders of magnitude without increasing the laminate thickness--a unique feature of structural composites. The old reference denotes that nothing fundamental has been done since that time.

  2. Probabilistic Cue Combination: Less Is More

    Science.gov (United States)

    Yurovsky, Daniel; Boyer, Ty W.; Smith, Linda B.; Yu, Chen

    2013-01-01

    Learning about the structure of the world requires learning probabilistic relationships: rules in which cues do not predict outcomes with certainty. However, in some cases, the ability to track probabilistic relationships is a handicap, leading adults to perform non-normatively in prediction tasks. For example, in the "dilution effect,"…

  3. Paradigms for biologically inspired design

    DEFF Research Database (Denmark)

    Lenau, T. A.; Metzea, A.-L.; Hesselberg, T.

    2018-01-01

    engineering, medical engineering, nanotechnology, photonics,environmental protection and agriculture. However, a major obstacle for the wider use of biologically inspired design isthe knowledge barrier that exist between the application engineers that have insight into how to design suitable productsand......Biologically inspired design is attracting increasing interest since it offers access to a huge biological repository of wellproven design principles that can be used for developing new and innovative products. Biological phenomena can inspireproduct innovation in as diverse areas as mechanical...... the biologists with detailed knowledge and experience in understanding how biological organisms function in theirenvironment. The biologically inspired design process can therefore be approached using different design paradigmsdepending on the dominant opportunities, challenges and knowledge characteristics...

  4. Delineating probabilistic species pools in ecology and biogeography

    OpenAIRE

    Karger, Dirk Nikolaus; Cord, Anna F; Kessler, Michael; Kreft, Holger; Kühn, Ingolf; Pompe, Sven; Sandel, Brody; Sarmento Cabral, Juliano; Smith, Adam B; Svenning, Jens-Christian; Tuomisto, Hanna; Weigelt, Patrick; Wesche, Karsten

    2016-01-01

    Aim To provide a mechanistic and probabilistic framework for defining the species pool based on species-specific probabilities of dispersal, environmental suitability and biotic interactions within a specific temporal extent, and to show how probabilistic species pools can help disentangle the geographical structure of different community assembly processes. Innovation Probabilistic species pools provide an improved species pool definition based on probabilities in conjuncti...

  5. Probabilistic Design

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Burcharth, H. F.

    This chapter describes how partial safety factors can be used in design of vertical wall breakwaters and an example of a code format is presented. The partial safety factors are calibrated on a probabilistic basis. The code calibration process used to calibrate some of the partial safety factors...

  6. Probabilistic Learning by Rodent Grid Cells.

    Science.gov (United States)

    Cheung, Allen

    2016-10-01

    Mounting evidence shows mammalian brains are probabilistic computers, but the specific cells involved remain elusive. Parallel research suggests that grid cells of the mammalian hippocampal formation are fundamental to spatial cognition but their diverse response properties still defy explanation. No plausible model exists which explains stable grids in darkness for twenty minutes or longer, despite being one of the first results ever published on grid cells. Similarly, no current explanation can tie together grid fragmentation and grid rescaling, which show very different forms of flexibility in grid responses when the environment is varied. Other properties such as attractor dynamics and grid anisotropy seem to be at odds with one another unless additional properties are assumed such as a varying velocity gain. Modelling efforts have largely ignored the breadth of response patterns, while also failing to account for the disastrous effects of sensory noise during spatial learning and recall, especially in darkness. Here, published electrophysiological evidence from a range of experiments are reinterpreted using a novel probabilistic learning model, which shows that grid cell responses are accurately predicted by a probabilistic learning process. Diverse response properties of probabilistic grid cells are statistically indistinguishable from rat grid cells across key manipulations. A simple coherent set of probabilistic computations explains stable grid fields in darkness, partial grid rescaling in resized arenas, low-dimensional attractor grid cell dynamics, and grid fragmentation in hairpin mazes. The same computations also reconcile oscillatory dynamics at the single cell level with attractor dynamics at the cell ensemble level. Additionally, a clear functional role for boundary cells is proposed for spatial learning. These findings provide a parsimonious and unified explanation of grid cell function, and implicate grid cells as an accessible neuronal population

  7. Guard Cell and Tropomyosin Inspired Chemical Sensor

    Directory of Open Access Journals (Sweden)

    Jacquelyn K.S. Nagel

    2013-10-01

    Full Text Available Sensors are an integral part of many engineered products and systems. Biological inspiration has the potential to improve current sensor designs as well as inspire innovative ones. This paper presents the design of an innovative, biologically-inspired chemical sensor that performs “up-front” processing through mechanical means. Inspiration from the physiology (function of the guard cell coupled with the morphology (form and physiology of tropomyosin resulted in two concept variants for the chemical sensor. Applications of the sensor design include environmental monitoring of harmful gases, and a non-invasive approach to detect illnesses including diabetes, liver disease, and cancer on the breath.

  8. Probabilistic composition of preferences, theory and applications

    CERN Document Server

    Parracho Sant'Anna, Annibal

    2015-01-01

    Putting forward a unified presentation of the features and possible applications of probabilistic preferences composition, and serving as a methodology for decisions employing multiple criteria, this book maximizes reader insights into the evaluation in probabilistic terms and the development of composition approaches that do not depend on assigning weights to the criteria. With key applications in important areas of management such as failure modes, effects analysis and productivity analysis – together with explanations about the application of the concepts involved –this book makes available numerical examples of probabilistic transformation development and probabilistic composition. Useful not only as a reference source for researchers, but also in teaching classes of graduate courses in Production Engineering and Management Science, the key themes of the book will be of especial interest to researchers in the field of Operational Research.

  9. Biologically Inspired Micro-Flight Research

    Science.gov (United States)

    Raney, David L.; Waszak, Martin R.

    2003-01-01

    Natural fliers demonstrate a diverse array of flight capabilities, many of which are poorly understood. NASA has established a research project to explore and exploit flight technologies inspired by biological systems. One part of this project focuses on dynamic modeling and control of micro aerial vehicles that incorporate flexible wing structures inspired by natural fliers such as insects, hummingbirds and bats. With a vast number of potential civil and military applications, micro aerial vehicles represent an emerging sector of the aerospace market. This paper describes an ongoing research activity in which mechanization and control concepts for biologically inspired micro aerial vehicles are being explored. Research activities focusing on a flexible fixed- wing micro aerial vehicle design and a flapping-based micro aerial vehicle concept are presented.

  10. Effect of inspiration on airway dimensions measured in maximal inspiration CT images of subjects without airflow limitation

    Energy Technology Data Exchange (ETDEWEB)

    Petersen, Jens; Raket, Lars Lau; Nielsen, Mads [University of Copenhagen, Department of Computer Science, Copenhagen (Denmark); Wille, Mathilde M.W.; Dirksen, Asger [University of Copenhagen, Department of Respiratory Medicine, Gentofte Hospital, Hellerup (Denmark); Feragen, Aasa [University of Copenhagen, Department of Computer Science, Copenhagen (Denmark); Max Planck Institute for Intelligent Systems and Max Planck Institute for Developmental Biology, Tuebingen (Germany); Pedersen, Jesper H. [Rigshospitalet, University Hospital of Copenhagen, Department of Cardio-Thoracic Surgery RT, Copenhagen (Denmark); Bruijne, Marleen de [University of Copenhagen, Department of Computer Science, Copenhagen (Denmark); Erasmus MC Rotterdam, Departments of Medical Informatics and Radiology, Rotterdam (Netherlands)

    2014-09-15

    To study the effect of inspiration on airway dimensions measured in voluntary inspiration breath-hold examinations. 961 subjects with normal spirometry were selected from the Danish Lung Cancer Screening Trial. Subjects were examined annually for five years with low-dose CT. Automated software was utilized to segment lungs and airways, identify segmental bronchi, and match airway branches in all images of the same subject. Inspiration level was defined as segmented total lung volume (TLV) divided by predicted total lung capacity (pTLC). Mixed-effects models were used to predict relative change in lumen diameter (ALD) and wall thickness (AWT) in airways of generation 0 (trachea) to 7 and segmental bronchi (R1-R10 and L1-L10) from relative changes in inspiration level. Relative changes in ALD were related to relative changes in TLV/pTLC, and this distensibility increased with generation (p < 0.001). Relative changes in AWT were inversely related to relative changes in TLV/pTLC in generation 3-7 (p < 0.001). Segmental bronchi were widely dispersed in terms of ALD (5.7 ± 0.7 mm), AWT (0.86 ± 0.07 mm), and distensibility (23.5 ± 7.7 %). Subjects who inspire more deeply prior to imaging have larger ALD and smaller AWT. This effect is more pronounced in higher-generation airways. Therefore, adjustment of inspiration level is necessary to accurately assess airway dimensions. (orig.)

  11. A Physics-Inspired Introduction to Political Science

    Science.gov (United States)

    Taagepera, Rein

    1976-01-01

    This paper analyzes what is involved in patterning part of an introduction to politics along the lines of physical sciences, and it presents contents and results of a course in which the author did this. (Author/ND)

  12. Probabilistic Reversible Automata and Quantum Automata

    OpenAIRE

    Golovkins, Marats; Kravtsev, Maksim

    2002-01-01

    To study relationship between quantum finite automata and probabilistic finite automata, we introduce a notion of probabilistic reversible automata (PRA, or doubly stochastic automata). We find that there is a strong relationship between different possible models of PRA and corresponding models of quantum finite automata. We also propose a classification of reversible finite 1-way automata.

  13. Probabilistic uniformities of uniform spaces

    Energy Technology Data Exchange (ETDEWEB)

    Rodriguez Lopez, J.; Romaguera, S.; Sanchis, M.

    2017-07-01

    The theory of metric spaces in the fuzzy context has shown to be an interesting area of study not only from a theoretical point of view but also for its applications. Nevertheless, it is usual to consider these spaces as classical topological or uniform spaces and there are not too many results about constructing fuzzy topological structures starting from a fuzzy metric. Maybe, H/{sup o}hle was the first to show how to construct a probabilistic uniformity and a Lowen uniformity from a probabilistic pseudometric /cite{Hohle78,Hohle82a}. His method can be directly translated to the context of fuzzy metrics and allows to characterize the categories of probabilistic uniform spaces or Lowen uniform spaces by means of certain families of fuzzy pseudometrics /cite{RL}. On the other hand, other different fuzzy uniformities can be constructed in a fuzzy metric space: a Hutton $[0,1]$-quasi-uniformity /cite{GGPV06}; a fuzzifiying uniformity /cite{YueShi10}, etc. The paper /cite{GGRLRo} gives a study of several methods of endowing a fuzzy pseudometric space with a probabilistic uniformity and a Hutton $[0,1]$-quasi-uniformity. In 2010, J. Guti/'errez Garc/'{/i}a, S. Romaguera and M. Sanchis /cite{GGRoSanchis10} proved that the category of uniform spaces is isomorphic to a category formed by sets endowed with a fuzzy uniform structure, i. e. a family of fuzzy pseudometrics satisfying certain conditions. We will show here that, by means of this isomorphism, we can obtain several methods to endow a uniform space with a probabilistic uniformity. Furthermore, these constructions allow to obtain a factorization of some functors introduced in /cite{GGRoSanchis10}. (Author)

  14. Bisimulations meet PCTL equivalences for probabilistic automata

    DEFF Research Database (Denmark)

    Song, Lei; Zhang, Lijun; Godskesen, Jens Chr.

    2013-01-01

    Probabilistic automata (PAs) have been successfully applied in formal verification of concurrent and stochastic systems. Efficient model checking algorithms have been studied, where the most often used logics for expressing properties are based on probabilistic computation tree logic (PCTL) and its...

  15. Biomimetic and bio-inspired uses of mollusc shells.

    Science.gov (United States)

    Morris, J P; Wang, Y; Backeljau, T; Chapelle, G

    2016-06-01

    Climate change and ocean acidification are likely to have a profound effect on marine molluscs, which are of great ecological and economic importance. One process particularly sensitive to climate change is the formation of biominerals in mollusc shells. Fundamental research is broadening our understanding of the biomineralization process, as well as providing more informed predictions on the effects of climate change on marine molluscs. Such studies are important in their own right, but their value also extends to applied sciences. Biominerals, organic/inorganic hybrid materials with many remarkable physical and chemical properties, have been studied for decades, and the possibilities for future improved use of such materials for society are widely recognised. This article highlights the potential use of our understanding of the shell biomineralization process in novel bio-inspired and biomimetic applications. It also highlights the potential for the valorisation of shells produced as a by-product of the aquaculture industry. Studying shells and the formation of biominerals will inspire novel functional hybrid materials. It may also provide sustainable, ecologically- and economically-viable solutions to some of the problems created by current human resource exploitation. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. Nanofluidics in two-dimensional layered materials: inspirations from nature.

    Science.gov (United States)

    Gao, Jun; Feng, Yaping; Guo, Wei; Jiang, Lei

    2017-08-29

    With the advance of chemistry, materials science, and nanotechnology, significant progress has been achieved in the design and application of synthetic nanofluidic devices and materials, mimicking the gating, rectifying, and adaptive functions of biological ion channels. Fundamental physics and chemistry behind these novel transport phenomena on the nanoscale have been explored in depth on single-pore platforms. However, toward real-world applications, one major challenge is to extrapolate these single-pore devices into macroscopic materials. Recently, inspired partially by the layered microstructure of nacre, the material design and large-scale integration of artificial nanofluidic devices have stepped into a completely new stage, termed 2D nanofluidics. Unique advantages of the 2D layered materials have been found, such as facile and scalable fabrication, high flux, efficient chemical modification, tunable channel size, etc. These features enable wide applications in, for example, biomimetic ion transport manipulation, molecular sieving, water treatment, and nanofluidic energy conversion and storage. This review highlights the recent progress, current challenges, and future perspectives in this emerging research field of "2D nanofluidics", with emphasis on the thought of bio-inspiration.

  17. Probabilistic safety analysis procedures guide

    International Nuclear Information System (INIS)

    Papazoglou, I.A.; Bari, R.A.; Buslik, A.J.

    1984-01-01

    A procedures guide for the performance of probabilistic safety assessment has been prepared for interim use in the Nuclear Regulatory Commission programs. The probabilistic safety assessment studies performed are intended to produce probabilistic predictive models that can be used and extended by the utilities and by NRC to sharpen the focus of inquiries into a range of tissues affecting reactor safety. This guide addresses the determination of the probability (per year) of core damage resulting from accident initiators internal to the plant and from loss of offsite electric power. The scope includes analyses of problem-solving (cognitive) human errors, a determination of importance of the various core damage accident sequences, and an explicit treatment and display of uncertainties for the key accident sequences. Ultimately, the guide will be augmented to include the plant-specific analysis of in-plant processes (i.e., containment performance) and the risk associated with external accident initiators, as consensus is developed regarding suitable methodologies in these areas. This guide provides the structure of a probabilistic safety study to be performed, and indicates what products of the study are essential for regulatory decision making. Methodology is treated in the guide only to the extent necessary to indicate the range of methods which is acceptable; ample reference is given to alternative methodologies which may be utilized in the performance of the study

  18. Overview of Future of Probabilistic Methods and RMSL Technology and the Probabilistic Methods Education Initiative for the US Army at the SAE G-11 Meeting

    Science.gov (United States)

    Singhal, Surendra N.

    2003-01-01

    The SAE G-11 RMSL Division and Probabilistic Methods Committee meeting sponsored by the Picatinny Arsenal during March 1-3, 2004 at Westin Morristown, will report progress on projects for probabilistic assessment of Army system and launch an initiative for probabilistic education. The meeting features several Army and industry Senior executives and Ivy League Professor to provide an industry/government/academia forum to review RMSL technology; reliability and probabilistic technology; reliability-based design methods; software reliability; and maintainability standards. With over 100 members including members with national/international standing, the mission of the G-11s Probabilistic Methods Committee is to enable/facilitate rapid deployment of probabilistic technology to enhance the competitiveness of our industries by better, faster, greener, smarter, affordable and reliable product development.

  19. Inspiration fra NY-times

    DEFF Research Database (Denmark)

    Ejersbo, Lisser Rye

    2015-01-01

    NY-times har en ugentlig klumme med gode råd. For nogle uger siden var ugens inspiration henvendt til lærere/undervisere og drejede sig om, hvordan man skaber taletid til alle uden at have favoritter og overse de mere stille elever.......NY-times har en ugentlig klumme med gode råd. For nogle uger siden var ugens inspiration henvendt til lærere/undervisere og drejede sig om, hvordan man skaber taletid til alle uden at have favoritter og overse de mere stille elever....

  20. Multiobjective optimal allocation problem with probabilistic non ...

    African Journals Online (AJOL)

    This paper considers the optimum compromise allocation in multivariate stratified sampling with non-linear objective function and probabilistic non-linear cost constraint. The probabilistic non-linear cost constraint is converted into equivalent deterministic one by using Chance Constrained programming. A numerical ...

  1. Pre-processing for Triangulation of Probabilistic Networks

    NARCIS (Netherlands)

    Bodlaender, H.L.; Koster, A.M.C.A.; Eijkhof, F. van den; Gaag, L.C. van der

    2001-01-01

    The currently most efficient algorithm for inference with a probabilistic network builds upon a triangulation of a networks graph. In this paper, we show that pre-processing can help in finding good triangulations for probabilistic networks, that is, triangulations with a minimal maximum

  2. Inspiration til undervisning på museer

    DEFF Research Database (Denmark)

    Hyllested, Trine Elisabeth

    2015-01-01

    collection and arrangement of knowledge meant to give a general view of, to inspire and to develop teaching at museums in Denmark......collection and arrangement of knowledge meant to give a general view of, to inspire and to develop teaching at museums in Denmark...

  3. Probabilistic Geoacoustic Inversion in Complex Environments

    Science.gov (United States)

    2015-09-30

    Probabilistic Geoacoustic Inversion in Complex Environments Jan Dettmer School of Earth and Ocean Sciences, University of Victoria, Victoria BC...long-range inversion methods can fail to provide sufficient resolution. For proper quantitative examination of variability, parameter uncertainty must...project aims to advance probabilistic geoacoustic inversion methods for complex ocean environments for a range of geoacoustic data types. The work is

  4. Optimization (Alara) and probabilistic exposures: the application of optimization criteria to the control of risks due to exposures of a probabilistic nature

    International Nuclear Information System (INIS)

    Gonzalez, A.J.

    1989-01-01

    The paper described the application of the principles of optimization recommended by the International Commission on Radiological Protection (ICRP) to the restrain of radiation risks due to exposures that may or may not be incurred and to which a probability of occurrence can be assigned. After describing the concept of probabilistic exposures, it proposes a basis for a converging policy of control for both certain and probabilistic exposures, namely the dose-risk relationship adopted for radiation protection purposes. On that basis some coherent approaches for dealing with probabilistic exposures, such as the limitation of individual risks, are discussed. The optimization of safety for reducing all risks from probabilistic exposures to as-low-as-reasonably-achievable (ALARA) levels is reviewed in full. The principles of optimization of protection are used as a basic framework and the relevant factors to be taken into account when moving to probabilistic exposures are presented. The paper also reviews the decision-aiding techniques suitable for performing optimization with particular emphasis to the multi-attribute utility-analysis technique. Finally, there is a discussion on some practical application of decision-aiding multi-attribute utility analysis to probabilistic exposures including the use of probabilistic utilities. In its final outlook, the paper emphasizes the need for standardization and solutions to generic problems, if optimization of safety is to be successful

  5. Data specifications for INSPIRE

    Science.gov (United States)

    Portele, Clemens; Woolf, Andrew; Cox, Simon

    2010-05-01

    In Europe a major recent development has been the entering in force of the INSPIRE Directive in May 2007, establishing an infrastructure for spatial information in Europe to support Community environmental policies, and policies or activities which may have an impact on the environment. INSPIRE is based on the infrastructures for spatial information established and operated by the 27 Member States of the European Union. The Directive addresses 34 spatial data themes needed for environmental applications, with key components specified through technical implementing rules. This makes INSPIRE a unique example of a legislative "regional" approach. One of the requirements of the INSPIRE Directive is to make existing spatial data sets with relevance for one of the spatial data themes available in an interoperable way, i.e. where the spatial data from different sources in Europe can be combined to a coherent result. Since INSPIRE covers a wide range of spatial data themes, the first step has been the development of a modelling framework that provides a common foundation for all themes. This framework is largely based on the ISO 19100 series of standards. The use of common generic spatial modelling concepts across all themes is an important enabler for interoperability. As a second step, data specifications for the first set of themes has been developed based on the modelling framework. The themes include addresses, transport networks, protected sites, hydrography, administrative areas and others. The data specifications were developed by selected experts nominated by stakeholders from all over Europe. For each theme a working group was established in early 2008 working on their specific theme and collaborating with the other working groups on cross-theme issues. After a public review of the draft specifications starting in December 2008, an open testing process and thorough comment resolution process, the draft technical implementing rules for these themes have been

  6. INSPIRE from the JRC Point of View

    Directory of Open Access Journals (Sweden)

    Vlado Cetl

    2012-12-01

    Full Text Available This paper summarises some recent developments in INSPIRE implementation from the JRC (Joint Research Centre point of view. The INSPIRE process started around 11 years ago and today, clear results and benefits can be seen. Spatial data are more accessible and shared more frequently between countries and at the European level. In addition to this, efficient, unified coordination and collaboration between different stakeholders and participants has been achieved, which is another great success. The JRC, as a scientific think-tank of the European Commission, has played a very important role in this process from the very beginning. This role is in line with its mission, which is to provide customer-driven scientific and technical support for the conception, development, implementation and monitoring of European Union (EU policies. The JRC acts as the overall technical coordinator of INSPIRE, but it also carries out the activities necessary to support the coherent implementation of INSPIRE, by helping member states in the implementation process. Experiences drawn from collaboration and negotiation in each country and at the European level will be of great importance in the revision of the INSPIRE Directive, which is envisaged for 2014. Keywords: spatial data infrastructure (SDI; INSPIRE; development; Joint Research Centre (JRC

  7. Nature-inspired computation in engineering

    CERN Document Server

    2016-01-01

    This timely review book summarizes the state-of-the-art developments in nature-inspired optimization algorithms and their applications in engineering. Algorithms and topics include the overview and history of nature-inspired algorithms, discrete firefly algorithm, discrete cuckoo search, plant propagation algorithm, parameter-free bat algorithm, gravitational search, biogeography-based algorithm, differential evolution, particle swarm optimization and others. Applications include vehicle routing, swarming robots, discrete and combinatorial optimization, clustering of wireless sensor networks, cell formation, economic load dispatch, metamodeling, surrogated-assisted cooperative co-evolution, data fitting and reverse engineering as well as other case studies in engineering. This book will be an ideal reference for researchers, lecturers, graduates and engineers who are interested in nature-inspired computation, artificial intelligence and computational intelligence. It can also serve as a reference for relevant...

  8. A Comprehensive Probabilistic Framework to Learn Air Data from Surface Pressure Measurements

    Directory of Open Access Journals (Sweden)

    Ankur Srivastava

    2015-01-01

    Full Text Available Use of probabilistic techniques has been demonstrated to learn air data parameters from surface pressure measurements. Integration of numerical models with wind tunnel data and sequential experiment design of wind tunnel runs has been demonstrated in the calibration of a flush air data sensing anemometer system. Development and implementation of a metamodeling method, Sequential Function Approximation (SFA, are presented which lies at the core of the discussed probabilistic framework. SFA is presented as a tool capable of nonlinear statistical inference, uncertainty reduction by fusion of data with physical models of variable fidelity, and sequential experiment design. This work presents the development and application of these tools in the calibration of FADS for a Runway Assisted Landing Site (RALS control tower. However, the multidisciplinary nature of this work is general in nature and is potentially applicable to a variety of mechanical and aerospace engineering problems.

  9. Probabilistic structural analysis of aerospace components using NESSUS

    Science.gov (United States)

    Shiao, Michael C.; Nagpal, Vinod K.; Chamis, Christos C.

    1988-01-01

    Probabilistic structural analysis of a Space Shuttle main engine turbopump blade is conducted using the computer code NESSUS (numerical evaluation of stochastic structures under stress). The goal of the analysis is to derive probabilistic characteristics of blade response given probabilistic descriptions of uncertainties in blade geometry, material properties, and temperature and pressure distributions. Probability densities are derived for critical blade responses. Risk assessment and failure life analysis is conducted assuming different failure models.

  10. Probabilistic conditional independence structures

    CERN Document Server

    Studeny, Milan

    2005-01-01

    Probabilistic Conditional Independence Structures provides the mathematical description of probabilistic conditional independence structures; the author uses non-graphical methods of their description, and takes an algebraic approach.The monograph presents the methods of structural imsets and supermodular functions, and deals with independence implication and equivalence of structural imsets.Motivation, mathematical foundations and areas of application are included, and a rough overview of graphical methods is also given.In particular, the author has been careful to use suitable terminology, and presents the work so that it will be understood by both statisticians, and by researchers in artificial intelligence.The necessary elementary mathematical notions are recalled in an appendix.

  11. Biologically inspired control of humanoid robot arms robust and adaptive approaches

    CERN Document Server

    Spiers, Adam; Herrmann, Guido

    2016-01-01

    This book investigates a biologically inspired method of robot arm control, developed with the objective of synthesising human-like motion dynamically, using nonlinear, robust and adaptive control techniques in practical robot systems. The control method caters to a rising interest in humanoid robots and the need for appropriate control schemes to match these systems. Unlike the classic kinematic schemes used in industrial manipulators, the dynamic approaches proposed here promote human-like motion with better exploitation of the robot’s physical structure. This also benefits human-robot interaction. The control schemes proposed in this book are inspired by a wealth of human-motion literature that indicates the drivers of motion to be dynamic, model-based and optimal. Such considerations lend themselves nicely to achievement via nonlinear control techniques without the necessity for extensive and complex biological models. The operational-space method of robot control forms the basis of many of the techniqu...

  12. Probabilistic assessment of nuclear safety and safeguards

    International Nuclear Information System (INIS)

    Higson, D.J.

    1987-01-01

    Nuclear reactor accidents and diversions of materials from the nuclear fuel cycle are perceived by many people as particularly serious threats to society. Probabilistic assessment is a rational approach to the evaluation of both threats, and may provide a basis for decisions on appropriate actions to control them. Probabilistic method have become standard tools used in the analysis of safety, but there are disagreements on the criteria to be applied when assessing the results of analysis. Probabilistic analysis and assessment of the effectiveness of nuclear material safeguards are still at an early stage of development. (author)

  13. bayesPop: Probabilistic Population Projections

    Directory of Open Access Journals (Sweden)

    Hana Ševčíková

    2016-12-01

    Full Text Available We describe bayesPop, an R package for producing probabilistic population projections for all countries. This uses probabilistic projections of total fertility and life expectancy generated by Bayesian hierarchical models. It produces a sample from the joint posterior predictive distribution of future age- and sex-specific population counts, fertility rates and mortality rates, as well as future numbers of births and deaths. It provides graphical ways of summarizing this information, including trajectory plots and various kinds of probabilistic population pyramids. An expression language is introduced which allows the user to produce the predictive distribution of a wide variety of derived population quantities, such as the median age or the old age dependency ratio. The package produces aggregated projections for sets of countries, such as UN regions or trading blocs. The methodology has been used by the United Nations to produce their most recent official population projections for all countries, published in the World Population Prospects.

  14. bayesPop: Probabilistic Population Projections

    Science.gov (United States)

    Ševčíková, Hana; Raftery, Adrian E.

    2016-01-01

    We describe bayesPop, an R package for producing probabilistic population projections for all countries. This uses probabilistic projections of total fertility and life expectancy generated by Bayesian hierarchical models. It produces a sample from the joint posterior predictive distribution of future age- and sex-specific population counts, fertility rates and mortality rates, as well as future numbers of births and deaths. It provides graphical ways of summarizing this information, including trajectory plots and various kinds of probabilistic population pyramids. An expression language is introduced which allows the user to produce the predictive distribution of a wide variety of derived population quantities, such as the median age or the old age dependency ratio. The package produces aggregated projections for sets of countries, such as UN regions or trading blocs. The methodology has been used by the United Nations to produce their most recent official population projections for all countries, published in the World Population Prospects. PMID:28077933

  15. Impact of Game-Inspired Infographics on User Engagement and Information Processing in an eHealth Program.

    Science.gov (United States)

    Comello, Maria Leonora G; Qian, Xiaokun; Deal, Allison M; Ribisl, Kurt M; Linnan, Laura A; Tate, Deborah F

    2016-09-22

    Online interventions providing individual health behavior assessment should deliver feedback in a way that is both understandable and engaging. This study focused on the potential for infographics inspired by the aesthetics of game design to contribute to these goals. We conducted formative research to test game-inspired infographics against more traditional displays (eg, text-only, column chart) for conveying a behavioral goal and an individual's behavior relative to the goal. We explored the extent to which the display type would influence levels of engagement and information processing. Between-participants experiments compared game-inspired infographics with traditional formats in terms of outcomes related to information processing (eg, comprehension, cognitive load) and engagement (eg, attitudes toward the information, emotional tone). We randomly assigned participants (N=1162) to an experiment in 1 of 6 modules (tobacco use, alcohol use, vegetable consumption, fruit consumption, physical activity, and weight management). In the tobacco module, a game-inspired format (scorecard) was compared with text-only; there were no differences in attitudes and emotional tone, but the scorecard outperformed text-only on comprehension (P=.004) and decreased cognitive load (P=.006). For the other behaviors, we tested 2 game-inspired formats (scorecard, progress bar) and a traditional column chart; there were no differences in comprehension, but the progress bar outperformed the other formats on attitudes and emotional tone (Pgame-inspired infographic showed potential to outperform a traditional format for some study outcomes while not underperforming on other outcomes. Overall, findings support the use of game-inspired infographics in behavioral assessment feedback to enhance comprehension and engagement, which may lead to greater behavior change.

  16. A Markov Chain Approach to Probabilistic Swarm Guidance

    Science.gov (United States)

    Acikmese, Behcet; Bayard, David S.

    2012-01-01

    This paper introduces a probabilistic guidance approach for the coordination of swarms of autonomous agents. The main idea is to drive the swarm to a prescribed density distribution in a prescribed region of the configuration space. In its simplest form, the probabilistic approach is completely decentralized and does not require communication or collabo- ration between agents. Agents make statistically independent probabilistic decisions based solely on their own state, that ultimately guides the swarm to the desired density distribution in the configuration space. In addition to being completely decentralized, the probabilistic guidance approach has a novel autonomous self-repair property: Once the desired swarm density distribution is attained, the agents automatically repair any damage to the distribution without collaborating and without any knowledge about the damage.

  17. Probabilistic reasoning for assembly-based 3D modeling

    KAUST Repository

    Chaudhuri, Siddhartha

    2011-01-01

    Assembly-based modeling is a promising approach to broadening the accessibility of 3D modeling. In assembly-based modeling, new models are assembled from shape components extracted from a database. A key challenge in assembly-based modeling is the identification of relevant components to be presented to the user. In this paper, we introduce a probabilistic reasoning approach to this problem. Given a repository of shapes, our approach learns a probabilistic graphical model that encodes semantic and geometric relationships among shape components. The probabilistic model is used to present components that are semantically and stylistically compatible with the 3D model that is being assembled. Our experiments indicate that the probabilistic model increases the relevance of presented components. © 2011 ACM.

  18. Probabilistic structural analysis methods for select space propulsion system components

    Science.gov (United States)

    Millwater, H. R.; Cruse, T. A.

    1989-01-01

    The Probabilistic Structural Analysis Methods (PSAM) project developed at the Southwest Research Institute integrates state-of-the-art structural analysis techniques with probability theory for the design and analysis of complex large-scale engineering structures. An advanced efficient software system (NESSUS) capable of performing complex probabilistic analysis has been developed. NESSUS contains a number of software components to perform probabilistic analysis of structures. These components include: an expert system, a probabilistic finite element code, a probabilistic boundary element code and a fast probability integrator. The NESSUS software system is shown. An expert system is included to capture and utilize PSAM knowledge and experience. NESSUS/EXPERT is an interactive menu-driven expert system that provides information to assist in the use of the probabilistic finite element code NESSUS/FEM and the fast probability integrator (FPI). The expert system menu structure is summarized. The NESSUS system contains a state-of-the-art nonlinear probabilistic finite element code, NESSUS/FEM, to determine the structural response and sensitivities. A broad range of analysis capabilities and an extensive element library is present.

  19. Probabilistic reasoning in data analysis.

    Science.gov (United States)

    Sirovich, Lawrence

    2011-09-20

    This Teaching Resource provides lecture notes, slides, and a student assignment for a lecture on probabilistic reasoning in the analysis of biological data. General probabilistic frameworks are introduced, and a number of standard probability distributions are described using simple intuitive ideas. Particular attention is focused on random arrivals that are independent of prior history (Markovian events), with an emphasis on waiting times, Poisson processes, and Poisson probability distributions. The use of these various probability distributions is applied to biomedical problems, including several classic experimental studies.

  20. Do probabilistic forecasts lead to better decisions?

    Directory of Open Access Journals (Sweden)

    M. H. Ramos

    2013-06-01

    Full Text Available The last decade has seen growing research in producing probabilistic hydro-meteorological forecasts and increasing their reliability. This followed the promise that, supplied with information about uncertainty, people would take better risk-based decisions. In recent years, therefore, research and operational developments have also started focusing attention on ways of communicating the probabilistic forecasts to decision-makers. Communicating probabilistic forecasts includes preparing tools and products for visualisation, but also requires understanding how decision-makers perceive and use uncertainty information in real time. At the EGU General Assembly 2012, we conducted a laboratory-style experiment in which several cases of flood forecasts and a choice of actions to take were presented as part of a game to participants, who acted as decision-makers. Answers were collected and analysed. In this paper, we present the results of this exercise and discuss if we indeed make better decisions on the basis of probabilistic forecasts.

  1. A linear process-algebraic format for probabilistic systems with data

    NARCIS (Netherlands)

    Katoen, Joost P.; van de Pol, Jan Cornelis; Stoelinga, Mariëlle Ida Antoinette; Timmer, Mark; Gomes, L.; Khomenko, V.; Fernandes, J.M.

    This paper presents a novel linear process algebraic format for probabilistic automata. The key ingredient is a symbolic transformation of probabilistic process algebra terms that incorporate data into this linear format while preserving strong probabilistic bisimulation. This generalises similar

  2. Skin-Inspired Electronics: An Emerging Paradigm.

    Science.gov (United States)

    Wang, Sihong; Oh, Jin Young; Xu, Jie; Tran, Helen; Bao, Zhenan

    2018-05-15

    Future electronics will take on more important roles in people's lives. They need to allow more intimate contact with human beings to enable advanced health monitoring, disease detection, medical therapies, and human-machine interfacing. However, current electronics are rigid, nondegradable and cannot self-repair, while the human body is soft, dynamic, stretchable, biodegradable, and self-healing. Therefore, it is critical to develop a new class of electronic materials that incorporate skinlike properties, including stretchability for conformable integration, minimal discomfort and suppressed invasive reactions; self-healing for long-term durability under harsh mechanical conditions; and biodegradability for reducing environmental impact and obviating the need for secondary device removal for medical implants. These demands have fueled the development of a new generation of electronic materials, primarily composed of polymers and polymer composites with both high electrical performance and skinlike properties, and consequently led to a new paradigm of electronics, termed "skin-inspired electronics". This Account covers recent important advances in skin-inspired electronics, from basic material developments to device components and proof-of-concept demonstrations for integrated bioelectronics applications. To date, stretchability has been the most prominent focus in this field. In contrast to strain-engineering approaches that extrinsically impart stretchability into inorganic electronics, intrinsically stretchable materials provide a direct route to achieve higher mechanical robustness, higher device density, and scalable fabrication. The key is the introduction of strain-dissipation mechanisms into the material design, which has been realized through molecular engineering (e.g., soft molecular segments, dynamic bonds) and physical engineering (e.g., nanoconfinement effect, geometric design). The material design concepts have led to the successful demonstrations of

  3. Kids Inspire Kids for STEAM

    OpenAIRE

    Fenyvesi, Kristof; Houghton, Tony; Diego-Mantecón, José Manuel; Crilly, Elizabeth; Oldknow, Adrian; Lavicza, Zsolt; Blanco, Teresa F.

    2017-01-01

    Abstract The goal of the Kids Inspiring Kids in STEAM (KIKS) project was to raise students' awareness towards the multi- and transdisciplinary connections between the STEAM subjects (Science, Technology, Engineering, Arts & Mathematics), and make the learning about topics and phenomena from these fields more enjoyable. In order to achieve these goals, KIKS project has popularized the STEAM-concept by projects based on the students inspiring other students-approach and by utilizing new tec...

  4. Probabilistic thread algebra

    NARCIS (Netherlands)

    Bergstra, J.A.; Middelburg, C.A.

    2015-01-01

    We add probabilistic features to basic thread algebra and its extensions with thread-service interaction and strategic interleaving. Here, threads represent the behaviours produced by instruction sequences under execution and services represent the behaviours exhibited by the components of execution

  5. The cascade probabilistic functions and the Markov's processes. Chapter 1

    International Nuclear Information System (INIS)

    2003-01-01

    In the Chapter 1 the physical and mathematical descriptions of radiation processes are carried out. The relation of the cascade probabilistic functions (CPF) with Markov's chain is shown. The CPF calculation for electrons with the energy losses taking into account are given. The calculation of the CPF on the computer was carried out. The estimation of energy losses contribution in the CPFs and radiation defects concentration are made. Besides calculation of the primarily knock-on atoms and radiation defects at electron irradiation with use of the CPF with taking into account energy losses are conducted

  6. On probabilistic forecasting of wind power time-series

    DEFF Research Database (Denmark)

    Pinson, Pierre

    power dynamics. In both cases, the model parameters are adaptively and recursively estimated, time-adaptativity being the result of exponential forgetting of past observations. The probabilistic forecasting methodology is applied at the Horns Rev wind farm in Denmark, for 10-minute ahead probabilistic...... forecasting of wind power generation. Probabilistic forecasts generated from the proposed methodology clearly have higher skill than those obtained from a classical Gaussian assumption about wind power predictive densities. Corresponding point forecasts also exhibit significantly lower error criteria....

  7. Review of the Brunswick Steam Electric Plant Probabilistic Risk Assessment

    International Nuclear Information System (INIS)

    Sattison, M.B.; Davis, P.R.; Satterwhite, D.G.; Gilmore, W.E.; Gregg, R.E.

    1989-11-01

    A review of the Brunswick Steam Electric Plant probabilistic risk Assessment was conducted with the objective of confirming the safety perspectives brought to light by the probabilistic risk assessment. The scope of the review included the entire Level I probabilistic risk assessment including external events. This is consistent with the scope of the probabilistic risk assessment. The review included an assessment of the assumptions, methods, models, and data used in the study. 47 refs., 14 figs., 15 tabs

  8. Deliverable D74.2. Probabilistic analysis methods for support structures

    DEFF Research Database (Denmark)

    Gintautas, Tomas

    2018-01-01

    Relevant Description: Report describing the probabilistic analysis for offshore substructures and results attained. This includes comparison with experimental data and with conventional design. Specific targets: 1) Estimate current reliability level of support structures 2) Development of basis...... for probabilistic calculations and evaluation of reliability for offshore support structures (substructures) 3) Development of a probabilistic model for stiffness and strength of soil parameters and for modeling geotechnical load bearing capacity 4) Comparison between probabilistic analysis and deterministic...

  9. A common fixed point for operators in probabilistic normed spaces

    International Nuclear Information System (INIS)

    Ghaemi, M.B.; Lafuerza-Guillen, Bernardo; Razani, A.

    2009-01-01

    Probabilistic Metric spaces was introduced by Karl Menger. Alsina, Schweizer and Sklar gave a general definition of probabilistic normed space based on the definition of Menger [Alsina C, Schweizer B, Sklar A. On the definition of a probabilistic normed spaces. Aequationes Math 1993;46:91-8]. Here, we consider the equicontinuity of a class of linear operators in probabilistic normed spaces and finally, a common fixed point theorem is proved. Application to quantum Mechanic is considered.

  10. A History of Probabilistic Inductive Logic Programming

    Directory of Open Access Journals (Sweden)

    Fabrizio eRiguzzi

    2014-09-01

    Full Text Available The field of Probabilistic Logic Programming (PLP has seen significant advances in the last 20 years, with many proposals for languages that combine probability with logic programming. Since the start, the problem of learning probabilistic logic programs has been the focus of much attention. Learning these programs represents a whole subfield of Inductive Logic Programming (ILP. In Probabilistic ILP (PILP two problems are considered: learning the parameters of a program given the structure (the rules and learning both the structure and the parameters. Usually structure learning systems use parameter learning as a subroutine. In this article we present an overview of PILP and discuss the main results.

  11. Probabilistic assessment of faults

    International Nuclear Information System (INIS)

    Foden, R.W.

    1987-01-01

    Probabilistic safety analysis (PSA) is the process by which the probability (or frequency of occurrence) of reactor fault conditions which could lead to unacceptable consequences is assessed. The basic objective of a PSA is to allow a judgement to be made as to whether or not the principal probabilistic requirement is satisfied. It also gives insights into the reliability of the plant which can be used to identify possible improvements. This is explained in the article. The scope of a PSA and the PSA performed by the National Nuclear Corporation (NNC) for the Heysham II and Torness AGRs and Sizewell-B PWR are discussed. The NNC methods for hazards, common cause failure and operator error are mentioned. (UK)

  12. Probabilistic methods in combinatorial analysis

    CERN Document Server

    Sachkov, Vladimir N

    2014-01-01

    This 1997 work explores the role of probabilistic methods for solving combinatorial problems. These methods not only provide the means of efficiently using such notions as characteristic and generating functions, the moment method and so on but also let us use the powerful technique of limit theorems. The basic objects under investigation are nonnegative matrices, partitions and mappings of finite sets, with special emphasis on permutations and graphs, and equivalence classes specified on sequences of finite length consisting of elements of partially ordered sets; these specify the probabilist

  13. Probabilistic coding of quantum states

    International Nuclear Information System (INIS)

    Grudka, Andrzej; Wojcik, Antoni; Czechlewski, Mikolaj

    2006-01-01

    We discuss the properties of probabilistic coding of two qubits to one qutrit and generalize the scheme to higher dimensions. We show that the protocol preserves the entanglement between the qubits to be encoded and the environment and can also be applied to mixed states. We present a protocol that enables encoding of n qudits to one qudit of dimension smaller than the Hilbert space of the original system and then allows probabilistic but error-free decoding of any subset of k qudits. We give a formula for the probability of successful decoding

  14. Probabilistic Modeling of Timber Structures

    DEFF Research Database (Denmark)

    Köhler, J.D.; Sørensen, John Dalsgaard; Faber, Michael Havbro

    2005-01-01

    The present paper contains a proposal for the probabilistic modeling of timber material properties. It is produced in the context of the Probabilistic Model Code (PMC) of the Joint Committee on Structural Safety (JCSS) and of the COST action E24 'Reliability of Timber Structures'. The present...... proposal is based on discussions and comments from participants of the COST E24 action and the members of the JCSS. The paper contains a description of the basic reference properties for timber strength parameters and ultimate limit state equations for components and connections. The recommended...

  15. Application of probabilistic precipitation forecasts from a ...

    African Journals Online (AJOL)

    2014-02-14

    Feb 14, 2014 ... Application of probabilistic precipitation forecasts from a deterministic model ... aim of this paper is to investigate the increase in the lead-time of flash flood warnings of the SAFFG using probabilistic precipitation forecasts ... The procedure is applied to a real flash flood event and the ensemble-based.

  16. Probabilistic Analysis Methods for Hybrid Ventilation

    DEFF Research Database (Denmark)

    Brohus, Henrik; Frier, Christian; Heiselberg, Per

    This paper discusses a general approach for the application of probabilistic analysis methods in the design of ventilation systems. The aims and scope of probabilistic versus deterministic methods are addressed with special emphasis on hybrid ventilation systems. A preliminary application...... of stochastic differential equations is presented comprising a general heat balance for an arbitrary number of loads and zones in a building to determine the thermal behaviour under random conditions....

  17. Ships - inspiring objects in architecture

    Science.gov (United States)

    Marczak, Elzbieta

    2017-10-01

    Sea-going vessels have for centuries fascinated people, not only those who happen to work at sea, but first and foremost, those who have never set foot aboard a ship. The environment in which ships operate is reminiscent of freedom and countless adventures, but also of hard and interesting maritime working life. The famous words of Pompey: “Navigare necesseest, vivere non estnecesse” (sailing is necessary, living - is not necessary), which he pronounced on a stormy sea voyage, arouse curiosity and excitement, inviting one to test the truth of this saying personally. It is often the case, however, that sea-faring remains within the realm of dreams, while the fascination with ships demonstrates itself through a transposition of naval features onto land constructions. In such cases, ship-inspired motifs bring alive dreams and yearnings as well as reflect tastes. Tourism is one of the indicators of people’s standard of living and a measure of a society’s civilisation. Maritime tourism has been developing rapidly in recent decades. A sea cruise offers an insight into life at sea. Still, most people derive their knowledge of passenger vessels and their furnishings from the mass media. Passenger vessels, also known as “floating cities,” are described as majestic and grand, while their on-board facilities as luxurious, comfortable, exclusive and inaccessible to common people on land. Freight vessels, on the other hand, are described as enormous objects which dwarf the human being into insignificance. This article presents the results of research intended to answer the following questions: what makes ships a source of inspiration for land architecture? To what extent and by what means do architects draw on ships in their design work? In what places can we find structures inspired by ships? What ships inspire architects? This article presents examples of buildings, whose design was inspired by the architecture and structural details of sea vessels. An analysis of

  18. Is Probabilistic Evidence a Source of Knowledge?

    Science.gov (United States)

    Friedman, Ori; Turri, John

    2015-01-01

    We report a series of experiments examining whether people ascribe knowledge for true beliefs based on probabilistic evidence. Participants were less likely to ascribe knowledge for beliefs based on probabilistic evidence than for beliefs based on perceptual evidence (Experiments 1 and 2A) or testimony providing causal information (Experiment 2B).…

  19. Probabilistic Tsunami Hazard Analysis

    Science.gov (United States)

    Thio, H. K.; Ichinose, G. A.; Somerville, P. G.; Polet, J.

    2006-12-01

    The recent tsunami disaster caused by the 2004 Sumatra-Andaman earthquake has focused our attention to the hazard posed by large earthquakes that occur under water, in particular subduction zone earthquakes, and the tsunamis that they generate. Even though these kinds of events are rare, the very large loss of life and material destruction caused by this earthquake warrant a significant effort towards the mitigation of the tsunami hazard. For ground motion hazard, Probabilistic Seismic Hazard Analysis (PSHA) has become a standard practice in the evaluation and mitigation of seismic hazard to populations in particular with respect to structures, infrastructure and lifelines. Its ability to condense the complexities and variability of seismic activity into a manageable set of parameters greatly facilitates the design of effective seismic resistant buildings but also the planning of infrastructure projects. Probabilistic Tsunami Hazard Analysis (PTHA) achieves the same goal for hazards posed by tsunami. There are great advantages of implementing such a method to evaluate the total risk (seismic and tsunami) to coastal communities. The method that we have developed is based on the traditional PSHA and therefore completely consistent with standard seismic practice. Because of the strong dependence of tsunami wave heights on bathymetry, we use a full waveform tsunami waveform computation in lieu of attenuation relations that are common in PSHA. By pre-computing and storing the tsunami waveforms at points along the coast generated for sets of subfaults that comprise larger earthquake faults, we can efficiently synthesize tsunami waveforms for any slip distribution on those faults by summing the individual subfault tsunami waveforms (weighted by their slip). This efficiency make it feasible to use Green's function summation in lieu of attenuation relations to provide very accurate estimates of tsunami height for probabilistic calculations, where one typically computes

  20. Undecidability of model-checking branching-time properties of stateless probabilistic pushdown process

    OpenAIRE

    Lin, T.

    2014-01-01

    In this paper, we settle a problem in probabilistic verification of infinite--state process (specifically, {\\it probabilistic pushdown process}). We show that model checking {\\it stateless probabilistic pushdown process} (pBPA) against {\\it probabilistic computational tree logic} (PCTL) is undecidable.

  1. Probabilistic safety assessment as a standpoint for decision making

    International Nuclear Information System (INIS)

    Cepin, M.

    2001-01-01

    This paper focuses on the role of probabilistic safety assessment in decision-making. The prerequisites for use of the results of probabilistic safety assessment and the criteria for the decision-making based on probabilistic safety assessment are discussed. The decision-making process is described. It provides a risk evaluation of impact of the issue under investigation. Selected examples are discussed, which highlight the described process. (authors)

  2. Probabilistic inversion for chicken processing lines

    International Nuclear Information System (INIS)

    Cooke, Roger M.; Nauta, Maarten; Havelaar, Arie H.; Fels, Ine van der

    2006-01-01

    We discuss an application of probabilistic inversion techniques to a model of campylobacter transmission in chicken processing lines. Such techniques are indicated when we wish to quantify a model which is new and perhaps unfamiliar to the expert community. In this case there are no measurements for estimating model parameters, and experts are typically unable to give a considered judgment. In such cases, experts are asked to quantify their uncertainty regarding variables which can be predicted by the model. The experts' distributions (after combination) are then pulled back onto the parameter space of the model, a process termed 'probabilistic inversion'. This study illustrates two such techniques, iterative proportional fitting (IPF) and PARmeter fitting for uncertain models (PARFUM). In addition, we illustrate how expert judgement on predicted observable quantities in combination with probabilistic inversion may be used for model validation and/or model criticism

  3. Analysis of truncation limit in probabilistic safety assessment

    International Nuclear Information System (INIS)

    Cepin, Marko

    2005-01-01

    A truncation limit defines the boundaries of what is considered in the probabilistic safety assessment and what is neglected. The truncation limit that is the focus here is the truncation limit on the size of the minimal cut set contribution at which to cut off. A new method was developed, which defines truncation limit in probabilistic safety assessment. The method specifies truncation limits with more stringency than presenting existing documents dealing with truncation criteria in probabilistic safety assessment do. The results of this paper indicate that the truncation limits for more complex probabilistic safety assessments, which consist of larger number of basic events, should be more severe than presently recommended in existing documents if more accuracy is desired. The truncation limits defined by the new method reduce the relative errors of importance measures and produce more accurate results for probabilistic safety assessment applications. The reduced relative errors of importance measures can prevent situations, where the acceptability of change of equipment under investigation according to RG 1.174 would be shifted from region, where changes can be accepted, to region, where changes cannot be accepted, if the results would be calculated with smaller truncation limit

  4. Limited probabilistic risk assessment applications in plant backfitting

    International Nuclear Information System (INIS)

    Desaedeleer, G.

    1987-01-01

    Plant backfitting programs are defined on the basis of deterministic (e.g. Systematic Evaluation Program) or probabilistic (e.g. Probabilistic Risk Assessment) approaches. Each approach provides valuable assets in defining the program and has its own advantages and disadvantages. Ideally one should combine the strong points of each approach. This chapter summarizes actual experience gained from combinations of deterministic and probabilistic approaches to define and implement PWR backfitting programs. Such combinations relate to limited applications of probabilistic techniques and are illustrated for upgrading fluid systems. These evaluations allow sound and rational optimization systems upgrade. However, the boundaries of the reliability analysis need to be clearly defined and system reliability may have to go beyond classical boundaries (e.g. identification of weak links in support systems). Also the implementation of upgrade on a system per system basis is not necessarily cost-effective. (author)

  5. Bio-inspired computation in unmanned aerial vehicles

    CERN Document Server

    Duan, Haibin

    2014-01-01

    Bio-inspired Computation in Unmanned Aerial Vehicles focuses on the aspects of path planning, formation control, heterogeneous cooperative control and vision-based surveillance and navigation in Unmanned Aerial Vehicles (UAVs) from the perspective of bio-inspired computation. It helps readers to gain a comprehensive understanding of control-related problems in UAVs, presenting the latest advances in bio-inspired computation. By combining bio-inspired computation and UAV control problems, key questions are explored in depth, and each piece is content-rich while remaining accessible. With abundant illustrations of simulation work, this book links theory, algorithms and implementation procedures, demonstrating the simulation results with graphics that are intuitive without sacrificing academic rigor. Further, it pays due attention to both the conceptual framework and the implementation procedures. The book offers a valuable resource for scientists, researchers and graduate students in the field of Control, Aeros...

  6. Biologically Inspired Technology Using Electroactive Polymers (EAP)

    Science.gov (United States)

    Bar-Cohen, Yoseph

    2006-01-01

    Evolution allowed nature to introduce highly effective biological mechanisms that are incredible inspiration for innovation. Humans have always made efforts to imitate nature's inventions and we are increasingly making advances that it becomes significantly easier to imitate, copy, and adapt biological methods, processes and systems. This brought us to the ability to create technology that is far beyond the simple mimicking of nature. Having better tools to understand and to implement nature's principles we are now equipped like never before to be inspired by nature and to employ our tools in far superior ways. Effectively, by bio-inspiration we can have a better view and value of nature capability while studying its models to learn what can be extracted, copied or adapted. Using electroactive polymers (EAP) as artificial muscles is adding an important element to the development of biologically inspired technologies.

  7. Probabilistic analysis of a materially nonlinear structure

    Science.gov (United States)

    Millwater, H. R.; Wu, Y.-T.; Fossum, A. F.

    1990-01-01

    A probabilistic finite element program is used to perform probabilistic analysis of a materially nonlinear structure. The program used in this study is NESSUS (Numerical Evaluation of Stochastic Structure Under Stress), under development at Southwest Research Institute. The cumulative distribution function (CDF) of the radial stress of a thick-walled cylinder under internal pressure is computed and compared with the analytical solution. In addition, sensitivity factors showing the relative importance of the input random variables are calculated. Significant plasticity is present in this problem and has a pronounced effect on the probabilistic results. The random input variables are the material yield stress and internal pressure with Weibull and normal distributions, respectively. The results verify the ability of NESSUS to compute the CDF and sensitivity factors of a materially nonlinear structure. In addition, the ability of the Advanced Mean Value (AMV) procedure to assess the probabilistic behavior of structures which exhibit a highly nonlinear response is shown. Thus, the AMV procedure can be applied with confidence to other structures which exhibit nonlinear behavior.

  8. Probabilistic dual heuristic programming-based adaptive critic

    Science.gov (United States)

    Herzallah, Randa

    2010-02-01

    Adaptive critic (AC) methods have common roots as generalisations of dynamic programming for neural reinforcement learning approaches. Since they approximate the dynamic programming solutions, they are potentially suitable for learning in noisy, non-linear and non-stationary environments. In this study, a novel probabilistic dual heuristic programming (DHP)-based AC controller is proposed. Distinct to current approaches, the proposed probabilistic (DHP) AC method takes uncertainties of forward model and inverse controller into consideration. Therefore, it is suitable for deterministic and stochastic control problems characterised by functional uncertainty. Theoretical development of the proposed method is validated by analytically evaluating the correct value of the cost function which satisfies the Bellman equation in a linear quadratic control problem. The target value of the probabilistic critic network is then calculated and shown to be equal to the analytically derived correct value. Full derivation of the Riccati solution for this non-standard stochastic linear quadratic control problem is also provided. Moreover, the performance of the proposed probabilistic controller is demonstrated on linear and non-linear control examples.

  9. Probabilistic Load Flow

    DEFF Research Database (Denmark)

    Chen, Peiyuan; Chen, Zhe; Bak-Jensen, Birgitte

    2008-01-01

    This paper reviews the development of the probabilistic load flow (PLF) techniques. Applications of the PLF techniques in different areas of power system steady-state analysis are also discussed. The purpose of the review is to identify different available PLF techniques and their corresponding...

  10. Stable oscillations of a predator-prey probabilistic cellular automaton: a mean-field approach

    International Nuclear Information System (INIS)

    Tome, Tania; Carvalho, Kelly C de

    2007-01-01

    We analyze a probabilistic cellular automaton describing the dynamics of coexistence of a predator-prey system. The individuals of each species are localized over the sites of a lattice and the local stochastic updating rules are inspired by the processes of the Lotka-Volterra model. Two levels of mean-field approximations are set up. The simple approximation is equivalent to an extended patch model, a simple metapopulation model with patches colonized by prey, patches colonized by predators and empty patches. This approximation is capable of describing the limited available space for species occupancy. The pair approximation is moreover able to describe two types of coexistence of prey and predators: one where population densities are constant in time and another displaying self-sustained time oscillations of the population densities. The oscillations are associated with limit cycles and arise through a Hopf bifurcation. They are stable against changes in the initial conditions and, in this sense, they differ from the Lotka-Volterra cycles which depend on initial conditions. In this respect, the present model is biologically more realistic than the Lotka-Volterra model

  11. Probabilistic inversion in priority setting of emerging zoonoses.

    NARCIS (Netherlands)

    Kurowicka, D.; Bucura, C.; Cooke, R.; Havelaar, A.H.

    2010-01-01

    This article presents methodology of applying probabilistic inversion in combination with expert judgment in priority setting problem. Experts rank scenarios according to severity. A linear multi-criteria analysis model underlying the expert preferences is posited. Using probabilistic inversion, a

  12. Cognitive modeling and dynamic probabilistic simulation of operating crew response to complex system accidents

    International Nuclear Information System (INIS)

    Chang, Y.H.J.; Mosleh, A.

    2007-01-01

    This is the last in a series of five papers that discuss the Information Decision and Action in Crew (IDAC) context for human reliability analysis (HRA) and example application. The model is developed to probabilistically predict the responses of the control room operating crew in nuclear power plants during an accident, for use in probabilistic risk assessments (PRA). The operator response spectrum includes cognitive, emotional, and physical activities during the course of an accident. This paper describes a dynamic PRA computer simulation program, accident dynamics simulator (ADS), developed in part to implement the IDAC model. This paper also provides a detailed example of implementing a simpler version of IDAC, compared with the IDAC model discussed in the first four papers of this series, to demonstrate the practicality of integrating a detailed cognitive HRA model within a dynamic PRA framework

  13. Learning from nature: Nature-inspired algorithms

    DEFF Research Database (Denmark)

    Albeanu, Grigore; Madsen, Henrik; Popentiu-Vladicescu, Florin

    2016-01-01

    .), genetic and evolutionary strategies, artificial immune systems etc. Well-known examples of applications include: aircraft wing design, wind turbine design, bionic car, bullet train, optimal decisions related to traffic, appropriate strategies to survive under a well-adapted immune system etc. Based......During last decade, the nature has inspired researchers to develop new algorithms. The largest collection of nature-inspired algorithms is biology-inspired: swarm intelligence (particle swarm optimization, ant colony optimization, cuckoo search, bees' algorithm, bat algorithm, firefly algorithm etc...... on collective social behaviour of organisms, researchers have developed optimization strategies taking into account not only the individuals, but also groups and environment. However, learning from nature, new classes of approaches can be identified, tested and compared against already available algorithms...

  14. Physically based probabilistic seismic hazard analysis using broadband ground motion simulation: a case study for the Prince Islands Fault, Marmara Sea

    Science.gov (United States)

    Mert, Aydin; Fahjan, Yasin M.; Hutchings, Lawrence J.; Pınar, Ali

    2016-08-01

    The main motivation for this study was the impending occurrence of a catastrophic earthquake along the Prince Island Fault (PIF) in the Marmara Sea and the disaster risk around the Marmara region, especially in Istanbul. This study provides the results of a physically based probabilistic seismic hazard analysis (PSHA) methodology, using broadband strong ground motion simulations, for sites within the Marmara region, Turkey, that may be vulnerable to possible large earthquakes throughout the PIF segments in the Marmara Sea. The methodology is called physically based because it depends on the physical processes of earthquake rupture and wave propagation to simulate earthquake ground motion time histories. We included the effects of all considerable-magnitude earthquakes. To generate the high-frequency (0.5-20 Hz) part of the broadband earthquake simulation, real, small-magnitude earthquakes recorded by a local seismic array were used as empirical Green's functions. For the frequencies below 0.5 Hz, the simulations were obtained by using synthetic Green's functions, which are synthetic seismograms calculated by an explicit 2D /3D elastic finite difference wave propagation routine. By using a range of rupture scenarios for all considerable-magnitude earthquakes throughout the PIF segments, we produced a hazard calculation for frequencies of 0.1-20 Hz. The physically based PSHA used here followed the same procedure as conventional PSHA, except that conventional PSHA utilizes point sources or a series of point sources to represent earthquakes, and this approach utilizes the full rupture of earthquakes along faults. Furthermore, conventional PSHA predicts ground motion parameters by using empirical attenuation relationships, whereas this approach calculates synthetic seismograms for all magnitudes of earthquakes to obtain ground motion parameters. PSHA results were produced for 2, 10, and 50 % hazards for all sites studied in the Marmara region.

  15. Physically-Based Probabilistic Seismic Hazard Analysis Using Broad-Band Ground Motion Simulation: a Case Study for Prince Islands Fault, Marmara Sea

    Science.gov (United States)

    Mert, A.

    2016-12-01

    The main motivation of this study is the impending occurrence of a catastrophic earthquake along the Prince Island Fault (PIF) in Marmara Sea and the disaster risk around Marmara region, especially in İstanbul. This study provides the results of a physically-based Probabilistic Seismic Hazard Analysis (PSHA) methodology, using broad-band strong ground motion simulations, for sites within the Marmara region, Turkey, due to possible large earthquakes throughout the PIF segments in the Marmara Sea. The methodology is called physically-based because it depends on the physical processes of earthquake rupture and wave propagation to simulate earthquake ground motion time histories. We include the effects of all considerable magnitude earthquakes. To generate the high frequency (0.5-20 Hz) part of the broadband earthquake simulation, the real small magnitude earthquakes recorded by local seismic array are used as an Empirical Green's Functions (EGF). For the frequencies below 0.5 Hz the simulations are obtained using by Synthetic Green's Functions (SGF) which are synthetic seismograms calculated by an explicit 2D/3D elastic finite difference wave propagation routine. Using by a range of rupture scenarios for all considerable magnitude earthquakes throughout the PIF segments we provide a hazard calculation for frequencies 0.1-20 Hz. Physically based PSHA used here follows the same procedure of conventional PSHA except that conventional PSHA utilizes point sources or a series of point sources to represent earthquakes and this approach utilizes full rupture of earthquakes along faults. Further, conventional PSHA predicts ground-motion parameters using by empirical attenuation relationships, whereas this approach calculates synthetic seismograms for all magnitude earthquakes to obtain ground-motion parameters. PSHA results are produced for 2%, 10% and 50% hazards for all studied sites in Marmara Region.

  16. Probabilistic cloning of three symmetric states

    International Nuclear Information System (INIS)

    Jimenez, O.; Bergou, J.; Delgado, A.

    2010-01-01

    We study the probabilistic cloning of three symmetric states. These states are defined by a single complex quantity, the inner product among them. We show that three different probabilistic cloning machines are necessary to optimally clone all possible families of three symmetric states. We also show that the optimal cloning probability of generating M copies out of one original can be cast as the quotient between the success probability of unambiguously discriminating one and M copies of symmetric states.

  17. Recent developments of the NESSUS probabilistic structural analysis computer program

    Science.gov (United States)

    Millwater, H.; Wu, Y.-T.; Torng, T.; Thacker, B.; Riha, D.; Leung, C. P.

    1992-01-01

    The NESSUS probabilistic structural analysis computer program combines state-of-the-art probabilistic algorithms with general purpose structural analysis methods to compute the probabilistic response and the reliability of engineering structures. Uncertainty in loading, material properties, geometry, boundary conditions and initial conditions can be simulated. The structural analysis methods include nonlinear finite element and boundary element methods. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. The scope of the code has recently been expanded to include probabilistic life and fatigue prediction of structures in terms of component and system reliability and risk analysis of structures considering cost of failure. The code is currently being extended to structural reliability considering progressive crack propagation. Several examples are presented to demonstrate the new capabilities.

  18. Probabilistic Design of Wave Energy Devices

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Kofoed, Jens Peter; Ferreira, C.B.

    2011-01-01

    Wave energy has a large potential for contributing significantly to production of renewable energy. However, the wave energy sector is still not able to deliver cost competitive and reliable solutions. But the sector has already demonstrated several proofs of concepts. The design of wave energy...... devices is a new and expanding technical area where there is no tradition for probabilistic design—in fact very little full scale devices has been build to date, so it can be said that no design tradition really exists in this area. For this reason it is considered to be of great importance to develop...... and advocate for a probabilistic design approach, as it is assumed (in other areas this has been demonstrated) that this leads to more economical designs compared to designs based on deterministic methods. In the present paper a general framework for probabilistic design and reliability analysis of wave energy...

  19. Physical Education as "Means without Ends:" Towards a New Concept of Physical Education

    Science.gov (United States)

    Vlieghe, Joris

    2013-01-01

    This article is concerned with the educational value of raising the human body at school. Drawing inspiration from the work of Giorgio Agamben, I develop a new perspective that explores the possibility of taking the concept of physical education in a literal sense. This is to say that the specific educational content of physical education (in…

  20. Some thoughts on the future of probabilistic structural design of nuclear components

    International Nuclear Information System (INIS)

    Stancampiano, P.A.

    1978-01-01

    This paper presents some views on the future role of probabilistic methods in the structural design of nuclear components. The existing deterministic design approach is discussed and compared to the probabilistic approach. Some of the objections to both deterministic and probabilistic design are listed. Extensive research and development activities are required to mature the probabilistic approach suficiently to make it cost-effective and competitive with current deterministic design practices. The required research activities deal with probabilistic methods development, more realistic casual failure mode models development, and statistical data models development. A quasi-probabilistic structural design approach is recommended which accounts for the random error in the design models. (Auth.)

  1. Cooperation in an evolutionary prisoner’s dilemma game with probabilistic strategies

    International Nuclear Information System (INIS)

    Li Haihong; Dai Qionglin; Cheng Hongyan; Yang Junzhong

    2012-01-01

    Highlights: ► Introducing probabilistic strategies instead of the pure C/D in the PDG. ► The strategies patterns depends on interaction structures and updating rules. ► There exists an optimal increment of the probabilistic strategy. - Abstract: In this work, we investigate an evolutionary prisoner’s dilemma game in structured populations with probabilistic strategies instead of the pure strategies of cooperation and defection. We explore the model in details by considering different strategy update rules and different population structures. We find that the distribution of probabilistic strategies patterns is dependent on both the interaction structures and the updating rules. We also find that, when an individual updates her strategy by increasing or decreasing her probabilistic strategy a certain amount towards that of her opponent, there exists an optimal increment of the probabilistic strategy at which the cooperator frequency reaches its maximum.

  2. Probabilistic liver atlas construction.

    Science.gov (United States)

    Dura, Esther; Domingo, Juan; Ayala, Guillermo; Marti-Bonmati, Luis; Goceri, E

    2017-01-13

    Anatomical atlases are 3D volumes or shapes representing an organ or structure of the human body. They contain either the prototypical shape of the object of interest together with other shapes representing its statistical variations (statistical atlas) or a probability map of belonging to the object (probabilistic atlas). Probabilistic atlases are mostly built with simple estimations only involving the data at each spatial location. A new method for probabilistic atlas construction that uses a generalized linear model is proposed. This method aims to improve the estimation of the probability to be covered by the liver. Furthermore, all methods to build an atlas involve previous coregistration of the sample of shapes available. The influence of the geometrical transformation adopted for registration in the quality of the final atlas has not been sufficiently investigated. The ability of an atlas to adapt to a new case is one of the most important quality criteria that should be taken into account. The presented experiments show that some methods for atlas construction are severely affected by the previous coregistration step. We show the good performance of the new approach. Furthermore, results suggest that extremely flexible registration methods are not always beneficial, since they can reduce the variability of the atlas and hence its ability to give sensible values of probability when used as an aid in segmentation of new cases.

  3. Probabilistic safety assessment goals in Canada

    International Nuclear Information System (INIS)

    Snell, V.G.

    1986-01-01

    CANDU safety philosphy, both in design and in licensing, has always had a strong bias towards quantitative probabilistically-based goals derived from comparative safety. Formal probabilistic safety assessment began in Canada as a design tool. The influence of this carried over later on into the definition of the deterministic safety guidelines used in CANDU licensing. Design goals were further developed which extended the consequence/frequency spectrum of 'acceptable' events, from the two points defined by the deterministic single/dual failure analysis, to a line passing through lower and higher frequencies. Since these were design tools, a complete risk summation was not necessary, allowing a cutoff at low event frequencies while preserving the identification of the most significant safety-related events. These goals gave a logical framework for making decisions on implementing design changes proposed as a result of the Probabilistic Safety Analysis. Performing this analysis became a regulatory requirement, and the design goals remained the framework under which this was submitted. Recently, there have been initiatives to incorporate more detailed probabilistic safety goals into the regulatory process in Canada. These range from far-reaching safety optimization across society, to initiatives aimed at the nuclear industry only. The effectiveness of the latter is minor at very low and very high event frequencies; at medium frequencies, a justification against expenditures per life saved in other industries should be part of the goal setting

  4. Probabilistic Design of Coastal Flood Defences in Vietnam

    NARCIS (Netherlands)

    Mai Van, C.

    2010-01-01

    This study further develops the method of probabilistic design and to address a knowledge gap in its application regarding safety and reliability, risk assessment and risk evaluation to the fields of flood defences. The thesis discusses: - a generic probabilistic design framework for assessing flood

  5. Probabilistic Teleportation via Quantum Channel with Partial Information

    Directory of Open Access Journals (Sweden)

    Desheng Liu

    2015-06-01

    Full Text Available Two novel schemes are proposed to teleport an unknown two-level quantum state probabilistically when the sender and the receiver only have partial information about the quantum channel, respectively. This is distinct from the fact that either the sender or the receiver has entire information about the quantum channel in previous schemes for probabilistic teleportation. Theoretical analysis proves that these schemes are straightforward, efficient and cost-saving. The concrete realization procedures of our schemes are presented in detail, and the result shows that our proposals could extend the application range of probabilistic teleportation.

  6. Probabilistic assessment of pressure vessel and piping reliability

    International Nuclear Information System (INIS)

    Sundararajan, C.

    1986-01-01

    The paper presents a critical review of the state-of-the-art in probabilistic assessment of pressure vessel and piping reliability. First the differences in assessing the reliability directly from historical failure data and indirectly by a probabilistic analysis of the failure phenomenon are discussed and the advantages and disadvantages are pointed out. The rest of the paper deals with the latter approach of reliability assessment. Methods of probabilistic reliability assessment are described and major projects where these methods are applied for pressure vessel and piping problems are discussed. An extensive list of references is provided at the end of the paper

  7. Expanding Earth and Space Science through the Initiating New Science Partnerships In Rural Education (INSPIRE)

    Science.gov (United States)

    Radencic, S.; McNeal, K. S.; Pierce, D.; Hare, D.

    2010-12-01

    The INSPIRE program at Mississippi State University (MSU), funded by the NSF Graduate STEM Fellows in K-12 Education (GK12) program, focuses on Earth and Space science education and has partnered ten graduate students from MSU with five teachers from local, rural school districts. For the next five years the project will serve to increase inquiry and technology experiences in science and math while enhancing graduate student’s communication skills. Graduate students, from the disciplines of Geosciences, Physics, and Engineering are partnered with Chemistry, Physical Science, Physics, Geometry and Middle school science classrooms and will create engaging inquiry activities that incorporate elements of their research, and integrate various forms of technology. The generated lesson plans that are implemented in the classroom are published on the INSPIRE home page (www.gk12.msstate.edu) so that other classroom instructors can utilize this free resource. Local 7th -12th grade students will attend GIS day later this fall at MSU to increase their understanding and interest in Earth and Space sciences. Selected graduate students and teachers will visit one of four international university partners located in Poland, Australia, England, or The Bahamas to engage research abroad. Upon return they will incorporate their global experiences into their local classrooms. Planning for the project included many factors important to the success of the partnerships. The need for the program was evident in Mississippi K-12 schools based on low performance on high stakes assessments and lack of curriculum in the Earth and Space sciences. Meeting with administrators to determine what needs they would like addressed by the project and recognizing the individual differences among the schools were integral components to tailoring project goals and to meet the unique needs of each school partner. Time for training and team building of INSPIRE teachers and graduate students before the

  8. Probabilistic calculation of dose commitment from uranium mill tailings

    International Nuclear Information System (INIS)

    1983-10-01

    The report discusses in a general way considerations of uncertainty in relation to probabilistic modelling. An example of a probabilistic calculation applied to the behaviour of uranium mill tailings is given

  9. Foundation plate on the elastic half-space, deterministic and probabilistic approach

    Directory of Open Access Journals (Sweden)

    Tvrdá Katarína

    2017-01-01

    Full Text Available Interaction between the foundation plate and subgrade can be described by different mathematical - physical model. Elastic foundation can be modelled by different types of models, e.g. one-parametric model, two-parametric model and a comprehensive model - Boussinesque (elastic half-space had been used. The article deals with deterministic and probabilistic analysis of deflection of the foundation plate on the elastic half-space. Contact between the foundation plate and subsoil was modelled using contact elements node-node. At the end the obtained results are presented.

  10. Probabilistic safety goals. Phase 3 - Status report

    Energy Technology Data Exchange (ETDEWEB)

    Holmberg, J.-E. (VTT (Finland)); Knochenhauer, M. (Relcon Scandpower AB, Sundbyberg (Sweden))

    2009-07-15

    The first phase of the project (2006) described the status, concepts and history of probabilistic safety goals for nuclear power plants. The second and third phases (2007-2008) have provided guidance related to the resolution of some of the problems identified, and resulted in a common understanding regarding the definition of safety goals. The basic aim of phase 3 (2009) has been to increase the scope and level of detail of the project, and to start preparations of a guidance document. Based on the conclusions from the previous project phases, the following issues have been covered: 1) Extension of international overview. Analysis of results from the questionnaire performed within the ongoing OECD/NEA WGRISK activity on probabilistic safety criteria, including participation in the preparation of the working report for OECD/NEA/WGRISK (to be finalised in phase 4). 2) Use of subsidiary criteria and relations between these (to be finalised in phase 4). 3) Numerical criteria when using probabilistic analyses in support of deterministic safety analysis (to be finalised in phase 4). 4) Guidance for the formulation, application and interpretation of probabilistic safety criteria (to be finalised in phase 4). (LN)

  11. Probabilistic safety goals. Phase 3 - Status report

    International Nuclear Information System (INIS)

    Holmberg, J.-E.; Knochenhauer, M.

    2009-07-01

    The first phase of the project (2006) described the status, concepts and history of probabilistic safety goals for nuclear power plants. The second and third phases (2007-2008) have provided guidance related to the resolution of some of the problems identified, and resulted in a common understanding regarding the definition of safety goals. The basic aim of phase 3 (2009) has been to increase the scope and level of detail of the project, and to start preparations of a guidance document. Based on the conclusions from the previous project phases, the following issues have been covered: 1) Extension of international overview. Analysis of results from the questionnaire performed within the ongoing OECD/NEA WGRISK activity on probabilistic safety criteria, including participation in the preparation of the working report for OECD/NEA/WGRISK (to be finalised in phase 4). 2) Use of subsidiary criteria and relations between these (to be finalised in phase 4). 3) Numerical criteria when using probabilistic analyses in support of deterministic safety analysis (to be finalised in phase 4). 4) Guidance for the formulation, application and interpretation of probabilistic safety criteria (to be finalised in phase 4). (LN)

  12. Probabilistic Linguistic Power Aggregation Operators for Multi-Criteria Group Decision Making

    Directory of Open Access Journals (Sweden)

    Agbodah Kobina

    2017-12-01

    Full Text Available As an effective aggregation tool, power average (PA allows the input arguments being aggregated to support and reinforce each other, which provides more versatility in the information aggregation process. Under the probabilistic linguistic term environment, we deeply investigate the new power aggregation (PA operators for fusing the probabilistic linguistic term sets (PLTSs. In this paper, we firstly develop the probabilistic linguistic power average (PLPA, the weighted probabilistic linguistic power average (WPLPA operators, the probabilistic linguistic power geometric (PLPG and the weighted probabilistic linguistic power geometric (WPLPG operators. At the same time, we carefully analyze the properties of these new aggregation operators. With the aid of the WPLPA and WPLPG operators, we further design the approaches for the application of multi-criteria group decision-making (MCGDM with PLTSs. Finally, we use an illustrated example to expound our proposed methods and verify their performances.

  13. Visualizing Probabilistic Proof

    OpenAIRE

    Guerra-Pujol, Enrique

    2015-01-01

    The author revisits the Blue Bus Problem, a famous thought-experiment in law involving probabilistic proof, and presents simple Bayesian solutions to different versions of the blue bus hypothetical. In addition, the author expresses his solutions in standard and visual formats, i.e. in terms of probabilities and natural frequencies.

  14. Probabilistic Volcanic Multi-Hazard Assessment at Somma-Vesuvius (Italy): coupling Bayesian Belief Networks with a physical model for lahar propagation

    Science.gov (United States)

    Tierz, Pablo; Woodhouse, Mark; Phillips, Jeremy; Sandri, Laura; Selva, Jacopo; Marzocchi, Warner; Odbert, Henry

    2017-04-01

    Volcanoes are extremely complex physico-chemical systems where magma formed at depth breaks into the planet's surface resulting in major hazards from local to global scales. Volcano physics are dominated by non-linearities, and complicated spatio-temporal interrelationships which make volcanic hazards stochastic (i.e. not deterministic) by nature. In this context, probabilistic assessments are required to quantify the large uncertainties related to volcanic hazards. Moreover, volcanoes are typically multi-hazard environments where different hazardous processes can occur whether simultaneously or in succession. In particular, explosive volcanoes are able to accumulate, through tephra fallout and Pyroclastic Density Currents (PDCs), large amounts of pyroclastic material into the drainage basins surrounding the volcano. This addition of fresh particulate material alters the local/regional hydrogeological equilibrium and increases the frequency and magnitude of sediment-rich aqueous flows, commonly known as lahars. The initiation and volume of rain-triggered lahars may depend on: rainfall intensity and duration; antecedent rainfall; terrain slope; thickness, permeability and hydraulic diffusivity of the tephra deposit; etc. Quantifying these complex interrelationships (and their uncertainties), in a tractable manner, requires a structured but flexible probabilistic approach. A Bayesian Belief Network (BBN) is a directed acyclic graph that allows the representation of the joint probability distribution for a set of uncertain variables in a compact and efficient way, by exploiting unconditional and conditional independences between these variables. Once constructed and parametrized, the BBN uses Bayesian inference to perform causal (e.g. forecast) and/or evidential reasoning (e.g. explanation) about query variables, given some evidence. In this work, we illustrate how BBNs can be used to model the influence of several variables on the generation of rain-triggered lahars

  15. Probabilistic modelling of the high-pressure arc cathode spot displacement dynamic

    International Nuclear Information System (INIS)

    Coulombe, Sylvain

    2003-01-01

    A probabilistic modelling approach for the study of the cathode spot displacement dynamic in high-pressure arc systems is developed in an attempt to interpret the observed voltage fluctuations. The general framework of the model allows to define simple, probabilistic displacement rules, the so-called cathode spot dynamic rules, for various possible surface states (un-arced metal, arced, contaminated) and to study the resulting dynamic of the cathode spot displacements over one or several arc passages. The displacements of the type-A cathode spot (macro-spot) in a magnetically rotating arc using concentric electrodes made up of either clean or contaminated metal surfaces is considered. Experimental observations for this system revealed a 1/f -tilde1 signature in the frequency power spectrum (FPS) of the arc voltage for anchoring arc conditions on the cathode (e.g. clean metal surface), while it shows a 'white noise' signature for conditions favouring a smooth movement (e.g. oxide-contaminated cathode surface). Through an appropriate choice of the local probabilistic displacement rules, the model is able to correctly represent the dynamic behaviours of the type-A cathode spot, including the FPS for the arc elongation (i.e. voltage) and the arc erosion trace formation. The model illustrates that the cathode spot displacements between re-strikes can be seen as a diffusion process with a diffusion constant which depends on the surface structure. A physical interpretation for the jumping probability associated with the re-strike event is given in terms of the electron emission processes across dielectric contaminants present on the cathode surface

  16. Allothetic and idiothetic sensor fusion in rat-inspired robot localization

    Science.gov (United States)

    Weitzenfeld, Alfredo; Fellous, Jean-Marc; Barrera, Alejandra; Tejera, Gonzalo

    2012-06-01

    We describe a spatial cognition model based on the rat's brain neurophysiology as a basis for new robotic navigation architectures. The model integrates allothetic (external visual landmarks) and idiothetic (internal kinesthetic information) cues to train either rat or robot to learn a path enabling it to reach a goal from multiple starting positions. It stands in contrast to most robotic architectures based on SLAM, where a map of the environment is built to provide probabilistic localization information computed from robot odometry and landmark perception. Allothetic cues suffer in general from perceptual ambiguity when trying to distinguish between places with equivalent visual patterns, while idiothetic cues suffer from imprecise motions and limited memory recalls. We experiment with both types of cues in different maze configurations by training rats and robots to find the goal starting from a fixed location, and then testing them to reach the same target from new starting locations. We show that the robot, after having pre-explored a maze, can find a goal with improved efficiency, and is able to (1) learn the correct route to reach the goal, (2) recognize places already visited, and (3) exploit allothetic and idiothetic cues to improve on its performance. We finally contrast our biologically-inspired approach to more traditional robotic approaches and discuss current work in progress.

  17. Documentation design for probabilistic risk assessment

    International Nuclear Information System (INIS)

    Parkinson, W.J.; von Herrmann, J.L.

    1985-01-01

    This paper describes a framework for documentation design of probabilistic risk assessment (PRA) and is based on the EPRI document NP-3470 ''Documentation Design for Probabilistic Risk Assessment''. The goals for PRA documentation are stated. Four audiences are identified which PRA documentation must satisfy, and the documentation consistent with the needs of the various audiences are discussed, i.e., the Summary Report, the Executive Summary, the Main Report, and Appendices. The authors recommend the documentation specifications discussed herein as guides rather than rigid definitions

  18. The probabilistic approach in the licensing process and the development of probabilistic risk assessment methodology in Japan

    International Nuclear Information System (INIS)

    Togo, Y.; Sato, K.

    1981-01-01

    The probabilistic approach has long seemed to be one of the most comprehensive methods for evaluating the safety of nuclear plants. So far, most of the guidelines and criteria for licensing are based on the deterministic concept. However, there have been a few examples to which the probabilistic approach was directly applied, such as the evaluation of aircraft crashes and turbine missiles. One may find other examples of such applications. However, a much more important role is now to be played by this concept, in implementing the 52 recommendations from the lessons learned from the TMI accident. To develop the probabilistic risk assessment methodology most relevant to Japanese situations, a five-year programme plan has been adopted and is to be conducted by the Japan Atomic Research Institute from fiscal 1980. Various problems have been identified and are to be solved through this programme plan. The current status of developments is described together with activities outside the government programme. (author)

  19. Scalable group level probabilistic sparse factor analysis

    DEFF Research Database (Denmark)

    Hinrich, Jesper Løve; Nielsen, Søren Føns Vind; Riis, Nicolai Andre Brogaard

    2017-01-01

    Many data-driven approaches exist to extract neural representations of functional magnetic resonance imaging (fMRI) data, but most of them lack a proper probabilistic formulation. We propose a scalable group level probabilistic sparse factor analysis (psFA) allowing spatially sparse maps, component...... pruning using automatic relevance determination (ARD) and subject specific heteroscedastic spatial noise modeling. For task-based and resting state fMRI, we show that the sparsity constraint gives rise to components similar to those obtained by group independent component analysis. The noise modeling...... shows that noise is reduced in areas typically associated with activation by the experimental design. The psFA model identifies sparse components and the probabilistic setting provides a natural way to handle parameter uncertainties. The variational Bayesian framework easily extends to more complex...

  20. Probabilistic Modeling and Visualization for Bankruptcy Prediction

    DEFF Research Database (Denmark)

    Antunes, Francisco; Ribeiro, Bernardete; Pereira, Francisco Camara

    2017-01-01

    In accounting and finance domains, bankruptcy prediction is of great utility for all of the economic stakeholders. The challenge of accurate assessment of business failure prediction, specially under scenarios of financial crisis, is known to be complicated. Although there have been many successful...... studies on bankruptcy detection, seldom probabilistic approaches were carried out. In this paper we assume a probabilistic point-of-view by applying Gaussian Processes (GP) in the context of bankruptcy prediction, comparing it against the Support Vector Machines (SVM) and the Logistic Regression (LR......). Using real-world bankruptcy data, an in-depth analysis is conducted showing that, in addition to a probabilistic interpretation, the GP can effectively improve the bankruptcy prediction performance with high accuracy when compared to the other approaches. We additionally generate a complete graphical...

  1. Sensitivity analysis in multi-parameter probabilistic systems

    International Nuclear Information System (INIS)

    Walker, J.R.

    1987-01-01

    Probabilistic methods involving the use of multi-parameter Monte Carlo analysis can be applied to a wide range of engineering systems. The output from the Monte Carlo analysis is a probabilistic estimate of the system consequence, which can vary spatially and temporally. Sensitivity analysis aims to examine how the output consequence is influenced by the input parameter values. Sensitivity analysis provides the necessary information so that the engineering properties of the system can be optimized. This report details a package of sensitivity analysis techniques that together form an integrated methodology for the sensitivity analysis of probabilistic systems. The techniques have known confidence limits and can be applied to a wide range of engineering problems. The sensitivity analysis methodology is illustrated by performing the sensitivity analysis of the MCROC rock microcracking model

  2. Business Inspiration: Small Business Leadership in Recovery?

    Science.gov (United States)

    Rae, David; Price, Liz; Bosworth, Gary; Parkinson, Paul

    2012-01-01

    Business Inspiration was a short, action-centred leadership and innovation development programme designed for owners and managers of smaller firms to address business survival and repositioning needs arising from the UK's economic downturn. The article examines the design and delivery of Business Inspiration and the impact of the programme on…

  3. Next-generation probabilistic seismicity forecasting

    Energy Technology Data Exchange (ETDEWEB)

    Hiemer, S.

    2014-07-01

    The development of probabilistic seismicity forecasts is one of the most important tasks of seismologists at present time. Such forecasts form the basis of probabilistic seismic hazard assessment, a widely used approach to generate ground motion exceedance maps. These hazard maps guide the development of building codes, and in the absence of the ability to deterministically predict earthquakes, good building and infrastructure planning is key to prevent catastrophes. Probabilistic seismicity forecasts are models that specify the occurrence rate of earthquakes as a function of space, time and magnitude. The models presented in this thesis are time-invariant mainshock occurrence models. Accordingly, the reliable estimation of the spatial and size distribution of seismicity are of crucial importance when constructing such probabilistic forecasts. Thereby we focus on data-driven approaches to infer these distributions, circumventing the need for arbitrarily chosen external parameters and subjective expert decisions. Kernel estimation has been shown to appropriately transform discrete earthquake locations into spatially continuous probability distributions. However, we show that neglecting the information from fault networks constitutes a considerable shortcoming and thus limits the skill of these current seismicity models. We present a novel earthquake rate forecast that applies the kernel-smoothing method to both past earthquake locations and slip rates on mapped crustal faults applied to Californian and European data. Our model is independent from biases caused by commonly used non-objective seismic zonations, which impose artificial borders of activity that are not expected in nature. Studying the spatial variability of the seismicity size distribution is of great importance. The b-value of the well-established empirical Gutenberg-Richter model forecasts the rates of hazard-relevant large earthquakes based on the observed rates of abundant small events. We propose a

  4. Next-generation probabilistic seismicity forecasting

    International Nuclear Information System (INIS)

    Hiemer, S.

    2014-01-01

    The development of probabilistic seismicity forecasts is one of the most important tasks of seismologists at present time. Such forecasts form the basis of probabilistic seismic hazard assessment, a widely used approach to generate ground motion exceedance maps. These hazard maps guide the development of building codes, and in the absence of the ability to deterministically predict earthquakes, good building and infrastructure planning is key to prevent catastrophes. Probabilistic seismicity forecasts are models that specify the occurrence rate of earthquakes as a function of space, time and magnitude. The models presented in this thesis are time-invariant mainshock occurrence models. Accordingly, the reliable estimation of the spatial and size distribution of seismicity are of crucial importance when constructing such probabilistic forecasts. Thereby we focus on data-driven approaches to infer these distributions, circumventing the need for arbitrarily chosen external parameters and subjective expert decisions. Kernel estimation has been shown to appropriately transform discrete earthquake locations into spatially continuous probability distributions. However, we show that neglecting the information from fault networks constitutes a considerable shortcoming and thus limits the skill of these current seismicity models. We present a novel earthquake rate forecast that applies the kernel-smoothing method to both past earthquake locations and slip rates on mapped crustal faults applied to Californian and European data. Our model is independent from biases caused by commonly used non-objective seismic zonations, which impose artificial borders of activity that are not expected in nature. Studying the spatial variability of the seismicity size distribution is of great importance. The b-value of the well-established empirical Gutenberg-Richter model forecasts the rates of hazard-relevant large earthquakes based on the observed rates of abundant small events. We propose a

  5. HMM_Model-Checker pour la vérification probabiliste HMM_Model ...

    African Journals Online (AJOL)

    ASSIA

    probabiliste –Télescope Hubble. Abstract. Probabilistic verification for embedded systems continues to attract more and more followers in the research community. Given a probabilistic model, a formula of temporal logic, describing a property of a system and an exploration algorithm to check whether the property is satisfied ...

  6. Wireless synapses in bio-inspired neural networks

    Science.gov (United States)

    Jannson, Tomasz; Forrester, Thomas; Degrood, Kevin

    2009-05-01

    Wireless (virtual) synapses represent a novel approach to bio-inspired neural networks that follow the infrastructure of the biological brain, except that biological (physical) synapses are replaced by virtual ones based on cellular telephony modeling. Such synapses are of two types: intracluster synapses are based on IR wireless ones, while intercluster synapses are based on RF wireless ones. Such synapses have three unique features, atypical of conventional artificial ones: very high parallelism (close to that of the human brain), very high reconfigurability (easy to kill and to create), and very high plasticity (easy to modify or upgrade). In this paper we analyze the general concept of wireless synapses with special emphasis on RF wireless synapses. Also, biological mammalian (vertebrate) neural models are discussed for comparison, and a novel neural lensing effect is discussed in detail.

  7. Delivery of Functionality in Complex Food Systems: Physically inspired approaches from nanoscale to microscale, Paris 14 to 17 July, 2015.

    Science.gov (United States)

    Relkin, Perla

    2016-10-01

    The 6th international symposium in the series "Delivery of Functionality in Complex Food Systems: Physically inspired approaches from nanoscale to microscal" was held in the heart of Paris from 14 to 17 July, 2015. It brought together PhD students, academic food researchers and industrials from diversified food sectors. The scientific sessions of this meeting were constructed around important topics dealing with 1) Engineering of tailored-made structures in bio-based systems; 2) Complexity and emergent phenomena in the integrative food science; 3) Investigation of nano and microstructures in the bulk and at interfaces; 4) Modeling approaches from bio-molecules and matrix structures to functionality; 5) Tuning binding & release of bioactive compounds by matrix modulation, and finally; 6) Tuning the delivery of functionality to the body. These topics were selected to cover different scientific fields and to show the contribution of food physical structures to development of health- and plaisure-supporting food functions. The oral communications were all introduced by key note speakers and they were all illustrated by outstanding high quality short communications. One of the most original features of this symposium was the increasing number of presentations using multiscale and modeling approaches illustrating the concept of complexity and emergent phenomena integrative food science. These highlighted the importance of studies on interactions between structure properties of engineered delivery systems and human body (sensory properties, digestion, release, bioavailability and bioaccessibility). Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. Nature as inspiration for leisure education

    OpenAIRE

    ŠPIRHANZLOVÁ, Andrea

    2017-01-01

    The thesis deals with the organization of leisure activities where the main tool and inspiration is nature. The theoretical part defines basic concepts of pedagogy of free time and points to the possibility of using nature as an inspiration not only for creating content components of leisure activities, but also as the environment in which the pedagogical - educational process of activities takes place. The practical part contains specific pedagogical - educational activity whose essence is b...

  9. Evaluation of Probabilistic Reasoning Evidence from Seventh-Graders

    Science.gov (United States)

    Erdem, Emrullah; Gürbüz, Ramazan

    2016-01-01

    The purpose of this study was to evaluate probabilistic reasoning of seventh-grade students (N=167) studying at randomly selected three middle schools that served low and middle socioeconomic areas in a city of Turkey. "Probabilistic Reasoning Test (PRT)" was developed and used as a data collection tool. In analyzing the data,…

  10. When catalysis is useful for probabilistic entanglement transformation

    International Nuclear Information System (INIS)

    Feng Yuan; Duan Runyao; Ying Mingsheng

    2004-01-01

    We determine all 2x2 quantum states that can serve as useful catalysts for a given probabilistic entanglement transformation, in the sense that they can increase the maximal transformation probability. When higher-dimensional catalysts are considered, a sufficient and necessary condition is derived under which a certain probabilistic transformation has useful catalysts

  11. Probabilistic Damage Stability Calculations for Ships

    DEFF Research Database (Denmark)

    Jensen, Jørgen Juncher

    1996-01-01

    The aim of these notes is to provide background material for the present probabilistic damage stability rules fro dry cargo ships.The formulas for the damage statistics are derived and shortcomings as well as possible improvements are discussed. The advantage of the definiton of fictitious...... compartments in the formulation of a computer-based general procedure for probabilistic damaged stability assessment is shown. Some comments are given on the current state of knowledge on the ship survivability in damaged conditions. Finally, problems regarding proper account of water ingress through openings...

  12. Quantum logic networks for probabilistic teleportation

    Institute of Scientific and Technical Information of China (English)

    刘金明; 张永生; 等

    2003-01-01

    By eans of the primitive operations consisting of single-qubit gates.two-qubit controlled-not gates,Von Neuman measurement and classically controlled operations.,we construct efficient quantum logic networks for implementing probabilistic teleportation of a single qubit,a two-particle entangled state,and an N-particle entanglement.Based on the quantum networks,we show that after the partially entangled states are concentrated into maximal entanglement,the above three kinds of probabilistic teleportation are the same as the standard teleportation using the corresponding maximally entangled states as the quantum channels.

  13. Probabilistic pathway construction.

    Science.gov (United States)

    Yousofshahi, Mona; Lee, Kyongbum; Hassoun, Soha

    2011-07-01

    Expression of novel synthesis pathways in host organisms amenable to genetic manipulations has emerged as an attractive metabolic engineering strategy to overproduce natural products, biofuels, biopolymers and other commercially useful metabolites. We present a pathway construction algorithm for identifying viable synthesis pathways compatible with balanced cell growth. Rather than exhaustive exploration, we investigate probabilistic selection of reactions to construct the pathways. Three different selection schemes are investigated for the selection of reactions: high metabolite connectivity, low connectivity and uniformly random. For all case studies, which involved a diverse set of target metabolites, the uniformly random selection scheme resulted in the highest average maximum yield. When compared to an exhaustive search enumerating all possible reaction routes, our probabilistic algorithm returned nearly identical distributions of yields, while requiring far less computing time (minutes vs. years). The pathways identified by our algorithm have previously been confirmed in the literature as viable, high-yield synthesis routes. Prospectively, our algorithm could facilitate the design of novel, non-native synthesis routes by efficiently exploring the diversity of biochemical transformations in nature. Copyright © 2011 Elsevier Inc. All rights reserved.

  14. Probabilistic versus deterministic hazard assessment in liquefaction susceptible zones

    Science.gov (United States)

    Daminelli, Rosastella; Gerosa, Daniele; Marcellini, Alberto; Tento, Alberto

    2015-04-01

    Probabilistic seismic hazard assessment (PSHA), usually adopted in the framework of seismic codes redaction, is based on Poissonian description of the temporal occurrence, negative exponential distribution of magnitude and attenuation relationship with log-normal distribution of PGA or response spectrum. The main positive aspect of this approach stems into the fact that is presently a standard for the majority of countries, but there are weak points in particular regarding the physical description of the earthquake phenomenon. Factors like site effects, source characteristics like duration of the strong motion and directivity that could significantly influence the expected motion at the site are not taken into account by PSHA. Deterministic models can better evaluate the ground motion at a site from a physical point of view, but its prediction reliability depends on the degree of knowledge of the source, wave propagation and soil parameters. We compare these two approaches in selected sites affected by the May 2012 Emilia-Romagna and Lombardia earthquake, that caused widespread liquefaction phenomena unusually for magnitude less than 6. We focus on sites liquefiable because of their soil mechanical parameters and water table level. Our analysis shows that the choice between deterministic and probabilistic hazard analysis is strongly dependent on site conditions. The looser the soil and the higher the liquefaction potential, the more suitable is the deterministic approach. Source characteristics, in particular the duration of strong ground motion, have long since recognized as relevant to induce liquefaction; unfortunately a quantitative prediction of these parameters appears very unlikely, dramatically reducing the possibility of their adoption in hazard assessment. Last but not least, the economic factors are relevant in the choice of the approach. The case history of 2012 Emilia-Romagna and Lombardia earthquake, with an officially estimated cost of 6 billions

  15. Volume 2. Probabilistic analysis of HTGR application studies. Supporting data

    International Nuclear Information System (INIS)

    1980-09-01

    Volume II, Probabilistic Analysis of HTGR Application Studies - Supporting Data, gives the detail data, both deterministic and probabilistic, employed in the calculation presented in Volume I. The HTGR plants and the fossil plants considered in the study are listed. GCRA provided the technical experts from which the data were obtained by MAC personnel. The names of the technical experts (interviewee) and the analysts (interviewer) are given for the probabilistic data

  16. On the progress towards probabilistic basis for deterministic codes

    International Nuclear Information System (INIS)

    Ellyin, F.

    1975-01-01

    Fundamentals arguments for a probabilistic basis of codes are presented. A class of code formats is outlined in which explicit statistical measures of uncertainty of design variables are incorporated. The format looks very much like present codes (deterministic) except for having probabilistic background. An example is provided whereby the design factors are plotted against the safety index, the probability of failure, and the risk of mortality. The safety level of the present codes is also indicated. A decision regarding the new probabilistically based code parameters thus could be made with full knowledge of implied consequences

  17. A Probabilistic Analysis of the Sacco and Vanzetti Evidence

    CERN Document Server

    Kadane, Joseph B

    2011-01-01

    A Probabilistic Analysis of the Sacco and Vanzetti Evidence is a Bayesian analysis of the trial and post-trial evidence in the Sacco and Vanzetti case, based on subjectively determined probabilities and assumed relationships among evidential events. It applies the ideas of charting evidence and probabilistic assessment to this case, which is perhaps the ranking cause celebre in all of American legal history. Modern computation methods applied to inference networks are used to show how the inferential force of evidence in a complicated case can be graded. The authors employ probabilistic assess

  18. Inspiration and the Texts of the Bible

    Directory of Open Access Journals (Sweden)

    Dirk Buchner

    1997-12-01

    Full Text Available This article seeks to explore what the inspired text of the Old Testament was as it existed for the New Testament authors, particularly for the author of the book of Hebrews. A quick look at the facts makes. it clear that there was, at the time, more than one 'inspired' text, among these were the Septuagint and the Masoretic Text 'to name but two'. The latter eventually gained ascendancy which is why it forms the basis of our translated Old Testament today. Yet we have to ask: what do we make of that other text that was the inspired Bible to the early Church, especially to the writer of the book of Hebrews, who ignored the Masoretic text? This article will take a brief look at some suggestions for a doctrine of inspiration that keeps up with the facts of Scripture. Allied to this, the article is something of a bibliographical study of recent developments in textual research following the discovery of the Dead Sea scrolls.

  19. Influence of probabilistic safety analysis on design and operation of PWR plants

    International Nuclear Information System (INIS)

    Bastl, W.; Hoertner, H.; Kafka, P.

    1978-01-01

    This paper gives a comprehensive presentation of the connections and influences of probabilistic safety analysis on design and operation of PWR plants. In this context a short historical retrospective view concerning probabilistic reliability analysis is given. In the main part of this paper some examples are presented in detail, showing special outcomes of such probabilistic investigations. Additional paragraphs illustrate some activities and issues in the field of probabilistic safety analysis

  20. Creative design inspired by biological knowledge: Technologies and methods

    Science.gov (United States)

    Tan, Runhua; Liu, Wei; Cao, Guozhong; Shi, Yuan

    2018-05-01

    Biological knowledge is becoming an important source of inspiration for developing creative solutions to engineering design problems and even has a huge potential in formulating ideas that can help firms compete successfully in a dynamic market. To identify the technologies and methods that can facilitate the development of biologically inspired creative designs, this research briefly reviews the existing biological-knowledge-based theories and methods and examines the application of biological-knowledge-inspired designs in various fields. Afterward, this research thoroughly examines the four dimensions of key technologies that underlie the biologically inspired design (BID) process. This research then discusses the future development trends of the BID process before presenting the conclusions.

  1. Effectiveness of Securities with Fuzzy Probabilistic Return

    Directory of Open Access Journals (Sweden)

    Krzysztof Piasecki

    2011-01-01

    Full Text Available The generalized fuzzy present value of a security is defined here as fuzzy valued utility of cash flow. The generalized fuzzy present value cannot depend on the value of future cash flow. There exists such a generalized fuzzy present value which is not a fuzzy present value in the sense given by some authors. If the present value is a fuzzy number and the future value is a random one, then the return rate is given as a probabilistic fuzzy subset on a real line. This kind of return rate is called a fuzzy probabilistic return. The main goal of this paper is to derive the family of effective securities with fuzzy probabilistic return. Achieving this goal requires the study of the basic parameters characterizing fuzzy probabilistic return. Therefore, fuzzy expected value and variance are determined for this case of return. These results are a starting point for constructing a three-dimensional image. The set of effective securities is introduced as the Pareto optimal set determined by the maximization of the expected return rate and minimization of the variance. Finally, the set of effective securities is distinguished as a fuzzy set. These results are obtained without the assumption that the distribution of future values is Gaussian. (original abstract

  2. Application of the probabilistic method at the E.D.F

    International Nuclear Information System (INIS)

    Gachot, Bernard

    1976-01-01

    Having first evoked the problems arising from the definition of a so-called 'acceptable risk', the probabilistic study programme on safety carried out at the E.D.F. is described. The different aspects of the probabilistic estimation of a hazard are presented as well as the different steps i.e. collecting the information, carrying out a quantitative and qualitative analysis, which characterize the probabilistic study of safety problems. The problem of data determination is considered on reliability of the equipment, noting as a conclusion, that in spite of the lack of accuracy of the present data, the probabilistic methods already appear as a highly valuable tool favouring an homogenous and coherent approach of nuclear plant safety [fr

  3. Probabilistic modeling of discourse-aware sentence processing.

    Science.gov (United States)

    Dubey, Amit; Keller, Frank; Sturt, Patrick

    2013-07-01

    Probabilistic models of sentence comprehension are increasingly relevant to questions concerning human language processing. However, such models are often limited to syntactic factors. This restriction is unrealistic in light of experimental results suggesting interactions between syntax and other forms of linguistic information in human sentence processing. To address this limitation, this article introduces two sentence processing models that augment a syntactic component with information about discourse co-reference. The novel combination of probabilistic syntactic components with co-reference classifiers permits them to more closely mimic human behavior than existing models. The first model uses a deep model of linguistics, based in part on probabilistic logic, allowing it to make qualitative predictions on experimental data; the second model uses shallow processing to make quantitative predictions on a broad-coverage reading-time corpus. Copyright © 2013 Cognitive Science Society, Inc.

  4. Probabilistic risk assessment in nuclear power plant regulation

    Energy Technology Data Exchange (ETDEWEB)

    Wall, J B

    1980-09-01

    A specific program is recommended to utilize more effectively probabilistic risk assessment in nuclear power plant regulation. It is based upon the engineering insights from the Reactor Safety Study (WASH-1400) and some follow-on risk assessment research by USNRC. The Three Mile Island accident is briefly discussed from a risk viewpoint to illustrate a weakness in current practice. The development of a probabilistic safety goal is recommended with some suggestions on underlying principles. Some ongoing work on risk perception and the draft probabilistic safety goal being reviewed on Canada is described. Some suggestions are offered on further risk assessment research. Finally, some recent U.S. Nuclear Regulatory Commission actions are described.

  5. GUI program to compute probabilistic seismic hazard analysis

    International Nuclear Information System (INIS)

    Shin, Jin Soo; Chi, H. C.; Cho, J. C.; Park, J. H.; Kim, K. G.; Im, I. S.

    2006-12-01

    The development of program to compute probabilistic seismic hazard is completed based on Graphic User Interface(GUI). The main program consists of three part - the data input processes, probabilistic seismic hazard analysis and result output processes. The probabilistic seismic hazard analysis needs various input data which represent attenuation formulae, seismic zoning map, and earthquake event catalog. The input procedure of previous programs based on text interface take a much time to prepare the data. The data cannot be checked directly on screen to prevent input erroneously in existing methods. The new program simplifies the input process and enable to check the data graphically in order to minimize the artificial error within limits of the possibility

  6. Non-probabilistic defect assessment for structures with cracks based on interval model

    International Nuclear Information System (INIS)

    Dai, Qiao; Zhou, Changyu; Peng, Jian; Chen, Xiangwei; He, Xiaohua

    2013-01-01

    Highlights: • Non-probabilistic approach is introduced to defect assessment. • Definition and establishment of IFAC are put forward. • Determination of assessment rectangle is proposed. • Solution of non-probabilistic reliability index is presented. -- Abstract: Traditional defect assessment methods conservatively treat uncertainty of parameters as safety factors, while the probabilistic method is based on the clear understanding of detailed statistical information of parameters. In this paper, the non-probabilistic approach is introduced to the failure assessment diagram (FAD) to propose a non-probabilistic defect assessment method for structures with cracks. This novel defect assessment method contains three critical processes: establishment of the interval failure assessment curve (IFAC), determination of the assessment rectangle, and solution of the non-probabilistic reliability degree. Based on the interval theory, uncertain parameters such as crack sizes, material properties and loads are considered as interval variables. As a result, the failure assessment curve (FAC) will vary in a certain range, which is defined as IFAC. And the assessment point will vary within a rectangle zone which is defined as an assessment rectangle. Based on the interval model, the establishment of IFAC and the determination of the assessment rectangle are presented. Then according to the interval possibility degree method, the non-probabilistic reliability degree of IFAC can be determined. Meanwhile, in order to clearly introduce the non-probabilistic defect assessment method, a numerical example for the assessment of a pipe with crack is given. In addition, the assessment result of the proposed method is compared with that of the traditional probabilistic method, which confirms that this non-probabilistic defect assessment can reasonably resolve the practical problem with interval variables

  7. Non-probabilistic defect assessment for structures with cracks based on interval model

    Energy Technology Data Exchange (ETDEWEB)

    Dai, Qiao; Zhou, Changyu, E-mail: changyu_zhou@163.com; Peng, Jian; Chen, Xiangwei; He, Xiaohua

    2013-09-15

    Highlights: • Non-probabilistic approach is introduced to defect assessment. • Definition and establishment of IFAC are put forward. • Determination of assessment rectangle is proposed. • Solution of non-probabilistic reliability index is presented. -- Abstract: Traditional defect assessment methods conservatively treat uncertainty of parameters as safety factors, while the probabilistic method is based on the clear understanding of detailed statistical information of parameters. In this paper, the non-probabilistic approach is introduced to the failure assessment diagram (FAD) to propose a non-probabilistic defect assessment method for structures with cracks. This novel defect assessment method contains three critical processes: establishment of the interval failure assessment curve (IFAC), determination of the assessment rectangle, and solution of the non-probabilistic reliability degree. Based on the interval theory, uncertain parameters such as crack sizes, material properties and loads are considered as interval variables. As a result, the failure assessment curve (FAC) will vary in a certain range, which is defined as IFAC. And the assessment point will vary within a rectangle zone which is defined as an assessment rectangle. Based on the interval model, the establishment of IFAC and the determination of the assessment rectangle are presented. Then according to the interval possibility degree method, the non-probabilistic reliability degree of IFAC can be determined. Meanwhile, in order to clearly introduce the non-probabilistic defect assessment method, a numerical example for the assessment of a pipe with crack is given. In addition, the assessment result of the proposed method is compared with that of the traditional probabilistic method, which confirms that this non-probabilistic defect assessment can reasonably resolve the practical problem with interval variables.

  8. New probabilistic interest measures for association rules

    OpenAIRE

    Hahsler, Michael; Hornik, Kurt

    2008-01-01

    Mining association rules is an important technique for discovering meaningful patterns in transaction databases. Many different measures of interestingness have been proposed for association rules. However, these measures fail to take the probabilistic properties of the mined data into account. In this paper, we start with presenting a simple probabilistic framework for transaction data which can be used to simulate transaction data when no associations are present. We use such data and a rea...

  9. Invariant and semi-invariant probabilistic normed spaces

    Energy Technology Data Exchange (ETDEWEB)

    Ghaemi, M.B. [School of Mathematics Iran, University of Science and Technology, Narmak, Tehran (Iran, Islamic Republic of)], E-mail: mghaemi@iust.ac.ir; Lafuerza-Guillen, B. [Departamento de Estadistica y Matematica Aplicada, Universidad de Almeria, Almeria E-04120 (Spain)], E-mail: blafuerz@ual.es; Saiedinezhad, S. [School of Mathematics Iran, University of Science and Technology, Narmak, Tehran (Iran, Islamic Republic of)], E-mail: ssaiedinezhad@yahoo.com

    2009-10-15

    Probabilistic metric spaces were introduced by Karl Menger. Alsina, Schweizer and Sklar gave a general definition of probabilistic normed space based on the definition of Menger . We introduce the concept of semi-invariance among the PN spaces. In this paper we will find a sufficient condition for some PN spaces to be semi-invariant. We will show that PN spaces are normal spaces. Urysohn's lemma, and Tietze extension theorem for them are proved.

  10. A toolkit for integrated deterministic and probabilistic assessment for hydrogen infrastructure.

    Energy Technology Data Exchange (ETDEWEB)

    Groth, Katrina M.; Tchouvelev, Andrei V.

    2014-03-01

    There has been increasing interest in using Quantitative Risk Assessment [QRA] to help improve the safety of hydrogen infrastructure and applications. Hydrogen infrastructure for transportation (e.g. fueling fuel cell vehicles) or stationary (e.g. back-up power) applications is a relatively new area for application of QRA vs. traditional industrial production and use, and as a result there are few tools designed to enable QRA for this emerging sector. There are few existing QRA tools containing models that have been developed and validated for use in small-scale hydrogen applications. However, in the past several years, there has been significant progress in developing and validating deterministic physical and engineering models for hydrogen dispersion, ignition, and flame behavior. In parallel, there has been progress in developing defensible probabilistic models for the occurrence of events such as hydrogen release and ignition. While models and data are available, using this information is difficult due to a lack of readily available tools for integrating deterministic and probabilistic components into a single analysis framework. This paper discusses the first steps in building an integrated toolkit for performing QRA on hydrogen transportation technologies and suggests directions for extending the toolkit.

  11. A linear process-algebraic format for probabilistic systems with data (extended version)

    NARCIS (Netherlands)

    Katoen, Joost P.; van de Pol, Jan Cornelis; Stoelinga, Mariëlle Ida Antoinette; Timmer, Mark

    2010-01-01

    This paper presents a novel linear process-algebraic format for probabilistic automata. The key ingredient is a symbolic transformation of probabilistic process algebra terms that incorporate data into this linear format while preserving strong probabilistic bisimulation. This generalises similar

  12. Error Immune Logic for Low-Power Probabilistic Computing

    Directory of Open Access Journals (Sweden)

    Bo Marr

    2010-01-01

    design for the maximum amount of energy savings per a given error rate. Spice simulation results using a commercially available and well-tested 0.25 μm technology are given verifying the ultra-low power, probabilistic full-adder designs. Further, close to 6X energy savings is achieved for a probabilistic full-adder over the deterministic case.

  13. The Role of Language in Building Probabilistic Thinking

    Science.gov (United States)

    Nacarato, Adair Mendes; Grando, Regina Célia

    2014-01-01

    This paper is based on research that investigated the development of probabilistic language and thinking by students 10-12 years old. The focus was on the adequate use of probabilistic terms in social practice. A series of tasks was developed for the investigation and completed by the students working in groups. The discussions were video recorded…

  14. PRECIS -- A probabilistic risk assessment system

    International Nuclear Information System (INIS)

    Peterson, D.M.; Knowlton, R.G. Jr.

    1996-01-01

    A series of computer tools has been developed to conduct the exposure assessment and risk characterization phases of human health risk assessments within a probabilistic framework. The tools are collectively referred to as the Probabilistic Risk Evaluation and Characterization Investigation System (PRECIS). With this system, a risk assessor can calculate the doses and risks associated with multiple environmental and exposure pathways, for both chemicals and radioactive contaminants. Exposure assessment models in the system account for transport of contaminants to receptor points from a source zone originating in unsaturated soils above the water table. In addition to performing calculations of dose and risk based on initial concentrations, PRECIS can also be used in an inverse manner to compute soil concentrations in the source area that must not be exceeded if prescribed limits on dose or risk are to be met. Such soil contaminant levels, referred to as soil guidelines, are computed for both single contaminants and chemical mixtures and can be used as action levels or cleanup levels. Probabilistic estimates of risk, dose and soil guidelines are derived using Monte Carlo techniques

  15. Nature-Inspired Design : Strategies for Sustainable Product Development

    NARCIS (Netherlands)

    De Pauw, I.C.

    2015-01-01

    Product designers can apply different strategies, methods, and tools for sustainable product development. Nature-Inspired Design Strategies (NIDS) offer designers a distinct class of strategies that use ‘nature’ as a guiding source of knowledge and inspiration for addressing sustainability.

  16. The scientific study of inspiration in the creative process: Challenges and opportunities

    Directory of Open Access Journals (Sweden)

    Victoria C. Oleynick

    2014-06-01

    Full Text Available Inspiration is a motivational state that compels individuals to bring ideas into fruition. Creators have long argued that inspiration is important to the creative process, but until recently, scientists have not investigated this claim. In this article, we review challenges to the study of creative inspiration, as well as solutions to these challenges afforded by theoretical and empirical work on inspiration over the past decade. First, we discuss the problem of definitional ambiguity, which has been addressed through an integrative process of construct conceptualization. Second, we discuss the challenge of how to operationalize inspiration. This challenge has been overcome by the development and validation of the Inspiration Scale, which may be used to assess trait or state inspiration. Third, we address ambiguity regarding how inspiration differs from related concepts (creativity, insight, positive affect by discussing discriminant validity. Next, we discuss the preconception that inspiration is less important than perspiration (effort, and we review empirical evidence that inspiration and effort both play important—but different—roles in the creative process. Finally, with many challenges overcome, we argue that the foundation is now set for a new generation of research focused on neural underpinnings. We discuss potential challenges to and opportunities for the neuroscientific study of inspiration. A better understanding of the biological basis of inspiration will illuminate the process through which creative ideas fire the soul, such that individuals are compelled to transform ideas into products and solutions that may benefit society.

  17. Efficient probabilistic model checking on general purpose graphic processors

    NARCIS (Netherlands)

    Bosnacki, D.; Edelkamp, S.; Sulewski, D.; Pasareanu, C.S.

    2009-01-01

    We present algorithms for parallel probabilistic model checking on general purpose graphic processing units (GPGPUs). For this purpose we exploit the fact that some of the basic algorithms for probabilistic model checking rely on matrix vector multiplication. Since this kind of linear algebraic

  18. A Probabilistic Framework for Security Scenarios with Dependent Actions

    NARCIS (Netherlands)

    Kordy, Barbara; Pouly, Marc; Schweizer, Patrick; Albert, Elvira; Sekereinsk, Emil

    2014-01-01

    This work addresses the growing need of performing meaningful probabilistic analysis of security. We propose a framework that integrates the graphical security modeling technique of attack–defense trees with probabilistic information expressed in terms of Bayesian networks. This allows us to perform

  19. Transitive probabilistic CLIR models.

    NARCIS (Netherlands)

    Kraaij, W.; de Jong, Franciska M.G.

    2004-01-01

    Transitive translation could be a useful technique to enlarge the number of supported language pairs for a cross-language information retrieval (CLIR) system in a cost-effective manner. The paper describes several setups for transitive translation based on probabilistic translation models. The

  20. Probabilistic analysis of extreme wind events

    Energy Technology Data Exchange (ETDEWEB)

    Chaviaropoulos, P.K. [Center for Renewable Energy Sources (CRES), Pikermi Attikis (Greece)

    1997-12-31

    A vital task in wind engineering and meterology is to understand, measure, analyse and forecast extreme wind conditions, due to their significant effects on human activities and installations like buildings, bridges or wind turbines. The latest version of the IEC standard (1996) pays particular attention to the extreme wind events that have to be taken into account when designing or certifying a wind generator. Actually, the extreme wind events within a 50 year period are those which determine the ``static`` design of most of the wind turbine components. The extremes which are important for the safety of wind generators are those associated with the so-called ``survival wind speed``, the extreme operating gusts and the extreme wind direction changes. A probabilistic approach for the analysis of these events is proposed in this paper. Emphasis is put on establishing the relation between extreme values and physically meaningful ``site calibration`` parameters, like probability distribution of the annual wind speed, turbulence intensity and power spectra properties. (Author)

  1. Non-unitary probabilistic quantum computing

    Science.gov (United States)

    Gingrich, Robert M.; Williams, Colin P.

    2004-01-01

    We present a method for designing quantum circuits that perform non-unitary quantum computations on n-qubit states probabilistically, and give analytic expressions for the success probability and fidelity.

  2. Probabilistic seismic hazards: Guidelines and constraints in evaluating results

    International Nuclear Information System (INIS)

    Sadigh, R.K.; Power, M.S.

    1989-01-01

    In conducting probabilistic seismic hazard analyses, consideration of the dispersion as well as the upper bounds on ground motion is of great significance. In particular, the truncation of ground motion levels at some upper limit would have a major influence on the computed hazard at the low-to-very-low probability levels. Additionally, other deterministic guidelines and constraints should be considered in evaluating the probabilistic seismic hazard results. In contrast to probabilistic seismic hazard evaluations, mean plus one standard deviation ground motions are typically used for deterministic estimates of ground motions from maximum events that may affect a structure. To be consistent with standard deterministic maximum estimates of ground motions values should be the highest level considered for the site. These maximum values should be associated with the largest possible event occurring at the site. Furthermore, the relationships between the ground motion level and probability of exceedance should reflect a transition from purely probabilistic assessments of ground motion at high probability levels where there are multiple chances for events to a deterministic upper bound ground motion at very low probability levels where there is very limited opportunity for maximum events to occur. In Interplate Regions, where the seismic sources may be characterized by a high-to-very-high rate of activity, the deterministic bounds will be approached or exceeded by the computer probabilistic hazard values at annual probability of exceedance levels typically as high as 10 -2 to 10 -3 . Thus, at these or lower values probability levels, probabilistically computed hazard values could be readily interpreted in the light of the deterministic constraints

  3. Physical Tools for Creativity with Textile Materials

    DEFF Research Database (Denmark)

    Heimdal, Elisabeth Jacobsen

    2010-01-01

    This paper seeks to develop a better understanding of how physical objects can stimulate creativity, studying the case of textile material samples employed to inspire textile designers to use new responsive materials and technologies in their designs. I show: 1) how physical objects can act both...

  4. Extended probabilistic system assessment calculations within the SKI project-90

    International Nuclear Information System (INIS)

    Pereira, A.

    1993-03-01

    The probabilistic system assessment calculation reported in the SKI Project-90 final documents were restricted to the following nuclides: 14 C, 129 I, 135 Cs, 237 Np and 240 Pu. In this report we have extended those calculations to another five nuclides: 79 Se, 243 Am, 240 Pu, 93 Zr and 99 Tc. The execution of probabilistic assessment calculations integrated in the context of SKIs first safety analysis exercise of an hypothetic final repository for high-level nuclear waste in Sweden, was a learning experience of relevance for the conduction of probabilistic safety assessment in future exercises. Some major conclusions and viewpoints of future need related with probabilistic assessment were withdrawn from this work and are presented in our report

  5. Modeling and control of an unstable system using probabilistic fuzzy inference system

    Directory of Open Access Journals (Sweden)

    Sozhamadevi N.

    2015-09-01

    Full Text Available A new type Fuzzy Inference System is proposed, a Probabilistic Fuzzy Inference system which model and minimizes the effects of statistical uncertainties. The blend of two different concepts, degree of truth and probability of truth in a unique framework leads to this new concept. This combination is carried out both in Fuzzy sets and Fuzzy rules, which gives rise to Probabilistic Fuzzy Sets and Probabilistic Fuzzy Rules. Introducing these probabilistic elements, a distinctive probabilistic fuzzy inference system is developed and this involves fuzzification, inference and output processing. This integrated approach accounts for all of the uncertainty like rule uncertainties and measurement uncertainties present in the systems and has led to the design which performs optimally after training. In this paper a Probabilistic Fuzzy Inference System is applied for modeling and control of a highly nonlinear, unstable system and also proved its effectiveness.

  6. On revision of partially specified convex probabilistic belief bases

    CSIR Research Space (South Africa)

    Rens, G

    2016-08-01

    Full Text Available We propose a method for an agent to revise its incomplete probabilistic beliefs when a new piece of propositional information is observed. In this work, an agent’s beliefs are represented by a set of probabilistic formulae – a belief base...

  7. Semantics of probabilistic processes an operational approach

    CERN Document Server

    Deng, Yuxin

    2015-01-01

    This book discusses the semantic foundations of concurrent systems with nondeterministic and probabilistic behaviour. Particular attention is given to clarifying the relationship between testing and simulation semantics and characterising bisimulations from metric, logical, and algorithmic perspectives. Besides presenting recent research outcomes in probabilistic concurrency theory, the book exemplifies the use of many mathematical techniques to solve problems in computer science, which is intended to be accessible to postgraduate students in Computer Science and Mathematics. It can also be us

  8. Role of Cultural Inspiration with Different Types in Cultural Product Design Activities

    Science.gov (United States)

    Luo, Shi-Jian; Dong, Ye-Nan

    2017-01-01

    Inspiration plays an important role in the design activities and design education. This paper describes "ancient cultural artefacts" as "cultural inspiration," consisting of two types called "cultural-pictorial inspiration" (CPI) and "cultural-textual inspiration" (CTI). This study aims to test the important…

  9. Valid Probabilistic Predictions for Ginseng with Venn Machines Using Electronic Nose

    Directory of Open Access Journals (Sweden)

    You Wang

    2016-07-01

    Full Text Available In the application of electronic noses (E-noses, probabilistic prediction is a good way to estimate how confident we are about our prediction. In this work, a homemade E-nose system embedded with 16 metal-oxide semi-conductive gas sensors was used to discriminate nine kinds of ginsengs of different species or production places. A flexible machine learning framework, Venn machine (VM was introduced to make probabilistic predictions for each prediction. Three Venn predictors were developed based on three classical probabilistic prediction methods (Platt’s method, Softmax regression and Naive Bayes. Three Venn predictors and three classical probabilistic prediction methods were compared in aspect of classification rate and especially the validity of estimated probability. A best classification rate of 88.57% was achieved with Platt’s method in offline mode, and the classification rate of VM-SVM (Venn machine based on Support Vector Machine was 86.35%, just 2.22% lower. The validity of Venn predictors performed better than that of corresponding classical probabilistic prediction methods. The validity of VM-SVM was superior to the other methods. The results demonstrated that Venn machine is a flexible tool to make precise and valid probabilistic prediction in the application of E-nose, and VM-SVM achieved the best performance for the probabilistic prediction of ginseng samples.

  10. Probabilistic Design of Offshore Structural Systems

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard

    1988-01-01

    Probabilistic design of structural systems is considered in this paper. The reliability is estimated using first-order reliability methods (FORM). The design problem is formulated as the optimization problem to minimize a given cost function such that the reliability of the single elements...... satisfies given requirements or such that the systems reliability satisfies a given requirement. Based on a sensitivity analysis optimization procedures to solve the optimization problems are presented. Two of these procedures solve the system reliability-based optimization problem sequentially using quasi......-analytical derivatives. Finally an example of probabilistic design of an offshore structure is considered....

  11. Probabilistic Design of Offshore Structural Systems

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard

    Probabilistic design of structural systems is considered in this paper. The reliability is estimated using first-order reliability methods (FORM). The design problem is formulated as the optimization problem to minimize a given cost function such that the reliability of the single elements...... satisfies given requirements or such that the systems reliability satisfies a given requirement. Based on a sensitivity analysis optimization procedures to solve the optimization problems are presented. Two of these procedures solve the system reliability-based optimization problem sequentially using quasi......-analytical derivatives. Finally an example of probabilistic design of an offshore structure is considered....

  12. CAD Parts-Based Assembly Modeling by Probabilistic Reasoning

    KAUST Repository

    Zhang, Kai-Ke

    2016-04-11

    Nowadays, increasing amount of parts and sub-assemblies are publicly available, which can be used directly for product development instead of creating from scratch. In this paper, we propose an interactive design framework for efficient and smart assembly modeling, in order to improve the design efficiency. Our approach is based on a probabilistic reasoning. Given a collection of industrial assemblies, we learn a probabilistic graphical model from the relationships between the parts of assemblies. Then in the modeling stage, this probabilistic model is used to suggest the most likely used parts compatible with the current assembly. Finally, the parts are assembled under certain geometric constraints. We demonstrate the effectiveness of our framework through a variety of assembly models produced by our prototype system. © 2015 IEEE.

  13. CAD Parts-Based Assembly Modeling by Probabilistic Reasoning

    KAUST Repository

    Zhang, Kai-Ke; Hu, Kai-Mo; Yin, Li-Cheng; Yan, Dongming; Wang, Bin

    2016-01-01

    Nowadays, increasing amount of parts and sub-assemblies are publicly available, which can be used directly for product development instead of creating from scratch. In this paper, we propose an interactive design framework for efficient and smart assembly modeling, in order to improve the design efficiency. Our approach is based on a probabilistic reasoning. Given a collection of industrial assemblies, we learn a probabilistic graphical model from the relationships between the parts of assemblies. Then in the modeling stage, this probabilistic model is used to suggest the most likely used parts compatible with the current assembly. Finally, the parts are assembled under certain geometric constraints. We demonstrate the effectiveness of our framework through a variety of assembly models produced by our prototype system. © 2015 IEEE.

  14. Systematic evaluations of probabilistic floor response spectrum generation

    International Nuclear Information System (INIS)

    Lilhanand, K.; Wing, D.W.; Tseng, W.S.

    1985-01-01

    The relative merits of the current methods for direct generation of probabilistic floor response spectra (FRS) from the prescribed design response spectra (DRS) are evaluated. The explicit probabilistic methods, which explicitly use the relationship between the power spectral density function (PSDF) and response spectra (RS), i.e., the PSDF-RS relationship, are found to have advantages for practical applications over the implicit methods. To evaluate the accuracy of the explicit methods, the root-mean-square (rms) response and the peak factor contained in the PSDF-RS relationship are systematically evaluated, especially for the narrow-band floor spectral response, by comparing the analytical results with simulation results. Based on the evaluation results, a method is recommended for practical use for the direct generation of probabilistic FRS. (orig.)

  15. Adaptive predictors based on probabilistic SVM for real time disruption mitigation on JET

    Science.gov (United States)

    Murari, A.; Lungaroni, M.; Peluso, E.; Gaudio, P.; Vega, J.; Dormido-Canto, S.; Baruzzo, M.; Gelfusa, M.; Contributors, JET

    2018-05-01

    Detecting disruptions with sufficient anticipation time is essential to undertake any form of remedial strategy, mitigation or avoidance. Traditional predictors based on machine learning techniques can be very performing, if properly optimised, but do not provide a natural estimate of the quality of their outputs and they typically age very quickly. In this paper a new set of tools, based on probabilistic extensions of support vector machines (SVM), are introduced and applied for the first time to JET data. The probabilistic output constitutes a natural qualification of the prediction quality and provides additional flexibility. An adaptive training strategy ‘from scratch’ has also been devised, which allows preserving the performance even when the experimental conditions change significantly. Large JET databases of disruptions, covering entire campaigns and thousands of discharges, have been analysed, both for the case of the graphite and the ITER Like Wall. Performance significantly better than any previous predictor using adaptive training has been achieved, satisfying even the requirements of the next generation of devices. The adaptive approach to the training has also provided unique information about the evolution of the operational space. The fact that the developed tools give the probability of disruption improves the interpretability of the results, provides an estimate of the predictor quality and gives new insights into the physics. Moreover, the probabilistic treatment permits to insert more easily these classifiers into general decision support and control systems.

  16. Ant- and Ant-Colony-Inspired ALife Visual Art.

    Science.gov (United States)

    Greenfield, Gary; Machado, Penousal

    2015-01-01

    Ant- and ant-colony-inspired ALife art is characterized by the artistic exploration of the emerging collective behavior of computational agents, developed using ants as a metaphor. We present a chronology that documents the emergence and history of such visual art, contextualize ant- and ant-colony-inspired art within generative art practices, and consider how it relates to other ALife art. We survey many of the algorithms that artists have used in this genre, address some of their aims, and explore the relationships between ant- and ant-colony-inspired art and research on ant and ant colony behavior.

  17. Probabilistic description of traffic flow

    International Nuclear Information System (INIS)

    Mahnke, R.; Kaupuzs, J.; Lubashevsky, I.

    2005-01-01

    A stochastic description of traffic flow, called probabilistic traffic flow theory, is developed. The general master equation is applied to relatively simple models to describe the formation and dissolution of traffic congestions. Our approach is mainly based on spatially homogeneous systems like periodically closed circular rings without on- and off-ramps. We consider a stochastic one-step process of growth or shrinkage of a car cluster (jam). As generalization we discuss the coexistence of several car clusters of different sizes. The basic problem is to find a physically motivated ansatz for the transition rates of the attachment and detachment of individual cars to a car cluster consistent with the empirical observations in real traffic. The emphasis is put on the analogy with first-order phase transitions and nucleation phenomena in physical systems like supersaturated vapour. The results are summarized in the flux-density relation, the so-called fundamental diagram of traffic flow, and compared with empirical data. Different regimes of traffic flow are discussed: free flow, congested mode as stop-and-go regime, and heavy viscous traffic. The traffic breakdown is studied based on the master equation as well as the Fokker-Planck approximation to calculate mean first passage times or escape rates. Generalizations are developed to allow for on-ramp effects. The calculated flux-density relation and characteristic breakdown times coincide with empirical data measured on highways. Finally, a brief summary of the stochastic cellular automata approach is given

  18. Bruno, Galileo, Einstein: The Value of Myths in Physics

    Science.gov (United States)

    Martinez, Alberto

    2015-03-01

    Usually, historical myths are portrayed as something to be avoided in a physics classroom. Instead, I will discuss the positive function of myths and how they can be used to improve physics education. First, on the basis of historical research from primary sources and significant new findings about the Catholic Inquisition, I will discuss how to use the inspirational story of Giordano Bruno when discussing cosmology. Next, I will discuss the recurring story about Galileo and the Leaning Tower of Pisa. Finally, I will discuss how neglected stories about the young Albert Einstein can help to inspire students.

  19. Probabilistic G-Metric space and some fixed point results

    Directory of Open Access Journals (Sweden)

    A. R. Janfada

    2013-01-01

    Full Text Available In this note we introduce the notions of generalized probabilistic metric spaces and generalized Menger probabilistic metric spaces. After making our elementary observations and proving some basic properties of these spaces, we are going to prove some fixed point result in these spaces.

  20. ISSUES ASSOCIATED WITH PROBABILISTIC FAILURE MODELING OF DIGITAL SYSTEMS

    International Nuclear Information System (INIS)

    CHU, T.L.; MARTINEZ-GURIDI, G.; LIHNER, J.; OVERLAND, D.

    2004-01-01

    The current U.S. Nuclear Regulatory Commission (NRC) licensing process of instrumentation and control (I and C) systems is based on deterministic requirements, e.g., single failure criteria, and defense in depth and diversity. Probabilistic considerations can be used as supplements to the deterministic process. The National Research Council has recommended development of methods for estimating failure probabilities of digital systems, including commercial off-the-shelf (COTS) equipment, for use in probabilistic risk assessment (PRA). NRC staff has developed informal qualitative and quantitative requirements for PRA modeling of digital systems. Brookhaven National Laboratory (BNL) has performed a review of the-state-of-the-art of the methods and tools that can potentially be used to model digital systems. The objectives of this paper are to summarize the review, discuss the issues associated with probabilistic modeling of digital systems, and identify potential areas of research that would enhance the state of the art toward a satisfactory modeling method that could be integrated with a typical probabilistic risk assessment

  1. Relative Gains, Losses, and Reference Points in Probabilistic Choice in Rats

    Science.gov (United States)

    Marshall, Andrew T.; Kirkpatrick, Kimberly

    2015-01-01

    Theoretical reference points have been proposed to differentiate probabilistic gains from probabilistic losses in humans, but such a phenomenon in non-human animals has yet to be thoroughly elucidated. Three experiments evaluated the effect of reward magnitude on probabilistic choice in rats, seeking to determine reference point use by examining the effect of previous outcome magnitude(s) on subsequent choice behavior. Rats were trained to choose between an outcome that always delivered reward (low-uncertainty choice) and one that probabilistically delivered reward (high-uncertainty). The probability of high-uncertainty outcome receipt and the magnitudes of low-uncertainty and high-uncertainty outcomes were manipulated within and between experiments. Both the low- and high-uncertainty outcomes involved variable reward magnitudes, so that either a smaller or larger magnitude was probabilistically delivered, as well as reward omission following high-uncertainty choices. In Experiments 1 and 2, the between groups factor was the magnitude of the high-uncertainty-smaller (H-S) and high-uncertainty-larger (H-L) outcome, respectively. The H-S magnitude manipulation differentiated the groups, while the H-L magnitude manipulation did not. Experiment 3 showed that manipulating the probability of differential losses as well as the expected value of the low-uncertainty choice produced systematic effects on choice behavior. The results suggest that the reference point for probabilistic gains and losses was the expected value of the low-uncertainty choice. Current theories of probabilistic choice behavior have difficulty accounting for the present results, so an integrated theoretical framework is proposed. Overall, the present results have implications for understanding individual differences and corresponding underlying mechanisms of probabilistic choice behavior. PMID:25658448

  2. Relative gains, losses, and reference points in probabilistic choice in rats.

    Directory of Open Access Journals (Sweden)

    Andrew T Marshall

    Full Text Available Theoretical reference points have been proposed to differentiate probabilistic gains from probabilistic losses in humans, but such a phenomenon in non-human animals has yet to be thoroughly elucidated. Three experiments evaluated the effect of reward magnitude on probabilistic choice in rats, seeking to determine reference point use by examining the effect of previous outcome magnitude(s on subsequent choice behavior. Rats were trained to choose between an outcome that always delivered reward (low-uncertainty choice and one that probabilistically delivered reward (high-uncertainty. The probability of high-uncertainty outcome receipt and the magnitudes of low-uncertainty and high-uncertainty outcomes were manipulated within and between experiments. Both the low- and high-uncertainty outcomes involved variable reward magnitudes, so that either a smaller or larger magnitude was probabilistically delivered, as well as reward omission following high-uncertainty choices. In Experiments 1 and 2, the between groups factor was the magnitude of the high-uncertainty-smaller (H-S and high-uncertainty-larger (H-L outcome, respectively. The H-S magnitude manipulation differentiated the groups, while the H-L magnitude manipulation did not. Experiment 3 showed that manipulating the probability of differential losses as well as the expected value of the low-uncertainty choice produced systematic effects on choice behavior. The results suggest that the reference point for probabilistic gains and losses was the expected value of the low-uncertainty choice. Current theories of probabilistic choice behavior have difficulty accounting for the present results, so an integrated theoretical framework is proposed. Overall, the present results have implications for understanding individual differences and corresponding underlying mechanisms of probabilistic choice behavior.

  3. Japanese round robin analysis for probabilistic fracture mechanics

    International Nuclear Information System (INIS)

    Yagawa, G.; Yoshimura, S.; Handa, N.

    1991-01-01

    Recently attention is focused on the probabilistic fracture mechanics, a branch of fracture mechanics with probability theory for a rational mean to assess the strength of components and structures. In particular, the probabilistic fracture mechanics is recognized as the powerful means for quantitative investigation of significance of factors and rational evaluation of life on problems involving a number of uncertainties, such as degradation of material strength, accuracy and frequency of inspection. Comparison with reference experiments are generally employed to assure the analytical accuracy. However, accuracy and reliability of analytical methods in the probabilistic fracture mechanics are hardly verified by experiments. Therefore, it is strongly needed to verify the probabilistic fracture mechanics through the round robin analysis. This paper describes results from the round robin analysis of flat plate with semi-elliptic cracks on the surface, conducted by the PFM Working Group of LE Subcommittee of the Japan Welding Society under the contract of the Japan Atomic Energy Research Institute and participated by Tokyo University, Yokohama National University, the Power Reactor and Nuclear Fuel Corporation, Tokyo Electric Power Co. Central Research Institute of Electric Power Industry, Toshiba Corporation, Kawasaki Heavy Industry Co. and Mitsubishi Heavy Industry Co. (author)

  4. Nature as Inspiration

    Science.gov (United States)

    Tank, Kristina; Moore, Tamara; Strnat, Meg

    2015-01-01

    This article describes the final lesson within a seven-day STEM and literacy unit that is part of the Picture STEM curriculum (pictureSTEM. org) and uses engineering to integrate science and mathematics learning in a meaningful way (Tank and Moore 2013). For this engineering challenge, students used nature as a source of inspiration for designs to…

  5. Ndebele Inspired Houses

    Science.gov (United States)

    Rice, Nicole

    2012-01-01

    The house paintings of the South African Ndebele people are more than just an attempt to improve the aesthetics of a community; they are a source of identity and significance for Ndebele women. In this article, the author describes an art project wherein students use the tradition of Ndebele house painting as inspiration for creating their own…

  6. Probabilistic simulation of fermion paths

    International Nuclear Information System (INIS)

    Zhirov, O.V.

    1989-01-01

    Permutation symmetry of fermion path integral allows (while spin degrees of freedom are ignored) to use in its simulation any probabilistic algorithm, like Metropolis one, heat bath, etc. 6 refs., 2 tabs

  7. Media Pembelajaran Interaktif Lectora Inspire sebagai Inovasi Pembelajaran

    Directory of Open Access Journals (Sweden)

    Norma Dewi Shalikhah

    2017-06-01

    Full Text Available Abstract The utilization of information and communication technology in education sector is a tremendous output. Support of ICT is hoped to become an innovation in learning with many involving information technology components inside. Therefore, in globalization era, education sector can not pass from its extent, with involves the inherent technology can produce a system of education. This paper discusses the interactive learning media that involve education technology using lectora inspire application. Lectora inspire is designed specifically for the beginner with purpose user friendly to use to make learning media, and can make the material test or evaluation. The development of interactive learning media with lectora inspire is conducted with how to provide training to the teachers in the elementary school. The methods are done with phases, includes gathering information, planning tools, implementing, presenting and reflecting. The object of this training is MIM Jagalan and MIM Jumoyo Greeting sub Magelang regency. Keywords: Media Interactive Learning, Lectora Inspire, Learning Innovation

  8. Probabilistic approaches for geotechnical site characterization and slope stability analysis

    CERN Document Server

    Cao, Zijun; Li, Dianqing

    2017-01-01

    This is the first book to revisit geotechnical site characterization from a probabilistic point of view and provide rational tools to probabilistically characterize geotechnical properties and underground stratigraphy using limited information obtained from a specific site. This book not only provides new probabilistic approaches for geotechnical site characterization and slope stability analysis, but also tackles the difficulties in practical implementation of these approaches. In addition, this book also develops efficient Monte Carlo simulation approaches for slope stability analysis and implements these approaches in a commonly available spreadsheet environment. These approaches and the software package are readily available to geotechnical practitioners and alleviate them from reliability computational algorithms. The readers will find useful information for a non-specialist to determine project-specific statistics of geotechnical properties and to perform probabilistic analysis of slope stability.

  9. Probabilistic Sophistication, Second Order Stochastic Dominance, and Uncertainty Aversion

    OpenAIRE

    Simone Cerreia-Vioglio; Fabio Maccheroni; Massimo Marinacci; Luigi Montrucchio

    2010-01-01

    We study the interplay of probabilistic sophistication, second order stochastic dominance, and uncertainty aversion, three fundamental notions in choice under uncertainty. In particular, our main result, Theorem 2, characterizes uncertainty averse preferences that satisfy second order stochastic dominance, as well as uncertainty averse preferences that are probabilistically sophisticated.

  10. Probabilistic cellular automata.

    Science.gov (United States)

    Agapie, Alexandru; Andreica, Anca; Giuclea, Marius

    2014-09-01

    Cellular automata are binary lattices used for modeling complex dynamical systems. The automaton evolves iteratively from one configuration to another, using some local transition rule based on the number of ones in the neighborhood of each cell. With respect to the number of cells allowed to change per iteration, we speak of either synchronous or asynchronous automata. If randomness is involved to some degree in the transition rule, we speak of probabilistic automata, otherwise they are called deterministic. With either type of cellular automaton we are dealing with, the main theoretical challenge stays the same: starting from an arbitrary initial configuration, predict (with highest accuracy) the end configuration. If the automaton is deterministic, the outcome simplifies to one of two configurations, all zeros or all ones. If the automaton is probabilistic, the whole process is modeled by a finite homogeneous Markov chain, and the outcome is the corresponding stationary distribution. Based on our previous results for the asynchronous case-connecting the probability of a configuration in the stationary distribution to its number of zero-one borders-the article offers both numerical and theoretical insight into the long-term behavior of synchronous cellular automata.

  11. Probabilistic population aging

    Science.gov (United States)

    2017-01-01

    We merge two methodologies, prospective measures of population aging and probabilistic population forecasts. We compare the speed of change and variability in forecasts of the old age dependency ratio and the prospective old age dependency ratio as well as the same comparison for the median age and the prospective median age. While conventional measures of population aging are computed on the basis of the number of years people have already lived, prospective measures are computed also taking account of the expected number of years they have left to live. Those remaining life expectancies change over time and differ from place to place. We compare the probabilistic distributions of the conventional and prospective measures using examples from China, Germany, Iran, and the United States. The changes over time and the variability of the prospective indicators are smaller than those that are observed in the conventional ones. A wide variety of new results emerge from the combination of methodologies. For example, for Germany, Iran, and the United States the likelihood that the prospective median age of the population in 2098 will be lower than it is today is close to 100 percent. PMID:28636675

  12. Probabilistic risk assessment methodology

    International Nuclear Information System (INIS)

    Shinaishin, M.A.

    1988-06-01

    The objective of this work is to provide the tools necessary for clear identification of: the purpose of a Probabilistic Risk Study, the bounds and depth of the study, the proper modeling techniques to be used, the failure modes contributing to the analysis, the classical and baysian approaches for manipulating data necessary for quantification, ways for treating uncertainties, and available computer codes that may be used in performing such probabilistic analysis. In addition, it provides the means for measuring the importance of a safety feature to maintaining a level of risk at a Nuclear Power Plant and the worth of optimizing a safety system in risk reduction. In applying these techniques so that they accommodate our national resources and needs it was felt that emphasis should be put on the system reliability analysis level of PRA. Objectives of such studies could include: comparing systems' designs of the various vendors in the bedding stage, and performing grid reliability and human performance analysis using national specific data. (author)

  13. Probabilistic risk assessment methodology

    Energy Technology Data Exchange (ETDEWEB)

    Shinaishin, M A

    1988-06-15

    The objective of this work is to provide the tools necessary for clear identification of: the purpose of a Probabilistic Risk Study, the bounds and depth of the study, the proper modeling techniques to be used, the failure modes contributing to the analysis, the classical and baysian approaches for manipulating data necessary for quantification, ways for treating uncertainties, and available computer codes that may be used in performing such probabilistic analysis. In addition, it provides the means for measuring the importance of a safety feature to maintaining a level of risk at a Nuclear Power Plant and the worth of optimizing a safety system in risk reduction. In applying these techniques so that they accommodate our national resources and needs it was felt that emphasis should be put on the system reliability analysis level of PRA. Objectives of such studies could include: comparing systems' designs of the various vendors in the bedding stage, and performing grid reliability and human performance analysis using national specific data. (author)

  14. Process for computing geometric perturbations for probabilistic analysis

    Science.gov (United States)

    Fitch, Simeon H. K. [Charlottesville, VA; Riha, David S [San Antonio, TX; Thacker, Ben H [San Antonio, TX

    2012-04-10

    A method for computing geometric perturbations for probabilistic analysis. The probabilistic analysis is based on finite element modeling, in which uncertainties in the modeled system are represented by changes in the nominal geometry of the model, referred to as "perturbations". These changes are accomplished using displacement vectors, which are computed for each node of a region of interest and are based on mean-value coordinate calculations.

  15. Biologically inspired technologies in NASA's morphing project

    Science.gov (United States)

    McGowan, Anna-Maria R.; Cox, David E.; Lazos, Barry S.; Waszak, Martin R.; Raney, David L.; Siochi, Emilie J.; Pao, S. Paul

    2003-07-01

    For centuries, biology has provided fertile ground for hypothesis, discovery, and inspiration. Time-tested methods used in nature are being used as a basis for several research studies conducted at the NASA Langley Research Center as a part of Morphing Project, which develops and assesses breakthrough vehicle technologies. These studies range from low drag airfoil design guided by marine and avian morphologies to soaring techniques inspired by birds and the study of small flexible wing vehicles. Biology often suggests unconventional yet effective approaches such as non-planar wings, dynamic soaring, exploiting aeroelastic effects, collaborative control, flapping, and fibrous active materials. These approaches and other novel technologies for future flight vehicles are being studied in NASA's Morphing Project. This paper will discuss recent findings in the aeronautics-based, biologically-inspired research in the project.

  16. Collision of an object in the transition from adiabatic inspiral to plunge around a Kerr black hole

    International Nuclear Information System (INIS)

    Harada, Tomohiro; Kimura, Masashi

    2011-01-01

    An inspiraling object of mass μ around a Kerr black hole of mass M(>>μ) experiences a continuous transition near the innermost stable circular orbit from adiabatic inspiral to plunge into the horizon as gravitational radiation extracts its energy and angular momentum. We investigate the collision of such an object with a generic counterpart around a Kerr black hole. We find that the angular momentum of the object is fine-tuned through gravitational radiation and that the high-velocity collision of the object with a generic counterpart naturally occurs around a nearly maximally rotating black hole. We also find that the center-of-mass energy can be far beyond the Planck energy for dark matter particles colliding around a stellar mass black hole and as high as 10 58 erg for stellar mass compact objects colliding around a supermassive black hole, where the present transition formalism is well justified. Therefore, rapidly rotating black holes can accelerate objects inspiraling around them to energy high enough to be of great physical interest.

  17. Probabilistic Graph Layout for Uncertain Network Visualization.

    Science.gov (United States)

    Schulz, Christoph; Nocaj, Arlind; Goertler, Jochen; Deussen, Oliver; Brandes, Ulrik; Weiskopf, Daniel

    2017-01-01

    We present a novel uncertain network visualization technique based on node-link diagrams. Nodes expand spatially in our probabilistic graph layout, depending on the underlying probability distributions of edges. The visualization is created by computing a two-dimensional graph embedding that combines samples from the probabilistic graph. A Monte Carlo process is used to decompose a probabilistic graph into its possible instances and to continue with our graph layout technique. Splatting and edge bundling are used to visualize point clouds and network topology. The results provide insights into probability distributions for the entire network-not only for individual nodes and edges. We validate our approach using three data sets that represent a wide range of network types: synthetic data, protein-protein interactions from the STRING database, and travel times extracted from Google Maps. Our approach reveals general limitations of the force-directed layout and allows the user to recognize that some nodes of the graph are at a specific position just by chance.

  18. Exploring Creativity in the Bio-Inspired Design Process

    DEFF Research Database (Denmark)

    Anggakara, K.; Aksdal, T.; Onarheim, Balder

    2015-01-01

    The growing interest in the of field bio-inspired design has been driven by the acknowledgement that inspiration from nature can serve as a valuable source of innovation. As an emerging approach, there has been a focus on building a principled methodology to address the challenges that arise...

  19. Probabilistic tsunami hazard assessment for Point Lepreau Generating Station

    Energy Technology Data Exchange (ETDEWEB)

    Mullin, D., E-mail: dmullin@nbpower.com [New Brunswick Power Corporation, Point Lepreau Generating Station, Point Lepreau (Canada); Alcinov, T.; Roussel, P.; Lavine, A.; Arcos, M.E.M.; Hanson, K.; Youngs, R., E-mail: trajce.alcinov@amecfw.com, E-mail: patrick.roussel@amecfw.com [AMEC Foster Wheeler Environment & Infrastructure, Dartmouth, NS (Canada)

    2015-07-01

    In 2012 the Geological Survey of Canada published a preliminary probabilistic tsunami hazard assessment in Open File 7201 that presents the most up-to-date information on all potential tsunami sources in a probabilistic framework on a national level, thus providing the underlying basis for conducting site-specific tsunami hazard assessments. However, the assessment identified a poorly constrained hazard for the Atlantic Coastline and recommended further evaluation. As a result, NB Power has embarked on performing a Probabilistic Tsunami Hazard Assessment (PTHA) for Point Lepreau Generating Station. This paper provides the methodology and progress or hazard evaluation results for Point Lepreau G.S. (author)

  20. Applications of probabilistic techniques at NRC

    International Nuclear Information System (INIS)

    Thadani, A.; Rowsome, F.; Speis, T.

    1984-01-01

    The NRC is currently making extensive use of probabilistic safety assessment in the reactor regulation. Most of these applications have been introduced in the regulatory activities in the past few years. Plant Probabilistic Safety Studies are being utilized as a design tool for applications for standard designs and for assessment of plants located in regions of particularly high population density. There is considerable motivation for licenses to perform plant-specific probabilistic studies for many, if not all, of the existing operating nuclear power plants as a tool for prioritizing the implementation of the many outstanding licensing actions of these plants as well as recommending the elimination of a number of these issues which are judged to be insignificant in terms of their contribution to safety and risk. Risk assessment perspectives are being used in the priorization of generic safety issues, development of technical resolution of unresolved safety issues, assessing safety significance of proposed new regulatory requirements, assessment of safety significance of some of the occurrences at operating facilities and in environmental impact analyses of license applicants as required by the National Environmental Policy Act. (orig.)

  1. GUI program to compute probabilistic seismic hazard analysis

    International Nuclear Information System (INIS)

    Shin, Jin Soo; Chi, H. C.; Cho, J. C.; Park, J. H.; Kim, K. G.; Im, I. S.

    2005-12-01

    The first stage of development of program to compute probabilistic seismic hazard is completed based on Graphic User Interface (GUI). The main program consists of three part - the data input processes, probabilistic seismic hazard analysis and result output processes. The first part has developed and others are developing now in this term. The probabilistic seismic hazard analysis needs various input data which represent attenuation formulae, seismic zoning map, and earthquake event catalog. The input procedure of previous programs based on text interface take a much time to prepare the data. The data cannot be checked directly on screen to prevent input erroneously in existing methods. The new program simplifies the input process and enable to check the data graphically in order to minimize the artificial error within the limits of the possibility

  2. The probabilistic approach and the deterministic licensing procedure

    International Nuclear Information System (INIS)

    Fabian, H.; Feigel, A.; Gremm, O.

    1984-01-01

    If safety goals are given, the creativity of the engineers is necessary to transform the goals into actual safety measures. That is, safety goals are not sufficient for the derivation of a safety concept; the licensing process asks ''What does a safe plant look like.'' The answer connot be given by a probabilistic procedure, but need definite deterministic statements; the conclusion is, that the licensing process needs a deterministic approach. The probabilistic approach should be used in a complementary role in cases where deterministic criteria are not complete, not detailed enough or not consistent and additional arguments for decision making in connection with the adequacy of a specific measure are necessary. But also in these cases the probabilistic answer has to be transformed into a clear deterministic statement. (orig.)

  3. Inspiration in the Act of Reading

    DEFF Research Database (Denmark)

    Zeller, Kinga

    2016-01-01

    In German-language theology, Professor Ulrich H. J. Körtner’s theory of inspiration, as it relates to the Bible reader’s perspective, is well known. His attempt to gain fruitful insights from contemporary literary hermeneutics while linking them to theological concerns makes his approach a valued...... yet not uncontroversial example of a reception-aesthetics twist on the Lutheran sola Scriptura. This article presents Körtner’s hermeneutical considerations with special regard to inspiration related to the Bible reader’s perspective and shows how this approach may be related to some aspects...

  4. Probabilistic Analysis of Passive Safety System Reliability in Advanced Small Modular Reactors: Methodologies and Lessons Learned

    Energy Technology Data Exchange (ETDEWEB)

    Grabaskas, David; Bucknor, Matthew; Brunett, Acacia; Grelle, Austin

    2015-06-28

    Many advanced small modular reactor designs rely on passive systems to fulfill safety functions during accident sequences. These systems depend heavily on boundary conditions to induce a motive force, meaning the system can fail to operate as intended due to deviations in boundary conditions, rather than as the result of physical failures. Furthermore, passive systems may operate in intermediate or degraded modes. These factors make passive system operation difficult to characterize with a traditional probabilistic framework that only recognizes discrete operating modes and does not allow for the explicit consideration of time-dependent boundary conditions. Argonne National Laboratory has been examining various methodologies for assessing passive system reliability within a probabilistic risk assessment for a station blackout event at an advanced small modular reactor. This paper describes the most promising options: mechanistic techniques, which share qualities with conventional probabilistic methods, and simulation-based techniques, which explicitly account for time-dependent processes. The primary intention of this paper is to describe the strengths and weaknesses of each methodology and highlight the lessons learned while applying the two techniques while providing high-level results. This includes the global benefits and deficiencies of the methods and practical problems encountered during the implementation of each technique.

  5. Supporting STEM Teachers to Inspire through Everyday Innovation

    Science.gov (United States)

    Bienkowski, Marie; Shechtman, Nicole; Remold, Julie; Knudsen, Jennifer

    2014-01-01

    Science teachers inspire in part by their constant adaptation to the learning needs of their students and to evolving content, curriculum, technology, and student populations. Innovation--bringing novel things to a situation to confer a benefit--is an integral part of teaching overall, and in especially inspired science teaching. While innovation…

  6. Probabilistic costing of transmission services

    International Nuclear Information System (INIS)

    Wijayatunga, P.D.C.

    1992-01-01

    Costing of transmission services of electrical utilities is required for transactions involving the transport of energy over a power network. The calculation of these costs based on Short Run Marginal Costing (SRMC) is preferred over other methods proposed in the literature due to its economic efficiency. In the research work discussed here, the concept of probabilistic costing of use-of-system based on SRMC which emerges as a consequence of the uncertainties in a power system is introduced using two different approaches. The first approach, based on the Monte Carlo method, generates a large number of possible system states by simulating random variables in the system using pseudo random number generators. A second approach to probabilistic use-of-system costing is proposed based on numerical convolution and multi-area representation of the transmission network. (UK)

  7. Volume 1. Probabilistic analysis of HTGR application studies. Technical discussion

    International Nuclear Information System (INIS)

    May, J.; Perry, L.

    1980-01-01

    The HTGR Program encompasses a number of decisions facing both industry and government which are being evaluated under the HTGR application studies being conducted by the GCRA. This report is in support of these application studies, specifically by developing comparative probabilistic energy costs of the alternative HTGR plant types under study at this time and of competitive PWR and coal-fired plants. Management decision analytic methodology was used as the basis for the development of the comparative probabilistic data. This study covers the probabilistic comparison of various HTGR plant types at a commercial development stage with comparative PWR and coal-fired plants. Subsequent studies are needed to address the sequencing of HTGR plants from the lead plant to the commercial plants and to integrate the R and D program into the plant construction sequence. The probabilistic results cover the comparison of the 15-year levelized energy costs for commercial plants, all with 1995 startup dates. For comparison with the HTGR plants, PWR and fossil-fired plants have been included in the probabilistic analysis, both as steam electric plants and as combined steam electric and process heat plants

  8. Comparative study of probabilistic methodologies for small signal stability assessment

    Energy Technology Data Exchange (ETDEWEB)

    Rueda, J.L.; Colome, D.G. [Universidad Nacional de San Juan (IEE-UNSJ), San Juan (Argentina). Inst. de Energia Electrica], Emails: joseluisrt@iee.unsj.edu.ar, colome@iee.unsj.edu.ar

    2009-07-01

    Traditional deterministic approaches for small signal stability assessment (SSSA) are unable to properly reflect the existing uncertainties in real power systems. Hence, the probabilistic analysis of small signal stability (SSS) is attracting more attention by power system engineers. This paper discusses and compares two probabilistic methodologies for SSSA, which are based on the two point estimation method and the so-called Monte Carlo method, respectively. The comparisons are based on the results obtained for several power systems of different sizes and with different SSS performance. It is demonstrated that although with an analytical approach the amount of computation of probabilistic SSSA can be reduced, the different degrees of approximations that are adopted, lead to deceptive results. Conversely, Monte Carlo based probabilistic SSSA can be carried out with reasonable computational effort while holding satisfactory estimation precision. (author)

  9. Advances in bio-inspired computing for combinatorial optimization problems

    CERN Document Server

    Pintea, Camelia-Mihaela

    2013-01-01

    Advances in Bio-inspired Combinatorial Optimization Problems' illustrates several recent bio-inspired efficient algorithms for solving NP-hard problems.Theoretical bio-inspired concepts and models, in particular for agents, ants and virtual robots are described. Large-scale optimization problems, for example: the Generalized Traveling Salesman Problem and the Railway Traveling Salesman Problem, are solved and their results are discussed.Some of the main concepts and models described in this book are: inner rule to guide ant search - a recent model in ant optimization, heterogeneous sensitive a

  10. Training mechanical engineering students to utilize biological inspiration during product development.

    Science.gov (United States)

    Bruck, Hugh A; Gershon, Alan L; Golden, Ira; Gupta, Satyandra K; Gyger, Lawrence S; Magrab, Edward B; Spranklin, Brent W

    2007-12-01

    The use of bio-inspiration for the development of new products and devices requires new educational tools for students consisting of appropriate design and manufacturing technologies, as well as curriculum. At the University of Maryland, new educational tools have been developed that introduce bio-inspired product realization to undergraduate mechanical engineering students. These tools include the development of a bio-inspired design repository, a concurrent fabrication and assembly manufacturing technology, a series of undergraduate curriculum modules and a new senior elective in the bio-inspired robotics area. This paper first presents an overview of the two new design and manufacturing technologies that enable students to realize bio-inspired products, and describes how these technologies are integrated into the undergraduate educational experience. Then, the undergraduate curriculum modules are presented, which provide students with the fundamental design and manufacturing principles needed to support bio-inspired product and device development. Finally, an elective bio-inspired robotics project course is present, which provides undergraduates with the opportunity to demonstrate the application of the knowledge acquired through the curriculum modules in their senior year using the new design and manufacturing technologies.

  11. Probabilistic Remaining Useful Life Prediction of Composite Aircraft Components, Phase II

    Data.gov (United States)

    National Aeronautics and Space Administration — A Probabilistic Fatigue Damage Assessment Network (PFDAN) toolkit for Abaqus will be developed for probabilistic life management of a laminated composite structure...

  12. Drawing inspiration from biological optical systems

    Science.gov (United States)

    Wolpert, H. D.

    2009-08-01

    Bio-Mimicking/Bio-Inspiration: How can we not be inspired by Nature? Life has evolved on earth over the last 3.5 to 4 billion years. Materials formed during this time were not toxic; they were created at low temperatures and low pressures unlike many of the materials developed today. The natural materials formed are self-assembled, multifunctional, nonlinear, complex, adaptive, self-repairing and biodegradable. The designs that failed are fossils. Those that survived are the success stories. Natural materials are mostly formed from organics, inorganic crystals and amorphous phases. The materials make economic sense by optimizing the design of the structures or systems to meet multiple needs. We constantly "see" many similar strategies in approaches, between man and nature, but we seldom look at the details of natures approaches. The power of image processing, in many of natures creatures, is a detail that is often overlooked. Seldon does the engineer interact with the biologist and learn what nature has to teach us. The variety and complexity of biological materials and the optical systems formed should inspire us.

  13. Probabilistic Criterion for the Economical Assessment of Nuclear Reactors

    International Nuclear Information System (INIS)

    Juanico, L; Florido, Pablo; Bergallo, Juan

    2000-01-01

    In this paper a MonteCarlo probabilistic model for the economic evaluation of nuclear power plants is presented.The probabilistic results have shown a wide spread on the economic performance due to the schedule complexity and coupling if tasks.This spread increasing to the discount rate, end hence, it becomes more important for developing countries

  14. Solving stochastic multiobjective vehicle routing problem using probabilistic metaheuristic

    Directory of Open Access Journals (Sweden)

    Gannouni Asmae

    2017-01-01

    closed form expression. This novel approach is based on combinatorial probability and can be incorporated in a multiobjective evolutionary algorithm. (iiProvide probabilistic approaches to elitism and diversification in multiobjective evolutionary algorithms. Finally, The behavior of the resulting Probabilistic Multi-objective Evolutionary Algorithms (PrMOEAs is empirically investigated on the multi-objective stochastic VRP problem.

  15. A probabilistic model of the electron transport in films of nanocrystals arranged in a cubic lattice

    Energy Technology Data Exchange (ETDEWEB)

    Kriegel, Ilka [Department of Nanochemistry, Istituto Italiano di Tecnologia (IIT), via Morego, 30, 16163 Genova (Italy); Scotognella, Francesco, E-mail: francesco.scotognella@polimi.it [Dipartimento di Fisica, Politecnico di Milano, Piazza Leonardo da Vinci 32, 20133 Milano (Italy); Center for Nano Science and Technology@PoliMi, Istituto Italiano di Tecnologia, Via Giovanni Pascoli, 70/3, 20133 Milan (Italy)

    2016-08-01

    The fabrication of nanocrystal (NC) films, starting from colloidal dispersion, is a very attractive topic in condensed matter physics community. NC films can be employed for transistors, light emitting diodes, lasers, and solar cells. For this reason the understanding of the film conductivity is of major importance. In this paper we describe a probabilistic model that allows the prediction of the conductivity of NC films, in this case of a cubic lattice of Lead Selenide or Cadmium Selenide NCs. The model is based on the hopping probability between NCs. The results are compared to experimental data reported in literature. - Highlights: • Colloidal nanocrystal (NC) film conductivity is a topic of major importance. • We present a probabilistic model to predict the electron conductivity in NC films. • The model is based on the hopping probability between NCs. • We found a good agreement between the model and data reported in literature.

  16. Epistemic and aleatory uncertainties in integrated deterministic and probabilistic safety assessment: Tradeoff between accuracy and accident simulations

    International Nuclear Information System (INIS)

    Karanki, D.R.; Rahman, S.; Dang, V.N.; Zerkak, O.

    2017-01-01

    The coupling of plant simulation models and stochastic models representing failure events in Dynamic Event Trees (DET) is a framework used to model the dynamic interactions among physical processes, equipment failures, and operator responses. The integration of physical and stochastic models may additionally enhance the treatment of uncertainties. Probabilistic Safety Assessments as currently implemented propagate the (epistemic) uncertainties in failure probabilities, rates, and frequencies; while the uncertainties in the physical model (parameters) are not propagated. The coupling of deterministic (physical) and probabilistic models in integrated simulations such as DET allows both types of uncertainties to be considered. However, integrated accident simulations with epistemic uncertainties will challenge even today's high performance computing infrastructure, especially for simulations of inherently complex nuclear or chemical plants. Conversely, intentionally limiting computations for practical reasons would compromise accuracy of results. This work investigates how to tradeoff accuracy and computations to quantify risk in light of both uncertainties and accident dynamics. A simple depleting tank problem that can be solved analytically is considered to examine the adequacy of a discrete DET approach. The results show that optimal allocation of computational resources between epistemic and aleatory calculations by means of convergence studies ensures accuracy within a limited budget. - Highlights: • Accident simulations considering uncertainties require intensive computations. • Tradeoff between accuracy and accident simulations is a challenge. • Optimal allocation between epistemic & aleatory computations ensures the tradeoff. • Online convergence gives an early indication of computational requirements. • Uncertainty propagation in DDET is examined on a tank problem solved analytically.

  17. Tractable approximations for probabilistic models: The adaptive Thouless-Anderson-Palmer mean field approach

    DEFF Research Database (Denmark)

    Opper, Manfred; Winther, Ole

    2001-01-01

    We develop an advanced mean held method for approximating averages in probabilistic data models that is based on the Thouless-Anderson-Palmer (TAP) approach of disorder physics. In contrast to conventional TAP. where the knowledge of the distribution of couplings between the random variables...... is required. our method adapts to the concrete couplings. We demonstrate the validity of our approach, which is so far restricted to models with nonglassy behavior? by replica calculations for a wide class of models as well as by simulations for a real data set....

  18. Learning Probabilistic Inference through Spike-Timing-Dependent Plasticity.

    Science.gov (United States)

    Pecevski, Dejan; Maass, Wolfgang

    2016-01-01

    Numerous experimental data show that the brain is able to extract information from complex, uncertain, and often ambiguous experiences. Furthermore, it can use such learnt information for decision making through probabilistic inference. Several models have been proposed that aim at explaining how probabilistic inference could be performed by networks of neurons in the brain. We propose here a model that can also explain how such neural network could acquire the necessary information for that from examples. We show that spike-timing-dependent plasticity in combination with intrinsic plasticity generates in ensembles of pyramidal cells with lateral inhibition a fundamental building block for that: probabilistic associations between neurons that represent through their firing current values of random variables. Furthermore, by combining such adaptive network motifs in a recursive manner the resulting network is enabled to extract statistical information from complex input streams, and to build an internal model for the distribution p (*) that generates the examples it receives. This holds even if p (*) contains higher-order moments. The analysis of this learning process is supported by a rigorous theoretical foundation. Furthermore, we show that the network can use the learnt internal model immediately for prediction, decision making, and other types of probabilistic inference.

  19. Probabilistic Model Development

    Science.gov (United States)

    Adam, James H., Jr.

    2010-01-01

    Objective: Develop a Probabilistic Model for the Solar Energetic Particle Environment. Develop a tool to provide a reference solar particle radiation environment that: 1) Will not be exceeded at a user-specified confidence level; 2) Will provide reference environments for: a) Peak flux; b) Event-integrated fluence; and c) Mission-integrated fluence. The reference environments will consist of: a) Elemental energy spectra; b) For protons, helium and heavier ions.

  20. Probabilistic Design Analysis (PDA) Approach to Determine the Probability of Cross-System Failures for a Space Launch Vehicle

    Science.gov (United States)

    Shih, Ann T.; Lo, Yunnhon; Ward, Natalie C.

    2010-01-01

    Quantifying the probability of significant launch vehicle failure scenarios for a given design, while still in the design process, is critical to mission success and to the safety of the astronauts. Probabilistic risk assessment (PRA) is chosen from many system safety and reliability tools to verify the loss of mission (LOM) and loss of crew (LOC) requirements set by the NASA Program Office. To support the integrated vehicle PRA, probabilistic design analysis (PDA) models are developed by using vehicle design and operation data to better quantify failure probabilities and to better understand the characteristics of a failure and its outcome. This PDA approach uses a physics-based model to describe the system behavior and response for a given failure scenario. Each driving parameter in the model is treated as a random variable with a distribution function. Monte Carlo simulation is used to perform probabilistic calculations to statistically obtain the failure probability. Sensitivity analyses are performed to show how input parameters affect the predicted failure probability, providing insight for potential design improvements to mitigate the risk. The paper discusses the application of the PDA approach in determining the probability of failure for two scenarios from the NASA Ares I project

  1. Probabilistic assessment of dry transport with burnup credit

    International Nuclear Information System (INIS)

    Lake, W.H.

    2003-01-01

    The general concept of probabilistic analysis and its application to the use of burnup credit in spent fuel transport is explored. Discussion of the probabilistic analysis method is presented. The concepts of risk and its perception are introduced, and models are suggested for performing probability and risk estimates. The general probabilistic models are used for evaluating the application of burnup credit for dry spent nuclear fuel transport. Two basic cases are considered. The first addresses the question of the relative likelihood of exceeding an established criticality safety limit with and without burnup credit. The second examines the effect of using burnup credit on the overall risk for dry spent fuel transport. Using reasoned arguments and related failure probability and consequence data analysis is performed to estimate the risks of using burnup credit for dry transport of spent nuclear fuel. (author)

  2. Software R&D for Next Generation of HEP Experiments, Inspired by Theano

    CERN Multimedia

    CERN. Geneva

    2015-01-01

    In the next decade, the frontiers of High Energy Physics (HEP) will be explored by three machines: the High Luminosity Large Hadron Collider (HL-LHC) in Europe, the Long Base Neutrino Facility (LBNF) in the US, and the International Linear Collider (ILC) in Japan. These next generation experiments must address two fundamental problems in the current generation of HEP experimental software: the inability to take advantage and adapt to the rapidly evolving processor landscape, and the difficulty in developing and maintaining increasingly complex software systems by physicists. I will propose a strategy, inspired by the automatic optimization and code generation in Theano, to simultaneously address both problems. I will describe three R&D projects with short-term physics deliverables aimed at developing this strategy. The first project is to develop maximally sensitive General Search for New Physics at the LHC by applying the Matrix Element Method running GPUs of HPCs. The second is to classify and reconstru...

  3. The analysis of probability task completion; Taxonomy of probabilistic thinking-based across gender in elementary school students

    Science.gov (United States)

    Sari, Dwi Ivayana; Budayasa, I. Ketut; Juniati, Dwi

    2017-08-01

    Formulation of mathematical learning goals now is not only oriented on cognitive product, but also leads to cognitive process, which is probabilistic thinking. Probabilistic thinking is needed by students to make a decision. Elementary school students are required to develop probabilistic thinking as foundation to learn probability at higher level. A framework of probabilistic thinking of students had been developed by using SOLO taxonomy, which consists of prestructural probabilistic thinking, unistructural probabilistic thinking, multistructural probabilistic thinking and relational probabilistic thinking. This study aimed to analyze of probability task completion based on taxonomy of probabilistic thinking. The subjects were two students of fifth grade; boy and girl. Subjects were selected by giving test of mathematical ability and then based on high math ability. Subjects were given probability tasks consisting of sample space, probability of an event and probability comparison. The data analysis consisted of categorization, reduction, interpretation and conclusion. Credibility of data used time triangulation. The results was level of boy's probabilistic thinking in completing probability tasks indicated multistructural probabilistic thinking, while level of girl's probabilistic thinking in completing probability tasks indicated unistructural probabilistic thinking. The results indicated that level of boy's probabilistic thinking was higher than level of girl's probabilistic thinking. The results could contribute to curriculum developer in developing probability learning goals for elementary school students. Indeed, teachers could teach probability with regarding gender difference.

  4. OCA-P, PWR Vessel Probabilistic Fracture Mechanics

    International Nuclear Information System (INIS)

    Cheverton, R.D.; Ball, D.G.

    2001-01-01

    1 - Description of program or function: OCA-P is a probabilistic fracture-mechanics code prepared specifically for evaluating the integrity of pressurized-water reactor vessels subjected to overcooling-accident loading conditions. Based on linear-elastic fracture mechanics, it has two- and limited three-dimensional flaw capability, and can treat cladding as a discrete region. Both deterministic and probabilistic analyses can be performed. For deterministic analysis, it is possible to conduct a search for critical values of the fluence and the nil-ductility reference temperature corresponding to incipient initiation of the initial flaw. The probabilistic portion of OCA-P is based on Monte Carlo techniques, and simulated parameters include fluence, flaw depth, fracture toughness, nil-ductility reference temperature, and concentrations of copper, nickel, and phosphorous. Plotting capabilities include the construction of critical-crack-depth diagrams (deterministic analysis) and a variety of histograms (probabilistic analysis). 2 - Method of solution: OAC-P accepts as input the reactor primary- system pressure and the reactor pressure-vessel downcomer coolant temperature, as functions of time in the specified transient. Then, the wall temperatures and stresses are calculated as a function of time and radial position in the wall, and the fracture-mechanics analysis is performed to obtain the stress intensity factors as a function of crack depth and time in the transient. In a deterministic analysis, values of the static crack initiation toughness and the crack arrest toughness are also calculated for all crack depths and times in the transient. A comparison of these values permits an evaluation of flaw behavior. For a probabilistic analysis, OCA-P generates a large number of reactor pressure vessels, each with a different combination of the various values of the parameters involved in the analysis of flaw behavior. For each of these vessels, a deterministic fracture

  5. Development Of Dynamic Probabilistic Safety Assessment: The Accident Dynamic Simulator (ADS) Tool

    International Nuclear Information System (INIS)

    Chang, Y.H.; Mosleh, A.; Dang, V.N.

    2003-01-01

    The development of a dynamic methodology for Probabilistic Safety Assessment (PSA) addresses the complex interactions between the behaviour of technical systems and personnel response in the evolution of accident scenarios. This paper introduces the discrete dynamic event tree, a framework for dynamic PSA, and its implementation in the Accident Dynamic Simulator (ADS) tool. Dynamic event tree tools generate and quantify accident scenarios through coupled simulation models of the plant physical processes, its automatic systems, the equipment reliability, and the human response. The current research on the framework, the ADS tool, and on Human Reliability Analysis issues within dynamic PSA, is discussed. (author)

  6. Development Of Dynamic Probabilistic Safety Assessment: The Accident Dynamic Simulator (ADS) Tool

    Energy Technology Data Exchange (ETDEWEB)

    Chang, Y.H.; Mosleh, A.; Dang, V.N

    2003-03-01

    The development of a dynamic methodology for Probabilistic Safety Assessment (PSA) addresses the complex interactions between the behaviour of technical systems and personnel response in the evolution of accident scenarios. This paper introduces the discrete dynamic event tree, a framework for dynamic PSA, and its implementation in the Accident Dynamic Simulator (ADS) tool. Dynamic event tree tools generate and quantify accident scenarios through coupled simulation models of the plant physical processes, its automatic systems, the equipment reliability, and the human response. The current research on the framework, the ADS tool, and on Human Reliability Analysis issues within dynamic PSA, is discussed. (author)

  7. Probabilistic, meso-scale flood loss modelling

    Science.gov (United States)

    Kreibich, Heidi; Botto, Anna; Schröter, Kai; Merz, Bruno

    2016-04-01

    Flood risk analyses are an important basis for decisions on flood risk management and adaptation. However, such analyses are associated with significant uncertainty, even more if changes in risk due to global change are expected. Although uncertainty analysis and probabilistic approaches have received increased attention during the last years, they are still not standard practice for flood risk assessments and even more for flood loss modelling. State of the art in flood loss modelling is still the use of simple, deterministic approaches like stage-damage functions. Novel probabilistic, multi-variate flood loss models have been developed and validated on the micro-scale using a data-mining approach, namely bagging decision trees (Merz et al. 2013). In this presentation we demonstrate and evaluate the upscaling of the approach to the meso-scale, namely on the basis of land-use units. The model is applied in 19 municipalities which were affected during the 2002 flood by the River Mulde in Saxony, Germany (Botto et al. submitted). The application of bagging decision tree based loss models provide a probability distribution of estimated loss per municipality. Validation is undertaken on the one hand via a comparison with eight deterministic loss models including stage-damage functions as well as multi-variate models. On the other hand the results are compared with official loss data provided by the Saxon Relief Bank (SAB). The results show, that uncertainties of loss estimation remain high. Thus, the significant advantage of this probabilistic flood loss estimation approach is that it inherently provides quantitative information about the uncertainty of the prediction. References: Merz, B.; Kreibich, H.; Lall, U. (2013): Multi-variate flood damage assessment: a tree-based data-mining approach. NHESS, 13(1), 53-64. Botto A, Kreibich H, Merz B, Schröter K (submitted) Probabilistic, multi-variable flood loss modelling on the meso-scale with BT-FLEMO. Risk Analysis.

  8. Bio-inspired Edible Superhydrophobic Interface for Reducing Residual Liquid Food.

    Science.gov (United States)

    Li, Yao; Bi, Jingran; Wang, Siqi; Zhang, Tan; Xu, Xiaomeng; Wang, Haitao; Cheng, Shasha; Zhu, Bei-Wei; Tan, Mingqian

    2018-03-07

    Significant wastage of residual liquid food, such as milk, yogurt, and honey, in food containers has attracted great attention. In this work, a bio-inspired edible superhydrophobic interface was fabricated using U.S. Food and Drug Administration-approved and edible honeycomb wax, arabic gum, and gelatin by a simple and low-cost method. The bio-inspired edible superhydrophobic interface showed multiscale structures, which were similar to that of a lotus leaf surface. This bio-inspired edible superhydrophobic interface displayed high contact angles for a variety of liquid foods, and the residue of liquid foods could be effectively reduced using the bio-inspired interface. To improve the adhesive force of the superhydrophobic interface, a flexible edible elastic film was fabricated between the interface and substrate material. After repeated folding and flushing for a long time, the interface still maintained excellent superhydrophobic property. The bio-inspired edible superhydrophobic interface showed good biocompatibility, which may have potential applications as a functional packaging interface material.

  9. Probabilistic safety assessment in nuclear power plant management

    International Nuclear Information System (INIS)

    Holloway, N.J.

    1989-06-01

    Probabilistic Safety Assessment (PSA) techniques have been widely used over the past few years to assist in understanding how engineered systems respond to abnormal conditions, particularly during a severe accident. The use of PSAs in the design and operation of such systems thus contributes to the safety of nuclear power plants. Probabilistic safety assessments can be maintained to provide a continuous up-to-date assessment (Living PSA), supporting the management of plant operations and modifications

  10. Exploiting Tensor Rank-One Decomposition in Probabilistic Inference

    Czech Academy of Sciences Publication Activity Database

    Savický, Petr; Vomlel, Jiří

    2007-01-01

    Roč. 43, č. 5 (2007), s. 747-764 ISSN 0023-5954 R&D Projects: GA MŠk 1M0545; GA MŠk 1M0572; GA ČR GA201/04/0393 Institutional research plan: CEZ:AV0Z10300504; CEZ:AV0Z10750506 Keywords : graphical probabilistic models * probabilistic inference * tensor rank Subject RIV: BD - Theory of Information Impact factor: 0.552, year: 2007 http://dml.cz/handle/10338.dmlcz/135810

  11. Probabilities, causes and propensities in physics

    CERN Document Server

    Suárez, Mauricio

    2010-01-01

    This volume defends a novel approach to the philosophy of physics: it is the first book devoted to a comparative study of probability, causality, and propensity, and their various interrelations, within the context of contemporary physics - particularly quantum and statistical physics. The philosophical debates and distinctions are firmly grounded upon examples from actual physics, thus exemplifying a robustly empiricist approach. The essays, by both prominent scholars in the field and promising young researchers, constitute a pioneer effort in bringing out the connections between probabilistic, causal and dispositional aspects of the quantum domain. This book will appeal to specialists in philosophy and foundations of physics, philosophy of science in general, metaphysics, ontology of physics theories, and philosophy of probability.

  12. How quantum physics came to Cambridge

    International Nuclear Information System (INIS)

    McCrea, William

    1985-01-01

    The paper traces the early stages of quantum physics, in Cambridge, in the 1920's. The mathematicians who inspired a generation of quantum physicists are briefly described, as well as the work of Dirac on quantum mechanics. The author's own contribution to quantum mechanics is outlined, along with other work in physics carried out at that time in Cambridge. (U.K.)

  13. InSpiRe - Intelligent Spine Rehabilitation

    DEFF Research Database (Denmark)

    Bøg, Kasper Hafstrøm; Helms, Niels Henrik; Kjær, Per

    Rapport on InSpiRe-projektet: InSpiRe er et nationalt netværk, der skal fremme mulighederne for intelligent genoptræning i forhold til ryglidelser. I netværket mødes forskere, virksomheder, kiropraktorer og fysioterapeuter for at udvikle nye genoptrænings og/eller behandlingsteknologier.......Rapport on InSpiRe-projektet: InSpiRe er et nationalt netværk, der skal fremme mulighederne for intelligent genoptræning i forhold til ryglidelser. I netværket mødes forskere, virksomheder, kiropraktorer og fysioterapeuter for at udvikle nye genoptrænings og/eller behandlingsteknologier....

  14. LEGO-inspired drug design

    DEFF Research Database (Denmark)

    Thanh Tung, Truong; Dao, Trong Tuan; Grifell Junyent, Marta

    2018-01-01

    The fungal plasma membrane H+-ATPase (Pma1p) is a potential target for the discovery of new antifungal agents. Surprisingly, no structure-activity relationship studies for small molecules targeting Pma1p have been reported. Herein, we disclose a LEGO-inspired fragment assembly strategy for design...

  15. No Space for Girliness in Physics: Understanding and Overcoming the Masculinity of Physics

    Science.gov (United States)

    Götschel, Helene

    2014-01-01

    Allison Gonsalves' article on "women doctoral students' positioning around discourses of gender and competence in physics" explores narratives of Canadian women physicists concerning their strategies to gain recognition as physicists. In my response to her rewarding and inspiring analysis I will reflect on her findings and arguments and…

  16. Probabilistic information on object weight shapes force dynamics in a grip-lift task.

    Science.gov (United States)

    Trampenau, Leif; Kuhtz-Buschbeck, Johann P; van Eimeren, Thilo

    2015-06-01

    Advance information, such as object weight, size and texture, modifies predictive scaling of grip forces in a grip-lift task. Here, we examined the influence of probabilistic advance information about object weight. Fifteen healthy volunteers repeatedly grasped and lifted an object equipped with a force transducer between their thumb and index finger. Three clearly distinguishable object weights were used. Prior to each lift, the probabilities for the three object weights were given by a visual cue. We examined the effect of probabilistic pre-cues on grip and lift force dynamics. We expected predictive scaling of grip force parameters to follow predicted values calculated according to probabilistic contingencies of the cues. We observed that probabilistic cues systematically influenced peak grip and load force rates, as an index of predictive motor scaling. However, the effects of probabilistic cues on force rates were nonlinear, and anticipatory adaptations of the motor output generally seemed to overestimate high probabilities and underestimate low probabilities. These findings support the suggestion that anticipatory adaptations and force scaling of the motor system can integrate probabilistic information. However, probabilistic information seems to influence motor programs in a nonlinear fashion.

  17. Inspirational Catalogue of Master Thesis Proposals 2015

    DEFF Research Database (Denmark)

    Thorndahl, Søren

    2015-01-01

    This catalog presents different topics for master thesis projects. It is important to emphasize that the project descriptions only serves as an inspiration and that you always can discuss with the potential supervisors the specific contents of a project.......This catalog presents different topics for master thesis projects. It is important to emphasize that the project descriptions only serves as an inspiration and that you always can discuss with the potential supervisors the specific contents of a project....

  18. Probabilistic Model for Fatigue Crack Growth in Welded Bridge Details

    DEFF Research Database (Denmark)

    Toft, Henrik Stensgaard; Sørensen, John Dalsgaard; Yalamas, Thierry

    2013-01-01

    In the present paper a probabilistic model for fatigue crack growth in welded steel details in road bridges is presented. The probabilistic model takes the influence of bending stresses in the joints into account. The bending stresses can either be introduced by e.g. misalignment or redistribution...... of stresses in the structure. The fatigue stress ranges are estimated from traffic measurements and a generic bridge model. Based on the probabilistic models for the resistance and load the reliability is estimated for a typical welded steel detail. The results show that large misalignments in the joints can...

  19. Cumulative Dominance and Probabilistic Sophistication

    NARCIS (Netherlands)

    Wakker, P.P.; Sarin, R.H.

    2000-01-01

    Machina & Schmeidler (Econometrica, 60, 1992) gave preference conditions for probabilistic sophistication, i.e. decision making where uncertainty can be expressed in terms of (subjective) probabilities without commitment to expected utility maximization. This note shows that simpler and more general

  20. Nature-inspired design of hybrid intelligent systems

    CERN Document Server

    Castillo, Oscar; Kacprzyk, Janusz

    2017-01-01

    This book highlights recent advances in the design of hybrid intelligent systems based on nature-inspired optimization and their application in areas such as intelligent control and robotics, pattern recognition, time series prediction, and optimization of complex problems. The book is divided into seven main parts, the first of which addresses theoretical aspects of and new concepts and algorithms based on type-2 and intuitionistic fuzzy logic systems. The second part focuses on neural network theory, and explores the applications of neural networks in diverse areas, such as time series prediction and pattern recognition. The book’s third part presents enhancements to meta-heuristics based on fuzzy logic techniques and describes new nature-inspired optimization algorithms that employ fuzzy dynamic adaptation of parameters, while the fourth part presents diverse applications of nature-inspired optimization algorithms. In turn, the fifth part investigates applications of fuzzy logic in diverse areas, such as...

  1. Probabilistic safety analysis and interpretation thereof

    International Nuclear Information System (INIS)

    Steininger, U.; Sacher, H.

    1999-01-01

    Increasing use of the instrumentation of PSA is being made in Germany for quantitative technical safety assessment, for example with regard to incidents which must be reported and forwarding of information, especially in the case of modification of nuclear plants. The Commission for Nuclear Reactor Safety recommends regular execution of PSA on a cycle period of ten years. According to the PSA guidance instructions, probabilistic analyses serve for assessing the degree of safety of the entire plant, expressed as the expectation value for the frequency of endangering conditions. The authors describe the method, action sequence and evaluation of the probabilistic safety analyses. The limits of probabilistic safety analyses arise in the practical implementation. Normally the guidance instructions for PSA are confined to the safety systems, so that in practice they are at best suitable for operational optimisation only to a limited extent. The present restriction of the analyses has a similar effect on power output operation of the plant. This seriously degrades the utilitarian value of these analyses for the plant operators. In order to further develop PSA as a supervisory and operational optimisation instrument, both authors consider it to be appropriate to bring together the specific know-how of analysts, manufacturers, plant operators and experts. (orig.) [de

  2. Traceability investigation in Computed Tomography using industry-inspired workpieces

    DEFF Research Database (Denmark)

    Kraemer, Alexandra; Stolfi, Alessandro; Schneider, Timm

    2017-01-01

    This paper concerns an investigation of the accuracy of Computed Tomography (CT) measurements using four industry-inspired workpieces. A total of 16 measurands were selected and calibrated using CMMs. CT measurements on industry-inspired workpieces were carried out using two CTs having different...

  3. Probabilistic Structural Analysis Methods (PSAM) for select space propulsion system structural components

    Science.gov (United States)

    Cruse, T. A.

    1987-01-01

    The objective is the development of several modular structural analysis packages capable of predicting the probabilistic response distribution for key structural variables such as maximum stress, natural frequencies, transient response, etc. The structural analysis packages are to include stochastic modeling of loads, material properties, geometry (tolerances), and boundary conditions. The solution is to be in terms of the cumulative probability of exceedance distribution (CDF) and confidence bounds. Two methods of probability modeling are to be included as well as three types of structural models - probabilistic finite-element method (PFEM); probabilistic approximate analysis methods (PAAM); and probabilistic boundary element methods (PBEM). The purpose in doing probabilistic structural analysis is to provide the designer with a more realistic ability to assess the importance of uncertainty in the response of a high performance structure. Probabilistic Structural Analysis Method (PSAM) tools will estimate structural safety and reliability, while providing the engineer with information on the confidence that should be given to the predicted behavior. Perhaps most critically, the PSAM results will directly provide information on the sensitivity of the design response to those variables which are seen to be uncertain.

  4. Probabilistic Structural Analysis Methods for select space propulsion system structural components (PSAM)

    Science.gov (United States)

    Cruse, T. A.; Burnside, O. H.; Wu, Y.-T.; Polch, E. Z.; Dias, J. B.

    1988-01-01

    The objective is the development of several modular structural analysis packages capable of predicting the probabilistic response distribution for key structural variables such as maximum stress, natural frequencies, transient response, etc. The structural analysis packages are to include stochastic modeling of loads, material properties, geometry (tolerances), and boundary conditions. The solution is to be in terms of the cumulative probability of exceedance distribution (CDF) and confidence bounds. Two methods of probability modeling are to be included as well as three types of structural models - probabilistic finite-element method (PFEM); probabilistic approximate analysis methods (PAAM); and probabilistic boundary element methods (PBEM). The purpose in doing probabilistic structural analysis is to provide the designer with a more realistic ability to assess the importance of uncertainty in the response of a high performance structure. Probabilistic Structural Analysis Method (PSAM) tools will estimate structural safety and reliability, while providing the engineer with information on the confidence that should be given to the predicted behavior. Perhaps most critically, the PSAM results will directly provide information on the sensitivity of the design response to those variables which are seen to be uncertain.

  5. Understanding solid state physics

    CERN Document Server

    Holgate, Sharon Ann

    2009-01-01

    Where Sharon Ann Holgate has succeeded in this book is in packing it with examples of the application of solid state physics to technology. … All the basic elements of solid state physics are covered … . The range of materials is good, including as it does polymers and glasses as well as crystalline solids. In general, the style makes for easy reading. … Overall this book succeeds in showing the relevance of solid state physics to the modern world … .-Contemporary Physics, Vol. 52, No. 2, 2011I was indeed amused and inspired by the wonderful images throughout the book, carefully selected by th

  6. Augmenting Probabilistic Risk Assesment with Malevolent Initiators

    International Nuclear Information System (INIS)

    Smith, Curtis; Schwieder, David

    2011-01-01

    As commonly practiced, the use of probabilistic risk assessment (PRA) in nuclear power plants only considers accident initiators such as natural hazards, equipment failures, and human error. Malevolent initiators are ignored in PRA, but are considered the domain of physical security, which uses vulnerability assessment based on an officially specified threat (design basis threat). This paper explores the implications of augmenting and extending existing PRA models by considering new and modified scenarios resulting from malevolent initiators. Teaming the augmented PRA models with conventional vulnerability assessments can cost-effectively enhance security of a nuclear power plant. This methodology is useful for operating plants, as well as in the design of new plants. For the methodology, we have proposed an approach that builds on and extends the practice of PRA for nuclear power plants for security-related issues. Rather than only considering 'random' failures, we demonstrated a framework that is able to represent and model malevolent initiating events and associated plant impacts.

  7. Probabilistically-Cued Patterns Trump Perfect Cues in Statistical Language Learning.

    Science.gov (United States)

    Lany, Jill; Gómez, Rebecca L

    2013-01-01

    Probabilistically-cued co-occurrence relationships between word categories are common in natural languages but difficult to acquire. For example, in English, determiner-noun and auxiliary-verb dependencies both involve co-occurrence relationships, but determiner-noun relationships are more reliably marked by correlated distributional and phonological cues, and appear to be learned more readily. We tested whether experience with co-occurrence relationships that are more reliable promotes learning those that are less reliable using an artificial language paradigm. Prior experience with deterministically-cued contingencies did not promote learning of less reliably-cued structure, nor did prior experience with relationships instantiated in the same vocabulary. In contrast, prior experience with probabilistically-cued co-occurrence relationships instantiated in different vocabulary did enhance learning. Thus, experience with co-occurrence relationships sharing underlying structure but not vocabulary may be an important factor in learning grammatical patterns. Furthermore, experience with probabilistically-cued co-occurrence relationships, despite their difficultly for naïve learners, lays an important foundation for learning novel probabilistic structure.

  8. Crickets as bio-inspiration for MEMS-based flow-sensing

    NARCIS (Netherlands)

    Krijnen, Gijsbertus J.M.; Droogendijk, H.; Dagamseh, A.M.K.; Jaganatharaja, R.K.; Casas, Jerome

    2014-01-01

    MEMS offers exciting possibilities for the fabrication of bio-inspired mechanosensors. Over the last few years, we have been working on cricket- inspired hair-sensor arrays for spatio-temporal flow-field observations (i.e. flow camera) and source localisation. Whereas making flow-sensors as energy

  9. Mediating between the muse and the masses: inspiration and the actualization of creative ideas.

    Science.gov (United States)

    Thrash, Todd M; Maruskin, Laura A; Cassidy, Scott E; Fryer, James W; Ryan, Richard M

    2010-03-01

    Within the creativity domain, inspiration is a motivational state posited to energize the actualization of creative ideas. The authors examined the construct validity, predictive utility, and function of inspiration in the writing process. Study 1, a cross-lagged panel study, showed that getting creative ideas and being inspired are distinct and that the former precedes the latter. In Study 2, inspiration, at the between-person level, predicted the creativity of scientific writing, whereas effort predicted technical merit. Within persons, peaks in inspiration predicted peaks in creativity and troughs in technical merit. In Study 3, inspiration predicted the creativity of poetry. Consistent with its posited transmission function, inspiration mediated between creativity of the idea and creativity of the product, whereas effort, positive affect, and awe did not. Study 4 extended the Study 3 findings to fiction writing. Openness to aesthetics and positive affect predicted creativity of the idea, whereas approach temperament moderated the relation between creativity of the idea and inspiration. Inspiration predicted efficiency, productivity, and use of shorter words, indicating that inspiration not only transmits creativity but does so economically.

  10. Quantitative probabilistic functional diffusion mapping in newly diagnosed glioblastoma treated with radiochemotherapy.

    Science.gov (United States)

    Ellingson, Benjamin M; Cloughesy, Timothy F; Lai, Albert; Nghiemphu, Phioanh L; Liau, Linda M; Pope, Whitney B

    2013-03-01

    Functional diffusion mapping (fDM) is a cancer imaging technique that uses voxel-wise changes in apparent diffusion coefficients (ADC) to evaluate response to treatment. Despite promising initial results, uncertainty in image registration remains the largest barrier to widespread clinical application. The current study introduces a probabilistic approach to fDM quantification to overcome some of these limitations. A total of 143 patients with newly diagnosed glioblastoma who were undergoing standard radiochemotherapy were enrolled in this retrospective study. Traditional and probabilistic fDMs were calculated using ADC maps acquired before and after therapy. Probabilistic fDMs were calculated by applying random, finite translational, and rotational perturbations to both pre-and posttherapy ADC maps, then repeating calculation of fDMs reflecting changes after treatment, resulting in probabilistic fDMs showing the voxel-wise probability of fDM classification. Probabilistic fDMs were then compared with traditional fDMs in their ability to predict progression-free survival (PFS) and overall survival (OS). Probabilistic fDMs applied to patients with newly diagnosed glioblastoma treated with radiochemotherapy demonstrated shortened PFS and OS among patients with a large volume of tumor with decreasing ADC evaluated at the posttreatment time with respect to the baseline scans. Alternatively, patients with a large volume of tumor with increasing ADC evaluated at the posttreatment time with respect to baseline scans were more likely to progress later and live longer. Probabilistic fDMs performed better than traditional fDMs at predicting 12-month PFS and 24-month OS with use of receiver-operator characteristic analysis. Univariate log-rank analysis on Kaplan-Meier data also revealed that probabilistic fDMs could better separate patients on the basis of PFS and OS, compared with traditional fDMs. Results suggest that probabilistic fDMs are a more predictive biomarker in

  11. Probabilistic Durability Analysis in Advanced Engineering Design

    Directory of Open Access Journals (Sweden)

    A. Kudzys

    2000-01-01

    Full Text Available Expedience of probabilistic durability concepts and approaches in advanced engineering design of building materials, structural members and systems is considered. Target margin values of structural safety and serviceability indices are analyzed and their draft values are presented. Analytical methods of the cumulative coefficient of correlation and the limit transient action effect for calculation of reliability indices are given. Analysis can be used for probabilistic durability assessment of carrying and enclosure metal, reinforced concrete, wood, plastic, masonry both homogeneous and sandwich or composite structures and some kinds of equipments. Analysis models can be applied in other engineering fields.

  12. Neo-Deterministic and Probabilistic Seismic Hazard Assessments: a Comparative Analysis

    Science.gov (United States)

    Peresan, Antonella; Magrin, Andrea; Nekrasova, Anastasia; Kossobokov, Vladimir; Panza, Giuliano F.

    2016-04-01

    Objective testing is the key issue towards any reliable seismic hazard assessment (SHA). Different earthquake hazard maps must demonstrate their capability in anticipating ground shaking from future strong earthquakes before an appropriate use for different purposes - such as engineering design, insurance, and emergency management. Quantitative assessment of maps performances is an essential step also in scientific process of their revision and possible improvement. Cross-checking of probabilistic models with available observations and independent physics based models is recognized as major validation procedure. The existing maps from the classical probabilistic seismic hazard analysis (PSHA), as well as those from the neo-deterministic analysis (NDSHA), which have been already developed for several regions worldwide (including Italy, India and North Africa), are considered to exemplify the possibilities of the cross-comparative analysis in spotting out limits and advantages of different methods. Where the data permit, a comparative analysis versus the documented seismic activity observed in reality is carried out, showing how available observations about past earthquakes can contribute to assess performances of the different methods. Neo-deterministic refers to a scenario-based approach, which allows for consideration of a wide range of possible earthquake sources as the starting point for scenarios constructed via full waveforms modeling. The method does not make use of empirical attenuation models (i.e. Ground Motion Prediction Equations, GMPE) and naturally supplies realistic time series of ground shaking (i.e. complete synthetic seismograms), readily applicable to complete engineering analysis and other mitigation actions. The standard NDSHA maps provide reliable envelope estimates of maximum seismic ground motion from a wide set of possible scenario earthquakes, including the largest deterministically or historically defined credible earthquake. In addition

  13. Nature inspires sensors to do more with less.

    Science.gov (United States)

    Mulvaney, Shawn P; Sheehan, Paul E

    2014-10-28

    The world is filled with widely varying chemical, physical, and biological stimuli. Over millennia, organisms have refined their senses to cope with these diverse stimuli, becoming virtuosos in differentiating closely related antigens, handling extremes in concentration, resetting the spent sensing mechanisms, and processing the multiple data streams being generated. Nature successfully deals with both repeating and new stimuli, demonstrating great adaptability when confronted with the latter. Interestingly, nature accomplishes these feats using a fairly simple toolbox. The sensors community continues to draw inspiration from nature's example: just look at the antibodies used as biosensor capture agents or the neural networks that process multivariate data streams. Indeed, many successful sensors have been built by simply mimicking natural systems. However, some of the most exciting breakthroughs occur when the community moves beyond mimicking nature and learns to use nature's tools in innovative ways.

  14. Origami-Inspired Folding of Thick, Rigid Panels

    Science.gov (United States)

    Trease, Brian P.; Thomson, Mark W.; Sigel, Deborah A.; Walkemeyer, Phillip E.; Zirbel, Shannon; Howell, Larry; Lang, Robert

    2014-01-01

    To achieve power of 250 kW or greater, a large compression ratio of stowed-to-deployed area is needed. Origami folding patterns were used to inspire the folding of a solar array to achieve synchronous deployment; however, origami models are generally created for near-zero-thickness material. Panel thickness is one of the main challenges of origami-inspired design. Three origami-inspired folding techniques (flasher, square twist, and map fold) were created with rigid panels and hinges. Hinge components are added to the model to enable folding of thick, rigid materials. Origami models are created assuming zero (or near zero) thickness. When a material with finite thickness is used, the panels are required to bend around an increasingly thick fold as they move away from the center of the model. The two approaches for dealing with material thickness are to use membrane hinges to connect the panels, or to add panel hinges, or hinges of the same thickness, at an appropriate width to enable folding.

  15. Pramana – Journal of Physics | Indian Academy of Sciences

    Indian Academy of Sciences (India)

    Home; Journals; Pramana – Journal of Physics. Bhaskar Jyoti Hazarika. Articles written in Pramana – Journal of Physics. Volume 75 Issue 3 September 2010 pp 423-438 Research Articles. Slope and curvature of Isgur–Wise function using variationally improved perturbation theory in a quantum chromodynamics inspired ...

  16. The effects of 12 weeks Pilates-inspired exercise training on functional performance in older women: A randomized clinical trial.

    Science.gov (United States)

    Vieira, Natália Donzeli; Testa, Daniela; Ruas, Paula Cristine; Salvini, Tânia de Fátima; Catai, Aparecida Maria; Melo, Ruth Caldeira

    2017-04-01

    Recent scientific evidence supports the benefits of Pilates exercises on postural balance and muscle strength of older persons. However, their effects on other aspects of physical fitness, which are also important for independent living in older age, are still unknown. To investigate the effects of a 12-week Pilates-inspired exercise program on the functional performance of community-dwelling older women. Forty community-dwelling older women were randomly enrolled in a Pilates-inspired exercise training (2 times/week, 60 min/session) (PG, n = 21, 66.0 ± 1.4yrs) or kept in the control group (CG; n = 19, 63.3 ± 0.9yrs). The Pilates exercises were conducted in small groups and performed on mats (using accessories such as exercise rubber bands, swiss and exercise balls). The functional performance on one-leg stance (OLS), timed up and go (TUG), five-times-sit-to-stand (STS) and 6-min walk (6 MW) tests was evaluated before and after the 12-week Pilates training or control follow-up period. After 12 weeks, time effects were observed for STS (p = 0.03) and 6 MW tests (p Pilates-inspired exercises improved dynamic balance, lower-extremity strength and aerobic resistance in community-dwelling older women. Therefore, it may be a potentially effective exercise regimen to maintain physical fitness in old age. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. "Kind and Grateful": A Context-Sensitive Smartphone App Utilizing Inspirational Content to Promote Gratitude.

    Science.gov (United States)

    Ghandeharioun, Asma; Azaria, Asaph; Taylor, Sara; Picard, Rosalind W

    Previous research has shown that gratitude positively influences psychological wellbeing and physical health. Grateful people are reported to feel more optimistic and happy, to better mitigate aversive experiences, and to have stronger interpersonal bonds. Gratitude interventions have been shown to result in improved sleep, more frequent exercise and stronger cardiovascular and immune systems. These findings call for the development of technologies that would inspire gratitude. This paper presents a novel system designed toward this end. We leverage pervasive technologies to naturally embed inspiration to express gratitude in everyday life. Novel to this work, mobile sensor data is utilized to infer optimal moments for stimulating contextually relevant thankfulness and appreciation. Sporadic mood measurements are inventively obtained through the smartphone lock screen, investigating their interplay with grateful expressions. Both momentary thankful emotion and dispositional gratitude are measured. To evaluate our system, we ran two rounds of randomized control trials (RCT), including a pilot study (N = 15, 2 weeks) and a main study (N = 27, 5 weeks). Studies' participants were provided with a newly developed smartphone app through which they were asked to express gratitude; the app displayed inspirational content to only the intervention group, while measuring contextual cues for all users. In both rounds of the RCT, the intervention was associated with improved thankful behavior. Significant increase was observed in multiple facets of practicing gratitude in the intervention groups. The average frequency of practicing thankfulness increased by more than 120 %, comparing the baseline weeks with the intervention weeks of the main study. In contrast, the control group of the same study exhibited a decrease of 90 % in the frequency of thankful expressions. In the course of the study's 5 weeks, increases in dispositional gratitude and in psychological wellbeing were

  18. Site-specific Probabilistic Analysis of DCGLs Using RESRAD Code

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jeongju; Yoon, Suk Bon; Sohn, Wook [KHNP CRI, Daejeon (Korea, Republic of)

    2016-10-15

    In general, DCGLs can be conservative (screening DCGL) if they do not take into account site specific factors. Use of such conservative DCGLs can lead to additional remediation that would not be required if the effort was made to develop site-specific DCGLs. Therefore, the objective of this work is to provide an example on the use of the RESRAD 6.0 probabilistic (site-specific) dose analysis to compare with the screening DCGL. Site release regulations state that a site will be considered acceptable for unrestricted use if the residual radioactivity that is distinguishable from background radiation results in a Total Effective Dose Equivalent (TEDE) to an average member of the critical group of less than the site release criteria, for example 0.25 mSv per year in U.S. Utilities use computer dose modeling codes to establish an acceptable level of contamination, the derived concentration guideline level (DCGL) that will meet this regulatory limit. Since the DCGL value is the principal measure of residual radioactivity, it is critical to understand the technical basis of these dose modeling codes. The objective this work was to provide example on nuclear power plant decommissioning dose analysis in a probabilistic analysis framework. The focus was on the demonstration of regulatory compliance for surface soil contamination using the RESRAD 6.0 code. Both the screening and site-specific probabilistic dose analysis methodologies were examined. Example analyses performed with the screening probabilistic dose analysis confirmed the conservatism of the NRC screening values and indicated the effectiveness of probabilistic dose analysis in reducing the conservatism in DCGL derivation.

  19. Inspired gas humidity and temperature during mechanical ventilation with the Stephanie ventilator.

    Science.gov (United States)

    Preo, Bianca L; Shadbolt, Bruce; Todd, David A

    2013-11-01

    To measure inspired gas humidity and temperature delivered by a Stephanie neonatal ventilator with variations in (i) circuit length; (ii) circuit insulation; (iii) proximal airway temperature probe (pATP) position; (iv) inspiratory temperature (offset); and (v) incubator temperatures. Using the Stephanie neonatal ventilator, inspired gas humidity and temperature were measured during mechanical ventilation at the distal inspiratory limb and 3 cm down the endotracheal tube. Measurements were made with a long or short circuit; with or without insulation of the inspiratory limb; proximal ATP (pATP) either within or external to the incubator; at two different inspiratory temperature (offset) of 37(-0.5) and 39(-2.0)°C; and at three different incubator temperatures of 32, 34.5, and 37°C. Long circuits produced significantly higher inspired humidity than short circuits at all incubator settings, while only at 32°C was the inspired temperature higher. In the long circuits, insulation further improved the inspired humidity especially at 39(-2.0)°C, while only at incubator temperatures of 32 and 37°C did insulation significantly improve inspired temperature. Positioning the pATP outside the incubator did not result in higher inspired humidity but did significantly improve inspired temperature. An inspiratory temperature (offset) of 39(-2.0)°C delivered significantly higher inspired humidity and temperature than the 37(-0.5)°C especially when insulated. Long insulated Stephanie circuits should be used for neonatal ventilation when the infant is nursed in an incubator. The recommended inspiratory temperature (offset) of 37(-0.5)°C produced inspired humidity and temperature below international standards, and we suggest an increase to 39(-2.0)°C. © 2013 John Wiley & Sons Ltd.

  20. Electronic and optoelectronic materials and devices inspired by nature

    Science.gov (United States)

    Meredith, P.; Bettinger, C. J.; Irimia-Vladu, M.; Mostert, A. B.; Schwenn, P. E.

    2013-03-01

    Inorganic semiconductors permeate virtually every sphere of modern human existence. Micro-fabricated memory elements, processors, sensors, circuit elements, lasers, displays, detectors, etc are ubiquitous. However, the dawn of the 21st century has brought with it immense new challenges, and indeed opportunities—some of which require a paradigm shift in the way we think about resource use and disposal, which in turn directly impacts our ongoing relationship with inorganic semiconductors such as silicon and gallium arsenide. Furthermore, advances in fields such as nano-medicine and bioelectronics, and the impending revolution of the ‘ubiquitous sensor network’, all require new functional materials which are bio-compatible, cheap, have minimal embedded manufacturing energy plus extremely low power consumption, and are mechanically robust and flexible for integration with tissues, building structures, fabrics and all manner of hosts. In this short review article we summarize current progress in creating materials with such properties. We focus primarily on organic and bio-organic electronic and optoelectronic systems derived from or inspired by nature, and outline the complex charge transport and photo-physics which control their behaviour. We also introduce the concept of electrical devices based upon ion or proton flow (‘ionics and protonics’) and focus particularly on their role as a signal interface with biological systems. Finally, we highlight recent advances in creating working devices, some of which have bio-inspired architectures, and summarize the current issues, challenges and potential solutions. This is a rich new playground for the modern materials physicist.

  1. Probabilistic Harmonic Analysis on Distributed Photovoltaic Integration Considering Typical Weather Scenarios

    Science.gov (United States)

    Bin, Che; Ruoying, Yu; Dongsheng, Dang; Xiangyan, Wang

    2017-05-01

    Distributed Generation (DG) integrating to the network would cause the harmonic pollution which would cause damages on electrical devices and affect the normal operation of power system. On the other hand, due to the randomness of the wind and solar irradiation, the output of DG is random, too, which leads to an uncertainty of the harmonic generated by the DG. Thus, probabilistic methods are needed to analyse the impacts of the DG integration. In this work we studied the harmonic voltage probabilistic distribution and the harmonic distortion in distributed network after the distributed photovoltaic (DPV) system integrating in different weather conditions, mainly the sunny day, cloudy day, rainy day and the snowy day. The probabilistic distribution function of the DPV output power in different typical weather conditions could be acquired via the parameter identification method of maximum likelihood estimation. The Monte-Carlo simulation method was adopted to calculate the probabilistic distribution of harmonic voltage content at different frequency orders as well as the harmonic distortion (THD) in typical weather conditions. The case study was based on the IEEE33 system and the results of harmonic voltage content probabilistic distribution as well as THD in typical weather conditions were compared.

  2. Automatic Probabilistic Program Verification through Random Variable Abstraction

    Directory of Open Access Journals (Sweden)

    Damián Barsotti

    2010-06-01

    Full Text Available The weakest pre-expectation calculus has been proved to be a mature theory to analyze quantitative properties of probabilistic and nondeterministic programs. We present an automatic method for proving quantitative linear properties on any denumerable state space using iterative backwards fixed point calculation in the general framework of abstract interpretation. In order to accomplish this task we present the technique of random variable abstraction (RVA and we also postulate a sufficient condition to achieve exact fixed point computation in the abstract domain. The feasibility of our approach is shown with two examples, one obtaining the expected running time of a probabilistic program, and the other the expected gain of a gambling strategy. Our method works on general guarded probabilistic and nondeterministic transition systems instead of plain pGCL programs, allowing us to easily model a wide range of systems including distributed ones and unstructured programs. We present the operational and weakest precondition semantics for this programs and prove its equivalence.

  3. Probabilistic Approaches to Video Retrieval

    NARCIS (Netherlands)

    Ianeva, Tzvetanka; Boldareva, L.; Westerveld, T.H.W.; Cornacchia, Roberto; Hiemstra, Djoerd; de Vries, A.P.

    Our experiments for TRECVID 2004 further investigate the applicability of the so-called “Generative Probabilistic Models to video retrieval��?. TRECVID 2003 results demonstrated that mixture models computed from video shot sequences improve the precision of “query by examples��? results when

  4. Performance analysis of chi models using discrete-time probabilistic reward graphs

    NARCIS (Netherlands)

    Trcka, N.; Georgievska, S.; Markovski, J.; Andova, S.; Vink, de E.P.

    2008-01-01

    We propose the model of discrete-time probabilistic reward graphs (DTPRGs) for performance analysis of systems exhibiting discrete deterministic time delays and probabilistic behavior, via their interpretation as discrete-time Markov reward chains, full-fledged platform for qualitative and

  5. Biological sequence analysis: probabilistic models of proteins and nucleic acids

    National Research Council Canada - National Science Library

    Durbin, Richard

    1998-01-01

    ... analysis methods are now based on principles of probabilistic modelling. Examples of such methods include the use of probabilistically derived score matrices to determine the significance of sequence alignments, the use of hidden Markov models as the basis for profile searches to identify distant members of sequence families, and the inference...

  6. Bio-inspired passive actuator simulating an abalone shell mechanism for structural control

    International Nuclear Information System (INIS)

    Yang, Henry T Y; Lin, Chun-Hung; Bridges, Daniel; Randall, Connor J; Hansma, Paul K

    2010-01-01

    An energy dispersion mechanism called 'sacrificial bonds and hidden length', which is found in some biological systems, such as abalone shells and bones, is the inspiration for new strategies for structural control. Sacrificial bonds and hidden length can substantially increase the stiffness and enhance energy dissipation in the constituent molecules of abalone shells and bone. Having been inspired by the usefulness and effectiveness of such a mechanism, which has evolved over millions of years and countless cycles of evolutions, the authors employ the conceptual underpinnings of this mechanism to develop a bio-inspired passive actuator. This paper presents a fundamental method for optimally designing such bio-inspired passive actuators for structural control. To optimize the bio-inspired passive actuator, a simple method utilizing the force–displacement–velocity (FDV) plots based on LQR control is proposed. A linear regression approach is adopted in this research to find the initial values of the desired parameters for the bio-inspired passive actuator. The illustrative examples, conducted by numerical simulation with experimental validation, suggest that the bio-inspired passive actuator based on sacrificial bonds and hidden length may be comparable in performance to state-of-the-art semi-active actuators

  7. Bio-inspired passive actuator simulating an abalone shell mechanism for structural control

    Science.gov (United States)

    Yang, Henry T. Y.; Lin, Chun-Hung; Bridges, Daniel; Randall, Connor J.; Hansma, Paul K.

    2010-10-01

    An energy dispersion mechanism called 'sacrificial bonds and hidden length', which is found in some biological systems, such as abalone shells and bones, is the inspiration for new strategies for structural control. Sacrificial bonds and hidden length can substantially increase the stiffness and enhance energy dissipation in the constituent molecules of abalone shells and bone. Having been inspired by the usefulness and effectiveness of such a mechanism, which has evolved over millions of years and countless cycles of evolutions, the authors employ the conceptual underpinnings of this mechanism to develop a bio-inspired passive actuator. This paper presents a fundamental method for optimally designing such bio-inspired passive actuators for structural control. To optimize the bio-inspired passive actuator, a simple method utilizing the force-displacement-velocity (FDV) plots based on LQR control is proposed. A linear regression approach is adopted in this research to find the initial values of the desired parameters for the bio-inspired passive actuator. The illustrative examples, conducted by numerical simulation with experimental validation, suggest that the bio-inspired passive actuator based on sacrificial bonds and hidden length may be comparable in performance to state-of-the-art semi-active actuators.

  8. Safety-specific benefit of the probabilistic evaluation of older nuclear power plants

    International Nuclear Information System (INIS)

    Hoertner, H.; Koeberlein, K.

    1991-01-01

    The report summarizes the experience of the GRS obtained within the framework of a probabilistic evaluation of older nuclear power plants and the German risk study. The applied methodology and the problems involved are explained first. After a brief summary of probabilistic analyses carried out for German nuclear power plants, reliability analyses for older systems are discussed in detail. The findings from the probabilistic safety analyses and the conclusions drawn are presented. (orig.) [de

  9. Fast probabilistic file fingerprinting for big data.

    Science.gov (United States)

    Tretyakov, Konstantin; Laur, Sven; Smant, Geert; Vilo, Jaak; Prins, Pjotr

    2013-01-01

    Biological data acquisition is raising new challenges, both in data analysis and handling. Not only is it proving hard to analyze the data at the rate it is generated today, but simply reading and transferring data files can be prohibitively slow due to their size. This primarily concerns logistics within and between data centers, but is also important for workstation users in the analysis phase. Common usage patterns, such as comparing and transferring files, are proving computationally expensive and are tying down shared resources. We present an efficient method for calculating file uniqueness for large scientific data files, that takes less computational effort than existing techniques. This method, called Probabilistic Fast File Fingerprinting (PFFF), exploits the variation present in biological data and computes file fingerprints by sampling randomly from the file instead of reading it in full. Consequently, it has a flat performance characteristic, correlated with data variation rather than file size. We demonstrate that probabilistic fingerprinting can be as reliable as existing hashing techniques, with provably negligible risk of collisions. We measure the performance of the algorithm on a number of data storage and access technologies, identifying its strengths as well as limitations. Probabilistic fingerprinting may significantly reduce the use of computational resources when comparing very large files. Utilisation of probabilistic fingerprinting techniques can increase the speed of common file-related workflows, both in the data center and for workbench analysis. The implementation of the algorithm is available as an open-source tool named pfff, as a command-line tool as well as a C library. The tool can be downloaded from http://biit.cs.ut.ee/pfff.

  10. A study on the methodology of probabilistic safety assessment for KALIMER

    Energy Technology Data Exchange (ETDEWEB)

    Jeong, Kwan Seong; Kwon, Young Min; Lee, Yong Bum; Jeong, Hae Yong; Yang, Joon Eon; Ha, Kyu Suk; Hahn, Do Hee [Korea Atomic Energy Research Institute, Taejeon (Korea)

    2002-03-01

    Existing Probabilistic Safety Assessment(PSA) is a method for Light Water Reactor or Pressurized Heavy Water Reactor. Because KALIMER is different from these reactor, the new methodology of PSA need to be developed. In this paper, the PSA of Power Reactor Inherently Safety Module(PRISM) is analyzed, and Initiating Event such as Experiential Assessment, Logical Assessment and Failure Mode Effect Analysis(FMEA) is reviewed. Also, Pipe Damage Frequency Method is suggested for KALIMER. And the Reliability Physical method of Passive System, which is a chief safety system of KALIMER, is reviewed and its applicability is investigated. Finally, for the Preliminary PSA of KALIMER, Intermediate Heat Transfer System is analyzed. 23 refs., 10 figs., 13 tabs. (Author)

  11. Microflyers: inspiration from nature

    Science.gov (United States)

    Sirohi, Jayant

    2013-04-01

    Over the past decade, there has been considerable interest in miniaturizing aircraft to create a class of extremely small, robotic vehicles with a gross mass on the order of tens of grams and a dimension on the order of tens of centimeters. These are collectively refered to as micro aerial vehicles (MAVs) or microflyers. Because the size of microflyers is on the same order as that of small birds and large insects, engineers are turning to nature for inspiration. Bioinspired concepts make use of structural or aerodynamic mechanisms that are observed in insects and birds, such as elastic energy storage and unsteady aerodynamics. Biomimetic concepts attempt to replicate the form and function of natural flyers, such as flapping-wing propulsion and external appearance. This paper reviews recent developments in the area of man-made microflyers. The design space for microflyers will be described, along with fundamental physical limits to miniaturization. Key aerodynamic phenomena at the scale of microflyers will be highlighted. Because the focus is on bioinspiration and biomimetics, scaled-down versions of conventional aircraft, such as fixed wing micro air vehicles and microhelicopters will not be addressed. A few representative bioinspired and biomimetic microflyer concepts developed by researchers will be described in detail. Finally, some of the sensing mechanisms used by natural flyers that are being implemented in man-made microflyers will be discussed.

  12. A generative, probabilistic model of local protein structure

    DEFF Research Database (Denmark)

    Boomsma, Wouter; Mardia, Kanti V.; Taylor, Charles C.

    2008-01-01

    Despite significant progress in recent years, protein structure prediction maintains its status as one of the prime unsolved problems in computational biology. One of the key remaining challenges is an efficient probabilistic exploration of the structural space that correctly reflects the relative...... conformational stabilities. Here, we present a fully probabilistic, continuous model of local protein structure in atomic detail. The generative model makes efficient conformational sampling possible and provides a framework for the rigorous analysis of local sequence-structure correlations in the native state...

  13. Advances in probabilistic databases for uncertain information management

    CERN Document Server

    Yan, Li

    2013-01-01

    This book covers a fast-growing topic in great depth and focuses on the technologies and applications of probabilistic data management. It aims to provide a single account of current studies in probabilistic data management. The objective of the book is to provide the state of the art information to researchers, practitioners, and graduate students of information technology of intelligent information processing, and at the same time serving the information technology professional faced with non-traditional applications that make the application of conventional approaches difficult or impossible.

  14. Advanced Test Reactor probabilistic risk assessment

    International Nuclear Information System (INIS)

    Atkinson, S.A.; Eide, S.A.; Khericha, S.T.; Thatcher, T.A.

    1993-01-01

    This report discusses Level 1 probabilistic risk assessment (PRA) incorporating a full-scope external events analysis which has been completed for the Advanced Test Reactor (ATR) located at the Idaho National Engineering Laboratory

  15. Guidance for the definition and application of probabilistic safety criteria

    International Nuclear Information System (INIS)

    Holmberg, J.-E.; Knochenhauer, M.

    2011-05-01

    The project 'The Validity of Safety Goals' has been financed jointly by NKS (Nordic Nuclear Safety Research), SSM (Swedish Radiation Safety Authority) and the Swedish and Finnish nuclear utilities. The national financing went through NPSAG, the Nordic PSA Group (Swedish contributions) and SAFIR2010, the Finnish research programme on NPP safety (Finnish contributions). The project has been performed in four phases during 2006-2010. This guidance document aims at describing, on the basis of the work performed throughout the project, issues to consider when defining, applying and interpreting probabilistic safety criteria. Thus, the basic aim of the document is to serve as a checklist and toolbox for the definition and application of probabilistic safety criteria. The document describes the terminology and concepts involved, the levels of criteria and relations between these, how to define a probabilistic safety criterion, how to apply a probabilistic safety criterion, on what to apply the probabilistic safety criterion, and how to interpret the result of the application. The document specifically deals with what makes up a probabilistic safety criterion, i.e., the risk metric, the frequency criterion, the PSA used for assessing compliance and the application procedure for the criterion. It also discusses the concept of subsidiary criteria, i.e., different levels of safety goals. The results from the project can be used as a platform for discussions at the utilities on how to define and use quantitative safety goals. The results can also be used by safety authorities as a reference for risk-informed regulation. The outcome can have an impact on the requirements on PSA, e.g., regarding quality, scope, level of detail, and documentation. Finally, the results can be expected to support on-going activities concerning risk-informed applications. (Author)

  16. Guidance for the definition and application of probabilistic safety criteria

    Energy Technology Data Exchange (ETDEWEB)

    Holmberg, J.-E. (VTT Technical Research Centre of Finland (Finland)); Knochenhauer, M. (Scandpower AB (Sweden))

    2011-05-15

    The project 'The Validity of Safety Goals' has been financed jointly by NKS (Nordic Nuclear Safety Research), SSM (Swedish Radiation Safety Authority) and the Swedish and Finnish nuclear utilities. The national financing went through NPSAG, the Nordic PSA Group (Swedish contributions) and SAFIR2010, the Finnish research programme on NPP safety (Finnish contributions). The project has been performed in four phases during 2006-2010. This guidance document aims at describing, on the basis of the work performed throughout the project, issues to consider when defining, applying and interpreting probabilistic safety criteria. Thus, the basic aim of the document is to serve as a checklist and toolbox for the definition and application of probabilistic safety criteria. The document describes the terminology and concepts involved, the levels of criteria and relations between these, how to define a probabilistic safety criterion, how to apply a probabilistic safety criterion, on what to apply the probabilistic safety criterion, and how to interpret the result of the application. The document specifically deals with what makes up a probabilistic safety criterion, i.e., the risk metric, the frequency criterion, the PSA used for assessing compliance and the application procedure for the criterion. It also discusses the concept of subsidiary criteria, i.e., different levels of safety goals. The results from the project can be used as a platform for discussions at the utilities on how to define and use quantitative safety goals. The results can also be used by safety authorities as a reference for risk-informed regulation. The outcome can have an impact on the requirements on PSA, e.g., regarding quality, scope, level of detail, and documentation. Finally, the results can be expected to support on-going activities concerning risk-informed applications. (Author)

  17. Probabilistic inductive inference: a survey

    OpenAIRE

    Ambainis, Andris

    2001-01-01

    Inductive inference is a recursion-theoretic theory of learning, first developed by E. M. Gold (1967). This paper surveys developments in probabilistic inductive inference. We mainly focus on finite inference of recursive functions, since this simple paradigm has produced the most interesting (and most complex) results.

  18. Making Probabilistic Relational Categories Learnable

    Science.gov (United States)

    Jung, Wookyoung; Hummel, John E.

    2015-01-01

    Theories of relational concept acquisition (e.g., schema induction) based on structured intersection discovery predict that relational concepts with a probabilistic (i.e., family resemblance) structure ought to be extremely difficult to learn. We report four experiments testing this prediction by investigating conditions hypothesized to facilitate…

  19. Fully probabilistic design: the way for optimizing of concrete structures

    Directory of Open Access Journals (Sweden)

    I. Laníková

    Full Text Available Some standards for the design of concrete structures (e.g. EC2 and the original ČSN 73 1201-86 allow a structure to be designed by several methods. This contribution documents the fact that even if a structure does not comply with the partial reliability factor method, according to EC2, it can satisfy the conditions during the application of the fully probabilistic approach when using the same standard. From an example of the reliability of a prestressed spun concrete pole designed by the partial factor method and fully probabilistic approach according to the Eurocode it is evident that an expert should apply a more precise (though unfortunately more complicated method in the limiting cases. The Monte Carlo method, modified by the Latin Hypercube Sampling (LHS method, has been used for the calculation of reliability. Ultimate and serviceability limit states were checked for the partial factor method and fully probabilistic design. As a result of fully probabilistic design it is possible to obtain a more efficient design for a structure.

  20. True-slime-mould-inspired hydrostatically coupled oscillator system exhibiting versatile behaviours

    International Nuclear Information System (INIS)

    Umedachi, Takuya; Ito, Kentaro; Idei, Ryo; Ishiguro, Akio

    2013-01-01

    Behavioural diversity is an indispensable attribute of living systems, which makes them intrinsically adaptive and responsive to the demands of a dynamically changing environment. In contrast, conventional engineering approaches struggle to suppress behavioural diversity in artificial systems to reach optimal performance in given environments for desired tasks. The goals of this research include understanding the essential mechanism that endows living systems with behavioural diversity and implementing the mechanism in robots to exhibit adaptive behaviours. For this purpose, we have focused on an amoeba-like unicellular organism: the plasmodium of true slime mould. Despite the absence of a central nervous system, the plasmodium exhibits versatile spatiotemporal oscillatory patterns and switches spontaneously among these patterns. By exploiting this behavioural diversity, it is able to exhibit adaptive behaviour according to the situation encountered. Inspired by this organism, we built a real physical robot using hydrostatically coupled oscillators that produce versatile oscillatory patterns and spontaneous transitions among the patterns. The experimental results show that exploiting physical hydrostatic interplay—the physical dynamics of the robot—allows simple phase oscillators to promote versatile behaviours. The results can contribute to an understanding of how a living system generates versatile and adaptive behaviours with physical interplays among body parts. (paper)

  1. Compression of Probabilistic XML Documents

    Science.gov (United States)

    Veldman, Irma; de Keijzer, Ander; van Keulen, Maurice

    Database techniques to store, query and manipulate data that contains uncertainty receives increasing research interest. Such UDBMSs can be classified according to their underlying data model: relational, XML, or RDF. We focus on uncertain XML DBMS with as representative example the Probabilistic XML model (PXML) of [10,9]. The size of a PXML document is obviously a factor in performance. There are PXML-specific techniques to reduce the size, such as a push down mechanism, that produces equivalent but more compact PXML documents. It can only be applied, however, where possibilities are dependent. For normal XML documents there also exist several techniques for compressing a document. Since Probabilistic XML is (a special form of) normal XML, it might benefit from these methods even more. In this paper, we show that existing compression mechanisms can be combined with PXML-specific compression techniques. We also show that best compression rates are obtained with a combination of PXML-specific technique with a rather simple generic DAG-compression technique.

  2. Advances in probabilistic risk analysis

    International Nuclear Information System (INIS)

    Hardung von Hardung, H.

    1982-01-01

    Probabilistic risk analysis can now look back upon almost a quarter century of intensive development. The early studies, whose methods and results are still referred to occasionally, however, only permitted rough estimates to be made of the probabilities of recognizable accident scenarios, failing to provide a method which could have served as a reference base in calculating the overall risk associated with nuclear power plants. The first truly solid attempt was the Rasmussen Study and, partly based on it, the German Risk Study. In those studies, probabilistic risk analysis has been given a much more precise basis. However, new methodologies have been developed in the meantime, which allow much more informative risk studies to be carried out. They have been found to be valuable tools for management decisions with respect to backfitting, reinforcement and risk limitation. Today they are mainly applied by specialized private consultants and have already found widespread application especially in the USA. (orig.) [de

  3. Probabilistic structural analysis using a general purpose finite element program

    Science.gov (United States)

    Riha, D. S.; Millwater, H. R.; Thacker, B. H.

    1992-07-01

    This paper presents an accurate and efficient method to predict the probabilistic response for structural response quantities, such as stress, displacement, natural frequencies, and buckling loads, by combining the capabilities of MSC/NASTRAN, including design sensitivity analysis and fast probability integration. Two probabilistic structural analysis examples have been performed and verified by comparison with Monte Carlo simulation of the analytical solution. The first example consists of a cantilevered plate with several point loads. The second example is a probabilistic buckling analysis of a simply supported composite plate under in-plane loading. The coupling of MSC/NASTRAN and fast probability integration is shown to be orders of magnitude more efficient than Monte Carlo simulation with excellent accuracy.

  4. Probabilistic quantum cloning of a subset of linearly dependent states

    Science.gov (United States)

    Rui, Pinshu; Zhang, Wen; Liao, Yanlin; Zhang, Ziyun

    2018-02-01

    It is well known that a quantum state, secretly chosen from a certain set, can be probabilistically cloned with positive cloning efficiencies if and only if all the states in the set are linearly independent. In this paper, we focus on probabilistic quantum cloning of a subset of linearly dependent states. We show that a linearly-independent subset of linearly-dependent quantum states {| Ψ 1⟩,| Ψ 2⟩,…,| Ψ n ⟩} can be probabilistically cloned if and only if any state in the subset cannot be expressed as a linear superposition of the other states in the set {| Ψ 1⟩,| Ψ 2⟩,…,| Ψ n ⟩}. The optimal cloning efficiencies are also investigated.

  5. Probability in physics

    CERN Document Server

    Hemmo, Meir

    2012-01-01

    What is the role and meaning of probability in physical theory, in particular in two of the most successful theories of our age, quantum physics and statistical mechanics? Laws once conceived as universal and deterministic, such as Newton‘s laws of motion, or the second law of thermodynamics, are replaced in these theories by inherently probabilistic laws. This collection of essays by some of the world‘s foremost experts presents an in-depth analysis of the meaning of probability in contemporary physics. Among the questions addressed are: How are probabilities defined? Are they objective or subjective? What is their  explanatory value? What are the differences between quantum and classical probabilities? The result is an informative and thought-provoking book for the scientifically inquisitive. 

  6. Learning Probabilistic Inference through Spike-Timing-Dependent Plasticity123

    Science.gov (United States)

    Pecevski, Dejan

    2016-01-01

    Abstract Numerous experimental data show that the brain is able to extract information from complex, uncertain, and often ambiguous experiences. Furthermore, it can use such learnt information for decision making through probabilistic inference. Several models have been proposed that aim at explaining how probabilistic inference could be performed by networks of neurons in the brain. We propose here a model that can also explain how such neural network could acquire the necessary information for that from examples. We show that spike-timing-dependent plasticity in combination with intrinsic plasticity generates in ensembles of pyramidal cells with lateral inhibition a fundamental building block for that: probabilistic associations between neurons that represent through their firing current values of random variables. Furthermore, by combining such adaptive network motifs in a recursive manner the resulting network is enabled to extract statistical information from complex input streams, and to build an internal model for the distribution p* that generates the examples it receives. This holds even if p* contains higher-order moments. The analysis of this learning process is supported by a rigorous theoretical foundation. Furthermore, we show that the network can use the learnt internal model immediately for prediction, decision making, and other types of probabilistic inference. PMID:27419214

  7. The dialectical thinking about deterministic and probabilistic safety analysis

    International Nuclear Information System (INIS)

    Qian Yongbai; Tong Jiejuan; Zhang Zuoyi; He Xuhong

    2005-01-01

    There are two methods in designing and analysing the safety performance of a nuclear power plant, the traditional deterministic method and the probabilistic method. To date, the design of nuclear power plant is based on the deterministic method. It has been proved in practice that the deterministic method is effective on current nuclear power plant. However, the probabilistic method (Probabilistic Safety Assessment - PSA) considers a much wider range of faults, takes an integrated look at the plant as a whole, and uses realistic criteria for the performance of the systems and constructions of the plant. PSA can be seen, in principle, to provide a broader and realistic perspective on safety issues than the deterministic approaches. In this paper, the historical origins and development trend of above two methods are reviewed and summarized in brief. Based on the discussion of two application cases - one is the changes to specific design provisions of the general design criteria (GDC) and the other is the risk-informed categorization of structure, system and component, it can be concluded that the deterministic method and probabilistic method are dialectical and unified, and that they are being merged into each other gradually, and being used in coordination. (authors)

  8. Probabilistic biological network alignment.

    Science.gov (United States)

    Todor, Andrei; Dobra, Alin; Kahveci, Tamer

    2013-01-01

    Interactions between molecules are probabilistic events. An interaction may or may not happen with some probability, depending on a variety of factors such as the size, abundance, or proximity of the interacting molecules. In this paper, we consider the problem of aligning two biological networks. Unlike existing methods, we allow one of the two networks to contain probabilistic interactions. Allowing interaction probabilities makes the alignment more biologically relevant at the expense of explosive growth in the number of alternative topologies that may arise from different subsets of interactions that take place. We develop a novel method that efficiently and precisely characterizes this massive search space. We represent the topological similarity between pairs of aligned molecules (i.e., proteins) with the help of random variables and compute their expected values. We validate our method showing that, without sacrificing the running time performance, it can produce novel alignments. Our results also demonstrate that our method identifies biologically meaningful mappings under a comprehensive set of criteria used in the literature as well as the statistical coherence measure that we developed to analyze the statistical significance of the similarity of the functions of the aligned protein pairs.

  9. Growing hierarchical probabilistic self-organizing graphs.

    Science.gov (United States)

    López-Rubio, Ezequiel; Palomo, Esteban José

    2011-07-01

    Since the introduction of the growing hierarchical self-organizing map, much work has been done on self-organizing neural models with a dynamic structure. These models allow adjusting the layers of the model to the features of the input dataset. Here we propose a new self-organizing model which is based on a probabilistic mixture of multivariate Gaussian components. The learning rule is derived from the stochastic approximation framework, and a probabilistic criterion is used to control the growth of the model. Moreover, the model is able to adapt to the topology of each layer, so that a hierarchy of dynamic graphs is built. This overcomes the limitations of the self-organizing maps with a fixed topology, and gives rise to a faithful visualization method for high-dimensional data.

  10. When to conduct probabilistic linkage vs. deterministic linkage? A simulation study.

    Science.gov (United States)

    Zhu, Ying; Matsuyama, Yutaka; Ohashi, Yasuo; Setoguchi, Soko

    2015-08-01

    When unique identifiers are unavailable, successful record linkage depends greatly on data quality and types of variables available. While probabilistic linkage theoretically captures more true matches than deterministic linkage by allowing imperfection in identifiers, studies have shown inconclusive results likely due to variations in data quality, implementation of linkage methodology and validation method. The simulation study aimed to understand data characteristics that affect the performance of probabilistic vs. deterministic linkage. We created ninety-six scenarios that represent real-life situations using non-unique identifiers. We systematically introduced a range of discriminative power, rate of missing and error, and file size to increase linkage patterns and difficulties. We assessed the performance difference of linkage methods using standard validity measures and computation time. Across scenarios, deterministic linkage showed advantage in PPV while probabilistic linkage showed advantage in sensitivity. Probabilistic linkage uniformly outperformed deterministic linkage as the former generated linkages with better trade-off between sensitivity and PPV regardless of data quality. However, with low rate of missing and error in data, deterministic linkage performed not significantly worse. The implementation of deterministic linkage in SAS took less than 1min, and probabilistic linkage took 2min to 2h depending on file size. Our simulation study demonstrated that the intrinsic rate of missing and error of linkage variables was key to choosing between linkage methods. In general, probabilistic linkage was a better choice, but for exceptionally good quality data (<5% error), deterministic linkage was a more resource efficient choice. Copyright © 2015 Elsevier Inc. All rights reserved.

  11. Fully probabilistic control for stochastic nonlinear control systems with input dependent noise.

    Science.gov (United States)

    Herzallah, Randa

    2015-03-01

    Robust controllers for nonlinear stochastic systems with functional uncertainties can be consistently designed using probabilistic control methods. In this paper a generalised probabilistic controller design for the minimisation of the Kullback-Leibler divergence between the actual joint probability density function (pdf) of the closed loop control system, and an ideal joint pdf is presented emphasising how the uncertainty can be systematically incorporated in the absence of reliable systems models. To achieve this objective all probabilistic models of the system are estimated from process data using mixture density networks (MDNs) where all the parameters of the estimated pdfs are taken to be state and control input dependent. Based on this dependency of the density parameters on the input values, explicit formulations to the construction of optimal generalised probabilistic controllers are obtained through the techniques of dynamic programming and adaptive critic methods. Using the proposed generalised probabilistic controller, the conditional joint pdfs can be made to follow the ideal ones. A simulation example is used to demonstrate the implementation of the algorithm and encouraging results are obtained. Copyright © 2014 Elsevier Ltd. All rights reserved.

  12. Bio-inspired nanotechnology from surface analysis to applications

    CERN Document Server

    Walsh, Tiffany

    2014-01-01

    This book focuses on the use of bio-inspired and biomimetic methods for the fabrication and activation of nanomaterials. This includes studies concerning the binding of the biomolecules to the surface of inorganic structures, structure/function relationships of the final materials, and extensive discussions on the final applications of such biomimetic materials in unique applications including energy harvesting/storage, biomedical diagnostics, and materials assembly. This book also: ·          Covers the sustainable features of bio-inspired nanotechnology ·          Includes studies on the unique applications of biomimetic materials, such as energy harvesting and biomedical diagnostics Bio-Inspired Nanotechnology: From Surface Analysis to Applications is an ideal book for researchers, students, nanomaterials engineers, bioengineers, chemists, biologists, physicists, and medical researchers.

  13. Pramana – Journal of Physics | Indian Academy of Sciences

    Indian Academy of Sciences (India)

    Home; Journals; Pramana – Journal of Physics. Pradeep Bhadola. Articles written in Pramana – Journal of Physics. Volume 84 Issue 2 February 2015 pp 295-308. Matrix models with Penner interaction inspired by interacting ribonucleic acid · Pradeep Bhadola N Deo · More Details Abstract Fulltext PDF. The Penner ...

  14. Biologically inspired coupled antenna beampattern design

    Energy Technology Data Exchange (ETDEWEB)

    Akcakaya, Murat; Nehorai, Arye, E-mail: makcak2@ese.wustl.ed, E-mail: nehorai@ese.wustl.ed [Department of Electrical and Systems Engineering, Washington University in St Louis, St Louis, MO 63130 (United States)

    2010-12-15

    We propose to design a small-size transmission-coupled antenna array, and corresponding radiation pattern, having high performance inspired by the female Ormia ochracea's coupled ears. For reproduction purposes, the female Ormia is able to locate male crickets' call accurately despite the small distance between its ears compared with the incoming wavelength. This phenomenon has been explained by the mechanical coupling between the Ormia's ears, which has been modeled by a pair of differential equations. In this paper, we first solve these differential equations governing the Ormia ochracea's ear response, and convert the response to the pre-specified radio frequencies. We then apply the converted response of the biological coupling in the array factor of a uniform linear array composed of finite-length dipole antennas, and also include the undesired electromagnetic coupling due to the proximity of the elements. Moreover, we propose an algorithm to optimally choose the biologically inspired coupling for maximum array performance. In our numerical examples, we compute the radiation intensity of the designed system for binomial and uniform ordinary end-fire arrays, and demonstrate the improvement in the half-power beamwidth, sidelobe suppression and directivity of the radiation pattern due to the biologically inspired coupling.

  15. Probabilistic and sensitivity analysis of Botlek Bridge structures

    Directory of Open Access Journals (Sweden)

    Králik Juraj

    2017-01-01

    Full Text Available This paper deals with the probabilistic and sensitivity analysis of the largest movable lift bridge of the world. The bridge system consists of six reinforced concrete pylons and two steel decks 4000 tons weight each connected through ropes with counterweights. The paper focuses the probabilistic and sensitivity analysis as the base of dynamic study in design process of the bridge. The results had a high importance for practical application and design of the bridge. The model and resistance uncertainties were taken into account in LHS simulation method.

  16. Evaluation of Nonparametric Probabilistic Forecasts of Wind Power

    DEFF Research Database (Denmark)

    Pinson, Pierre; Møller, Jan Kloppenborg; Nielsen, Henrik Aalborg, orlov 31.07.2008

    Predictions of wind power production for horizons up to 48-72 hour ahead comprise a highly valuable input to the methods for the daily management or trading of wind generation. Today, users of wind power predictions are not only provided with point predictions, which are estimates of the most...... likely outcome for each look-ahead time, but also with uncertainty estimates given by probabilistic forecasts. In order to avoid assumptions on the shape of predictive distributions, these probabilistic predictions are produced from nonparametric methods, and then take the form of a single or a set...

  17. Foundations of the probabilistic mechanics of discrete media

    CERN Document Server

    Axelrad, D R

    1984-01-01

    This latest volume in the Foundations & Philosophy of Science & Technology series provides an account of probabilistic functional analysis and shows its applicability in the formulation of the behaviour of discrete media with the inclusion of microstructural effects. Although quantum mechanics have long been recognized as a stochastic theory, the introduction of probabilistic concepts and principles to classical mechanics has in general not been attempted. In this study the author takes the view that the significant field quantities of a discrete medium are random variables or functions of s

  18. Probabilistic Mobility Models for Mobile and Wireless Networks

    DEFF Research Database (Denmark)

    Song, Lei; Godskesen, Jens Christian

    2010-01-01

    In this paper we present a probabilistic broadcast calculus for mobile and wireless networks whose connections are unreliable. In our calculus broadcasted messages can be lost with a certain probability, and due to mobility the connection probabilities may change. If a network broadcasts a message...... from a location it will evolve to a network distribution depending on whether nodes at other locations receive the message or not. Mobility of locations is not arbitrary but guarded by a probabilistic mobility function (PMF) and we also define the notion of a weak bisimulation given a PMF...

  19. Probabilistic safety assessment for seismic events

    International Nuclear Information System (INIS)

    1993-10-01

    This Technical Document on Probabilistic Safety Assessment for Seismic Events is mainly associated with the Safety Practice on Treatment of External Hazards in PSA and discusses in detail one specific external hazard, i.e. earthquakes

  20. Does a more sophisticated storm erosion model improve probabilistic erosion estimates?

    NARCIS (Netherlands)

    Ranasinghe, R.W.M.R.J.B.; Callaghan, D.; Roelvink, D.

    2013-01-01

    The dependency between the accuracy/uncertainty of storm erosion exceedance estimates obtained via a probabilistic model and the level of sophistication of the structural function (storm erosion model) embedded in the probabilistic model is assessed via the application of Callaghan et al.'s (2008)