Probabilistic methods for physics
International Nuclear Information System (INIS)
Cirier, G
2013-01-01
We present an asymptotic method giving a probability of presence of the iterated spots of R d by a polynomial function f. We use the well-known Perron Frobenius operator (PF) that lets certain sets and measure invariant by f. Probabilistic solutions can exist for the deterministic iteration. If the theoretical result is already known, here we quantify these probabilities. This approach seems interesting to use for computing situations when the deterministic methods don't run. Among the examined applications, are asymptotic solutions of Lorenz, Navier-Stokes or Hamilton's equations. In this approach, linearity induces many difficult problems, all of whom we have not yet resolved.
A Probabilistic Recommendation Method Inspired by Latent Dirichlet Allocation Model
Directory of Open Access Journals (Sweden)
WenBo Xie
2014-01-01
Full Text Available The recent decade has witnessed an increasing popularity of recommendation systems, which help users acquire relevant knowledge, commodities, and services from an overwhelming information ocean on the Internet. Latent Dirichlet Allocation (LDA, originally presented as a graphical model for text topic discovery, now has found its application in many other disciplines. In this paper, we propose an LDA-inspired probabilistic recommendation method by taking the user-item collecting behavior as a two-step process: every user first becomes a member of one latent user-group at a certain probability and each user-group will then collect various items with different probabilities. Gibbs sampling is employed to approximate all the probabilities in the two-step process. The experiment results on three real-world data sets MovieLens, Netflix, and Last.fm show that our method exhibits a competitive performance on precision, coverage, and diversity in comparison with the other four typical recommendation methods. Moreover, we present an approximate strategy to reduce the computing complexity of our method with a slight degradation of the performance.
Kornyshev, Alexei A.
2010-10-01
The conference 'From DNA-Inspired Physics to Physics-Inspired Biology' (1-5 June 2009, International Center for Theoretical Physics, Trieste, Italy) that myself and two former presidents of the American Biophysical Society—Wilma Olson (Rutgers University) and Adrian Parsegian (NIH), with the support of an ICTP team (Ralf Gebauer (Local Organizer) and Doreen Sauleek (Conference Secretary)), have organized was intended to establish stronger links between the biology and physics communities on the DNA front. The relationships between them were never easy. In 1997, Adrian published a paper in Physics Today ('Harness the Hubris') summarizing his thoughts about the main obstacles for a successful collaboration. The bottom line of that article was that physicists must seriously learn biology before exploring it and even having an interpreter, a friend or co-worker, who will be cooperating with you and translating the problems of biology into a physical language, may not be enough. He started his story with a joke about a physicist asking a biologist: 'I want to study the brain. Tell me something about it!' Biologist: 'First, the brain consists of two parts, and..' Physicist: 'Stop. You have told me too much.' Adrian listed a few direct avenues where physicists' contributions may be particularly welcome. This gentle and elegantly written paper caused, however, a stormy reaction from Bob Austin (Princeton), published together with Adrian's notes, accusing Adrian of forbidding physicists to attack big questions in biology straightaway. Twelve years have passed and many new developments have taken place in the biologist-physicist interaction. This was something I addressed in my opening conference speech, with my position lying somewhere inbetween Parsegian's and Austin's, which is briefly outlined here. I will first recall certain precepts or 'dogmas' that fly in the air like Valkyries, poisoning those relationships. Since the early seventies when I was a first year Ph
Mathur, Deepak
2015-01-01
This Topical Review presents an overview of increasingly robust interconnects that are being established between atomic, molecular and optical (AMO) physics and the life sciences. AMO physics, outgrowing its historical role as a facilitator—a provider of optical methodologies, for instance—now seeks to partner biology in its quest to link systems-level descriptions of biological entities to insights based on molecular processes. Of course, perspectives differ when AMO physicists and biologists consider various processes. For instance, while AMO physicists link molecular properties and dynamics to potential energy surfaces, these have to give way to energy landscapes in considerations of protein dynamics. But there are similarities also: tunnelling and non-adiabatic transitions occur both in protein dynamics and in molecular dynamics. We bring to the fore some such differences and similarities; we consider imaging techniques based on AMO concepts, like 4D fluorescence microscopy which allows access to the dynamics of cellular processes, multiphoton microscopy which offers a built-in confocality, and microscopy with femtosecond laser beams to saturate the suppression of fluorescence in spatially controlled fashion so as to circumvent the diffraction limit. Beyond imaging, AMO physics contributes with optical traps that probe the mechanical and dynamical properties of single ‘live’ cells, highlighting differences between healthy and diseased cells. Trap methodologies have also begun to probe the dynamics governing of neural stem cells adhering to each other to form neurospheres and, with squeezed light to probe sub-diffusive motion of yeast cells. Strong field science contributes not only by providing a source of energetic electrons and γ-rays via laser-plasma accelerations schemes, but also via filamentation and supercontinuum generation, enabling mainstream collision physics into play in diverse processes like DNA damage induced by low-energy collisions to
International Nuclear Information System (INIS)
Mathur, Deepak
2015-01-01
This Topical Review presents an overview of increasingly robust interconnects that are being established between atomic, molecular and optical (AMO) physics and the life sciences. AMO physics, outgrowing its historical role as a facilitator—a provider of optical methodologies, for instance—now seeks to partner biology in its quest to link systems-level descriptions of biological entities to insights based on molecular processes. Of course, perspectives differ when AMO physicists and biologists consider various processes. For instance, while AMO physicists link molecular properties and dynamics to potential energy surfaces, these have to give way to energy landscapes in considerations of protein dynamics. But there are similarities also: tunnelling and non-adiabatic transitions occur both in protein dynamics and in molecular dynamics. We bring to the fore some such differences and similarities; we consider imaging techniques based on AMO concepts, like 4D fluorescence microscopy which allows access to the dynamics of cellular processes, multiphoton microscopy which offers a built-in confocality, and microscopy with femtosecond laser beams to saturate the suppression of fluorescence in spatially controlled fashion so as to circumvent the diffraction limit. Beyond imaging, AMO physics contributes with optical traps that probe the mechanical and dynamical properties of single ‘live’ cells, highlighting differences between healthy and diseased cells. Trap methodologies have also begun to probe the dynamics governing of neural stem cells adhering to each other to form neurospheres and, with squeezed light to probe sub-diffusive motion of yeast cells. Strong field science contributes not only by providing a source of energetic electrons and γ-rays via laser-plasma accelerations schemes, but also via filamentation and supercontinuum generation, enabling mainstream collision physics into play in diverse processes like DNA damage induced by low-energy collisions to
Perspective of an Artist Inspired by Physics
Sanborn, Jim
2010-02-01
Using digital images and video I will be presenting thirty years of my science based artwork. Beginning in the late 1970's my gallery and museum installations used lodestones and suspended compasses to reveal the earths' magnetic field. Through the 1980's my work included these compass installations and geologically inspired tableaux that had one thing in common, they were designed to expose the invisible forces of nature. Tectonics, the Coriolis force, and magnetism were among the subjects of study. In 1988, on the basis of my work with invisible forces, I was selected for a commission from the General Services Administration for the new Central Intelligence Agency headquarters in Langley Virginia. This work titled Kryptos included a large cryptographic component that remains undeciphered twenty years after its installation. In the 1990's Kryptos inspired several of my museum and gallery installations using cryptography and secrecy as their main themes. From 1995-1998 I completed a series of large format projections on the landscape in the western US and Ireland. These projections and the resulting series of photographs emulated the 19th century cartographers hired by the United States Government to map the western landscape. In 1998 I began my project titled Atomic Time. This installation shown for the first time in 2004 at the Corcoran Gallery in Washington DC, then again in the Gwangju Biennale in South Korea was a recreation of the 1944 Manhattan Project laboratory that built the first Atomic Bomb. This installation used original equipment and prototypes from the Los Alamos Lab and was an extremely accurate representation of the laboratory and the first nuclear bomb called the ``Trinity Device.'' I began my current project Terrestrial Physics in 2005. This installation to be shown in June 2010 at the Museum of Contemporary Art in Denver is a recreation of the large particle accelerator and the experiment that fissioned Uranium in 1939 at the Carnegie
INSPIRE - Premission. [Interactive NASA Space Physics Ionosphere Radio Experiment
Taylor, William W. L.; Mideke, Michael; Pine, William E.; Ericson, James D.
1992-01-01
The Interactive NASA Space Physics Ionosphere Radio Experiment (INSPIRE) designed to assist in a Space Experiments with Particle Accelerators (SEPAC) project is discussed. INSPIRE is aimed at recording data from a large number of receivers on the ground to determine the exact propagation paths and absorption of radio waves at frequencies between 50 Hz and 7 kHz. It is indicated how to participate in the experiment that will involve high school classes, colleges, and amateur radio operators.
Chain Experiment competition inspires learning of physics
Dziob, Daniel; Górska, Urszula; Kołodziej, Tomasz
2017-05-01
The Chain Experiment is an annual competition which originated in Slovenia in 2005 and later expanded to Poland in 2013. For the purpose of the event, each participating team designs and builds a contraption that transports a small steel ball from one end to the other. At the same time the constructed machine needs to use a number of interesting phenomena and physics laws. In the competition’s finale, all contraptions are connected to each other to form a long chain transporting steel balls. In brief, they are all evaluated for qualities such as: creativity and advance in theoretical background, as well as the reliability of the constructed machine to work without human help. In this article, we present the contraptions developed by students taking part in the competition in order to demonstrate the advance in theoretical basis together with creativity in design and outstanding engineering skills of its participants. Furthermore, we situate the Chain Experiment in the context of other group competitions, at the same time demonstrating that—besides activating numerous group work skills—it also improves the ability to think critically and present one’s knowledge to a broader audience. We discussed it in the context of problem based learning, gamification and collaborative testing.
INSPIRE: Interactive NASA Space Physics Ionosphere Radio Experiment
Franzen, K. A.; Garcia, L. N.; Webb, P. A.; Green, J. L.
2007-12-01
The INSPIRE Project is a non-profit scientific and educational corporation whose objective is to bring the excitement of observing very low frequency (VLF) natural radio waves to high school students. Underlying this objective is the conviction that science and technology are the underpinnings of our modern society, and that only with an understanding of these disciplines can people make correct decisions in their lives. Since 1989, the INSPIRE Project has provided specially designed radio receiver kits to over 2,500 students and other groups to make observations of signals in the VLF frequency range. These kits provide an innovative and unique opportunity for students to actively gather data that can be used in a basic research project. Natural VLF emissions that can be studied with the INSPIRE receiver kits include sferics, tweeks, whistlers, and chorus, which originate from phenomena such as lightning. These emissions can either come from the local atmospheric environment within a few tens of kilometers of the receiver or from outer space thousands of kilometers from the Earth. VLF emissions are at such low frequencies that they can be received, amplified and turned into sound that we can hear, with each emission producing in a distinctive sound. In 2006 INSPIRE was re-branded and its mission has expanded to developing new partnerships with multiple science projects. Links to magnetospheric physics, astronomy, and meteorology are being identified. This presentation will introduce the INSPIRE project, display the INSPIRE receiver kits, show examples of the types of VLF emissions that can be collected and provide information on scholarship programs being offered.
Convolution product construction of interactions in probabilistic physical models
International Nuclear Information System (INIS)
Ratsimbarison, H.M.; Raboanary, R.
2007-01-01
This paper aims to give a probabilistic construction of interactions which may be relevant for building physical theories such as interacting quantum field theories. We start with the path integral definition of partition function in quantum field theory which recall us the probabilistic nature of this physical theory. From a Gaussian law considered as free theory, an interacting theory is constructed by nontrivial convolution product between the free theory and an interacting term which is also a probability law. The resulting theory, again a probability law, exhibits two proprieties already present in nowadays theories of interactions such as Gauge theory : the interaction term does not depend on the free term, and two different free theories can be implemented with the same interaction.
Physics of collapses. Probabilistic occurrence of ELMs and crashes
International Nuclear Information System (INIS)
Itoh, S.-I.; Toda, S.; Yagi, M.; Itoh, K.; Fukuyama, A.
1997-01-01
Statistical picture for the collapse is proposed. The physics picture of the crash phenomena, which is based on the turbulence-turbulence transition, is extended to include the statistical variance of observables. The dynamics of the plasma gradient and the turbulence level is studied, with the hysteresis nature in the flux-gradient relation. The probabilistic excitation is predicted. The critical condition is described by the statistical probability. (author)
The physical world an inspirational tour of fundamental physics
Manton, Nicholas
2017-01-01
The Physical World offers a grand vision of the essential unity of physics that will enable the reader to see the world through the eyes of a physicist and understand their thinking. The text follows Einstein's dictum that 'explanations should be made as simple as possible, but no simpler', to give an honest account of how modern physicists understand their subject, including the shortcomings of current theory. The result is an up-to-date and engaging portrait of physics that contains concise derivations of the important results in a style where every step in a derivation is clearly explained, so that anyone with the appropriate mathematical skills will find the text easy to digest. It is over half a century since The Feynman Lectures in Physics were published. A new authoritative account of fundamental physics covering all branches of the subject is now well overdue. The Physical World has been written to satisfy this need. The book concentrates on the conceptual principles of each branch of physics and sho...
Effective Practices for Training and Inspiring High School Physics Teachers
Magee-Sauer, Karen
It is well-documented that there is a nationwide shortage of highly qualified high school physics teachers. Not surprising, institutions of higher education report that the most common number of physics teacher graduates is zero with the majority of institutions graduating less than two physics teachers per year. With these statistics in mind, it is critical that institutions take a careful look at how they recruit, train, and contribute to the retention of high school physics teachers. PhysTEC is a partnership between the APS and AAPT that is dedicated to improving and promoting the education of high school physics teachers. Primarily funded by the NSF and its partnering organizations, PhysTEC has identified key components that are common to successful physics teacher preparation programs. While creating a successful training program in physics, it is also important that students have the opportunity for a ``do-able'' path to certification that does not add further financial debt. This talk will present an overview of ``what works'' in creating a path for physics majors to a high school physics teaching career, actions and activities that help train and inspire pre-service physics teachers, and frameworks that provide the support for in-service teachers. Obstacles to certification and the importance of a strong partnership with colleges of education will be discussed. Several examples of successful physics high school teacher preparation programs will be presented. This material is part of the Physics Teacher Education Coalition project, which is based upon work supported by the National Science Foundation under Grant Nos. 0808790, 0108787, and 0833210.
Machine learning, computer vision, and probabilistic models in jet physics
CERN. Geneva; NACHMAN, Ben
2015-01-01
In this talk we present recent developments in the application of machine learning, computer vision, and probabilistic models to the analysis and interpretation of LHC events. First, we will introduce the concept of jet-images and computer vision techniques for jet tagging. Jet images enabled the connection between jet substructure and tagging with the fields of computer vision and image processing for the first time, improving the performance to identify highly boosted W bosons with respect to state-of-the-art methods, and providing a new way to visualize the discriminant features of different classes of jets, adding a new capability to understand the physics within jets and to design more powerful jet tagging methods. Second, we will present Fuzzy jets: a new paradigm for jet clustering using machine learning methods. Fuzzy jets view jet clustering as an unsupervised learning task and incorporate a probabilistic assignment of particles to jets to learn new features of the jet structure. In particular, we wi...
Tischhauser, Karen
2015-01-01
Students need inspiration to write. Assigning is not teaching. In order to inspire students to write fiction worth reading, teachers must take them through the process of writing. Physical objects inspire good writing with depth. In this article, the reader will be taken through the process of inspiring young writers through the use of boxes.…
Statistical physics of medical diagnostics: Study of a probabilistic model.
Mashaghi, Alireza; Ramezanpour, Abolfazl
2018-03-01
We study a diagnostic strategy which is based on the anticipation of the diagnostic process by simulation of the dynamical process starting from the initial findings. We show that such a strategy could result in more accurate diagnoses compared to a strategy that is solely based on the direct implications of the initial observations. We demonstrate this by employing the mean-field approximation of statistical physics to compute the posterior disease probabilities for a given subset of observed signs (symptoms) in a probabilistic model of signs and diseases. A Monte Carlo optimization algorithm is then used to maximize an objective function of the sequence of observations, which favors the more decisive observations resulting in more polarized disease probabilities. We see how the observed signs change the nature of the macroscopic (Gibbs) states of the sign and disease probability distributions. The structure of these macroscopic states in the configuration space of the variables affects the quality of any approximate inference algorithm (so the diagnostic performance) which tries to estimate the sign-disease marginal probabilities. In particular, we find that the simulation (or extrapolation) of the diagnostic process is helpful when the disease landscape is not trivial and the system undergoes a phase transition to an ordered phase.
Statistical physics of medical diagnostics: Study of a probabilistic model
Mashaghi, Alireza; Ramezanpour, Abolfazl
2018-03-01
We study a diagnostic strategy which is based on the anticipation of the diagnostic process by simulation of the dynamical process starting from the initial findings. We show that such a strategy could result in more accurate diagnoses compared to a strategy that is solely based on the direct implications of the initial observations. We demonstrate this by employing the mean-field approximation of statistical physics to compute the posterior disease probabilities for a given subset of observed signs (symptoms) in a probabilistic model of signs and diseases. A Monte Carlo optimization algorithm is then used to maximize an objective function of the sequence of observations, which favors the more decisive observations resulting in more polarized disease probabilities. We see how the observed signs change the nature of the macroscopic (Gibbs) states of the sign and disease probability distributions. The structure of these macroscopic states in the configuration space of the variables affects the quality of any approximate inference algorithm (so the diagnostic performance) which tries to estimate the sign-disease marginal probabilities. In particular, we find that the simulation (or extrapolation) of the diagnostic process is helpful when the disease landscape is not trivial and the system undergoes a phase transition to an ordered phase.
INSPIRE: Managing Metadata in a Global Digital Library for High-Energy Physics
Martin Montull, Javier
2011-01-01
Four leading laboratories in the High-Energy Physics (HEP) field are collaborating to roll-out the next-generation scientific information portal: INSPIRE. The goal of this project is to replace the popular 40 year-old SPIRES database. INSPIRE already provides access to about 1 million records and includes services such as fulltext search, automatic keyword assignment, ingestion and automatic display of LaTeX, citation analysis, automatic author disambiguation, metadata harvesting, extraction ...
A Physics-Inspired Introduction to Political Science
Taagepera, Rein
1976-01-01
This paper analyzes what is involved in patterning part of an introduction to politics along the lines of physical sciences, and it presents contents and results of a course in which the author did this. (Author/ND)
Integration of Advanced Probabilistic Analysis Techniques with Multi-Physics Models
Energy Technology Data Exchange (ETDEWEB)
Cetiner, Mustafa Sacit; none,; Flanagan, George F. [ORNL; Poore III, Willis P. [ORNL; Muhlheim, Michael David [ORNL
2014-07-30
An integrated simulation platform that couples probabilistic analysis-based tools with model-based simulation tools can provide valuable insights for reactive and proactive responses to plant operating conditions. The objective of this work is to demonstrate the benefits of a partial implementation of the Small Modular Reactor (SMR) Probabilistic Risk Assessment (PRA) Detailed Framework Specification through the coupling of advanced PRA capabilities and accurate multi-physics plant models. Coupling a probabilistic model with a multi-physics model will aid in design, operations, and safety by providing a more accurate understanding of plant behavior. This represents the first attempt at actually integrating these two types of analyses for a control system used for operations, on a faster than real-time basis. This report documents the development of the basic communication capability to exchange data with the probabilistic model using Reliability Workbench (RWB) and the multi-physics model using Dymola. The communication pathways from injecting a fault (i.e., failing a component) to the probabilistic and multi-physics models were successfully completed. This first version was tested with prototypic models represented in both RWB and Modelica. First, a simple event tree/fault tree (ET/FT) model was created to develop the software code to implement the communication capabilities between the dynamic-link library (dll) and RWB. A program, written in C#, successfully communicates faults to the probabilistic model through the dll. A systems model of the Advanced Liquid-Metal Reactor–Power Reactor Inherently Safe Module (ALMR-PRISM) design developed under another DOE project was upgraded using Dymola to include proper interfaces to allow data exchange with the control application (ConApp). A program, written in C+, successfully communicates faults to the multi-physics model. The results of the example simulation were successfully plotted.
Statistical physics inspired energy-efficient coded-modulation for optical communications.
Djordjevic, Ivan B; Xu, Lei; Wang, Ting
2012-04-15
Because Shannon's entropy can be obtained by Stirling's approximation of thermodynamics entropy, the statistical physics energy minimization methods are directly applicable to the signal constellation design. We demonstrate that statistical physics inspired energy-efficient (EE) signal constellation designs, in combination with large-girth low-density parity-check (LDPC) codes, significantly outperform conventional LDPC-coded polarization-division multiplexed quadrature amplitude modulation schemes. We also describe an EE signal constellation design algorithm. Finally, we propose the discrete-time implementation of D-dimensional transceiver and corresponding EE polarization-division multiplexed system. © 2012 Optical Society of America
INSPIRE: Managing Metadata in a Global Digital Library for High-Energy Physics
Martin Montull, Javier
2011-01-01
Four leading laboratories in the High-Energy Physics (HEP) field are collaborating to roll-out the next-generation scientific information portal: INSPIRE. The goal of this project is to replace the popular 40 year-old SPIRES database. INSPIRE already provides access to about 1 million records and includes services such as fulltext search, automatic keyword assignment, ingestion and automatic display of LaTeX, citation analysis, automatic author disambiguation, metadata harvesting, extraction of figures from fulltext and search in figure captions. In order to achieve high quality metadata both automatic processing and manual curation are needed. The different tools available in the system use modern web technologies to provide the curators of the maximum efficiency, while dealing with the MARC standard format. The project is under heavy development in order to provide new features including semantic analysis, crowdsourcing of metadata curation, user tagging, recommender systems, integration of OAIS standards a...
INSPIRE: Realizing the dream of a global digital library in High-Energy Physics
Holtkamp, Annette; Simko, Tibor; Smith, Tim
2010-01-01
High-Energy Physics (HEP) has a long tradition in pioneering infrastructures for scholarly communication, and four leading laboratories are now rolling-out the next-generation digital library for the field: INSPIRE. This is an evolution of the extraordinarily successful, 40-years old SPIRES database. Based on the Invenio software, INSPIRE already provides seamless access to almost 1 million records, which will be expanded to cover multimedia, data, software, wikis. Services offered include citation analysis, fulltext search, extraction of figures from fulltext and search in figure captions, automatic keyword assignment, metadata harvesting, retrodigitization, ingestion and automatic display of LaTeX, and storage of supplementary materials like Mathematica notebooks. New services are in different phases of design or implementation, in strategic partnerships with all other information providers in the field and neighbouring disciplines, including automatic author disambiguation, user tagging, crowdsourcing of m...
A Physics-Inspired Mechanistic Model of Migratory Movement Patterns in Birds.
Revell, Christopher; Somveille, Marius
2017-08-29
In this paper, we introduce a mechanistic model of migratory movement patterns in birds, inspired by ideas and methods from physics. Previous studies have shed light on the factors influencing bird migration but have mainly relied on statistical correlative analysis of tracking data. Our novel method offers a bottom up explanation of population-level migratory movement patterns. It differs from previous mechanistic models of animal migration and enables predictions of pathways and destinations from a given starting location. We define an environmental potential landscape from environmental data and simulate bird movement within this landscape based on simple decision rules drawn from statistical mechanics. We explore the capacity of the model by qualitatively comparing simulation results to the non-breeding migration patterns of a seabird species, the Black-browed Albatross (Thalassarche melanophris). This minimal, two-parameter model was able to capture remarkably well the previously documented migration patterns of the Black-browed Albatross, with the best combination of parameter values conserved across multiple geographically separate populations. Our physics-inspired mechanistic model could be applied to other bird and highly-mobile species, improving our understanding of the relative importance of various factors driving migration and making predictions that could be useful for conservation.
Gueddana, Amor; Attia, Moez; Chatta, Rihab
2015-03-01
In this work, we study the error sources standing behind the non-perfect linear optical quantum components composing a non-deterministic quantum CNOT gate model, which performs the CNOT function with a success probability of 4/27 and uses a double encoding technique to represent photonic qubits at the control and the target. We generalize this model to an abstract probabilistic CNOT version and determine the realizability limits depending on a realistic range of the errors. Finally, we discuss physical constraints allowing the implementation of the Asymmetric Partially Polarizing Beam Splitter (APPBS), which is at the heart of correctly realizing the CNOT function.
AUTHOR|(CDS)2266999
2017-01-01
CERN has been involved in the dissemination of scientific results since its early days and has continuously updated the distribution channels. Currently, Inspire hosts catalogues of articles, authors, institutions, conferences, jobs, experiments, journals and more. Successful orientation among this amount of data requires comprehensive linking between the content. Inspire has lacked a system for linking experiments and articles together based on which accelerator they were conducted at. The purpose of this project has been to create such a system. Records for 156 accelerators were created and all 2913 experiments on Inspire were given corresponding MARC tags. Records of 18404 accelerator physics related bibliographic entries were also tagged with corresponding accelerator tags. Finally, as a part of the endeavour to broaden CERN's presence on Wikipedia, existing Wikipedia articles of accelerators were updated with short descriptions and links to Inspire. In total, 86 Wikipedia articles were updated. This repo...
Dynamic modeling of physical phenomena for probabilistic assessment of spent fuel accidents
International Nuclear Information System (INIS)
Benjamin, A.S.
1997-01-01
If there should be an accident involving drainage of all the water from a spent fuel pool, the fuel elements will heat up until the heat produced by radioactive decay is balanced by that removed by natural convection to air, thermal radiation, and other means. If the temperatures become high enough for the cladding or other materials to ignite due to rapid oxidation, then some of the fuel might melt, leading to an undesirable release of radioactive materials. The amount of melting is dependent upon the fuel loading configuration and its age, the oxidation and melting characteristics of the materials, and the potential effectiveness of recovery actions. The authors have developed methods for modeling the pertinent physical phenomena and integrating the results with a probabilistic treatment of the uncertainty distributions. The net result is a set of complementary cumulative distribution functions for the amount of fuel melted
DEFF Research Database (Denmark)
Lepech, Michael; Geiker, Mette; Michel, Alexander
This paper looks to address the grand challenge of integrating construction materials engineering research within a multi-scale, inter-disciplinary research and management framework for sustainable concrete infrastructure. The ultimate goal is to drive sustainability-focused innovation and adoption...... cycles in the broader architecture, engineering, construction (AEC) industry. Specifically, a probabilistic design framework for sustainable concrete infrastructure and a multi-physics service life model for reinforced concrete are presented as important points of integration for innovation between...... design, consists of concrete service life models and life cycle assessment (LCA) models. Both types of models (service life and LCA) are formulated stochastically so that the service life and time(s) to repair, as well as total sustainability impact, are described by a probability distribution. A central...
International Nuclear Information System (INIS)
Benjamin, A.S.; Paez, T.L.; Brown, N.N.
1998-01-01
In most probabilistic risk assessments, there is a subset of accident scenarios that involves physical challenges to the system, such as high heat rates and/or accelerations. The system's responses to these challenges may be complicated, and their prediction may require the use of long-running computer codes. To deal with the many scenarios demanded by a risk assessment, the authors have been investigating the use of artificial neural networks (ANNs) as a fast-running estimation tool. They have developed a multivariate linear spline algorithm by extending previous ANN methods that use radial basis functions. They have applied the algorithm to problems involving fires, shocks, and vibrations. They have found that within the parameter range for which it is trained, the algorithm can simulate the nonlinear responses of complex systems with high accuracy. Running times per case are less than one second
International Nuclear Information System (INIS)
Medvedev, A.; Bogatyr, S.; Khramtsov; Sokolov, F.
2001-01-01
During the last years probabilistic methods for evaluation of the influence of the fuel geometry and technology parameters on fuel operational reliability are widely used. In the present work the START-3 procedure is used to calculate the thermal physics and strength characteristics of WWER fuel rods behavior. The procedure is based on the Monte-Carlo method with the application of Sobol quasi-random sequences. This technique allows to treat the fuel rod technological and operating parameters as well as its strength and thermal physics characteristics as random variables. The work deals with a series of WWER-1000 fuel rod statistical tests and verification based on the PIE results. Also preliminary calculations are implemented with the aim to determine the design schema parameters. This should ensure the accuracy of the assessment of the parameters of WWER fuel rod characteristics distribution. The probability characteristics of fuel rod strength and thermal physics are assessed via the statistical analysis of the results of probability calculations
Directory of Open Access Journals (Sweden)
O.O. Sdvizhkova
2017-12-01
Full Text Available The physical and mechanical characteristics of soils and soft rocks obtained as a result of laboratory tests are important initial parameters for assessing the stability of natural and artificial slopes. Such properties of rocks as adhesion and the angle of internal friction are due to the influence of a number of natural and technogenic factors. At the same time, from the set of factors influencing the stability of the slope, the most significant ones are singled out, which to a greater extent determine the properties of the rocks. The more factors are taken into account in the geotechnical model, the more closely the properties of the rocks are studied, which increases the accuracy of the scientific forecast of the landslide danger of the slope. On the other hand, an increase in the number of factors involved in the model complicates it and causes a decrease in the reliability of geotechnical calculations. The aim of the work is to construct a statistical distribution of the studied physical and mechanical properties of soft rocks and to substantiate a probabilistic statistical model. Based on the results of laboratory tests of rocks, the statistical distributions of the quantitative traits studied, the angle of internal friction φ and the cohesion, were constructed. It was established that the statistical distribution of physical mechanical properties of rocks is close to a uniform law.
Directory of Open Access Journals (Sweden)
S. Zhang
2018-03-01
Full Text Available Conventional outputs of physics-based landslide forecasting models are presented as deterministic warnings by calculating the safety factor (Fs of potentially dangerous slopes. However, these models are highly dependent on variables such as cohesion force and internal friction angle which are affected by a high degree of uncertainty especially at a regional scale, resulting in unacceptable uncertainties of Fs. Under such circumstances, the outputs of physical models are more suitable if presented in the form of landslide probability values. In order to develop such models, a method to link the uncertainty of soil parameter values with landslide probability is devised. This paper proposes the use of Monte Carlo methods to quantitatively express uncertainty by assigning random values to physical variables inside a defined interval. The inequality Fs < 1 is tested for each pixel in n simulations which are integrated in a unique parameter. This parameter links the landslide probability to the uncertainties of soil mechanical parameters and is used to create a physics-based probabilistic forecasting model for rainfall-induced shallow landslides. The prediction ability of this model was tested in a case study, in which simulated forecasting of landslide disasters associated with heavy rainfalls on 9 July 2013 in the Wenchuan earthquake region of Sichuan province, China, was performed. The proposed model successfully forecasted landslides in 159 of the 176 disaster points registered by the geo-environmental monitoring station of Sichuan province. Such testing results indicate that the new model can be operated in a highly efficient way and show more reliable results, attributable to its high prediction accuracy. Accordingly, the new model can be potentially packaged into a forecasting system for shallow landslides providing technological support for the mitigation of these disasters at regional scale.
Zhang, Shaojie; Zhao, Luqiang; Delgado-Tellez, Ricardo; Bao, Hongjun
2018-03-01
Conventional outputs of physics-based landslide forecasting models are presented as deterministic warnings by calculating the safety factor (Fs) of potentially dangerous slopes. However, these models are highly dependent on variables such as cohesion force and internal friction angle which are affected by a high degree of uncertainty especially at a regional scale, resulting in unacceptable uncertainties of Fs. Under such circumstances, the outputs of physical models are more suitable if presented in the form of landslide probability values. In order to develop such models, a method to link the uncertainty of soil parameter values with landslide probability is devised. This paper proposes the use of Monte Carlo methods to quantitatively express uncertainty by assigning random values to physical variables inside a defined interval. The inequality Fs soil mechanical parameters and is used to create a physics-based probabilistic forecasting model for rainfall-induced shallow landslides. The prediction ability of this model was tested in a case study, in which simulated forecasting of landslide disasters associated with heavy rainfalls on 9 July 2013 in the Wenchuan earthquake region of Sichuan province, China, was performed. The proposed model successfully forecasted landslides in 159 of the 176 disaster points registered by the geo-environmental monitoring station of Sichuan province. Such testing results indicate that the new model can be operated in a highly efficient way and show more reliable results, attributable to its high prediction accuracy. Accordingly, the new model can be potentially packaged into a forecasting system for shallow landslides providing technological support for the mitigation of these disasters at regional scale.
A physical probabilistic model to predict failure rates in buried PVC pipelines
International Nuclear Information System (INIS)
Davis, P.; Burn, S.; Moglia, M.; Gould, S.
2007-01-01
For older water pipeline materials such as cast iron and asbestos cement, future pipe failure rates can be extrapolated from large volumes of existing historical failure data held by water utilities. However, for newer pipeline materials such as polyvinyl chloride (PVC), only limited failure data exists and confident forecasts of future pipe failures cannot be made from historical data alone. To solve this problem, this paper presents a physical probabilistic model, which has been developed to estimate failure rates in buried PVC pipelines as they age. The model assumes that under in-service operating conditions, crack initiation can occur from inherent defects located in the pipe wall. Linear elastic fracture mechanics theory is used to predict the time to brittle fracture for pipes with internal defects subjected to combined internal pressure and soil deflection loading together with through-wall residual stress. To include uncertainty in the failure process, inherent defect size is treated as a stochastic variable, and modelled with an appropriate probability distribution. Microscopic examination of fracture surfaces from field failures in Australian PVC pipes suggests that the 2-parameter Weibull distribution can be applied. Monte Carlo simulation is then used to estimate lifetime probability distributions for pipes with internal defects, subjected to typical operating conditions. As with inherent defect size, the 2-parameter Weibull distribution is shown to be appropriate to model uncertainty in predicted pipe lifetime. The Weibull hazard function for pipe lifetime is then used to estimate the expected failure rate (per pipe length/per year) as a function of pipe age. To validate the model, predicted failure rates are compared to aggregated failure data from 17 UK water utilities obtained from the United Kingdom Water Industry Research (UKWIR) National Mains Failure Database. In the absence of actual operating pressure data in the UKWIR database, typical
Kopp, Robert E.; DeConto, Robert M.; Bader, Daniel A.; Hay, Carling C.; Horton, Radley M.; Kulp, Scott; Oppenheimer, Michael; Pollard, David; Strauss, Benjamin H.
2017-12-01
Mechanisms such as ice-shelf hydrofracturing and ice-cliff collapse may rapidly increase discharge from marine-based ice sheets. Here, we link a probabilistic framework for sea-level projections to a small ensemble of Antarctic ice-sheet (AIS) simulations incorporating these physical processes to explore their influence on global-mean sea-level (GMSL) and relative sea-level (RSL). We compare the new projections to past results using expert assessment and structured expert elicitation about AIS changes. Under high greenhouse gas emissions (Representative Concentration Pathway [RCP] 8.5), median projected 21st century GMSL rise increases from 79 to 146 cm. Without protective measures, revised median RSL projections would by 2100 submerge land currently home to 153 million people, an increase of 44 million. The use of a physical model, rather than simple parameterizations assuming constant acceleration of ice loss, increases forcing sensitivity: overlap between the central 90% of simulations for 2100 for RCP 8.5 (93-243 cm) and RCP 2.6 (26-98 cm) is minimal. By 2300, the gap between median GMSL estimates for RCP 8.5 and RCP 2.6 reaches >10 m, with median RSL projections for RCP 8.5 jeopardizing land now occupied by 950 million people (versus 167 million for RCP 2.6). The minimal correlation between the contribution of AIS to GMSL by 2050 and that in 2100 and beyond implies current sea-level observations cannot exclude future extreme outcomes. The sensitivity of post-2050 projections to deeply uncertain physics highlights the need for robust decision and adaptive management frameworks.
Kopp, Robert E.; DeConto, Robert M.; Bader, Daniel A.; Hay, Carling C.; Horton, Radley M.; Kulp, Scott; Oppenheimer, Michael; Pollard, David; Strauss, Benjamin
2017-01-01
Mechanisms such as ice-shelf hydrofracturing and ice-cliff collapse may rapidly increase discharge from marine-based ice sheets. Here, we link a probabilistic framework for sea-level projections to a small ensemble of Antarctic ice-sheet (AIS) simulations incorporating these physical processes to explore their influence on global-mean sea-level (GMSL) and relative sea-level (RSL). We compare the new projections to past results using expert assessment and structured expert elicitation about AIS changes. Under high greenhouse gas emissions (Representative Concentration Pathway [RCP] 8.5), median projected 21st century GMSL rise increases from 79 to 146 cm. Without protective measures, revised median RSL projections would by 2100 submerge land currently home to 153 million people, an increase of 44 million. The use of a physical model, rather than simple parameterizations assuming constant acceleration of ice loss, increases forcing sensitivity: overlap between the central 90% of simulations for 2100 for RCP 8.5 (93-243 cm) and RCP 2.6 (26-98 cm) is minimal. By 2300, the gap between median GMSL estimates for RCP 8.5 and RCP 2.6 reaches >10 m, with median RSL projections for RCP 8.5 jeopardizing land now occupied by 950 million people (versus 167 million for RCP 2.6). The minimal correlation between the contribution of AIS to GMSL by 2050 and that in 2100 and beyond implies current sea-level observations cannot exclude future extreme outcomes. The sensitivity of post-2050 projections to deeply uncertain physics highlights the need for robust decision and adaptive management frameworks.
Li, Chen; Fearing, Ronald; Full, Robert
Most animals move in nature in a variety of locomotor modes. For example, to traverse obstacles like dense vegetation, cockroaches can climb over, push across, reorient their bodies to maneuver through slits, or even transition among these modes forming diverse locomotor pathways; if flipped over, they can also self-right using wings or legs to generate body pitch or roll. By contrast, most locomotion studies have focused on a single mode such as running, walking, or jumping, and robots are still far from capable of life-like, robust, multi-modal locomotion in the real world. Here, we present two recent studies using bio-inspired robots, together with new locomotion energy landscapes derived from locomotor-environment interaction physics, to begin to understand the physics of multi-modal locomotion. (1) Our experiment of a cockroach-inspired legged robot traversing grass-like beam obstacles reveals that, with a terradynamically ``streamlined'' rounded body like that of the insect, robot traversal becomes more probable by accessing locomotor pathways that overcome lower potential energy barriers. (2) Our experiment of a cockroach-inspired self-righting robot further suggests that body vibrations are crucial for exploring locomotion energy landscapes and reaching lower barrier pathways. Finally, we posit that our new framework of locomotion energy landscapes holds promise to better understand and predict multi-modal biological and robotic movement.
Yugsi Molina, F. X.; Oppikofer, T.; Fischer, L.; Hermanns, R. L.; Taurisano, A.
2012-04-01
Traditional techniques to assess rockfall hazard are partially based on probabilistic analysis. Stochastic methods has been used for run-out analysis of rock blocks to estimate the trajectories that a detached block will follow during its fall until it stops due to kinetic energy loss. However, the selection of rockfall source areas is usually defined either by multivariate analysis or by field observations. For either case, a physically based approach is not used for the source area detection. We present an example of rockfall hazard assessment that integrates a probabilistic rockfall run-out analysis with a stochastic assessment of the rockfall source areas using kinematic stability analysis in a GIS environment. The method has been tested for a steep more than 200 m high rock wall, located in the municipality of Norddal (Møre og Romsdal county, Norway), where a large number of people are either exposed to snow avalanches, rockfalls, or debris flows. The area was selected following the recently published hazard mapping plan of Norway. The cliff is formed by medium to coarse-grained quartz-dioritic to granitic gneisses of Proterozoic age. Scree deposits product of recent rockfall activity are found at the bottom of the rock wall. Large blocks can be found several tens of meters away from the cliff in Sylte, the main locality in the Norddal municipality. Structural characterization of the rock wall was done using terrestrial laser scanning (TLS) point clouds in the software Coltop3D (www.terranum.ch), and results were validated with field data. Orientation data sets from the structural characterization were analyzed separately to assess best-fit probability density functions (PDF) for both dip angle and dip direction angle of each discontinuity set. A GIS-based stochastic kinematic analysis was then carried out using the discontinuity set orientations and the friction angle as random variables. An airborne laser scanning digital elevation model (ALS-DEM) with 1 m
International Nuclear Information System (INIS)
Zhu, Shun-Peng; Huang, Hong-Zhong; Peng, Weiwen; Wang, Hai-Kun; Mahadevan, Sankaran
2016-01-01
A probabilistic Physics of Failure-based framework for fatigue life prediction of aircraft gas turbine discs operating under uncertainty is developed. The framework incorporates the overall uncertainties appearing in a structural integrity assessment. A comprehensive uncertainty quantification (UQ) procedure is presented to quantify multiple types of uncertainty using multiplicative and additive UQ methods. In addition, the factors that contribute the most to the resulting output uncertainty are investigated and identified for uncertainty reduction in decision-making. A high prediction accuracy of the proposed framework is validated through a comparison of model predictions to the experimental results of GH4133 superalloy and full-scale tests of aero engine high-pressure turbine discs. - Highlights: • A probabilistic PoF-based framework for fatigue life prediction is proposed. • A comprehensive procedure forquantifyingmultiple types of uncertaintyis presented. • The factors that contribute most to the resulting output uncertainty are identified. • The proposed frameworkdemonstrates high prediction accuracybyfull-scale tests.
Denison, Stephanie; Trikutam, Pallavi; Xu, Fei
2014-01-01
A rich tradition in developmental psychology explores physical reasoning in infancy. However, no research to date has investigated whether infants can reason about physical objects that behave probabilistically, rather than deterministically. Physical events are often quite variable, in that similar-looking objects can be placed in similar…
Probabilistic logics and probabilistic networks
Haenni, Rolf; Wheeler, Gregory; Williamson, Jon; Andrews, Jill
2014-01-01
Probabilistic Logic and Probabilistic Networks presents a groundbreaking framework within which various approaches to probabilistic logic naturally fit. Additionally, the text shows how to develop computationally feasible methods to mesh with this framework.
Steele, Carol Frederick
2011-01-01
In terms of teacher quality, Steele believes the best teachers have reached a stage she terms inspired, and that teachers move progressively through the stages of unaware, aware, and capable until the most reflective teachers finally reach the inspired level. Inspired teachers have a wide repertoire of teaching and class management techniques and…
From the Outside In: Getting Physical with Exercises Inspired by Stella Adler and Uta Hagen.
Miller, Bruce
2002-01-01
Proposes that teaching students to find and play appropriate actions helps them tell the story of a play and create character better than if they focused on emotions. Discusses Stella Adler and Uta Hagen, two acting teachers who advocated this physical approach. Presents two exercises: "justify and connect," and "enter a room." (PM)
The effect of shape on drag: a physics exercise inspired by biology
Fingerut, Jonathan; Johnson, Nicholas; Mongeau, Eric; Habdas, Piotr
2017-07-01
As part of a biomechanics course aimed at upper-division biology and physics majors, but applicable to a range of student learning levels, this laboratory exercise provides an insight into the effect of shape on hydrodynamic performance, as well an introduction to computer aided design (CAD) and 3D printing. Students use hydrodynamic modeling software and simple CAD programs to design a shape with the least amount of drag based on strategies gleaned from the study of natural forms. Students then print the shapes using a 3D printer and test their shapes against their classmates in a friendly competition. From this exercise, students gain a more intuitive sense of the challenges that organisms face when moving through fluid environments, the physical phenomena involved in moving through fluids at high Reynolds numbers and observe how and why certain morphologies, such as streamlining, are common answers to the challenge of swimming at high speeds.
Physically Inspired Models for the Synthesis of Stiff Strings with Dispersive Waveguides
Directory of Open Access Journals (Sweden)
Testa I
2004-01-01
Full Text Available We review the derivation and design of digital waveguides from physical models of stiff systems, useful for the synthesis of sounds from strings, rods, and similar objects. A transform method approach is proposed to solve the classic fourth-order equations of stiff systems in order to reduce it to two second-order equations. By introducing scattering boundary matrices, the eigenfrequencies are determined and their dependency is discussed for the clamped, hinged, and intermediate cases. On the basis of the frequency-domain physical model, the numerical discretization is carried out, showing how the insertion of an all-pass delay line generalizes the Karplus-Strong algorithm for the synthesis of ideally flexible vibrating strings. Knowing the physical parameters, the synthesis can proceed using the generalized structure. Another point of view is offered by Laguerre expansions and frequency warping, which are introduced in order to show that a stiff system can be treated as a nonstiff one, provided that the solutions are warped. A method to compute the all-pass chain coefficients and the optimum warping curves from sound samples is discussed. Once the optimum warping characteristic is found, the length of the dispersive delay line to be employed in the simulation is simply determined from the requirement of matching the desired fundamental frequency. The regularization of the dispersion curves by means of optimum unwarping is experimentally evaluated.
Gromek, Katherine Emily
A novel computational and inference framework of the physics-of-failure (PoF) reliability modeling for complex dynamic systems has been established in this research. The PoF-based reliability models are used to perform a real time simulation of system failure processes, so that the system level reliability modeling would constitute inferences from checking the status of component level reliability at any given time. The "agent autonomy" concept is applied as a solution method for the system-level probabilistic PoF-based (i.e. PPoF-based) modeling. This concept originated from artificial intelligence (AI) as a leading intelligent computational inference in modeling of multi agents systems (MAS). The concept of agent autonomy in the context of reliability modeling was first proposed by M. Azarkhail [1], where a fundamentally new idea of system representation by autonomous intelligent agents for the purpose of reliability modeling was introduced. Contribution of the current work lies in the further development of the agent anatomy concept, particularly the refined agent classification within the scope of the PoF-based system reliability modeling, new approaches to the learning and the autonomy properties of the intelligent agents, and modeling interacting failure mechanisms within the dynamic engineering system. The autonomous property of intelligent agents is defined as agent's ability to self-activate, deactivate or completely redefine their role in the analysis. This property of agents and the ability to model interacting failure mechanisms of the system elements makes the agent autonomy fundamentally different from all existing methods of probabilistic PoF-based reliability modeling. 1. Azarkhail, M., "Agent Autonomy Approach to Physics-Based Reliability Modeling of Structures and Mechanical Systems", PhD thesis, University of Maryland, College Park, 2007.
Relkin, Perla
2016-10-01
The 6th international symposium in the series "Delivery of Functionality in Complex Food Systems: Physically inspired approaches from nanoscale to microscal" was held in the heart of Paris from 14 to 17 July, 2015. It brought together PhD students, academic food researchers and industrials from diversified food sectors. The scientific sessions of this meeting were constructed around important topics dealing with 1) Engineering of tailored-made structures in bio-based systems; 2) Complexity and emergent phenomena in the integrative food science; 3) Investigation of nano and microstructures in the bulk and at interfaces; 4) Modeling approaches from bio-molecules and matrix structures to functionality; 5) Tuning binding & release of bioactive compounds by matrix modulation, and finally; 6) Tuning the delivery of functionality to the body. These topics were selected to cover different scientific fields and to show the contribution of food physical structures to development of health- and plaisure-supporting food functions. The oral communications were all introduced by key note speakers and they were all illustrated by outstanding high quality short communications. One of the most original features of this symposium was the increasing number of presentations using multiscale and modeling approaches illustrating the concept of complexity and emergent phenomena integrative food science. These highlighted the importance of studies on interactions between structure properties of engineered delivery systems and human body (sensory properties, digestion, release, bioavailability and bioaccessibility). Copyright © 2016 Elsevier Ltd. All rights reserved.
Directory of Open Access Journals (Sweden)
Nikesh S. Dattani
2012-03-01
Full Text Available One of the most successful methods for calculating reduced density operator dynamics in open quantum systems, that can give numerically exact results, uses Feynman integrals. However, when simulating the dynamics for a given amount of time, the number of time steps that can realistically be used with this method is always limited, therefore one often obtains an approximation of the reduced density operator at a sparse grid of points in time. Instead of relying only on ad hoc interpolation methods (such as splines to estimate the system density operator in between these points, I propose a method that uses physical information to assist with this interpolation. This method is tested on a physically significant system, on which its use allows important qualitative features of the density operator dynamics to be captured with as little as two time steps in the Feynman integral. This method allows for an enormous reduction in the amount of memory and CPU time required for approximating density operator dynamics within a desired accuracy. Since this method does not change the way the Feynman integral itself is calculated, the value of the density operator approximation at the points in time used to discretize the Feynamn integral will be the same whether or not this method is used, but its approximation in between these points in time is considerably improved by this method. A list of ways in which this proposed method can be further improved is presented in the last section of the article.
Directory of Open Access Journals (Sweden)
S. Raia
2014-03-01
Full Text Available Distributed models to forecast the spatial and temporal occurrence of rainfall-induced shallow landslides are based on deterministic laws. These models extend spatially the static stability models adopted in geotechnical engineering, and adopt an infinite-slope geometry to balance the resisting and the driving forces acting on the sliding mass. An infiltration model is used to determine how rainfall changes pore-water conditions, modulating the local stability/instability conditions. A problem with the operation of the existing models lays in the difficulty in obtaining accurate values for the several variables that describe the material properties of the slopes. The problem is particularly severe when the models are applied over large areas, for which sufficient information on the geotechnical and hydrological conditions of the slopes is not generally available. To help solve the problem, we propose a probabilistic Monte Carlo approach to the distributed modeling of rainfall-induced shallow landslides. For this purpose, we have modified the transient rainfall infiltration and grid-based regional slope-stability analysis (TRIGRS code. The new code (TRIGRS-P adopts a probabilistic approach to compute, on a cell-by-cell basis, transient pore-pressure changes and related changes in the factor of safety due to rainfall infiltration. Infiltration is modeled using analytical solutions of partial differential equations describing one-dimensional vertical flow in isotropic, homogeneous materials. Both saturated and unsaturated soil conditions can be considered. TRIGRS-P copes with the natural variability inherent to the mechanical and hydrological properties of the slope materials by allowing values of the TRIGRS model input parameters to be sampled randomly from a given probability distribution. The range of variation and the mean value of the parameters can be determined by the usual methods used for preparing the TRIGRS input parameters. The outputs
Probabilistic short-term forecasting of eruption rate at Kīlauea Volcano using a physics-based model
Anderson, K. R.
2016-12-01
Deterministic models of volcanic eruptions yield predictions of future activity conditioned on uncertainty in the current state of the system. Physics-based eruption models are well-suited for deterministic forecasting as they can relate magma physics with a wide range of observations. Yet, physics-based eruption forecasting is strongly limited by an inadequate understanding of volcanic systems, and the need for eruption models to be computationally tractable. At Kīlauea Volcano, Hawaii, episodic depressurization-pressurization cycles of the magma system generate correlated, quasi-exponential variations in ground deformation and surface height of the active summit lava lake. Deflations are associated with reductions in eruption rate, or even brief eruptive pauses, and thus partly control lava flow advance rates and associated hazard. Because of the relatively well-understood nature of Kīlauea's shallow magma plumbing system, and because more than 600 of these events have been recorded to date, they offer a unique opportunity to refine a physics-based effusive eruption forecasting approach and apply it to lava eruption rates over short (hours to days) time periods. A simple physical model of the volcano ascribes observed data to temporary reductions in magma supply to an elastic reservoir filled with compressible magma. This model can be used to predict the evolution of an ongoing event, but because the mechanism that triggers events is unknown, event durations are modeled stochastically from previous observations. A Bayesian approach incorporates diverse data sets and prior information to simultaneously estimate uncertain model parameters and future states of the system. Forecasts take the form of probability distributions for eruption rate or cumulative erupted volume at some future time. Results demonstrate the significant uncertainties that still remain even for short-term eruption forecasting at a well-monitored volcano - but also the value of a physics
Practicum in adapted physical activity: a Dewey-inspired action research project.
Standal, Øyvind; Rugseth, Gro
2014-07-01
The purpose of this study was to investigate what adapted physical activity (APA) students learn from their practicum experiences. One cohort of APA students participated, and data were generated from an action research project that included observations, reflective journals, and a focus group interview. The theoretical framework for the study was Dewey's and Wackerhausen's theories of reflections. The findings show the objects of students' reflections, the kind of conceptual resources they draw on while reflecting, and their knowledge interests. In addition, two paradoxes are identified: the tension between reflecting from and on own values, and how practicum as a valued experience of reality can become too difficult to handle. In conclusion, we reflect on how practicum learning can be facilitated.
Bussotti, Paolo
2015-01-01
This book presents new insights into Leibniz’s research on planetary theory and his system of pre-established harmony. Although some aspects of this theory have been explored in the literature, others are less well known. In particular, the book offers new contributions on the connection between the planetary theory and the theory of gravitation. It also provides an in-depth discussion of Kepler’s influence on Leibniz’s planetary theory and, more generally, on Leibniz’s concept of pre-established harmony. Three initial chapters presenting the mathematical and physical details of Leibniz’s works provide a frame of reference. The book then goes on to discuss research on Leibniz’s conception of gravity and the connection between Leibniz and Kepler. .
CERN Bulletin
2010-01-01
Particle physicists thrive on information. They first create information by performing experiments or elaborating theoretical conjectures and then they share it through publications and various web tools. The INSPIRE service, just released, will bring state of the art information retrieval to the fingertips of researchers. Keeping track of the information shared within the particle physics community has long been the task of libraries at the larger labs, such as CERN, DESY, Fermilab and SLAC, as well as the focus of indispensible services like arXiv and those of the Particle Data Group. In 2007, many providers of information in the field came together for a summit at SLAC to see how physics information resources could be enhanced, and the INSPIRE project emerged from that meeting. The vision behind INSPIRE was built by a survey launched by the four labs to evaluate the real needs of the community. INSPIRE responds to these directives from the community by combining the most successful aspe...
International Nuclear Information System (INIS)
Chookah, M.; Nuhi, M.; Modarres, M.
2011-01-01
A combined probabilistic physics-of-failure-based model for pitting and corrosion-fatigue degradation mechanisms is proposed to estimate the reliability of structures and to perform prognosis and health management. A mechanistic superposition model for corrosion-fatigue mechanism was used as a benchmark model to propose the simple model. The proposed model describes the degradation of the structures as a function of physical and critical environmental stresses, such as amplitude and frequency of mechanical loads (for example caused by the internal piping pressure) and the concentration of corrosive chemical agents. The parameters of the proposed model are represented by the probability density functions and estimated through a Bayesian approach based on the data taken from the experiments performed as part of this research. For demonstrating applications, the proposed model provides prognostic information about the reliability of aging of structures and is helpful in developing inspection and replacement strategies. - Highlights: ► We model an inventory system under static–dynamic uncertainty strategy. ► The demand is stochastic and non-stationary. ► The optimal ordering policy is proven to be a base stock policy. ► A solution algorithm for finding an optimal solution is provided. ► Two heuristics developed produce high quality solutions and scale-up efficiently.
Wakker, P.P.; Thaler, R.H.; Tversky, A.
1997-01-01
textabstractProbabilistic insurance is an insurance policy involving a small probability that the consumer will not be reimbursed. Survey data suggest that people dislike probabilistic insurance and demand more than a 20% reduction in the premium to compensate for a 1% default risk. While these preferences are intuitively appealing they are difficult to reconcile with expected utility theory. Under highly plausible assumptions about the utility function, willingness to pay for probabilistic i...
Denison, Stephanie; Trikutam, Pallavi; Xu, Fei
2014-08-01
A rich tradition in developmental psychology explores physical reasoning in infancy. However, no research to date has investigated whether infants can reason about physical objects that behave probabilistically, rather than deterministically. Physical events are often quite variable, in that similar-looking objects can be placed in similar contexts with different outcomes. Can infants rapidly acquire probabilistic physical knowledge, such as some leaves fall and some glasses break by simply observing the statistical regularity with which objects behave and apply that knowledge in subsequent reasoning? We taught 11-month-old infants physical constraints on objects and asked them to reason about the probability of different outcomes when objects were drawn from a large distribution. Infants could have reasoned either by using the perceptual similarity between the samples and larger distributions or by applying physical rules to adjust base rates and estimate the probabilities. Infants learned the physical constraints quickly and used them to estimate probabilities, rather than relying on similarity, a version of the representativeness heuristic. These results indicate that infants can rapidly and flexibly acquire physical knowledge about objects following very brief exposure and apply it in subsequent reasoning. PsycINFO Database Record (c) 2014 APA, all rights reserved.
DEFF Research Database (Denmark)
Jensen, Finn Verner; Lauritzen, Steffen Lilholt
2001-01-01
This article describes the basic ideas and algorithms behind specification and inference in probabilistic networks based on directed acyclic graphs, undirected graphs, and chain graphs.......This article describes the basic ideas and algorithms behind specification and inference in probabilistic networks based on directed acyclic graphs, undirected graphs, and chain graphs....
Wakker, P.P.; Thaler, R.H.; Tversky, A.
1997-01-01
Probabilistic insurance is an insurance policy involving a small probability that the consumer will not be reimbursed. Survey data suggest that people dislike probabilistic insurance and demand more than a 20% reduction in premium to compensate for a 1% default risk. These observations cannot be
P.P. Wakker (Peter); R.H. Thaler (Richard); A. Tversky (Amos)
1997-01-01
textabstractProbabilistic insurance is an insurance policy involving a small probability that the consumer will not be reimbursed. Survey data suggest that people dislike probabilistic insurance and demand more than a 20% reduction in the premium to compensate for a 1% default risk. While these
Energy Technology Data Exchange (ETDEWEB)
Avelino, P.P., E-mail: ppavelin@fc.up.pt [Centro de Astrofísica da Universidade do Porto, Rua das Estrelas, 4150-762 Porto (Portugal)
2012-11-01
In this paper we investigate the classical non-relativistic limit of the Eddington-inspired Born-Infeld theory of gravity. We show that strong bounds on the value of the only additional parameter of the theory κ, with respect to general relativity, may be obtained by requiring that gravity plays a subdominant role compared to electromagnetic interactions inside atomic nuclei. We also discuss the validity of the continuous fluid approximation used in this and other astrophysical and cosmological studies. We argue that although the continuous fluid approximation is expected to be valid in the case of sufficiently smooth density distributions, its use should eventually be validated at a quantum level.
International Nuclear Information System (INIS)
Avelino, P.P.
2012-01-01
In this paper we investigate the classical non-relativistic limit of the Eddington-inspired Born-Infeld theory of gravity. We show that strong bounds on the value of the only additional parameter of the theory κ, with respect to general relativity, may be obtained by requiring that gravity plays a subdominant role compared to electromagnetic interactions inside atomic nuclei. We also discuss the validity of the continuous fluid approximation used in this and other astrophysical and cosmological studies. We argue that although the continuous fluid approximation is expected to be valid in the case of sufficiently smooth density distributions, its use should eventually be validated at a quantum level
Patel, Bhogila M.; Hoge, Peter A.; Nagpal, Vinod K.; Hojnicki, Jeffrey S.; Rusick, Jeffrey J.
2004-01-01
This paper describes the methods employed to apply probabilistic modeling techniques to the International Space Station (ISS) power system. These techniques were used to quantify the probabilistic variation in the power output, also called the response variable, due to variations (uncertainties) associated with knowledge of the influencing factors called the random variables. These uncertainties can be due to unknown environmental conditions, variation in the performance of electrical power system components or sensor tolerances. Uncertainties in these variables, cause corresponding variations in the power output, but the magnitude of that effect varies with the ISS operating conditions, e.g. whether or not the solar panels are actively tracking the sun. Therefore, it is important to quantify the influence of these uncertainties on the power output for optimizing the power available for experiments.
Bod, R.; Heine, B.; Narrog, H.
2010-01-01
Probabilistic linguistics takes all linguistic evidence as positive evidence and lets statistics decide. It allows for accurate modelling of gradient phenomena in production and perception, and suggests that rule-like behaviour is no more than a side effect of maximizing probability. This chapter
DEFF Research Database (Denmark)
Sørensen, John Dalsgaard; Burcharth, H. F.
This chapter describes how partial safety factors can be used in design of vertical wall breakwaters and an example of a code format is presented. The partial safety factors are calibrated on a probabilistic basis. The code calibration process used to calibrate some of the partial safety factors...
2012-01-01
The motto of the 2012 Olympic and Paralympic Games is ‘Inspire a generation’ so it was particularly pleasing to see science, the LHC and Higgs bosons featuring so strongly in the opening ceremony of the Paralympics last week. It’s a sign of just how far our field has come that such a high-profile event featured particle physics so strongly, and we can certainly add our support to that motto. If the legacy of London 2012 is a generation inspired by science as well as sport, then the games will have more than fulfilled their mission. Particle physics has truly inspiring stories to tell, going well beyond Higgs and the LHC, and the entire community has played its part in bringing the excitement of frontier research in particle physics to a wide audience. Nevertheless, we cannot rest on our laurels: maintaining the kind of enthusiasm for science we witnessed at the Paralympic opening ceremony will require constant vigilance, and creative thinking about ways to rea...
Probabilistic Logic and Probabilistic Networks
Haenni, R.; Romeijn, J.-W.; Wheeler, G.; Williamson, J.
2009-01-01
While in principle probabilistic logics might be applied to solve a range of problems, in practice they are rarely applied at present. This is perhaps because they seem disparate, complicated, and computationally intractable. However, we shall argue in this programmatic paper that several approaches
Perceptually-Inspired Computing
Directory of Open Access Journals (Sweden)
Ming Lin
2015-08-01
Full Text Available Human sensory systems allow individuals to see, hear, touch, and interact with the surrounding physical environment. Understanding human perception and its limit enables us to better exploit the psychophysics of human perceptual systems to design more efficient, adaptive algorithms and develop perceptually-inspired computational models. In this talk, I will survey some of recent efforts on perceptually-inspired computing with applications to crowd simulation and multimodal interaction. In particular, I will present data-driven personality modeling based on the results of user studies, example-guided physics-based sound synthesis using auditory perception, as well as perceptually-inspired simplification for multimodal interaction. These perceptually guided principles can be used to accelerating multi-modal interaction and visual computing, thereby creating more natural human-computer interaction and providing more immersive experiences. I will also present their use in interactive applications for entertainment, such as video games, computer animation, and shared social experience. I will conclude by discussing possible future research directions.
Strategic Team AI Path Plans: Probabilistic Pathfinding
Directory of Open Access Journals (Sweden)
Tng C. H. John
2008-01-01
Full Text Available This paper proposes a novel method to generate strategic team AI pathfinding plans for computer games and simulations using probabilistic pathfinding. This method is inspired by genetic algorithms (Russell and Norvig, 2002, in that, a fitness function is used to test the quality of the path plans. The method generates high-quality path plans by eliminating the low-quality ones. The path plans are generated by probabilistic pathfinding, and the elimination is done by a fitness test of the path plans. This path plan generation method has the ability to generate variation or different high-quality paths, which is desired for games to increase replay values. This work is an extension of our earlier work on team AI: probabilistic pathfinding (John et al., 2006. We explore ways to combine probabilistic pathfinding and genetic algorithm to create a new method to generate strategic team AI pathfinding plans.
Aleardi, Mattia
2018-01-01
We apply a two-step probabilistic seismic-petrophysical inversion for the characterization of a clastic, gas-saturated, reservoir located in offshore Nile Delta. In particular, we discuss and compare the results obtained when two different rock-physics models (RPMs) are employed in the inversion. The first RPM is an empirical, linear model directly derived from the available well log data by means of an optimization procedure. The second RPM is a theoretical, non-linear model based on the Hertz-Mindlin contact theory. The first step of the inversion procedure is a Bayesian linearized amplitude versus angle (AVA) inversion in which the elastic properties, and the associated uncertainties, are inferred from pre-stack seismic data. The estimated elastic properties constitute the input to the second step that is a probabilistic petrophysical inversion in which we account for the noise contaminating the recorded seismic data and the uncertainties affecting both the derived rock-physics models and the estimated elastic parameters. In particular, a Gaussian mixture a-priori distribution is used to properly take into account the facies-dependent behavior of petrophysical properties, related to the different fluid and rock properties of the different litho-fluid classes. In the synthetic and in the field data tests, the very minor differences between the results obtained by employing the two RPMs, and the good match between the estimated properties and well log information, confirm the applicability of the inversion approach and the suitability of the two different RPMs for reservoir characterization in the investigated area.
Mahtani, Kamal Ram; Protheroe, Joanne; Slight, Sarah Patricia; Demarzo, Marcelo Marcos Piva; Blakeman, Thomas; Barton, Christopher A; Brijnath, Bianca; Roberts, Nia
2013-01-07
To examine if there is an increased participation in physical or sporting activities following an Olympic or Paralympic games. Overview of systematic reviews. We searched the Medline, Embase, Cochrane, DARE, SportDISCUS and Web of Knowledge databases. In addition, we searched for 'grey literature' in Google, Google scholar and on the International Olympic Committee websites. We restricted our search to those reviews published in English. We used the AMSTAR tool to assess the methodological quality of those systematic reviews included. The primary outcome was evidence for an increased participation in physical or sporting activities. Secondary outcomes included public perceptions of sport during and after an Olympic games, barriers to increased sports participation and any other non-sporting health benefits. Our systematic search revealed 844 citations, of which only two matched our inclusion criteria. The quality of these two reviews was assessed by three independent reviewers as 'good' using the AMSTAR tool for quality appraisal. Both reviews reported little evidence of an increased uptake of sporting activity following an Olympic Games event. Other effects on health, for example, changes in hospital admissions, suicide rates and drug use, were cited although there was insufficient evidence to see an overall effect. There is a paucity of evidence to support the notion that hosting an Olympic games leads to an increased participation in physical or sporting activities for host countries. We also found little evidence to suggest other health benefits. We conclude that the true success of these and future games should be evaluated by high-quality, evidence-based studies that have been commissioned before, during and following the completion of the event. Only then can the true success and legacy of the games be established.
AUTHOR|(CDS)2079501; Hecker, Bernard Louis; Holtkamp, Annette; Mele, Salvatore; O'Connell, Heath; Sachs, Kirsten; Simko, Tibor; Schwander, Thorsten
2013-01-01
Public calls, agency mandates and scientist demand for Open Science are by now a reality with different nuances across diverse research communities. A complex “ecosystem” of services and tools, mostly communityDdriven, will underpin this revolution in science. Repositories stand to accelerate this process, as “openness” evolves beyond text, in lockstep with scholarly communication. We present a case study of a global discipline, HighDEnergy Physics (HEP), where most of these transitions have already taken place in a “social laboratory” of multiple global information services interlinked in a complex, but successful, ecosystem at the service of scientists. We discuss our firstDhand experience, at a technical and organizational level, of leveraging partnership across repositories and with the user community in support of Open Science, along threads relevant to the OR2013 community.
Directory of Open Access Journals (Sweden)
Mikaël Cozic
2016-11-01
Full Text Available The modeling of awareness and unawareness is a significant topic in the doxastic logic literature, where it is usually tackled in terms of full belief operators. The present paper aims at a treatment in terms of partial belief operators. It draws upon the modal probabilistic logic that was introduced by Aumann (1999 at the semantic level, and then axiomatized by Heifetz and Mongin (2001. The paper embodies in this framework those properties of unawareness that have been highlighted in the seminal paper by Modica and Rustichini (1999. Their paper deals with full belief, but we argue that the properties in question also apply to partial belief. Our main result is a (soundness and completeness theorem that reunites the two strands—modal and probabilistic—of doxastic logic.
Anaïs Vernède
2011-01-01
On Tuesday 18 January 2011, artist Pipilotti Rist came to CERN to find out how science could provide her with a source of inspiration for her art and perhaps to get ideas for future work. Pipilotti, who is an eclectic artist always on the lookout for an original source of inspiration, is almost as passionate about physics as she is about art. Ever Is Over All, 1997, audio video installation by Pipilotti Rist. View of the installation at the National Museum for Foreign Art, Sofia, Bulgaria. © Pipilotti Rist. Courtesy the artist and Hauser & Wirth. Photo by Angel Tzvetanov. Swiss video-maker Pipilotti Rist (her real name is Elisabeth Charlotte Rist), who is well-known in the international art world for her highly colourful videos and creations, visited CERN for the first time on Tuesday 18 January 2011. Her visit represented a trip down memory lane, since she originally studied physics before becoming interested in pursuing a career as an artist and going on to de...
Mert, Aydin; Fahjan, Yasin M.; Hutchings, Lawrence J.; Pınar, Ali
2016-08-01
The main motivation for this study was the impending occurrence of a catastrophic earthquake along the Prince Island Fault (PIF) in the Marmara Sea and the disaster risk around the Marmara region, especially in Istanbul. This study provides the results of a physically based probabilistic seismic hazard analysis (PSHA) methodology, using broadband strong ground motion simulations, for sites within the Marmara region, Turkey, that may be vulnerable to possible large earthquakes throughout the PIF segments in the Marmara Sea. The methodology is called physically based because it depends on the physical processes of earthquake rupture and wave propagation to simulate earthquake ground motion time histories. We included the effects of all considerable-magnitude earthquakes. To generate the high-frequency (0.5-20 Hz) part of the broadband earthquake simulation, real, small-magnitude earthquakes recorded by a local seismic array were used as empirical Green's functions. For the frequencies below 0.5 Hz, the simulations were obtained by using synthetic Green's functions, which are synthetic seismograms calculated by an explicit 2D /3D elastic finite difference wave propagation routine. By using a range of rupture scenarios for all considerable-magnitude earthquakes throughout the PIF segments, we produced a hazard calculation for frequencies of 0.1-20 Hz. The physically based PSHA used here followed the same procedure as conventional PSHA, except that conventional PSHA utilizes point sources or a series of point sources to represent earthquakes, and this approach utilizes the full rupture of earthquakes along faults. Furthermore, conventional PSHA predicts ground motion parameters by using empirical attenuation relationships, whereas this approach calculates synthetic seismograms for all magnitudes of earthquakes to obtain ground motion parameters. PSHA results were produced for 2, 10, and 50 % hazards for all sites studied in the Marmara region.
Mert, A.
2016-12-01
The main motivation of this study is the impending occurrence of a catastrophic earthquake along the Prince Island Fault (PIF) in Marmara Sea and the disaster risk around Marmara region, especially in İstanbul. This study provides the results of a physically-based Probabilistic Seismic Hazard Analysis (PSHA) methodology, using broad-band strong ground motion simulations, for sites within the Marmara region, Turkey, due to possible large earthquakes throughout the PIF segments in the Marmara Sea. The methodology is called physically-based because it depends on the physical processes of earthquake rupture and wave propagation to simulate earthquake ground motion time histories. We include the effects of all considerable magnitude earthquakes. To generate the high frequency (0.5-20 Hz) part of the broadband earthquake simulation, the real small magnitude earthquakes recorded by local seismic array are used as an Empirical Green's Functions (EGF). For the frequencies below 0.5 Hz the simulations are obtained using by Synthetic Green's Functions (SGF) which are synthetic seismograms calculated by an explicit 2D/3D elastic finite difference wave propagation routine. Using by a range of rupture scenarios for all considerable magnitude earthquakes throughout the PIF segments we provide a hazard calculation for frequencies 0.1-20 Hz. Physically based PSHA used here follows the same procedure of conventional PSHA except that conventional PSHA utilizes point sources or a series of point sources to represent earthquakes and this approach utilizes full rupture of earthquakes along faults. Further, conventional PSHA predicts ground-motion parameters using by empirical attenuation relationships, whereas this approach calculates synthetic seismograms for all magnitude earthquakes to obtain ground-motion parameters. PSHA results are produced for 2%, 10% and 50% hazards for all studied sites in Marmara Region.
Physicists Get INSPIREd: INSPIRE Project and Grid Applications
International Nuclear Information System (INIS)
Klem, Jukka; Iwaszkiewicz, Jan
2011-01-01
INSPIRE is the new high-energy physics scientific information system developed by CERN, DESY, Fermilab and SLAC. INSPIRE combines the curated and trusted contents of SPIRES database with Invenio digital library technology. INSPIRE contains the entire HEP literature with about one million records and in addition to becoming the reference HEP scientific information platform, it aims to provide new kinds of data mining services and metrics to assess the impact of articles and authors. Grid and cloud computing provide new opportunities to offer better services in areas that require large CPU and storage resources including document Optical Character Recognition (OCR) processing, full-text indexing of articles and improved metrics. D4Science-II is a European project that develops and operates an e-Infrastructure supporting Virtual Research Environments (VREs). It develops an enabling technology (gCube) which implements a mechanism for facilitating the interoperation of its e-Infrastructure with other autonomously running data e-Infrastructures. As a result, this creates the core of an e-Infrastructure ecosystem. INSPIRE is one of the e-Infrastructures participating in D4Science-II project. In the context of the D4Science-II project, the INSPIRE e-Infrastructure makes available some of its resources and services to other members of the resulting ecosystem. Moreover, it benefits from the ecosystem via a dedicated Virtual Organization giving access to an array of resources ranging from computing and storage resources of grid infrastructures to data and services.
van Lier-Walqui, M.; Morrison, H.; Kumjian, M. R.; Prat, O. P.
2016-12-01
Microphysical parameterization schemes have reached an impressive level of sophistication: numerous prognostic hydrometeor categories, and either size-resolved (bin) particle size distributions, or multiple prognostic moments of the size distribution. Yet, uncertainty in model representation of microphysical processes and the effects of microphysics on numerical simulation of weather has not shown a improvement commensurate with the advanced sophistication of these schemes. We posit that this may be caused by unconstrained assumptions of these schemes, such as ad-hoc parameter value choices and structural uncertainties (e.g. choice of a particular form for the size distribution). We present work on development and observational constraint of a novel microphysical parameterization approach, the Bayesian Observationally-constrained Statistical-physical Scheme (BOSS), which seeks to address these sources of uncertainty. Our framework avoids unnecessary a priori assumptions, and instead relies on observations to provide probabilistic constraint of the scheme structure and sensitivities to environmental and microphysical conditions. We harness the rich microphysical information content of polarimetric radar observations to develop and constrain BOSS within a Bayesian inference framework using a Markov Chain Monte Carlo sampler (see Kumjian et al., this meeting for details on development of an associated polarimetric forward operator). Our work shows how knowledge of microphysical processes is provided by polarimetric radar observations of diverse weather conditions, and which processes remain highly uncertain, even after considering observations.
Probabilistic Counterfactuals: Semantics, Computation, and Applications
National Research Council Canada - National Science Library
Balke, Alexander
1997-01-01
... handled within the framework of standard probability theory. Starting with functional description of physical mechanisms, we were able to derive the standard probabilistic properties of Bayesian networks and to show: (1...
International Nuclear Information System (INIS)
Karimpouli, Sadegh; Hassani, Hossein; Nabi-Bidhendi, Majid; Khoshdel, Hossein; Malehmir, Alireza
2013-01-01
In this study, a carbonate field from Iran was studied. Estimation of rock properties such as porosity and permeability is much more challenging in carbonate rocks than sandstone rocks because of their strong heterogeneity. The frame flexibility factor (γ) is a rock physics parameter which is related not only to pore structure variation but also to solid/pore connectivity and rock texture in carbonate reservoirs. We used porosity, frame flexibility factor and bulk modulus of fluid as the proper parameters to study this gas carbonate reservoir. According to rock physics parameters, three facies were defined: favourable and unfavourable facies and then a transition facies located between these two end members. To capture both the inversion solution and associated uncertainty, a complete implementation of the Bayesian inversion of the facies from pre-stack seismic data was applied to well data and validated with data from another well. Finally, this method was applied on a 2D seismic section and, in addition to inversion of petrophysical parameters, the high probability distribution of favorable facies was also obtained. (paper)
Ha, Taesung
A probabilistic risk assessment (PRA) was conducted for a loss of coolant accident, (LOCA) in the McMaster Nuclear Reactor (MNR). A level 1 PRA was completed including event sequence modeling, system modeling, and quantification. To support the quantification of the accident sequence identified, data analysis using the Bayesian method and human reliability analysis (HRA) using the accident sequence evaluation procedure (ASEP) approach were performed. Since human performance in research reactors is significantly different from that in power reactors, a time-oriented HRA model (reliability physics model) was applied for the human error probability (HEP) estimation of the core relocation. This model is based on two competing random variables: phenomenological time and performance time. The response surface and direct Monte Carlo simulation with Latin Hypercube sampling were applied for estimating the phenomenological time, whereas the performance time was obtained from interviews with operators. An appropriate probability distribution for the phenomenological time was assigned by statistical goodness-of-fit tests. The human error probability (HEP) for the core relocation was estimated from these two competing quantities: phenomenological time and operators' performance time. The sensitivity of each probability distribution in human reliability estimation was investigated. In order to quantify the uncertainty in the predicted HEPs, a Bayesian approach was selected due to its capability of incorporating uncertainties in model itself and the parameters in that model. The HEP from the current time-oriented model was compared with that from the ASEP approach. Both results were used to evaluate the sensitivity of alternative huinan reliability modeling for the manual core relocation in the LOCA risk model. This exercise demonstrated the applicability of a reliability physics model supplemented with a. Bayesian approach for modeling human reliability and its potential
International Nuclear Information System (INIS)
Sharirli, M.; Butner, J.M.; Rand, J.L.; Macek, R.J.; McKinney, S.J.; Roush, M.L.
1992-01-01
This paper presents results from a Los Alamos National Laboratory Engineering and Safety Analysis Group assessment of the worse-case design-basis accident associated with the Clinton P. Anderson Meson Physics Facility (LAMPF)/Weapons Neutron Research (WNR) Facility. The primary goal of the analysis was to quantify the accident sequences that result in personnel radiation exposure in the WNR Experimental Hall following the worst-case design-basis accident, a complete spill of the LAMPF accelerator 1L beam. This study also provides information regarding the roles of hardware systems and operators in these sequences, and insights regarding the areas where improvements can increase facility-operation safety. Results also include confidence ranges to incorporate combined effects of uncertainties in probability estimates and importance measures to determine how variations in individual events affect the frequencies in accident sequences
Schweizer, B
2005-01-01
Topics include special classes of probabilistic metric spaces, topologies, and several related structures, such as probabilistic normed and inner-product spaces. 1983 edition, updated with 3 new appendixes. Includes 17 illustrations.
International Nuclear Information System (INIS)
Hall, P.L.; Strutt, J.E.
2003-01-01
In reliability engineering, component failures are generally classified in one of three ways: (1) early life failures; (2) failures having random onset times; and (3) late life or 'wear out' failures. When the time-distribution of failures of a population of components is analysed in terms of a Weibull distribution, these failure types may be associated with shape parameters β having values 1 respectively. Early life failures are frequently attributed to poor design (e.g. poor materials selection) or problems associated with manufacturing or assembly processes. We describe a methodology for the implementation of physics-of-failure models of component lifetimes in the presence of parameter and model uncertainties. This treats uncertain parameters as random variables described by some appropriate statistical distribution, which may be sampled using Monte Carlo methods. The number of simulations required depends upon the desired accuracy of the predicted lifetime. Provided that the number of sampled variables is relatively small, an accuracy of 1-2% can be obtained using typically 1000 simulations. The resulting collection of times-to-failure are then sorted into ascending order and fitted to a Weibull distribution to obtain a shape factor β and a characteristic life-time η. Examples are given of the results obtained using three different models: (1) the Eyring-Peck (EP) model for corrosion of printed circuit boards; (2) a power-law corrosion growth (PCG) model which represents the progressive deterioration of oil and gas pipelines; and (3) a random shock-loading model of mechanical failure. It is shown that for any specific model the values of the Weibull shape parameters obtained may be strongly dependent on the degree of uncertainty of the underlying input parameters. Both the EP and PCG models can yield a wide range of values of β, from β>1, characteristic of wear-out behaviour, to β<1, characteristic of early-life failure, depending on the degree of
Tierz, Pablo; Woodhouse, Mark; Phillips, Jeremy; Sandri, Laura; Selva, Jacopo; Marzocchi, Warner; Odbert, Henry
2017-04-01
Volcanoes are extremely complex physico-chemical systems where magma formed at depth breaks into the planet's surface resulting in major hazards from local to global scales. Volcano physics are dominated by non-linearities, and complicated spatio-temporal interrelationships which make volcanic hazards stochastic (i.e. not deterministic) by nature. In this context, probabilistic assessments are required to quantify the large uncertainties related to volcanic hazards. Moreover, volcanoes are typically multi-hazard environments where different hazardous processes can occur whether simultaneously or in succession. In particular, explosive volcanoes are able to accumulate, through tephra fallout and Pyroclastic Density Currents (PDCs), large amounts of pyroclastic material into the drainage basins surrounding the volcano. This addition of fresh particulate material alters the local/regional hydrogeological equilibrium and increases the frequency and magnitude of sediment-rich aqueous flows, commonly known as lahars. The initiation and volume of rain-triggered lahars may depend on: rainfall intensity and duration; antecedent rainfall; terrain slope; thickness, permeability and hydraulic diffusivity of the tephra deposit; etc. Quantifying these complex interrelationships (and their uncertainties), in a tractable manner, requires a structured but flexible probabilistic approach. A Bayesian Belief Network (BBN) is a directed acyclic graph that allows the representation of the joint probability distribution for a set of uncertain variables in a compact and efficient way, by exploiting unconditional and conditional independences between these variables. Once constructed and parametrized, the BBN uses Bayesian inference to perform causal (e.g. forecast) and/or evidential reasoning (e.g. explanation) about query variables, given some evidence. In this work, we illustrate how BBNs can be used to model the influence of several variables on the generation of rain-triggered lahars
Probabilistic escalation modelling
Energy Technology Data Exchange (ETDEWEB)
Korneliussen, G.; Eknes, M.L.; Haugen, K.; Selmer-Olsen, S. [Det Norske Veritas, Oslo (Norway)
1997-12-31
This paper describes how structural reliability methods may successfully be applied within quantitative risk assessment (QRA) as an alternative to traditional event tree analysis. The emphasis is on fire escalation in hydrocarbon production and processing facilities. This choice was made due to potential improvements over current QRA practice associated with both the probabilistic approach and more detailed modelling of the dynamics of escalating events. The physical phenomena important for the events of interest are explicitly modelled as functions of time. Uncertainties are represented through probability distributions. The uncertainty modelling enables the analysis to be simple when possible and detailed when necessary. The methodology features several advantages compared with traditional risk calculations based on event trees. (Author)
Integrated Deterministic-Probabilistic Safety Assessment Methodologies
Energy Technology Data Exchange (ETDEWEB)
Kudinov, P.; Vorobyev, Y.; Sanchez-Perea, M.; Queral, C.; Jimenez Varas, G.; Rebollo, M. J.; Mena, L.; Gomez-Magin, J.
2014-02-01
IDPSA (Integrated Deterministic-Probabilistic Safety Assessment) is a family of methods which use tightly coupled probabilistic and deterministic approaches to address respective sources of uncertainties, enabling Risk informed decision making in a consistent manner. The starting point of the IDPSA framework is that safety justification must be based on the coupling of deterministic (consequences) and probabilistic (frequency) considerations to address the mutual interactions between stochastic disturbances (e.g. failures of the equipment, human actions, stochastic physical phenomena) and deterministic response of the plant (i.e. transients). This paper gives a general overview of some IDPSA methods as well as some possible applications to PWR safety analyses. (Author)
A Tony Thomas-Inspired Guide to INSPIRE
Energy Technology Data Exchange (ETDEWEB)
O' Connell, Heath B.; /Fermilab
2010-04-01
The SPIRES database was created in the late 1960s to catalogue the high energy physics preprints received by the SLAC Library. In the early 1990s it became the first database on the web and the first website outside of Europe. Although indispensible to the HEP community, its aging software infrastructure is becoming a serious liability. In a joint project involving CERN, DESY, Fermilab and SLAC, a new database, INSPIRE, is being created to replace SPIRES using CERN's modern, open-source Invenio database software. INSPIRE will maintain the content and functionality of SPIRES plus many new features. I describe this evolution from the birth of SPIRES to the current day, noting that the career of Tony Thomas spans this timeline.
A Tony Thomas-Inspired Guide to INSPIRE
International Nuclear Information System (INIS)
O'Connell, Heath B.
2010-01-01
The SPIRES database was created in the late 1960s to catalogue the high energy physics preprints received by the SLAC Library. In the early 1990s it became the first database on the web and the first website outside of Europe. Although indispensible to the HEP community, its aging software infrastructure is becoming a serious liability. In a joint project involving CERN, DESY, Fermilab and SLAC, a new database, INSPIRE, is being created to replace SPIRES using CERN's modern, open-source Invenio database software. INSPIRE will maintain the content and functionality of SPIRES plus many new features. I describe this evolution from the birth of SPIRES to the current day, noting that the career of Tony Thomas spans this timeline.
Doutsi, Effrosyni; Fillatre, Lionel; Antonini, Marc; Gaulmin, Julien
2018-07-01
This paper introduces a novel filter, which is inspired by the human retina. The human retina consists of three different layers: the Outer Plexiform Layer (OPL), the inner plexiform layer, and the ganglionic layer. Our inspiration is the linear transform which takes place in the OPL and has been mathematically described by the neuroscientific model "virtual retina." This model is the cornerstone to derive the non-separable spatio-temporal OPL retina-inspired filter, briefly renamed retina-inspired filter, studied in this paper. This filter is connected to the dynamic behavior of the retina, which enables the retina to increase the sharpness of the visual stimulus during filtering before its transmission to the brain. We establish that this retina-inspired transform forms a group of spatio-temporal Weighted Difference of Gaussian (WDoG) filters when it is applied to a still image visible for a given time. We analyze the spatial frequency bandwidth of the retina-inspired filter with respect to time. It is shown that the WDoG spectrum varies from a lowpass filter to a bandpass filter. Therefore, while time increases, the retina-inspired filter enables to extract different kinds of information from the input image. Finally, we discuss the benefits of using the retina-inspired filter in image processing applications such as edge detection and compression.
Wagner, Tom
2010-01-01
The ceremonial copper and iron bells at the Smithsonian's National Museum of African Art were the author's inspiration for an interdisciplinary unit with a focus on the contributions various cultures make toward the richness of a community. The author of this article describes an Edo bell-inspired ceramic project incorporating slab-building…
DEFF Research Database (Denmark)
Vagnby, Bo
2008-01-01
Danish housing policy needs a dose of renewed social concern - and could find new inspiration in Britain's housing and urban planning policies, says Bo Vagnby. Udgivelsesdato: November......Danish housing policy needs a dose of renewed social concern - and could find new inspiration in Britain's housing and urban planning policies, says Bo Vagnby. Udgivelsesdato: November...
Learning Probabilistic Logic Models from Probabilistic Examples.
Chen, Jianzhong; Muggleton, Stephen; Santos, José
2008-10-01
We revisit an application developed originally using abductive Inductive Logic Programming (ILP) for modeling inhibition in metabolic networks. The example data was derived from studies of the effects of toxins on rats using Nuclear Magnetic Resonance (NMR) time-trace analysis of their biofluids together with background knowledge representing a subset of the Kyoto Encyclopedia of Genes and Genomes (KEGG). We now apply two Probabilistic ILP (PILP) approaches - abductive Stochastic Logic Programs (SLPs) and PRogramming In Statistical modeling (PRISM) to the application. Both approaches support abductive learning and probability predictions. Abductive SLPs are a PILP framework that provides possible worlds semantics to SLPs through abduction. Instead of learning logic models from non-probabilistic examples as done in ILP, the PILP approach applied in this paper is based on a general technique for introducing probability labels within a standard scientific experimental setting involving control and treated data. Our results demonstrate that the PILP approach provides a way of learning probabilistic logic models from probabilistic examples, and the PILP models learned from probabilistic examples lead to a significant decrease in error accompanied by improved insight from the learned results compared with the PILP models learned from non-probabilistic examples.
Microflyers: inspiration from nature
Sirohi, Jayant
2013-04-01
Over the past decade, there has been considerable interest in miniaturizing aircraft to create a class of extremely small, robotic vehicles with a gross mass on the order of tens of grams and a dimension on the order of tens of centimeters. These are collectively refered to as micro aerial vehicles (MAVs) or microflyers. Because the size of microflyers is on the same order as that of small birds and large insects, engineers are turning to nature for inspiration. Bioinspired concepts make use of structural or aerodynamic mechanisms that are observed in insects and birds, such as elastic energy storage and unsteady aerodynamics. Biomimetic concepts attempt to replicate the form and function of natural flyers, such as flapping-wing propulsion and external appearance. This paper reviews recent developments in the area of man-made microflyers. The design space for microflyers will be described, along with fundamental physical limits to miniaturization. Key aerodynamic phenomena at the scale of microflyers will be highlighted. Because the focus is on bioinspiration and biomimetics, scaled-down versions of conventional aircraft, such as fixed wing micro air vehicles and microhelicopters will not be addressed. A few representative bioinspired and biomimetic microflyer concepts developed by researchers will be described in detail. Finally, some of the sensing mechanisms used by natural flyers that are being implemented in man-made microflyers will be discussed.
2004-01-01
Art students inspired by CERN will be returning to show their work 9 to 16 October in Building 500, outside the Auditorium. Seventeen art students from around Europe visited CERN last January for a week of introductions to particle physics and astrophysics, and discussions with CERN scientists about their projects. A CERN scientist "adopted"each artist so they could ask questions during and after the visit. Now the seeds planted during their visit have come to fruition in a show using many media and exploring varied concepts, such as how people experience the online world, the sheer scale of CERN's equipment, and the abstractness of the entities scientists are looking for. "The work is so varied, people are going to love some pieces and detest others," says Andrew Charalambous, the project coordinator from University College London who is also curating the exhibition. "It's contemporary modern art, and that's sometimes difficult to take in." For more information on this thought-provoking show, see: htt...
Probabilistic Logical Characterization
DEFF Research Database (Denmark)
Hermanns, Holger; Parma, Augusto; Segala, Roberto
2011-01-01
Probabilistic automata exhibit both probabilistic and non-deterministic choice. They are therefore a powerful semantic foundation for modeling concurrent systems with random phenomena arising in many applications ranging from artificial intelligence, security, systems biology to performance...... modeling. Several variations of bisimulation and simulation relations have proved to be useful as means to abstract and compare different automata. This paper develops a taxonomy of logical characterizations of these relations on image-finite and image-infinite probabilistic automata....
Conditional Probabilistic Population Forecasting
Sanderson, W.C.; Scherbov, S.; O'Neill, B.C.; Lutz, W.
2003-01-01
Since policy makers often prefer to think in terms of scenarios, the question has arisen as to whether it is possible to make conditional population forecasts in a probabilistic context. This paper shows that it is both possible and useful to make these forecasts. We do this with two different kinds of examples. The first is the probabilistic analog of deterministic scenario analysis. Conditional probabilistic scenario analysis is essential for policy makers it allows them to answer "what if"...
Conditional probabilistic population forecasting
Sanderson, Warren; Scherbov, Sergei; O'Neill, Brian; Lutz, Wolfgang
2003-01-01
Since policy-makers often prefer to think in terms of alternative scenarios, the question has arisen as to whether it is possible to make conditional population forecasts in a probabilistic context. This paper shows that it is both possible and useful to make these forecasts. We do this with two different kinds of examples. The first is the probabilistic analog of deterministic scenario analysis. Conditional probabilistic scenario analysis is essential for policy-makers because it allows them...
Conditional Probabilistic Population Forecasting
Sanderson, Warren C.; Scherbov, Sergei; O'Neill, Brian C.; Lutz, Wolfgang
2004-01-01
Since policy-makers often prefer to think in terms of alternative scenarios, the question has arisen as to whether it is possible to make conditional population forecasts in a probabilistic context. This paper shows that it is both possible and useful to make these forecasts. We do this with two different kinds of examples. The first is the probabilistic analog of deterministic scenario analysis. Conditional probabilistic scenario analysis is essential for policy-makers because...
Probabilistic Design of Wind Turbines
DEFF Research Database (Denmark)
Sørensen, John Dalsgaard; Toft, H.S.
2010-01-01
Probabilistic design of wind turbines requires definition of the structural elements to be included in the probabilistic basis: e.g., blades, tower, foundation; identification of important failure modes; careful stochastic modeling of the uncertain parameters; recommendations for target reliability....... It is described how uncertainties in wind turbine design related to computational models, statistical data from test specimens, results from a few full-scale tests and from prototype wind turbines can be accounted for using the Maximum Likelihood Method and a Bayesian approach. Assessment of the optimal...... reliability level by cost-benefit optimization is illustrated by an offshore wind turbine example. Uncertainty modeling is illustrated by an example where physical, statistical and model uncertainties are estimated....
Duplicate Detection in Probabilistic Data
Panse, Fabian; van Keulen, Maurice; de Keijzer, Ander; Ritter, Norbert
2009-01-01
Collected data often contains uncertainties. Probabilistic databases have been proposed to manage uncertain data. To combine data from multiple autonomous probabilistic databases, an integration of probabilistic data has to be performed. Until now, however, data integration approaches have focused
Câmara, Daniel
2015-01-01
Bio-inspired techniques are based on principles, or models, of biological systems. In general, natural systems present remarkable capabilities of resilience and adaptability. In this book, we explore how bio-inspired methods can solve different problems linked to computer networks. Future networks are expected to be autonomous, scalable and adaptive. During millions of years of evolution, nature has developed a number of different systems that present these and other characteristics required for the next generation networks. Indeed, a series of bio-inspired methods have been successfully used to solve the most diverse problems linked to computer networks. This book presents some of these techniques from a theoretical and practical point of view. Discusses the key concepts of bio-inspired networking to aid you in finding efficient networking solutions Delivers examples of techniques both in theoretical concepts and practical applications Helps you apply nature's dynamic resource and task management to your co...
Probabilistic Structural Analysis Program
Pai, Shantaram S.; Chamis, Christos C.; Murthy, Pappu L. N.; Stefko, George L.; Riha, David S.; Thacker, Ben H.; Nagpal, Vinod K.; Mital, Subodh K.
2010-01-01
NASA/NESSUS 6.2c is a general-purpose, probabilistic analysis program that computes probability of failure and probabilistic sensitivity measures of engineered systems. Because NASA/NESSUS uses highly computationally efficient and accurate analysis techniques, probabilistic solutions can be obtained even for extremely large and complex models. Once the probabilistic response is quantified, the results can be used to support risk-informed decisions regarding reliability for safety-critical and one-of-a-kind systems, as well as for maintaining a level of quality while reducing manufacturing costs for larger-quantity products. NASA/NESSUS has been successfully applied to a diverse range of problems in aerospace, gas turbine engines, biomechanics, pipelines, defense, weaponry, and infrastructure. This program combines state-of-the-art probabilistic algorithms with general-purpose structural analysis and lifting methods to compute the probabilistic response and reliability of engineered structures. Uncertainties in load, material properties, geometry, boundary conditions, and initial conditions can be simulated. The structural analysis methods include non-linear finite-element methods, heat-transfer analysis, polymer/ceramic matrix composite analysis, monolithic (conventional metallic) materials life-prediction methodologies, boundary element methods, and user-written subroutines. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. NASA/NESSUS 6.2c is structured in a modular format with 15 elements.
Probabilistic programmable quantum processors
International Nuclear Information System (INIS)
Buzek, V.; Ziman, M.; Hillery, M.
2004-01-01
We analyze how to improve performance of probabilistic programmable quantum processors. We show how the probability of success of the probabilistic processor can be enhanced by using the processor in loops. In addition, we show that an arbitrary SU(2) transformations of qubits can be encoded in program state of a universal programmable probabilistic quantum processor. The probability of success of this processor can be enhanced by a systematic correction of errors via conditional loops. Finally, we show that all our results can be generalized also for qudits. (Abstract Copyright [2004], Wiley Periodicals, Inc.)
Practices of Waldorf-Inspired Schools. Research Brief
Friedlaender, Diane; Beckham, Kyle; Zheng, Xinhua; Darling-Hammond, Linda
2015-01-01
"Growing a Waldorf-Inspired Approach in a Public School District" documents the practices and outcomes of Alice Birney, a Waldorf-Inspired School in Sacramento City Unified School District (SCUSD). This study highlights how such a school addresses students' academic, social, emotional, physical, and creative development. The study also…
Probabilistic Infinite Secret Sharing
Csirmaz, László
2013-01-01
The study of probabilistic secret sharing schemes using arbitrary probability spaces and possibly infinite number of participants lets us investigate abstract properties of such schemes. It highlights important properties, explains why certain definitions work better than others, connects this topic to other branches of mathematics, and might yield new design paradigms. A probabilistic secret sharing scheme is a joint probability distribution of the shares and the secret together with a colle...
Probabilistic Programming (Invited Talk)
Yang, Hongseok
2017-01-01
Probabilistic programming refers to the idea of using standard programming constructs for specifying probabilistic models from machine learning and statistics, and employing generic inference algorithms for answering various queries on these models, such as posterior inference and estimation of model evidence. Although this idea itself is not new and was, in fact, explored by several programming-language and statistics researchers in the early 2000, it is only in the last few years that proba...
Inspirations in medical genetics.
Asadollahi, Reza
2016-02-01
There are abundant instances in the history of genetics and medical genetics to illustrate how curiosity, charisma of mentors, nature, art, the saving of lives and many other matters have inspired great discoveries. These achievements from deciphering genetic concepts to characterizing genetic disorders have been crucial for management of the patients. There remains, however, a long pathway ahead. © The Author(s) 2014.
Tank, Kristina; Moore, Tamara; Strnat, Meg
2015-01-01
This article describes the final lesson within a seven-day STEM and literacy unit that is part of the Picture STEM curriculum (pictureSTEM. org) and uses engineering to integrate science and mathematics learning in a meaningful way (Tank and Moore 2013). For this engineering challenge, students used nature as a source of inspiration for designs to…
Rice, Nicole
2012-01-01
The house paintings of the South African Ndebele people are more than just an attempt to improve the aesthetics of a community; they are a source of identity and significance for Ndebele women. In this article, the author describes an art project wherein students use the tradition of Ndebele house painting as inspiration for creating their own…
DEFF Research Database (Denmark)
Valentin, Jan B.; Andreetta, Christian; Boomsma, Wouter
2014-01-01
We propose a method to formulate probabilistic models of protein structure in atomic detail, for a given amino acid sequence, based on Bayesian principles, while retaining a close link to physics. We start from two previously developed probabilistic models of protein structure on a local length s....... The results indicate that the proposed method and the probabilistic models show considerable promise for probabilistic protein structure prediction and related applications. © 2013 Wiley Periodicals, Inc....
Sayers, Adrian; Ben-Shlomo, Yoav; Blom, Ashley W; Steele, Fiona
2016-06-01
Studies involving the use of probabilistic record linkage are becoming increasingly common. However, the methods underpinning probabilistic record linkage are not widely taught or understood, and therefore these studies can appear to be a 'black box' research tool. In this article, we aim to describe the process of probabilistic record linkage through a simple exemplar. We first introduce the concept of deterministic linkage and contrast this with probabilistic linkage. We illustrate each step of the process using a simple exemplar and describe the data structure required to perform a probabilistic linkage. We describe the process of calculating and interpreting matched weights and how to convert matched weights into posterior probabilities of a match using Bayes theorem. We conclude this article with a brief discussion of some of the computational demands of record linkage, how you might assess the quality of your linkage algorithm, and how epidemiologists can maximize the value of their record-linked research using robust record linkage methods. © The Author 2015; Published by Oxford University Press on behalf of the International Epidemiological Association.
Data specifications for INSPIRE
Portele, Clemens; Woolf, Andrew; Cox, Simon
2010-05-01
In Europe a major recent development has been the entering in force of the INSPIRE Directive in May 2007, establishing an infrastructure for spatial information in Europe to support Community environmental policies, and policies or activities which may have an impact on the environment. INSPIRE is based on the infrastructures for spatial information established and operated by the 27 Member States of the European Union. The Directive addresses 34 spatial data themes needed for environmental applications, with key components specified through technical implementing rules. This makes INSPIRE a unique example of a legislative "regional" approach. One of the requirements of the INSPIRE Directive is to make existing spatial data sets with relevance for one of the spatial data themes available in an interoperable way, i.e. where the spatial data from different sources in Europe can be combined to a coherent result. Since INSPIRE covers a wide range of spatial data themes, the first step has been the development of a modelling framework that provides a common foundation for all themes. This framework is largely based on the ISO 19100 series of standards. The use of common generic spatial modelling concepts across all themes is an important enabler for interoperability. As a second step, data specifications for the first set of themes has been developed based on the modelling framework. The themes include addresses, transport networks, protected sites, hydrography, administrative areas and others. The data specifications were developed by selected experts nominated by stakeholders from all over Europe. For each theme a working group was established in early 2008 working on their specific theme and collaborating with the other working groups on cross-theme issues. After a public review of the draft specifications starting in December 2008, an open testing process and thorough comment resolution process, the draft technical implementing rules for these themes have been
Formalizing Probabilistic Safety Claims
Herencia-Zapana, Heber; Hagen, George E.; Narkawicz, Anthony J.
2011-01-01
A safety claim for a system is a statement that the system, which is subject to hazardous conditions, satisfies a given set of properties. Following work by John Rushby and Bev Littlewood, this paper presents a mathematical framework that can be used to state and formally prove probabilistic safety claims. It also enables hazardous conditions, their uncertainties, and their interactions to be integrated into the safety claim. This framework provides a formal description of the probabilistic composition of an arbitrary number of hazardous conditions and their effects on system behavior. An example is given of a probabilistic safety claim for a conflict detection algorithm for aircraft in a 2D airspace. The motivation for developing this mathematical framework is that it can be used in an automated theorem prover to formally verify safety claims.
Brain-inspired Stochastic Models and Implementations
Al-Shedivat, Maruan
2015-05-12
One of the approaches to building artificial intelligence (AI) is to decipher the princi- ples of the brain function and to employ similar mechanisms for solving cognitive tasks, such as visual perception or natural language understanding, using machines. The recent breakthrough, named deep learning, demonstrated that large multi-layer networks of arti- ficial neural-like computing units attain remarkable performance on some of these tasks. Nevertheless, such artificial networks remain to be very loosely inspired by the brain, which rich structures and mechanisms may further suggest new algorithms or even new paradigms of computation. In this thesis, we explore brain-inspired probabilistic mechanisms, such as neural and synaptic stochasticity, in the context of generative models. The two questions we ask here are: (i) what kind of models can describe a neural learning system built of stochastic components? and (ii) how can we implement such systems e ̆ciently? To give specific answers, we consider two well known models and the corresponding neural architectures: the Naive Bayes model implemented with a winner-take-all spiking neural network and the Boltzmann machine implemented in a spiking or non-spiking fashion. We propose and analyze an e ̆cient neuromorphic implementation of the stochastic neu- ral firing mechanism and study the e ̄ects of synaptic unreliability on learning generative energy-based models implemented with neural networks.
2014-05-01
UK public libraries offer walk-in access to research Atoms for Peace? The Atomic Weapons Establishment and UK universities Students present their research to academics: CERN@school Science in a suitcase: Marvin and Milo visit Ethiopia Inspiring telescopes A day for everyone teaching physics 2014 Forthcoming Events
DEFF Research Database (Denmark)
Larsen, Kim Guldstrand; Mardare, Radu Iulian; Xue, Bingtian
2016-01-01
We introduce a version of the probabilistic µ-calculus (PMC) built on top of a probabilistic modal logic that allows encoding n-ary inequational conditions on transition probabilities. PMC extends previously studied calculi and we prove that, despite its expressiveness, it enjoys a series of good...... metaproperties. Firstly, we prove the decidability of satisﬁability checking by establishing the small model property. An algorithm for deciding the satisﬁability problem is developed. As a second major result, we provide a complete axiomatization for the alternation-free fragment of PMC. The completeness proof...
Probabilistic conditional independence structures
Studeny, Milan
2005-01-01
Probabilistic Conditional Independence Structures provides the mathematical description of probabilistic conditional independence structures; the author uses non-graphical methods of their description, and takes an algebraic approach.The monograph presents the methods of structural imsets and supermodular functions, and deals with independence implication and equivalence of structural imsets.Motivation, mathematical foundations and areas of application are included, and a rough overview of graphical methods is also given.In particular, the author has been careful to use suitable terminology, and presents the work so that it will be understood by both statisticians, and by researchers in artificial intelligence.The necessary elementary mathematical notions are recalled in an appendix.
Probabilistic approach to mechanisms
Sandler, BZ
1984-01-01
This book discusses the application of probabilistics to the investigation of mechanical systems. The book shows, for example, how random function theory can be applied directly to the investigation of random processes in the deflection of cam profiles, pitch or gear teeth, pressure in pipes, etc. The author also deals with some other technical applications of probabilistic theory, including, amongst others, those relating to pneumatic and hydraulic mechanisms and roller bearings. Many of the aspects are illustrated by examples of applications of the techniques under discussion.
Probabilistic systems coalgebraically: A survey
Sokolova, Ana
2011-01-01
We survey the work on both discrete and continuous-space probabilistic systems as coalgebras, starting with how probabilistic systems are modeled as coalgebras and followed by a discussion of their bisimilarity and behavioral equivalence, mentioning results that follow from the coalgebraic treatment of probabilistic systems. It is interesting to note that, for different reasons, for both discrete and continuous probabilistic systems it may be more convenient to work with behavioral equivalence than with bisimilarity. PMID:21998490
Mahtani, Kamal Ram; Protheroe, Joanne; Slight, Sarah Patricia; Demarzo, Marcelo Marcos Piva; Blakeman, Thomas; Barton, Christopher A; Brijnath, Bianca; Roberts, Nia
2013-01-01
Objective To examine if there is an increased participation in physical or sporting activities following an Olympic or Paralympic games. Design Overview of systematic reviews. Methods We searched the Medline, Embase, Cochrane, DARE, SportDISCUS and Web of Knowledge databases. In addition, we searched for ‘grey literature’ in Google, Google scholar and on the International Olympic Committee websites. We restricted our search to those reviews published in English. We used the AMSTAR tool to assess the methodological quality of those systematic reviews included. Primary and secondary outcome measures The primary outcome was evidence for an increased participation in physical or sporting activities. Secondary outcomes included public perceptions of sport during and after an Olympic games, barriers to increased sports participation and any other non-sporting health benefits. Results Our systematic search revealed 844 citations, of which only two matched our inclusion criteria. The quality of these two reviews was assessed by three independent reviewers as ‘good’ using the AMSTAR tool for quality appraisal. Both reviews reported little evidence of an increased uptake of sporting activity following an Olympic Games event. Other effects on health, for example, changes in hospital admissions, suicide rates and drug use, were cited although there was insufficient evidence to see an overall effect. Conclusion There is a paucity of evidence to support the notion that hosting an Olympic games leads to an increased participation in physical or sporting activities for host countries. We also found little evidence to suggest other health benefits. We conclude that the true success of these and future games should be evaluated by high-quality, evidence-based studies that have been commissioned before, during and following the completion of the event. Only then can the true success and legacy of the games be established. PMID:23299112
Confluence reduction for probabilistic systems
Timmer, Mark; van de Pol, Jan Cornelis; Stoelinga, Mariëlle Ida Antoinette
In this presentation we introduce a novel technique for state space reduction of probabilistic specifications, based on a newly developed notion of confluence for probabilistic automata. We proved that this reduction preserves branching probabilistic bisimulation and can be applied on-the-fly. To
Bergstra, J.A.; Middelburg, C.A.
2015-01-01
We add probabilistic features to basic thread algebra and its extensions with thread-service interaction and strategic interleaving. Here, threads represent the behaviours produced by instruction sequences under execution and services represent the behaviours exhibited by the components of execution
Probabilistic simple sticker systems
Selvarajoo, Mathuri; Heng, Fong Wan; Sarmin, Nor Haniza; Turaev, Sherzod
2017-04-01
A model for DNA computing using the recombination behavior of DNA molecules, known as a sticker system, was introduced by by L. Kari, G. Paun, G. Rozenberg, A. Salomaa, and S. Yu in the paper entitled DNA computing, sticker systems and universality from the journal of Acta Informatica vol. 35, pp. 401-420 in the year 1998. A sticker system uses the Watson-Crick complementary feature of DNA molecules: starting from the incomplete double stranded sequences, and iteratively using sticking operations until a complete double stranded sequence is obtained. It is known that sticker systems with finite sets of axioms and sticker rules generate only regular languages. Hence, different types of restrictions have been considered to increase the computational power of sticker systems. Recently, a variant of restricted sticker systems, called probabilistic sticker systems, has been introduced [4]. In this variant, the probabilities are initially associated with the axioms, and the probability of a generated string is computed by multiplying the probabilities of all occurrences of the initial strings in the computation of the string. Strings for the language are selected according to some probabilistic requirements. In this paper, we study fundamental properties of probabilistic simple sticker systems. We prove that the probabilistic enhancement increases the computational power of simple sticker systems.
Visualizing Probabilistic Proof
Guerra-Pujol, Enrique
2015-01-01
The author revisits the Blue Bus Problem, a famous thought-experiment in law involving probabilistic proof, and presents simple Bayesian solutions to different versions of the blue bus hypothetical. In addition, the author expresses his solutions in standard and visual formats, i.e. in terms of probabilities and natural frequencies.
Memristive Probabilistic Computing
Alahmadi, Hamzah
2017-10-01
In the era of Internet of Things and Big Data, unconventional techniques are rising to accommodate the large size of data and the resource constraints. New computing structures are advancing based on non-volatile memory technologies and different processing paradigms. Additionally, the intrinsic resiliency of current applications leads to the development of creative techniques in computations. In those applications, approximate computing provides a perfect fit to optimize the energy efficiency while compromising on the accuracy. In this work, we build probabilistic adders based on stochastic memristor. Probabilistic adders are analyzed with respect of the stochastic behavior of the underlying memristors. Multiple adder implementations are investigated and compared. The memristive probabilistic adder provides a different approach from the typical approximate CMOS adders. Furthermore, it allows for a high area saving and design exibility between the performance and power saving. To reach a similar performance level as approximate CMOS adders, the memristive adder achieves 60% of power saving. An image-compression application is investigated using the memristive probabilistic adders with the performance and the energy trade-off.
DEFF Research Database (Denmark)
Chen, Peiyuan; Chen, Zhe; Bak-Jensen, Birgitte
2008-01-01
This paper reviews the development of the probabilistic load flow (PLF) techniques. Applications of the PLF techniques in different areas of power system steady-state analysis are also discussed. The purpose of the review is to identify different available PLF techniques and their corresponding...
Transitive probabilistic CLIR models.
Kraaij, W.; de Jong, Franciska M.G.
2004-01-01
Transitive translation could be a useful technique to enlarge the number of supported language pairs for a cross-language information retrieval (CLIR) system in a cost-effective manner. The paper describes several setups for transitive translation based on probabilistic translation models. The
DEFF Research Database (Denmark)
Meier, Ninna
2016-01-01
What academics or books have inspired you in your writing and research, or helped to make sense of the world around you? In this feature essay, Ninna Meier returns to her experience of reading Hannah Arendt as she sought to understand work and how it relates to value production in capitalist...... economies. Meier recounts how Arendt’s book On Revolution (1963) forged connective threads between the ‘smallest parts’ and the ‘largest wholes’ and showed how academic work is never fully relegated to the past, but can return in new iterations across time....
Combining Bio-inspired Sensing with Bio-inspired Locomotion
DEFF Research Database (Denmark)
Shaikh, Danish; Hallam, John; Christensen-Dalsgaard, Jakob
In this paper we present a preliminary Braitenberg vehicle–like approach to combine bio-inspired audition with bio-inspired quadruped locomotion in simulation. Locomotion gaits of the salamander–like robot Salamandra robotica are modified by a lizard’s peripheral auditory system model that modula......In this paper we present a preliminary Braitenberg vehicle–like approach to combine bio-inspired audition with bio-inspired quadruped locomotion in simulation. Locomotion gaits of the salamander–like robot Salamandra robotica are modified by a lizard’s peripheral auditory system model...
Hartmann, Stephan
2011-01-01
Many results of modern physics--those of quantum mechanics, for instance--come in a probabilistic guise. But what do probabilistic statements in physics mean? Are probabilities matters of objective fact and part of the furniture of the world, as objectivists think? Or do they only express ignorance or belief, as Bayesians suggest? And how are probabilistic hypotheses justified and supported by empirical evidence? Finally, what does the probabilistic nature of physics imply for our understanding of the world? This volume is the first to provide a philosophical appraisal of probabilities in all of physics. Its main aim is to make sense of probabilistic statements as they occur in the various physical theories and models and to provide a plausible epistemology and metaphysics of probabilities. The essays collected here consider statistical physics, probabilistic modelling, and quantum mechanics, and critically assess the merits and disadvantages of objectivist and subjectivist views of probabilities in these fie...
Probabilistic assessment of faults
International Nuclear Information System (INIS)
Foden, R.W.
1987-01-01
Probabilistic safety analysis (PSA) is the process by which the probability (or frequency of occurrence) of reactor fault conditions which could lead to unacceptable consequences is assessed. The basic objective of a PSA is to allow a judgement to be made as to whether or not the principal probabilistic requirement is satisfied. It also gives insights into the reliability of the plant which can be used to identify possible improvements. This is explained in the article. The scope of a PSA and the PSA performed by the National Nuclear Corporation (NNC) for the Heysham II and Torness AGRs and Sizewell-B PWR are discussed. The NNC methods for hazards, common cause failure and operator error are mentioned. (UK)
Probabilistic Model Development
Adam, James H., Jr.
2010-01-01
Objective: Develop a Probabilistic Model for the Solar Energetic Particle Environment. Develop a tool to provide a reference solar particle radiation environment that: 1) Will not be exceeded at a user-specified confidence level; 2) Will provide reference environments for: a) Peak flux; b) Event-integrated fluence; and c) Mission-integrated fluence. The reference environments will consist of: a) Elemental energy spectra; b) For protons, helium and heavier ions.
Geothermal probabilistic cost study
Energy Technology Data Exchange (ETDEWEB)
Orren, L.H.; Ziman, G.M.; Jones, S.C.; Lee, T.K.; Noll, R.; Wilde, L.; Sadanand, V.
1981-08-01
A tool is presented to quantify the risks of geothermal projects, the Geothermal Probabilistic Cost Model (GPCM). The GPCM model is used to evaluate a geothermal reservoir for a binary-cycle electric plant at Heber, California. Three institutional aspects of the geothermal risk which can shift the risk among different agents are analyzed. The leasing of geothermal land, contracting between the producer and the user of the geothermal heat, and insurance against faulty performance are examined. (MHR)
Probabilistic approaches to recommendations
Barbieri, Nicola; Ritacco, Ettore
2014-01-01
The importance of accurate recommender systems has been widely recognized by academia and industry, and recommendation is rapidly becoming one of the most successful applications of data mining and machine learning. Understanding and predicting the choices and preferences of users is a challenging task: real-world scenarios involve users behaving in complex situations, where prior beliefs, specific tendencies, and reciprocal influences jointly contribute to determining the preferences of users toward huge amounts of information, services, and products. Probabilistic modeling represents a robus
Probabilistic liver atlas construction.
Dura, Esther; Domingo, Juan; Ayala, Guillermo; Marti-Bonmati, Luis; Goceri, E
2017-01-13
Anatomical atlases are 3D volumes or shapes representing an organ or structure of the human body. They contain either the prototypical shape of the object of interest together with other shapes representing its statistical variations (statistical atlas) or a probability map of belonging to the object (probabilistic atlas). Probabilistic atlases are mostly built with simple estimations only involving the data at each spatial location. A new method for probabilistic atlas construction that uses a generalized linear model is proposed. This method aims to improve the estimation of the probability to be covered by the liver. Furthermore, all methods to build an atlas involve previous coregistration of the sample of shapes available. The influence of the geometrical transformation adopted for registration in the quality of the final atlas has not been sufficiently investigated. The ability of an atlas to adapt to a new case is one of the most important quality criteria that should be taken into account. The presented experiments show that some methods for atlas construction are severely affected by the previous coregistration step. We show the good performance of the new approach. Furthermore, results suggest that extremely flexible registration methods are not always beneficial, since they can reduce the variability of the atlas and hence its ability to give sensible values of probability when used as an aid in segmentation of new cases.
Belytschko, Ted; Wing, Kam Liu
1987-01-01
In the Probabilistic Finite Element Method (PFEM), finite element methods have been efficiently combined with second-order perturbation techniques to provide an effective method for informing the designer of the range of response which is likely in a given problem. The designer must provide as input the statistical character of the input variables, such as yield strength, load magnitude, and Young's modulus, by specifying their mean values and their variances. The output then consists of the mean response and the variance in the response. Thus the designer is given a much broader picture of the predicted performance than with simply a single response curve. These methods are applicable to a wide class of problems, provided that the scale of randomness is not too large and the probabilistic density functions possess decaying tails. By incorporating the computational techniques we have developed in the past 3 years for efficiency, the probabilistic finite element methods are capable of handling large systems with many sources of uncertainties. Sample results for an elastic-plastic ten-bar structure and an elastic-plastic plane continuum with a circular hole subject to cyclic loadings with the yield stress on the random field are given.
Probabilistic Tsunami Hazard Analysis
Thio, H. K.; Ichinose, G. A.; Somerville, P. G.; Polet, J.
2006-12-01
The recent tsunami disaster caused by the 2004 Sumatra-Andaman earthquake has focused our attention to the hazard posed by large earthquakes that occur under water, in particular subduction zone earthquakes, and the tsunamis that they generate. Even though these kinds of events are rare, the very large loss of life and material destruction caused by this earthquake warrant a significant effort towards the mitigation of the tsunami hazard. For ground motion hazard, Probabilistic Seismic Hazard Analysis (PSHA) has become a standard practice in the evaluation and mitigation of seismic hazard to populations in particular with respect to structures, infrastructure and lifelines. Its ability to condense the complexities and variability of seismic activity into a manageable set of parameters greatly facilitates the design of effective seismic resistant buildings but also the planning of infrastructure projects. Probabilistic Tsunami Hazard Analysis (PTHA) achieves the same goal for hazards posed by tsunami. There are great advantages of implementing such a method to evaluate the total risk (seismic and tsunami) to coastal communities. The method that we have developed is based on the traditional PSHA and therefore completely consistent with standard seismic practice. Because of the strong dependence of tsunami wave heights on bathymetry, we use a full waveform tsunami waveform computation in lieu of attenuation relations that are common in PSHA. By pre-computing and storing the tsunami waveforms at points along the coast generated for sets of subfaults that comprise larger earthquake faults, we can efficiently synthesize tsunami waveforms for any slip distribution on those faults by summing the individual subfault tsunami waveforms (weighted by their slip). This efficiency make it feasible to use Green's function summation in lieu of attenuation relations to provide very accurate estimates of tsunami height for probabilistic calculations, where one typically computes
Generative probabilistic models extend the scope of inferential structure determination
DEFF Research Database (Denmark)
Olsson, Simon; Boomsma, Wouter; Frellsen, Jes
2011-01-01
demonstrate that the use of generative probabilistic models instead of physical forcefields in the Bayesian formalism is not only conceptually attractive, but also improves precision and efficiency. Our results open new vistas for the use of sophisticated probabilistic models of biomolecular structure......Conventional methods for protein structure determination from NMR data rely on the ad hoc combination of physical forcefields and experimental data, along with heuristic determination of free parameters such as weight of experimental data relative to a physical forcefield. Recently, a theoretically...
Some probabilistic aspects of fracture
International Nuclear Information System (INIS)
Thomas, J.M.
1982-01-01
Some probabilistic aspects of fracture in structural and mechanical components are examined. The principles of fracture mechanics, material quality and inspection uncertainty are formulated into a conceptual and analytical framework for prediction of failure probability. The role of probabilistic fracture mechanics in a more global context of risk and optimization of decisions is illustrated. An example, where Monte Carlo simulation was used to implement a probabilistic fracture mechanics analysis, is discussed. (orig.)
Vandromme, Rosalie; Thiéry, Yannick; Sedan, Olivier; Bernardie, Séverine
2016-04-01
Landslide hazard assessment is the estimation of a target area where landslides of a particular type, volume, runout and intensity may occur within a given period. The first step to analyze landslide hazard consists in assessing the spatial and temporal failure probability (when the information is available, i.e. susceptibility assessment). Two types of approach are generally recommended to achieve this goal: (i) qualitative approach (i.e. inventory based methods and knowledge data driven methods) and (ii) quantitative approach (i.e. data-driven methods or deterministic physically based methods). Among quantitative approaches, deterministic physically based methods (PBM) are generally used at local and/or site-specific scales (1:5,000-1:25,000 and >1:5,000, respectively). The main advantage of these methods is the calculation of probability of failure (safety factor) following some specific environmental conditions. For some models it is possible to integrate the land-uses and climatic change. At the opposite, major drawbacks are the large amounts of reliable and detailed data (especially materials type, their thickness and the geotechnical parameters heterogeneity over a large area) and the fact that only shallow landslides are taking into account. This is why they are often used at site-specific scales (> 1:5,000). Thus, to take into account (i) materials' heterogeneity , (ii) spatial variation of physical parameters, (iii) different landslide types, the French Geological Survey (i.e. BRGM) has developed a physically based model (PBM) implemented in a GIS environment. This PBM couples a global hydrological model (GARDENIA®) including a transient unsaturated/saturated hydrological component with a physically based model computing the stability of slopes (ALICE®, Assessment of Landslides Induced by Climatic Events) based on the Morgenstern-Price method for any slip surface. The variability of mechanical parameters is handled by Monte Carlo approach. The
Energy Technology Data Exchange (ETDEWEB)
Alpuche Aviles, Jorge E.; VanBeek, Timothy [CancerCare Manitoba, Winnipeg (Canada); Sasaki, David; Rivest, Ryan; Akra, Mohamed [CancerCare Manitoba, Winnipeg (Canada); University of Manitoba, Winnipeg (Canada)
2016-08-15
Purpose: This work presents an algorithm used to quantify intra-fraction motion for patients treated using deep inspiration breath hold (DIBH). The algorithm quantifies the position of the chest wall in breast tangent fields using electronic portal images. Methods: The algorithm assumes that image profiles, taken along a direction perpendicular to the medial border of the field, follow a monotonically and smooth decreasing function. This assumption is invalid in the presence of lung and can be used to calculate chest wall position. The algorithm was validated by determining the position of the chest wall for varying field edge positions in portal images of a thoracic phantom. The algorithm was used to quantify intra-fraction motion in cine images for 7 patients treated with DIBH. Results: Phantom results show that changes in the distance between chest wall and field edge were accurate within 0.1 mm on average. For a fixed field edge, the algorithm calculates the position of the chest wall with a 0.2 mm standard deviation. Intra-fraction motion for DIBH patients was within 1 mm 91.4% of the time and within 1.5 mm 97.9% of the time. The maximum intra-fraction motion was 3.0 mm. Conclusions: A physics based algorithm was developed and can be used to quantify the position of chest wall irradiated in tangent portal images with an accuracy of 0.1 mm and precision of 0.6 mm. Intra-fraction motion for patients treated with DIBH at our clinic is less than 3 mm.
Inspiration, anyone? (Editorial
Directory of Open Access Journals (Sweden)
Lindsay Glynn
2006-09-01
Full Text Available I have to admit that writing an editorial for this issue was a struggle. Trying to sit down and write when the sun was shining outside and most of my colleagues were on vacation was, to say the least, difficult. Add to that research projects and conferences…let’s just say that I found myself less than inspired. A pitiful plea for ideas to a colleague resulted in the reintroduction to a few recent evidence based papers and resources which inspired further searching and reading. Though I generally find myself surrounded (more like buried in research papers and EBLIP literature, somehow I had missed the great strides that have been made of late in the world of evidence based library and information practice. I realize now that I am inspired by the researchers, authors and innovators who are putting EBLIP on the proverbial map. My biggest beef with library literature in general has been the plethora of articles highlighting what we should be doing. Take a close look at the evidence based practitioners in the information professions: these are some of the people who are actively practicing what has been preached for the past few years. Take, for example, the about‐to‐be released Libraries using Evidence Toolkit by Northern Sydney Central Coast Health and The University of Newcastle, Australia (see their announcement in this issue. An impressive advisory group is responsible for maintaining the currency and relevancy of the site as well as promoting the site and acting as a steering committee for related projects. This group is certainly doing more than “talking the talk”: they took their experience at the 3rd International Evidence Based Librarianship Conference and did something with the information they obtained by implementing solutions that worked in their environment. The result? The creation of a collection of tools for all of us to use. This toolkit is just what EBLIP needs: a portal to resources aimed at supporting the information
INSPIRE: A new scientific information system for HEP
International Nuclear Information System (INIS)
Ivanov, R; Raae, L
2010-01-01
The status of high-energy physics (HEP) information systems has been jointly analyzed by the libraries of CERN, DESY, Fermilab and SLAC. As a result, the four laboratories have started the INSPIRE project - a new platform built by moving the successful SPIRES features and content, curated at DESY, Fermilab and SLAC, into the open-source CDS Invenio digital library software that was developed at CERN. INSPIRE will integrate current acquisition workflows and databases to host the entire body of the HEP literature (about one million records), aiming to become the reference HEP scientific information platform worldwide. It will provide users with fast access to full text journal articles and preprints, but also material such as conference slides and multimedia. INSPIRE will empower scientists with new tools to discover and access the results most relevant to their research, enable novel text- and data-mining applications, and deploy new metrics to assess the impact of articles and authors. In addition, it will introduce the 'Web 2.0' paradigm of user-enriched content in the domain of sciences, with community-based approaches to scientific publishing. INSPIRE represents a natural evolution of scholarly communication built on successful community-based information systems, and it provides a vision for information management in other fields of science. Inspired by the needs of HEP, we hope that the INSPIRE project will be inspiring for other communities.
Probabilistic safety assessment
International Nuclear Information System (INIS)
Hoertner, H.; Schuetz, B.
1982-09-01
For the purpose of assessing applicability and informativeness on risk-analysis methods in licencing procedures under atomic law, the choice of instruments for probabilistic analysis, the problems in and experience gained in their application, and the discussion of safety goals with respect to such instruments are of paramount significance. Naturally, such a complex field can only be dealt with step by step, making contribution relative to specific problems. The report on hand shows the essentials of a 'stocktaking' of systems relability studies in the licencing procedure under atomic law and of an American report (NUREG-0739) on 'Quantitative Safety Goals'. (orig.) [de
Quantum probability for probabilists
Meyer, Paul-André
1993-01-01
In recent years, the classical theory of stochastic integration and stochastic differential equations has been extended to a non-commutative set-up to develop models for quantum noises. The author, a specialist of classical stochastic calculus and martingale theory, tries to provide anintroduction to this rapidly expanding field in a way which should be accessible to probabilists familiar with the Ito integral. It can also, on the other hand, provide a means of access to the methods of stochastic calculus for physicists familiar with Fock space analysis.
Integration of Probabilistic Exposure Assessment and Probabilistic Hazard Characterization
Voet, van der H.; Slob, W.
2007-01-01
A method is proposed for integrated probabilistic risk assessment where exposure assessment and hazard characterization are both included in a probabilistic way. The aim is to specify the probability that a random individual from a defined (sub)population will have an exposure high enough to cause a
Probabilistic description of traffic flow
International Nuclear Information System (INIS)
Mahnke, R.; Kaupuzs, J.; Lubashevsky, I.
2005-01-01
A stochastic description of traffic flow, called probabilistic traffic flow theory, is developed. The general master equation is applied to relatively simple models to describe the formation and dissolution of traffic congestions. Our approach is mainly based on spatially homogeneous systems like periodically closed circular rings without on- and off-ramps. We consider a stochastic one-step process of growth or shrinkage of a car cluster (jam). As generalization we discuss the coexistence of several car clusters of different sizes. The basic problem is to find a physically motivated ansatz for the transition rates of the attachment and detachment of individual cars to a car cluster consistent with the empirical observations in real traffic. The emphasis is put on the analogy with first-order phase transitions and nucleation phenomena in physical systems like supersaturated vapour. The results are summarized in the flux-density relation, the so-called fundamental diagram of traffic flow, and compared with empirical data. Different regimes of traffic flow are discussed: free flow, congested mode as stop-and-go regime, and heavy viscous traffic. The traffic breakdown is studied based on the master equation as well as the Fokker-Planck approximation to calculate mean first passage times or escape rates. Generalizations are developed to allow for on-ramp effects. The calculated flux-density relation and characteristic breakdown times coincide with empirical data measured on highways. Finally, a brief summary of the stochastic cellular automata approach is given
Probabilistic approach to manipulator kinematics and dynamics
International Nuclear Information System (INIS)
Rao, S.S.; Bhatti, P.K.
2001-01-01
A high performance, high speed robotic arm must be able to manipulate objects with a high degree of accuracy and repeatability. As with any other physical system, there are a number of factors causing uncertainties in the behavior of a robotic manipulator. These factors include manufacturing and assembling tolerances, and errors in the joint actuators and controllers. In order to study the effect of these uncertainties on the robotic end-effector and to obtain a better insight into the manipulator behavior, the manipulator kinematics and dynamics are modeled using a probabilistic approach. Based on the probabilistic model, kinematic and dynamic performance criteria are defined to provide measures of the behavior of the robotic end-effector. Techniques are presented to compute the kinematic and dynamic reliabilities of the manipulator. The effects of tolerances associated with the various manipulator parameters on the reliabilities are studied. Numerical examples are presented to illustrate the procedures
Probabilistic Structural Analysis of SSME Turbopump Blades: Probabilistic Geometry Effects
Nagpal, V. K.
1985-01-01
A probabilistic study was initiated to evaluate the precisions of the geometric and material properties tolerances on the structural response of turbopump blades. To complete this study, a number of important probabilistic variables were identified which are conceived to affect the structural response of the blade. In addition, a methodology was developed to statistically quantify the influence of these probabilistic variables in an optimized way. The identified variables include random geometric and material properties perturbations, different loadings and a probabilistic combination of these loadings. Influences of these probabilistic variables are planned to be quantified by evaluating the blade structural response. Studies of the geometric perturbations were conducted for a flat plate geometry as well as for a space shuttle main engine blade geometry using a special purpose code which uses the finite element approach. Analyses indicate that the variances of the perturbations about given mean values have significant influence on the response.
A General Framework for Probabilistic Characterizing Formulae
DEFF Research Database (Denmark)
Sack, Joshua; Zhang, Lijun
2012-01-01
Recently, a general framework on characteristic formulae was proposed by Aceto et al. It offers a simple theory that allows one to easily obtain characteristic formulae of many non-probabilistic behavioral relations. Our paper studies their techniques in a probabilistic setting. We provide...... a general method for determining characteristic formulae of behavioral relations for probabilistic automata using fixed-point probability logics. We consider such behavioral relations as simulations and bisimulations, probabilistic bisimulations, probabilistic weak simulations, and probabilistic forward...
International Nuclear Information System (INIS)
Posch, C
2012-01-01
Nature still outperforms the most powerful computers in routine functions involving perception, sensing and actuation like vision, audition, and motion control, and is, most strikingly, orders of magnitude more energy-efficient than its artificial competitors. The reasons for the superior performance of biological systems are subject to diverse investigations, but it is clear that the form of hardware and the style of computation in nervous systems are fundamentally different from what is used in artificial synchronous information processing systems. Very generally speaking, biological neural systems rely on a large number of relatively simple, slow and unreliable processing elements and obtain performance and robustness from a massively parallel principle of operation and a high level of redundancy where the failure of single elements usually does not induce any observable system performance degradation. In the late 1980's, Carver Mead demonstrated that silicon VLSI technology can be employed in implementing ''neuromorphic'' circuits that mimic neural functions and fabricating building blocks that work like their biological role models. Neuromorphic systems, as the biological systems they model, are adaptive, fault-tolerant and scalable, and process information using energy-efficient, asynchronous, event-driven methods. In this paper, some basics of neuromorphic electronic engineering and its impact on recent developments in optical sensing and artificial vision are presented. It is demonstrated that bio-inspired vision systems have the potential to outperform conventional, frame-based vision acquisition and processing systems in many application fields and to establish new benchmarks in terms of redundancy suppression/data compression, dynamic range, temporal resolution and power efficiency to realize advanced functionality like 3D vision, object tracking, motor control, visual feedback loops, etc. in real-time. It is argued that future artificial vision systems
Probabilistic pathway construction.
Yousofshahi, Mona; Lee, Kyongbum; Hassoun, Soha
2011-07-01
Expression of novel synthesis pathways in host organisms amenable to genetic manipulations has emerged as an attractive metabolic engineering strategy to overproduce natural products, biofuels, biopolymers and other commercially useful metabolites. We present a pathway construction algorithm for identifying viable synthesis pathways compatible with balanced cell growth. Rather than exhaustive exploration, we investigate probabilistic selection of reactions to construct the pathways. Three different selection schemes are investigated for the selection of reactions: high metabolite connectivity, low connectivity and uniformly random. For all case studies, which involved a diverse set of target metabolites, the uniformly random selection scheme resulted in the highest average maximum yield. When compared to an exhaustive search enumerating all possible reaction routes, our probabilistic algorithm returned nearly identical distributions of yields, while requiring far less computing time (minutes vs. years). The pathways identified by our algorithm have previously been confirmed in the literature as viable, high-yield synthesis routes. Prospectively, our algorithm could facilitate the design of novel, non-native synthesis routes by efficiently exploring the diversity of biochemical transformations in nature. Copyright © 2011 Elsevier Inc. All rights reserved.
Probabilistic risk assessment methodology
International Nuclear Information System (INIS)
Shinaishin, M.A.
1988-06-01
The objective of this work is to provide the tools necessary for clear identification of: the purpose of a Probabilistic Risk Study, the bounds and depth of the study, the proper modeling techniques to be used, the failure modes contributing to the analysis, the classical and baysian approaches for manipulating data necessary for quantification, ways for treating uncertainties, and available computer codes that may be used in performing such probabilistic analysis. In addition, it provides the means for measuring the importance of a safety feature to maintaining a level of risk at a Nuclear Power Plant and the worth of optimizing a safety system in risk reduction. In applying these techniques so that they accommodate our national resources and needs it was felt that emphasis should be put on the system reliability analysis level of PRA. Objectives of such studies could include: comparing systems' designs of the various vendors in the bedding stage, and performing grid reliability and human performance analysis using national specific data. (author)
Probabilistic population aging
2017-01-01
We merge two methodologies, prospective measures of population aging and probabilistic population forecasts. We compare the speed of change and variability in forecasts of the old age dependency ratio and the prospective old age dependency ratio as well as the same comparison for the median age and the prospective median age. While conventional measures of population aging are computed on the basis of the number of years people have already lived, prospective measures are computed also taking account of the expected number of years they have left to live. Those remaining life expectancies change over time and differ from place to place. We compare the probabilistic distributions of the conventional and prospective measures using examples from China, Germany, Iran, and the United States. The changes over time and the variability of the prospective indicators are smaller than those that are observed in the conventional ones. A wide variety of new results emerge from the combination of methodologies. For example, for Germany, Iran, and the United States the likelihood that the prospective median age of the population in 2098 will be lower than it is today is close to 100 percent. PMID:28636675
Probabilistic cellular automata.
Agapie, Alexandru; Andreica, Anca; Giuclea, Marius
2014-09-01
Cellular automata are binary lattices used for modeling complex dynamical systems. The automaton evolves iteratively from one configuration to another, using some local transition rule based on the number of ones in the neighborhood of each cell. With respect to the number of cells allowed to change per iteration, we speak of either synchronous or asynchronous automata. If randomness is involved to some degree in the transition rule, we speak of probabilistic automata, otherwise they are called deterministic. With either type of cellular automaton we are dealing with, the main theoretical challenge stays the same: starting from an arbitrary initial configuration, predict (with highest accuracy) the end configuration. If the automaton is deterministic, the outcome simplifies to one of two configurations, all zeros or all ones. If the automaton is probabilistic, the whole process is modeled by a finite homogeneous Markov chain, and the outcome is the corresponding stationary distribution. Based on our previous results for the asynchronous case-connecting the probability of a configuration in the stationary distribution to its number of zero-one borders-the article offers both numerical and theoretical insight into the long-term behavior of synchronous cellular automata.
Probabilistic biological network alignment.
Todor, Andrei; Dobra, Alin; Kahveci, Tamer
2013-01-01
Interactions between molecules are probabilistic events. An interaction may or may not happen with some probability, depending on a variety of factors such as the size, abundance, or proximity of the interacting molecules. In this paper, we consider the problem of aligning two biological networks. Unlike existing methods, we allow one of the two networks to contain probabilistic interactions. Allowing interaction probabilities makes the alignment more biologically relevant at the expense of explosive growth in the number of alternative topologies that may arise from different subsets of interactions that take place. We develop a novel method that efficiently and precisely characterizes this massive search space. We represent the topological similarity between pairs of aligned molecules (i.e., proteins) with the help of random variables and compute their expected values. We validate our method showing that, without sacrificing the running time performance, it can produce novel alignments. Our results also demonstrate that our method identifies biologically meaningful mappings under a comprehensive set of criteria used in the literature as well as the statistical coherence measure that we developed to analyze the statistical significance of the similarity of the functions of the aligned protein pairs.
Quantum probabilistic logic programming
Balu, Radhakrishnan
2015-05-01
We describe a quantum mechanics based logic programming language that supports Horn clauses, random variables, and covariance matrices to express and solve problems in probabilistic logic. The Horn clauses of the language wrap random variables, including infinite valued, to express probability distributions and statistical correlations, a powerful feature to capture relationship between distributions that are not independent. The expressive power of the language is based on a mechanism to implement statistical ensembles and to solve the underlying SAT instances using quantum mechanical machinery. We exploit the fact that classical random variables have quantum decompositions to build the Horn clauses. We establish the semantics of the language in a rigorous fashion by considering an existing probabilistic logic language called PRISM with classical probability measures defined on the Herbrand base and extending it to the quantum context. In the classical case H-interpretations form the sample space and probability measures defined on them lead to consistent definition of probabilities for well formed formulae. In the quantum counterpart, we define probability amplitudes on Hinterpretations facilitating the model generations and verifications via quantum mechanical superpositions and entanglements. We cast the well formed formulae of the language as quantum mechanical observables thus providing an elegant interpretation for their probabilities. We discuss several examples to combine statistical ensembles and predicates of first order logic to reason with situations involving uncertainty.
Probabilistic risk assessment methodology
Energy Technology Data Exchange (ETDEWEB)
Shinaishin, M A
1988-06-15
The objective of this work is to provide the tools necessary for clear identification of: the purpose of a Probabilistic Risk Study, the bounds and depth of the study, the proper modeling techniques to be used, the failure modes contributing to the analysis, the classical and baysian approaches for manipulating data necessary for quantification, ways for treating uncertainties, and available computer codes that may be used in performing such probabilistic analysis. In addition, it provides the means for measuring the importance of a safety feature to maintaining a level of risk at a Nuclear Power Plant and the worth of optimizing a safety system in risk reduction. In applying these techniques so that they accommodate our national resources and needs it was felt that emphasis should be put on the system reliability analysis level of PRA. Objectives of such studies could include: comparing systems' designs of the various vendors in the bedding stage, and performing grid reliability and human performance analysis using national specific data. (author)
Topics in Probabilistic Judgment Aggregation
Wang, Guanchun
2011-01-01
This dissertation is a compilation of several studies that are united by their relevance to probabilistic judgment aggregation. In the face of complex and uncertain events, panels of judges are frequently consulted to provide probabilistic forecasts, and aggregation of such estimates in groups often yield better results than could have been made…
Probabilistic studies of accident sequences
International Nuclear Information System (INIS)
Villemeur, A.; Berger, J.P.
1986-01-01
For several years, Electricite de France has carried out probabilistic assessment of accident sequences for nuclear power plants. In the framework of this program many methods were developed. As the interest in these studies was increasing and as adapted methods were developed, Electricite de France has undertaken a probabilistic safety assessment of a nuclear power plant [fr
Compression of Probabilistic XML documents
Veldman, Irma
2009-01-01
Probabilistic XML (PXML) files resulting from data integration can become extremely large, which is undesired. For XML there are several techniques available to compress the document and since probabilistic XML is in fact (a special form of) XML, it might benefit from these methods even more. In
Kerner, Boris S; Klenov, Sergey L; Schreckenberg, Michael
2014-05-01
Physical features of induced phase transitions in a metastable free flow at an on-ramp bottleneck in three-phase and two-phase cellular automaton (CA) traffic-flow models have been revealed. It turns out that at given flow rates at the bottleneck, to induce a moving jam (F → J transition) in the metastable free flow through the application of a time-limited on-ramp inflow impulse, in both two-phase and three-phase CA models the same critical amplitude of the impulse is required. If a smaller impulse than this critical one is applied, neither F → J transition nor other phase transitions can occur in the two-phase CA model. We have found that in contrast with the two-phase CA model, in the three-phase CA model, if the same smaller impulse is applied, then a phase transition from free flow to synchronized flow (F → S transition) can be induced at the bottleneck. This explains why rather than the F → J transition, in the three-phase theory traffic breakdown at a highway bottleneck is governed by an F → S transition, as observed in real measured traffic data. None of two-phase traffic-flow theories incorporates an F → S transition in a metastable free flow at the bottleneck that is the main feature of the three-phase theory. On the one hand, this shows the incommensurability of three-phase and two-phase traffic-flow theories. On the other hand, this clarifies why none of the two-phase traffic-flow theories can explain the set of fundamental empirical features of traffic breakdown at highway bottlenecks.
Probabilistic Structural Analysis Theory Development
Burnside, O. H.
1985-01-01
The objective of the Probabilistic Structural Analysis Methods (PSAM) project is to develop analysis techniques and computer programs for predicting the probabilistic response of critical structural components for current and future space propulsion systems. This technology will play a central role in establishing system performance and durability. The first year's technical activity is concentrating on probabilistic finite element formulation strategy and code development. Work is also in progress to survey critical materials and space shuttle mian engine components. The probabilistic finite element computer program NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) is being developed. The final probabilistic code will have, in the general case, the capability of performing nonlinear dynamic of stochastic structures. It is the goal of the approximate methods effort to increase problem solving efficiency relative to finite element methods by using energy methods to generate trial solutions which satisfy the structural boundary conditions. These approximate methods will be less computer intensive relative to the finite element approach.
Nature-inspired optimization algorithms
Yang, Xin-She
2014-01-01
Nature-Inspired Optimization Algorithms provides a systematic introduction to all major nature-inspired algorithms for optimization. The book's unified approach, balancing algorithm introduction, theoretical background and practical implementation, complements extensive literature with well-chosen case studies to illustrate how these algorithms work. Topics include particle swarm optimization, ant and bee algorithms, simulated annealing, cuckoo search, firefly algorithm, bat algorithm, flower algorithm, harmony search, algorithm analysis, constraint handling, hybrid methods, parameter tuning
Fenyvesi, Kristof; Houghton, Tony; Diego-Mantecón, José Manuel; Crilly, Elizabeth; Oldknow, Adrian; Lavicza, Zsolt; Blanco, Teresa F.
2017-01-01
Abstract The goal of the Kids Inspiring Kids in STEAM (KIKS) project was to raise students' awareness towards the multi- and transdisciplinary connections between the STEAM subjects (Science, Technology, Engineering, Arts & Mathematics), and make the learning about topics and phenomena from these fields more enjoyable. In order to achieve these goals, KIKS project has popularized the STEAM-concept by projects based on the students inspiring other students-approach and by utilizing new tec...
Smart Nacre-inspired Nanocomposites.
Peng, Jingsong; Cheng, Qunfeng
2018-03-15
Nacre-inspired nanocomposites with excellent mechanical properties have achieved remarkable attention in the past decades. The high performance of nacre-inspired nanocomposites is a good basis for the further application of smart devices. Recently, some smart nanocomposites inspired by nacre have demonstrated good mechanical properties as well as effective and stable stimuli-responsive functions. In this Concept, we summarize the recent development of smart nacre-inspired nanocomposites, including 1D fibers, 2D films and 3D bulk nanocomposites, in response to temperature, moisture, light, strain, and so on. We show that diverse smart nanocomposites could be designed by combining various conventional fabrication methods of nacre-inspired nanocomposites with responsive building blocks and interface interactions. The nacre-inspired strategy is versatile for different kinds of smart nanocomposites in extensive applications, such as strain sensors, displays, artificial muscles, robotics, and so on, and may act as an effective roadmap for designing smart nanocomposites in the future. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
A bio-inspired spatial patterning circuit.
Chen, Kai-Yuan; Joe, Danial J; Shealy, James B; Land, Bruce R; Shen, Xiling
2014-01-01
Lateral Inhibition (LI) is a widely conserved patterning mechanism in biological systems across species. Distinct from better-known Turing patterns, LI depend on cell-cell contact rather than diffusion. We built an in silico genetic circuit model to analyze the dynamic properties of LI. The model revealed that LI amplifies differences between neighboring cells to push them into opposite states, hence forming stable 2-D patterns. Inspired by this insight, we designed and implemented an electronic circuit that recapitulates LI patterning dynamics. This biomimetic system serve as a physical model to elucidate the design principle of generating robust patterning through spatial feedback, regardless of the underlying devices being biological or electrical.
Probabilistic fracture finite elements
Liu, W. K.; Belytschko, T.; Lua, Y. J.
1991-05-01
The Probabilistic Fracture Mechanics (PFM) is a promising method for estimating the fatigue life and inspection cycles for mechanical and structural components. The Probability Finite Element Method (PFEM), which is based on second moment analysis, has proved to be a promising, practical approach to handle problems with uncertainties. As the PFEM provides a powerful computational tool to determine first and second moment of random parameters, the second moment reliability method can be easily combined with PFEM to obtain measures of the reliability of the structural system. The method is also being applied to fatigue crack growth. Uncertainties in the material properties of advanced materials such as polycrystalline alloys, ceramics, and composites are commonly observed from experimental tests. This is mainly attributed to intrinsic microcracks, which are randomly distributed as a result of the applied load and the residual stress.
Probabilistic retinal vessel segmentation
Wu, Chang-Hua; Agam, Gady
2007-03-01
Optic fundus assessment is widely used for diagnosing vascular and non-vascular pathology. Inspection of the retinal vasculature may reveal hypertension, diabetes, arteriosclerosis, cardiovascular disease and stroke. Due to various imaging conditions retinal images may be degraded. Consequently, the enhancement of such images and vessels in them is an important task with direct clinical applications. We propose a novel technique for vessel enhancement in retinal images that is capable of enhancing vessel junctions in addition to linear vessel segments. This is an extension of vessel filters we have previously developed for vessel enhancement in thoracic CT scans. The proposed approach is based on probabilistic models which can discern vessels and junctions. Evaluation shows the proposed filter is better than several known techniques and is comparable to the state of the art when evaluated on a standard dataset. A ridge-based vessel tracking process is applied on the enhanced image to demonstrate the effectiveness of the enhancement filter.
Probabilistic sensory recoding.
Jazayeri, Mehrdad
2008-08-01
A hallmark of higher brain functions is the ability to contemplate the world rather than to respond reflexively to it. To do so, the nervous system makes use of a modular architecture in which sensory representations are dissociated from areas that control actions. This flexibility however necessitates a recoding scheme that would put sensory information to use in the control of behavior. Sensory recoding faces two important challenges. First, recoding must take into account the inherent variability of sensory responses. Second, it must be flexible enough to satisfy the requirements of different perceptual goals. Recent progress in theory, psychophysics, and neurophysiology indicate that cortical circuitry might meet these challenges by evaluating sensory signals probabilistically.
Probabilistic brains: knowns and unknowns
Pouget, Alexandre; Beck, Jeffrey M; Ma, Wei Ji; Latham, Peter E
2015-01-01
There is strong behavioral and physiological evidence that the brain both represents probability distributions and performs probabilistic inference. Computational neuroscientists have started to shed light on how these probabilistic representations and computations might be implemented in neural circuits. One particularly appealing aspect of these theories is their generality: they can be used to model a wide range of tasks, from sensory processing to high-level cognition. To date, however, these theories have only been applied to very simple tasks. Here we discuss the challenges that will emerge as researchers start focusing their efforts on real-life computations, with a focus on probabilistic learning, structural learning and approximate inference. PMID:23955561
Directory of Open Access Journals (Sweden)
Valéria Barbosa Gomes
2001-08-01
Full Text Available O presente estudo avaliou a atividade física em uma amostra probabilística de 4.331 indivíduos com 12 anos ou mais, moradores no Município do Rio de Janeiro, que participaram de um inquérito domiciliar em 1996. Ocupação e lazer foram agrupados segundo categorias de gasto energético. Horas assistindo televisão ou utilizando computador ou video game foram também avaliadas. Somente 3,6% dos homens e 0,3% das mulheres referiram ocupação pesada. Entre os homens 59,8% referiram que nunca realizavam atividade física de lazer e entre as mulheres este percentual foi de 77,8%, ocorrendo um importante aumento desta prevalência com a idade, principalmente para homens. Mulheres realizam atividades de lazer de menor gasto energético do que os homens e com duração mediana também menor. Para horas assistindo televisão/vídeo/computador a média diária foi maior para as mulheres do que para os homens. Quanto maior o grau de escolaridade, maior a freqüência de atividade física de lazer em ambos os sexos. Analisados em conjunto, estes dados mostram o baixo gasto energético da população do Município do Rio de Janeiro com atividade física, sendo que as mulheres, os grupos de meia idade e idosos e os de baixa escolaridade apresentam um maior risco de não realizar atividade física de lazer.This study evaluated physical activity in a probabilistic sample of 4,331 individuals 12 years of age and older residing in the city of Rio de Janeiro, who participated in a household survey in 1996. Occupation and leisure activity were grouped according to categories of energy expenditure. The study also evaluated number of hours watching TV, using the computer, or playing video-games. Only 3.6% of males and 0.3% of females reported heavy occupational work. A full 59.8% of males and 77.8% of females reported never performing recreational physical activity, and there was an increase in this prevalence with age, especially for men. Women's leisure
Probabilistic Decision Graphs - Combining Verification and AI Techniques for Probabilistic Inference
DEFF Research Database (Denmark)
Jaeger, Manfred
2004-01-01
We adopt probabilistic decision graphs developed in the field of automated verification as a tool for probabilistic model representation and inference. We show that probabilistic inference has linear time complexity in the size of the probabilistic decision graph, that the smallest probabilistic ...
Probabilistic analysis of fires in nuclear plants
International Nuclear Information System (INIS)
Unione, A.; Teichmann, T.
1985-01-01
The aim of this paper is to describe a multilevel (i.e., staged) probabilistic analysis of fire risks in nuclear plants (as part of a general PRA) which maximizes the benefits of the FRA (fire risk assessment) in a cost effective way. The approach uses several stages of screening, physical modeling of clearly dominant risk contributors, searches for direct (e.g., equipment dependences) and secondary (e.g., fire induced internal flooding) interactions, and relies on lessons learned and available data from and surrogate FRAs. The general methodology is outlined. 6 figs., 10 tabs
Probabilistic Open Set Recognition
Jain, Lalit Prithviraj
Real-world tasks in computer vision, pattern recognition and machine learning often touch upon the open set recognition problem: multi-class recognition with incomplete knowledge of the world and many unknown inputs. An obvious way to approach such problems is to develop a recognition system that thresholds probabilities to reject unknown classes. Traditional rejection techniques are not about the unknown; they are about the uncertain boundary and rejection around that boundary. Thus traditional techniques only represent the "known unknowns". However, a proper open set recognition algorithm is needed to reduce the risk from the "unknown unknowns". This dissertation examines this concept and finds existing probabilistic multi-class recognition approaches are ineffective for true open set recognition. We hypothesize the cause is due to weak adhoc assumptions combined with closed-world assumptions made by existing calibration techniques. Intuitively, if we could accurately model just the positive data for any known class without overfitting, we could reject the large set of unknown classes even under this assumption of incomplete class knowledge. For this, we formulate the problem as one of modeling positive training data by invoking statistical extreme value theory (EVT) near the decision boundary of positive data with respect to negative data. We provide a new algorithm called the PI-SVM for estimating the unnormalized posterior probability of class inclusion. This dissertation also introduces a new open set recognition model called Compact Abating Probability (CAP), where the probability of class membership decreases in value (abates) as points move from known data toward open space. We show that CAP models improve open set recognition for multiple algorithms. Leveraging the CAP formulation, we go on to describe the novel Weibull-calibrated SVM (W-SVM) algorithm, which combines the useful properties of statistical EVT for score calibration with one-class and binary
Paradigms for biologically inspired design
DEFF Research Database (Denmark)
Lenau, T. A.; Metzea, A.-L.; Hesselberg, T.
2018-01-01
engineering, medical engineering, nanotechnology, photonics,environmental protection and agriculture. However, a major obstacle for the wider use of biologically inspired design isthe knowledge barrier that exist between the application engineers that have insight into how to design suitable productsand......Biologically inspired design is attracting increasing interest since it offers access to a huge biological repository of wellproven design principles that can be used for developing new and innovative products. Biological phenomena can inspireproduct innovation in as diverse areas as mechanical...... the biologists with detailed knowledge and experience in understanding how biological organisms function in theirenvironment. The biologically inspired design process can therefore be approached using different design paradigmsdepending on the dominant opportunities, challenges and knowledge characteristics...
Probabilistic broadcasting of mixed states
International Nuclear Information System (INIS)
Li Lvjun; Li Lvzhou; Wu Lihua; Zou Xiangfu; Qiu Daowen
2009-01-01
It is well known that the non-broadcasting theorem proved by Barnum et al is a fundamental principle of quantum communication. As we are aware, optimal broadcasting (OB) is the only method to broadcast noncommuting mixed states approximately. In this paper, motivated by the probabilistic cloning of quantum states proposed by Duan and Guo, we propose a new way for broadcasting noncommuting mixed states-probabilistic broadcasting (PB), and we present a sufficient condition for PB of mixed states. To a certain extent, we generalize the probabilistic cloning theorem from pure states to mixed states, and in particular, we generalize the non-broadcasting theorem, since the case that commuting mixed states can be exactly broadcast can be thought of as a special instance of PB where the success ratio is 1. Moreover, we discuss probabilistic local broadcasting (PLB) of separable bipartite states
Evaluation of Probabilistic Disease Forecasts.
Hughes, Gareth; Burnett, Fiona J
2017-10-01
The statistical evaluation of probabilistic disease forecasts often involves calculation of metrics defined conditionally on disease status, such as sensitivity and specificity. However, for the purpose of disease management decision making, metrics defined conditionally on the result of the forecast-predictive values-are also important, although less frequently reported. In this context, the application of scoring rules in the evaluation of probabilistic disease forecasts is discussed. An index of separation with application in the evaluation of probabilistic disease forecasts, described in the clinical literature, is also considered and its relation to scoring rules illustrated. Scoring rules provide a principled basis for the evaluation of probabilistic forecasts used in plant disease management. In particular, the decomposition of scoring rules into interpretable components is an advantageous feature of their application in the evaluation of disease forecasts.
14th International Probabilistic Workshop
Taerwe, Luc; Proske, Dirk
2017-01-01
This book presents the proceedings of the 14th International Probabilistic Workshop that was held in Ghent, Belgium in December 2016. Probabilistic methods are currently of crucial importance for research and developments in the field of engineering, which face challenges presented by new materials and technologies and rapidly changing societal needs and values. Contemporary needs related to, for example, performance-based design, service-life design, life-cycle analysis, product optimization, assessment of existing structures and structural robustness give rise to new developments as well as accurate and practically applicable probabilistic and statistical engineering methods to support these developments. These proceedings are a valuable resource for anyone interested in contemporary developments in the field of probabilistic engineering applications.
Cumulative Dominance and Probabilistic Sophistication
Wakker, P.P.; Sarin, R.H.
2000-01-01
Machina & Schmeidler (Econometrica, 60, 1992) gave preference conditions for probabilistic sophistication, i.e. decision making where uncertainty can be expressed in terms of (subjective) probabilities without commitment to expected utility maximization. This note shows that simpler and more general
Probabilistic simulation of fermion paths
International Nuclear Information System (INIS)
Zhirov, O.V.
1989-01-01
Permutation symmetry of fermion path integral allows (while spin degrees of freedom are ignored) to use in its simulation any probabilistic algorithm, like Metropolis one, heat bath, etc. 6 refs., 2 tabs
Probabilistic modeling of timber structures
DEFF Research Database (Denmark)
Köhler, Jochen; Sørensen, John Dalsgaard; Faber, Michael Havbro
2007-01-01
The present paper contains a proposal for the probabilistic modeling of timber material properties. It is produced in the context of the Probabilistic Model Code (PMC) of the Joint Committee on Structural Safety (JCSS) [Joint Committee of Structural Safety. Probabilistic Model Code, Internet...... Publication: www.jcss.ethz.ch; 2001] and of the COST action E24 ‘Reliability of Timber Structures' [COST Action E 24, Reliability of timber structures. Several meetings and Publications, Internet Publication: http://www.km.fgg.uni-lj.si/coste24/coste24.htm; 2005]. The present proposal is based on discussions...... and comments from participants of the COST E24 action and the members of the JCSS. The paper contains a description of the basic reference properties for timber strength parameters and ultimate limit state equations for timber components. The recommended probabilistic model for these basic properties...
DEFF Research Database (Denmark)
Ejersbo, Lisser Rye
2015-01-01
NY-times har en ugentlig klumme med gode råd. For nogle uger siden var ugens inspiration henvendt til lærere/undervisere og drejede sig om, hvordan man skaber taletid til alle uden at have favoritter og overse de mere stille elever.......NY-times har en ugentlig klumme med gode råd. For nogle uger siden var ugens inspiration henvendt til lærere/undervisere og drejede sig om, hvordan man skaber taletid til alle uden at have favoritter og overse de mere stille elever....
Hoch, Michael
2017-01-01
Andy Charalambous; art@andycharalambous.com artist and trained engineer based in London UK, HEP Artist in Residence, Astronomy Artist in Residence and Honorary Research Fellow Physics and Astronomy University College London http://www.andycharalambous.com art@CMS_sciARTbooklet: web page : http://artcms.web.cern.ch/artcms/ A tool to support students with their research on various scientific topics, encourage an understanding of the relevance of expression through the arts, a manual to recreate the artwork and enable students to define and develop their own artistic inquiry in the creation of new artworks. The art@CMS sciART booklet series directed by Dr. Michael Hoch, michael.hoch@cern.ch scientist and artist at CERN, in cooperation with the HST 2017 participants (S. Bellefontaine, S. Chaiwan, A. Djune Tchinda, R. O’Keeffe, G. Shumanova)
CERN Bulletin
2011-01-01
From 8 December 2011 to 17 February 2012, Geneva University's physics faculty will be holding an exhibition called "L'Origine – un voyage entre la Science et l'Art". Thirty artists from Europe and Africa will be exhibiting their work. The aim of the exhibition is to take the visitor on an imaginary journey to the origins of mankind and to show how science and art approach the same theme from different angles. The works on display will include pieces of Makonde art, a traditional art form native to Mozambique, created by artists of the Nairucu Arts centre. The cultural programme that will run alongside the exhibition will include lectures on contemporary scientific themes aimed at the general public. Visitors will also have the opportunity to discover "L’Origine", a book of poetry by Beatrice Bressan (Ed. Loreleo, Geneva, 2010), which was awarded the third prize in the “Poeti nella società&...
De la Sen, M.
2015-01-01
In the framework of complete probabilistic metric spaces and, in particular, in probabilistic Menger spaces, this paper investigates some relevant properties of convergence of sequences to probabilistic α-fuzzy fixed points under some types of probabilistic contractive conditions.
Searching for inspiration during idea generation : Pictures or words?
Coimbra Cardoso, C.M.; Guerreiro Goncalves, M.; Badke-Schaub, P.G.
2012-01-01
People from different professional arenas search for inspiration in a number of sources, be it in memories from past experiences or in the physical environment that surrounds them. Purposefully or unconsciously, scientists, artists, writers and different types of designers for instance, come across
Probabilistic Fatigue Design of Composite Material for Wind Turbine Blades
DEFF Research Database (Denmark)
Toft, Henrik Stensgaard; Sørensen, John Dalsgaard
2011-01-01
In the present paper a probabilistic design approach to fatigue design of wind turbine blades is presented. The physical uncertainty on the fatigue strength for composite material is estimated using public available fatigue tests. Further, the model uncertainty on Miner rule for damage accumulation...
Cascade probabilistic function and the Markov's processes. Chapter 1
International Nuclear Information System (INIS)
2002-01-01
In the Chapter 1 the physical and mathematical descriptions of radiation processes are carried out. The relation of the cascade probabilistic functions (CPF) for electrons, protons, alpha-particles and ions with Markov's chain is shown. The algorithms for CPF calculation with accounting energy losses are given
Hanson, David F.
2017-04-01
Bio-inspired intelligent robots are coming of age in both research and industry, propelling market growth for robots and A.I. However, conventional motors limit bio-inspired robotics. EAP actuators and sensors could improve the simplicity, compliance, physical scaling, and offer bio-inspired advantages in robotic locomotion, grasping and manipulation, and social expressions. For EAP actuators to realize their transformative potential, further innovations are needed: the actuators must be robust, fast, powerful, manufacturable, and affordable. This presentation surveys progress, opportunities, and challenges in the author's latest work in social robots and EAP actuators, and proposes a roadmap for EAP actuators in bio-inspired intelligent robotics.
In Search of Scientific Inspiration.
2017-01-12
In the ever-expanding sea of scientific advances, how do you find inspiration for your own study? Cell editor Jiaying Tan talked with Mark Lemmon and Joseph (Yossi) Schlessinger about the importance of fueling your research creativity with the conceptual excitement and technical advance from the broad scientific field. An excerpt of the conversation appears below. Copyright © 2017. Published by Elsevier Inc.
INSPIRED High School Computing Academies
Doerschuk, Peggy; Liu, Jiangjiang; Mann, Judith
2011-01-01
If we are to attract more women and minorities to computing we must engage students at an early age. As part of its mission to increase participation of women and underrepresented minorities in computing, the Increasing Student Participation in Research Development Program (INSPIRED) conducts computing academies for high school students. The…
Inspiration: One Percent and Rising
Walling, Donovan R.
2009-01-01
Inventor Thomas Edison once famously declared, "Genius is one percent inspiration and ninety-nine percent perspiration." If that's the case, then the students the author witnessed at the International Student Media Festival (ISMF) last November in Orlando, Florida, are geniuses and more. The students in the ISMF pre-conference workshop…
DEFF Research Database (Denmark)
Thanh Tung, Truong; Dao, Trong Tuan; Grifell Junyent, Marta
2018-01-01
The fungal plasma membrane H+-ATPase (Pma1p) is a potential target for the discovery of new antifungal agents. Surprisingly, no structure-activity relationship studies for small molecules targeting Pma1p have been reported. Herein, we disclose a LEGO-inspired fragment assembly strategy for design...
Inspiration til fremtidens naturfaglige uddannelser
DEFF Research Database (Denmark)
Busch, Henrik; Troelsen, Rie; Horst, Sebastian
uddannelsesniveauer • at den naturfaglige uddannelseskultur styrkes • at lærerkompetencerne styrkes. Rapportens 2. bind - den selvstændige publikation Inspiration til fremtidens naturfaglige uddannelser • En antologi indeholder en række essays om væsentlige problemstillinger for naturfagene. Der er tidligere udsendt...
A survey of snake-inspired robot designs
International Nuclear Information System (INIS)
Hopkins, James K; Spranklin, Brent W; Gupta, Satyandra K
2009-01-01
Body undulation used by snakes and the physical architecture of a snake body may offer significant benefits over typical legged or wheeled locomotion designs in certain types of scenarios. A large number of research groups have developed snake-inspired robots to exploit these benefits. The purpose of this review is to report different types of snake-inspired robot designs and categorize them based on their main characteristics. For each category, we discuss their relative advantages and disadvantages. This review will assist in familiarizing a newcomer to the field with the existing designs and their distinguishing features. We hope that by studying existing robots, future designers will be able to create new designs by adopting features from successful robots. The review also summarizes the design challenges associated with the further advancement of the field and deploying snake-inspired robots in practice. (topical review)
Probabilistic numerical discrimination in mice.
Berkay, Dilara; Çavdaroğlu, Bilgehan; Balcı, Fuat
2016-03-01
Previous studies showed that both human and non-human animals can discriminate between different quantities (i.e., time intervals, numerosities) with a limited level of precision due to their endogenous/representational uncertainty. In addition, other studies have shown that subjects can modulate their temporal categorization responses adaptively by incorporating information gathered regarding probabilistic contingencies into their time-based decisions. Despite the psychophysical similarities between the interval timing and nonverbal counting functions, the sensitivity of count-based decisions to probabilistic information remains an unanswered question. In the current study, we investigated whether exogenous probabilistic information can be integrated into numerosity-based judgments by mice. In the task employed in this study, reward was presented either after few (i.e., 10) or many (i.e., 20) lever presses, the last of which had to be emitted on the lever associated with the corresponding trial type. In order to investigate the effect of probabilistic information on performance in this task, we manipulated the relative frequency of different trial types across different experimental conditions. We evaluated the behavioral performance of the animals under models that differed in terms of their assumptions regarding the cost of responding (e.g., logarithmically increasing vs. no response cost). Our results showed for the first time that mice could adaptively modulate their count-based decisions based on the experienced probabilistic contingencies in directions predicted by optimality.
Probabilistic Design and Analysis Framework
Strack, William C.; Nagpal, Vinod K.
2010-01-01
PRODAF is a software package designed to aid analysts and designers in conducting probabilistic analysis of components and systems. PRODAF can integrate multiple analysis programs to ease the tedious process of conducting a complex analysis process that requires the use of multiple software packages. The work uses a commercial finite element analysis (FEA) program with modules from NESSUS to conduct a probabilistic analysis of a hypothetical turbine blade, disk, and shaft model. PRODAF applies the response surface method, at the component level, and extrapolates the component-level responses to the system level. Hypothetical components of a gas turbine engine are first deterministically modeled using FEA. Variations in selected geometrical dimensions and loading conditions are analyzed to determine the effects of the stress state within each component. Geometric variations include the cord length and height for the blade, inner radius, outer radius, and thickness, which are varied for the disk. Probabilistic analysis is carried out using developing software packages like System Uncertainty Analysis (SUA) and PRODAF. PRODAF was used with a commercial deterministic FEA program in conjunction with modules from the probabilistic analysis program, NESTEM, to perturb loads and geometries to provide a reliability and sensitivity analysis. PRODAF simplified the handling of data among the various programs involved, and will work with many commercial and opensource deterministic programs, probabilistic programs, or modules.
Probabilistic methods used in NUSS
International Nuclear Information System (INIS)
Fischer, J.; Giuliani, P.
1985-01-01
Probabilistic considerations are used implicitly or explicitly in all technical areas. In the NUSS codes and guides the two areas of design and siting are those where more use is made of these concepts. A brief review of the relevant documents in these two areas is made in this paper. It covers the documents where either probabilistic considerations are implied or where probabilistic approaches are recommended in the evaluation of situations and of events. In the siting guides the review mainly covers the area of seismic hydrological and external man-made events analysis, as well as some aspects of meteorological extreme events analysis. Probabilistic methods are recommended in the design guides but they are not made a requirement. There are several reasons for this, mainly lack of reliable data and the absence of quantitative safety limits or goals against which to judge the design analysis. As far as practical, engineering judgement should be backed up by quantitative probabilistic analysis. Examples are given and the concept of design basis as used in NUSS design guides is explained. (author)
Norsk inspiration til uddannelse og job
DEFF Research Database (Denmark)
Skovhus, Randi Boelskifte; Thomsen, Rie; Buhl, Rita
2017-01-01
Anmeldelse af bog om det norske fag Utdanningsvalg - inspiration til arbejde med uddannelse og job......Anmeldelse af bog om det norske fag Utdanningsvalg - inspiration til arbejde med uddannelse og job...
Skin-Inspired Electronics: An Emerging Paradigm.
Wang, Sihong; Oh, Jin Young; Xu, Jie; Tran, Helen; Bao, Zhenan
2018-05-15
Future electronics will take on more important roles in people's lives. They need to allow more intimate contact with human beings to enable advanced health monitoring, disease detection, medical therapies, and human-machine interfacing. However, current electronics are rigid, nondegradable and cannot self-repair, while the human body is soft, dynamic, stretchable, biodegradable, and self-healing. Therefore, it is critical to develop a new class of electronic materials that incorporate skinlike properties, including stretchability for conformable integration, minimal discomfort and suppressed invasive reactions; self-healing for long-term durability under harsh mechanical conditions; and biodegradability for reducing environmental impact and obviating the need for secondary device removal for medical implants. These demands have fueled the development of a new generation of electronic materials, primarily composed of polymers and polymer composites with both high electrical performance and skinlike properties, and consequently led to a new paradigm of electronics, termed "skin-inspired electronics". This Account covers recent important advances in skin-inspired electronics, from basic material developments to device components and proof-of-concept demonstrations for integrated bioelectronics applications. To date, stretchability has been the most prominent focus in this field. In contrast to strain-engineering approaches that extrinsically impart stretchability into inorganic electronics, intrinsically stretchable materials provide a direct route to achieve higher mechanical robustness, higher device density, and scalable fabrication. The key is the introduction of strain-dissipation mechanisms into the material design, which has been realized through molecular engineering (e.g., soft molecular segments, dynamic bonds) and physical engineering (e.g., nanoconfinement effect, geometric design). The material design concepts have led to the successful demonstrations of
Ships - inspiring objects in architecture
Marczak, Elzbieta
2017-10-01
Sea-going vessels have for centuries fascinated people, not only those who happen to work at sea, but first and foremost, those who have never set foot aboard a ship. The environment in which ships operate is reminiscent of freedom and countless adventures, but also of hard and interesting maritime working life. The famous words of Pompey: “Navigare necesseest, vivere non estnecesse” (sailing is necessary, living - is not necessary), which he pronounced on a stormy sea voyage, arouse curiosity and excitement, inviting one to test the truth of this saying personally. It is often the case, however, that sea-faring remains within the realm of dreams, while the fascination with ships demonstrates itself through a transposition of naval features onto land constructions. In such cases, ship-inspired motifs bring alive dreams and yearnings as well as reflect tastes. Tourism is one of the indicators of people’s standard of living and a measure of a society’s civilisation. Maritime tourism has been developing rapidly in recent decades. A sea cruise offers an insight into life at sea. Still, most people derive their knowledge of passenger vessels and their furnishings from the mass media. Passenger vessels, also known as “floating cities,” are described as majestic and grand, while their on-board facilities as luxurious, comfortable, exclusive and inaccessible to common people on land. Freight vessels, on the other hand, are described as enormous objects which dwarf the human being into insignificance. This article presents the results of research intended to answer the following questions: what makes ships a source of inspiration for land architecture? To what extent and by what means do architects draw on ships in their design work? In what places can we find structures inspired by ships? What ships inspire architects? This article presents examples of buildings, whose design was inspired by the architecture and structural details of sea vessels. An analysis of
Radioactivity, a pragmatic pillar of probabilistic conceptions
International Nuclear Information System (INIS)
Amaldi, E.
1979-01-01
The author expresses his opinion that by looking at the problem of repudiation of causality in physics from the most general and far away point of view, one can be brought to over-estimate the extrinsic influences and over-look intrinsic arguments inherent to two parallel, almost independent developments. The first one starts from the kinetic theory of gases and passes through statistical mechanics, Planck original definition of quantum, the photons conceived as particles and the relations between emission and absorption of photons by atoms. The other path, also intrinsic to physics starts with the accidental discovery of radioactive substances, passes through the experimental recognition of their decay properties and quickly finds its natural settlement in a probabilistic conception which can be accused to be a critical but has certainly a sound pragmatic ground, uncorrelated or at extremely loosely correlated to contemporary or pre-existing philosophical lines of thought. (Auth.)
Implications of probabilistic risk assessment
International Nuclear Information System (INIS)
Cullingford, M.C.; Shah, S.M.; Gittus, J.H.
1987-01-01
Probabilistic risk assessment (PRA) is an analytical process that quantifies the likelihoods, consequences and associated uncertainties of the potential outcomes of postulated events. Starting with planned or normal operation, probabilistic risk assessment covers a wide range of potential accidents and considers the whole plant and the interactions of systems and human actions. Probabilistic risk assessment can be applied in safety decisions in design, licensing and operation of industrial facilities, particularly nuclear power plants. The proceedings include a review of PRA procedures, methods and technical issues in treating uncertainties, operating and licensing issues and future trends. Risk assessment for specific reactor types or components and specific risks (eg aircraft crashing onto a reactor) are used to illustrate the points raised. All 52 articles are indexed separately. (U.K.)
Bounding probabilistic safety assessment probabilities by reality
International Nuclear Information System (INIS)
Fragola, J.R.; Shooman, M.L.
1991-01-01
The investigation of the failure in systems where failure is a rare event makes the continual comparisons between the developed probabilities and empirical evidence difficult. The comparison of the predictions of rare event risk assessments with historical reality is essential to prevent probabilistic safety assessment (PSA) predictions from drifting into fantasy. One approach to performing such comparisons is to search out and assign probabilities to natural events which, while extremely rare, have a basis in the history of natural phenomena or human activities. For example the Segovian aqueduct and some of the Roman fortresses in Spain have existed for several millennia and in many cases show no physical signs of earthquake damage. This evidence could be used to bound the probability of earthquakes above a certain magnitude to less than 10 -3 per year. On the other hand, there is evidence that some repetitive actions can be performed with extremely low historical probabilities when operators are properly trained and motivated, and sufficient warning indicators are provided. The point is not that low probability estimates are impossible, but continual reassessment of the analysis assumptions, and a bounding of the analysis predictions by historical reality. This paper reviews the probabilistic predictions of PSA in this light, attempts to develop, in a general way, the limits which can be historically established and the consequent bounds that these limits place upon the predictions, and illustrates the methodology used in computing such limits. Further, the paper discusses the use of empirical evidence and the requirement for disciplined systematic approaches within the bounds of reality and the associated impact on PSA probabilistic estimates
Prediction of Intention during Interaction with iCub with Probabilistic Movement Primitives
Directory of Open Access Journals (Sweden)
Oriane Dermy
2017-10-01
Full Text Available This article describes our open-source software for predicting the intention of a user physically interacting with the humanoid robot iCub. Our goal is to allow the robot to infer the intention of the human partner during collaboration, by predicting the future intended trajectory: this capability is critical to design anticipatory behaviors that are crucial in human–robot collaborative scenarios, such as in co-manipulation, cooperative assembly, or transportation. We propose an approach to endow the iCub with basic capabilities of intention recognition, based on Probabilistic Movement Primitives (ProMPs, a versatile method for representing, generalizing, and reproducing complex motor skills. The robot learns a set of motion primitives from several demonstrations, provided by the human via physical interaction. During training, we model the collaborative scenario using human demonstrations. During the reproduction of the collaborative task, we use the acquired knowledge to recognize the intention of the human partner. Using a few early observations of the state of the robot, we can not only infer the intention of the partner but also complete the movement, even if the user breaks the physical interaction with the robot. We evaluate our approach in simulation and on the real iCub. In simulation, the iCub is driven by the user using the Geomagic Touch haptic device. In the real robot experiment, we directly interact with the iCub by grabbing and manually guiding the robot’s arm. We realize two experiments on the real robot: one with simple reaching trajectories, and one inspired by collaborative object sorting. The software implementing our approach is open source and available on the GitHub platform. In addition, we provide tutorials and videos.
Social insects inspire human design
Holbrook, C. Tate; Clark, Rebecca M.; Moore, Dani; Overson, Rick P.; Penick, Clint A.; Smith, Adrian A.
2010-01-01
The international conference ‘Social Biomimicry: Insect Societies and Human Design’, hosted by Arizona State University, USA, 18–20 February 2010, explored how the collective behaviour and nest architecture of social insects can inspire innovative and effective solutions to human design challenges. It brought together biologists, designers, engineers, computer scientists, architects and businesspeople, with the dual aims of enriching biology and advancing biomimetic design. PMID:20392721
Biology-Inspired Autonomous Control
2011-08-31
insect brain, allow these animals to fly with damaged wings, order of body mass payloads (e.g., foraging bees with a load of pollen , blood satiated...The research focus addressed two broad, complementary research areas : autonomous systems concepts inspired by the behavior and neurobiology...UL 46 19b. TELEPHONE NUMBER (include area code) 850 883-1887 Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std. Z39.18 iii Table of
Probabilistic coding of quantum states
International Nuclear Information System (INIS)
Grudka, Andrzej; Wojcik, Antoni; Czechlewski, Mikolaj
2006-01-01
We discuss the properties of probabilistic coding of two qubits to one qutrit and generalize the scheme to higher dimensions. We show that the protocol preserves the entanglement between the qubits to be encoded and the environment and can also be applied to mixed states. We present a protocol that enables encoding of n qudits to one qudit of dimension smaller than the Hilbert space of the original system and then allows probabilistic but error-free decoding of any subset of k qudits. We give a formula for the probability of successful decoding
Probabilistic methods in combinatorial analysis
Sachkov, Vladimir N
2014-01-01
This 1997 work explores the role of probabilistic methods for solving combinatorial problems. These methods not only provide the means of efficiently using such notions as characteristic and generating functions, the moment method and so on but also let us use the powerful technique of limit theorems. The basic objects under investigation are nonnegative matrices, partitions and mappings of finite sets, with special emphasis on permutations and graphs, and equivalence classes specified on sequences of finite length consisting of elements of partially ordered sets; these specify the probabilist
Probabilistic reasoning in data analysis.
Sirovich, Lawrence
2011-09-20
This Teaching Resource provides lecture notes, slides, and a student assignment for a lecture on probabilistic reasoning in the analysis of biological data. General probabilistic frameworks are introduced, and a number of standard probability distributions are described using simple intuitive ideas. Particular attention is focused on random arrivals that are independent of prior history (Markovian events), with an emphasis on waiting times, Poisson processes, and Poisson probability distributions. The use of these various probability distributions is applied to biomedical problems, including several classic experimental studies.
Probabilistic Modeling of Timber Structures
DEFF Research Database (Denmark)
Köhler, J.D.; Sørensen, John Dalsgaard; Faber, Michael Havbro
2005-01-01
The present paper contains a proposal for the probabilistic modeling of timber material properties. It is produced in the context of the Probabilistic Model Code (PMC) of the Joint Committee on Structural Safety (JCSS) and of the COST action E24 'Reliability of Timber Structures'. The present...... proposal is based on discussions and comments from participants of the COST E24 action and the members of the JCSS. The paper contains a description of the basic reference properties for timber strength parameters and ultimate limit state equations for components and connections. The recommended...
Convex sets in probabilistic normed spaces
International Nuclear Information System (INIS)
Aghajani, Asadollah; Nourouzi, Kourosh
2008-01-01
In this paper we obtain some results on convexity in a probabilistic normed space. We also investigate the concept of CSN-closedness and CSN-compactness in a probabilistic normed space and generalize the corresponding results of normed spaces
International Nuclear Information System (INIS)
Boak, D.M.; Painton, L.
1995-01-01
Probabilistic forecasting techniques have been used in many risk assessment and performance assessment applications on radioactive waste disposal projects such as Yucca Mountain and the Waste Isolation Pilot Plant (WIPP). Probabilistic techniques such as Monte Carlo and Latin Hypercube sampling methods are routinely used to treat uncertainties in physical parameters important in simulating radionuclide transport in a coupled geohydrologic system and assessing the ability of that system to comply with regulatory release limits. However, the use of probabilistic techniques in the treatment of uncertainties in the cost and duration of programmatic alternatives on risk and performance assessment projects is less common. Where significant uncertainties exist and where programmatic decisions must be made despite existing uncertainties, probabilistic techniques may yield important insights into decision options, especially when used in a decision analysis framework and when properly balanced with deterministic analyses. For relatively simple evaluations, these types of probabilistic evaluations can be made using personal computer-based software
Probabilistic Space Weather Forecasting: a Bayesian Perspective
Camporeale, E.; Chandorkar, M.; Borovsky, J.; Care', A.
2017-12-01
Most of the Space Weather forecasts, both at operational and research level, are not probabilistic in nature. Unfortunately, a prediction that does not provide a confidence level is not very useful in a decision-making scenario. Nowadays, forecast models range from purely data-driven, machine learning algorithms, to physics-based approximation of first-principle equations (and everything that sits in between). Uncertainties pervade all such models, at every level: from the raw data to finite-precision implementation of numerical methods. The most rigorous way of quantifying the propagation of uncertainties is by embracing a Bayesian probabilistic approach. One of the simplest and most robust machine learning technique in the Bayesian framework is Gaussian Process regression and classification. Here, we present the application of Gaussian Processes to the problems of the DST geomagnetic index forecast, the solar wind type classification, and the estimation of diffusion parameters in radiation belt modeling. In each of these very diverse problems, the GP approach rigorously provide forecasts in the form of predictive distributions. In turn, these distributions can be used as input for ensemble simulations in order to quantify the amplification of uncertainties. We show that we have achieved excellent results in all of the standard metrics to evaluate our models, with very modest computational cost.
Confluence Reduction for Probabilistic Systems (extended version)
Timmer, Mark; Stoelinga, Mariëlle Ida Antoinette; van de Pol, Jan Cornelis
2010-01-01
This paper presents a novel technique for state space reduction of probabilistic specifications, based on a newly developed notion of confluence for probabilistic automata. We prove that this reduction preserves branching probabilistic bisimulation and can be applied on-the-fly. To support the
Probabilistic Role Models and the Guarded Fragment
DEFF Research Database (Denmark)
Jaeger, Manfred
2004-01-01
We propose a uniform semantic framework for interpreting probabilistic concept subsumption and probabilistic role quantification through statistical sampling distributions. This general semantic principle serves as the foundation for the development of a probabilistic version of the guarded fragm...... fragment of first-order logic. A characterization of equivalence in that logic in terms of bisimulations is given....
Probabilistic role models and the guarded fragment
DEFF Research Database (Denmark)
Jaeger, Manfred
2006-01-01
We propose a uniform semantic framework for interpreting probabilistic concept subsumption and probabilistic role quantification through statistical sampling distributions. This general semantic principle serves as the foundation for the development of a probabilistic version of the guarded fragm...... fragment of first-order logic. A characterization of equivalence in that logic in terms of bisimulations is given....
Making Probabilistic Relational Categories Learnable
Jung, Wookyoung; Hummel, John E.
2015-01-01
Theories of relational concept acquisition (e.g., schema induction) based on structured intersection discovery predict that relational concepts with a probabilistic (i.e., family resemblance) structure ought to be extremely difficult to learn. We report four experiments testing this prediction by investigating conditions hypothesized to facilitate…
Probabilistic inductive inference: a survey
Ambainis, Andris
2001-01-01
Inductive inference is a recursion-theoretic theory of learning, first developed by E. M. Gold (1967). This paper surveys developments in probabilistic inductive inference. We mainly focus on finite inference of recursive functions, since this simple paradigm has produced the most interesting (and most complex) results.
Probabilistic Approaches to Video Retrieval
Ianeva, Tzvetanka; Boldareva, L.; Westerveld, T.H.W.; Cornacchia, Roberto; Hiemstra, Djoerd; de Vries, A.P.
Our experiments for TRECVID 2004 further investigate the applicability of the so-called “Generative Probabilistic Models to video retrieval��?. TRECVID 2003 results demonstrated that mixture models computed from video shot sequences improve the precision of “query by examples��? results when
Probabilistic safety analysis procedures guide
International Nuclear Information System (INIS)
Papazoglou, I.A.; Bari, R.A.; Buslik, A.J.
1984-01-01
A procedures guide for the performance of probabilistic safety assessment has been prepared for interim use in the Nuclear Regulatory Commission programs. The probabilistic safety assessment studies performed are intended to produce probabilistic predictive models that can be used and extended by the utilities and by NRC to sharpen the focus of inquiries into a range of tissues affecting reactor safety. This guide addresses the determination of the probability (per year) of core damage resulting from accident initiators internal to the plant and from loss of offsite electric power. The scope includes analyses of problem-solving (cognitive) human errors, a determination of importance of the various core damage accident sequences, and an explicit treatment and display of uncertainties for the key accident sequences. Ultimately, the guide will be augmented to include the plant-specific analysis of in-plant processes (i.e., containment performance) and the risk associated with external accident initiators, as consensus is developed regarding suitable methodologies in these areas. This guide provides the structure of a probabilistic safety study to be performed, and indicates what products of the study are essential for regulatory decision making. Methodology is treated in the guide only to the extent necessary to indicate the range of methods which is acceptable; ample reference is given to alternative methodologies which may be utilized in the performance of the study
Sound Probabilistic #SAT with Projection
Directory of Open Access Journals (Sweden)
Vladimir Klebanov
2016-10-01
Full Text Available We present an improved method for a sound probabilistic estimation of the model count of a boolean formula under projection. The problem solved can be used to encode a variety of quantitative program analyses, such as concerning security of resource consumption. We implement the technique and discuss its application to quantifying information flow in programs.
Probabilistic graphs as a conceptual and computational tool in hydrology and water management
Schoups, Gerrit
2014-05-01
Originally developed in the fields of machine learning and artificial intelligence, probabilistic graphs constitute a general framework for modeling complex systems in the presence of uncertainty. The framework consists of three components: 1. Representation of the model as a graph (or network), with nodes depicting random variables in the model (e.g. parameters, states, etc), which are joined together by factors. Factors are local probabilistic or deterministic relations between subsets of variables, which, when multiplied together, yield the joint distribution over all variables. 2. Consistent use of probability theory for quantifying uncertainty, relying on basic rules of probability for assimilating data into the model and expressing unknown variables as a function of observations (via the posterior distribution). 3. Efficient, distributed approximation of the posterior distribution using general-purpose algorithms that exploit model structure encoded in the graph. These attributes make probabilistic graphs potentially useful as a conceptual and computational tool in hydrology and water management (and beyond). Conceptually, they can provide a common framework for existing and new probabilistic modeling approaches (e.g. by drawing inspiration from other fields of application), while computationally they can make probabilistic inference feasible in larger hydrological models. The presentation explores, via examples, some of these benefits.
Probabilistic uniformities of uniform spaces
Energy Technology Data Exchange (ETDEWEB)
Rodriguez Lopez, J.; Romaguera, S.; Sanchis, M.
2017-07-01
The theory of metric spaces in the fuzzy context has shown to be an interesting area of study not only from a theoretical point of view but also for its applications. Nevertheless, it is usual to consider these spaces as classical topological or uniform spaces and there are not too many results about constructing fuzzy topological structures starting from a fuzzy metric. Maybe, H/{sup o}hle was the first to show how to construct a probabilistic uniformity and a Lowen uniformity from a probabilistic pseudometric /cite{Hohle78,Hohle82a}. His method can be directly translated to the context of fuzzy metrics and allows to characterize the categories of probabilistic uniform spaces or Lowen uniform spaces by means of certain families of fuzzy pseudometrics /cite{RL}. On the other hand, other different fuzzy uniformities can be constructed in a fuzzy metric space: a Hutton $[0,1]$-quasi-uniformity /cite{GGPV06}; a fuzzifiying uniformity /cite{YueShi10}, etc. The paper /cite{GGRLRo} gives a study of several methods of endowing a fuzzy pseudometric space with a probabilistic uniformity and a Hutton $[0,1]$-quasi-uniformity. In 2010, J. Guti/'errez Garc/'{/i}a, S. Romaguera and M. Sanchis /cite{GGRoSanchis10} proved that the category of uniform spaces is isomorphic to a category formed by sets endowed with a fuzzy uniform structure, i. e. a family of fuzzy pseudometrics satisfying certain conditions. We will show here that, by means of this isomorphism, we can obtain several methods to endow a uniform space with a probabilistic uniformity. Furthermore, these constructions allow to obtain a factorization of some functors introduced in /cite{GGRoSanchis10}. (Author)
A probabilistic Hu-Washizu variational principle
Liu, W. K.; Belytschko, T.; Besterfield, G. H.
1987-01-01
A Probabilistic Hu-Washizu Variational Principle (PHWVP) for the Probabilistic Finite Element Method (PFEM) is presented. This formulation is developed for both linear and nonlinear elasticity. The PHWVP allows incorporation of the probabilistic distributions for the constitutive law, compatibility condition, equilibrium, domain and boundary conditions into the PFEM. Thus, a complete probabilistic analysis can be performed where all aspects of the problem are treated as random variables and/or fields. The Hu-Washizu variational formulation is available in many conventional finite element codes thereby enabling the straightforward inclusion of the probabilistic features into present codes.
Stancu, Cristina
2017-04-01
Using space as context to inspire science education tapps into the excitement of generations of discovering the unknown resulting in unprecedented public participation. Educators are finding exciting and age appropiate materials for their class that explore science, technology, engineering and mathematics. Possible misconceptions are highlighted so that teachers may plan lessons to facilitate correct conceptual understanding. With a range of hands-on learning experiences, Web materials and online ,opportunities for students, educators are invited to take a closer look to actual science missions. This session leverages resources, materials and expertise to address a wide range of traditional and nontraditional audiences while providing consistent messages and information on various space agencies programs.
Natural photonics for industrial inspiration.
Parker, Andrew R
2009-05-13
There are two considerations for optical biomimetics: the diversity of submicrometre architectures found in the natural world, and the industrial manufacture of these. A review exists on the latter subject, where current engineering methods are considered along with those of the natural cells. Here, on the other hand, I will provide a modern review of the different categories of reflectors and antireflectors found in animals, including their optical characterization. The purpose of this is to inspire designers within the $2 billion annual optics industry.
Neuroscience-Inspired Artificial Intelligence.
Hassabis, Demis; Kumaran, Dharshan; Summerfield, Christopher; Botvinick, Matthew
2017-07-19
The fields of neuroscience and artificial intelligence (AI) have a long and intertwined history. In more recent times, however, communication and collaboration between the two fields has become less commonplace. In this article, we argue that better understanding biological brains could play a vital role in building intelligent machines. We survey historical interactions between the AI and neuroscience fields and emphasize current advances in AI that have been inspired by the study of neural computation in humans and other animals. We conclude by highlighting shared themes that may be key for advancing future research in both fields. Copyright © 2017. Published by Elsevier Inc.
Probabilistic sharing solves the problem of costly punishment
Chen, Xiaojie; Szolnoki, Attila; Perc, Matjaž
2014-08-01
Cooperators that refuse to participate in sanctioning defectors create the second-order free-rider problem. Such cooperators will not be punished because they contribute to the public good, but they also eschew the costs associated with punishing defectors. Altruistic punishers—those that cooperate and punish—are at a disadvantage, and it is puzzling how such behaviour has evolved. We show that sharing the responsibility to sanction defectors rather than relying on certain individuals to do so permanently can solve the problem of costly punishment. Inspired by the fact that humans have strong but also emotional tendencies for fair play, we consider probabilistic sanctioning as the simplest way of distributing the duty. In well-mixed populations the public goods game is transformed into a coordination game with full cooperation and defection as the two stable equilibria, while in structured populations pattern formation supports additional counterintuitive solutions that are reminiscent of Parrondo's paradox.
Probabilistic interpretation of the reduction criterion for entanglement
International Nuclear Information System (INIS)
Zhang, Zhengmin; Luo, Shunlong
2007-01-01
Inspired by the idea of conditional probabilities, we introduce a variant of conditional density operators. But unlike the conditional probabilities which are bounded by 1, the conditional density operators may have eigenvalues exceeding 1 for entangled states. This has the consequence that although any bivariate classical probability distribution has a natural separable decomposition in terms of conditional probabilities, we do not have a quantum analogue of this separable decomposition in general. The 'nonclassical' eigenvalues of conditional density operators are indications of entanglement. The resulting separability criterion turns out to be equivalent to the reduction criterion introduced by Horodecki [Phys. Rev. A 59, 4206 (1999)] and Cerf et al. [Phys. Rev. A 60, 898 (1999)]. This supplies an intuitive probabilistic interpretation for the reduction criterion. The conditional density operators are also used to define a form of quantum conditional entropy which provides an alternative mechanism to reveal quantum discord
Augmenting Probabilistic Risk Assesment with Malevolent Initiators
International Nuclear Information System (INIS)
Smith, Curtis; Schwieder, David
2011-01-01
As commonly practiced, the use of probabilistic risk assessment (PRA) in nuclear power plants only considers accident initiators such as natural hazards, equipment failures, and human error. Malevolent initiators are ignored in PRA, but are considered the domain of physical security, which uses vulnerability assessment based on an officially specified threat (design basis threat). This paper explores the implications of augmenting and extending existing PRA models by considering new and modified scenarios resulting from malevolent initiators. Teaming the augmented PRA models with conventional vulnerability assessments can cost-effectively enhance security of a nuclear power plant. This methodology is useful for operating plants, as well as in the design of new plants. For the methodology, we have proposed an approach that builds on and extends the practice of PRA for nuclear power plants for security-related issues. Rather than only considering 'random' failures, we demonstrated a framework that is able to represent and model malevolent initiating events and associated plant impacts.
Superstring-inspired SO(10) GUT model with intermediate scale
Sasaki, Ken
1987-12-01
A new mechanism is proposed for the mixing of Weinberg-Salam Higgs fields in superstring-inspired SO(10) models with no SO(10) singlet fields. The higher-dimensional terms in the superpotential can generate both Higgs field mixing and a small mass for the physical neutrino. I would like to thank Professor C. Iso for hospitality extended to me at the Tokyo Institute of Technology.
Mathematical simulation of cascade-probabilistic functions for charged particles
International Nuclear Information System (INIS)
Kupchishin, A.A.; Kupchishin, A.I.; Smygaleva, T.A.
1998-01-01
Analytical expressions for cascade-probabilistic functions (CPF) for electrons, protons, α-particles and ions with taking into account energy losses are received. Mathematical analysis of these functions is carried out and main properties of function are determined. Algorithms of CPF are developed and their computer calculation were conducted. Regularities in behavior of function in dependence on initial particles energy, atomic number and registration depth are established. Book is intended to specialists on mathematical simulation of radiation defects, solid state physics, elementary particle physics and applied mathematics. There are 3 chapters in the book: 1. Cascade-probabilistic functions for electrons; 2. CPF for protons and α-particles; 3. CPF with taking unto account energy losses of ions. (author)
Probabilistic costing of transmission services
International Nuclear Information System (INIS)
Wijayatunga, P.D.C.
1992-01-01
Costing of transmission services of electrical utilities is required for transactions involving the transport of energy over a power network. The calculation of these costs based on Short Run Marginal Costing (SRMC) is preferred over other methods proposed in the literature due to its economic efficiency. In the research work discussed here, the concept of probabilistic costing of use-of-system based on SRMC which emerges as a consequence of the uncertainties in a power system is introduced using two different approaches. The first approach, based on the Monte Carlo method, generates a large number of possible system states by simulating random variables in the system using pseudo random number generators. A second approach to probabilistic use-of-system costing is proposed based on numerical convolution and multi-area representation of the transmission network. (UK)
Advances in probabilistic risk analysis
International Nuclear Information System (INIS)
Hardung von Hardung, H.
1982-01-01
Probabilistic risk analysis can now look back upon almost a quarter century of intensive development. The early studies, whose methods and results are still referred to occasionally, however, only permitted rough estimates to be made of the probabilities of recognizable accident scenarios, failing to provide a method which could have served as a reference base in calculating the overall risk associated with nuclear power plants. The first truly solid attempt was the Rasmussen Study and, partly based on it, the German Risk Study. In those studies, probabilistic risk analysis has been given a much more precise basis. However, new methodologies have been developed in the meantime, which allow much more informative risk studies to be carried out. They have been found to be valuable tools for management decisions with respect to backfitting, reinforcement and risk limitation. Today they are mainly applied by specialized private consultants and have already found widespread application especially in the USA. (orig.) [de
Up-gradient transport in a probabilistic transport model
DEFF Research Database (Denmark)
Gavnholt, J.; Juul Rasmussen, J.; Garcia, O.E.
2005-01-01
The transport of particles or heat against the driving gradient is studied by employing a probabilistic transport model with a characteristic particle step length that depends on the local concentration or heat gradient. When this gradient is larger than a prescribed critical value, the standard....... These results supplement recent works by van Milligen [Phys. Plasmas 11, 3787 (2004)], which applied Levy distributed step sizes in the case of supercritical gradients to obtain the up-gradient transport. (c) 2005 American Institute of Physics....
Probabilistic risk assessment of HTGRs
International Nuclear Information System (INIS)
Fleming, K.N.; Houghton, W.J.; Hannaman, G.W.; Joksimovic, V.
1980-08-01
Probabilistic Risk Assessment methods have been applied to gas-cooled reactors for more than a decade and to HTGRs for more than six years in the programs sponsored by the US Department of Energy. Significant advancements to the development of PRA methodology in these programs are summarized as are the specific applications of the methods to HTGRs. Emphasis here is on PRA as a tool for evaluating HTGR design options. Current work and future directions are also discussed
Probabilistic methods for rotordynamics analysis
Wu, Y.-T.; Torng, T. Y.; Millwater, H. R.; Fossum, A. F.; Rheinfurth, M. H.
1991-01-01
This paper summarizes the development of the methods and a computer program to compute the probability of instability of dynamic systems that can be represented by a system of second-order ordinary linear differential equations. Two instability criteria based upon the eigenvalues or Routh-Hurwitz test functions are investigated. Computational methods based on a fast probability integration concept and an efficient adaptive importance sampling method are proposed to perform efficient probabilistic analysis. A numerical example is provided to demonstrate the methods.
Probabilistic analysis and related topics
Bharucha-Reid, A T
1983-01-01
Probabilistic Analysis and Related Topics, Volume 3 focuses on the continuity, integrability, and differentiability of random functions, including operator theory, measure theory, and functional and numerical analysis. The selection first offers information on the qualitative theory of stochastic systems and Langevin equations with multiplicative noise. Discussions focus on phase-space evolution via direct integration, phase-space evolution, linear and nonlinear systems, linearization, and generalizations. The text then ponders on the stability theory of stochastic difference systems and Marko
Probabilistic analysis and related topics
Bharucha-Reid, A T
1979-01-01
Probabilistic Analysis and Related Topics, Volume 2 focuses on the integrability, continuity, and differentiability of random functions, as well as functional analysis, measure theory, operator theory, and numerical analysis.The selection first offers information on the optimal control of stochastic systems and Gleason measures. Discussions focus on convergence of Gleason measures, random Gleason measures, orthogonally scattered Gleason measures, existence of optimal controls without feedback, random necessary conditions, and Gleason measures in tensor products. The text then elaborates on an
Probabilistic risk assessment of HTGRs
International Nuclear Information System (INIS)
Fleming, K.N.; Houghton, W.J.; Hannaman, G.W.; Joksimovic, V.
1981-01-01
Probabilistic Risk Assessment methods have been applied to gas-cooled reactors for more than a decade and to HTGRs for more than six years in the programs sponsored by the U.S. Department of Energy. Significant advancements to the development of PRA methodology in these programs are summarized as are the specific applications of the methods to HTGRs. Emphasis here is on PRA as a tool for evaluating HTGR design options. Current work and future directions are also discussed. (author)
Guard Cell and Tropomyosin Inspired Chemical Sensor
Directory of Open Access Journals (Sweden)
Jacquelyn K.S. Nagel
2013-10-01
Full Text Available Sensors are an integral part of many engineered products and systems. Biological inspiration has the potential to improve current sensor designs as well as inspire innovative ones. This paper presents the design of an innovative, biologically-inspired chemical sensor that performs “up-front” processing through mechanical means. Inspiration from the physiology (function of the guard cell coupled with the morphology (form and physiology of tropomyosin resulted in two concept variants for the chemical sensor. Applications of the sensor design include environmental monitoring of harmful gases, and a non-invasive approach to detect illnesses including diabetes, liver disease, and cancer on the breath.
INSPIRE 2012 da Istanbul a Firenze
Directory of Open Access Journals (Sweden)
Mauro Salvemini
2012-09-01
Full Text Available DURING THE CONFERENCE HELD IN ISTANBUL IN 2012 INSPIRE THE NEWS THAT MOST IMPRESSED ITALIANS PRESENT, EVEN THOSE IN THE PUBLIC ADMINISTRATION , WAS THAT THE NEXT INSPIRE CONFERENCE WILL TAKE PLACE IN FLORENCEDurante la conferenza INSPIRE 2012 svoltasi ad Istanbul la notizia che ha maggiormente colpito gli italiani presenti, anche quelli della pubblica amministrazione , è stata che la prossima Conferenza INSPIRE si svolgerà a Firenze dal 23 al 27 giugno 2013.
INSPIRE 2012 da Istanbul a Firenze
Directory of Open Access Journals (Sweden)
Mauro Salvemini
2012-09-01
Full Text Available DURING THE CONFERENCE HELD IN ISTANBUL IN 2012 INSPIRE THE NEWS THAT MOST IMPRESSED ITALIANS PRESENT, EVEN THOSE IN THE PUBLIC ADMINISTRATION , WAS THAT THE NEXT INSPIRE CONFERENCE WILL TAKE PLACE IN FLORENCE Durante la conferenza INSPIRE 2012 svoltasi ad Istanbul la notizia che ha maggiormente colpito gli italiani presenti, anche quelli della pubblica amministrazione , è stata che la prossima Conferenza INSPIRE si svolgerà a Firenze dal 23 al 27 giugno 2013.
Hoffmann, K.; Srouji, R. G.; Hansen, S. O.
2017-12-01
The technology development within the structural design of long-span bridges in Norwegian fjords has created a need for reformulating the calculation format and the physical quantities used to describe the properties of wind and the associated wind-induced effects on bridge decks. Parts of a new probabilistic format describing the incoming, undisturbed wind is presented. It is expected that a fixed probabilistic format will facilitate a more physically consistent and precise description of the wind conditions, which in turn increase the accuracy and considerably reduce uncertainties in wind load assessments. Because the format is probabilistic, a quantification of the level of safety and uncertainty in predicted wind loads is readily accessible. A simple buffeting response calculation demonstrates the use of probabilistic wind data in the assessment of wind loads and responses. Furthermore, vortex-induced fatigue damage is discussed in relation to probabilistic wind turbulence data and response measurements from wind tunnel tests.
P. Sphicas
There have been three physics meetings since the last CMS week: “physics days” on March 27-29, the Physics/ Trigger week on April 23-27 and the most recent physics days on May 22-24. The main purpose of the March physics days was to finalize the list of “2007 analyses”, i.e. the few topics that the physics groups will concentrate on for the rest of this calendar year. The idea is to carry out a full physics exercise, with CMSSW, for select physics channels which test key features of the physics objects, or represent potential “day 1” physics topics that need to be addressed in advance. The list of these analyses was indeed completed and presented in the plenary meetings. As always, a significant amount of time was also spent in reviewing the status of the physics objects (reconstruction) as well as their usage in the High-Level Trigger (HLT). The major event of the past three months was the first “Physics/Trigger week” in Apri...
[Nikola Tesla: flashes of inspiration].
Villarejo-Galende, Albero; Herrero-San Martín, Alejandro
2013-01-16
Nikola Tesla (1856-1943) was one of the greatest inventors in history and a key player in the revolution that led to the large-scale use of electricity. He also made important contributions to such diverse fields as x-rays, remote control, radio, the theory of consciousness or electromagnetism. In his honour, the international unit of magnetic induction was named after him. Yet, his fame is scarce in comparison with that of other inventors of the time, such as Edison, with whom he had several heated arguments. He was a rather odd, reserved person who lived for his inventions, the ideas for which came to him in moments of inspiration. In his autobiography he relates these flashes with a number of neuropsychiatric manifestations, which can be seen to include migraine auras, synaesthesiae, obsessions and compulsions.
Collide@CERN: sharing inspiration
Katarina Anthony
2012-01-01
Late last year, Julius von Bismarck was appointed to be CERN's first "artist in residence" after winning the Collide@CERN Digital Arts award. He’ll be spending two months at CERN starting this March but, to get a flavour of what’s in store, he visited the Organization last week for a crash course in its inspiring activities. Julius von Bismarck, taking a closer look... When we arrive to interview German artist Julius von Bismarck, he’s being given a presentation about antiprotons’ ability to kill cancer cells. The whiteboard in the room contains graphs and equations that might easily send a non-scientist running, yet as Julius puts it, “if I weren’t interested, I’d be asleep”. Given his numerous questions, he must have been fascinated. “This ‘introduction’ week has been exhilarating,” says Julius. “I’ve been able to interact ...
Switchable bio-inspired adhesives
Kroner, Elmar
2015-03-01
Geckos have astonishing climbing abilities. They can adhere to almost any surface and can run on walls and even stick to ceilings. The extraordinary adhesion performance is caused by a combination of a complex surface pattern on their toes and the biomechanics of its movement. These biological dry adhesives have been intensely investigated during recent years because of the unique combination of adhesive properties. They provide high adhesion, allow for easy detachment, can be removed residue-free, and have self-cleaning properties. Many aspects have been successfully mimicked, leading to artificial, bio-inspired, patterned dry adhesives, and were addressed and in some aspects they even outperform the adhesion capabilities of geckos. However, designing artificial patterned adhesion systems with switchable adhesion remains a big challenge; the gecko's adhesion system is based on a complex hierarchical surface structure and on advanced biomechanics, which are both difficult to mimic. In this paper, two approaches are presented to achieve switchable adhesion. The first approach is based on a patterned polydimethylsiloxane (PDMS) polymer, where adhesion can be switched on and off by applying a low and a high compressive preload. The switch in adhesion is caused by a reversible mechanical instability of the adhesive silicone structures. The second approach is based on a composite material consisting of a Nickel- Titanium (NiTi) shape memory alloy and a patterned adhesive PDMS layer. The NiTi alloy is trained to change its surface topography as a function of temperature, which results in a change of the contact area and of alignment of the adhesive pattern towards a substrate, leading to switchable adhesion. These examples show that the unique properties of bio-inspired adhesives can be greatly improved by new concepts such as mechanical instability or by the use of active materials which react to external stimuli.
2010-09-01
Chaniotakis. The physical and mechanical properties of composite cements manufactured with cal- careous and clayey greek diatomite mixtures. Cement and...Hierarchical and size dependent mechanical properties of silica and silicon nanostructures inspired by diatom algae by Andre Phillipe Garcia B.S...dependent mechanical properties of silica and silicon nanostructures inspired by diatom algae 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM
Probabilistic seismic history matching using binary images
Davolio, Alessandra; Schiozer, Denis Jose
2018-02-01
Currently, the goal of history-matching procedures is not only to provide a model matching any observed data but also to generate multiple matched models to properly handle uncertainties. One such approach is a probabilistic history-matching methodology based on the discrete Latin Hypercube sampling algorithm, proposed in previous works, which was particularly efficient for matching well data (production rates and pressure). 4D seismic (4DS) data have been increasingly included into history-matching procedures. A key issue in seismic history matching (SHM) is to transfer data into a common domain: impedance, amplitude or pressure, and saturation. In any case, seismic inversions and/or modeling are required, which can be time consuming. An alternative to avoid these procedures is using binary images in SHM as they allow the shape, rather than the physical values, of observed anomalies to be matched. This work presents the incorporation of binary images in SHM within the aforementioned probabilistic history matching. The application was performed with real data from a segment of the Norne benchmark case that presents strong 4D anomalies, including softening signals due to pressure build up. The binary images are used to match the pressurized zones observed in time-lapse data. Three history matchings were conducted using: only well data, well and 4DS data, and only 4DS. The methodology is very flexible and successfully utilized the addition of binary images for seismic objective functions. Results proved the good convergence of the method in few iterations for all three cases. The matched models of the first two cases provided the best results, with similar well matching quality. The second case provided models presenting pore pressure changes according to the expected dynamic behavior (pressurized zones) observed on 4DS data. The use of binary images in SHM is relatively new with few examples in the literature. This work enriches this discussion by presenting a new
Probabilistic finite elements for fracture mechanics
Besterfield, Glen
1988-01-01
The probabilistic finite element method (PFEM) is developed for probabilistic fracture mechanics (PFM). A finite element which has the near crack-tip singular strain embedded in the element is used. Probabilistic distributions, such as expectation, covariance and correlation stress intensity factors, are calculated for random load, random material and random crack length. The method is computationally quite efficient and can be expected to determine the probability of fracture or reliability.
Business Inspiration: Small Business Leadership in Recovery?
Rae, David; Price, Liz; Bosworth, Gary; Parkinson, Paul
2012-01-01
Business Inspiration was a short, action-centred leadership and innovation development programme designed for owners and managers of smaller firms to address business survival and repositioning needs arising from the UK's economic downturn. The article examines the design and delivery of Business Inspiration and the impact of the programme on…
Inspiration til undervisning på museer
DEFF Research Database (Denmark)
Hyllested, Trine Elisabeth
2015-01-01
collection and arrangement of knowledge meant to give a general view of, to inspire and to develop teaching at museums in Denmark......collection and arrangement of knowledge meant to give a general view of, to inspire and to develop teaching at museums in Denmark...
Probabilistic Harmonic Modeling of Wind Power Plants
DEFF Research Database (Denmark)
Guest, Emerson; Jensen, Kim H.; Rasmussen, Tonny Wederberg
2017-01-01
A probabilistic sequence domain (SD) harmonic model of a grid-connected voltage-source converter is used to estimate harmonic emissions in a wind power plant (WPP) comprised of Type-IV wind turbines. The SD representation naturally partitioned converter generated voltage harmonics into those...... with deterministic phase and those with probabilistic phase. A case study performed on a string of ten 3MW, Type-IV wind turbines implemented in PSCAD was used to verify the probabilistic SD harmonic model. The probabilistic SD harmonic model can be employed in the planning phase of WPP projects to assess harmonic...
Students’ difficulties in probabilistic problem-solving
Arum, D. P.; Kusmayadi, T. A.; Pramudya, I.
2018-03-01
There are many errors can be identified when students solving mathematics problems, particularly in solving the probabilistic problem. This present study aims to investigate students’ difficulties in solving the probabilistic problem. It focuses on analyzing and describing students errors during solving the problem. This research used the qualitative method with case study strategy. The subjects in this research involve ten students of 9th grade that were selected by purposive sampling. Data in this research involve students’ probabilistic problem-solving result and recorded interview regarding students’ difficulties in solving the problem. Those data were analyzed descriptively using Miles and Huberman steps. The results show that students have difficulties in solving the probabilistic problem and can be divided into three categories. First difficulties relate to students’ difficulties in understanding the probabilistic problem. Second, students’ difficulties in choosing and using appropriate strategies for solving the problem. Third, students’ difficulties with the computational process in solving the problem. Based on the result seems that students still have difficulties in solving the probabilistic problem. It means that students have not able to use their knowledge and ability for responding probabilistic problem yet. Therefore, it is important for mathematics teachers to plan probabilistic learning which could optimize students probabilistic thinking ability.
D. Acosta
2010-01-01
A remarkable amount of progress has been made in Physics since the last CMS Week in June given the exponential growth in the delivered LHC luminosity. The first major milestone was the delivery of a variety of results to the ICHEP international conference held in Paris this July. For this conference, CMS prepared 15 Physics Analysis Summaries on physics objects and 22 Summaries on new and interesting physics measurements that exploited the luminosity recorded by the CMS detector. The challenge was incorporating the largest batch of luminosity that was delivered only days before the conference (300 nb-1 total). The physics covered from this initial running period spanned hadron production measurements, jet production and properties, electroweak vector boson production, and even glimpses of the top quark. Since then, the accumulated integrated luminosity has increased by a factor of more than 100, and all groups have been working tremendously hard on analysing this dataset. The September Physics Week was held ...
J. Incandela
There have been numerous developments in the physics area since the September CMS week. The biggest single event was the Physics/Trigger week in the end of Octo¬ber, whereas in terms of ongoing activities the “2007 analyses” went into high gear. This was in parallel with participation in CSA07 by the physics groups. On the or¬ganizational side, the new conveners of the physics groups have been selected, and a new database for man¬aging physics analyses has been deployed. Physics/Trigger week The second Physics-Trigger week of 2007 took place during the week of October 22-26. The first half of the week was dedicated to working group meetings. The ple¬nary Joint Physics-Trigger meeting took place on Wednesday afternoon and focused on the activities of the new Trigger Studies Group (TSG) and trigger monitoring. Both the Physics and Trigger organizations are now focused on readiness for early data-taking. Thus, early trigger tables and preparations for calibr...
P. Sphicas
The CPT project came to an end in December 2006 and its original scope is now shared among three new areas, namely Computing, Offline and Physics. In the physics area the basic change with respect to the previous system (where the PRS groups were charged with detector and physics object reconstruction and physics analysis) was the split of the detector PRS groups (the old ECAL-egamma, HCAL-jetMET, Tracker-btau and Muons) into two groups each: a Detector Performance Group (DPG) and a Physics Object Group. The DPGs are now led by the Commissioning and Run Coordinator deputy (Darin Acosta) and will appear in the correspond¬ing column in CMS bulletins. On the physics side, the physics object groups are charged with the reconstruction of physics objects, the tuning of the simulation (in collaboration with the DPGs) to reproduce the data, the provision of code for the High-Level Trigger, the optimization of the algorithms involved for the different physics analyses (in collaboration with the analysis gr...
Probabilistic Flood Defence Assessment Tools
Directory of Open Access Journals (Sweden)
Slomp Robert
2016-01-01
institutions managing flood the defences, and not by just a small number of experts in probabilistic assessment. Therefore, data management and use of software are main issues that have been covered in courses and training in 2016 and 2017. All in all, this is the largest change in the assessment of Dutch flood defences since 1996. In 1996 probabilistic techniques were first introduced to determine hydraulic boundary conditions (water levels and waves (wave height, wave period and direction for different return periods. To simplify the process, the assessment continues to consist of a three-step approach, moving from simple decision rules, to the methods for semi-probabilistic assessment, and finally to a fully probabilistic analysis to compare the strength of flood defences with the hydraulic loads. The formal assessment results are thus mainly based on the fully probabilistic analysis and the ultimate limit state of the strength of a flood defence. For complex flood defences, additional models and software were developed. The current Hydra software suite (for policy analysis, formal flood defence assessment and design will be replaced by the model Ringtoets. New stand-alone software has been developed for revetments, geotechnical analysis and slope stability of the foreshore. Design software and policy analysis software, including the Delta model, will be updated in 2018. A fully probabilistic method results in more precise assessments and more transparency in the process of assessment and reconstruction of flood defences. This is of increasing importance, as large-scale infrastructural projects in a highly urbanized environment are increasingly subject to political and societal pressure to add additional features. For this reason, it is of increasing importance to be able to determine which new feature really adds to flood protection, to quantify how much its adds to the level of flood protection and to evaluate if it is really worthwhile. Please note: The Netherlands
Valentin, Jan B; Andreetta, Christian; Boomsma, Wouter; Bottaro, Sandro; Ferkinghoff-Borg, Jesper; Frellsen, Jes; Mardia, Kanti V; Tian, Pengfei; Hamelryck, Thomas
2014-02-01
We propose a method to formulate probabilistic models of protein structure in atomic detail, for a given amino acid sequence, based on Bayesian principles, while retaining a close link to physics. We start from two previously developed probabilistic models of protein structure on a local length scale, which concern the dihedral angles in main chain and side chains, respectively. Conceptually, this constitutes a probabilistic and continuous alternative to the use of discrete fragment and rotamer libraries. The local model is combined with a nonlocal model that involves a small number of energy terms according to a physical force field, and some information on the overall secondary structure content. In this initial study we focus on the formulation of the joint model and the evaluation of the use of an energy vector as a descriptor of a protein's nonlocal structure; hence, we derive the parameters of the nonlocal model from the native structure without loss of generality. The local and nonlocal models are combined using the reference ratio method, which is a well-justified probabilistic construction. For evaluation, we use the resulting joint models to predict the structure of four proteins. The results indicate that the proposed method and the probabilistic models show considerable promise for probabilistic protein structure prediction and related applications. Copyright © 2013 Wiley Periodicals, Inc.
Aging in probabilistic safety assessment
International Nuclear Information System (INIS)
Jordan Cizelj, R.; Kozuh, M.
1995-01-01
Aging is a phenomenon, which is influencing on unavailability of all components of the plant. The influence of aging on Probabilistic Safety Assessment calculations was estimated for Electrical Power Supply System. The average increase of system unavailability due to aging of system components was estimated and components were prioritized regarding their influence on change of system unavailability and relative increase of their unavailability due to aging. After the analysis of some numerical results, the recommendation for a detailed research of aging phenomena and its influence on system availability is given. (author)
Probabilistic assessment of SGTR management
International Nuclear Information System (INIS)
Champ, M.; Cornille, Y.; Lanore, J.M.
1989-04-01
In case of steam generator tube rupture (SGTR) event, in France, the mitigation of accident relies on operator intervention, by applying a specific accidental procedure. A detailed probabilistic analysis has been conducted which required the assessment of the failure probability of the operator actions, and for that purpose it was necessary to estimate the time available for the operator to apply the adequate procedure for various sequences. The results indicate that by taking into account the delays and the existence of adequate accidental procedures, the risk is reduced to a reasonably low level
Probabilistic accident sequence recovery analysis
International Nuclear Information System (INIS)
Stutzke, Martin A.; Cooper, Susan E.
2004-01-01
Recovery analysis is a method that considers alternative strategies for preventing accidents in nuclear power plants during probabilistic risk assessment (PRA). Consideration of possible recovery actions in PRAs has been controversial, and there seems to be a widely held belief among PRA practitioners, utility staff, plant operators, and regulators that the results of recovery analysis should be skeptically viewed. This paper provides a framework for discussing recovery strategies, thus lending credibility to the process and enhancing regulatory acceptance of PRA results and conclusions. (author)
Probabilistic risk assessment: Number 219
International Nuclear Information System (INIS)
Bari, R.A.
1985-01-01
This report describes a methodology for analyzing the safety of nuclear power plants. A historical overview of plants in the US is provided, and past, present, and future nuclear safety and risk assessment are discussed. A primer on nuclear power plants is provided with a discussion of pressurized water reactors (PWR) and boiling water reactors (BWR) and their operation and containment. Probabilistic Risk Assessment (PRA), utilizing both event-tree and fault-tree analysis, is discussed as a tool in reactor safety, decision making, and communications. (FI)
Axiomatisation of fully probabilistic design
Czech Academy of Sciences Publication Activity Database
Kárný, Miroslav; Kroupa, Tomáš
2012-01-01
Roč. 186, č. 1 (2012), s. 105-113 ISSN 0020-0255 R&D Projects: GA MŠk(CZ) 2C06001; GA ČR GA102/08/0567 Institutional research plan: CEZ:AV0Z10750506 Keywords : Bayesian decision making * Fully probabilistic design * Kullback–Leibler divergence * Unified decision making Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 3.643, year: 2012 http://library.utia.cas.cz/separaty/2011/AS/karny-0367271.pdf
Probabilistic Analysis of Crack Width
Directory of Open Access Journals (Sweden)
J. Marková
2000-01-01
Full Text Available Probabilistic analysis of crack width of a reinforced concrete element is based on the formulas accepted in Eurocode 2 and European Model Code 90. Obtained values of reliability index b seem to be satisfactory for the reinforced concrete slab that fulfils requirements for the crack width specified in Eurocode 2. However, the reliability of the slab seems to be insufficient when the European Model Code 90 is considered; reliability index is less than recommended value 1.5 for serviceability limit states indicated in Eurocode 1. Analysis of sensitivity factors of basic variables enables to find out variables significantly affecting the total crack width.
Probabilistic approach to EMP assessment
International Nuclear Information System (INIS)
Bevensee, R.M.; Cabayan, H.S.; Deadrick, F.J.; Martin, L.C.; Mensing, R.W.
1980-09-01
The development of nuclear EMP hardness requirements must account for uncertainties in the environment, in interaction and coupling, and in the susceptibility of subsystems and components. Typical uncertainties of the last two kinds are briefly summarized, and an assessment methodology is outlined, based on a probabilistic approach that encompasses the basic concepts of reliability. It is suggested that statements of survivability be made compatible with system reliability. Validation of the approach taken for simple antenna/circuit systems is performed with experiments and calculations that involve a Transient Electromagnetic Range, numerical antenna modeling, separate device failure data, and a failure analysis computer program
Probabilistic risk assessment, Volume I
International Nuclear Information System (INIS)
Anon.
1982-01-01
This book contains 158 papers presented at the International Topical Meeting on Probabilistic Risk Assessment held by the American Nuclear Society (ANS) and the European Nuclear Society (ENS) in Port Chester, New York in 1981. The meeting was second in a series of three. The main focus of the meeting was on the safety of light water reactors. The papers discuss safety goals and risk assessment. Quantitative safety goals, risk assessment in non-nuclear technologies, and operational experience and data base are also covered. Included is an address by Dr. Chauncey Starr
Probabilistic safety analysis using microcomputer
International Nuclear Information System (INIS)
Futuro Filho, F.L.F.; Mendes, J.E.S.; Santos, M.J.P. dos
1990-01-01
The main steps of execution of a Probabilistic Safety Assessment (PSA) are presented in this report, as the study of the system description, construction of event trees and fault trees, and the calculation of overall unavailability of the systems. It is also presented the use of microcomputer in performing some tasks, highlightning the main characteristics of a software to perform adequately the job. A sample case of fault tree construction and calculation is presented, using the PSAPACK software, distributed by the IAEA (International Atomic Energy Agency) for training purpose. (author)
Probabilistic fuel rod analyses using the TRANSURANUS code
Energy Technology Data Exchange (ETDEWEB)
Lassmann, K; O` Carroll, C; Laar, J Van De [CEC Joint Research Centre, Karlsruhe (Germany)
1997-08-01
After more than 25 years of fuel rod modelling research, the basic concepts are well established and the limitations of the specific approaches are known. However, the widely used mechanistic approach leads in many cases to discrepancies between theoretical predictions and experimental evidence indicating that models are not exact and that some of the physical processes encountered are of stochastic nature. To better understand uncertainties and their consequences, the mechanistic approach must therefore be augmented by statistical analyses. In the present paper the basic probabilistic methods are briefly discussed. Two such probabilistic approaches are included in the fuel rod performance code TRANSURANUS: the Monte Carlo method and the Numerical Noise Analysis. These two techniques are compared and their capabilities are demonstrated. (author). 12 refs, 4 figs, 2 tabs.
Development of Probabilistic Structural Analysis Integrated with Manufacturing Processes
Pai, Shantaram S.; Nagpal, Vinod K.
2007-01-01
An effort has been initiated to integrate manufacturing process simulations with probabilistic structural analyses in order to capture the important impacts of manufacturing uncertainties on component stress levels and life. Two physics-based manufacturing process models (one for powdered metal forging and the other for annular deformation resistance welding) have been linked to the NESSUS structural analysis code. This paper describes the methodology developed to perform this integration including several examples. Although this effort is still underway, particularly for full integration of a probabilistic analysis, the progress to date has been encouraging and a software interface that implements the methodology has been developed. The purpose of this paper is to report this preliminary development.
Submitted by
Physics Week: plenary meeting on physics groups plans for startup (14–15 May 2008) The Physics Objects (POG) and Physics Analysis (PAG) Groups presented their latest developments at the plenary meeting during the Physics Week. In the presentations particular attention was given to startup plans and readiness for data-taking. Many results based on the recent cosmic run were shown. A special Workshop on SUSY, described in a separate section, took place the day before the plenary. At the meeting, we had also two special DPG presentations on “Tracker and Muon alignment with CRAFT” (Ernesto Migliore) and “Calorimeter studies with CRAFT” (Chiara Rovelli). We had also a report from Offline (Andrea Rizzi) and Computing (Markus Klute) on the San Diego Workshop, described elsewhere in this bulletin. Tracking group (Boris Mangano). The level of sophistication of the tracking software increased significantly over the last few months: V0 (K0 and Λ) reconstr...
Biologically-inspired soft exosuit.
Asbeck, Alan T; Dyer, Robert J; Larusson, Arnar F; Walsh, Conor J
2013-06-01
In this paper, we present the design and evaluation of a novel soft cable-driven exosuit that can apply forces to the body to assist walking. Unlike traditional exoskeletons which contain rigid framing elements, the soft exosuit is worn like clothing, yet can generate moments at the ankle and hip with magnitudes of 18% and 30% of those naturally generated by the body during walking, respectively. Our design uses geared motors to pull on Bowden cables connected to the suit near the ankle. The suit has the advantages over a traditional exoskeleton in that the wearer's joints are unconstrained by external rigid structures, and the worn part of the suit is extremely light, which minimizes the suit's unintentional interference with the body's natural biomechanics. However, a soft suit presents challenges related to actuation force transfer and control, since the body is compliant and cannot support large pressures comfortably. We discuss the design of the suit and actuation system, including principles by which soft suits can transfer force to the body effectively and the biological inspiration for the design. For a soft exosuit, an important design parameter is the combined effective stiffness of the suit and its interface to the wearer. We characterize the exosuit's effective stiffness, and present preliminary results from it generating assistive torques to a subject during walking. We envision such an exosuit having broad applicability for assisting healthy individuals as well as those with muscle weakness.
Deyhle, Hans; Bunk, Oliver; Buser, Stefan; Krastl, Gabriel; Zitzmann, Nicola U.; Ilgenstein, Bernd; Beckmann, Felix; Pfeiffer, Franz; Weiger, Roland; Müller, Bert
2009-08-01
Human teeth are anisotropic composites. Dentin as the core material of the tooth consists of nanometer-sized calcium phosphate crystallites embedded in collagen fiber networks. It shows its anisotropy on the micrometer scale by its well-oriented microtubules. The detailed three-dimensional nanostructure of the hard tissues namely dentin and enamel, however, is not understood, although numerous studies on the anisotropic mechanical properties have been performed and evaluated to explain the tooth function including the enamel-dentin junction acting as effective crack barrier. Small angle X-ray scattering (SAXS) with a spatial resolution in the 10 μm range allows determining the size and orientation of the constituents on the nanometer scale with reasonable precision. So far, only some dental materials, i.e. the fiber reinforced posts exhibit anisotropic properties related to the micrometer-size glass fibers. Dental fillings, composed of nanostructures oriented similar to the natural hard tissues of teeth, however, do not exist at all. The current X-ray-based investigations of extracted human teeth provide evidence for oriented micro- and nanostructures in dentin and enamel. These fundamental quantitative findings result in profound knowledge to develop biologically inspired dental fillings with superior resistance to thermal and mechanical shocks.
Fracture Mechanics: Inspirations from Nature
Directory of Open Access Journals (Sweden)
David Taylor
2014-10-01
Full Text Available In Nature there are many examples of materials performing structural functions. Nature requires materials which are stiff and strong to provide support against various forces, including self-weight, the dynamic forces involved in movement, and external loads such as wind or the actions of a predator. These materials and structures have evolved over millions of years; the science of Biomimetics seeks to understand Nature and, as a result, to find inspiration for the creation of better engineering solutions. There has been relatively little fundamental research work in this area from a fracture mechanics point of view. Natural materials are quite brittle and, as a result, they have evolved several interesting strategies for preventing failure by crack propagation. Fatigue is also a major problem for many animals and plants. In this paper, several examples will be given of recent work in the Bioengineering Research Centre at Trinity College Dublin, investigating fracture and fatigue in such diverse materials as bamboo, the legs and wings of insects, and living cells.
Anaïs Schaeffer
2012-01-01
During the Frankfurt book fair last October, the CERN stand drew quite the crowd. Director-General Rolf Heuer was there to promote CERN’s mission and the "LHC: the Large Hadron Collider" book. He met a lot of visitors and for one of them there was also a nice follow-up… Marcus and his father visiting the LINAC facility. Fifteen year-old Marcus lives in Lauterecken near Frankfurt. The popular book fair last autumn was for him a nice opportunity to get in touch with the CERN environment. Inspired by the stand and what the CERN people were describing, he started to ask more and more questions… So many, that Rolf Heuer decided to invite him to come to CERN and find out some of the answers for himself. A few weeks later, while recovering from an exciting visit to the ATLAS underground cavern and other CERN installations with a cup of tea in Restaurant 1, Marcus shared his enthusiasm about the Organization: “When I was younger, my moth...
Lunabotics Mining Competition: Inspiration Through Accomplishment
Mueller, Robert P.
2011-01-01
NASA's Lunabotics Mining Competition is designed to promote the development of interest in space activities and STEM (Science, Technology, Engineering, and Mathematics) fields. The competition uses excavation, a necessary first step towards extracting resources from the regolith and building bases on the moon. The unique physical properties of lunar regolith and the reduced 1/6th gravity, vacuum environment make excavation a difficult technical challenge. Advances in lunar regolith mining have the potential to significantly contribute to our nation's space vision and NASA space exploration operations. The competition is conducted annually by NASA at the Kennedy Space Center Visitor Complex. The teams that can use telerobotic or autonomous operation to excavate a lunar regolith geotechnical simulant, herein after referred to as Black Point-1 (or BP-1) and score the most points (calculated as an average of two separate 10-minute timed competition attempts) will eam points towards the Joe Kosmo Award for Excellence and the scores will reflect ranking in the on-site mining category of the competition. The minimum excavation requirement is 10.0 kg during each competition attempt and the robotic excavator, referred to as the "Lunabot", must meet all specifications. This paper will review the achievements of the Lunabotics Mining Competition in 2010 and 2011, and present the new rules for 2012. By providing a framework for robotic design and fabrication, which culminates in a live competition event, university students have been able to produce sophisticated lunabots which are tele-operated. Multi-disciplinary teams are encouraged and the extreme sense of accomplishment provides a unique source of inspiration to the participating students, which has been shown to translate into increased interest in STEM careers. Our industrial sponsors (Caterpillar, Newmont Mining, Harris, Honeybee Robotics) have all stated that there is a strong need for skills in the workforce related
Compression of Probabilistic XML Documents
Veldman, Irma; de Keijzer, Ander; van Keulen, Maurice
Database techniques to store, query and manipulate data that contains uncertainty receives increasing research interest. Such UDBMSs can be classified according to their underlying data model: relational, XML, or RDF. We focus on uncertain XML DBMS with as representative example the Probabilistic XML model (PXML) of [10,9]. The size of a PXML document is obviously a factor in performance. There are PXML-specific techniques to reduce the size, such as a push down mechanism, that produces equivalent but more compact PXML documents. It can only be applied, however, where possibilities are dependent. For normal XML documents there also exist several techniques for compressing a document. Since Probabilistic XML is (a special form of) normal XML, it might benefit from these methods even more. In this paper, we show that existing compression mechanisms can be combined with PXML-specific compression techniques. We also show that best compression rates are obtained with a combination of PXML-specific technique with a rather simple generic DAG-compression technique.
Living probabilistic safety assessment (LPSA)
International Nuclear Information System (INIS)
1999-08-01
Over the past few years many nuclear power plant organizations have performed probabilistic safety assessments (PSAs) to identify and understand key plant vulnerabilities. As a result of the availability of these PSA studies, there is a desire to use them to enhance plant safety and to operate the nuclear stations in the most efficient manner. PSA is an effective tool for this purpose as it assists plant management to target resources where the largest benefit to plant safety can be obtained. However, any PSA which is to be used in this way must have a credible and defensible basis. Thus, it is very important to have a high quality 'living PSA' accepted by the plant and the regulator. With this background in mind, the IAEA has prepared this report on Living Probabilistic Safety Assessment (LPSA) which addresses the updating, documentation, quality assurance, and management and organizational requirements for LPSA. Deficiencies in the areas addressed in this report would seriously reduce the adequacy of the LPSA as a tool to support decision making at NPPs. This report was reviewed by a working group during a Technical Committee Meeting on PSA Applications to Improve NPP Safety held in Madrid, Spain, from 23 to 27 February 1998
Software for Probabilistic Risk Reduction
Hensley, Scott; Michel, Thierry; Madsen, Soren; Chapin, Elaine; Rodriguez, Ernesto
2004-01-01
A computer program implements a methodology, denoted probabilistic risk reduction, that is intended to aid in planning the development of complex software and/or hardware systems. This methodology integrates two complementary prior methodologies: (1) that of probabilistic risk assessment and (2) a risk-based planning methodology, implemented in a prior computer program known as Defect Detection and Prevention (DDP), in which multiple requirements and the beneficial effects of risk-mitigation actions are taken into account. The present methodology and the software are able to accommodate both process knowledge (notably of the efficacy of development practices) and product knowledge (notably of the logical structure of a system, the development of which one seeks to plan). Estimates of the costs and benefits of a planned development can be derived. Functional and non-functional aspects of software can be taken into account, and trades made among them. It becomes possible to optimize the planning process in the sense that it becomes possible to select the best suite of process steps and design choices to maximize the expectation of success while remaining within budget.
Energy Technology Data Exchange (ETDEWEB)
Martín, G.
2017-11-01
The International Organization for Medical Physics (IOMP) celebrates annually the International Day of Medical Physics (IDMP) for which the 7th November, the birthday of Marie Sklodowska Curie, a most exceptional character in science at all and a pioneer of Medical Physics, has been chosen. This year, the IDMP is devoted to women in Medical Physics to honour the 150th anniversary of Marie Curie’s birthday. This article briefly outlines her outstanding personality, her fundamental discovery of the radioactivity and other scientific achievements for which she was awarded two Nobel Prizes, and her extensive collaboration with industry, far less well known. Finally, a brief review of the fundamental legacy she left the humanity in Medicine, Science and for women scientists is presented. [Spanish] La IOMP (Organización Internacional de Física Médica) celebra cada año el Día Internacional de la Física Médica (IDMP), el 7 de noviembre, en honor al nacimiento de Marie Sklodowska-Curie, un personaje excepcional en la historia de la ciencia y pionera de la Física Médica. Este año, en conmemoración del 150º aniversario de su nacimiento, el IDMP está dedicado a las mujeres en Física Médica. El siguiente artículo sobre Marie Curie describe brevemente su destacada personalidad, su descubrimiento fundamental de la radiactividad y otros logros científicos que la hicieron merecedora de dos Premios Nobel, y su extensa colaboración con la industria, faceta ésta desconocida en general. Finalmente, se hace un breve recorrido por el legado fundamental que dejó a la humanidad en Medicina, en Ciencia y para las mujeres científicas.
Is Probabilistic Evidence a Source of Knowledge?
Friedman, Ori; Turri, John
2015-01-01
We report a series of experiments examining whether people ascribe knowledge for true beliefs based on probabilistic evidence. Participants were less likely to ascribe knowledge for beliefs based on probabilistic evidence than for beliefs based on perceptual evidence (Experiments 1 and 2A) or testimony providing causal information (Experiment 2B).…
Probabilistic Cue Combination: Less Is More
Yurovsky, Daniel; Boyer, Ty W.; Smith, Linda B.; Yu, Chen
2013-01-01
Learning about the structure of the world requires learning probabilistic relationships: rules in which cues do not predict outcomes with certainty. However, in some cases, the ability to track probabilistic relationships is a handicap, leading adults to perform non-normatively in prediction tasks. For example, in the "dilution effect,"…
Multiobjective optimal allocation problem with probabilistic non ...
African Journals Online (AJOL)
This paper considers the optimum compromise allocation in multivariate stratified sampling with non-linear objective function and probabilistic non-linear cost constraint. The probabilistic non-linear cost constraint is converted into equivalent deterministic one by using Chance Constrained programming. A numerical ...
Probabilistic reasoning with graphical security models
Kordy, Barbara; Pouly, Marc; Schweitzer, Patrick
This work provides a computational framework for meaningful probabilistic evaluation of attack–defense scenarios involving dependent actions. We combine the graphical security modeling technique of attack–defense trees with probabilistic information expressed in terms of Bayesian networks. In order
Probabilistic Geoacoustic Inversion in Complex Environments
2015-09-30
Probabilistic Geoacoustic Inversion in Complex Environments Jan Dettmer School of Earth and Ocean Sciences, University of Victoria, Victoria BC...long-range inversion methods can fail to provide sufficient resolution. For proper quantitative examination of variability, parameter uncertainty must...project aims to advance probabilistic geoacoustic inversion methods for complex ocean environments for a range of geoacoustic data types. The work is
Application of probabilistic precipitation forecasts from a ...
African Journals Online (AJOL)
2014-02-14
Feb 14, 2014 ... Application of probabilistic precipitation forecasts from a deterministic model ... aim of this paper is to investigate the increase in the lead-time of flash flood warnings of the SAFFG using probabilistic precipitation forecasts ... The procedure is applied to a real flash flood event and the ensemble-based.
Why do probabilistic finite element analysis ?
Thacker, Ben H
2008-01-01
The intention of this book is to provide an introduction to performing probabilistic finite element analysis. As a short guideline, the objective is to inform the reader of the use, benefits and issues associated with performing probabilistic finite element analysis without excessive theory or mathematical detail.
Branching bisimulation congruence for probabilistic systems
Trcka, N.; Georgievska, S.; Aldini, A.; Baier, C.
2008-01-01
The notion of branching bisimulation for the alternating model of probabilistic systems is not a congruence with respect to parallel composition. In this paper we first define another branching bisimulation in the more general model allowing consecutive probabilistic transitions, and we prove that
Probabilistic Reversible Automata and Quantum Automata
Golovkins, Marats; Kravtsev, Maksim
2002-01-01
To study relationship between quantum finite automata and probabilistic finite automata, we introduce a notion of probabilistic reversible automata (PRA, or doubly stochastic automata). We find that there is a strong relationship between different possible models of PRA and corresponding models of quantum finite automata. We also propose a classification of reversible finite 1-way automata.
Bisimulations meet PCTL equivalences for probabilistic automata
DEFF Research Database (Denmark)
Song, Lei; Zhang, Lijun; Godskesen, Jens Chr.
2013-01-01
Probabilistic automata (PAs) have been successfully applied in formal verification of concurrent and stochastic systems. Efficient model checking algorithms have been studied, where the most often used logics for expressing properties are based on probabilistic computation tree logic (PCTL) and its...
Error Discounting in Probabilistic Category Learning
Craig, Stewart; Lewandowsky, Stephan; Little, Daniel R.
2011-01-01
The assumption in some current theories of probabilistic categorization is that people gradually attenuate their learning in response to unavoidable error. However, existing evidence for this error discounting is sparse and open to alternative interpretations. We report 2 probabilistic-categorization experiments in which we investigated error…
Bio-inspired computation in telecommunications
Yang, Xin-She; Ting, TO
2015-01-01
Bio-inspired computation, especially those based on swarm intelligence, has become increasingly popular in the last decade. Bio-Inspired Computation in Telecommunications reviews the latest developments in bio-inspired computation from both theory and application as they relate to telecommunications and image processing, providing a complete resource that analyzes and discusses the latest and future trends in research directions. Written by recognized experts, this is a must-have guide for researchers, telecommunication engineers, computer scientists and PhD students.
Directory of Open Access Journals (Sweden)
Mauro Salvemini
2010-03-01
Full Text Available INPIRE's maturityThe INSPIRE Conference 2010 took place from 23 to 25 June 2010 in Kraków, Poland. On 22 June pre-conference workshops have been organized. The theme of this year’s edition has been "INSPIRE as a Framework for Cooperation".The INSPIRE Conference has been organised through a series of plenary sessions addressing common policy issues, and parallel sessions focusing in particular on applications and implementations of SDIs, research issues and new and evolvingtechnologies and applications and poster presentations.
Consideration of aging in probabilistic safety assessment
International Nuclear Information System (INIS)
Titina, B.; Cepin, M.
2007-01-01
Probabilistic safety assessment is a standardised tool for assessment of safety of nuclear power plants. It is a complement to the safety analyses. Standard probabilistic models of safety equipment assume component failure rate as a constant. Ageing of systems, structures and components can theoretically be included in new age-dependent probabilistic safety assessment, which generally causes the failure rate to be a function of age. New age-dependent probabilistic safety assessment models, which offer explicit calculation of the ageing effects, are developed. Several groups of components are considered which require their unique models: e.g. operating components e.g. stand-by components. The developed models on the component level are inserted into the models of the probabilistic safety assessment in order that the ageing effects are evaluated for complete systems. The preliminary results show that the lack of necessary data for consideration of ageing causes highly uncertain models and consequently the results. (author)
Structural reliability codes for probabilistic design
DEFF Research Database (Denmark)
Ditlevsen, Ove Dalager
1997-01-01
probabilistic code format has not only strong influence on the formal reliability measure, but also on the formal cost of failure to be associated if a design made to the target reliability level is considered to be optimal. In fact, the formal cost of failure can be different by several orders of size for two...... different, but by and large equally justifiable probabilistic code formats. Thus, the consequence is that a code format based on decision theoretical concepts and formulated as an extension of a probabilistic code format must specify formal values to be used as costs of failure. A principle of prudence...... is suggested for guiding the choice of the reference probabilistic code format for constant reliability. In the author's opinion there is an urgent need for establishing a standard probabilistic reliability code. This paper presents some considerations that may be debatable, but nevertheless point...
D. Futyan
A lot has transpired on the “Physics” front since the last CMS Bulletin. The summer was filled with preparations of new Monte Carlo samples based on CMSSW_3, the finalization of all the 10 TeV physics analyses [in total 50 analyses were approved] and the preparations for the Physics Week in Bologna. A couple weeks later, the “October Exercise” commenced and ran through an intense two-week period. The Physics Days in October were packed with a number of topics that are relevant to data taking, in a number of “mini-workshops”: the luminosity measurement, the determination of the beam spot and the measurement of the missing transverse energy (MET) were the three main topics. Physics Week in Bologna The second physics week in 2009 took place in Bologna, Italy, on the week of Sep 7-11. The aim of the week was to review and establish how ready we are to do physics with the early collisions at the LHC. The agenda of the week was thus pac...
D. Futyan
A lot has transpired on the “Physics” front since the last CMS Bulletin. The summer was filled with preparations of new Monte Carlo samples based on CMSSW_3, the finalization of all the 10 TeV physics analyses [in total 50 analyses were approved] and the preparations for the Physics Week in Bologna. A couple weeks later, the “October Exercise” commenced and ran through an intense two-week period. The Physics Days in October were packed with a number of topics that are relevant to data taking, in a number of “mini-workshops”: the luminosity measurement, the determination of the beam spot and the measurement of the missing transverse energy (MET) were the three main topics. Physics Week in Bologna The second physics week in 2009 took place in Bologna, Italy, on the week of Sep 7-11. The aim of the week was to review and establish (we hoped) the readiness of CMS to do physics with the early collisions at the LHC. The agenda of the...
Biologically inspired toys using artificial muscles
Bar-Cohen, Y.
2001-01-01
Recent developments in electroactive polymers, so-called artificial muscles, could one day be used to make bionics possible. Meanwhile, as this technology evolves novel mechanisms are expected to emerge that are biologically inspired.
Innovative Didactics in an International Internship - inspiration
DEFF Research Database (Denmark)
Lembcke, Steen; Skibsted, Else Bengaard; Mølgaard, Niels
An inspiration handbook for the international team from the teacher education programme in VIA. Aimed to assist internship supervisors and students during international internships in regards to innovation, social entrepreneurship and development of the international teacher. Introduces why and how...
Biologically Inspired Technology Using Electroactive Polymers (EAP)
Bar-Cohen, Yoseph
2006-01-01
Evolution allowed nature to introduce highly effective biological mechanisms that are incredible inspiration for innovation. Humans have always made efforts to imitate nature's inventions and we are increasingly making advances that it becomes significantly easier to imitate, copy, and adapt biological methods, processes and systems. This brought us to the ability to create technology that is far beyond the simple mimicking of nature. Having better tools to understand and to implement nature's principles we are now equipped like never before to be inspired by nature and to employ our tools in far superior ways. Effectively, by bio-inspiration we can have a better view and value of nature capability while studying its models to learn what can be extracted, copied or adapted. Using electroactive polymers (EAP) as artificial muscles is adding an important element to the development of biologically inspired technologies.
Nature-Inspired Structural Materials for Flexible Electronic Devices.
Liu, Yaqing; He, Ke; Chen, Geng; Leow, Wan Ru; Chen, Xiaodong
2017-10-25
Exciting advancements have been made in the field of flexible electronic devices in the last two decades and will certainly lead to a revolution in peoples' lives in the future. However, because of the poor sustainability of the active materials in complex stress environments, new requirements have been adopted for the construction of flexible devices. Thus, hierarchical architectures in natural materials, which have developed various environment-adapted structures and materials through natural selection, can serve as guides to solve the limitations of materials and engineering techniques. This review covers the smart designs of structural materials inspired by natural materials and their utility in the construction of flexible devices. First, we summarize structural materials that accommodate mechanical deformations, which is the fundamental requirement for flexible devices to work properly in complex environments. Second, we discuss the functionalities of flexible devices induced by nature-inspired structural materials, including mechanical sensing, energy harvesting, physically interacting, and so on. Finally, we provide a perspective on newly developed structural materials and their potential applications in future flexible devices, as well as frontier strategies for biomimetic functions. These analyses and summaries are valuable for a systematic understanding of structural materials in electronic devices and will serve as inspirations for smart designs in flexible electronics.
Artificial heartbeat: design and fabrication of a biologically inspired pump
International Nuclear Information System (INIS)
Walters, Peter; Stephenson, Robert; Lewis, Amy; Stinchcombe, Andrew; Ieropoulos, Ioannis
2013-01-01
We present a biologically inspired actuator exhibiting a novel pumping action. The design of the ‘artificial heartbeat’ actuator is inspired by physical principles derived from the structure and function of the human heart. The actuator employs NiTi artificial muscles and is powered by electrical energy generated by microbial fuel cells (MFCs). We describe the design and fabrication of the actuator and report the results of tests conducted to characterize its performance. This is the first artificial muscle-driven pump to be powered by MFCs fed on human urine. Results are presented in terms of the peak pumping pressure generated by the actuator, as well as for the volume of fluid transferred, when the actuator was powered by energy stored in a capacitor bank, which was charged by 24 MFCs fed on urine. The results demonstrate the potential for the artificial heartbeat actuator to be employed as a fluid circulation pump in future generations of MFC-powered robots (‘EcoBots’) that extract energy from organic waste. We also envisage that the actuator could in the future form part of a bio-robotic artwork or ‘bio-automaton’ that could help increase public awareness of research in robotics, bio-energy and biologically inspired design. (paper)
INSPIRE: a new scientific information system for HEP
Ivanov, R; CERN. Geneva. IT Department
2010-01-01
The status of high-energy physics (HEP) information systems has been jointly analyzed by the libraries of CERN, DESY, Fermilab and SLAC. As a result, the four laboratories have started the INSPIRE project – a new platform built by moving the successful SPIRES features and content, curated at DESY, Fermilab and SLAC, into the open-source CDS Invenio digital library software that was developed at CERN. INSPIRE will integrate current acquisition workflows and databases to host the entire body of the HEP literature (about one million records), aiming to become the reference HEP scientific information platform worldwide. It will provide users with fast access to full text journal articles and preprints, but also material such as conference slides and multimedia. INSPIRE will empower scientists with new tools to discover and access the results most relevant to their research, enable novel text- and data-mining applications, and deploy new metrics to assess the impact of articles and authors. In addition, it will ...
INSPIRE: a new scientific information system for HEP
Ivanov, R
2009-01-01
The status of high-energy physics (HEP) information systems has been jointly analyzed by the libraries of CERN, DESY, Fermilab and SLAC. As a result, the four laboratories have started the INSPIRE project – a new platform built by moving the successful SPIRES features and content, curated at DESY, Fermilab and SLAC, into the open-source CDS Invenio digital library software that was developed at CERN. INSPIRE will integrate present acquisition workflows and databases to host the entire body of the HEP literature (about one million records), aiming to become the reference HEP scientific information platform worldwide. It will provide users with fast access to full-text journal articles and preprints, but also material such as conference slides and multimedia. INSPIRE will empower scientists with new tools to discover and access the results most relevant to their research, enable novel text- and data-mining applications, and deploy new metrics to assess the impact of articles and authors. In addition, it will ...
J. Incandela
The all-plenary format of the CMS week in Cyprus gave the opportunity to the conveners of the physics groups to present the plans of each physics analysis group for tackling early physics analyses. The presentations were complete, so all are encouraged to browse through them on the Web. There is a wealth of information on what is going on, by whom and on what basis and priority. The CMS week was followed by two CMS “physics events”, the ICHEP08 days and the physics days in July. These were two weeks dedicated to either the approval of all the results that would be presented at ICHEP08, or to the review of all the other Monte-Carlo based analyses that were carried out in the context of our preparations for analysis with the early LHC data (the so-called “2008 analyses”). All this was planned in the context of the beginning of a ramp down of these Monte Carlo efforts, in anticipation of data. The ICHEP days are described below (agenda and talks at: http://indic...
Joe Incandela
There have been two plenary physics meetings since the December CMS week. The year started with two workshops, one on the measurements of the Standard Model necessary for “discovery physics” as well as one on the Physics Analysis Toolkit (PAT). Meanwhile the tail of the “2007 analyses” is going through the last steps of approval. It is expected that by the end of January all analyses will have converted to using the data from CSA07 – which include the effects of miscalibration and misalignment. January Physics Days The first Physics Days of 2008 took place on January 22-24. The first two days were devoted to comprehensive re¬ports from the Detector Performance Groups (DPG) and Physics Objects Groups (POG) on their planning and readiness for early data-taking followed by approvals of several recent studies. Highlights of POG presentations are included below while the activities of the DPGs are covered elsewhere in this bulletin. January 24th was devo...
Inspirational Catalogue of Master Thesis Proposals 2015
DEFF Research Database (Denmark)
Thorndahl, Søren
2015-01-01
This catalog presents different topics for master thesis projects. It is important to emphasize that the project descriptions only serves as an inspiration and that you always can discuss with the potential supervisors the specific contents of a project.......This catalog presents different topics for master thesis projects. It is important to emphasize that the project descriptions only serves as an inspiration and that you always can discuss with the potential supervisors the specific contents of a project....
Nature as inspiration for leisure education
ŠPIRHANZLOVÁ, Andrea
2017-01-01
The thesis deals with the organization of leisure activities where the main tool and inspiration is nature. The theoretical part defines basic concepts of pedagogy of free time and points to the possibility of using nature as an inspiration not only for creating content components of leisure activities, but also as the environment in which the pedagogical - educational process of activities takes place. The practical part contains specific pedagogical - educational activity whose essence is b...
Cullen, Katherine
2005-01-01
Defined as the scientific study of matter and energy, physics explains how all matter behaves. Separated into modern and classical physics, the study attracts both experimental and theoretical physicists. From the discovery of the process of nuclear fission to an explanation of the nature of light, from the theory of special relativity to advancements made in particle physics, this volume profiles 10 pioneers who overcame tremendous odds to make significant breakthroughs in this heavily studied branch of science. Each chapter contains relevant information on the scientist''s childhood, research, discoveries, and lasting contributions to the field and concludes with a chronology and a list of print and Internet references specific to that individual.
Probabilistic Survivability Versus Time Modeling
Joyner, James J., Sr.
2016-01-01
This presentation documents Kennedy Space Center's Independent Assessment work completed on three assessments for the Ground Systems Development and Operations (GSDO) Program to assist the Chief Safety and Mission Assurance Officer during key programmatic reviews and provided the GSDO Program with analyses of how egress time affects the likelihood of astronaut and ground worker survival during an emergency. For each assessment, a team developed probability distributions for hazard scenarios to address statistical uncertainty, resulting in survivability plots over time. The first assessment developed a mathematical model of probabilistic survivability versus time to reach a safe location using an ideal Emergency Egress System at Launch Complex 39B (LC-39B); the second used the first model to evaluate and compare various egress systems under consideration at LC-39B. The third used a modified LC-39B model to determine if a specific hazard decreased survivability more rapidly than other events during flight hardware processing in Kennedy's Vehicle Assembly Building.
Probabilistic cloning with supplementary information
International Nuclear Information System (INIS)
Azuma, Koji; Shimamura, Junichi; Koashi, Masato; Imoto, Nobuyuki
2005-01-01
We consider probabilistic cloning of a state chosen from a mutually nonorthogonal set of pure states, with the help of a party holding supplementary information in the form of pure states. When the number of states is 2, we show that the best efficiency of producing m copies is always achieved by a two-step protocol in which the helping party first attempts to produce m-1 copies from the supplementary state, and if it fails, then the original state is used to produce m copies. On the other hand, when the number of states exceeds two, the best efficiency is not always achieved by such a protocol. We give examples in which the best efficiency is not achieved even if we allow any amount of one-way classical communication from the helping party
Machine learning a probabilistic perspective
Murphy, Kevin P
2012-01-01
Today's Web-enabled deluge of electronic data calls for automated methods of data analysis. Machine learning provides these, developing methods that can automatically detect patterns in data and then use the uncovered patterns to predict future data. This textbook offers a comprehensive and self-contained introduction to the field of machine learning, based on a unified, probabilistic approach. The coverage combines breadth and depth, offering necessary background material on such topics as probability, optimization, and linear algebra as well as discussion of recent developments in the field, including conditional random fields, L1 regularization, and deep learning. The book is written in an informal, accessible style, complete with pseudo-code for the most important algorithms. All topics are copiously illustrated with color images and worked examples drawn from such application domains as biology, text processing, computer vision, and robotics. Rather than providing a cookbook of different heuristic method...
Probabilistic analysis of modernization options
International Nuclear Information System (INIS)
Wunderlich, W.O.; Giles, J.E.
1991-01-01
This paper reports on benefit-cost analysis for hydropower operations, a standard procedure for reaching planning decisions. Cost overruns and benefit shortfalls are also common occurrences. One reason for the difficulty of predicting future benefits and costs is that they usually cannot be represented with sufficient reliability by accurate values, because of the many uncertainties that enter the analysis through assumptions on inputs and system parameters. Therefore, ranges of variables need to be analyzed instead of single values. As a consequence, the decision criteria, such as net benefit and benefit-cost ratio, also vary over some range. A probabilistic approach will be demonstrated as a tool for assessing the reliability of the results
Probabilistic assessments of fuel performance
International Nuclear Information System (INIS)
Kelppe, S.; Ranta-Puska, K.
1998-01-01
The probabilistic Monte Carlo Method, coupled with quasi-random sampling, is applied for the fuel performance analyses. By using known distributions of fabrication parameters and real power histories with their randomly selected combinations, and by making a large number of ENIGMA code calculations, one expects to find out the state of the whole reactor fuel. Good statistics requires thousands of runs. A sample case representing VVER-440 reactor fuel indicates relatively low fuel temperatures and mainly athermal fission gas release if any. The rod internal pressure remains typically below 2.5 MPa, which leaves a large margin to the system pressure of 12 MPa Gap conductance, an essential parameter in the accident evaluations, shows no decrease from its start-of-life value. (orig.)
Probabilistic Fatigue Damage Program (FATIG)
Michalopoulos, Constantine
2012-01-01
FATIG computes fatigue damage/fatigue life using the stress rms (root mean square) value, the total number of cycles, and S-N curve parameters. The damage is computed by the following methods: (a) traditional method using Miner s rule with stress cycles determined from a Rayleigh distribution up to 3*sigma; and (b) classical fatigue damage formula involving the Gamma function, which is derived from the integral version of Miner's rule. The integration is carried out over all stress amplitudes. This software solves the problem of probabilistic fatigue damage using the integral form of the Palmgren-Miner rule. The software computes fatigue life using an approach involving all stress amplitudes, up to N*sigma, as specified by the user. It can be used in the design of structural components subjected to random dynamic loading, or by any stress analyst with minimal training for fatigue life estimates of structural components.
Probabilistic cloning of equidistant states
International Nuclear Information System (INIS)
Jimenez, O.; Roa, Luis; Delgado, A.
2010-01-01
We study the probabilistic cloning of equidistant states. These states are such that the inner product between them is a complex constant or its conjugate. Thereby, it is possible to study their cloning in a simple way. In particular, we are interested in the behavior of the cloning probability as a function of the phase of the overlap among the involved states. We show that for certain families of equidistant states Duan and Guo's cloning machine leads to cloning probabilities lower than the optimal unambiguous discrimination probability of equidistant states. We propose an alternative cloning machine whose cloning probability is higher than or equal to the optimal unambiguous discrimination probability for any family of equidistant states. Both machines achieve the same probability for equidistant states whose inner product is a positive real number.
INSPIRE from the JRC Point of View
Directory of Open Access Journals (Sweden)
Vlado Cetl
2012-12-01
Full Text Available This paper summarises some recent developments in INSPIRE implementation from the JRC (Joint Research Centre point of view. The INSPIRE process started around 11 years ago and today, clear results and benefits can be seen. Spatial data are more accessible and shared more frequently between countries and at the European level. In addition to this, efficient, unified coordination and collaboration between different stakeholders and participants has been achieved, which is another great success. The JRC, as a scientific think-tank of the European Commission, has played a very important role in this process from the very beginning. This role is in line with its mission, which is to provide customer-driven scientific and technical support for the conception, development, implementation and monitoring of European Union (EU policies. The JRC acts as the overall technical coordinator of INSPIRE, but it also carries out the activities necessary to support the coherent implementation of INSPIRE, by helping member states in the implementation process. Experiences drawn from collaboration and negotiation in each country and at the European level will be of great importance in the revision of the INSPIRE Directive, which is envisaged for 2014. Keywords: spatial data infrastructure (SDI; INSPIRE; development; Joint Research Centre (JRC
Probabilistic safety assessment - regulatory perspective
International Nuclear Information System (INIS)
Solanki, R.B.; Paul, U.K.; Hajra, P.; Agarwal, S.K.
2002-01-01
Full text: Nuclear power plants (NPPs) have been designed, constructed and operated mainly based on deterministic safety analysis philosophy. In this approach, a substantial amount of safety margin is incorporated in the design and operational requirements. Additional margin is incorporated by applying the highest quality engineering codes, standards and practices, and the concept of defence-in-depth in design and operating procedures, by including conservative assumptions and acceptance criteria in plant response analysis of postulated initiating events (PIEs). However, as the probabilistic approach has been improved and refined over the years, it is possible for the designer, operator and regulator to get a more detailed and realistic picture of the safety importance of plant design features, operating procedures and operational practices by using probabilistic safety assessment (PSA) along with the deterministic methodology. At present, many countries including USA, UK and France are using PSA insights in their decision making along with deterministic basis. India has also made substantial progress in the development of methods for carrying out PSA. However, consensus on the use of PSA in regulatory decision-making has not been achieved yet. This paper emphasises on the requirements (e.g.,level of details, key modelling assumptions, data, modelling aspects, success criteria, sensitivity and uncertainty analysis) for improving the quality and consistency in performance and use of PSA that can facilitate meaningful use of the PSA insights in the regulatory decision-making in India. This paper also provides relevant information on international scenario and various application areas of PSA along with progress made in India. The PSA perspective presented in this paper may help in achieving consensus on the use of PSA for regulatory / utility decision-making in design and operation of NPPs
Guenther Dissertori
The time period between the last CMS week and this June was one of intense activity with numerous get-together targeted at addressing specific issues on the road to data-taking. The two series of workshops, namely the “En route to discoveries” series and the “Vertical Integration” meetings continued. The first meeting of the “En route to discoveries” sequence (end 2007) had covered the measurements of the Standard Model signals as necessary prerequisite to any claim of signals beyond the Standard Model. The second meeting took place during the Feb CMS week and concentrated on the commissioning of the Physics Objects, whereas the third occurred during the April Physics Week – and this time the theme was the strategy for key new physics signatures. Both of these workshops are summarized below. The vertical integration meetings also continued, with two DPG-physics get-togethers on jets and missing ET and on electrons and photons. ...
Chris Hill
2012-01-01
The months that have passed since the last CMS Bulletin have been a very busy and exciting time for CMS physics. We have gone from observing the very first 8TeV collisions produced by the LHC to collecting a dataset of the collisions that already exceeds that recorded in all of 2011. All in just a few months! Meanwhile, the analysis of the 2011 dataset and publication of the subsequent results has continued. These results come from all the PAGs in CMS, including searches for the Higgs boson and other new phenomena, that have set the most stringent limits on an ever increasing number of models of physics beyond the Standard Model including dark matter, Supersymmetry, and TeV-scale gravity scenarios, top-quark physics where CMS has overtaken the Tevatron in the precision of some measurements, and bottom-quark physics where CMS made its first discovery of a new particle, the Ξ*0b baryon (candidate event pictured below). Image 2: A Ξ*0b candidate event At the same time POGs and PAGs...
D. Acosta
2011-01-01
Since the last CMS Week, all physics groups have been extremely active on analyses based on the full 2010 dataset, with most aiming for a preliminary measurement in time for the winter conferences. Nearly 50 analyses were approved in a “marathon” of approval meetings during the first two weeks of March, and the total number of approved analyses reached 90. The diversity of topics is very broad, including precision QCD, Top, and electroweak measurements, the first observation of single Top production at the LHC, the first limits on Higgs production at the LHC including the di-tau final state, and comprehensive searches for new physics in a wide range of topologies (so far all with null results unfortunately). Most of the results are based on the full 2010 pp data sample, which corresponds to 36 pb-1 at √s = 7 TeV. This report can only give a few of the highlights of a very rich physics program, which is listed below by physics group...
Towards New Probabilistic Assumptions in Business Intelligence
Directory of Open Access Journals (Sweden)
Schumann Andrew
2015-01-01
Full Text Available One of the main assumptions of mathematical tools in science is represented by the idea of measurability and additivity of reality. For discovering the physical universe additive measures such as mass, force, energy, temperature, etc. are used. Economics and conventional business intelligence try to continue this empiricist tradition and in statistical and econometric tools they appeal only to the measurable aspects of reality. However, a lot of important variables of economic systems cannot be observable and additive in principle. These variables can be called symbolic values or symbolic meanings and studied within symbolic interactionism, the theory developed since George Herbert Mead and Herbert Blumer. In statistical and econometric tools of business intelligence we accept only phenomena with causal connections measured by additive measures. In the paper we show that in the social world we deal with symbolic interactions which can be studied by non-additive labels (symbolic meanings or symbolic values. For accepting the variety of such phenomena we should avoid additivity of basic labels and construct a new probabilistic method in business intelligence based on non-Archimedean probabilities.
Probabilistic finite element modeling of waste rollover
International Nuclear Information System (INIS)
Khaleel, M.A.; Cofer, W.F.; Al-fouqaha, A.A.
1995-09-01
Stratification of the wastes in many Hanford storage tanks has resulted in sludge layers which are capable of retaining gases formed by chemical and/or radiolytic reactions. As the gas is produced, the mechanisms of gas storage evolve until the resulting buoyancy in the sludge leads to instability, at which point the sludge ''rolls over'' and a significant volume of gas is suddenly released. Because the releases may contain flammable gases, these episodes of release are potentially hazardous. Mitigation techniques are desirable for more controlled releases at more frequent intervals. To aid the mitigation efforts, a methodology for predicting of sludge rollover at specific times is desired. This methodology would then provide a rational basis for the development of a schedule for the mitigation procedures. In addition, a knowledge of the sensitivity of the sludge rollovers to various physical and chemical properties within the tanks would provide direction for efforts to reduce the frequency and severity of these events. In this report, the use of probabilistic finite element analyses for computing the probability of rollover and the sensitivity of rollover probability to various parameters is described
Probabilistic analysis of extreme wind events
Energy Technology Data Exchange (ETDEWEB)
Chaviaropoulos, P.K. [Center for Renewable Energy Sources (CRES), Pikermi Attikis (Greece)
1997-12-31
A vital task in wind engineering and meterology is to understand, measure, analyse and forecast extreme wind conditions, due to their significant effects on human activities and installations like buildings, bridges or wind turbines. The latest version of the IEC standard (1996) pays particular attention to the extreme wind events that have to be taken into account when designing or certifying a wind generator. Actually, the extreme wind events within a 50 year period are those which determine the ``static`` design of most of the wind turbine components. The extremes which are important for the safety of wind generators are those associated with the so-called ``survival wind speed``, the extreme operating gusts and the extreme wind direction changes. A probabilistic approach for the analysis of these events is proposed in this paper. Emphasis is put on establishing the relation between extreme values and physically meaningful ``site calibration`` parameters, like probability distribution of the annual wind speed, turbulence intensity and power spectra properties. (Author)
Yunes, Nicolás; Kocsis, Bence; Loeb, Abraham; Haiman, Zoltán
2011-10-21
We study the effects of a thin gaseous accretion disk on the inspiral of a stellar-mass black hole into a supermassive black hole. We construct a phenomenological angular momentum transport equation that reproduces known disk effects. Disk torques modify the gravitational wave phase evolution to detectable levels with LISA for reasonable disk parameters. The Fourier transform of disk-modified waveforms acquires a correction with a different frequency trend than post-Newtonian vacuum terms. Such inspirals could be used to detect accretion disks with LISA and to probe their physical parameters. © 2011 American Physical Society
CERN - Six Decades of Science, Innovation, Cooperation, and Inspiration
Energy Technology Data Exchange (ETDEWEB)
Quigg, Chris [Fermi National Accelerator Lab. (FNAL), Batavia, IL (United States)
2014-09-01
The European Laboratory for Particle Physics, which straddles the Swiss-French border northwest of Geneva, celebrates its sixtieth birthday in 2014 CERN is the preeminent particle-physics institution in the world, currently emphasizing the study of collisions of protons and heavy nuclei at very high energies and the exploration of physics on the electroweak scale (energies where electromagnetism and the weak nuclear force merge). With brilliant accomplishments in research, innovation, and education, and a sustained history of cooperation among people from different countries and cultures, CERN ranks as one of the signal achievements of the postwar European Project. For physicists the world over, the laboratory is a source of pride and inspiration.
Probabilistic machine learning and artificial intelligence.
Ghahramani, Zoubin
2015-05-28
How can a machine learn from experience? Probabilistic modelling provides a framework for understanding what learning is, and has therefore emerged as one of the principal theoretical and practical approaches for designing machines that learn from data acquired through experience. The probabilistic framework, which describes how to represent and manipulate uncertainty about models and predictions, has a central role in scientific data analysis, machine learning, robotics, cognitive science and artificial intelligence. This Review provides an introduction to this framework, and discusses some of the state-of-the-art advances in the field, namely, probabilistic programming, Bayesian optimization, data compression and automatic model discovery.
Probabilistic machine learning and artificial intelligence
Ghahramani, Zoubin
2015-05-01
How can a machine learn from experience? Probabilistic modelling provides a framework for understanding what learning is, and has therefore emerged as one of the principal theoretical and practical approaches for designing machines that learn from data acquired through experience. The probabilistic framework, which describes how to represent and manipulate uncertainty about models and predictions, has a central role in scientific data analysis, machine learning, robotics, cognitive science and artificial intelligence. This Review provides an introduction to this framework, and discusses some of the state-of-the-art advances in the field, namely, probabilistic programming, Bayesian optimization, data compression and automatic model discovery.
Probabilistic assessment of nuclear safety and safeguards
International Nuclear Information System (INIS)
Higson, D.J.
1987-01-01
Nuclear reactor accidents and diversions of materials from the nuclear fuel cycle are perceived by many people as particularly serious threats to society. Probabilistic assessment is a rational approach to the evaluation of both threats, and may provide a basis for decisions on appropriate actions to control them. Probabilistic method have become standard tools used in the analysis of safety, but there are disagreements on the criteria to be applied when assessing the results of analysis. Probabilistic analysis and assessment of the effectiveness of nuclear material safeguards are still at an early stage of development. (author)
A History of Probabilistic Inductive Logic Programming
Directory of Open Access Journals (Sweden)
Fabrizio eRiguzzi
2014-09-01
Full Text Available The field of Probabilistic Logic Programming (PLP has seen significant advances in the last 20 years, with many proposals for languages that combine probability with logic programming. Since the start, the problem of learning probabilistic logic programs has been the focus of much attention. Learning these programs represents a whole subfield of Inductive Logic Programming (ILP. In Probabilistic ILP (PILP two problems are considered: learning the parameters of a program given the structure (the rules and learning both the structure and the parameters. Usually structure learning systems use parameter learning as a subroutine. In this article we present an overview of PILP and discuss the main results.
PROBABILISTIC RELATIONAL MODELS OF COMPLETE IL-SEMIRINGS
Tsumagari, Norihiro
2012-01-01
This paper studies basic properties of probabilistic multirelations which are generalized the semantic domain of probabilistic systems and then provides two probabilistic models of complete IL-semirings using probabilistic multirelations. Also it is shown that these models need not be models of complete idempotentsemirings.
A convergence theory for probabilistic metric spaces | Jäger ...
African Journals Online (AJOL)
We develop a theory of probabilistic convergence spaces based on Tardiff's neighbourhood systems for probabilistic metric spaces. We show that the resulting category is a topological universe and we characterize a subcategory that is isomorphic to the category of probabilistic metric spaces. Keywords: Probabilistic metric ...
Darin Acosta
2010-01-01
The collisions last year at 900 GeV and 2.36 TeV provided the long anticipated collider data to the CMS physics groups. Quite a lot has been accomplished in a very short time. Although the delivered luminosity was small, CMS was able to publish its first physics paper (with several more in preparation), and commence the commissioning of physics objects for future analyses. Many new performance results have been approved in advance of this CMS Week. One remarkable outcome has been the amazing agreement between out-of-the-box data with simulation at these low energies so early in the commissioning of the experiment. All of this is testament to the hard work and preparation conducted beforehand by many people in CMS. These analyses could not have happened without the dedicated work of the full collaboration on building and commissioning the detector, computing, and software systems combined with the tireless work of many to collect, calibrate and understand the data and our detector. To facilitate the efficien...
D. Acosta
2010-01-01
The Physics Groups are actively engaged on analyses of the first data from the LHC at 7 TeV, targeting many results for the ICHEP conference taking place in Paris this summer. The first large batch of physics approvals is scheduled for this CMS Week, to be followed by four more weeks of approvals and analysis updates leading to the start of the conference in July. Several high priority analysis areas were organized into task forces to ensure sufficient coverage from the relevant detector, object, and analysis groups in the preparation of these analyses. Already some results on charged particle correlations and multiplicities in 7 TeV minimum bias collisions have been approved. Only one small detail remains before ICHEP: further integrated luminosity delivered by the LHC! Beyond the Standard Model measurements that can be done with these data, the focus changes to the search for new physics at the TeV scale and for the Higgs boson in the period after ICHEP. Particle Flow The PFT group is focusing on the ...
the PAG conveners
2011-01-01
The delivered LHC integrated luminosity of more than 1 inverse femtobarn by summer and more than 5 by the end of 2011 has been a gold mine for the physics groups. With 2011 data, we have submitted or published 14 papers, 7 others are in collaboration-wide review, and 75 Physics Analysis Summaries have been approved already. They add to the 73 papers already published based on the 2010 and 2009 datasets. Highlights from each physics analysis group are described below. Heavy ions Many important results have been obtained from the first lead-ion collision run in 2010. The published measurements include the first ever indications of Υ excited state suppression (PRL synopsis), long-range correlation in PbPb, and track multiplicity over a wide η range. Preliminary results include the first ever measurement of isolated photons (showing no modification), J/ψ suppression including the separation of the non-prompt component, further study of jet fragmentation, nuclear modification factor...
L. Demortier
Physics-wise, the CMS week in December was dominated by discussions of the analyses that will be carried out in the “next six months”, i.e. while waiting for the first LHC collisions. As presented in December, analysis approvals based on Monte Carlo simulation were re-opened, with the caveat that for this work to be helpful to the goals of CMS, it should be carried out using the new software (CMSSW_2_X) and associated samples. By the end of the week, the goal for the physics groups was set to be the porting of our physics commissioning methods and plans, as well as the early analyses (based an integrated luminosity in the range 10-100pb-1) into this new software. Since December, the large data samples from CMSSW_2_1 were completed. A big effort by the production group gave a significant number of events over the end-of-year break – but also gave out the first samples with the fast simulation. Meanwhile, as mentioned in December, the arrival of 2_2 meant that ...
C. Hill
2012-01-01
2012 has started off as a very busy year for the CMS Physics Groups. Planning for the upcoming higher luminosity/higher energy (8 TeV) operation of the LHC and relatively early Rencontres de Moriond are the high-priority activities for the group at the moment. To be ready for the coming 8-TeV data, CMS has made a concerted effort to perform and publish analyses on the 5 fb−1 dataset recorded in 2011. This has resulted in the submission of 16 papers already, including nine on the search for the Higgs boson. In addition, a number of preliminary results on the 2011 dataset have been released to the public. The Exotica and SUSY groups approved several searches for new physics in January, such as searches for W′ and exotic highly ionising particles. These were highlighted at a CERN seminar given on 24th January. Many more analyses, from all the PAGs, including the newly formed SMP (Standard Model Physics) and FSQ (Forward and Small-x QCD), were approved in February. The ...
C. Hill
2012-01-01
The period since the last CMS Bulletin has been historic for CMS Physics. The pinnacle of our physics programme was an observation of a new particle – a strong candidate for a Higgs boson – which has captured worldwide interest and made a profound impact on the very field of particle physics. At the time of the discovery announcement on 4 July, 2012, prominent signals were observed in the high-resolution H→γγ and H→ZZ(4l) modes. Corroborating excess was observed in the H→W+W– mode as well. The fermionic channel analyses (H→bb, H→ττ), however, yielded less than the Standard Model (SM) expectation. Collectively, the five channels established the signal with a significance of five standard deviations. With the exception of the diphoton channel, these analyses have all been updated in the last months and several new channels have been added. With improved analyses and more than twice the i...
Disjunctive Probabilistic Modal Logic is Enough for Bisimilarity on Reactive Probabilistic Systems
Bernardo, Marco; Miculan, Marino
2016-01-01
Larsen and Skou characterized probabilistic bisimilarity over reactive probabilistic systems with a logic including true, negation, conjunction, and a diamond modality decorated with a probabilistic lower bound. Later on, Desharnais, Edalat, and Panangaden showed that negation is not necessary to characterize the same equivalence. In this paper, we prove that the logical characterization holds also when conjunction is replaced by disjunction, with negation still being not necessary. To this e...
On synchronous parallel computations with independent probabilistic choice
International Nuclear Information System (INIS)
Reif, J.H.
1984-01-01
This paper introduces probabilistic choice to synchronous parallel machine models; in particular parallel RAMs. The power of probabilistic choice in parallel computations is illustrate by parallelizing some known probabilistic sequential algorithms. The authors characterize the computational complexity of time, space, and processor bounded probabilistic parallel RAMs in terms of the computational complexity of probabilistic sequential RAMs. They show that parallelism uniformly speeds up time bounded probabilistic sequential RAM computations by nearly a quadratic factor. They also show that probabilistic choice can be eliminated from parallel computations by introducing nonuniformity
The cascade probabilistic functions and the Markov's processes. Chapter 1
International Nuclear Information System (INIS)
2003-01-01
In the Chapter 1 the physical and mathematical descriptions of radiation processes are carried out. The relation of the cascade probabilistic functions (CPF) with Markov's chain is shown. The CPF calculation for electrons with the energy losses taking into account are given. The calculation of the CPF on the computer was carried out. The estimation of energy losses contribution in the CPFs and radiation defects concentration are made. Besides calculation of the primarily knock-on atoms and radiation defects at electron irradiation with use of the CPF with taking into account energy losses are conducted
Multiobjective optimal allocation problem with probabilistic non ...
African Journals Online (AJOL)
user
The probabilistic non-linear cost constraint is converted into equivalent deterministic .... Further, in a survey the costs for enumerating a character in various strata are not known exactly, rather these are being ...... Naval Research Logistics, Vol.
Probabilistic Meteorological Characterization for Turbine Loads
DEFF Research Database (Denmark)
Kelly, Mark C.; Larsen, Gunner Chr.; Dimitrov, Nikolay Krasimirov
2014-01-01
Beyond the existing, limited IEC prescription to describe fatigue loads on wind turbines, we look towards probabilistic characterization of the loads via analogous characterization of the atmospheric flow, particularly for today's "taller" turbines with rotors well above the atmospheric surface...
Probabilistic composition of preferences, theory and applications
Parracho Sant'Anna, Annibal
2015-01-01
Putting forward a unified presentation of the features and possible applications of probabilistic preferences composition, and serving as a methodology for decisions employing multiple criteria, this book maximizes reader insights into the evaluation in probabilistic terms and the development of composition approaches that do not depend on assigning weights to the criteria. With key applications in important areas of management such as failure modes, effects analysis and productivity analysis – together with explanations about the application of the concepts involved –this book makes available numerical examples of probabilistic transformation development and probabilistic composition. Useful not only as a reference source for researchers, but also in teaching classes of graduate courses in Production Engineering and Management Science, the key themes of the book will be of especial interest to researchers in the field of Operational Research.
Advanced Test Reactor probabilistic risk assessment
International Nuclear Information System (INIS)
Atkinson, S.A.; Eide, S.A.; Khericha, S.T.; Thatcher, T.A.
1993-01-01
This report discusses Level 1 probabilistic risk assessment (PRA) incorporating a full-scope external events analysis which has been completed for the Advanced Test Reactor (ATR) located at the Idaho National Engineering Laboratory
Probabilistic safety assessment for seismic events
International Nuclear Information System (INIS)
1993-10-01
This Technical Document on Probabilistic Safety Assessment for Seismic Events is mainly associated with the Safety Practice on Treatment of External Hazards in PSA and discusses in detail one specific external hazard, i.e. earthquakes
Estimating software development project size, using probabilistic ...
African Journals Online (AJOL)
Estimating software development project size, using probabilistic techniques. ... of managing the size of software development projects by Purchasers (Clients) and Vendors (Development ... EMAIL FREE FULL TEXT EMAIL FREE FULL TEXT
Comparing Categorical and Probabilistic Fingerprint Evidence.
Garrett, Brandon; Mitchell, Gregory; Scurich, Nicholas
2018-04-23
Fingerprint examiners traditionally express conclusions in categorical terms, opining that impressions do or do not originate from the same source. Recently, probabilistic conclusions have been proposed, with examiners estimating the probability of a match between recovered and known prints. This study presented a nationally representative sample of jury-eligible adults with a hypothetical robbery case in which an examiner opined on the likelihood that a defendant's fingerprints matched latent fingerprints in categorical or probabilistic terms. We studied model language developed by the U.S. Defense Forensic Science Center to summarize results of statistical analysis of the similarity between prints. Participant ratings of the likelihood the defendant left prints at the crime scene and committed the crime were similar when exposed to categorical and strong probabilistic match evidence. Participants reduced these likelihoods when exposed to the weaker probabilistic evidence, but did not otherwise discriminate among the prints assigned different match probabilities. © 2018 American Academy of Forensic Sciences.
Probabilistic methods in exotic option pricing
Anderluh, J.H.M.
2007-01-01
The thesis presents three ways of calculating the Parisian option price as an illustration of probabilistic methods in exotic option pricing. Moreover options on commidities are considered and double-sided barrier options in a compound Poisson framework.
Non-unitary probabilistic quantum computing
Gingrich, Robert M.; Williams, Colin P.
2004-01-01
We present a method for designing quantum circuits that perform non-unitary quantum computations on n-qubit states probabilistically, and give analytic expressions for the success probability and fidelity.
A logic for inductive probabilistic reasoning
DEFF Research Database (Denmark)
Jaeger, Manfred
2005-01-01
Inductive probabilistic reasoning is understood as the application of inference patterns that use statistical background information to assign (subjective) probabilities to single events. The simplest such inference pattern is direct inference: from '70% of As are Bs" and "a is an A" infer...... that a is a B with probability 0.7. Direct inference is generalized by Jeffrey's rule and the principle of cross-entropy minimization. To adequately formalize inductive probabilistic reasoning is an interesting topic for artificial intelligence, as an autonomous system acting in a complex environment may have...... to base its actions on a probabilistic model of its environment, and the probabilities needed to form this model can often be obtained by combining statistical background information with particular observations made, i.e., by inductive probabilistic reasoning. In this paper a formal framework...
Do probabilistic forecasts lead to better decisions?
Directory of Open Access Journals (Sweden)
M. H. Ramos
2013-06-01
Full Text Available The last decade has seen growing research in producing probabilistic hydro-meteorological forecasts and increasing their reliability. This followed the promise that, supplied with information about uncertainty, people would take better risk-based decisions. In recent years, therefore, research and operational developments have also started focusing attention on ways of communicating the probabilistic forecasts to decision-makers. Communicating probabilistic forecasts includes preparing tools and products for visualisation, but also requires understanding how decision-makers perceive and use uncertainty information in real time. At the EGU General Assembly 2012, we conducted a laboratory-style experiment in which several cases of flood forecasts and a choice of actions to take were presented as part of a game to participants, who acted as decision-makers. Answers were collected and analysed. In this paper, we present the results of this exercise and discuss if we indeed make better decisions on the basis of probabilistic forecasts.
The enigma of probability and physics
International Nuclear Information System (INIS)
Mayants, L.
1984-01-01
This volume contains a coherent exposition of the elements of two unique sciences: probabilistics (science of probability) and probabilistic physics (application of probabilistics to physics). Proceeding from a key methodological principle, it starts with the disclosure of the true content of probability and the interrelation between probability theory and experimental statistics. This makes is possible to introduce a proper order in all the sciences dealing with probability and, by conceiving the real content of statistical mechanics and quantum mechanics in particular, to construct both as two interconnected domains of probabilistic physics. Consistent theories of kinetics of physical transformations, decay processes, and intramolecular rearrangements are also outlined. The interrelation between the electromagnetic field, photons, and the theoretically discovered subatomic particle 'emon' is considered. Numerous internal imperfections of conventional probability theory, statistical physics, and quantum physics are exposed and removed - quantum physics no longer needs special interpretation. EPR, Bohm, and Bell paradoxes are easily resolved, among others. (Auth.)
Risk assessment using probabilistic standards
International Nuclear Information System (INIS)
Avila, R.
2004-01-01
A core element of risk is uncertainty represented by plural outcomes and their likelihood. No risk exists if the future outcome is uniquely known and hence guaranteed. The probability that we will die some day is equal to 1, so there would be no fatal risk if sufficiently long time frame is assumed. Equally, rain risk does not exist if there was 100% assurance of rain tomorrow, although there would be other risks induced by the rain. In a formal sense, any risk exists if, and only if, more than one outcome is expected at a future time interval. In any practical risk assessment we have to deal with uncertainties associated with the possible outcomes. One way of dealing with the uncertainties is to be conservative in the assessments. For example, we may compare the maximal exposure to a radionuclide with a conservatively chosen reference value. In this case, if the exposure is below the reference value then it is possible to assure that the risk is low. Since single values are usually compared; this approach is commonly called 'deterministic'. Its main advantage lies in the simplicity and in that it requires minimum information. However, problems arise when the reference values are actually exceeded or might be exceeded, as in the case of potential exposures, and when the costs for realizing the reference values are high. In those cases, the lack of knowledge on the degree of conservatism involved impairs a rational weighing of the risks against other interests. In this presentation we will outline an approach for dealing with uncertainties that in our opinion is more consistent. We will call it a 'fully probabilistic risk assessment'. The essence of this approach consists in measuring the risk in terms of probabilities, where the later are obtained from comparison of two probabilistic distributions, one reflecting the uncertainties in the outcomes and one reflecting the uncertainties in the reference value (standard) used for defining adverse outcomes. Our first aim
New probabilistic interest measures for association rules
Hahsler, Michael; Hornik, Kurt
2008-01-01
Mining association rules is an important technique for discovering meaningful patterns in transaction databases. Many different measures of interestingness have been proposed for association rules. However, these measures fail to take the probabilistic properties of the mined data into account. In this paper, we start with presenting a simple probabilistic framework for transaction data which can be used to simulate transaction data when no associations are present. We use such data and a rea...
Semantics of probabilistic processes an operational approach
Deng, Yuxin
2015-01-01
This book discusses the semantic foundations of concurrent systems with nondeterministic and probabilistic behaviour. Particular attention is given to clarifying the relationship between testing and simulation semantics and characterising bisimulations from metric, logical, and algorithmic perspectives. Besides presenting recent research outcomes in probabilistic concurrency theory, the book exemplifies the use of many mathematical techniques to solve problems in computer science, which is intended to be accessible to postgraduate students in Computer Science and Mathematics. It can also be us
Probabilistic cloning of three symmetric states
International Nuclear Information System (INIS)
Jimenez, O.; Bergou, J.; Delgado, A.
2010-01-01
We study the probabilistic cloning of three symmetric states. These states are defined by a single complex quantity, the inner product among them. We show that three different probabilistic cloning machines are necessary to optimally clone all possible families of three symmetric states. We also show that the optimal cloning probability of generating M copies out of one original can be cast as the quotient between the success probability of unambiguously discriminating one and M copies of symmetric states.
Probabilistic Analysis Methods for Hybrid Ventilation
DEFF Research Database (Denmark)
Brohus, Henrik; Frier, Christian; Heiselberg, Per
This paper discusses a general approach for the application of probabilistic analysis methods in the design of ventilation systems. The aims and scope of probabilistic versus deterministic methods are addressed with special emphasis on hybrid ventilation systems. A preliminary application...... of stochastic differential equations is presented comprising a general heat balance for an arbitrary number of loads and zones in a building to determine the thermal behaviour under random conditions....
J. D'Hondt
The Electroweak and Top Quark Workshop (16-17th of July) A Workshop on Electroweak and Top Quark Physics, dedicated on early measurements, took place on 16th-17th July. We had more than 40 presentations at the Workshop, which was an important milestone for 2007 physics analyses in the EWK and TOP areas. The Standard Model has been tested empirically by many previous experiments. Observables which are nowadays known with high precision will play a major role for data-based CMS calibrations. A typical example is the use of the Z to monitor electron and muon reconstruction in di-lepton inclusive samples. Another example is the use of the W mass as a constraint for di-jets in the kinematic fitting of top-quark events, providing information on the jet energy scale. The predictions of the Standard Model, for what concerns proton collisions at the LHC, are accurate to a level that the production of W/Z and top-quark events can be used as a powerful tool to commission our experiment. On the other hand the measure...
Christopher Hill
2013-01-01
Since the last CMS Bulletin, the CMS Physics Analysis Groups have completed more than 70 new analyses, many of which are based on the complete Run 1 dataset. In parallel the Snowmass whitepaper on projected discovery potential of CMS for HL-LHC has been completed, while the ECFA HL-LHC future physics studies has been summarised in a report and nine published benchmark analyses. Run 1 summary studies on b-tag and jet identification, quark-gluon discrimination and boosted topologies have been documented in BTV-13-001 and JME-13-002/005/006, respectively. The new tracking alignment and performance papers are being prepared for submission as well. The Higgs analysis group produced several new results including the search for ttH with H decaying to ZZ, WW, ττ+bb (HIG-13-019/020) where an excess of ~2.5σ is observed in the like-sign di-muon channel, and new searches for high-mass Higgs bosons (HIG-13-022). Search for invisible Higgs decays have also been performed both using the associ...
C. Hill
2013-01-01
In the period since the last CMS Bulletin, the LHC – and CMS – have entered LS1. During this time, CMS Physics Analysis Groups have performed more than 40 new analyses, many of which are based on the complete 8 TeV dataset delivered by the LHC in 2012 (and in some cases on the full Run 1 dataset). These results were shown at, and well received by, several high-profile conferences in the spring of 2013, including the inaugural meeting of the Large Hadron Collider Physics Conference (LHCP) in Barcelona, and the 26th International Symposium on Lepton Photon Interactions at High Energies (LP) in San Francisco. In parallel, there have been significant developments in preparations for Run 2 of the LHC and on “future physics” studies for both Phase 1 and Phase 2 upgrades of the CMS detector. The Higgs analysis group produced five new results for LHCP including a new H-to-bb search in VBF production (HIG-13-011), ttH with H to γ&ga...
C. Hill
2013-01-01
The period since the last CMS bulletin has seen the end of proton collisions at a centre-of-mass energy 8 TeV, a successful proton-lead collision run at 5 TeV/nucleon, as well as a “reference” proton run at 2.76 TeV. With these final LHC Run 1 datasets in hand, CMS Physics Analysis Groups have been busy analysing these data in preparation for the winter conferences. Moreover, despite the fact that the pp run only concluded in mid-December (and there was consequently less time to complete data analyses), CMS again made a strong showing at the Rencontres de Moriond in La Thuile (EW and QCD) where nearly 40 new results were presented. The highlight of these preliminary results was the eagerly anticipated updated studies of the properties of the Higgs boson discovered in July of last year. Meanwhile, preparations for Run 2 and physics performance studies for Phase 1 and Phase 2 upgrade scenarios are ongoing. The Higgs analysis group produced updated analyses on the full Run 1 dataset (~25 f...
Probabilistic causality and radiogenic cancers
International Nuclear Information System (INIS)
Groeer, P.G.
1986-01-01
A review and scrutiny of the literature on probability and probabilistic causality shows that it is possible under certain assumptions to estimate the probability that a certain type of cancer diagnosed in an individual exposed to radiation prior to diagnosis was caused by this exposure. Diagnosis of this causal relationship like diagnosis of any disease - malignant or not - requires always some subjective judgments by the diagnostician. It is, therefore, illusory to believe that tables based on actuarial data can provide objective estimates of the chance that a cancer diagnosed in an individual is radiogenic. It is argued that such tables can only provide a base from which the diagnostician(s) deviate in one direction or the other according to his (their) individual (consensual) judgment. Acceptance of a physician's diagnostic judgment by patients is commonplace. Similar widespread acceptance of expert judgment by claimants in radiation compensation cases does presently not exist. Judicious use of the present radioepidemiological tables prepared by the Working Group of the National Institutes of Health or of updated future versions of similar tables may improve the situation. 20 references
Dynamical systems probabilistic risk assessment
Energy Technology Data Exchange (ETDEWEB)
Denman, Matthew R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Ames, Arlo Leroy [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)
2014-03-01
Probabilistic Risk Assessment (PRA) is the primary tool used to risk-inform nuclear power regulatory and licensing activities. Risk-informed regulations are intended to reduce inherent conservatism in regulatory metrics (e.g., allowable operating conditions and technical specifications) which are built into the regulatory framework by quantifying both the total risk profile as well as the change in the risk profile caused by an event or action (e.g., in-service inspection procedures or power uprates). Dynamical Systems (DS) analysis has been used to understand unintended time-dependent feedbacks in both industrial and organizational settings. In dynamical systems analysis, feedback loops can be characterized and studied as a function of time to describe the changes to the reliability of plant Structures, Systems and Components (SSCs). While DS has been used in many subject areas, some even within the PRA community, it has not been applied toward creating long-time horizon, dynamic PRAs (with time scales ranging between days and decades depending upon the analysis). Understanding slowly developing dynamic effects, such as wear-out, on SSC reliabilities may be instrumental in ensuring a safely and reliably operating nuclear fleet. Improving the estimation of a plant's continuously changing risk profile will allow for more meaningful risk insights, greater stakeholder confidence in risk insights, and increased operational flexibility.
Computing Distances between Probabilistic Automata
Directory of Open Access Journals (Sweden)
Mathieu Tracol
2011-07-01
Full Text Available We present relaxed notions of simulation and bisimulation on Probabilistic Automata (PA, that allow some error epsilon. When epsilon is zero we retrieve the usual notions of bisimulation and simulation on PAs. We give logical characterisations of these notions by choosing suitable logics which differ from the elementary ones, L with negation and L without negation, by the modal operator. Using flow networks, we show how to compute the relations in PTIME. This allows the definition of an efficiently computable non-discounted distance between the states of a PA. A natural modification of this distance is introduced, to obtain a discounted distance, which weakens the influence of long term transitions. We compare our notions of distance to others previously defined and illustrate our approach on various examples. We also show that our distance is not expansive with respect to process algebra operators. Although L without negation is a suitable logic to characterise epsilon-(bisimulation on deterministic PAs, it is not for general PAs; interestingly, we prove that it does characterise weaker notions, called a priori epsilon-(bisimulation, which we prove to be NP-difficult to decide.
Probabilistic modeling of children's handwriting
Puri, Mukta; Srihari, Sargur N.; Hanson, Lisa
2013-12-01
There is little work done in the analysis of children's handwriting, which can be useful in developing automatic evaluation systems and in quantifying handwriting individuality. We consider the statistical analysis of children's handwriting in early grades. Samples of handwriting of children in Grades 2-4 who were taught the Zaner-Bloser style were considered. The commonly occurring word "and" written in cursive style as well as hand-print were extracted from extended writing. The samples were assigned feature values by human examiners using a truthing tool. The human examiners looked at how the children constructed letter formations in their writing, looking for similarities and differences from the instructions taught in the handwriting copy book. These similarities and differences were measured using a feature space distance measure. Results indicate that the handwriting develops towards more conformity with the class characteristics of the Zaner-Bloser copybook which, with practice, is the expected result. Bayesian networks were learnt from the data to enable answering various probabilistic queries, such as determining students who may continue to produce letter formations as taught during lessons in school and determining the students who will develop a different and/or variation of the those letter formations and the number of different types of letter formations.
Distribution functions of probabilistic automata
Vatan, F.
2001-01-01
Each probabilistic automaton M over an alphabet A defines a probability measure Prob sub(M) on the set of all finite and infinite words over A. We can identify a k letter alphabet A with the set {0, 1,..., k-1}, and, hence, we can consider every finite or infinite word w over A as a radix k expansion of a real number X(w) in the interval [0, 1]. This makes X(w) a random variable and the distribution function of M is defined as usual: F(x) := Prob sub(M) { w: X(w) automata in detail. Automata with continuous distribution functions are characterized. By a new, and much more easier method, it is shown that the distribution function F(x) is an analytic function if it is a polynomial. Finally, answering a question posed by D. Knuth and A. Yao, we show that a polynomial distribution function F(x) on [0, 1] can be generated by a prob abilistic automaton iff all the roots of F'(x) = 0 in this interval, if any, are rational numbers. For this, we define two dynamical systems on the set of polynomial distributions and study attracting fixed points of random composition of these two systems.
Probabilistic transport models for fusion
International Nuclear Information System (INIS)
Milligen, B.Ph. van; Carreras, B.A.; Lynch, V.E.; Sanchez, R.
2005-01-01
A generalization of diffusive (Fickian) transport is considered, in which particle motion is described by probability distributions. We design a simple model that includes a critical mechanism to switch between two transport channels, and show that it exhibits various interesting characteristics, suggesting that the ideas of probabilistic transport might provide a framework for the description of a range of unusual transport phenomena observed in fusion plasmas. The model produces power degradation and profile consistency, as well as a scaling of the confinement time with system size reminiscent of the gyro-Bohm/Bohm scalings observed in fusion plasmas, and rapid propagation of disturbances. In the present work we show how this model may also produce on-axis peaking of the profiles with off-axis fuelling. It is important to note that the fluid limit of a simple model like this, characterized by two transport channels, does not correspond to the usual (Fickian) transport models commonly used for modelling transport in fusion plasmas, and behaves in a fundamentally different way. (author)
Prospects for probabilistic safety assessment
International Nuclear Information System (INIS)
Hirschberg, S.
1992-01-01
This article provides some reflections on future developments of Probabilistic Safety Assessment (PSA) in view of the present state of the art and evaluates current trends in the use of PSA for safety management. The main emphasis is on Level 1 PSA, although Level 2 aspects are also highlighted to some extent. As a starting point, the role of PSA is outlined from a historical perspective, demonstrating the rapid expansion of the uses of PSA. In this context the wide spectrum of PSA applications and the associated benefits to the users are in focus. It should be kept in mind, however, that PSA, in spite of its merits, is not a self-standing safety tool. It complements deterministic analysis and thus improves understanding and facilitating prioritization of safety issues. Significant progress in handling PSA limitations - such as reliability data, common-cause failures, human interactions, external events, accident progression, containment performance, and source-term issues - is described. This forms a background for expected future developments of PSA. Among the most important issues on the agenda for the future are PSA scope extensions, methodological improvements and computer code advancements, and full exploitation of the potential benefits of applications to operational safety management. Many PSA uses, if properly exercised, lead to safety improvements as well as major burden reductions. The article provides, in addition, International Atomic Energy Agency (IAEA) perspective on the topics covered, as reflected in the current PSA programs of the agency. 74 refs., 6 figs., 1 tab
2014-01-01
Just as the sunshine seems to have arrived back at CERN, in other respects summer is coming to a close as we say our farewells to this year’s crop of summer students. This injection of young people – always a welcome feature in July and August at CERN – dates back to the early 1960s, when the Summer Student programme began under one of my predecessors Vicky Weisskopf. The idea was to awaken the interest of undergraduates in CERN's activities by offering them the chance of hands-on experience during their long summer vacation. Around the same time, the CERN School of Physics began. Aimed at young postgraduates, it led to the current European School of High-Energy Physics and related schools in Latin America and the Asia-Pacific region. Over the years, it was joined by CERN schools on accelerator subjects and computing, which have expanded CERN’s training mandate. These days, our efforts begin with young people before they go to university &...
Nature-inspired computation in engineering
2016-01-01
This timely review book summarizes the state-of-the-art developments in nature-inspired optimization algorithms and their applications in engineering. Algorithms and topics include the overview and history of nature-inspired algorithms, discrete firefly algorithm, discrete cuckoo search, plant propagation algorithm, parameter-free bat algorithm, gravitational search, biogeography-based algorithm, differential evolution, particle swarm optimization and others. Applications include vehicle routing, swarming robots, discrete and combinatorial optimization, clustering of wireless sensor networks, cell formation, economic load dispatch, metamodeling, surrogated-assisted cooperative co-evolution, data fitting and reverse engineering as well as other case studies in engineering. This book will be an ideal reference for researchers, lecturers, graduates and engineers who are interested in nature-inspired computation, artificial intelligence and computational intelligence. It can also serve as a reference for relevant...
Biologically Inspired Micro-Flight Research
Raney, David L.; Waszak, Martin R.
2003-01-01
Natural fliers demonstrate a diverse array of flight capabilities, many of which are poorly understood. NASA has established a research project to explore and exploit flight technologies inspired by biological systems. One part of this project focuses on dynamic modeling and control of micro aerial vehicles that incorporate flexible wing structures inspired by natural fliers such as insects, hummingbirds and bats. With a vast number of potential civil and military applications, micro aerial vehicles represent an emerging sector of the aerospace market. This paper describes an ongoing research activity in which mechanization and control concepts for biologically inspired micro aerial vehicles are being explored. Research activities focusing on a flexible fixed- wing micro aerial vehicle design and a flapping-based micro aerial vehicle concept are presented.
Learning from nature: Nature-inspired algorithms
DEFF Research Database (Denmark)
Albeanu, Grigore; Madsen, Henrik; Popentiu-Vladicescu, Florin
2016-01-01
.), genetic and evolutionary strategies, artificial immune systems etc. Well-known examples of applications include: aircraft wing design, wind turbine design, bionic car, bullet train, optimal decisions related to traffic, appropriate strategies to survive under a well-adapted immune system etc. Based......During last decade, the nature has inspired researchers to develop new algorithms. The largest collection of nature-inspired algorithms is biology-inspired: swarm intelligence (particle swarm optimization, ant colony optimization, cuckoo search, bees' algorithm, bat algorithm, firefly algorithm etc...... on collective social behaviour of organisms, researchers have developed optimization strategies taking into account not only the individuals, but also groups and environment. However, learning from nature, new classes of approaches can be identified, tested and compared against already available algorithms...
Biologically inspired technologies in NASA's morphing project
McGowan, Anna-Maria R.; Cox, David E.; Lazos, Barry S.; Waszak, Martin R.; Raney, David L.; Siochi, Emilie J.; Pao, S. Paul
2003-07-01
For centuries, biology has provided fertile ground for hypothesis, discovery, and inspiration. Time-tested methods used in nature are being used as a basis for several research studies conducted at the NASA Langley Research Center as a part of Morphing Project, which develops and assesses breakthrough vehicle technologies. These studies range from low drag airfoil design guided by marine and avian morphologies to soaring techniques inspired by birds and the study of small flexible wing vehicles. Biology often suggests unconventional yet effective approaches such as non-planar wings, dynamic soaring, exploiting aeroelastic effects, collaborative control, flapping, and fibrous active materials. These approaches and other novel technologies for future flight vehicles are being studied in NASA's Morphing Project. This paper will discuss recent findings in the aeronautics-based, biologically-inspired research in the project.
Hanea, A.M.; Nane, G.F.; Wielicki, B.A.; Cooke, R.M.
2018-01-01
Probabilistic thinking can often be unintuitive. This is the case even for simple problems, let alone the more complex ones arising in climate modelling, where disparate information sources need to be combined. The physical models, the natural variability of systems, the measurement errors and
Directory of Open Access Journals (Sweden)
Alkın Yurtkuran
2016-01-01
Full Text Available The artificial bee colony (ABC algorithm is a popular swarm based technique, which is inspired from the intelligent foraging behavior of honeybee swarms. This paper proposes a new variant of ABC algorithm, namely, enhanced ABC with solution acceptance rule and probabilistic multisearch (ABC-SA to address global optimization problems. A new solution acceptance rule is proposed where, instead of greedy selection between old solution and new candidate solution, worse candidate solutions have a probability to be accepted. Additionally, the acceptance probability of worse candidates is nonlinearly decreased throughout the search process adaptively. Moreover, in order to improve the performance of the ABC and balance the intensification and diversification, a probabilistic multisearch strategy is presented. Three different search equations with distinctive characters are employed using predetermined search probabilities. By implementing a new solution acceptance rule and a probabilistic multisearch approach, the intensification and diversification performance of the ABC algorithm is improved. The proposed algorithm has been tested on well-known benchmark functions of varying dimensions by comparing against novel ABC variants, as well as several recent state-of-the-art algorithms. Computational results show that the proposed ABC-SA outperforms other ABC variants and is superior to state-of-the-art algorithms proposed in the literature.
Yurtkuran, Alkın; Emel, Erdal
2016-01-01
The artificial bee colony (ABC) algorithm is a popular swarm based technique, which is inspired from the intelligent foraging behavior of honeybee swarms. This paper proposes a new variant of ABC algorithm, namely, enhanced ABC with solution acceptance rule and probabilistic multisearch (ABC-SA) to address global optimization problems. A new solution acceptance rule is proposed where, instead of greedy selection between old solution and new candidate solution, worse candidate solutions have a probability to be accepted. Additionally, the acceptance probability of worse candidates is nonlinearly decreased throughout the search process adaptively. Moreover, in order to improve the performance of the ABC and balance the intensification and diversification, a probabilistic multisearch strategy is presented. Three different search equations with distinctive characters are employed using predetermined search probabilities. By implementing a new solution acceptance rule and a probabilistic multisearch approach, the intensification and diversification performance of the ABC algorithm is improved. The proposed algorithm has been tested on well-known benchmark functions of varying dimensions by comparing against novel ABC variants, as well as several recent state-of-the-art algorithms. Computational results show that the proposed ABC-SA outperforms other ABC variants and is superior to state-of-the-art algorithms proposed in the literature.
An empirical system for probabilistic seasonal climate prediction
Eden, Jonathan; van Oldenborgh, Geert Jan; Hawkins, Ed; Suckling, Emma
2016-04-01
Preparing for episodes with risks of anomalous weather a month to a year ahead is an important challenge for governments, non-governmental organisations, and private companies and is dependent on the availability of reliable forecasts. The majority of operational seasonal forecasts are made using process-based dynamical models, which are complex, computationally challenging and prone to biases. Empirical forecast approaches built on statistical models to represent physical processes offer an alternative to dynamical systems and can provide either a benchmark for comparison or independent supplementary forecasts. Here, we present a simple empirical system based on multiple linear regression for producing probabilistic forecasts of seasonal surface air temperature and precipitation across the globe. The global CO2-equivalent concentration is taken as the primary predictor; subsequent predictors, including large-scale modes of variability in the climate system and local-scale information, are selected on the basis of their physical relationship with the predictand. The focus given to the climate change signal as a source of skill and the probabilistic nature of the forecasts produced constitute a novel approach to global empirical prediction. Hindcasts for the period 1961-2013 are validated against observations using deterministic (correlation of seasonal means) and probabilistic (continuous rank probability skill scores) metrics. Good skill is found in many regions, particularly for surface air temperature and most notably in much of Europe during the spring and summer seasons. For precipitation, skill is generally limited to regions with known El Niño-Southern Oscillation (ENSO) teleconnections. The system is used in a quasi-operational framework to generate empirical seasonal forecasts on a monthly basis.
A global empirical system for probabilistic seasonal climate prediction
Eden, J. M.; van Oldenborgh, G. J.; Hawkins, E.; Suckling, E. B.
2015-12-01
Preparing for episodes with risks of anomalous weather a month to a year ahead is an important challenge for governments, non-governmental organisations, and private companies and is dependent on the availability of reliable forecasts. The majority of operational seasonal forecasts are made using process-based dynamical models, which are complex, computationally challenging and prone to biases. Empirical forecast approaches built on statistical models to represent physical processes offer an alternative to dynamical systems and can provide either a benchmark for comparison or independent supplementary forecasts. Here, we present a simple empirical system based on multiple linear regression for producing probabilistic forecasts of seasonal surface air temperature and precipitation across the globe. The global CO2-equivalent concentration is taken as the primary predictor; subsequent predictors, including large-scale modes of variability in the climate system and local-scale information, are selected on the basis of their physical relationship with the predictand. The focus given to the climate change signal as a source of skill and the probabilistic nature of the forecasts produced constitute a novel approach to global empirical prediction. Hindcasts for the period 1961-2013 are validated against observations using deterministic (correlation of seasonal means) and probabilistic (continuous rank probability skill scores) metrics. Good skill is found in many regions, particularly for surface air temperature and most notably in much of Europe during the spring and summer seasons. For precipitation, skill is generally limited to regions with known El Niño-Southern Oscillation (ENSO) teleconnections. The system is used in a quasi-operational framework to generate empirical seasonal forecasts on a monthly basis.
Inspiration in the Act of Reading
DEFF Research Database (Denmark)
Zeller, Kinga
2016-01-01
In German-language theology, Professor Ulrich H. J. Körtner’s theory of inspiration, as it relates to the Bible reader’s perspective, is well known. His attempt to gain fruitful insights from contemporary literary hermeneutics while linking them to theological concerns makes his approach a valued...... yet not uncontroversial example of a reception-aesthetics twist on the Lutheran sola Scriptura. This article presents Körtner’s hermeneutical considerations with special regard to inspiration related to the Bible reader’s perspective and shows how this approach may be related to some aspects...
Bio-inspired functional surfaces for advanced applications
DEFF Research Database (Denmark)
Malshe, Ajay; Rajurkar, Kamlakar; Samant, Anoop
2013-01-01
, are being evolved to a higher state of intelligent functionality. These surfaces became more efficient by using combinations of available materials, along with unique physical and chemical strategies. Noteworthy physical strategies include features such as texturing and structure, and chemical strategies...... such as sensing and actuation. These strategies collectively enable functional surfaces to deliver extraordinary adhesion, hydrophobicity, multispectral response, energy scavenging, thermal regulation, antibiofouling, and other advanced functions. Production industries have been intrigued with such biological...... surface strategies in order to learn clever surface architectures and implement those architectures to impart advanced functionalities into manufactured consumer products. This keynote paper delivers a critical review of such inspiring biological surfaces and their nonbiological product analogs, where...
Growing a Waldorf-Inspired Approach in a Public School District
Friedlaender, Diane; Beckham, Kyle; Zheng, Xinhua; Darling-Hammond, Linda
2015-01-01
This report documents the practices and outcomes of Alice Birney, a public K-8 Waldorf-Inspired School in Sacramento City Unified School District (SCUSD). This study highlights how such a school addresses students' academic, social, emotional, physical, and creative development. Birney students outperform similar students in SCUSD on several…
Isgur–Wise function in a QCD-inspired potential model with WKB ...
Indian Academy of Sciences (India)
2017-02-28
Feb 28, 2017 ... DOI 10.1007/s12043-016-1357-9. Isgur–Wise function in a QCD-inspired potential model with WKB approximation. BHASKAR JYOTI HAZARIKA1,∗ and D K CHOUDHURY1,2. 1Centre for Theoretical Studies, Pandu College, Guwahati 781 012, India. 2Physics Academy of North East, Gauhati University, ...
Leser, Patrick E.; Hochhalter, Jacob D.; Newman, John A.; Leser, William P.; Warner, James E.; Wawrzynek, Paul A.; Yuan, Fuh-Gwo
2015-01-01
Utilizing inverse uncertainty quantification techniques, structural health monitoring can be integrated with damage progression models to form probabilistic predictions of a structure's remaining useful life. However, damage evolution in realistic structures is physically complex. Accurately representing this behavior requires high-fidelity models which are typically computationally prohibitive. In the present work, a high-fidelity finite element model is represented by a surrogate model, reducing computation times. The new approach is used with damage diagnosis data to form a probabilistic prediction of remaining useful life for a test specimen under mixed-mode conditions.
Orhan, A Emin; Ma, Wei Ji
2017-07-26
Animals perform near-optimal probabilistic inference in a wide range of psychophysical tasks. Probabilistic inference requires trial-to-trial representation of the uncertainties associated with task variables and subsequent use of this representation. Previous work has implemented such computations using neural networks with hand-crafted and task-dependent operations. We show that generic neural networks trained with a simple error-based learning rule perform near-optimal probabilistic inference in nine common psychophysical tasks. In a probabilistic categorization task, error-based learning in a generic network simultaneously explains a monkey's learning curve and the evolution of qualitative aspects of its choice behavior. In all tasks, the number of neurons required for a given level of performance grows sublinearly with the input population size, a substantial improvement on previous implementations of probabilistic inference. The trained networks develop a novel sparsity-based probabilistic population code. Our results suggest that probabilistic inference emerges naturally in generic neural networks trained with error-based learning rules.Behavioural tasks often require probability distributions to be inferred about task specific variables. Here, the authors demonstrate that generic neural networks can be trained using a simple error-based learning rule to perform such probabilistic computations efficiently without any need for task specific operations.
Fatimah, F.; Rosadi, D.; Hakim, R. B. F.
2018-03-01
In this paper, we motivate and introduce probabilistic soft sets and dual probabilistic soft sets for handling decision making problem in the presence of positive and negative parameters. We propose several types of algorithms related to this problem. Our procedures are flexible and adaptable. An example on real data is also given.
V.Ciulli
2011-01-01
The main programme of the Physics Week held between 16th and 20th May was a series of topology-oriented workshops on di-leptons, di-photons, inclusive W, and all-hadronic final states. The goal of these workshops was to reach a common understanding for the set of objects (ID, cleaning...), the handling of pile-up, calibration, efficiency and purity determination, as well as to revisit critical common issues such as the trigger. Di-lepton workshop Most analysis groups use a di-lepton trigger or a combination of single and di-lepton triggers in 2011. Some groups need to collect leptons with as low PT as possible with strong isolation and identification requirements as for Higgs into WW at low mass, others with intermediate PT values as in Drell-Yan studies, or high PT as in the Exotica group. Electron and muon reconstruction, identification and isolation, was extensively described in the workshop. For electrons, VBTF selection cuts for low PT and HEEP cuts for high PT were discussed, as well as more complex d...
Modelling fog in probabilistic consequence assessment
International Nuclear Information System (INIS)
Underwood, B.Y.
1993-02-01
Earlier work examined the potential influence of foggy weather conditions on the probabilistic assessment of the consequences of accidental releases of radioactive material to the atmosphere (PCA), in particular the impact of a fraction of the released aerosol becoming incorporated into droplets. A major uncertainty emerging from the initial scoping study concerned estimation of the fraction of the released material that would be taken up into droplets. An objective is to construct a method for handling in a PCA context the effect of fog on deposition, basing the method on the experience gained from prior investigations. There are two aspects to explicitly including the effect of fog in PCA: estimating the probability of occurrence of various types of foggy condition and calculating the impact on the conventional end-points of consequence assessment. For the first, a brief outline is given of the use of meteorological data by PCA computer codes, followed by a discussion of some routinely-recorded meteorological parameters that are pertinent to fog, such as the presentweather code and horizontal visibility. Four stylized scenarios are defined to cover a wide range of situations in which particle growth by uptake of water may have an important impact on deposition. A description is then given of the way in which routine meteorological data could be used to flag the presence of each of these conditions in the meteorological data file used by the PCA code. The approach developed to calculate the impact on deposition is pitched at a level of complexity appropriate to the PCA context and reflects the physical constraints of the system and accounts for the specific characteristics of the released aerosol. (Author)
The probabilistic innovation theoretical framework
Directory of Open Access Journals (Sweden)
Chris W. Callaghan
2017-07-01
Full Text Available Background: Despite technological advances that offer new opportunities for solving societal problems in real time, knowledge management theory development has largely not kept pace with these developments. This article seeks to offer useful insights into how more effective theory development in this area could be enabled. Aim: This article suggests different streams of literature for inclusion into a theoretical framework for an emerging stream of research, termed ‘probabilistic innovation’, which seeks to develop a system of real-time research capability. The objective of this research is therefore to provide a synthesis of a range of diverse literatures, and to provide useful insights into how research enabled by crowdsourced research and development can potentially be used to address serious knowledge problems in real time. Setting: This research suggests that knowledge management theory can provide an anchor for a new stream of research contributing to the development of real-time knowledge problem solving. Methods: This conceptual article seeks to re-conceptualise the problem of real-time research and locate this knowledge problem in relation to a host of rapidly developing streams of literature. In doing so, a novel perspective of societal problem-solving is enabled. Results: An analysis of theory and literature suggests that certain rapidly developing streams of literature might more effectively contribute to societally important real-time research problem solving if these steams are united under a theoretical framework with this goal as its explicit focus. Conclusion: Although the goal of real-time research is as yet not attainable, research that contributes to its attainment may ultimately make an important contribution to society.
Probabilistic Multi-Hazard Assessment of Dry Cask Structures
Energy Technology Data Exchange (ETDEWEB)
Bencturk, Bora [Univ. of Houston, TX (United States); Padgett, Jamie [Rice Univ., Houston, TX (United States); Uddin, Rizwan [Univ. of Illinois, Urbana-Champaign, IL (United States).
2017-01-10
systems the concrete shall not only provide shielding but insures stability of the upright canister, facilitates anchoring, allows ventilation, and provides physical protection against theft, severe weather and natural (seismic) as well as man-made events (blast incidences). Given the need to remain functional for 40 years or even longer in case of interim storage, the concrete outerpack and the internal canister components need to be evaluated with regard to their long-term ability to perform their intended design functions. Just as evidenced by deteriorating concrete bridges, there are reported visible degradation mechanisms of dry storage systems especially when high corrosive environments are considered in maritime locations. The degradation of reinforced concrete is caused by multiple physical and chemical mechanisms, which may be summarized under the heading of environmental aging. The underlying hygro-thermal transport processes are accelerated by irradiation effects, hence creep and shrinkage need to include the effect of chloride penetration, alkali aggregate reaction as well as corrosion of the reinforcing steel. In light of the above, the two main objectives of this project are to (1) develop a probabilistic multi-hazard assessment framework, and (2) through experimental and numerical research perform a comprehensive assessment under combined earthquake loads and aging induced deterioration, which will also provide data for the development and validation of the probabilistic framework.
Kinds of inspiration in interaction design
DEFF Research Database (Denmark)
Halskov, Kim
2010-01-01
In this paper, we explore the role of sources of inspiration in interaction design. We identify four strategies for relating sources of inspiration to emerging ideas: selection; adaptation; translation; and combination. As our starting point, we argue that sources of inspiration are a form...... of knowledge crucial to creativity. Our research is based on empirical findings arising from the use of Inspiration Card Workshops, which are collaborative design events in which domain and technology insight are combined to create design concepts. In addition to the systematically introduced sources...... of inspiration that form part of the workshop format, a number of spontaneous sources of inspiration emerged during these workshops....
Buckling Pneumatic Linear Actuators Inspired by Muscle
Yang, Dian; Verma, Mohit Singh; So, Ju-Hee; Mosadegh, Bobak; Keplinger, Christoph; Lee, Benjamin; Khashai, Fatemeh; Lossner, Elton Garret; Suo, Zhigang; Whitesides, George McClelland
2016-01-01
The mechanical features of biological muscles are difficult to reproduce completely in synthetic systems. A new class of soft pneumatic structures (vacuum-actuated muscle-inspired pneumatic structures) is described that combines actuation by negative pressure (vacuum), with cooperative buckling of beams fabricated in a slab of elastomer, to achieve motion and demonstrate many features that are similar to that of mammalian muscle.
Inspiration and the Texts of the Bible
Directory of Open Access Journals (Sweden)
Dirk Buchner
1997-12-01
Full Text Available This article seeks to explore what the inspired text of the Old Testament was as it existed for the New Testament authors, particularly for the author of the book of Hebrews. A quick look at the facts makes. it clear that there was, at the time, more than one 'inspired' text, among these were the Septuagint and the Masoretic Text 'to name but two'. The latter eventually gained ascendancy which is why it forms the basis of our translated Old Testament today. Yet we have to ask: what do we make of that other text that was the inspired Bible to the early Church, especially to the writer of the book of Hebrews, who ignored the Masoretic text? This article will take a brief look at some suggestions for a doctrine of inspiration that keeps up with the facts of Scripture. Allied to this, the article is something of a bibliographical study of recent developments in textual research following the discovery of the Dead Sea scrolls.
Using Space to Inspire and Engage Children
Clements, Allan
2015-01-01
The European Space Education Resources Office (ESERO-UK) is a project of the European Space Agency (ESA) and national partners including the Department for Education (DfE), The UK Space Agency (UKSA) and the Science and Technology Facilities Council (STFC). The key objective of the project is to promote space as an exciting inspirational context…
Inspired by Athletes, Myths, and Poets
Melvin, Samantha
2010-01-01
Tales of love and hate, of athleticism, heroism, devotion to gods and goddesses that influenced myth and culture are a way of sharing ancient Greece's rich history. In this article, the author describes how her students created their own Greek-inspired clay vessels as artifacts of their study. (Contains 6 online resources.)
Inspirational catalogue of Master Thesis proposals 2014
DEFF Research Database (Denmark)
This catalog presents different topics for master thesis projects. It is important to emphasize that the project descriptions only serves as an inspiration and that you always can discuss with the potential supervisors the specific contents of a project. If you have an idea for a project which...
Water Treatment Technologies Inspire Healthy Beverages
2013-01-01
Mike Johnson, a former technician at Johnson Space Center, drew on his expertise as a wastewater engineer to create a line of kombucha-based probiotic drinks. Unpeeled Inc., based in Minneapolis-St. Paul, Minnesota, employs 12 people and has sold more than 6 million units of its NASA-inspired beverage.
Inspiring a Life Full of Learning
Ludlam, John
2012-01-01
The Secrets and Words films had everything one would expect from a BBC drama--great writing, acting and directing allied with high production values. But the dramas were also powerful learning tools, co-commissioned by BBC Learning and aimed at inspiring people who have difficulty with reading and writing to seek help. The BBC's learning vision is…
Trauma-Inspired Prosocial Leadership Development
Williams, Jenifer Wolf; Allen, Stuart
2015-01-01
Though trauma survivors sometimes emerge as leaders in prosocial causes related to their previous negative or traumatic experiences, little is known about this transition, and limited guidance is available for survivors who hope to make prosocial contributions. To understand what enables trauma-inspired prosocial leadership development, the…
Pop Art--Inspired Self-Portraits
Goodwin, Donna J.
2011-01-01
In this article, the author describes an art lesson that was inspired by Andy Warhol's mass-produced portraits. Warhol began his career as a graphic artist and illustrator. His artwork was a response to the redundancy of the advertising images put in front of the American public. Celebrities and famous people in magazines and newspapers were seen…
Surfacing Authentic Leadership: Inspiration from "After Life"
Billsberry, Jon; North-Samardzic, Andrea
2016-01-01
This paper advocates an innovative approach to help leadership students analyze, capture, and remember the nature of their authentic leadership. This developmental activity was inspired by the Japanese film, "Wandâfuru raifu" ("After Life") (Kore-Eda, Sato, & Shigenobu, 1998), in which the recently deceased are asked to…
Coaching som inspiration til dialogbaseret lederskab
DEFF Research Database (Denmark)
Stelter, Reinhard
2013-01-01
, hvor mening og værdiskabende processer er i centrum. De centrale grunddimensioner for denne form for coachende dialog ligger i et fokus på værdier, i muligheder for meningsskabelse og i det narrativ-samskabende perspektiv. På dette grundlag kan tredje generations coaching være inspiration i forhold til...
Uses of probabilistic estimates of seismic hazard and nuclear power plants in the US
International Nuclear Information System (INIS)
Reiter, L.
1983-01-01
The use of probabilistic estimates is playing an increased role in the review of seismic hazard at nuclear power plants. The NRC Geosciences Branch emphasis has been on using these estimates in a relative rather than absolute manner and to gain insight on other approaches. Examples of this use include estimates to determine design levels, to determine equivalent hazard at different sites, to help define more realistic seismotectonic provinces, and to assess implied levels of acceptable risk using deterministic methods. Increased use of probabilistic estimates is expected. Probabilistic estimates of seismic hazard have a potential for misuse, however, and their successful integration into decision making requires they not be divorced from physical insight and scientific intuition
Sustaining Physics Teacher Education Coalition Programs in Physics Teacher Education
Scherr, Rachel E.; Plisch, Monica; Goertzen, Renee Michelle
2017-01-01
Understanding the mechanisms of increasing the number of physics teachers educated per year at institutions with thriving physics teacher preparation programs may inspire and support other institutions in building thriving programs of their own. The Physics Teacher Education Coalition (PhysTEC), led by the American Physical Society (APS) and the…
Use and Communication of Probabilistic Forecasts.
Raftery, Adrian E
2016-12-01
Probabilistic forecasts are becoming more and more available. How should they be used and communicated? What are the obstacles to their use in practice? I review experience with five problems where probabilistic forecasting played an important role. This leads me to identify five types of potential users: Low Stakes Users, who don't need probabilistic forecasts; General Assessors, who need an overall idea of the uncertainty in the forecast; Change Assessors, who need to know if a change is out of line with expectatations; Risk Avoiders, who wish to limit the risk of an adverse outcome; and Decision Theorists, who quantify their loss function and perform the decision-theoretic calculations. This suggests that it is important to interact with users and to consider their goals. The cognitive research tells us that calibration is important for trust in probability forecasts, and that it is important to match the verbal expression with the task. The cognitive load should be minimized, reducing the probabilistic forecast to a single percentile if appropriate. Probabilities of adverse events and percentiles of the predictive distribution of quantities of interest seem often to be the best way to summarize probabilistic forecasts. Formal decision theory has an important role, but in a limited range of applications.
Use and Communication of Probabilistic Forecasts
Raftery, Adrian E.
2015-01-01
Probabilistic forecasts are becoming more and more available. How should they be used and communicated? What are the obstacles to their use in practice? I review experience with five problems where probabilistic forecasting played an important role. This leads me to identify five types of potential users: Low Stakes Users, who don’t need probabilistic forecasts; General Assessors, who need an overall idea of the uncertainty in the forecast; Change Assessors, who need to know if a change is out of line with expectatations; Risk Avoiders, who wish to limit the risk of an adverse outcome; and Decision Theorists, who quantify their loss function and perform the decision-theoretic calculations. This suggests that it is important to interact with users and to consider their goals. The cognitive research tells us that calibration is important for trust in probability forecasts, and that it is important to match the verbal expression with the task. The cognitive load should be minimized, reducing the probabilistic forecast to a single percentile if appropriate. Probabilities of adverse events and percentiles of the predictive distribution of quantities of interest seem often to be the best way to summarize probabilistic forecasts. Formal decision theory has an important role, but in a limited range of applications. PMID:28446941
Probabilistic numerics and uncertainty in computations.
Hennig, Philipp; Osborne, Michael A; Girolami, Mark
2015-07-08
We deliver a call to arms for probabilistic numerical methods : algorithms for numerical tasks, including linear algebra, integration, optimization and solving differential equations, that return uncertainties in their calculations. Such uncertainties, arising from the loss of precision induced by numerical calculation with limited time or hardware, are important for much contemporary science and industry. Within applications such as climate science and astrophysics, the need to make decisions on the basis of computations with large and complex data have led to a renewed focus on the management of numerical uncertainty. We describe how several seminal classic numerical methods can be interpreted naturally as probabilistic inference. We then show that the probabilistic view suggests new algorithms that can flexibly be adapted to suit application specifics, while delivering improved empirical performance. We provide concrete illustrations of the benefits of probabilistic numeric algorithms on real scientific problems from astrometry and astronomical imaging, while highlighting open problems with these new algorithms. Finally, we describe how probabilistic numerical methods provide a coherent framework for identifying the uncertainty in calculations performed with a combination of numerical algorithms (e.g. both numerical optimizers and differential equation solvers), potentially allowing the diagnosis (and control) of error sources in computations.
Probabilistic Modeling of the Renal Stone Formation Module
Best, Lauren M.; Myers, Jerry G.; Goodenow, Debra A.; McRae, Michael P.; Jackson, Travis C.
2013-01-01
The Integrated Medical Model (IMM) is a probabilistic tool, used in mission planning decision making and medical systems risk assessments. The IMM project maintains a database of over 80 medical conditions that could occur during a spaceflight, documenting an incidence rate and end case scenarios for each. In some cases, where observational data are insufficient to adequately define the inflight medical risk, the IMM utilizes external probabilistic modules to model and estimate the event likelihoods. One such medical event of interest is an unpassed renal stone. Due to a high salt diet and high concentrations of calcium in the blood (due to bone depletion caused by unloading in the microgravity environment) astronauts are at a considerable elevated risk for developing renal calculi (nephrolithiasis) while in space. Lack of observed incidences of nephrolithiasis has led HRP to initiate the development of the Renal Stone Formation Module (RSFM) to create a probabilistic simulator capable of estimating the likelihood of symptomatic renal stone presentation in astronauts on exploration missions. The model consists of two major parts. The first is the probabilistic component, which utilizes probability distributions to assess the range of urine electrolyte parameters and a multivariate regression to transform estimated crystal density and size distributions to the likelihood of the presentation of nephrolithiasis symptoms. The second is a deterministic physical and chemical model of renal stone growth in the kidney developed by Kassemi et al. The probabilistic component of the renal stone model couples the input probability distributions describing the urine chemistry, astronaut physiology, and system parameters with the physical and chemical outputs and inputs to the deterministic stone growth model. These two parts of the model are necessary to capture the uncertainty in the likelihood estimate. The model will be driven by Monte Carlo simulations, continuously
bayesPop: Probabilistic Population Projections
Ševčíková, Hana; Raftery, Adrian E.
2016-01-01
We describe bayesPop, an R package for producing probabilistic population projections for all countries. This uses probabilistic projections of total fertility and life expectancy generated by Bayesian hierarchical models. It produces a sample from the joint posterior predictive distribution of future age- and sex-specific population counts, fertility rates and mortality rates, as well as future numbers of births and deaths. It provides graphical ways of summarizing this information, including trajectory plots and various kinds of probabilistic population pyramids. An expression language is introduced which allows the user to produce the predictive distribution of a wide variety of derived population quantities, such as the median age or the old age dependency ratio. The package produces aggregated projections for sets of countries, such as UN regions or trading blocs. The methodology has been used by the United Nations to produce their most recent official population projections for all countries, published in the World Population Prospects. PMID:28077933
Probabilistic Modeling and Visualization for Bankruptcy Prediction
DEFF Research Database (Denmark)
Antunes, Francisco; Ribeiro, Bernardete; Pereira, Francisco Camara
2017-01-01
In accounting and finance domains, bankruptcy prediction is of great utility for all of the economic stakeholders. The challenge of accurate assessment of business failure prediction, specially under scenarios of financial crisis, is known to be complicated. Although there have been many successful...... studies on bankruptcy detection, seldom probabilistic approaches were carried out. In this paper we assume a probabilistic point-of-view by applying Gaussian Processes (GP) in the context of bankruptcy prediction, comparing it against the Support Vector Machines (SVM) and the Logistic Regression (LR......). Using real-world bankruptcy data, an in-depth analysis is conducted showing that, in addition to a probabilistic interpretation, the GP can effectively improve the bankruptcy prediction performance with high accuracy when compared to the other approaches. We additionally generate a complete graphical...
Probabilistic inversion for chicken processing lines
International Nuclear Information System (INIS)
Cooke, Roger M.; Nauta, Maarten; Havelaar, Arie H.; Fels, Ine van der
2006-01-01
We discuss an application of probabilistic inversion techniques to a model of campylobacter transmission in chicken processing lines. Such techniques are indicated when we wish to quantify a model which is new and perhaps unfamiliar to the expert community. In this case there are no measurements for estimating model parameters, and experts are typically unable to give a considered judgment. In such cases, experts are asked to quantify their uncertainty regarding variables which can be predicted by the model. The experts' distributions (after combination) are then pulled back onto the parameter space of the model, a process termed 'probabilistic inversion'. This study illustrates two such techniques, iterative proportional fitting (IPF) and PARmeter fitting for uncertain models (PARFUM). In addition, we illustrate how expert judgement on predicted observable quantities in combination with probabilistic inversion may be used for model validation and/or model criticism
Scalable group level probabilistic sparse factor analysis
DEFF Research Database (Denmark)
Hinrich, Jesper Løve; Nielsen, Søren Føns Vind; Riis, Nicolai Andre Brogaard
2017-01-01
Many data-driven approaches exist to extract neural representations of functional magnetic resonance imaging (fMRI) data, but most of them lack a proper probabilistic formulation. We propose a scalable group level probabilistic sparse factor analysis (psFA) allowing spatially sparse maps, component...... pruning using automatic relevance determination (ARD) and subject specific heteroscedastic spatial noise modeling. For task-based and resting state fMRI, we show that the sparsity constraint gives rise to components similar to those obtained by group independent component analysis. The noise modeling...... shows that noise is reduced in areas typically associated with activation by the experimental design. The psFA model identifies sparse components and the probabilistic setting provides a natural way to handle parameter uncertainties. The variational Bayesian framework easily extends to more complex...
bayesPop: Probabilistic Population Projections
Directory of Open Access Journals (Sweden)
Hana Ševčíková
2016-12-01
Full Text Available We describe bayesPop, an R package for producing probabilistic population projections for all countries. This uses probabilistic projections of total fertility and life expectancy generated by Bayesian hierarchical models. It produces a sample from the joint posterior predictive distribution of future age- and sex-specific population counts, fertility rates and mortality rates, as well as future numbers of births and deaths. It provides graphical ways of summarizing this information, including trajectory plots and various kinds of probabilistic population pyramids. An expression language is introduced which allows the user to produce the predictive distribution of a wide variety of derived population quantities, such as the median age or the old age dependency ratio. The package produces aggregated projections for sets of countries, such as UN regions or trading blocs. The methodology has been used by the United Nations to produce their most recent official population projections for all countries, published in the World Population Prospects.
Probabilistic Design of Wave Energy Devices
DEFF Research Database (Denmark)
Sørensen, John Dalsgaard; Kofoed, Jens Peter; Ferreira, C.B.
2011-01-01
Wave energy has a large potential for contributing significantly to production of renewable energy. However, the wave energy sector is still not able to deliver cost competitive and reliable solutions. But the sector has already demonstrated several proofs of concepts. The design of wave energy...... devices is a new and expanding technical area where there is no tradition for probabilistic design—in fact very little full scale devices has been build to date, so it can be said that no design tradition really exists in this area. For this reason it is considered to be of great importance to develop...... and advocate for a probabilistic design approach, as it is assumed (in other areas this has been demonstrated) that this leads to more economical designs compared to designs based on deterministic methods. In the present paper a general framework for probabilistic design and reliability analysis of wave energy...
Efficient network-matrix architecture for general flow transport inspired by natural pinnate leaves.
Hu, Liguo; Zhou, Han; Zhu, Hanxing; Fan, Tongxiang; Zhang, Di
2014-11-14
Networks embedded in three dimensional matrices are beneficial to deliver physical flows to the matrices. Leaf architectures, pervasive natural network-matrix architectures, endow leaves with high transpiration rates and low water pressure drops, providing inspiration for efficient network-matrix architectures. In this study, the network-matrix model for general flow transport inspired by natural pinnate leaves is investigated analytically. The results indicate that the optimal network structure inspired by natural pinnate leaves can greatly reduce the maximum potential drop and the total potential drop caused by the flow through the network while maximizing the total flow rate through the matrix. These results can be used to design efficient networks in network-matrix architectures for a variety of practical applications, such as tissue engineering, cell culture, photovoltaic devices and heat transfer.
Biomimetic and bio-inspired uses of mollusc shells.
Morris, J P; Wang, Y; Backeljau, T; Chapelle, G
2016-06-01
Climate change and ocean acidification are likely to have a profound effect on marine molluscs, which are of great ecological and economic importance. One process particularly sensitive to climate change is the formation of biominerals in mollusc shells. Fundamental research is broadening our understanding of the biomineralization process, as well as providing more informed predictions on the effects of climate change on marine molluscs. Such studies are important in their own right, but their value also extends to applied sciences. Biominerals, organic/inorganic hybrid materials with many remarkable physical and chemical properties, have been studied for decades, and the possibilities for future improved use of such materials for society are widely recognised. This article highlights the potential use of our understanding of the shell biomineralization process in novel bio-inspired and biomimetic applications. It also highlights the potential for the valorisation of shells produced as a by-product of the aquaculture industry. Studying shells and the formation of biominerals will inspire novel functional hybrid materials. It may also provide sustainable, ecologically- and economically-viable solutions to some of the problems created by current human resource exploitation. Copyright © 2016 Elsevier B.V. All rights reserved.
Nanofluidics in two-dimensional layered materials: inspirations from nature.
Gao, Jun; Feng, Yaping; Guo, Wei; Jiang, Lei
2017-08-29
With the advance of chemistry, materials science, and nanotechnology, significant progress has been achieved in the design and application of synthetic nanofluidic devices and materials, mimicking the gating, rectifying, and adaptive functions of biological ion channels. Fundamental physics and chemistry behind these novel transport phenomena on the nanoscale have been explored in depth on single-pore platforms. However, toward real-world applications, one major challenge is to extrapolate these single-pore devices into macroscopic materials. Recently, inspired partially by the layered microstructure of nacre, the material design and large-scale integration of artificial nanofluidic devices have stepped into a completely new stage, termed 2D nanofluidics. Unique advantages of the 2D layered materials have been found, such as facile and scalable fabrication, high flux, efficient chemical modification, tunable channel size, etc. These features enable wide applications in, for example, biomimetic ion transport manipulation, molecular sieving, water treatment, and nanofluidic energy conversion and storage. This review highlights the recent progress, current challenges, and future perspectives in this emerging research field of "2D nanofluidics", with emphasis on the thought of bio-inspiration.
Electronic and optoelectronic materials and devices inspired by nature
Meredith, P.; Bettinger, C. J.; Irimia-Vladu, M.; Mostert, A. B.; Schwenn, P. E.
2013-03-01
Inorganic semiconductors permeate virtually every sphere of modern human existence. Micro-fabricated memory elements, processors, sensors, circuit elements, lasers, displays, detectors, etc are ubiquitous. However, the dawn of the 21st century has brought with it immense new challenges, and indeed opportunities—some of which require a paradigm shift in the way we think about resource use and disposal, which in turn directly impacts our ongoing relationship with inorganic semiconductors such as silicon and gallium arsenide. Furthermore, advances in fields such as nano-medicine and bioelectronics, and the impending revolution of the ‘ubiquitous sensor network’, all require new functional materials which are bio-compatible, cheap, have minimal embedded manufacturing energy plus extremely low power consumption, and are mechanically robust and flexible for integration with tissues, building structures, fabrics and all manner of hosts. In this short review article we summarize current progress in creating materials with such properties. We focus primarily on organic and bio-organic electronic and optoelectronic systems derived from or inspired by nature, and outline the complex charge transport and photo-physics which control their behaviour. We also introduce the concept of electrical devices based upon ion or proton flow (‘ionics and protonics’) and focus particularly on their role as a signal interface with biological systems. Finally, we highlight recent advances in creating working devices, some of which have bio-inspired architectures, and summarize the current issues, challenges and potential solutions. This is a rich new playground for the modern materials physicist.
Probabilistic Learning by Rodent Grid Cells.
Cheung, Allen
2016-10-01
Mounting evidence shows mammalian brains are probabilistic computers, but the specific cells involved remain elusive. Parallel research suggests that grid cells of the mammalian hippocampal formation are fundamental to spatial cognition but their diverse response properties still defy explanation. No plausible model exists which explains stable grids in darkness for twenty minutes or longer, despite being one of the first results ever published on grid cells. Similarly, no current explanation can tie together grid fragmentation and grid rescaling, which show very different forms of flexibility in grid responses when the environment is varied. Other properties such as attractor dynamics and grid anisotropy seem to be at odds with one another unless additional properties are assumed such as a varying velocity gain. Modelling efforts have largely ignored the breadth of response patterns, while also failing to account for the disastrous effects of sensory noise during spatial learning and recall, especially in darkness. Here, published electrophysiological evidence from a range of experiments are reinterpreted using a novel probabilistic learning model, which shows that grid cell responses are accurately predicted by a probabilistic learning process. Diverse response properties of probabilistic grid cells are statistically indistinguishable from rat grid cells across key manipulations. A simple coherent set of probabilistic computations explains stable grid fields in darkness, partial grid rescaling in resized arenas, low-dimensional attractor grid cell dynamics, and grid fragmentation in hairpin mazes. The same computations also reconcile oscillatory dynamics at the single cell level with attractor dynamics at the cell ensemble level. Additionally, a clear functional role for boundary cells is proposed for spatial learning. These findings provide a parsimonious and unified explanation of grid cell function, and implicate grid cells as an accessible neuronal population
Probabilistic Damage Stability Calculations for Ships
DEFF Research Database (Denmark)
Jensen, Jørgen Juncher
1996-01-01
The aim of these notes is to provide background material for the present probabilistic damage stability rules fro dry cargo ships.The formulas for the damage statistics are derived and shortcomings as well as possible improvements are discussed. The advantage of the definiton of fictitious...... compartments in the formulation of a computer-based general procedure for probabilistic damaged stability assessment is shown. Some comments are given on the current state of knowledge on the ship survivability in damaged conditions. Finally, problems regarding proper account of water ingress through openings...
Quantum logic networks for probabilistic teleportation
Institute of Scientific and Technical Information of China (English)
刘金明; 张永生; 等
2003-01-01
By eans of the primitive operations consisting of single-qubit gates.two-qubit controlled-not gates,Von Neuman measurement and classically controlled operations.,we construct efficient quantum logic networks for implementing probabilistic teleportation of a single qubit,a two-particle entangled state,and an N-particle entanglement.Based on the quantum networks,we show that after the partially entangled states are concentrated into maximal entanglement,the above three kinds of probabilistic teleportation are the same as the standard teleportation using the corresponding maximally entangled states as the quantum channels.
Probabilistic Durability Analysis in Advanced Engineering Design
Directory of Open Access Journals (Sweden)
A. Kudzys
2000-01-01
Full Text Available Expedience of probabilistic durability concepts and approaches in advanced engineering design of building materials, structural members and systems is considered. Target margin values of structural safety and serviceability indices are analyzed and their draft values are presented. Analytical methods of the cumulative coefficient of correlation and the limit transient action effect for calculation of reliability indices are given. Analysis can be used for probabilistic durability assessment of carrying and enclosure metal, reinforced concrete, wood, plastic, masonry both homogeneous and sandwich or composite structures and some kinds of equipments. Analysis models can be applied in other engineering fields.
Probabilistic Design of Offshore Structural Systems
DEFF Research Database (Denmark)
Sørensen, John Dalsgaard
1988-01-01
Probabilistic design of structural systems is considered in this paper. The reliability is estimated using first-order reliability methods (FORM). The design problem is formulated as the optimization problem to minimize a given cost function such that the reliability of the single elements...... satisfies given requirements or such that the systems reliability satisfies a given requirement. Based on a sensitivity analysis optimization procedures to solve the optimization problems are presented. Two of these procedures solve the system reliability-based optimization problem sequentially using quasi......-analytical derivatives. Finally an example of probabilistic design of an offshore structure is considered....
Probabilistic Design of Offshore Structural Systems
DEFF Research Database (Denmark)
Sørensen, John Dalsgaard
Probabilistic design of structural systems is considered in this paper. The reliability is estimated using first-order reliability methods (FORM). The design problem is formulated as the optimization problem to minimize a given cost function such that the reliability of the single elements...... satisfies given requirements or such that the systems reliability satisfies a given requirement. Based on a sensitivity analysis optimization procedures to solve the optimization problems are presented. Two of these procedures solve the system reliability-based optimization problem sequentially using quasi......-analytical derivatives. Finally an example of probabilistic design of an offshore structure is considered....
Documentation design for probabilistic risk assessment
International Nuclear Information System (INIS)
Parkinson, W.J.; von Herrmann, J.L.
1985-01-01
This paper describes a framework for documentation design of probabilistic risk assessment (PRA) and is based on the EPRI document NP-3470 ''Documentation Design for Probabilistic Risk Assessment''. The goals for PRA documentation are stated. Four audiences are identified which PRA documentation must satisfy, and the documentation consistent with the needs of the various audiences are discussed, i.e., the Summary Report, the Executive Summary, the Main Report, and Appendices. The authors recommend the documentation specifications discussed herein as guides rather than rigid definitions
Probabilistic calculation of dose commitment from uranium mill tailings
International Nuclear Information System (INIS)
1983-10-01
The report discusses in a general way considerations of uncertainty in relation to probabilistic modelling. An example of a probabilistic calculation applied to the behaviour of uranium mill tailings is given
Probabilistic inversion in priority setting of emerging zoonoses.
Kurowicka, D.; Bucura, C.; Cooke, R.; Havelaar, A.H.
2010-01-01
This article presents methodology of applying probabilistic inversion in combination with expert judgment in priority setting problem. Experts rank scenarios according to severity. A linear multi-criteria analysis model underlying the expert preferences is posited. Using probabilistic inversion, a
Review of the Brunswick Steam Electric Plant Probabilistic Risk Assessment
International Nuclear Information System (INIS)
Sattison, M.B.; Davis, P.R.; Satterwhite, D.G.; Gilmore, W.E.; Gregg, R.E.
1989-11-01
A review of the Brunswick Steam Electric Plant probabilistic risk Assessment was conducted with the objective of confirming the safety perspectives brought to light by the probabilistic risk assessment. The scope of the review included the entire Level I probabilistic risk assessment including external events. This is consistent with the scope of the probabilistic risk assessment. The review included an assessment of the assumptions, methods, models, and data used in the study. 47 refs., 14 figs., 15 tabs
Arbitrage and Hedging in a non probabilistic framework
Alvarez, Alexander; Ferrando, Sebastian; Olivares, Pablo
2011-01-01
The paper studies the concepts of hedging and arbitrage in a non probabilistic framework. It provides conditions for non probabilistic arbitrage based on the topological structure of the trajectory space and makes connections with the usual notion of arbitrage. Several examples illustrate the non probabilistic arbitrage as well perfect replication of options under continuous and discontinuous trajectories, the results can then be applied in probabilistic models path by path. The approach is r...
A common fixed point for operators in probabilistic normed spaces
International Nuclear Information System (INIS)
Ghaemi, M.B.; Lafuerza-Guillen, Bernardo; Razani, A.
2009-01-01
Probabilistic Metric spaces was introduced by Karl Menger. Alsina, Schweizer and Sklar gave a general definition of probabilistic normed space based on the definition of Menger [Alsina C, Schweizer B, Sklar A. On the definition of a probabilistic normed spaces. Aequationes Math 1993;46:91-8]. Here, we consider the equicontinuity of a class of linear operators in probabilistic normed spaces and finally, a common fixed point theorem is proved. Application to quantum Mechanic is considered.
InSpiRe - Intelligent Spine Rehabilitation
DEFF Research Database (Denmark)
Bøg, Kasper Hafstrøm; Helms, Niels Henrik; Kjær, Per
Rapport on InSpiRe-projektet: InSpiRe er et nationalt netværk, der skal fremme mulighederne for intelligent genoptræning i forhold til ryglidelser. I netværket mødes forskere, virksomheder, kiropraktorer og fysioterapeuter for at udvikle nye genoptrænings og/eller behandlingsteknologier.......Rapport on InSpiRe-projektet: InSpiRe er et nationalt netværk, der skal fremme mulighederne for intelligent genoptræning i forhold til ryglidelser. I netværket mødes forskere, virksomheder, kiropraktorer og fysioterapeuter for at udvikle nye genoptrænings og/eller behandlingsteknologier....
Taxonomic etymology – in search of inspiration
Directory of Open Access Journals (Sweden)
Piotr Jozwiak
2015-07-01
Full Text Available We present a review of the etymology of zoological taxonomic names with emphasis on the most unusual examples. The names were divided into several categories, starting from the most common – given after morphological features – through inspiration from mythology, legends, and classic literature but also from fictional and nonfictional pop-culture characters (e.g., music, movies or cartoons, science, and politics. A separate category includes zoological names created using word-play and figures of speech such as tautonyms, acronyms, anagrams, and palindromes. Our intention was to give an overview of possibilities of how and where taxonomists can find the inspirations that will be consistent with the ICZN rules and generate more detail afterthought about the naming process itself, the meaningful character of naming, as well as the recognition and understanding of names.
Biologically inspired water purification through selective transport
International Nuclear Information System (INIS)
Freeman, E C; Soncini, R M; Weiland, L M
2013-01-01
Biologically inspired systems based on cellular mechanics demonstrate the ability to selectively transport ions across a bilayer membrane. These systems may be observed in nature in plant roots, which remove select nutrients from the surrounding soil against significant concentration gradients. Using biomimetic principles in the design of tailored active materials allows for the development of selective membranes for capturing and filtering targeted ions. Combining this biomimetic transport system with a method for reclaiming the captured ions will allow for increased removal potential. To illustrate this concept, a device for removing nutrients from waterways to aid in reducing eutrophication is outlined and discussed. Presented is a feasibility study of various cellular configurations designed for this purpose, focusing on maximizing nutrient uptake. The results enable a better understanding of the benefits and obstacles when developing these cellularly inspired systems. (paper)
International Nuclear Information System (INIS)
Kupchishin, A.A.; Kupchishin, A.I.; Shmygaleva, T.A.
2002-01-01
Within framework of the cascade-probabilistic (CP) method the radiation and physical processes are studied, theirs relation with Markov's processes are found. The conclusion that CP-function for electrons, protons, alpha-particles and ions are describing by unhomogeneous Markov's chain is drawn. The algorithms are developed, the CP-functions calculations for charged particles, concentration of radiation defects in solids at ion irradiation are carried out as well. Tables for CPF different parameters and radiation defects concentration at charged particle interaction with solids are given. The book consists of the introduction and two chapters: (1) Cascade probabilistic function and the Markov's processes; (2) Radiation defects formation in solids as a part of the Markov's processes. The book is intended for specialists on the radiation defects mathematical stimulation, solid state physics, elementary particles physics and applied mathematics
Neurobiologically inspired mobile robot navigation and planning
Directory of Open Access Journals (Sweden)
Mathias Quoy
2007-11-01
Full Text Available After a short review of biologically inspired navigation architectures, mainly relying on modeling the hippocampal anatomy, or at least some of its functions, we present a navigation and planning model for mobile robots. This architecture is based on a model of the hippocampal and prefrontal interactions. In particular, the system relies on the definition of a new cell type “transition cells” that encompasses traditional “place cells”.
Biological Inspiration for Agile Autonomous Air Vehicles
2007-11-01
half of one wing, bees with legs packed with pollen , butterflies or moths with torn and frayed wings likewise are capable of apparently normal flight...technologies. To appreciate this, consider a not unreasonable extension of a wide area autonomous search (WAAS) munition operational scenario. Here...detect and destroy missile launchers that are operating in the back alleys of an urban areas or search Evers, J.H. (2007) Biological Inspiration for Agile
Humidification of inspired gases during mechanical ventilation.
Gross, J L; Park, G R
2012-04-01
Humidification of inspired gas is mandatory for all mechanically ventilated patients to prevent secretion retention, tracheal tube blockage and adverse changes occurring to the respiratory tract epithelium. However, the debate over "ideal" humidification continues. Several devices are available that include active and passive heat and moisture exchangers and hot water humidifiers Each have their advantages and disadvantages in mechanically ventilated patients. This review explores each device in turn and defines their role in clinical practice.
xLPR - a probabilistic approach to piping integrity analysis
International Nuclear Information System (INIS)
Harrington, C.; Rudland, D.; Fyfitch, S.
2015-01-01
The xLPR Code is a probabilistic fracture mechanics (PFM) computational tool that can be used to quantitatively determine a best-estimate probability of failure with well characterized uncertainties for reactor coolant system components, beginning with the piping systems and including the effects of relevant active degradation mechanisms. The initial application planned for xLPR is somewhat narrowly focused on validating LBB (leak-before-break) compliance in PWSCC-susceptible systems such as coolant systems of PWRs. The xLPR code incorporates a set of deterministic models that represent the full range of physical phenomena necessary to evaluate both fatigue and PWSCC degradation modes from crack initiation through failure. These models are each implemented in a modular form and linked together by a probabilistic framework that contains the logic for xLPR execution, exercises the individual modules as required, and performs necessary administrative and bookkeeping functions. The completion of the first production version of the xLPR code in a fully documented, releasable condition is presently planned for spring 2015
All-possible-couplings approach to measuring probabilistic context.
Directory of Open Access Journals (Sweden)
Ehtibar N Dzhafarov
Full Text Available From behavioral sciences to biology to quantum mechanics, one encounters situations where (i a system outputs several random variables in response to several inputs, (ii for each of these responses only some of the inputs may "directly" influence them, but (iii other inputs provide a "context" for this response by influencing its probabilistic relations to other responses. These contextual influences are very different, say, in classical kinetic theory and in the entanglement paradigm of quantum mechanics, which are traditionally interpreted as representing different forms of physical determinism. One can mathematically construct systems with other types of contextuality, whether or not empirically realizable: those that form special cases of the classical type, those that fall between the classical and quantum ones, and those that violate the quantum type. We show how one can quantify and classify all logically possible contextual influences by studying various sets of probabilistic couplings, i.e., sets of joint distributions imposed on random outputs recorded at different (mutually incompatible values of inputs.
Drawing inspiration from biological optical systems
Wolpert, H. D.
2009-08-01
Bio-Mimicking/Bio-Inspiration: How can we not be inspired by Nature? Life has evolved on earth over the last 3.5 to 4 billion years. Materials formed during this time were not toxic; they were created at low temperatures and low pressures unlike many of the materials developed today. The natural materials formed are self-assembled, multifunctional, nonlinear, complex, adaptive, self-repairing and biodegradable. The designs that failed are fossils. Those that survived are the success stories. Natural materials are mostly formed from organics, inorganic crystals and amorphous phases. The materials make economic sense by optimizing the design of the structures or systems to meet multiple needs. We constantly "see" many similar strategies in approaches, between man and nature, but we seldom look at the details of natures approaches. The power of image processing, in many of natures creatures, is a detail that is often overlooked. Seldon does the engineer interact with the biologist and learn what nature has to teach us. The variety and complexity of biological materials and the optical systems formed should inspire us.
Biologically inspired coupled antenna beampattern design
Energy Technology Data Exchange (ETDEWEB)
Akcakaya, Murat; Nehorai, Arye, E-mail: makcak2@ese.wustl.ed, E-mail: nehorai@ese.wustl.ed [Department of Electrical and Systems Engineering, Washington University in St Louis, St Louis, MO 63130 (United States)
2010-12-15
We propose to design a small-size transmission-coupled antenna array, and corresponding radiation pattern, having high performance inspired by the female Ormia ochracea's coupled ears. For reproduction purposes, the female Ormia is able to locate male crickets' call accurately despite the small distance between its ears compared with the incoming wavelength. This phenomenon has been explained by the mechanical coupling between the Ormia's ears, which has been modeled by a pair of differential equations. In this paper, we first solve these differential equations governing the Ormia ochracea's ear response, and convert the response to the pre-specified radio frequencies. We then apply the converted response of the biological coupling in the array factor of a uniform linear array composed of finite-length dipole antennas, and also include the undesired electromagnetic coupling due to the proximity of the elements. Moreover, we propose an algorithm to optimally choose the biologically inspired coupling for maximum array performance. In our numerical examples, we compute the radiation intensity of the designed system for binomial and uniform ordinary end-fire arrays, and demonstrate the improvement in the half-power beamwidth, sidelobe suppression and directivity of the radiation pattern due to the biologically inspired coupling.
Delineating probabilistic species pools in ecology and biogeography
Karger, Dirk Nikolaus; Cord, Anna F; Kessler, Michael; Kreft, Holger; Kühn, Ingolf; Pompe, Sven; Sandel, Brody; Sarmento Cabral, Juliano; Smith, Adam B; Svenning, Jens-Christian; Tuomisto, Hanna; Weigelt, Patrick; Wesche, Karsten
2016-01-01
Aim To provide a mechanistic and probabilistic framework for defining the species pool based on species-specific probabilities of dispersal, environmental suitability and biotic interactions within a specific temporal extent, and to show how probabilistic species pools can help disentangle the geographical structure of different community assembly processes. Innovation Probabilistic species pools provide an improved species pool definition based on probabilities in conjuncti...
Towards quantum gravity: a framework for probabilistic theories with non-fixed causal structure
International Nuclear Information System (INIS)
Hardy, Lucien
2007-01-01
General relativity is a deterministic theory with non-fixed causal structure. Quantum theory is a probabilistic theory with fixed causal structure. In this paper, we build a framework for probabilistic theories with non-fixed causal structure. This combines the radical elements of general relativity and quantum theory. We adopt an operational methodology for the purposes of theory construction (though without committing to operationalism as a fundamental philosophy). The key idea in the construction is physical compression. A physical theory relates quantities. Thus, if we specify a sufficiently large set of quantities (this is the compressed set), we can calculate all the others. We apply three levels of physical compression. First, we apply it locally to quantities (actually probabilities) that might be measured in a particular region of spacetime. Then we consider composite regions. We find that there is a second level of physical compression for a composite region over and above the first level physical compression for the component regions. Each application of first and second level physical compression is quantified by a matrix. We find that these matrices themselves are related by the physical theory and can therefore be subject to compression. This is the third level of physical compression. The third level of physical compression gives rise to a new mathematical object which we call the causaloid. From the causaloid for a particular physical theory we can calculate everything the physical theory can calculate. This approach allows us to set up a framework for calculating probabilistic correlations in data without imposing a fixed causal structure (such as a background time). We show how to put quantum theory in this framework (thus providing a new formulation of this theory). We indicate how general relativity might be put into this framework and how the framework might be used to construct a theory of quantum gravity
Eclipse 2017: Partnering with NASA MSFC to Inspire Students
Fry, Craig " Ghee" Adams, Mitzi; Gallagher, Dennis; Krause, Linda
2017-01-01
NASA's Marshall Space Flight Center (MSFC) is partnering with the U.S. Space and Rocket Center (USSRC), and Austin Peay State University (APSU) to engage citizen scientists, engineers, and students in science investigations during the 2017 American Solar Eclipse. Investigations will support the Citizen Continental America Telescopic Eclipse (CATE), Ham Radio Science Citizen Investigation(HamSCI), and Interactive NASA Space Physics Ionosphere Radio Experiments (INSPIRE). All planned activities will engage Space Campers and local high school students in the application of the scientific method as they seek to explore a wide range of observations during the eclipse. Where planned experiments touch on current scientific questions, the camper/students will be acting as citizen scientists, participating with researchers from APSU and MSFC. Participants will test their expectations and after the eclipse, share their results, experiences, and conclusions to younger Space Campers at the US Space & Rocket Center.
Wireless synapses in bio-inspired neural networks
Jannson, Tomasz; Forrester, Thomas; Degrood, Kevin
2009-05-01
Wireless (virtual) synapses represent a novel approach to bio-inspired neural networks that follow the infrastructure of the biological brain, except that biological (physical) synapses are replaced by virtual ones based on cellular telephony modeling. Such synapses are of two types: intracluster synapses are based on IR wireless ones, while intercluster synapses are based on RF wireless ones. Such synapses have three unique features, atypical of conventional artificial ones: very high parallelism (close to that of the human brain), very high reconfigurability (easy to kill and to create), and very high plasticity (easy to modify or upgrade). In this paper we analyze the general concept of wireless synapses with special emphasis on RF wireless synapses. Also, biological mammalian (vertebrate) neural models are discussed for comparison, and a novel neural lensing effect is discussed in detail.
Nature inspires sensors to do more with less.
Mulvaney, Shawn P; Sheehan, Paul E
2014-10-28
The world is filled with widely varying chemical, physical, and biological stimuli. Over millennia, organisms have refined their senses to cope with these diverse stimuli, becoming virtuosos in differentiating closely related antigens, handling extremes in concentration, resetting the spent sensing mechanisms, and processing the multiple data streams being generated. Nature successfully deals with both repeating and new stimuli, demonstrating great adaptability when confronted with the latter. Interestingly, nature accomplishes these feats using a fairly simple toolbox. The sensors community continues to draw inspiration from nature's example: just look at the antibodies used as biosensor capture agents or the neural networks that process multivariate data streams. Indeed, many successful sensors have been built by simply mimicking natural systems. However, some of the most exciting breakthroughs occur when the community moves beyond mimicking nature and learns to use nature's tools in innovative ways.
Probabilistic analysis of tokamak plasma disruptions
International Nuclear Information System (INIS)
Sanzo, D.L.; Apostolakis, G.E.
1985-01-01
An approximate analytical solution to the heat conduction equations used in modeling component melting and vaporization resulting from plasma disruptions is presented. This solution is then used to propagate uncertainties in the input data characterizing disruptions, namely, energy density and disruption time, to obtain a probabilistic description of the output variables of interest, material melted and vaporized. (orig.)
Strong Ideal Convergence in Probabilistic Metric Spaces
Indian Academy of Sciences (India)
In the present paper we introduce the concepts of strongly ideal convergent sequence and strong ideal Cauchy sequence in a probabilistic metric (PM) space endowed with the strong topology, and establish some basic facts. Next, we define the strong ideal limit points and the strong ideal cluster points of a sequence in this ...
Quantum Probabilistic Dyadic Second-Order Logic
Baltag, A.; Bergfeld, J.M.; Kishida, K.; Sack, J.; Smets, S.J.L.; Zhong, S.; Libkin, L.; Kohlenbach, U.; de Queiroz, R.
2013-01-01
We propose an expressive but decidable logic for reasoning about quantum systems. The logic is endowed with tensor operators to capture properties of composite systems, and with probabilistic predication formulas P ≥ r (s), saying that a quantum system in state s will yield the answer ‘yes’ (i.e.
Probabilistic analysis of a materially nonlinear structure
Millwater, H. R.; Wu, Y.-T.; Fossum, A. F.
1990-01-01
A probabilistic finite element program is used to perform probabilistic analysis of a materially nonlinear structure. The program used in this study is NESSUS (Numerical Evaluation of Stochastic Structure Under Stress), under development at Southwest Research Institute. The cumulative distribution function (CDF) of the radial stress of a thick-walled cylinder under internal pressure is computed and compared with the analytical solution. In addition, sensitivity factors showing the relative importance of the input random variables are calculated. Significant plasticity is present in this problem and has a pronounced effect on the probabilistic results. The random input variables are the material yield stress and internal pressure with Weibull and normal distributions, respectively. The results verify the ability of NESSUS to compute the CDF and sensitivity factors of a materially nonlinear structure. In addition, the ability of the Advanced Mean Value (AMV) procedure to assess the probabilistic behavior of structures which exhibit a highly nonlinear response is shown. Thus, the AMV procedure can be applied with confidence to other structures which exhibit nonlinear behavior.
Probabilistic Programming : A True Verification Challenge
Katoen, Joost P.; Finkbeiner, Bernd; Pu, Geguang; Zhang, Lijun
2015-01-01
Probabilistic programs [6] are sequential programs, written in languages like C, Java, Scala, or ML, with two added constructs: (1) the ability to draw values at random from probability distributions, and (2) the ability to condition values of variables in a program through observations. For a
Probabilistic calculation for angular dependence collision
International Nuclear Information System (INIS)
Villarino, E.A.
1990-01-01
This collision probabilistic method is broadly used in cylindrical geometry (in one- or two-dimensions). It constitutes a powerful tool for the heterogeneous Response Method where, the coupling current is of the cosine type, that is, without angular dependence at azimuthal angle θ and proportional to μ (cosine of the θ polar angle). (Author) [es
Probabilistic safety assessment in radioactive waste disposal
International Nuclear Information System (INIS)
Robinson, P.C.
1987-07-01
Probabilistic safety assessment codes are now widely used in radioactive waste disposal assessments. This report gives an overview of the current state of the field. The relationship between the codes and the regulations covering radioactive waste disposal is discussed and the characteristics of current codes is described. The problems of verification and validation are considered. (author)
Ignorability in Statistical and Probabilistic Inference
DEFF Research Database (Denmark)
Jaeger, Manfred
2005-01-01
When dealing with incomplete data in statistical learning, or incomplete observations in probabilistic inference, one needs to distinguish the fact that a certain event is observed from the fact that the observed event has happened. Since the modeling and computational complexities entailed...
Probabilistic fuzzy systems as additive fuzzy systems
Almeida, R.J.; Verbeek, N.; Kaymak, U.; Costa Sousa, da J.M.; Laurent, A.; Strauss, O.; Bouchon-Meunier, B.; Yager, R.
2014-01-01
Probabilistic fuzzy systems combine a linguistic description of the system behaviour with statistical properties of data. It was originally derived based on Zadeh’s concept of probability of a fuzzy event. Two possible and equivalent additive reasoning schemes were proposed, that lead to the
A Geometric Presentation of Probabilistic Satisfiability
Morales-Luna, Guillermo
2010-01-01
By considering probability distributions over the set of assignments the expected truth values assignment to propositional variables are extended through linear operators, and the expected truth values of the clauses at any given conjunctive form are also extended through linear maps. The probabilistic satisfiability problems are discussed in terms of the introduced linear extensions. The case of multiple truth values is also discussed.
Probabilistic studies for a safety assurance program
International Nuclear Information System (INIS)
Iyer, S.S.; Davis, J.F.
1985-01-01
The adequate supply of energy is always a matter of concern for any country. Nuclear power has played, and will continue to play an important role in supplying this energy. However, safety in nuclear power production is a fundamental prerequisite in fulfilling this role. This paper outlines a program to ensure safe operation of a nuclear power plant utilizing the Probabilistic Safety Studies
Probabilistic safety goals. Phase 3 - Status report
Energy Technology Data Exchange (ETDEWEB)
Holmberg, J.-E. (VTT (Finland)); Knochenhauer, M. (Relcon Scandpower AB, Sundbyberg (Sweden))
2009-07-15
The first phase of the project (2006) described the status, concepts and history of probabilistic safety goals for nuclear power plants. The second and third phases (2007-2008) have provided guidance related to the resolution of some of the problems identified, and resulted in a common understanding regarding the definition of safety goals. The basic aim of phase 3 (2009) has been to increase the scope and level of detail of the project, and to start preparations of a guidance document. Based on the conclusions from the previous project phases, the following issues have been covered: 1) Extension of international overview. Analysis of results from the questionnaire performed within the ongoing OECD/NEA WGRISK activity on probabilistic safety criteria, including participation in the preparation of the working report for OECD/NEA/WGRISK (to be finalised in phase 4). 2) Use of subsidiary criteria and relations between these (to be finalised in phase 4). 3) Numerical criteria when using probabilistic analyses in support of deterministic safety analysis (to be finalised in phase 4). 4) Guidance for the formulation, application and interpretation of probabilistic safety criteria (to be finalised in phase 4). (LN)
Safety Verification for Probabilistic Hybrid Systems
DEFF Research Database (Denmark)
Zhang, Lijun; She, Zhikun; Ratschan, Stefan
2010-01-01
The interplay of random phenomena and continuous real-time control deserves increased attention for instance in wireless sensing and control applications. Safety verification for such systems thus needs to consider probabilistic variations of systems with hybrid dynamics. In safety verification o...... on a number of case studies, tackled using a prototypical implementation....
Ambient Surveillance by Probabilistic-Possibilistic Perception
Bittermann, M.S.; Ciftcioglu, O.
2013-01-01
A method for quantifying ambient surveillance is presented, which is based on probabilistic-possibilistic perception. The human surveillance of a scene through observing camera sensed images on a monitor is modeled in three steps. First immersion of the observer is simulated by modeling perception
HERMES probabilistic risk assessment. Pilot study
International Nuclear Information System (INIS)
Parisot, F.; Munoz, J.
1993-01-01
The study was performed in 1989 of the contribution of probabilistic analysis for the optimal construction of system safety status in aeronautical and European nuclear industries, shows the growing trends towards incorporation of quantitative safety assessment and lead to an agreement to undertake a prototype proof study on Hermes. The main steps of the study and results are presented in the paper
Some probabilistic properties of fractional point processes
Garra, Roberto; Orsingher, Enzo; Scavino, Marco
2017-01-01
P{T-k(alpha) < infinity} are explicitly obtained and analyzed. The processes N-f (t) are time-changed Poisson processes N( H-f (t)) with subordinators H-f (t) and here we study N(Sigma H-n(j= 1)f j (t)) and obtain probabilistic features
Strong Statistical Convergence in Probabilistic Metric Spaces
Şençimen, Celaleddin; Pehlivan, Serpil
2008-01-01
In this article, we introduce the concepts of strongly statistically convergent sequence and strong statistically Cauchy sequence in a probabilistic metric (PM) space endowed with the strong topology, and establish some basic facts. Next, we define the strong statistical limit points and the strong statistical cluster points of a sequence in this space and investigate the relations between these concepts.
Effectiveness of Securities with Fuzzy Probabilistic Return
Directory of Open Access Journals (Sweden)
Krzysztof Piasecki
2011-01-01
Full Text Available The generalized fuzzy present value of a security is defined here as fuzzy valued utility of cash flow. The generalized fuzzy present value cannot depend on the value of future cash flow. There exists such a generalized fuzzy present value which is not a fuzzy present value in the sense given by some authors. If the present value is a fuzzy number and the future value is a random one, then the return rate is given as a probabilistic fuzzy subset on a real line. This kind of return rate is called a fuzzy probabilistic return. The main goal of this paper is to derive the family of effective securities with fuzzy probabilistic return. Achieving this goal requires the study of the basic parameters characterizing fuzzy probabilistic return. Therefore, fuzzy expected value and variance are determined for this case of return. These results are a starting point for constructing a three-dimensional image. The set of effective securities is introduced as the Pareto optimal set determined by the maximization of the expected return rate and minimization of the variance. Finally, the set of effective securities is distinguished as a fuzzy set. These results are obtained without the assumption that the distribution of future values is Gaussian. (original abstract
Dialectical Multivalued Logic and Probabilistic Theory
Directory of Open Access Journals (Sweden)
José Luis Usó Doménech
2017-02-01
Full Text Available There are two probabilistic algebras: one for classical probability and the other for quantum mechanics. Naturally, it is the relation to the object that decides, as in the case of logic, which algebra is to be used. From a paraconsistent multivalued logic therefore, one can derive a probability theory, adding the correspondence between truth value and fortuity.
Revisiting the formal foundation of Probabilistic Databases
Wanders, B.; van Keulen, Maurice
2015-01-01
One of the core problems in soft computing is dealing with uncertainty in data. In this paper, we revisit the formal foundation of a class of probabilistic databases with the purpose to (1) obtain data model independence, (2) separate metadata on uncertainty and probabilities from the raw data, (3)
Probabilistic Resource Analysis by Program Transformation
DEFF Research Database (Denmark)
Kirkeby, Maja Hanne; Rosendahl, Mads
2016-01-01
The aim of a probabilistic resource analysis is to derive a probability distribution of possible resource usage for a program from a probability distribution of its input. We present an automated multi-phase rewriting based method to analyze programs written in a subset of C. It generates...
Application of probabilistic precipitation forecasts from a ...
African Journals Online (AJOL)
Application of probabilistic precipitation forecasts from a deterministic model towards increasing the lead-time of flash flood forecasts in South Africa. ... The procedure is applied to a real flash flood event and the ensemble-based rainfall forecasts are verified against rainfall estimated by the SAFFG system. The approach ...
Probabilistic safety assessment goals in Canada
International Nuclear Information System (INIS)
Snell, V.G.
1986-01-01
CANDU safety philosphy, both in design and in licensing, has always had a strong bias towards quantitative probabilistically-based goals derived from comparative safety. Formal probabilistic safety assessment began in Canada as a design tool. The influence of this carried over later on into the definition of the deterministic safety guidelines used in CANDU licensing. Design goals were further developed which extended the consequence/frequency spectrum of 'acceptable' events, from the two points defined by the deterministic single/dual failure analysis, to a line passing through lower and higher frequencies. Since these were design tools, a complete risk summation was not necessary, allowing a cutoff at low event frequencies while preserving the identification of the most significant safety-related events. These goals gave a logical framework for making decisions on implementing design changes proposed as a result of the Probabilistic Safety Analysis. Performing this analysis became a regulatory requirement, and the design goals remained the framework under which this was submitted. Recently, there have been initiatives to incorporate more detailed probabilistic safety goals into the regulatory process in Canada. These range from far-reaching safety optimization across society, to initiatives aimed at the nuclear industry only. The effectiveness of the latter is minor at very low and very high event frequencies; at medium frequencies, a justification against expenditures per life saved in other industries should be part of the goal setting
Overview of the probabilistic risk assessment approach
International Nuclear Information System (INIS)
Reed, J.W.
1985-01-01
The techniques of probabilistic risk assessment (PRA) are applicable to Department of Energy facilities. The background and techniques of PRA are given with special attention to seismic, wind and flooding external events. A specific application to seismic events is provided to demonstrate the method. However, the PRA framework is applicable also to wind and external flooding. 3 references, 8 figures, 1 table
Probabilistic safety goals. Phase 3 - Status report
International Nuclear Information System (INIS)
Holmberg, J.-E.; Knochenhauer, M.
2009-07-01
The first phase of the project (2006) described the status, concepts and history of probabilistic safety goals for nuclear power plants. The second and third phases (2007-2008) have provided guidance related to the resolution of some of the problems identified, and resulted in a common understanding regarding the definition of safety goals. The basic aim of phase 3 (2009) has been to increase the scope and level of detail of the project, and to start preparations of a guidance document. Based on the conclusions from the previous project phases, the following issues have been covered: 1) Extension of international overview. Analysis of results from the questionnaire performed within the ongoing OECD/NEA WGRISK activity on probabilistic safety criteria, including participation in the preparation of the working report for OECD/NEA/WGRISK (to be finalised in phase 4). 2) Use of subsidiary criteria and relations between these (to be finalised in phase 4). 3) Numerical criteria when using probabilistic analyses in support of deterministic safety analysis (to be finalised in phase 4). 4) Guidance for the formulation, application and interpretation of probabilistic safety criteria (to be finalised in phase 4). (LN)
Probabilistic Relational Structures and Their Applications
Domotor, Zoltan
The principal objects of the investigation reported were, first, to study qualitative probability relations on Boolean algebras, and secondly, to describe applications in the theories of probability logic, information, automata, and probabilistic measurement. The main contribution of this work is stated in 10 definitions and 20 theorems. The basic…
Branching bisimulation congruence for probabilistic systems
Andova, S.; Georgievska, S.; Trcka, N.
2012-01-01
A notion of branching bisimilarity for the alternating model of probabilistic systems, compatible with parallel composition, is defined. For a congruence result, an internal transition immediately followed by a non-trivial probability distribution is not considered inert. A weaker definition of
On Probabilistic Automata in Continuous Time
DEFF Research Database (Denmark)
Eisentraut, Christian; Hermanns, Holger; Zhang, Lijun
2010-01-01
We develop a compositional behavioural model that integrates a variation of probabilistic automata into a conservative extension of interactive Markov chains. The model is rich enough to embody the semantics of generalised stochastic Petri nets. We define strong and weak bisimulations and discuss...
Bisimulations Meet PCTL Equivalences for Probabilistic Automata
DEFF Research Database (Denmark)
Song, Lei; Zhang, Lijun; Godskesen, Jens Chr.
2011-01-01
Probabilistic automata (PA) [20] have been successfully applied in the formal verification of concurrent and stochastic systems. Efficient model checking algorithms have been studied, where the most often used logics for expressing properties are based on PCTL [11] and its extension PCTL∗ [4...
Validation of in vitro probabilistic tractography
DEFF Research Database (Denmark)
Dyrby, Tim B.; Sogaard, L.V.; Parker, G.J.
2007-01-01
assessed the anatomical validity and reproducibility of in vitro multi-fiber probabilistic tractography against two invasive tracers: the histochemically detectable biotinylated dextran amine and manganese enhanced magnetic resonance imaging. Post mortern DWI was used to ensure that most of the sources...
Searching Algorithms Implemented on Probabilistic Systolic Arrays
Czech Academy of Sciences Publication Activity Database
Kramosil, Ivan
1996-01-01
Roč. 25, č. 1 (1996), s. 7-45 ISSN 0308-1079 R&D Projects: GA ČR GA201/93/0781 Keywords : searching algorithms * probabilistic algorithms * systolic arrays * parallel algorithms Impact factor: 0.214, year: 1996
Financial Markets Analysis by Probabilistic Fuzzy Modelling
J.H. van den Berg (Jan); W.-M. van den Bergh (Willem-Max); U. Kaymak (Uzay)
2003-01-01
textabstractFor successful trading in financial markets, it is important to develop financial models where one can identify different states of the market for modifying one???s actions. In this paper, we propose to use probabilistic fuzzy systems for this purpose. We concentrate on Takagi???Sugeno
Towards decision making via expressive probabilistic ontologies
Acar, Erman; Thorne, Camilo; Stuckenschmidt, Heiner
2015-01-01
© Springer International Publishing Switzerland 2015. We propose a framework for automated multi-attribute deci- sion making, employing the probabilistic non-monotonic description log- ics proposed by Lukasiewicz in 2008. Using this framework, we can model artificial agents in decision-making
The Probabilistic Nature of Preferential Choice
Rieskamp, Jorg
2008-01-01
Previous research has developed a variety of theories explaining when and why people's decisions under risk deviate from the standard economic view of expected utility maximization. These theories are limited in their predictive accuracy in that they do not explain the probabilistic nature of preferential choice, that is, why an individual makes…
A Probabilistic Framework for Curve Evolution
DEFF Research Database (Denmark)
Dahl, Vedrana Andersen
2017-01-01
approach include ability to handle textured images, simple generalization to multiple regions, and efficiency in computation. We test our probabilistic framework in combination with parametric (snakes) and geometric (level-sets) curves. The experimental results on composed and natural images demonstrate...
Probabilistic Output Analysis by Program Manipulation
DEFF Research Database (Denmark)
Rosendahl, Mads; Kirkeby, Maja Hanne
2015-01-01
The aim of a probabilistic output analysis is to derive a probability distribution of possible output values for a program from a probability distribution of its input. We present a method for performing static output analysis, based on program transformation techniques. It generates a probability...
Improved transformer protection using probabilistic neural network ...
African Journals Online (AJOL)
This article presents a novel technique to distinguish between magnetizing inrush current and internal fault current of power transformer. An algorithm has been developed around the theme of the conventional differential protection method in which parallel combination of Probabilistic Neural Network (PNN) and Power ...
Financial markets analysis by probabilistic fuzzy modelling
Berg, van den J.; Kaymak, U.; Bergh, van den W.M.
2003-01-01
For successful trading in financial markets, it is important to develop financial models where one can identify different states of the market for modifying one???s actions. In this paper, we propose to use probabilistic fuzzy systems for this purpose. We concentrate on Takagi???Sugeno (TS)
Probabilistic solution of the Dirac equation
International Nuclear Information System (INIS)
Blanchard, P.; Combe, P.
1985-01-01
Various probabilistic representations of the 2, 3 and 4 dimensional Dirac equation are given in terms of expectation with respect to stochastic jump processes and are used to derive the nonrelativistic limit even in the presence of an external electromagnetic field. (orig.)
Mastering probabilistic graphical models using Python
Ankan, Ankur
2015-01-01
If you are a researcher or a machine learning enthusiast, or are working in the data science field and have a basic idea of Bayesian learning or probabilistic graphical models, this book will help you to understand the details of graphical models and use them in your data science problems.
Sustaining Physics Teacher Education Coalition programs in physics teacher education
Rachel E. Scherr; Monica Plisch; Renee Michelle Goertzen
2017-01-01
Understanding the mechanisms of increasing the number of physics teachers educated per year at institutions with thriving physics teacher preparation programs may inspire and support other institutions in building thriving programs of their own. The Physics Teacher Education Coalition (PhysTEC), led by the American Physical Society (APS) and the American Association of Physics Teachers (AAPT), has supported transformation of physics teacher preparation programs at a number of institutions aro...
A probabilistic model for snow avalanche occurrence
Perona, P.; Miescher, A.; Porporato, A.
2009-04-01
Avalanche hazard forecasting is an important issue in relation to the protection of urbanized environments, ski resorts and of ski-touring alpinists. A critical point is to predict the conditions that trigger the snow mass instability determining the onset and the size of avalanches. On steep terrains the risk of avalanches is known to be related to preceding consistent snowfall events and to subsequent changes in the local climatic conditions. Regression analysis has shown that avalanche occurrence indeed correlates to the amount of snow fallen in consecutive three snowing days and to the state of the settled snow at the ground. Moreover, since different type of avalanches may occur as a result of the interactions of different factors, the process of snow avalanche formation is inherently complex and with some degree of unpredictability. For this reason, although several models assess the risk of avalanche by accounting for all the involved processes with a great detail, a high margin of uncertainty invariably remains. In this work, we explicitly describe such an unpredictable behaviour with an intrinsic noise affecting the processes leading snow instability. Eventually, this sets the basis for a minimalist stochastic model, which allows us to investigate the avalanche dynamics and its statistical properties. We employ a continuous time process with stochastic jumps (snowfalls), deterministic decay (snowmelt and compaction) and state dependent avalanche occurrence (renewals) as a minimalist model for the determination of avalanche size and related intertime occurrence. The physics leading to avalanches is simplified to the extent where only meteorological data and terrain data are necessary to estimate avalanche danger. We explore the analytical formulation of the process and the properties of the probability density function of the avalanche process variables. We also discuss what is the probabilistic link between avalanche size and preceding snowfall event and
A probabilistic strategy for parametric catastrophe insurance
Figueiredo, Rui; Martina, Mario; Stephenson, David; Youngman, Benjamin
2017-04-01
Economic losses due to natural hazards have shown an upward trend since 1980, which is expected to continue. Recent years have seen a growing worldwide commitment towards the reduction of disaster losses. This requires effective management of disaster risk at all levels, a part of which involves reducing financial vulnerability to disasters ex-ante, ensuring that necessary resources will be available following such events. One way to achieve this is through risk transfer instruments. These can be based on different types of triggers, which determine the conditions under which payouts are made after an event. This study focuses on parametric triggers, where payouts are determined by the occurrence of an event exceeding specified physical parameters at a given location, or at multiple locations, or over a region. This type of product offers a number of important advantages, and its adoption is increasing. The main drawback of parametric triggers is their susceptibility to basis risk, which arises when there is a mismatch between triggered payouts and the occurrence of loss events. This is unavoidable in said programmes, as their calibration is based on models containing a number of different sources of uncertainty. Thus, a deterministic definition of the loss event triggering parameters appears flawed. However, often for simplicity, this is the way in which most parametric models tend to be developed. This study therefore presents an innovative probabilistic strategy for parametric catastrophe insurance. It is advantageous as it recognizes uncertainties and minimizes basis risk while maintaining a simple and transparent procedure. A logistic regression model is constructed here to represent the occurrence of loss events based on certain loss index variables, obtained through the transformation of input environmental variables. Flood-related losses due to rainfall are studied. The resulting model is able, for any given day, to issue probabilities of occurrence of loss
Directory of Open Access Journals (Sweden)
Edwin Eduardo Millán Rojas
2018-02-01
Full Text Available Context: Management to care for the environment and the Earth (geo can be source of inspiration for developing models that allow addressing complexity issues; the objective of this research was to develop an additional aspect of the inspired models. The geoinspired model has two features, the first covering aspects related to environmental management and the behavior of natural resources, and the second has a component of spatial location associated with existing objects on the Earth's surface. Method: The approach developed in the research is descriptive and its main objective is the representation or characterization of a case study within a particular context. Results: The result was the design of a model to emulate the natural behavior of the water tributaries of the Amazon foothills, in order to extend the application of the inspired models and allow the use of elements such as geo-referencing and environmental management. The proposed geoinspired model is called “natural vectors agents inspired in environmental management”. Conclusions: The agents vectors naturals inspired by the environmental are polyform elements that can assume the behavior of environmental entities, which makes it possible to achieve progress in other fields of environmental management (use of soil, climate, flora, fauna, and link environmental issues with the structure of the proposed model.
Feeling Is Believing: Inspiration Encourages Belief in God.
Critcher, Clayton R; Lee, Chan Jean
2018-05-01
Even without direct evidence of God's existence, about half of the world's population believes in God. Although previous research has found that people arrive at such beliefs intuitively instead of analytically, relatively little research has aimed to understand what experiences encourage or legitimate theistic belief systems. Using cross-cultural correlational and experimental methods, we investigated whether the experience of inspiration encourages a belief in God. Participants who dispositionally experience more inspiration, were randomly assigned to relive or have an inspirational experience, or reported such experiences to be more inspirational all showed stronger belief in God. These effects were specific to inspiration (instead of adjacent affective experiences) and a belief in God (instead of other empirically unverifiable claims). Being inspired by someone or something (but not inspired to do something) offers a spiritually transcendent experience that elevates belief in God, in part because it makes people feel connected to something beyond themselves.
Probabilistic application of fracture mechanics
International Nuclear Information System (INIS)
Dufresne, J.
1981-04-01
The different methods used to evaluate the rupture probability of a pressure vessel are reviewed. Data collection and processing of all parameters necessary for fracture mechanics evaluation are presented with particular attention to the size distribution of defects in actual vessels. Physical process is followed during crack growth and unstable propagation, using LEFM (Linear Elastic Fracture Mechanism) and plastic instability. Results show that the final failure probability for a PWR pressure vessel is 3.5 10 -8 , and is due essentially to LOCAs for any break size. The weakest point is the internal side of the belt line
International Nuclear Information System (INIS)
Zio, Enrico
2014-01-01
Highlights: • IDPSA contributes to robust risk-informed decision making in nuclear safety. • IDPSA considers time-dependent interactions among component failures and system process. • Also, IDPSA considers time-dependent interactions among control and operator actions. • Computational efficiency by advanced Monte Carlo and meta-modelling simulations. • Efficient post-processing of IDPSA output by clustering and data mining. - Abstract: Integrated deterministic and probabilistic safety assessment (IDPSA) is conceived as a way to analyze the evolution of accident scenarios in complex dynamic systems, like nuclear, aerospace and process ones, accounting for the mutual interactions between the failure and recovery of system components, the evolving physical processes, the control and operator actions, the software and firmware. In spite of the potential offered by IDPSA, several challenges need to be effectively addressed for its development and practical deployment. In this paper, we give an overview of these and discuss the related implications in terms of research perspectives
Need for a probabilistic fire analysis at nuclear power plants
International Nuclear Information System (INIS)
Calabuig Beneyto, J. L.; Ibanez Aparicio, J.
1993-01-01
Although fire protection standards for nuclear power plants cover a wide scope and are constantly being updated, the existence of certain constraints makes it difficult to precisely evaluate plant response to different postulatable fires. These constraints involve limitations such as: - Physical obstacles which impede the implementation of standards in certain cases; - Absence of general standards which cover all the situations which could arise in practice; - Possible temporary noncompliance of safety measures owing to unforeseen circumstances; - The fact that a fire protection standard cannot possibly take into account additional damages occurring simultaneously with the fire; Based on the experience of the ASCO NPP PSA developed within the framework of the joint venture, INITEC-INYPSA-EMPRESARIOS AGRUPADOS, this paper seeks to justify the need for a probabilistic analysis to overcome the limitations detected in general application of prevailing standards. (author)
Energy Technology Data Exchange (ETDEWEB)
Zio, Enrico, E-mail: enrico.zio@ecp.fr [Ecole Centrale Paris and Supelec, Chair on System Science and the Energetic Challenge, European Foundation for New Energy – Electricite de France (EDF), Grande Voie des Vignes, 92295 Chatenay-Malabry Cedex (France); Dipartimento di Energia, Politecnico di Milano, Via Ponzio 34/3, 20133 Milano (Italy)
2014-12-15
Highlights: • IDPSA contributes to robust risk-informed decision making in nuclear safety. • IDPSA considers time-dependent interactions among component failures and system process. • Also, IDPSA considers time-dependent interactions among control and operator actions. • Computational efficiency by advanced Monte Carlo and meta-modelling simulations. • Efficient post-processing of IDPSA output by clustering and data mining. - Abstract: Integrated deterministic and probabilistic safety assessment (IDPSA) is conceived as a way to analyze the evolution of accident scenarios in complex dynamic systems, like nuclear, aerospace and process ones, accounting for the mutual interactions between the failure and recovery of system components, the evolving physical processes, the control and operator actions, the software and firmware. In spite of the potential offered by IDPSA, several challenges need to be effectively addressed for its development and practical deployment. In this paper, we give an overview of these and discuss the related implications in terms of research perspectives.
ZERO: Probabilistic Routing for Deploy and Forget Wireless Sensor Networks
Directory of Open Access Journals (Sweden)
Jose Carlos Pacho
2010-09-01
Full Text Available As Wireless Sensor Networks are being adopted by industry and agriculture for large-scale and unattended deployments, the need for reliable and energy-conservative protocols become critical. Physical and Link layer efforts for energy conservation are not mostly considered by routing protocols that put their efforts on maintaining reliability and throughput. Gradient-based routing protocols route data through most reliable links aiming to ensure 99% packet delivery. However, they suffer from the so-called ”hot spot” problem. Most reliable routes waste their energy fast, thus partitioning the network and reducing the area monitored. To cope with this ”hot spot” problem we propose ZERO a combined approach at Network and Link layers to increase network lifespan while conserving reliability levels by means of probabilistic load balancing techniques.
Probabilistic versus deterministic hazard assessment in liquefaction susceptible zones
Daminelli, Rosastella; Gerosa, Daniele; Marcellini, Alberto; Tento, Alberto
2015-04-01
Probabilistic seismic hazard assessment (PSHA), usually adopted in the framework of seismic codes redaction, is based on Poissonian description of the temporal occurrence, negative exponential distribution of magnitude and attenuation relationship with log-normal distribution of PGA or response spectrum. The main positive aspect of this approach stems into the fact that is presently a standard for the majority of countries, but there are weak points in particular regarding the physical description of the earthquake phenomenon. Factors like site effects, source characteristics like duration of the strong motion and directivity that could significantly influence the expected motion at the site are not taken into account by PSHA. Deterministic models can better evaluate the ground motion at a site from a physical point of view, but its prediction reliability depends on the degree of knowledge of the source, wave propagation and soil parameters. We compare these two approaches in selected sites affected by the May 2012 Emilia-Romagna and Lombardia earthquake, that caused widespread liquefaction phenomena unusually for magnitude less than 6. We focus on sites liquefiable because of their soil mechanical parameters and water table level. Our analysis shows that the choice between deterministic and probabilistic hazard analysis is strongly dependent on site conditions. The looser the soil and the higher the liquefaction potential, the more suitable is the deterministic approach. Source characteristics, in particular the duration of strong ground motion, have long since recognized as relevant to induce liquefaction; unfortunately a quantitative prediction of these parameters appears very unlikely, dramatically reducing the possibility of their adoption in hazard assessment. Last but not least, the economic factors are relevant in the choice of the approach. The case history of 2012 Emilia-Romagna and Lombardia earthquake, with an officially estimated cost of 6 billions