WorldWideScience

Sample records for well-known probabilistic risk

  1. Implications of probabilistic risk assessment

    Energy Technology Data Exchange (ETDEWEB)

    Cullingford, M.C.; Shah, S.M.; Gittus, J.H. (eds.)

    1987-01-01

    Probabilistic risk assessment (PRA) is an analytical process that quantifies the likelihoods, consequences and associated uncertainties of the potential outcomes of postulated events. Starting with planned or normal operation, probabilistic risk assessment covers a wide range of potential accidents and considers the whole plant and the interactions of systems and human actions. Probabilistic risk assessment can be applied in safety decisions in design, licensing and operation of industrial facilities, particularly nuclear power plants. The proceedings include a review of PRA procedures, methods and technical issues in treating uncertainties, operating and licensing issues and future trends. Risk assessment for specific reactor types or components and specific risks (eg aircraft crashing onto a reactor) are used to illustrate the points raised. All 52 articles are indexed separately. (U.K.).

  2. Review of the chronic exposure pathways models in MACCS (MELCOR Accident Consequence Code System) and several other well-known probabilistic risk assessment models

    Energy Technology Data Exchange (ETDEWEB)

    Tveten, U. (Institutt for Energiteknikk, Kjeller (Norway))

    1990-06-01

    The purpose of this report is to document the results of the work performed by the author in connection with the following task, performed for US Nuclear Regulatory Commission, (USNRC) Office of Nuclear Regulatory Research, Division of Systems Research: MACCS Chronic Exposure Pathway Models: Review the chronic exposure pathway models implemented in the MELCOR Accident Consequence Code System (MACCS) and compare those models to the chronic exposure pathway models implemented in similar codes developed in countries that are members of the OECD. The chronic exposures concerned are via: the terrestrial food pathways, the water pathways, the long-term groundshine pathway, and the inhalation of resuspended radionuclides pathway. The USNRC has indicated during discussions of the task that the major effort should be spent on the terrestrial food pathways. There is one chapter for each of the categories of chronic exposure pathways listed above.

  3. Dynamical systems probabilistic risk assessment.

    Energy Technology Data Exchange (ETDEWEB)

    Denman, Matthew R.; Ames, Arlo Leroy

    2014-03-01

    Probabilistic Risk Assessment (PRA) is the primary tool used to risk-inform nuclear power regulatory and licensing activities. Risk-informed regulations are intended to reduce inherent conservatism in regulatory metrics (e.g., allowable operating conditions and technical specifications) which are built into the regulatory framework by quantifying both the total risk profile as well as the change in the risk profile caused by an event or action (e.g., in-service inspection procedures or power uprates). Dynamical Systems (DS) analysis has been used to understand unintended time-dependent feedbacks in both industrial and organizational settings. In dynamical systems analysis, feedback loops can be characterized and studied as a function of time to describe the changes to the reliability of plant Structures, Systems and Components (SSCs). While DS has been used in many subject areas, some even within the PRA community, it has not been applied toward creating long-time horizon, dynamic PRAs (with time scales ranging between days and decades depending upon the analysis). Understanding slowly developing dynamic effects, such as wear-out, on SSC reliabilities may be instrumental in ensuring a safely and reliably operating nuclear fleet. Improving the estimation of a plant's continuously changing risk profile will allow for more meaningful risk insights, greater stakeholder confidence in risk insights, and increased operational flexibility.

  4. Dynamical systems probabilistic risk assessment

    Energy Technology Data Exchange (ETDEWEB)

    Denman, Matthew R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Ames, Arlo Leroy [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2014-03-01

    Probabilistic Risk Assessment (PRA) is the primary tool used to risk-inform nuclear power regulatory and licensing activities. Risk-informed regulations are intended to reduce inherent conservatism in regulatory metrics (e.g., allowable operating conditions and technical specifications) which are built into the regulatory framework by quantifying both the total risk profile as well as the change in the risk profile caused by an event or action (e.g., in-service inspection procedures or power uprates). Dynamical Systems (DS) analysis has been used to understand unintended time-dependent feedbacks in both industrial and organizational settings. In dynamical systems analysis, feedback loops can be characterized and studied as a function of time to describe the changes to the reliability of plant Structures, Systems and Components (SSCs). While DS has been used in many subject areas, some even within the PRA community, it has not been applied toward creating long-time horizon, dynamic PRAs (with time scales ranging between days and decades depending upon the analysis). Understanding slowly developing dynamic effects, such as wear-out, on SSC reliabilities may be instrumental in ensuring a safely and reliably operating nuclear fleet. Improving the estimation of a plant's continuously changing risk profile will allow for more meaningful risk insights, greater stakeholder confidence in risk insights, and increased operational flexibility.

  5. Exploration Health Risks: Probabilistic Risk Assessment

    Science.gov (United States)

    Rhatigan, Jennifer; Charles, John; Hayes, Judith; Wren, Kiley

    2006-01-01

    Maintenance of human health on long-duration exploration missions is a primary challenge to mission designers. Indeed, human health risks are currently the largest risk contributors to the risks of evacuation or loss of the crew on long-duration International Space Station missions. We describe a quantitative assessment of the relative probabilities of occurrence of the individual risks to human safety and efficiency during space flight to augment qualitative assessments used in this field to date. Quantitative probabilistic risk assessments will allow program managers to focus resources on those human health risks most likely to occur with undesirable consequences. Truly quantitative assessments are common, even expected, in the engineering and actuarial spheres, but that capability is just emerging in some arenas of life sciences research, such as identifying and minimize the hazards to astronauts during future space exploration missions. Our expectation is that these results can be used to inform NASA mission design trade studies in the near future with the objective of preventing the higher among the human health risks. We identify and discuss statistical techniques to provide this risk quantification based on relevant sets of astronaut biomedical data from short and long duration space flights as well as relevant analog populations. We outline critical assumptions made in the calculations and discuss the rationale for these. Our efforts to date have focussed on quantifying the probabilities of medical risks that are qualitatively perceived as relatively high risks of radiation sickness, cardiac dysrhythmias, medically significant renal stone formation due to increased calcium mobilization, decompression sickness as a result of EVA (extravehicular activity), and bone fracture due to loss of bone mineral density. We present these quantitative probabilities in order-of-magnitude comparison format so that relative risk can be gauged. We address the effects of

  6. Probabilistic risk analysis and terrorism risk.

    Science.gov (United States)

    Ezell, Barry Charles; Bennett, Steven P; von Winterfeldt, Detlof; Sokolowski, John; Collins, Andrew J

    2010-04-01

    Since the terrorist attacks of September 11, 2001, and the subsequent establishment of the U.S. Department of Homeland Security (DHS), considerable efforts have been made to estimate the risks of terrorism and the cost effectiveness of security policies to reduce these risks. DHS, industry, and the academic risk analysis communities have all invested heavily in the development of tools and approaches that can assist decisionmakers in effectively allocating limited resources across the vast array of potential investments that could mitigate risks from terrorism and other threats to the homeland. Decisionmakers demand models, analyses, and decision support that are useful for this task and based on the state of the art. Since terrorism risk analysis is new, no single method is likely to meet this challenge. In this article we explore a number of existing and potential approaches for terrorism risk analysis, focusing particularly on recent discussions regarding the applicability of probabilistic and decision analytic approaches to bioterrorism risks and the Bioterrorism Risk Assessment methodology used by the DHS and criticized by the National Academies and others.

  7. Asteroid Risk Assessment: A Probabilistic Approach.

    Science.gov (United States)

    Reinhardt, Jason C; Chen, Xi; Liu, Wenhao; Manchev, Petar; Paté-Cornell, M Elisabeth

    2016-02-01

    Following the 2013 Chelyabinsk event, the risks posed by asteroids attracted renewed interest, from both the scientific and policy-making communities. It reminded the world that impacts from near-Earth objects (NEOs), while rare, have the potential to cause great damage to cities and populations. Point estimates of the risk (such as mean numbers of casualties) have been proposed, but because of the low-probability, high-consequence nature of asteroid impacts, these averages provide limited actionable information. While more work is needed to further refine its input distributions (e.g., NEO diameters), the probabilistic model presented in this article allows a more complete evaluation of the risk of NEO impacts because the results are distributions that cover the range of potential casualties. This model is based on a modularized simulation that uses probabilistic inputs to estimate probabilistic risk metrics, including those of rare asteroid impacts. Illustrative results of this analysis are presented for a period of 100 years. As part of this demonstration, we assess the effectiveness of civil defense measures in mitigating the risk of human casualties. We find that they are likely to be beneficial but not a panacea. We also compute the probability-but not the consequences-of an impact with global effects ("cataclysm"). We conclude that there is a continued need for NEO observation, and for analyses of the feasibility and risk-reduction effectiveness of space missions designed to deflect or destroy asteroids that threaten the Earth. © 2015 Society for Risk Analysis.

  8. Probabilistic risk assessment of disassembly procedures

    Energy Technology Data Exchange (ETDEWEB)

    O`Brien, D.A.; Bement, T.R.; Letellier, B.C.

    1993-11-01

    The purpose of this report is to describe the use of Probabilistic Risk (Safety) Assessment (PRA or PSA) at a Department of Energy (DOE) facility. PRA is a methodology for (i) identifying combinations of events that, if they occur, lead to accidents (ii) estimating the frequency of occurrence of each combination of events and (iii) estimating the consequences of each accident. Specifically the study focused on evaluating the risks associated with dissembling a hazardous assembly. The PRA for the operation included a detailed evaluation only for those potential accident sequences which could lead to significant off-site consequences and affect public health. The overall purpose of this study was to investigate the feasibility of establishing a risk-consequence goal for DOE operations.

  9. Performing Probabilistic Risk Assessment Through RAVEN

    Energy Technology Data Exchange (ETDEWEB)

    A. Alfonsi; C. Rabiti; D. Mandelli; J. Cogliati; R. Kinoshita

    2013-06-01

    The Reactor Analysis and Virtual control ENviroment (RAVEN) code is a software tool that acts as the control logic driver and post-processing engine for the newly developed Thermal-Hydraulic code RELAP-7. RAVEN is now a multi-purpose Probabilistic Risk Assessment (PRA) software framework that allows dispatching different functionalities: Derive and actuate the control logic required to simulate the plant control system and operator actions (guided procedures), allowing on-line monitoring/controlling in the Phase Space Perform both Monte-Carlo sampling of random distributed events and Dynamic Event Tree based analysis Facilitate the input/output handling through a Graphical User Interface (GUI) and a post-processing data mining module

  10. An Overview of Well-known Mark Protection in China

    Institute of Scientific and Technical Information of China (English)

    2004-01-01

    @@ The well-known mard,as a tooic that has drawnincreadng attention and been frequently cov ered in the mass media,is a trademark that has relaflvel high renown and repute,whose role is by no means limited to dtstingu7ishing origin of one goods and service from that of another and which has become a sharp edge for market competillon.

  11. A Probabilistic Asteroid Impact Risk Model

    Science.gov (United States)

    Mathias, Donovan L.; Wheeler, Lorien F.; Dotson, Jessie L.

    2016-01-01

    Asteroid threat assessment requires the quantification of both the impact likelihood and resulting consequence across the range of possible events. This paper presents a probabilistic asteroid impact risk (PAIR) assessment model developed for this purpose. The model incorporates published impact frequency rates with state-of-the-art consequence assessment tools, applied within a Monte Carlo framework that generates sets of impact scenarios from uncertain parameter distributions. Explicit treatment of atmospheric entry is included to produce energy deposition rates that account for the effects of thermal ablation and object fragmentation. These energy deposition rates are used to model the resulting ground damage, and affected populations are computed for the sampled impact locations. The results for each scenario are aggregated into a distribution of potential outcomes that reflect the range of uncertain impact parameters, population densities, and strike probabilities. As an illustration of the utility of the PAIR model, the results are used to address the question of what minimum size asteroid constitutes a threat to the population. To answer this question, complete distributions of results are combined with a hypothetical risk tolerance posture to provide the minimum size, given sets of initial assumptions. Model outputs demonstrate how such questions can be answered and provide a means for interpreting the effect that input assumptions and uncertainty can have on final risk-based decisions. Model results can be used to prioritize investments to gain knowledge in critical areas or, conversely, to identify areas where additional data has little effect on the metrics of interest.

  12. Augmenting Probabilistic Risk Assesment with Malevolent Initiators

    Energy Technology Data Exchange (ETDEWEB)

    Curtis Smith; David Schwieder

    2011-11-01

    As commonly practiced, the use of probabilistic risk assessment (PRA) in nuclear power plants only considers accident initiators such as natural hazards, equipment failures, and human error. Malevolent initiators are ignored in PRA, but are considered the domain of physical security, which uses vulnerability assessment based on an officially specified threat (design basis threat). This paper explores the implications of augmenting and extending existing PRA models by considering new and modified scenarios resulting from malevolent initiators. Teaming the augmented PRA models with conventional vulnerability assessments can cost-effectively enhance security of a nuclear power plant. This methodology is useful for operating plants, as well as in the design of new plants. For the methodology, we have proposed an approach that builds on and extends the practice of PRA for nuclear power plants for security-related issues. Rather than only considering 'random' failures, we demonstrated a framework that is able to represent and model malevolent initiating events and associated plant impacts.

  13. How probabilistic risk assessment can mislead terrorism risk analysts.

    Science.gov (United States)

    Brown, Gerald G; Cox, Louis Anthony Tony

    2011-02-01

    Traditional probabilistic risk assessment (PRA), of the type originally developed for engineered systems, is still proposed for terrorism risk analysis. We show that such PRA applications are unjustified in general. The capacity of terrorists to seek and use information and to actively research different attack options before deciding what to do raises unique features of terrorism risk assessment that are not adequately addressed by conventional PRA for natural and engineered systems-in part because decisions based on such PRA estimates do not adequately hedge against the different probabilities that attackers may eventually act upon. These probabilities may differ from the defender's (even if the defender's experts are thoroughly trained, well calibrated, unbiased probability assessors) because they may be conditioned on different information. We illustrate the fundamental differences between PRA and terrorism risk analysis, and suggest use of robust decision analysis for risk management when attackers may know more about some attack options than we do.

  14. Probabilistic economic frameworks for disaster risk management

    Science.gov (United States)

    Dulac, Guillaume; Forni, Marc

    2013-04-01

    range from simple elicitation of data from a subject matter expert to calibrate a probability distribution to more advanced stochastic modelling. This approach can be referred to more as a proficiency in the language of uncertainty rather than modelling per se in the sense that it allows for greater flexibility to adapt a given context. In a real decision making context, one seldom has neither time nor budget resources to investigate all of these variables thoroughly, hence the importance of being able to prioritize the level of effort among them. Under the proposed framework, this can be done in an optimised fashion. The point here consists in applying probabilistic sensitivity analysis together with the fundamentals of the economic value of information; the framework as built is well suited to such considerations, and variables can be ranked according to their contribution to risk understanding. Efforts to deal with second order uncertainties on variables prove to be valuable when dealing with the economic value of sample information.

  15. Risk analysis of analytical validations by probabilistic modification of FMEA.

    Science.gov (United States)

    Barends, D M; Oldenhof, M T; Vredenbregt, M J; Nauta, M J

    2012-05-01

    Risk analysis is a valuable addition to validation of an analytical chemistry process, enabling not only detecting technical risks, but also risks related to human failures. Failure Mode and Effect Analysis (FMEA) can be applied, using a categorical risk scoring of the occurrence, detection and severity of failure modes, and calculating the Risk Priority Number (RPN) to select failure modes for correction. We propose a probabilistic modification of FMEA, replacing the categorical scoring of occurrence and detection by their estimated relative frequency and maintaining the categorical scoring of severity. In an example, the results of traditional FMEA of a Near Infrared (NIR) analytical procedure used for the screening of suspected counterfeited tablets are re-interpretated by this probabilistic modification of FMEA. Using this probabilistic modification of FMEA, the frequency of occurrence of undetected failure mode(s) can be estimated quantitatively, for each individual failure mode, for a set of failure modes, and the full analytical procedure.

  16. Regulating by the Numbers: Probabilistic Risk Assessment and Nuclear Power.

    Science.gov (United States)

    Nichols, Elizabeth; Wildavsky, Aaron

    1988-01-01

    Probabilistic risk assessment has been promoted within the Nuclear Regulatory Commission as a means of judging risk to the public and of determining regulatory measures. Interviews with engineers and other technically trained personnel reveal the difficulties created by expectations that this form of assessment should be applied. (TJH)

  17. 77 FR 58590 - Determining Technical Adequacy of Probabilistic Risk Assessment for Risk-Informed License...

    Science.gov (United States)

    2012-09-21

    ... COMMISSION Determining Technical Adequacy of Probabilistic Risk Assessment for Risk-Informed License... NUREG-0800, ``Standard Review Plan for the Review of Safety Analysis Reports for Nuclear Power Plants: LWR Edition,'' Section 19.1, ``Determining the Technical Adequacy of Probabilistic Risk Assessment...

  18. Review of the Diablo Canyon probabilistic risk assessment

    Energy Technology Data Exchange (ETDEWEB)

    Bozoki, G.E.; Fitzpatrick, R.G.; Bohn, M.P. [Sandia National Lab., Albuquerque, NM (United States); Sabek, M.G. [Atomic Energy Authority, Nuclear Regulatory and Safety Center, Cairo (Egypt); Ravindra, M.K.; Johnson, J.J. [EQE Engineering, San Francisco, CA (United States)

    1994-08-01

    This report details the review of the Diablo Canyon Probabilistic Risk Assessment (DCPRA). The study was performed under contract from the Probabilistic Risk Analysis Branch, Office of Nuclear Reactor Research, USNRC by Brookhaven National Laboratory. The DCPRA is a full scope Level I effort and although the review touched on all aspects of the PRA, the internal events and seismic events received the vast majority of the review effort. The report includes a number of independent systems analyses sensitivity studies, importance analyses as well as conclusions on the adequacy of the DCPRA for use in the Diablo Canyon Long Term Seismic Program.

  19. Toward Probabilistic Risk Analyses - Development of a Probabilistic Tsunami Hazard Assessment of Crescent City, CA

    Science.gov (United States)

    González, F. I.; Leveque, R. J.; Hatheway, D.; Metzger, N.

    2011-12-01

    Risk is defined in many ways, but most are consistent with Crichton's [1999] definition based on the ''risk triangle'' concept and the explicit identification of three risk elements: ''Risk is the probability of a loss, and this depends on three elements: hazard, vulnerability, and exposure. If any of these three elements in risk increases or decreases, then the risk increases or decreases respectively." The World Meteorological Organization, for example, cites Crichton [1999] and then defines risk as [WMO, 2008] Risk = function (Hazard x Vulnerability x Exposure) while the Asian Disaster Reduction Center adopts the more general expression [ADRC, 2005] Risk = function (Hazard, Vulnerability, Exposure) In practice, probabilistic concepts are invariably invoked, and at least one of the three factors are specified as probabilistic in nature. The Vulnerability and Exposure factors are defined in multiple ways in the relevant literature; but the Hazard factor, which is the focus of our presentation, is generally understood to deal only with the physical aspects of the phenomena and, in particular, the ability of the phenomena to inflict harm [Thywissen, 2006]. A Hazard factor can be estimated by a methodology known as Probabilistic Tsunami Hazard Assessment (PTHA) [González, et al., 2009]. We will describe the PTHA methodology and provide an example -- the results of a previous application to Seaside, OR. We will also present preliminary results for a PTHA of Crescent City, CA -- a pilot project and coastal modeling/mapping effort funded by the Federal Emergency Management Agency (FEMA) Region IX office as part of the new California Coastal Analysis and Mapping Project (CCAMP). CCAMP and the PTHA in Crescent City are being conducted under the nationwide FEMA Risk Mapping, Assessment, and Planning (Risk MAP) Program which focuses on providing communities with flood information and tools they can use to enhance their mitigation plans and better protect their citizens.

  20. Developing Probabilistic Safety Performance Margins for Unknown and Underappreciated Risks

    Science.gov (United States)

    Benjamin, Allan; Dezfuli, Homayoon; Everett, Chris

    2015-01-01

    Probabilistic safety requirements currently formulated or proposed for space systems, nuclear reactor systems, nuclear weapon systems, and other types of systems that have a low-probability potential for high-consequence accidents depend on showing that the probability of such accidents is below a specified safety threshold or goal. Verification of compliance depends heavily upon synthetic modeling techniques such as PRA. To determine whether or not a system meets its probabilistic requirements, it is necessary to consider whether there are significant risks that are not fully considered in the PRA either because they are not known at the time or because their importance is not fully understood. The ultimate objective is to establish a reasonable margin to account for the difference between known risks and actual risks in attempting to validate compliance with a probabilistic safety threshold or goal. In this paper, we examine data accumulated over the past 60 years from the space program, from nuclear reactor experience, from aircraft systems, and from human reliability experience to formulate guidelines for estimating probabilistic margins to account for risks that are initially unknown or underappreciated. The formulation includes a review of the safety literature to identify the principal causes of such risks.

  1. Implementation of probabilistic risk estimation for VRU safety

    NARCIS (Netherlands)

    Nunen, E. van; Broek, T.H.A. van den; Kwakkernaat, M.R.J.A.E.; Kotiadis, D.

    2011-01-01

    This paper describes the design, implementation and results of a novel probabilistic collision warning system. To obtain reliable results for risk estimation, preprocessing sensor data is essential. The work described herein presents all the necessary preprocessing steps such as filtering, sensor fu

  2. Adolescents' Heightened Risk-Seeking in a Probabilistic Gambling Task

    Science.gov (United States)

    Burnett, Stephanie; Bault, Nadege; Coricelli, Giorgio; Blakemore, Sarah-Jayne

    2010-01-01

    This study investigated adolescent males' decision-making under risk, and the emotional response to decision outcomes, using a probabilistic gambling task designed to evoke counterfactually mediated emotions (relief and regret). Participants were 20 adolescents (aged 9-11), 26 young adolescents (aged 12-15), 20 mid-adolescents (aged 15-18) and 17…

  3. Adolescents’ heightened risk-seeking in a probabilistic gambling task

    NARCIS (Netherlands)

    Burnett, S.; Bault, N.; Coricelli, G.; Blakemore, S.J.

    2010-01-01

    This study investigated adolescent males’ decision-making under risk, and the emotional response to decision outcomes, using a probabilistic gambling task designed to evoke counterfactually mediated emotions (relief and regret). Participants were 20 adolescents (aged 9-11), 26 young adolescents (age

  4. Probabilistic Modeling and Risk Assessment of Cable Icing

    DEFF Research Database (Denmark)

    Roldsgaard, Joan Hee

    assessments together with the Bayesian pre-posterior decision analysis and builds upon the quantification of Value of Information (VoI). The consequences are evaluated for different outputs of the probabilistic model to provide a basis for prioritizing risk management decision alternatives. Each step...... the bridge cables, which can cause socioeconomically expensive closures of bridges and traffic disruptions. The objective is to develop a simple model that can be used to assess the occurrence probability of ice accretion on bridge cables from readily available meteorological variables. This model is used....... The damage assessment is performed using a probabilistic approach, based on a Bayesian Probabilistic Network, where the wind environment, traffic loading, bridge specific parameters and the mechanisms that induce significant cable vibrations are the main input parameters. It is outlined how information...

  5. Use of Probabilistic Risk Assessment in Shuttle Decision Making Process

    Science.gov (United States)

    Boyer, Roger L.; Hamlin, Teri, L.

    2011-01-01

    This slide presentation reviews the use of Probabilistic Risk Assessment (PRA) to assist in the decision making for the shuttle design and operation. Probabilistic Risk Assessment (PRA) is a comprehensive, structured, and disciplined approach to identifying and analyzing risk in complex systems and/or processes that seeks answers to three basic questions: (i.e., what can go wrong? what is the likelihood of these occurring? and what are the consequences that could result if these occur?) The purpose of the Shuttle PRA (SPRA) is to provide a useful risk management tool for the Space Shuttle Program (SSP) to identify strengths and possible weaknesses in the Shuttle design and operation. SPRA was initially developed to support upgrade decisions, but has evolved into a tool that supports Flight Readiness Reviews (FRR) and near real-time flight decisions. Examples of the use of PRA for the shuttle are reviewed.

  6. Risk analysis of analytical validations by probabilistic modification of FMEA

    DEFF Research Database (Denmark)

    Barends, D.M.; Oldenhof, M.T.; Vredenbregt, M.J.

    2012-01-01

    Risk analysis is a valuable addition to validation of an analytical chemistry process, enabling not only detecting technical risks, but also risks related to human failures. Failure Mode and Effect Analysis (FMEA) can be applied, using a categorical risk scoring of the occurrence, detection...... and severity of failure modes, and calculating the Risk Priority Number (RPN) to select failure modes for correction. We propose a probabilistic modification of FMEA, replacing the categorical scoring of occurrence and detection by their estimated relative frequency and maintaining the categorical scoring...... of undetected failure mode(s) can be estimated quantitatively, for each individual failure mode, for a set of failure modes, and the full analytical procedure....

  7. Bayesian Inference for NASA Probabilistic Risk and Reliability Analysis

    Science.gov (United States)

    Dezfuli, Homayoon; Kelly, Dana; Smith, Curtis; Vedros, Kurt; Galyean, William

    2009-01-01

    This document, Bayesian Inference for NASA Probabilistic Risk and Reliability Analysis, is intended to provide guidelines for the collection and evaluation of risk and reliability-related data. It is aimed at scientists and engineers familiar with risk and reliability methods and provides a hands-on approach to the investigation and application of a variety of risk and reliability data assessment methods, tools, and techniques. This document provides both: A broad perspective on data analysis collection and evaluation issues. A narrow focus on the methods to implement a comprehensive information repository. The topics addressed herein cover the fundamentals of how data and information are to be used in risk and reliability analysis models and their potential role in decision making. Understanding these topics is essential to attaining a risk informed decision making environment that is being sought by NASA requirements and procedures such as 8000.4 (Agency Risk Management Procedural Requirements), NPR 8705.05 (Probabilistic Risk Assessment Procedures for NASA Programs and Projects), and the System Safety requirements of NPR 8715.3 (NASA General Safety Program Requirements).

  8. The Constitutive Element of Probabilistic Agency in Risk

    DEFF Research Database (Denmark)

    Merkelsen, Henrik

    2011-01-01

    Defining central concepts with accuracy is crucial to any scientific discipline. A recent debate over risk definitions in this journal illustrates the far reaching consequences of divergent definitions. Aven and Renn define risk as a social construct while Rosa defines risk as an ontological fact...... to uphold if a risk definition is to be in accordance with the ordinary usage of the word. The paper concludes by arguing that risks are only real within a subjective ontology.......Defining central concepts with accuracy is crucial to any scientific discipline. A recent debate over risk definitions in this journal illustrates the far reaching consequences of divergent definitions. Aven and Renn define risk as a social construct while Rosa defines risk as an ontological fact....... Both claim that their definition reflects the common usage of the word risk. Through a semantic analysis this paper points to a constitutive element of what is termed probabilistic agency in the risk concept. In this respect, risk is distinct from danger, and because Rosa’s main argument is based...

  9. MATILDA: A Military Laser Range Safety Tool Based on Probabilistic Risk Assessment (PRA) Techniques

    Science.gov (United States)

    2014-08-01

    AFRL-RH-FS-TR-2014-0035 MATILDA: A Military Laser Range Safety Tool Based on Probabilistic Risk Assessment (PRA) Techniques Paul...the Government’s approval or disapproval of its ideas or findings. MATILDA: A Military Laser Range Safety Tool Based on Probabilistic Risk Assessment... Probabilistic Risk Assessment (PRA) techniques to perform laser safety and hazard analysis for high output lasers in outdoor environments has become

  10. Cutting costs through detailed probabilistic fire risk analysis

    Energy Technology Data Exchange (ETDEWEB)

    Oliveira, Luiz; Huser, Asmund; Vianna, Savio [Det Norske Veritas PRINCIPIA, Rio de Janeiro, RJ (Brazil)

    2004-07-01

    A new procedure for calculation of fire risks to offshore installations has been developed. The purposes of the procedure are to calculate the escalation and impairment frequencies to be applied in quantitative risk analyses, to optimize Passive Fire Protection (PFP) arrangement, and to optimize other fire mitigation means. The novelties of the procedure are that it uses state of the art Computational Fluid Dynamics (CFD) models to simulate fires and radiation, as well as the use of a probabilistic approach to decide the dimensioning fire loads. A CFD model of an actual platform was used to investigate the dynamic properties of a large set of jet fires, resulting in detailed knowledge of the important parameters that decide the severity of offshore fires. These results are applied to design the procedure. Potential increase in safety is further obtained for those conditions where simplified tools may have failed to predict abnormal heat loads due to geometrical effects. Using a field example it is indicated that the probabilistic approach can give significant reductions in PFP coverage with corresponding cost savings, still keeping the risk at acceptable level. (author)

  11. Applications of nuclear safety probabilistic risk assessment to nuclear security for optimized risk mitigation

    Energy Technology Data Exchange (ETDEWEB)

    Donnelly, S.K.; Harvey, S.B. [Amec Foster Wheeler, Toronto, Ontario (Canada)

    2016-06-15

    Critical infrastructure assets such as nuclear power generating stations are potential targets for malevolent acts. Probabilistic methodologies can be applied to evaluate the real-time security risk based upon intelligence and threat levels. By employing this approach, the application of security forces and other protective measures can be optimized. Existing probabilistic safety analysis (PSA) methodologies and tools employed. in the nuclear industry can be adapted to security applications for this purpose. Existing PSA models can also be adapted and enhanced to consider total plant risk, due to nuclear safety risks as well as security risks. By creating a Probabilistic Security Model (PSM), safety and security practitioners can maximize the safety and security of the plant while minimizing the significant costs associated with security upgrades and security forces. (author)

  12. Architecture for Integrated Medical Model Dynamic Probabilistic Risk Assessment

    Science.gov (United States)

    Jaworske, D. A.; Myers, J. G.; Goodenow, D.; Young, M.; Arellano, J. D.

    2016-01-01

    Probabilistic Risk Assessment (PRA) is a modeling tool used to predict potential outcomes of a complex system based on a statistical understanding of many initiating events. Utilizing a Monte Carlo method, thousands of instances of the model are considered and outcomes are collected. PRA is considered static, utilizing probabilities alone to calculate outcomes. Dynamic Probabilistic Risk Assessment (dPRA) is an advanced concept where modeling predicts the outcomes of a complex system based not only on the probabilities of many initiating events, but also on a progression of dependencies brought about by progressing down a time line. Events are placed in a single time line, adding each event to a queue, as managed by a planner. Progression down the time line is guided by rules, as managed by a scheduler. The recently developed Integrated Medical Model (IMM) summarizes astronaut health as governed by the probabilities of medical events and mitigation strategies. Managing the software architecture process provides a systematic means of creating, documenting, and communicating a software design early in the development process. The software architecture process begins with establishing requirements and the design is then derived from the requirements.

  13. Advanced Seismic Probabilistic Risk Assessment Demonstration Project Plan

    Energy Technology Data Exchange (ETDEWEB)

    Coleman, Justin [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2014-09-01

    Idaho National Laboratories (INL) has an ongoing research and development (R&D) project to remove excess conservatism from seismic probabilistic risk assessments (SPRA) calculations. These risk calculations should focus on providing best estimate results, and associated insights, for evaluation and decision-making. This report presents a plan for improving our current traditional SPRA process using a seismic event recorded at a nuclear power plant site, with known outcomes, to improve the decision making process. SPRAs are intended to provide best estimates of the various combinations of structural and equipment failures that can lead to a seismic induced core damage event. However, in general this approach has been conservative, and potentially masks other important events (for instance, it was not the seismic motions that caused the Fukushima core melt events, but the tsunami ingress into the facility).

  14. Advanced probabilistic risk analysis using RAVEN and RELAP-7

    Energy Technology Data Exchange (ETDEWEB)

    Rabiti, Cristian [Idaho National Lab. (INL), Idaho Falls, ID (United States); Alfonsi, Andrea [Idaho National Lab. (INL), Idaho Falls, ID (United States); Mandelli, Diego [Idaho National Lab. (INL), Idaho Falls, ID (United States); Cogliati, Joshua [Idaho National Lab. (INL), Idaho Falls, ID (United States); Kinoshita, Robert [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2014-06-01

    RAVEN, under the support of the Nuclear Energy Advanced Modeling and Simulation (NEAMS) program [1], is advancing its capability to perform statistical analyses of stochastic dynamic systems. This is aligned with its mission to provide the tools needed by the Risk Informed Safety Margin Characterization (RISMC) path-lead [2] under the Department Of Energy (DOE) Light Water Reactor Sustainability program [3]. In particular this task is focused on the synergetic development with the RELAP-7 [4] code to advance the state of the art on the safety analysis of nuclear power plants (NPP). The investigation of the probabilistic evolution of accident scenarios for a complex system such as a nuclear power plant is not a trivial challenge. The complexity of the system to be modeled leads to demanding computational requirements even to simulate one of the many possible evolutions of an accident scenario (tens of CPU/hour). At the same time, the probabilistic analysis requires thousands of runs to investigate outcomes characterized by low probability and severe consequence (tail problem). The milestone reported in June of 2013 [5] described the capability of RAVEN to implement complex control logic and provide an adequate support for the exploration of the probabilistic space using a Monte Carlo sampling strategy. Unfortunately the Monte Carlo approach is ineffective with a problem of this complexity. In the following year of development, the RAVEN code has been extended with more sophisticated sampling strategies (grids, Latin Hypercube, and adaptive sampling). This milestone report illustrates the effectiveness of those methodologies in performing the assessment of the probability of core damage following the onset of a Station Black Out (SBO) situation in a boiling water reactor (BWR). The first part of the report provides an overview of the available probabilistic analysis capabilities, ranging from the different types of distributions available, possible sampling

  15. Probabilistic Risk Assessment: Piping Fragility due to Earthquake Fault Mechanisms

    Directory of Open Access Journals (Sweden)

    Bu Seog Ju

    2015-01-01

    Full Text Available A lifeline system, serving as an energy-supply system, is an essential component of urban infrastructure. In a hospital, for example, the piping system supplies elements essential for hospital operations, such as water and fire-suppression foam. Such nonstructural components, especially piping systems and their subcomponents, must remain operational and functional during earthquake-induced fires. But the behavior of piping systems as subjected to seismic ground motions is very complex, owing particularly to the nonlinearity affected by the existence of many connections such as T-joints and elbows. The present study carried out a probabilistic risk assessment on a hospital fire-protection piping system’s acceleration-sensitive 2-inch T-joint sprinkler components under seismic ground motions. Specifically, the system’s seismic capacity, using an experimental-test-based nonlinear finite element (FE model, was evaluated for the probability of failure under different earthquake-fault mechanisms including normal fault, reverse fault, strike-slip fault, and near-source ground motions. It was observed that the probabilistic failure of the T-joint of the fire-protection piping system varied significantly according to the fault mechanisms. The normal-fault mechanism led to a higher probability of system failure at locations 1 and 2. The strike-slip fault mechanism, contrastingly, affected the lowest fragility of the piping system at a higher PGA.

  16. RAVEN and Dynamic Probabilistic Risk Assessment: Software overview

    Energy Technology Data Exchange (ETDEWEB)

    Andrea Alfonsi; Cristian Rabiti; Diego Mandelli; Joshua Cogliati; Robert Kinoshita; Antonio Naviglio

    2014-09-01

    RAVEN is a generic software framework to perform parametric and probabilistic analysis based on the response of complex system codes. The initial development was aimed to provide dynamic risk analysis capabilities to the Thermo-Hydraulic code RELAP-7 [], currently under development at the Idaho National Laboratory. Although the initial goal has been fully accomplished, RAVEN is now a multi-purpose probabilistic and uncertainty quantification platform, capable to agnostically communicate with any system code. This agnosticism has been employed by providing Application Programming Interfaces (APIs). These interfaces are used to allow RAVEN to interact with any code as long as all the parameters that need to be perturbed are accessible by inputs files or via python interfaces. RAVEN is capable to investigate the system response, investigating the input space using Monte Carlo, Grid, or Latin Hyper Cube sampling schemes, but its strength is focused toward system feature discovery, such as limit surfaces, separating regions of the input space leading to system failure, using dynamic supervised learning techniques. The paper presents an overview of the software capabilities and their implementation schemes followed by some application examples.

  17. 77 FR 61446 - Proposed Revision Probabilistic Risk Assessment and Severe Accident Evaluation for New Reactors

    Science.gov (United States)

    2012-10-09

    ... COMMISSION Proposed Revision Probabilistic Risk Assessment and Severe Accident Evaluation for New Reactors... comment on NUREG-0800, ``Standard Review Plan for the Review of Safety Analysis Reports for Nuclear Power..., ``Probabilistic Risk Assessment and Severe Accident Evaluation for New Reactors.'' DATES: Submit comments...

  18. 77 FR 66649 - Proposed Revision to Probabilistic Risk Assessment and Severe Accident Evaluation for New Reactors

    Science.gov (United States)

    2012-11-06

    ... COMMISSION Proposed Revision to Probabilistic Risk Assessment and Severe Accident Evaluation for New Reactors... the Commission), issued a NUREG-0800, ``Standard Review Plan for the Review of Safety Analysis Reports...), Section 19.0 ``Probabilistic Risk Assessment and Severe Accident Evaluation for New Reactors.'' The NRC...

  19. Risk Management Technologies With Logic and Probabilistic Models

    CERN Document Server

    Solozhentsev, E D

    2012-01-01

    This book presents intellectual, innovative, information technologies (I3-technologies) based on logical and probabilistic (LP) risk models. The technologies presented here consider such models for structurally complex systems and processes with logical links and with random events in economics and technology.  The volume describes the following components of risk management technologies: LP-calculus; classes of LP-models of risk and efficiency; procedures for different classes; special software for different classes; examples of applications; methods for the estimation of probabilities of events based on expert information. Also described are a variety of training courses in these topics. The classes of risk models treated here are: LP-modeling, LP-classification, LP-efficiency, and LP-forecasting. Particular attention is paid to LP-models of risk of failure to resolve difficult economic and technical problems. Amongst the  discussed  procedures of I3-technologies  are the construction of  LP-models,...

  20. OVERVIEW OF THE SAPHIRE PROBABILISTIC RISK ANALYSIS SOFTWARE

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Curtis L.; Wood, Ted; Knudsen, James; Ma, Zhegang

    2016-10-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) is a software application developed for performing a complete probabilistic risk assessment (PRA) using a personal computer (PC) running the Microsoft Windows operating system. SAPHIRE Version 8 is funded by the U.S. Nuclear Regulatory Commission (NRC) and developed by the Idaho National Laboratory (INL). INL's primary role in this project is that of software developer and tester. However, INL also plays an important role in technology transfer by interfacing and supporting SAPHIRE users, who constitute a wide range of PRA practitioners from the NRC, national laboratories, the private sector, and foreign countries. In this paper, we provide an overview of the current technical capabilities found in SAPHIRE Version 8, including the user interface and enhanced solving algorithms.

  1. Probabilistic landslide hazards and risk mapping on Penang Island, Malaysia

    Indian Academy of Sciences (India)

    Saro Lee; Biswajeet Pradhan

    2006-12-01

    This paper deals with landslide hazards and risk analysis of Penang Island, Malaysia using Geographic Information System (GIS) and remote sensing data. Landslide locations in the study area were identified from interpretations of aerial photographs and field surveys. Topographical/ geological data and satellite images were collected and processed using GIS and image processing tools. There are ten landslide inducing parameters which are considered for landslide hazard analysis. These parameters are topographic slope, aspect, curvature and distance from drainage, all derived from the topographic database; geology and distance from lineament, derived from the geologic database; landuse from Landsat satellite images; soil from the soil database; precipitation amount, derived from the rainfall database; and the vegetation index value from SPOT satellite images. Landslide susceptibility was analyzed using landslide-occurrence factors employing the probability–frequency ratio model. The results of the analysis were verified using the landslide location data and compared with the probabilistic model. The accuracy observed was 80.03%. The qualitative landslide hazard analysis was carried out using the frequency ratio model through the map overlay analysis in GIS environment. The accuracy of hazard map was 86.41%. Further, risk analysis was done by studying the landslide hazard map and damageable objects at risk. This information could be used to estimate the risk to population, property and existing infrastructure like transportation network.

  2. Advanced Small Modular Reactor (SMR) Probabilistic Risk Assessment (PRA) Demonstration

    Energy Technology Data Exchange (ETDEWEB)

    Curtis Smith; Steven Prescott; Tony Koonce

    2014-04-01

    A key area of the Advanced Small Modular Reactor (SMR) Probabilistic Risk Assessment (PRA) strategy is the development of methodologies and tools that will be used to predict the safety, security, safeguards, performance, and deployment viability of SMRs. The goal of the SMR PRA activity will be to develop quantitative methods and tools and the associated analysis framework for assessing a variety of risks. Development and implementation of SMR-focused safety assessment methods may require new analytic methods or adaptation of traditional methods to the advanced design and operational features of SMRs. We will need to move beyond the current limitations such as static, logic-based models in order to provide more integrated, scenario-based models based upon predictive modeling which are tied to causal factors. The development of SMR-specific safety models for margin determination will provide a safety case that describes potential accidents, design options (including postulated controls), and supports licensing activities by providing a technical basis for the safety envelope. This report documents the progress that was made to implement the PRA framework, specifically by way of demonstration of an advanced 3D approach to representing, quantifying and understanding flooding risks to a nuclear power plant.

  3. Framework for probabilistic flood risk assessment in an Alpine region

    Science.gov (United States)

    Schneeberger, Klaus; Huttenlau, Matthias; Steinberger, Thomas; Achleitner, Stefan; Stötter, Johann

    2014-05-01

    Flooding is among the natural hazards that regularly cause significant losses to property and human lives. The assessment of flood risk delivers crucial information for all participants involved in flood risk management and especially for local authorities and insurance companies in order to estimate the possible flood losses. Therefore a framework for assessing flood risk has been developed and is introduced with the presented contribution. Flood risk is thereby defined as combination of the probability of flood events and of potential flood damages. The probability of occurrence is described through the spatial and temporal characterisation of flood. The potential flood damages are determined in the course of vulnerability assessment, whereas, the exposure and the vulnerability of the elements at risks are considered. Direct costs caused by flooding with the focus on residential building are analysed. The innovative part of this contribution lies on the development of a framework which takes the probability of flood events and their spatio-temporal characteristic into account. Usually the probability of flooding will be determined by means of recurrence intervals for an entire catchment without any spatial variation. This may lead to a misinterpretation of the flood risk. Within the presented framework the probabilistic flood risk assessment is based on analysis of a large number of spatial correlated flood events. Since the number of historic flood events is relatively small additional events have to be generated synthetically. This temporal extrapolation is realised by means of the method proposed by Heffernan and Tawn (2004). It is used to generate a large number of possible spatial correlated flood events within a larger catchment. The approach is based on the modelling of multivariate extremes considering the spatial dependence structure of flood events. The input for this approach are time series derived from river gauging stations. In a next step the

  4. Probabilistic seismic vulnerability and risk assessment of stone masonry structures

    Science.gov (United States)

    Abo El Ezz, Ahmad

    Earthquakes represent major natural hazards that regularly impact the built environment in seismic prone areas worldwide and cause considerable social and economic losses. The high losses incurred following the past destructive earthquakes promoted the need for assessment of the seismic vulnerability and risk of the existing buildings. Many historic buildings in the old urban centers in Eastern Canada such as Old Quebec City are built of stone masonry and represent un-measurable architectural and cultural heritage. These buildings were built to resist gravity loads only and generally offer poor resistance to lateral seismic loads. Seismic vulnerability assessment of stone masonry buildings is therefore the first necessary step in developing seismic retrofitting and pre-disaster mitigation plans. The objective of this study is to develop a set of probability-based analytical tools for efficient seismic vulnerability and uncertainty analysis of stone masonry buildings. A simplified probabilistic analytical methodology for vulnerability modelling of stone masonry building with systematic treatment of uncertainties throughout the modelling process is developed in the first part of this study. Building capacity curves are developed using a simplified mechanical model. A displacement based procedure is used to develop damage state fragility functions in terms of spectral displacement response based on drift thresholds of stone masonry walls. A simplified probabilistic seismic demand analysis is proposed to capture the combined uncertainty in capacity and demand on fragility functions. In the second part, a robust analytical procedure for the development of seismic hazard compatible fragility and vulnerability functions is proposed. The results are given by sets of seismic hazard compatible vulnerability functions in terms of structure-independent intensity measure (e.g. spectral acceleration) that can be used for seismic risk analysis. The procedure is very efficient for

  5. A review of NRC staff uses of probabilistic risk assessment

    Energy Technology Data Exchange (ETDEWEB)

    1994-03-01

    The NRC staff uses probabilistic risk assessment (PRA) and risk management as important elements its licensing and regulatory processes. In October 1991, the NRC`s Executive Director for Operations established the PRA Working Group to address concerns identified by the Advisory Committee on Reactor Safeguards with respect to unevenness and inconsistency in the staff`s current uses of PRA. After surveying current staff uses of PRA and identifying needed improvements, the Working Group defined a set of basic principles for staff PRA use and identified three areas for improvements: guidance development, training enhancements, and PRA methods development. For each area of improvement, the Working Group took certain actions and recommended additional work. The Working Group recommended integrating its work with other recent PRA-related activities the staff completed and improving staff interactions with PRA users in the nuclear industry. The Working Group took two key actions by developing general guidance for two uses of PRA within the NRC (that is, screening or prioritizing reactor safety issues and analyzing such issues in detail) and developing guidance on basic terms and methods important to the staff`s uses of PRA.

  6. Reliability and Probabilistic Risk Assessment - How They Play Together

    Science.gov (United States)

    Safie, Fayssal M.; Stutts, Richard G.; Zhaofeng, Huang

    2015-01-01

    PRA methodology is one of the probabilistic analysis methods that NASA brought from the nuclear industry to assess the risk of LOM, LOV and LOC for launch vehicles. PRA is a system scenario based risk assessment that uses a combination of fault trees, event trees, event sequence diagrams, and probability and statistical data to analyze the risk of a system, a process, or an activity. It is a process designed to answer three basic questions: What can go wrong? How likely is it? What is the severity of the degradation? Since 1986, NASA, along with industry partners, has conducted a number of PRA studies to predict the overall launch vehicles risks. Planning Research Corporation conducted the first of these studies in 1988. In 1995, Science Applications International Corporation (SAIC) conducted a comprehensive PRA study. In July 1996, NASA conducted a two-year study (October 1996 - September 1998) to develop a model that provided the overall Space Shuttle risk and estimates of risk changes due to proposed Space Shuttle upgrades. After the Columbia accident, NASA conducted a PRA on the Shuttle External Tank (ET) foam. This study was the most focused and extensive risk assessment that NASA has conducted in recent years. It used a dynamic, physics-based, integrated system analysis approach to understand the integrated system risk due to ET foam loss in flight. Most recently, a PRA for Ares I launch vehicle has been performed in support of the Constellation program. Reliability, on the other hand, addresses the loss of functions. In a broader sense, reliability engineering is a discipline that involves the application of engineering principles to the design and processing of products, both hardware and software, for meeting product reliability requirements or goals. It is a very broad design-support discipline. It has important interfaces with many other engineering disciplines. Reliability as a figure of merit (i.e. the metric) is the probability that an item will

  7. Validation of seismic probabilistic risk assessments of nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Ellingwood, B. [Johns Hopkins Univ., Baltimore, MD (United States). Dept. of Civil Engineering

    1994-01-01

    A seismic probabilistic risk assessment (PRA) of a nuclear plant requires identification and information regarding the seismic hazard at the plant site, dominant accident sequences leading to core damage, and structure and equipment fragilities. Uncertainties are associated with each of these ingredients of a PRA. Sources of uncertainty due to seismic hazard and assumptions underlying the component fragility modeling may be significant contributors to uncertainty in estimates of core damage probability. Design and construction errors also may be important in some instances. When these uncertainties are propagated through the PRA, the frequency distribution of core damage probability may span three orders of magnitude or more. This large variability brings into question the credibility of PRA methods and the usefulness of insights to be gained from a PRA. The sensitivity of accident sequence probabilities and high-confidence, low probability of failure (HCLPF) plant fragilities to seismic hazard and fragility modeling assumptions was examined for three nuclear power plants. Mean accident sequence probabilities were found to be relatively insensitive (by a factor of two or less) to: uncertainty in the coefficient of variation (logarithmic standard deviation) describing inherent randomness in component fragility; truncation of lower tail of fragility; uncertainty in random (non-seismic) equipment failures (e.g., diesel generators); correlation between component capacities; and functional form of fragility family. On the other hand, the accident sequence probabilities, expressed in the form of a frequency distribution, are affected significantly by the seismic hazard modeling, including slopes of seismic hazard curves and likelihoods assigned to those curves.

  8. Probabilistic risk assessment of the modular HTGR plant. Revision 1

    Energy Technology Data Exchange (ETDEWEB)

    Everline, C.J.; Bellis, E.A.; Vasquez, J.

    1986-06-01

    A preliminary probabilistic risk assessment (PRA) has been performed for the modular HTGR (MHTGR). This PRA is preliminary in the context that although it updates the PRA issued earlier to include a wider spectrum of events for Licensing Basis Events (LBE) selection, the final version will not be issued until later. The primary function of the assessment was to assure compliance with the NRC interim safety goals imposed by the top-level regulatory criteria, and utility/user requirements regarding public evacuation or sheltering. In addition, the assessment provides a basis for designer feedback regarding reliability allocations and barrier retention requirements as well as providing a basis for the selection of licensing basis events (LBEs) and the safety classification of structures, systems, and components. The assessment demonstrates that both the NRC interim safety goals and utility/user imposed sheltering/evacuation requirements are satisfied. Moreover, it is not anticipated that design changes introduced will jeopardize compliance with the interim safety goals or utility/user requirements. 61 refs., 48 figs., 24 tabs.

  9. Mitomycin C: new strategies to improve efficacy of a well-known therapy.

    Science.gov (United States)

    Ragonese, Mauro; Racioppi, Marco; Bassi, Pier Francesco; Di Gianfrancesco, Luca; Lenci, Niccolò; Filianoti, Alessio; Recupero, Salvatore M

    2016-10-04

    Mitomycin C (MMC) as an intravesical chemotherapeutic agent is a well-known option for treatment of nonmuscle invasive bladder cancer (NMIBC) recurrence; it is probably the most commonly used agent given its low rate of side effects and its efficacy.Both the American Urologic Association (AUA) and European Association of Urology (EAU) consider MMC as a standard treatment for immediate single-dose postoperative treatment and for adjuvant therapy in low and intermediate-risk NMIBC.Despite the popularity of this agent in the treatment of NMIBCs, many questions regarding the optimal approach to MMC therapy remain unanswered and the schedule widely used is empirical.Nevertheless, even when the current optimal approaches to MMC administration are used, a large proportion of NMIBCs recur.This apparent treatment resistance might be overcome by an optimization of standard MMC therapy or with a combination of MMC with other agents that have different mechanisms of action.Strategies to enhance passive delivery of MMC have been well studied and multiple measures are recommended for implementation of use in routine clinical practice.A modified scheme of instillation seems to be an easy and inexpensive alternative to increase efficacy of intravesical MMC and to also use this agent with an ablative intent.Enhancing tumor response with a sequential therapy is another option that has been investigated, mostly for chemo-immunotherapy wherein the different mechanisms of action of Bacillus of Calmette and Guerìn (BCG) and MMC are combined to achieve a higher response.

  10. Probabilistic risk assessment for six vapour intrusion algorithms

    NARCIS (Netherlands)

    Provoost, J.; Reijnders, L.; Bronders, J.; Van Keer, I.; Govaerts, S.

    2014-01-01

    A probabilistic assessment with sensitivity analysis using Monte Carlo simulation for six vapour intrusion algorithms, used in various regulatory frameworks for contaminated land management, is presented here. In addition a deterministic approach with default parameter sets is evaluated against obse

  11. Probabilistic risk assessment for six vapour intrusion algorithms

    NARCIS (Netherlands)

    Provoost, J.; Reijnders, L.; Bronders, J.; Van Keer, I.; Govaerts, S.

    2014-01-01

    A probabilistic assessment with sensitivity analysis using Monte Carlo simulation for six vapour intrusion algorithms, used in various regulatory frameworks for contaminated land management, is presented here. In addition a deterministic approach with default parameter sets is evaluated against

  12. Probabilistic risk assessment for six vapour intrusion algorithms

    NARCIS (Netherlands)

    Provoost, J.; Reijnders, L.; Bronders, J.; Van Keer, I.; Govaerts, S.

    2014-01-01

    A probabilistic assessment with sensitivity analysis using Monte Carlo simulation for six vapour intrusion algorithms, used in various regulatory frameworks for contaminated land management, is presented here. In addition a deterministic approach with default parameter sets is evaluated against obse

  13. Probabilistic Risk Assessment for Decision Making During Spacecraft Operations

    Science.gov (United States)

    Meshkat, Leila

    2009-01-01

    Decisions made during the operational phase of a space mission often have significant and immediate consequences. Without the explicit consideration of the risks involved and their representation in a solid model, it is very likely that these risks are not considered systematically in trade studies. Wrong decisions during the operational phase of a space mission can lead to immediate system failure whereas correct decisions can help recover the system even from faulty conditions. A problem of special interest is the determination of the system fault protection strategies upon the occurrence of faults within the system. Decisions regarding the fault protection strategy also heavily rely on a correct understanding of the state of the system and an integrated risk model that represents the various possible scenarios and their respective likelihoods. Probabilistic Risk Assessment (PRA) modeling is applicable to the full lifecycle of a space mission project, from concept development to preliminary design, detailed design, development and operations. The benefits and utilities of the model, however, depend on the phase of the mission for which it is used. This is because of the difference in the key strategic decisions that support each mission phase. The focus of this paper is on describing the particular methods used for PRA modeling during the operational phase of a spacecraft by gleaning insight from recently conducted case studies on two operational Mars orbiters. During operations, the key decisions relate to the commands sent to the spacecraft for any kind of diagnostics, anomaly resolution, trajectory changes, or planning. Often, faults and failures occur in the parts of the spacecraft but are contained or mitigated before they can cause serious damage. The failure behavior of the system during operations provides valuable data for updating and adjusting the related PRA models that are built primarily based on historical failure data. The PRA models, in turn

  14. Bayesian probabilistic network approach for managing earthquake risks of cities

    DEFF Research Database (Denmark)

    Bayraktarli, Yahya; Faber, Michael

    2011-01-01

    and geographical information systems. The proposed framework comprises several modules: A module on the probabilistic description of potential future earthquake shaking intensity, a module on the probabilistic assessment of spatial variability of soil liquefaction, a module on damage assessment of buildings...... on an example considering a portfolio of reinforced concrete structures in a city located close to the western part of the North Anatolian Fault in Turkey....

  15. Using Probabilistic Models to Appraise and Decide on Sovereign Disaster Risk Financing and Insurance

    OpenAIRE

    Ley-Borrás, Roberto; Fox, Benjamin D.

    2015-01-01

    This paper presents an overview of the structure of probabilistic catastrophe risk models, discusses their importance for appraising sovereign disaster risk financing and insurance instruments and strategy, and puts forward a model and a process for improving decision making on the linked disaster risk management strategy and sovereign disaster risk financing and insurance strategy. The pa...

  16. Well-known and little-known: miscellaneous notes on Peruvian Orthalicidae (Gastropoda, Stylommatophora)

    NARCIS (Netherlands)

    Breure, A.S.H.; Mogollón Avila, V.

    2010-01-01

    The family Orthalicidae is well represented in Peru but, like in other families, some species are wellknown and others have not been reported on since their original descriptions. In this paper we present new records for well-known species and elucidate the status of several lesser known taxa. Four

  17. Conditioning of the stationary kriging matrices for some well-known covariance models

    Energy Technology Data Exchange (ETDEWEB)

    Posa, D. (IRMA-CNR, Bari (Italy))

    1989-10-01

    In this paper, the condition number of the stationary kriging matrix is studied for some well-known covariance models. Indeed, the robustness of the kriging weights is strongly affected by this measure. Such an analysis can justify the choice of a covariance function among other admissible models which could fit a given experimental covariance equally well.

  18. Hypocitraturia: a common but not well-known cause of nephrolithiasis.

    Science.gov (United States)

    Bos, S; Nap, R R H; Wouters, R S M E; van der Kleij, F G H

    2014-12-01

    Nephrolithiasis is a frequent problem that can cause serious morbidity. When associated with an underlying metabolic disorder the recurrence rate is higher. Hypocitraturia is estimated to be present in 20-60% of cases. Several secondary causes are known. Potassium citrate is the primary treatment. In the case we present here we emphasise the need for metabolic screening, focussing on hypocitraturia, a less well-known cause of nephrolithiasis.

  19. Social decisions under risk. Evidence from the probabilistic dictator game

    NARCIS (Netherlands)

    M. Krawczyk; F. Le Lec

    2008-01-01

    This paper reports results of a 'probabilistic dictator game' experiment in which subjects had to allocate chances to win a prize between themselves and a dummy player. We have manipulated (within subjects) two aspects of the game: the relative values of the prizes (being equal for the two players,

  20. Adaptive Sampling Algorithms for Probabilistic Risk Assessment of Nuclear Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Diego Mandelli; Dan Maljovec; Bei Wang; Valerio Pascucci; Peer-Timo Bremer

    2013-09-01

    Nuclear simulations are often computationally expensive, time-consuming, and high-dimensional with respect to the number of input parameters. Thus exploring the space of all possible simulation outcomes is infeasible using finite computing resources. During simulation-based probabilistic risk analysis, it is important to discover the relationship between a potentially large number of input parameters and the output of a simulation using as few simulation trials as possible. This is a typical context for performing adaptive sampling where a few observations are obtained from the simulation, a surrogate model is built to represent the simulation space, and new samples are selected based on the model constructed. The surrogate model is then updated based on the simulation results of the sampled points. In this way, we attempt to gain the most information possible with a small number of carefully selected sampled points, limiting the number of expensive trials needed to understand features of the simulation space. We analyze the specific use case of identifying the limit surface, i.e., the boundaries in the simulation space between system failure and system success. In this study, we explore several techniques for adaptively sampling the parameter space in order to reconstruct the limit surface. We focus on several adaptive sampling schemes. First, we seek to learn a global model of the entire simulation space using prediction models or neighborhood graphs and extract the limit surface as an iso-surface of the global model. Second, we estimate the limit surface by sampling in the neighborhood of the current estimate based on topological segmentations obtained locally. Our techniques draw inspirations from topological structure known as the Morse-Smale complex. We highlight the advantages and disadvantages of using a global prediction model versus local topological view of the simulation space, comparing several different strategies for adaptive sampling in both

  1. Integration of Evidence Base into a Probabilistic Risk Assessment

    Science.gov (United States)

    Saile, Lyn; Lopez, Vilma; Bickham, Grandin; Kerstman, Eric; FreiredeCarvalho, Mary; Byrne, Vicky; Butler, Douglas; Myers, Jerry; Walton, Marlei

    2011-01-01

    INTRODUCTION: A probabilistic decision support model such as the Integrated Medical Model (IMM) utilizes an immense amount of input data that necessitates a systematic, integrated approach for data collection, and management. As a result of this approach, IMM is able to forecasts medical events, resource utilization and crew health during space flight. METHODS: Inflight data is the most desirable input for the Integrated Medical Model. Non-attributable inflight data is collected from the Lifetime Surveillance for Astronaut Health study as well as the engineers, flight surgeons, and astronauts themselves. When inflight data is unavailable cohort studies, other models and Bayesian analyses are used, in addition to subject matters experts input on occasion. To determine the quality of evidence of a medical condition, the data source is categorized and assigned a level of evidence from 1-5; the highest level is one. The collected data reside and are managed in a relational SQL database with a web-based interface for data entry and review. The database is also capable of interfacing with outside applications which expands capabilities within the database itself. Via the public interface, customers can access a formatted Clinical Findings Form (CLiFF) that outlines the model input and evidence base for each medical condition. Changes to the database are tracked using a documented Configuration Management process. DISSCUSSION: This strategic approach provides a comprehensive data management plan for IMM. The IMM Database s structure and architecture has proven to support additional usages. As seen by the resources utilization across medical conditions analysis. In addition, the IMM Database s web-based interface provides a user-friendly format for customers to browse and download the clinical information for medical conditions. It is this type of functionality that will provide Exploratory Medicine Capabilities the evidence base for their medical condition list

  2. Application of probabilistic risk assessment in nuclear and environmental licensing processes of nuclear reactors in Brazil

    Energy Technology Data Exchange (ETDEWEB)

    Mata, Jonatas F.C. da; Vasconcelos, Vanderley de; Mesquita, Amir Z., E-mail: jonatasfmata@yahoo.com.br, E-mail: vasconv@cdtn.br, E-mail: amir@cdtn.br [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil)

    2015-07-01

    The nuclear accident at Fukushima Daiichi, occurred in Japan in 2011, brought reflections, worldwide, on the management of nuclear and environmental licensing processes of existing nuclear reactors. One of the key lessons learned in this matter, is that the studies of Probabilistic Safety Assessment and Severe Accidents are becoming essential, even in the early stage of a nuclear development project. In Brazil, Brazilian Nuclear Energy Commission, CNEN, conducts the nuclear licensing. The organism responsible for the environmental licensing is Brazilian Institute of Environment and Renewable Natural Resources, IBAMA. In the scope of the licensing processes of these two institutions, the safety analysis is essentially deterministic, complemented by probabilistic studies. The Probabilistic Safety Assessment (PSA) is the study performed to evaluate the behavior of the nuclear reactor in a sequence of events that may lead to the melting of its core. It includes both probability and consequence estimation of these events, which are called Severe Accidents, allowing to obtain the risk assessment of the plant. Thus, the possible shortcomings in the design of systems are identified, providing basis for safety assessment and improving safety. During the environmental licensing, a Quantitative Risk Analysis (QRA), including probabilistic evaluations, is required in order to support the development of the Risk Analysis Study, the Risk Management Program and the Emergency Plan. This article aims to provide an overview of probabilistic risk assessment methodologies and their applications in nuclear and environmental licensing processes of nuclear reactors in Brazil. (author)

  3. Perspectives on craniosynostosis: sutural biology, some well-known syndromes, and some unusual syndromes.

    Science.gov (United States)

    Cohen, M Michael

    2009-03-01

    Perspectives on craniosynostosis are discussed under the following headings: sutural biology (anatomic and genetic categories of synostosis; sutures, suture systems, and types of craniosynostosis; well-known syndromes (Muenke syndrome and Pfeiffer syndrome); and unusual syndromes (thanatophoric dysplasia, Beare-Stevenson cutis gyrata syndrome, Crouzonodermoskeletal syndrome, Carpenter syndrome, Elejalde syndrome, hypomandibular faciocranial syndrome, and craniorhiny). Five of these syndromes are caused by fibroblast growth factor receptor (FGFR) mutations; one is caused by ras-like in rat brain 23 (RAB23) mutations; and three have Mendelian patterns of inheritance, but the molecular basis remains unknown to date.

  4. Probabilistic risk assessment for six vapour intrusion algorithms

    OpenAIRE

    Provoost, J.; Reijnders, L.; Bronders, J.; Van Keer, I.; Govaerts, S.

    2014-01-01

    A probabilistic assessment with sensitivity analysis using Monte Carlo simulation for six vapour intrusion algorithms, used in various regulatory frameworks for contaminated land management, is presented here. In addition a deterministic approach with default parameter sets is evaluated against observed concentrations for benzene, ethylbenzene and trichloroethylene. The screening-level algorithms are ranked according to accuracy and conservatism in predicting observed soil air and indoor air ...

  5. Probabilistic cumulative risk assessment of anti-androgenic pesticides in food

    DEFF Research Database (Denmark)

    Müller, Anne Kirstine; Nielsen, Elsa

    2008-01-01

    A cumulative risk assessment of three anti-androgenic pesticides vinclozolin, procymidone and prochloraz in combination has been carried out using an Integrated Probabilistic Risk Assessment (IPRA) model. In the model, variability in both exposure and sensitivity between individuals were combined...

  6. Application of probabilistic robustness framework: Risk assessment of multi-storey building under extreme loading

    NARCIS (Netherlands)

    Vrouwenvelder, A.C.W.M.; Izzuddin, B.A.; Pereira, M.E.; Kuhlmann, U.; Rölle, L.; Leira, B.J.

    2012-01-01

    Risk assessment is a requirement for robustness design of high consequence class structures, yet very little guidance is offered in practice for performing this type of assessment. This paper demonstrates the application of the probabilistic risk assessment framework arising from COST Action TU0601

  7. Application of probabilistic robustness framework: Risk assessment of multi-storey building under extreme loading

    NARCIS (Netherlands)

    Vrouwenvelder, A.C.W.M.; Izzuddin, B.A.; Pereira, M.E.; Kuhlmann, U.; Rölle, L.; Leira, B.J.

    2012-01-01

    Risk assessment is a requirement for robustness design of high consequence class structures, yet very little guidance is offered in practice for performing this type of assessment. This paper demonstrates the application of the probabilistic risk assessment framework arising from COST Action TU0601

  8. Potential for the adaptation of probabilistic risk assessments by end-users and decision-makers

    NARCIS (Netherlands)

    Frewer, L.J.; Fischer, A.R.H.; Brink, van den P.J.; Byrne, P.; Brock, T.C.M.; Brown, C.; Crocker, J.; Goerlitz, G.; Hart, A.; Scholderer, J.; Solomon, K.

    2008-01-01

    In the area of risk assessment associated with ecotoxicological and plant protection products, probabilistic risk assessment (PRA) methodologies have been developed that enable quantification of variability and uncertainty. Despite the potential advantages of these new methodologies, end-user and re

  9. Probabilistic cumulative risk assessment of anti-androgenic pesticides in food.

    NARCIS (Netherlands)

    Müller, A.K.; Bosgra, S.; Boon, P.E.; van der Voet, H.; Nielsen, E.; Ladefoged, O.

    2009-01-01

    In this paper, we present a cumulative risk assessment of three anti-androgenic pesticides (vinclozolin, procymidone and prochloraz) using the relative potency factor (RPF) approach and an integrated probabilistic risk assessment (IPRA) model. RPFs for each substance were estimated for three reprodu

  10. Probabilistic cumulative risk assessment of anti-androgenic pesticides in food

    NARCIS (Netherlands)

    Muller, A.K.; Bosgra, S.; Boon, P.E.; Voet, van der H.; Nielsen, E.; Ladefoged, O.

    2009-01-01

    In this paper, we present a cumulative risk assessment of three anti-androgenic pesticides (vinclozolin, procymidone and prochloraz) using the relative potency factor (RPF) approach and an integrated probabilistic risk assessment (IPRA) model. RPFs for each substance were estimated for three reprodu

  11. Indexing of Iranian Publications in Well-known Endodontic Textbooks: A Scientometric Analysis

    Science.gov (United States)

    Kakooei, Sina; Mostafavi, Mahshid; Parirokh, Masoud; Asgary, Saeed

    2016-01-01

    Introduction: Quoting an article in well-known textbooks is held as a credit for that paper. The numbers of Iranian publications mentioned in endodontic textbooks have increased during recent years. The aim of this investigation was to evaluate the number of Iranian articles quoted in eminent endodontic textbooks. Methods and Materials: Three known textbooks (Ingle’s Endodontics, Seltzer and Bender’s Dental Pulp and Cohen’s Pathways of the Pulp) were chosen and all the editions of the textbooks since 2000 were investigated for quoted Iranian publications. Only Iranian authors with affiliations from a domestic university were chosen. All references at the end of each chapter were read by hand searching, and results were noted. The trend and percentage of Iranian publications in different editions of the textbooks were also calculated. The number of citations of these publications in Google Scholar and Scopus databases were also obtained. Results: The number of Iranian publications in all well-known textbooks have notably increased since 2000. The number and percentage of Iranian publications in the latest edition of Cohen’s Pathways of the Pulp was higher compared to other textbooks as well as the previous edition of the same text. Conclusion: Number and percentage of Iranian publications in the field of endodontics in all three textbooks have remarkably increased since 2000. PMID:27471523

  12. A novel hybrid meta-heuristic technique applied to the well-known benchmark optimization problems

    Science.gov (United States)

    Abtahi, Amir-Reza; Bijari, Afsane

    2016-09-01

    In this paper, a hybrid meta-heuristic algorithm, based on imperialistic competition algorithm (ICA), harmony search (HS), and simulated annealing (SA) is presented. The body of the proposed hybrid algorithm is based on ICA. The proposed hybrid algorithm inherits the advantages of the process of harmony creation in HS algorithm to improve the exploitation phase of the ICA algorithm. In addition, the proposed hybrid algorithm uses SA to make a balance between exploration and exploitation phases. The proposed hybrid algorithm is compared with several meta-heuristic methods, including genetic algorithm (GA), HS, and ICA on several well-known benchmark instances. The comprehensive experiments and statistical analysis on standard benchmark functions certify the superiority of the proposed method over the other algorithms. The efficacy of the proposed hybrid algorithm is promising and can be used in several real-life engineering and management problems.

  13. An investigation on consumer’s behaviors towards well-known luxury brands

    Directory of Open Access Journals (Sweden)

    Mohammad Javad Ghasemi

    2014-03-01

    Full Text Available This paper presents an empirical investigation to find the relationship between consumer’s behaviors towards well-known luxury brands in Iranian market. The study designs a questionnaire in Likert scale and distributes it among 250 randomly people who purchase luxury products. The study investigates the effects of three variables including perception value, social normality and need for being exclusive on perception of a brand for motivating customers to purchase luxury products. In addition, the study tries to find out whether customers’ educational backgrounds influence on purchasing luxury products or not. Cronbach alphas are all well above the minimum acceptable level, which validates the survey. Using structural equation modeling, the study confirms all hypotheses of the survey.

  14. A novel hybrid meta-heuristic technique applied to the well-known benchmark optimization problems

    Science.gov (United States)

    Abtahi, Amir-Reza; Bijari, Afsane

    2017-09-01

    In this paper, a hybrid meta-heuristic algorithm, based on imperialistic competition algorithm (ICA), harmony search (HS), and simulated annealing (SA) is presented. The body of the proposed hybrid algorithm is based on ICA. The proposed hybrid algorithm inherits the advantages of the process of harmony creation in HS algorithm to improve the exploitation phase of the ICA algorithm. In addition, the proposed hybrid algorithm uses SA to make a balance between exploration and exploitation phases. The proposed hybrid algorithm is compared with several meta-heuristic methods, including genetic algorithm (GA), HS, and ICA on several well-known benchmark instances. The comprehensive experiments and statistical analysis on standard benchmark functions certify the superiority of the proposed method over the other algorithms. The efficacy of the proposed hybrid algorithm is promising and can be used in several real-life engineering and management problems.

  15. Probabilistic insurance

    OpenAIRE

    Wakker, P. P.; Thaler, R.H.; Tversky, A.

    1997-01-01

    textabstractProbabilistic insurance is an insurance policy involving a small probability that the consumer will not be reimbursed. Survey data suggest that people dislike probabilistic insurance and demand more than a 20% reduction in the premium to compensate for a 1% default risk. While these preferences are intuitively appealing they are difficult to reconcile with expected utility theory. Under highly plausible assumptions about the utility function, willingness to pay for probabilistic i...

  16. Quantitative NDI integration with probabilistic fracture mechanics for the assessment of fracture risk in pipelines

    Energy Technology Data Exchange (ETDEWEB)

    Kurz, Jochen H.; Cioclov, Dragos; Dobmann, Gerd; Boiler, Christian [Fraunhofer Inst. fuer Zerstoerungsfreie Pruefverfahren (IZFP), Saarbruecken (Germany)

    2009-07-01

    In the context of probabilistic paradigm of fracture risk assessment in structural components a computer simulation rationale is presented which has at the base the integration of Quantitative Non-destructive Inspection and Probabilistic Fracture Mechanics. In this study the static failure under static loading is assessed in the format known as Failure Assessment Diagram (FAD). The key concept in the analysis is the stress intensity factor (SIF) which accounts on the geometry of the component and the size of a pre-existent defect of a crack nature. FAD assessments can be made in deterministic sense, which yields the end result in dual terms of fail/not-fail. The fracture risk is evaluated in probabilistic terms. The superposed probabilistic pattern over the deterministic one (in mean sense) is implemented via Monte-Carlo sampling. The probabilistic fracture simulation yields a more informative analysis in terms of probability of failure. An important feature of the PVrisk software is the ability to simulate the influence of the quality and reliability of non-destructive inspection (NDI). It is achieved by integrating, algorithmically. probabilistic FAD analysis and the Probability of Detection (POD). The POD information can only be applied in a probabilistic analysis and leads to a refinement of the assessment. By this means, it can be ascertained the decrease of probability of failure (increase of reliability) when POD-characterized NDI is applied. Therefore, this procedure can be used as a tool for inspection based life time conceptions. In this paper results of sensitivity analyses of the fracture toughness are presented with the aim to outline, in terms of non-failure probabilities, the benefits of applying NDI, in various qualities, in comparison with the situation when NDI is lacking. (orig.)

  17. 78 FR 15746 - Compendium of Analyses To Investigate Select Level 1 Probabilistic Risk Assessment End-State...

    Science.gov (United States)

    2013-03-12

    ... COMMISSION Compendium of Analyses To Investigate Select Level 1 Probabilistic Risk Assessment End-State... document entitled: Compendium of Analyses to Investigate Select Level 1 Probabilistic Risk Assessment End...-415- 7000, email: Donald.Helton@nrc.gov . SUPPLEMENTARY INFORMATION: This report, ``Compendium of...

  18. Uncertainty analysis of USES 3.0. Improving risk management through probabilistic risk assessment of agricultural pesticides

    NARCIS (Netherlands)

    Rikken MGJ; Wijnen HJ van; Linders JBHJ; Jager DT; CSR; ECO

    2003-01-01

    Risk assessment of pesticides in the Netherlands is carried out using the computerised Uniform System for the Evaluation of Substances (USES). In USES the measure of risk is a single-point or deterministic estimate. Here, it is shown how a probabilistic assessment, incorporating knowledge about unc

  19. Use of limited data to construct Bayesian networks for probabilistic risk assessment.

    Energy Technology Data Exchange (ETDEWEB)

    Groth, Katrina M.; Swiler, Laura Painton

    2013-03-01

    Probabilistic Risk Assessment (PRA) is a fundamental part of safety/quality assurance for nuclear power and nuclear weapons. Traditional PRA very effectively models complex hardware system risks using binary probabilistic models. However, traditional PRA models are not flexible enough to accommodate non-binary soft-causal factors, such as digital instrumentation&control, passive components, aging, common cause failure, and human errors. Bayesian Networks offer the opportunity to incorporate these risks into the PRA framework. This report describes the results of an early career LDRD project titled %E2%80%9CUse of Limited Data to Construct Bayesian Networks for Probabilistic Risk Assessment%E2%80%9D. The goal of the work was to establish the capability to develop Bayesian Networks from sparse data, and to demonstrate this capability by producing a data-informed Bayesian Network for use in Human Reliability Analysis (HRA) as part of nuclear power plant Probabilistic Risk Assessment (PRA). This report summarizes the research goal and major products of the research.

  20. Structural, electronic and optical properties of well-known primary explosive: Mercury fulminate

    Energy Technology Data Exchange (ETDEWEB)

    Yedukondalu, N.; Vaitheeswaran, G., E-mail: gvsp@uohyd.ernet.in [Advanced Centre of Research in High Energy Materials (ACRHEM), University of Hyderabad, Prof. C. R. Rao Road, Gachibowli, Hyderabad, Telangana 500046 (India)

    2015-11-28

    Mercury Fulminate (MF) is one of the well-known primary explosives since 17th century and it has rendered invaluable service over many years. However, the correct molecular and crystal structures are determined recently after 300 years of its discovery. In the present study, we report pressure dependent structural, elastic, electronic and optical properties of MF. Non-local correction methods have been employed to capture the weak van der Waals interactions in layered and molecular energetic MF. Among the non-local correction methods tested, optB88-vdW method works well for the investigated compound. The obtained equilibrium bulk modulus reveals that MF is softer than the well known primary explosives Silver Fulminate (SF), silver azide and lead azide. MF exhibits anisotropic compressibility (b > a > c) under pressure, consequently the corresponding elastic moduli decrease in the following order: C{sub 22} > C{sub 11} > C{sub 33}. The structural and mechanical properties suggest that MF is more sensitive to detonate along c-axis (similar to RDX) due to high compressibility of Hg⋯O non-bonded interactions along that axis. Electronic structure and optical properties were calculated including spin-orbit (SO) interactions using full potential linearized augmented plane wave method within recently developed Tran-Blaha modified Becke-Johnson (TB-mBJ) potential. The calculated TB-mBJ electronic structures of SF and MF show that these compounds are indirect bandgap insulators. Also, SO coupling is found to be more pronounced for 4d and 5d-states of Ag and Hg atoms of SF and MF, respectively. Partial density of states and electron charge density maps were used to describe the nature of chemical bonding. Ag—C bond is more directional than Hg—C bond which makes SF to be more unstable than MF. The effect of SO coupling on optical properties has also been studied and found to be significant for both (SF and MF) of the compounds.

  1. OFFENSE ELEMENTS ANALYSIS IN BASKETBALL APPLIED BY WELL-KNOWN COACHES

    Directory of Open Access Journals (Sweden)

    Naid Kadušić

    2010-09-01

    Full Text Available Offense tatctics can be divided into indivdual, group and collective. This article deals with man to man offense, as well as with primary and secondary fast break. It deals with offense tactics consideration and analysis of well-known contemporary teams. Their tactics is presented through different simplified patterns. These patterns allow the teams to find more ideas how to outperform an oponent. The aim of each strategy is to bring a specific outcome, which may be positive or negative. A game outcome is the final product of a team and it shows whether any progress has been made. One of the important offense tactics is man to man, which is very demanding. There is secondary offense also, that represents a situation in which the oponent defense is outnumbered by the offense. These tactics are closely connected with primary and secondary fast break offense. Another interesting type of an offensive play is the pick and roll, one of the most common contemporary plays used in basketball.

  2. Integrated probabilistic risk assessment for nanoparticles: the case of nanosilica in food

    Energy Technology Data Exchange (ETDEWEB)

    Jacobs, Rianne, E-mail: rianne.jacobs@wur.nl; Voet, Hilko van der; Braak, Cajo J. F. ter [Wageningen University and Research Centre, Biometris (Netherlands)

    2015-06-15

    Insight into risks of nanotechnology and the use of nanoparticles is an essential condition for the social acceptance and safe use of nanotechnology. One of the problems with which the risk assessment of nanoparticles is faced is the lack of data, resulting in uncertainty in the risk assessment. We attempt to quantify some of this uncertainty by expanding a previous deterministic study on nanosilica (5–200 nm) in food into a fully integrated probabilistic risk assessment. We use the integrated probabilistic risk assessment method in which statistical distributions and bootstrap methods are used to quantify uncertainty and variability in the risk assessment. Due to the large amount of uncertainty present, this probabilistic method, which separates variability from uncertainty, contributed to a better understandable risk assessment. We found that quantifying the uncertainties did not increase the perceived risk relative to the outcome of the deterministic study. We pinpointed particular aspects of the hazard characterization that contributed most to the total uncertainty in the risk assessment, suggesting that further research would benefit most from obtaining more reliable data on those aspects.

  3. Integrated probabilistic risk assessment for nanoparticles: the case of nanosilica in food

    Science.gov (United States)

    Jacobs, Rianne; van der Voet, Hilko; ter Braak, Cajo J. F.

    2015-06-01

    Insight into risks of nanotechnology and the use of nanoparticles is an essential condition for the social acceptance and safe use of nanotechnology. One of the problems with which the risk assessment of nanoparticles is faced is the lack of data, resulting in uncertainty in the risk assessment. We attempt to quantify some of this uncertainty by expanding a previous deterministic study on nanosilica (5-200 nm) in food into a fully integrated probabilistic risk assessment. We use the integrated probabilistic risk assessment method in which statistical distributions and bootstrap methods are used to quantify uncertainty and variability in the risk assessment. Due to the large amount of uncertainty present, this probabilistic method, which separates variability from uncertainty, contributed to a better understandable risk assessment. We found that quantifying the uncertainties did not increase the perceived risk relative to the outcome of the deterministic study. We pinpointed particular aspects of the hazard characterization that contributed most to the total uncertainty in the risk assessment, suggesting that further research would benefit most from obtaining more reliable data on those aspects.

  4. Potential for the adoption of probabilistic risk assessments by end-users and decision-makers

    DEFF Research Database (Denmark)

    Frewer, Lynn J.; Fischer, Arnout R. H.; van den Brink, Paul J.

    2008-01-01

    -user and regulatory uptake has not been, to date, extensive. A case study, utilizing the Theory of Planned Behavior, was conducted in order to identify potential determinants of end-user adoption of probabilistic risk assessments associated with the ecotoxicological impact of pesticides. Seventy potential end...

  5. The Terrestrial Investigation Model: A probabilistic risk assessment model for birds exposed to pesticides

    Science.gov (United States)

    One of the major recommendations of the National Academy of Science to the USEPA, NMFS and USFWS was to utilize probabilistic methods when assessing the risks of pesticides to federally listed endangered and threatened species. The Terrestrial Investigation Model (TIM, version 3....

  6. Optimal Portfolio Allocation under a Probabilistic Risk Constraint and the Incentives for Financial Innovation

    NARCIS (Netherlands)

    J. Daníelsson (Jón); B.N. Jorgensen (Bjørn); C.G. de Vries (Casper); X. Yang (Xiaoguang)

    2001-01-01

    textabstractWe derive, in a complete markets environment, an investor's optimal portfolio allocation subject to both a budget constraint and a probabilistic risk constraint. We demonstrate that the set of feasible portfolios need not be connected or convex, while the number of local optima increases

  7. Potential for the adoption of probabilistic risk assessments by end-users and decision-makers

    DEFF Research Database (Denmark)

    Frewer, Lynn J.; Fischer, Arnout R. H.; van den Brink, Paul J.

    2008-01-01

    -user and regulatory uptake has not been, to date, extensive. A case study, utilizing the Theory of Planned Behavior, was conducted in order to identify potential determinants of end-user adoption of probabilistic risk assessments associated with the ecotoxicological impact of pesticides. Seventy potential end...

  8. Optimal Portfolio Allocation under a Probabilistic Risk Constraint and the Incentives for Financial Innovation

    NARCIS (Netherlands)

    J. Danielsson; B.N. Jorgensen (Bjørn); C.G. de Vries (Casper); X. Yang (Xiaoguang)

    2001-01-01

    textabstractWe derive, in a complete markets environment, an investor's optimal portfolio allocation subject to both a budget constraint and a probabilistic risk constraint. We demonstrate that the set of feasible portfolios need not be connected or convex, while the number of local optima increases

  9. Probabilistic Insurance

    NARCIS (Netherlands)

    Wakker, P.P.; Thaler, R.H.; Tversky, A.

    1997-01-01

    Probabilistic insurance is an insurance policy involving a small probability that the consumer will not be reimbursed. Survey data suggest that people dislike probabilistic insurance and demand more than a 20% reduction in premium to compensate for a 1% default risk. These observations cannot be

  10. Probabilistic Insurance

    NARCIS (Netherlands)

    P.P. Wakker (Peter); R.H. Thaler (Richard); A. Tversky (Amos)

    1997-01-01

    textabstractProbabilistic insurance is an insurance policy involving a small probability that the consumer will not be reimbursed. Survey data suggest that people dislike probabilistic insurance and demand more than a 20% reduction in the premium to compensate for a 1% default risk. While these

  11. Probabilistic Insurance

    NARCIS (Netherlands)

    P.P. Wakker (Peter); R.H. Thaler (Richard); A. Tversky (Amos)

    1997-01-01

    textabstractProbabilistic insurance is an insurance policy involving a small probability that the consumer will not be reimbursed. Survey data suggest that people dislike probabilistic insurance and demand more than a 20% reduction in the premium to compensate for a 1% default risk. While these pref

  12. Probabilistic disaggregation model with application to natural hazard risk assessment of portfolios

    DEFF Research Database (Denmark)

    Custer, Rocco; Nishijima, Kazuyoshi

    are usually deterministic and make use of auxiliary indicator, such as land cover, to spatially distribute exposures. As the dependence between auxiliary indicator and disaggregated number of exposures is generally imperfect, uncertainty arises in disaggregation. This paper therefore proposes a probabilistic...... disaggregation model that considers the uncertainty in the disaggregation, taking basis in the scaled Dirichlet distribution. The proposed probabilistic disaggregation model is applied to a portfolio of residential buildings in the Canton Bern, Switzerland, subject to flood risk. Thereby, the model is verified...

  13. IgG4-related Hashimoto's thyroiditis--a new variant of a well known disease.

    Science.gov (United States)

    Luiz, Henrique Vara; Gonçalves, Diogo; Silva, Tiago Nunes da; Nascimento, Isabel; Ribeiro, Ana; Mafra, Manuela; Manita, Isabel; Portugal, Jorge

    2014-11-01

    Hashimoto's thyroiditis (HT) has been characterized for many years as a well-defined clinicopathologic entity, but is now considered a heterogeneous disease. IgG4-related HT is a new subtype characterized by thyroid inflammation rich in IgG4-positive plasma cells and marked fibrosis. It may be part of the systemic IgG4-related disease. We report a case of a 56-year-old Portuguese man who presented with a one-month history of progressive neck swelling and dysphagia. Laboratory testing revealed increased inflammatory parameters, subclinical hypothyroidism and very high levels of thyroid autoantibodies. Cervical ultrasound (US) demonstrated an enlarged and heterogeneous thyroid gland and two hypoechoic nodules. US-guided fine needle aspiration cytology was consistent with lymphocytic thyroiditis. The patient was submitted to total thyroidectomy and microscopic examination identified typical findings of HT, marked fibrosis limited within the thyroid capsule and lymphoplasmacytic infiltration, with >50 IgG4-positive plasma cells per high-power field and an IgG4/IgG ratio of >40%. After surgery, serum IgG4 concentration was high-normal. Symptoms relief and reduction in laboratory inflammatory parameters were noticed. Thyroid function is controlled with levothyroxine. To our knowledge we report the first case of IgG4-related HT in a non-Asian patient. We also perform a review of the literature regarding IgG4-related disease and IgG4-related HT. Our case highlights this new variant of the well known HT, and helps physicians in recognizing its main clinical features, allowing for proper diagnosis and treatment.

  14. Time course of EEG oscillations during repeated listening of a well-known aria

    Directory of Open Access Journals (Sweden)

    Lutz eJäncke

    2015-07-01

    Full Text Available While previous studies have analyzed mean neurophysiological responses to musical stimuli, the current study aimed to identify specific time courses of EEG oscillations, which are associated with dynamic changes in the acoustic features of the musical stimulus. In addition, we were interested in whether these time courses change during a repeated presentation of the same musical piece. A total of 16 subjects repeatedly listened to the well-known aria Nessun dorma, sung by Paul Potts, while continuous 128-channel EEG and heart rate (HR, as well as electrodermal (EDA responses, were recorded. The time courses for the EEG oscillations were calculated using a time resolution of 1 second for several frequency bands, on the basis of individual alpha-peak frequencies (theta, low alpha-1, low alpha-2, upper alpha, and beta. For all frequency bands, we identified a more or less continuous increase in power relative to a baseline period, indicating strong event-related synchronization (ERS during music listening. The ERS time courses, however, did not correlate strongly with the time courses of the acoustic features of the aria. In addition, we did not observe changes in EEG oscillations after repeated presentation of the same musical piece. Aside from this distinctive feature, we identified a remarkable variability in EEG oscillations, both within and between the repeated presentations of the aria. We interpret the continuous increase in ERS observed in all frequency bands during music listening as an indicator of a particular neurophysiological and psychological state evoked by music listening. We suggest that this state is characterized by increased internal attention (accompanied by reduced external attention, increased inhibition of brain networks not involved in the generation of this internal state, the maintenance of a particular level of general alertness, and a type of brain state that can be described as mind wandering. The overall state can be

  15. Advanced Small Modular Reactor (SMR) Probabilistic Risk Assessment (PRA) Technical Exchange Meeting

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Curtis [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2013-09-01

    During FY13, the INL developed an advanced SMR PRA framework which has been described in the report Small Modular Reactor (SMR) Probabilistic Risk Assessment (PRA) Detailed Technical Framework Specification, INL/EXT-13-28974 (April 2013). In this framework, the various areas are considered: Probabilistic models to provide information specific to advanced SMRs Representation of specific SMR design issues such as having co-located modules and passive safety features Use of modern open-source and readily available analysis methods Internal and external events resulting in impacts to safety All-hazards considerations Methods to support the identification of design vulnerabilities Mechanistic and probabilistic data needs to support modeling and tools In order to describe this framework more fully and obtain feedback on the proposed approaches, the INL hosted a technical exchange meeting during August 2013. This report describes the outcomes of that meeting.

  16. Scenario logic and probabilistic management of risk in business and engineering

    CERN Document Server

    Solojentsev, E D

    2005-01-01

    In this volume the methodological aspects of the scenario logic and probabilistic (LP) non-success risk management are considered. The theoretical bases of scenario non-success risk LP-management in business and engineering are also stated. Methods and algorithms for the scenario risk LP-management in problems of classification, investment and effectiveness are described. Risk LP- models and results of numerical investigations for credit risks, risk of frauds, security portfolio risk, risk of quality, accuracy, and risk in multi-stage systems reliability are given. In addition, a rather large number of new problems of estimation, analysis and management of risk are considered. Software for risk problems based on LP-methods, LP-theory, and GIE are described too. Audience This volume is intended for experts and scientists in the area of the risk in business and engineering, in problems of classification, investment and effectiveness, and post-graduates in those subject areas.

  17. Assessment of possible airborne impact from risk sites: methodology for probabilistic atmospheric studies

    Directory of Open Access Journals (Sweden)

    A. A. Baklanov

    2004-01-01

    Full Text Available The main purpose of this study is to develop a methodology for a multidisciplinary nuclear risk and vulnerability assessment, and to test this methodology through estimation of a nuclear risk to population in the Northern European countries in case of a severe accident at the nuclear risk sites. For assessment of the probabilistic risk and vulnerability, a combination of social-geophysical factors and probabilities are considered. The main focus of this paper is the description of methodology for evaluation of the atmospheric transport of radioactive releases from the risk site regions based on the long-term trajectory modeling. The suggested methodology is given from the probabilistic point of view. The main questions stated are: What are probabilities and times for radionuclide atmospheric transport to different neighbouring countries and territories in case of the hypothetical accidental release at the nuclear risk site? Which geographical territories or countries are at the highest risk from the hypothetical accidental releases? To answer these questions we suggest applying the following research tools for probabilistic atmospheric studies. First tool is atmospheric modelling to calculate multiyear forward trajectories originated over the sites. Second tool is statistical analyses to explore temporal and spatial structure of calculated trajectories and evaluate different probabilistic impact indicators: atmospheric transport pathways, airflow, fast transport, typical transport time, maximum possible impact zone, maximum reaching distance, etc. These indicators are applicable for further GIS-analysis and integration to estimate regional risk and vulnerability in case of accidental releases at the risk sites and for planning the emergency response and preparedness systems.

  18. Probabilistic risk assessment of emerging materials: case study of titanium dioxide nanoparticles.

    Science.gov (United States)

    Tsang, Michael P; Hristozov, Danail; Zabeo, Alex; Koivisto, Antti Joonas; Jensen, Alexander Christian Østerskov; Jensen, Keld Alstrup; Pang, Chengfang; Marcomini, Antonio; Sonnemann, Guido

    2017-05-01

    The development and use of emerging technologies such as nanomaterials can provide both benefits and risks to society. Emerging materials may promise to bring many technological advantages but may not be well characterized in terms of their production volumes, magnitude of emissions, behaviour in the environment and effects on living organisms. This uncertainty can present challenges to scientists developing these materials and persons responsible for defining and measuring their adverse impacts. Human health risk assessment is a method of identifying the intrinsic hazard of and quantifying the dose-response relationship and exposure to a chemical, to finally determine the estimation of risk. Commonly applied deterministic approaches may not sufficiently estimate and communicate the likelihood of risks from emerging technologies whose uncertainty is large. Probabilistic approaches allow for parameters in the risk assessment process to be defined by distributions instead of single deterministic values whose uncertainty could undermine the value of the assessment. A probabilistic approach was applied to the dose-response and exposure assessment of a case study involving the production of nanoparticles of titanium dioxide in seven different exposure scenarios. Only one exposure scenario showed a statistically significant level of risk. In the latter case, this involved dumping high volumes of nano-TiO2 powders into an open vessel with no personal protection equipment. The probabilistic approach not only provided the likelihood of but also the major contributing factors to the estimated risk (e.g. emission potential).

  19. A probabilistic assessment of health risks associated with short-term exposure to tropospheric ozone

    Energy Technology Data Exchange (ETDEWEB)

    Whitfield, R.G; Biller, W.F.; Jusko, M.J.; Keisler, J.M.

    1996-06-01

    The work described in this report is part of a larger risk assessment sponsored by the U.S. Environmental Protection Agency. Earlier efforts developed exposure-response relationships for acute health effects among populations engaged in heavy exertion. Those efforts also developed a probabilistic national ambient air quality standards exposure model and a general methodology for integrating probabilistic exposure-response relation- ships and exposure estimates to calculate overall risk results. Recently published data make it possible to model additional health endpoints (for exposure at moderate exertion), including hospital admissions. New air quality and exposure estimates for alternative national ambient air quality standards for ozone are combined with exposure-response models to produce the risk results for hospital admissions and acute health effects. Sample results explain the methodology and introduce risk output formats.

  20. Probabilistic risk analysis toward cost-effective 3S (safety, safeguards, security) implementation

    Science.gov (United States)

    Suzuki, Mitsutoshi; Mochiji, Toshiro

    2014-09-01

    Probabilistic Risk Analysis (PRA) has been introduced for several decades in safety and nuclear advanced countries have already used this methodology in their own regulatory systems. However, PRA has not been developed in safeguards and security so far because of inherent difficulties in intentional and malicious acts. In this paper, probabilistic proliferation and risk analysis based on random process is applied to hypothetical reprocessing process and physical protection system in nuclear reactor with the Markov model that was originally developed by the Proliferation Resistance and Physical Protection Working Group (PRPPWG) in Generation IV International Framework (GIF). Through the challenge to quantify the security risk with a frequency in this model, integrated risk notion among 3S to pursue the cost-effective installation of those countermeasures is discussed in a heroic manner.

  1. Development of fire simulation models for radiative heat transfer and probabilistic risk assessment

    OpenAIRE

    Hostikka, Simo

    2008-01-01

    An essential part of fire risk assessment is the analysis of fire hazards and fire propagation. In this work, models and tools for two different aspects of numerical fire simulation have been developed. The primary objectives have been firstly to investigate the possibility of exploiting state-of-the-art fire models within probabilistic fire risk assessments and secondly to develop a computationally efficient solver of thermal radiation for the Fire Dynamics Simulator (FDS) code. In the f...

  2. A methodology for post-mainshock probabilistic assessment of building collapse risk

    Science.gov (United States)

    Luco, N.; Gerstenberger, M.C.; Uma, S.R.; Ryu, H.; Liel, A.B.; Raghunandan, M.

    2011-01-01

    This paper presents a methodology for post-earthquake probabilistic risk (of damage) assessment that we propose in order to develop a computational tool for automatic or semi-automatic assessment. The methodology utilizes the same so-called risk integral which can be used for pre-earthquake probabilistic assessment. The risk integral couples (i) ground motion hazard information for the location of a structure of interest with (ii) knowledge of the fragility of the structure with respect to potential ground motion intensities. In the proposed post-mainshock methodology, the ground motion hazard component of the risk integral is adapted to account for aftershocks which are deliberately excluded from typical pre-earthquake hazard assessments and which decrease in frequency with the time elapsed since the mainshock. Correspondingly, the structural fragility component is adapted to account for any damage caused by the mainshock, as well as any uncertainty in the extent of this damage. The result of the adapted risk integral is a fully-probabilistic quantification of post-mainshock seismic risk that can inform emergency response mobilization, inspection prioritization, and re-occupancy decisions.

  3. Dynamic Positioning System (DPS) Risk Analysis Using Probabilistic Risk Assessment (PRA)

    Science.gov (United States)

    Thigpen, Eric B.; Boyer, Roger L.; Stewart, Michael A.; Fougere, Pete

    2017-01-01

    The National Aeronautics and Space Administration (NASA) Safety & Mission Assurance (S&MA) directorate at the Johnson Space Center (JSC) has applied its knowledge and experience with Probabilistic Risk Assessment (PRA) to projects in industries ranging from spacecraft to nuclear power plants. PRA is a comprehensive and structured process for analyzing risk in complex engineered systems and/or processes. The PRA process enables the user to identify potential risk contributors such as, hardware and software failure, human error, and external events. Recent developments in the oil and gas industry have presented opportunities for NASA to lend their PRA expertise to both ongoing and developmental projects within the industry. This paper provides an overview of the PRA process and demonstrates how this process was applied in estimating the probability that a Mobile Offshore Drilling Unit (MODU) operating in the Gulf of Mexico and equipped with a generically configured Dynamic Positioning System (DPS) loses location and needs to initiate an emergency disconnect. The PRA described in this paper is intended to be generic such that the vessel meets the general requirements of an International Maritime Organization (IMO) Maritime Safety Committee (MSC)/Circ. 645 Class 3 dynamically positioned vessel. The results of this analysis are not intended to be applied to any specific drilling vessel, although provisions were made to allow the analysis to be configured to a specific vessel if required.

  4. Site-specific probabilistic ecological risk assessment of a volatile chlorinated hydrocarbon-contaminated tidal estuary.

    Science.gov (United States)

    Hunt, James; Birch, Gavin; Warne, Michael St J

    2010-05-01

    Groundwater contaminated with volatile chlorinated hydrocarbons (VCHs) was identified as discharging to Penrhyn Estuary, an intertidal embayment of Botany Bay, New South Wales, Australia. A screening-level hazard assessment of surface water in Penrhyn Estuary identified an unacceptable hazard to marine organisms posed by VCHs. Given the limitations of hazard assessments, the present study conducted a higher-tier, quantitative probabilistic risk assessment using the joint probability curve (JPC) method that accounted for variability in exposure and toxicity profiles to quantify risk (delta). Risk was assessed for 24 scenarios, including four areas of the estuary based on three exposure scenarios (low tide, high tide, and both low and high tides) and two toxicity scenarios (chronic no-observed-effect concentrations [NOEC] and 50% effect concentrations [EC50]). Risk (delta) was greater at low tide than at high tide and varied throughout the tidal cycle. Spatial distributions of risk in the estuary were similar using both NOEC and EC50 data. The exposure scenario including data combined from both tides was considered the most accurate representation of the ecological risk in the estuary. When assessing risk using data across both tides, the greatest risk was identified in the Springvale tributary (delta=25%)-closest to the source area-followed by the inner estuary (delta=4%) and the Floodvale tributary (delta=2%), with the lowest risk in the outer estuary (delta=0.1%), farthest from the source area. Going from the screening level ecological risk assessment (ERA) to the probabilistic ERA changed the risk from unacceptable to acceptable in 50% of exposure scenarios in two of the four areas within the estuary. The probabilistic ERA provided a more realistic assessment of risk than the screening-level hazard assessment.

  5. Industrial metal pollution in water and probabilistic assessment of human health risk.

    Science.gov (United States)

    Saha, Narottam; Rahman, M Safiur; Ahmed, Mohammad Boshir; Zhou, John L; Ngo, Huu Hao; Guo, Wenshan

    2016-10-28

    Concentration of eight heavy metals in surface and groundwater around Dhaka Export Processing Zone (DEPZ) industrial area were investigated, and the health risk posed to local children and adult residents via ingestion and dermal contact was evaluated using deterministic and probabilistic approaches. Metal concentrations (except Cu, Mn, Ni, and Zn) in Bangshi River water were above the drinking water quality guidelines, while in groundwater were less than the recommended limits. Concentration of metals in surface water decreased as a function of distance. Estimations of non-carcinogenic health risk for surface water revealed that mean hazard index (HI) values of As, Cr, Cu, and Pb for combined pathways (i.e., ingestion and dermal contact) were >1.0 for both age groups. The estimated risk mainly came from the ingestion pathway. However, the HI values for all the examined metals in groundwater were probabilistically estimated 95th percentile values of TCR exceeded the benchmark, mean TCR values were less than 1 × 10(-4). Simulated results showed that 20.13% and 5.43% values of TCR for surface water were >1 × 10(-4) for adult and children, respectively. Deterministic and probabilistic estimations of cancer risk through exposure to groundwater were well below the safety limit. Overall, the population exposed to Bangshi River water remained at carcinogenic and non-carcinogenic health threat and the risk was higher for adults. Sensitivity analysis identified exposure duration (ED) and ingestion rate (IR) of water as the most relevant variables affecting the probabilistic risk estimation model outcome.

  6. Assessing risk: the role of probabilistic risk assessment (PRA) in patient safety improvement.

    Science.gov (United States)

    Wreathall, J; Nemeth, C

    2004-06-01

    Morbidity and mortality due to "medical errors" compel better understanding of health care as a system. Probabilistic risk assessment (PRA) has been used to assess the designs of high hazard, low risk systems such as commercial nuclear power plants and chemical manufacturing plants and is now being studied for its potential in the improvement of patient safety. PRA examines events that contribute to adverse outcomes through the use of event tree analysis and determines the likelihood of event occurrence through fault tree analysis. It complements tools already in use in patient safety such as failure modes and effects analyses (FMEAs) and root cause analyses (RCAs). PRA improves on RCA by taking account of the more complex causal interrelationships that are typical in health care. It also enables the analyst to examine potential solution effectiveness by direct graphical representations. However, PRA simplifies real world complexity by forcing binary conditions on events, and it lacks adequate probability data (although recent developments help to overcome these limitations). Its reliance on expert assessment calls for deep domain knowledge which has to come from research performed at the "sharp end" of acute care.

  7. Probabilistic methodology for estimating radiation-induced cancer risk

    Energy Technology Data Exchange (ETDEWEB)

    Dunning, D.E. Jr.; Leggett, R.W.; Williams, L.R.

    1981-01-01

    The RICRAC computer code was developed at Oak Ridge National Laboratory to provide a versatile and convenient methodology for radiation risk assessment. The code allows as input essentially any dose pattern commonly encountered in risk assessments for either acute or chronic exposures, and it includes consideration of the age structure of the exposed population. Results produced by the analysis include the probability of one or more radiation-induced cancer deaths in a specified population, expected numbers of deaths, and expected years of life lost as a result of premature fatalities. These calculatons include consideration of competing risks of death from all other causes. The program also generates a probability frequency distribution of the expected number of cancers in any specified cohort resulting from a given radiation dose. The methods may be applied to any specified population and dose scenario.

  8. Risk-based decision making in water management using probabilistic forecasts: results from a game experiment

    Science.gov (United States)

    Crochemore, Louise; Ramos, Maria-Helena; Pappenberger, Florian; van Andel, Schalk-Jan; Wood, Andy

    2014-05-01

    Probabilistic streamflow forecasts have been increasingly used or requested by practitioners in the operation of multipurpose water reservoirs. They usually integrate hydrologic inflow forecasts to their operational management rules to optimize water allocation or its economic value, to mitigate droughts, for flood and ecological control, among others. In this paper, we present an experiment conducted to investigate the use of probabilistic forecasts to make decisions on water reservoir outflows. The experiment was set up as a risk-based decision-making game. In the game, each participant acted as a water manager. A sequence of probabilistic inflow forecasts was presented to be used to make a reservoir release decision at a monthly time step, subject to a few constraints. After each decision, the actual inflow was presented and the consequences of the decisions made were discussed. Results from the application of the game to different groups of scientists and operational managers during conferences and meetings in 2013 (a total of about 150 participants) illustrate the different strategies adopted by the players. This game experiment allowed participants to experience first hand the challenges of probabilistic, quantitative decision-making.

  9. Probabilistic Risk Model for Organ Doses and Acute Health Effects of Astronauts on Lunar Missions

    Science.gov (United States)

    Kim, Myung-Hee Y.; Hu, Shaowen; Nounu, Hatem N.; Cucinotta, Francis A.

    2009-01-01

    Exposure to large solar particle events (SPEs) is a major concern during EVAs on the lunar surface and in Earth-to-Lunar transit. 15% of crew times may be on EVA with minimal radiation shielding. Therefore, an accurate assessment of SPE occurrence probability is required for the mission planning by NASA. We apply probabilistic risk assessment (PRA) for radiation protection of crews and optimization of lunar mission planning.

  10. An evaluation of the reliability and usefulness of external-initiator PRA (probabilistic risk analysis) methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Budnitz, R.J.; Lambert, H.E. (Future Resources Associates, Inc., Berkeley, CA (USA))

    1990-01-01

    The discipline of probabilistic risk analysis (PRA) has become so mature in recent years that it is now being used routinely to assist decision-making throughout the nuclear industry. This includes decision-making that affects design, construction, operation, maintenance, and regulation. Unfortunately, not all sub-areas within the larger discipline of PRA are equally mature,'' and therefore the many different types of engineering insights from PRA are not all equally reliable. 93 refs., 4 figs., 1 tab.

  11. Probabilistic risk assessment (PRA): status report and guidance for regulatory application. Draft report for comment

    Energy Technology Data Exchange (ETDEWEB)

    None

    1984-02-01

    This document describes the current status of the methodologies used in probabilistic risk assessment (PRA) and provides guidance for the application of the results of PRAs to the nuclear reactor regulatory process. The PRA studies that have been completed or are underway are reviewed. The levels of maturity of the methodologies used in a PRA are discussed. Insights derived from PRAs are listed. The potential uses of PRA results for regulatory purposes are discussed.

  12. Risk-Informed Safety Assurance and Probabilistic Assessment of Mission-Critical Software-Intensive Systems

    Science.gov (United States)

    Guarro, Sergio B.

    2010-01-01

    This report validates and documents the detailed features and practical application of the framework for software intensive digital systems risk assessment and risk-informed safety assurance presented in the NASA PRA Procedures Guide for Managers and Practitioner. This framework, called herein the "Context-based Software Risk Model" (CSRM), enables the assessment of the contribution of software and software-intensive digital systems to overall system risk, in a manner which is entirely compatible and integrated with the format of a "standard" Probabilistic Risk Assessment (PRA), as currently documented and applied for NASA missions and applications. The CSRM also provides a risk-informed path and criteria for conducting organized and systematic digital system and software testing so that, within this risk-informed paradigm, the achievement of a quantitatively defined level of safety and mission success assurance may be targeted and demonstrated. The framework is based on the concept of context-dependent software risk scenarios and on the modeling of such scenarios via the use of traditional PRA techniques - i.e., event trees and fault trees - in combination with more advanced modeling devices such as the Dynamic Flowgraph Methodology (DFM) or other dynamic logic-modeling representations. The scenarios can be synthesized and quantified in a conditional logic and probabilistic formulation. The application of the CSRM method documented in this report refers to the MiniAERCam system designed and developed by the NASA Johnson Space Center.

  13. Risk Analysis in Robust Control -- Making the Case for Probabilistic Robust Control

    CERN Document Server

    Chen, Xinjia; Zhou, Kemin

    2007-01-01

    This paper offers a critical view of the "worst-case" approach that is the cornerstone of robust control design. It is our contention that a blind acceptance of worst-case scenarios may lead to designs that are actually more dangerous than designs based on probabilistic techniques with a built-in risk factor. The real issue is one of modeling. If one accepts that no mathematical model of uncertainties is perfect then a probabilistic approach can lead to more reliable control even if it cannot guarantee stability for all possible cases. Our presentation is based on case analysis. We first establish that worst-case is not necessarily "all-encompassing." In fact, we show that for some uncertain control problems to have a conventional robust control solution it is necessary to make assumptions that leave out some feasible cases. Once we establish that point, we argue that it is not uncommon for the risk of unaccounted cases in worst-case design to be greater than that of the accepted risk in a probabilistic appro...

  14. A probabilistic topic model for clinical risk stratification from electronic health records.

    Science.gov (United States)

    Huang, Zhengxing; Dong, Wei; Duan, Huilong

    2015-12-01

    Risk stratification aims to provide physicians with the accurate assessment of a patient's clinical risk such that an individualized prevention or management strategy can be developed and delivered. Existing risk stratification techniques mainly focus on predicting the overall risk of an individual patient in a supervised manner, and, at the cohort level, often offer little insight beyond a flat score-based segmentation from the labeled clinical dataset. To this end, in this paper, we propose a new approach for risk stratification by exploring a large volume of electronic health records (EHRs) in an unsupervised fashion. Along this line, this paper proposes a novel probabilistic topic modeling framework called probabilistic risk stratification model (PRSM) based on Latent Dirichlet Allocation (LDA). The proposed PRSM recognizes a patient clinical state as a probabilistic combination of latent sub-profiles, and generates sub-profile-specific risk tiers of patients from their EHRs in a fully unsupervised fashion. The achieved stratification results can be easily recognized as high-, medium- and low-risk, respectively. In addition, we present an extension of PRSM, called weakly supervised PRSM (WS-PRSM) by incorporating minimum prior information into the model, in order to improve the risk stratification accuracy, and to make our models highly portable to risk stratification tasks of various diseases. We verify the effectiveness of the proposed approach on a clinical dataset containing 3463 coronary heart disease (CHD) patient instances. Both PRSM and WS-PRSM were compared with two established supervised risk stratification algorithms, i.e., logistic regression and support vector machine, and showed the effectiveness of our models in risk stratification of CHD in terms of the Area Under the receiver operating characteristic Curve (AUC) analysis. As well, in comparison with PRSM, WS-PRSM has over 2% performance gain, on the experimental dataset, demonstrating that

  15. MATILDA Version 2: Rough Earth TIALD Model for Laser Probabilistic Risk Assessment in Hilly Terrain - Part I

    Science.gov (United States)

    2017-03-13

    AFRL-RH-FS-TR-2017-0009 MATILDA Version-2: Rough Earth TIALD Model for Laser Probabilistic Risk Assessment in Hilly Terrain – Part I Paul K...Probabilistic Risk Assessment in Hilly Terrain – Part I ii Distribution A: Approved for public release; distribution unlimited. PA Case No: TSRL-PA-2017-0169...any penalty for failing to comply with a collection of information if it does not display a currently valid OMB control number. PLEASE DO NOT RETURN

  16. Probabilistic Databases

    CERN Document Server

    Suciu, Dan; Koch, Christop

    2011-01-01

    Probabilistic databases are databases where the value of some attributes or the presence of some records are uncertain and known only with some probability. Applications in many areas such as information extraction, RFID and scientific data management, data cleaning, data integration, and financial risk assessment produce large volumes of uncertain data, which are best modeled and processed by a probabilistic database. This book presents the state of the art in representation formalisms and query processing techniques for probabilistic data. It starts by discussing the basic principles for rep

  17. Probabilistic Approach to Risk Analysis of Chemical Spills at Sea

    Institute of Scientific and Technical Information of China (English)

    Magda Bogalecka; Krzysztof Kolowrocki

    2006-01-01

    Risk analysis of chemical spills at sea and their consequences for sea environment are discussed. Mutual interactions between the process of the sea accident initiating events, the process of the sea environment threats, and the process of the sea environment degradation are investigated. To describe these three particular processes, the separate semi-Markov models are built. Furthermore, these models are jointed into one general model of these processes interactions.Moreover, some comments on the method for statistical identification of the considered models are proposed.

  18. Multi-Hazard Advanced Seismic Probabilistic Risk Assessment Tools and Applications

    Energy Technology Data Exchange (ETDEWEB)

    Coleman, Justin L. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Bolisetti, Chandu [Idaho National Lab. (INL), Idaho Falls, ID (United States); Veeraraghavan, Swetha [Idaho National Lab. (INL), Idaho Falls, ID (United States); Parisi, Carlo [Idaho National Lab. (INL), Idaho Falls, ID (United States); Prescott, Steven R. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Gupta, Abhinav [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-09-01

    Design of nuclear power plant (NPP) facilities to resist natural hazards has been a part of the regulatory process from the beginning of the NPP industry in the United States (US), but has evolved substantially over time. The original set of approaches and methods was entirely deterministic in nature and focused on a traditional engineering margins-based approach. However, over time probabilistic and risk-informed approaches were also developed and implemented in US Nuclear Regulatory Commission (NRC) guidance and regulation. A defense-in-depth framework has also been incorporated into US regulatory guidance over time. As a result, today, the US regulatory framework incorporates deterministic and probabilistic approaches for a range of different applications and for a range of natural hazard considerations. This framework will continue to evolve as a result of improved knowledge and newly identified regulatory needs and objectives, most notably in response to the NRC activities developed in response to the 2011 Fukushima accident in Japan. Although the US regulatory framework has continued to evolve over time, the tools, methods and data available to the US nuclear industry to meet the changing requirements have not kept pace. Notably, there is significant room for improvement in the tools and methods available for external event probabilistic risk assessment (PRA), which is the principal assessment approach used in risk-informed regulations and risk-informed decision-making applied to natural hazard assessment and design. This is particularly true if PRA is applied to natural hazards other than seismic loading. Development of a new set of tools and methods that incorporate current knowledge, modern best practice, and state-of-the-art computational resources would lead to more reliable assessment of facility risk and risk insights (e.g., the SSCs and accident sequences that are most risk-significant), with less uncertainty and reduced conservatisms.

  19. Ecohydrology of agroecosystems: probabilistic description of yield reduction risk under limited water availability

    Science.gov (United States)

    Vico, Giulia; Porporato, Amilcare

    2013-04-01

    Supplemental irrigation represents one of the main strategies to mitigate the effects of climate variability and stabilize yields. Irrigated agriculture currently provides 40% of food production and its relevance is expected to further increase in the near future, in face of the projected alterations of rainfall patterns and increase in food, fiber, and biofuel demand. Because of the significant investments and water requirements involved in irrigation, strategic choices are needed to preserve productivity and profitability, while maintaining a sustainable water management - a nontrivial task given the unpredictability of the rainfall forcing. To facilitate decision making under uncertainty, a widely applicable probabilistic framework is proposed. The occurrence of rainfall events and irrigation applications are linked probabilistically to crop development during the growing season and yields at harvest. Based on these linkages, the probability density function of yields and corresponding probability density function of required irrigation volumes, as well as the probability density function of yields under the most common case of limited water availability are obtained analytically, as a function of irrigation strategy, climate, soil and crop parameters. The full probabilistic description of the frequency of occurrence of yields and water requirements is a crucial tool for decision making under uncertainty, e.g., via expected utility analysis. Furthermore, the knowledge of the probability density function of yield allows us to quantify the yield reduction hydrologic risk. Two risk indices are defined and quantified: the long-term risk index, suitable for long-term irrigation strategy assessment and investment planning, and the real-time risk index, providing a rigorous probabilistic quantification of the emergence of drought conditions during a single growing season in an agricultural setting. Our approach employs relatively few parameters and is thus easily and

  20. Scenario logic and probabilistic management of risk in business and engineering

    CERN Document Server

    Solojentsev, Evgueni D

    2009-01-01

    The book proposes a uniform logic and probabilistic (LP) approach to risk estimation and analysis in engineering and economics. It covers the methodological and theoretical basis of risk management at the design, test, and operation stages of economic, banking, and engineering systems with groups of incompatible events (GIE). It considers the risk LP-models in classification, investment, management of companies, bribes and corruption, analysis of risk and efficiency of social and economical processes, and management of development. Key features of this Second Edition: -Five new chapters -Treatment of the basic principles of the modern risk LP theory (the LP-calculus, the LP-methods and the risk LP-theory with GIE) using uniform methodology and terminology with a practical orientation towards both engineering and economics, for the first time in book form -Clear definitions and notations, revised sections and chapters, an extended list of references, and a new subject index -More than a hundred illustrations a...

  1. Can exposure limitations for well-known contact allergens be simplified? An analysis of dose-response patch test data.

    Science.gov (United States)

    Fischer, Louise Arup; Menné, Torkil; Voelund, Aage; Johansen, Jeanne Duus

    2011-06-01

    Allergic contact dermatitis is triggered by chemicals in the environment. Primary prevention is aimed at minimizing the risk of induction, whereas secondary and tertiary prevention are aimed at reducing elicitation. To identify the elicitation doses that will elicit an allergic reaction in 10% of allergic individuals under patch test conditions (ED(10) patch test) for different allergens, and to compare the results with those for different allergens and with animal data indicating sensitizing potency from the literature. The literature was searched for patch test elicitation studies that fulfilled six selected criteria. The elicitation doses were calculated, and fitted dose-response curves were drawn. Sixteen studies with eight different allergens-methylchloroisothiazolinone/ methylisothiazolinone, formaldehyde, nickel, cobalt, chromium, isoeugenol, hydroxyiso hexyl 3-cyclohexene carboxaldehyde, and methyldibromo glutaronitrile-were selected. The median ED(10) value was 0.835 µg/cm(2). The ED(10) patch test values were all within a factor of 7 from the lowest to the highest value, leaving out three outliers. No obvious patterns between the sensitization and elicitation doses for the allergens were found. We found a rather small variation in the ED(10) patch test between the allergens, and no clear relationship between induction potency and elicitation threshold of a range of allergens. This knowledge may stimulate thoughts on introducing a generic approach for limitations in exposure to well-known allergens. © 2011 John Wiley & Sons A/S.

  2. Probabilistic modeling of the flows and environmental risks of nano-silica

    Energy Technology Data Exchange (ETDEWEB)

    Wang, Yan; Kalinina, Anna; Sun, Tianyin; Nowack, Bernd, E-mail: nowack@empa.ch

    2016-03-01

    Nano-silica, the engineered nanomaterial with one of the largest production volumes, has a wide range of applications in consumer products and industry. This study aimed to quantify the exposure of nano-silica to the environment and to assess its risk to surface waters. Concentrations were calculated for four environmental (air, soil, surface water, sediments) and two technical compartments (wastewater, solid waste) for the EU and Switzerland using probabilistic material flow modeling. The corresponding median concentration in surface water is predicted to be 0.12 μg/l in the EU (0.053–3.3 μg/l, 15/85% quantiles). The concentrations in sediments in the complete sedimentation scenario were found to be the largest among all environmental compartments, with a median annual increase of 0.43 mg/kg·y in the EU (0.19–12 mg/kg·y, 15/85% quantiles). Moreover, probabilistic species sensitivity distributions (PSSD) were computed and the risk of nano-silica in surface waters was quantified by comparing the predicted environmental concentration (PEC) with the predicted no-effect concentration (PNEC) distribution, which was derived from the cumulative PSSD. This assessment suggests that nano-silica currently poses no risk to aquatic organisms in surface waters. Further investigations are needed to assess the risk of nano-silica in other environmental compartments, which is currently not possible due to a lack of ecotoxicological data. - Highlights: • We quantify the exposure of nano-silica to technical systems and the environment. • The median concentration in surface waters is predicted to be 0.12 μg/L in the EU. • Probabilistic species sensitivity distributions were computed for surface waters. • The risk assessment suggests that nano-silica poses no risk to aquatic organisms.

  3. Notes for a workshop on risk analysis and decision under uncertainty. The practical use of probabilistic and Bayesian methodology inreal life risk assessment and decision problems

    Energy Technology Data Exchange (ETDEWEB)

    1979-01-01

    The use of probabilistic, and especially Bayesian, methods is explained. The concepts of risk and decision, and probability and frequency are elucidated. The mechanics of probability and probabilistic calculations is discussed. The use of the method for particular problems, such as the frequency of aircraft crashes at a specified nuclear reactor site, is illustrated. 64 figures, 20 tables. (RWR)

  4. Fault tree analysis for integrated and probabilistic risk analysis of drinking water systems.

    Science.gov (United States)

    Lindhe, Andreas; Rosén, Lars; Norberg, Tommy; Bergstedt, Olof

    2009-04-01

    Drinking water systems are vulnerable and subject to a wide range of risks. To avoid sub-optimisation of risk-reduction options, risk analyses need to include the entire drinking water system, from source to tap. Such an integrated approach demands tools that are able to model interactions between different events. Fault tree analysis is a risk estimation tool with the ability to model interactions between events. Using fault tree analysis on an integrated level, a probabilistic risk analysis of a large drinking water system in Sweden was carried out. The primary aims of the study were: (1) to develop a method for integrated and probabilistic risk analysis of entire drinking water systems; and (2) to evaluate the applicability of Customer Minutes Lost (CML) as a measure of risk. The analysis included situations where no water is delivered to the consumer (quantity failure) and situations where water is delivered but does not comply with water quality standards (quality failure). Hard data as well as expert judgements were used to estimate probabilities of events and uncertainties in the estimates. The calculations were performed using Monte Carlo simulations. CML is shown to be a useful measure of risks associated with drinking water systems. The method presented provides information on risk levels, probabilities of failure, failure rates and downtimes of the system. This information is available for the entire system as well as its different sub-systems. Furthermore, the method enables comparison of the results with performance targets and acceptable levels of risk. The method thus facilitates integrated risk analysis and consequently helps decision-makers to minimise sub-optimisation of risk-reduction options.

  5. Illustrative probabilistic biosphere model for Yucca Mountain individual risk calculations

    Energy Technology Data Exchange (ETDEWEB)

    Wilems, R.E. [Del Mar Consulting, Corpus Christi, TX (United States)

    1994-12-31

    The proposed EPA Standards for the disposal of spent fuel, high-level and transuranic radioactive waste prescribe future biosphere--one in which no sustained human activity occurs inside the controlled zone, yet sustained use of groundwater occurs just outside the controlled zone boundary. Performance assessments have generally assumed a person at this location extracts all his water needs directly from the projected contaminated plume for all of his life. Dose to this maximally-exposed individual is too conservative a measure of performance for a nuclear waste repository and does not reflect the isolation characteristics of a site. A better measure is individual risk in which uncertainties in biosphere characteristics for the longer periods of performance, for a site like Yucca Mountain only those characteristics associated with well water scenarios need be prescribed. Such a prescription of the biosphere is appropriate because the goal of the regulations is to provide indicators of future performance so the regulators can make a responsible decision regarding reasonable assurance of public health and safety.

  6. Probabilistic risk assessment of aircraft impact on a spent nuclear fuel dry storage

    Energy Technology Data Exchange (ETDEWEB)

    Almomani, Belal, E-mail: balmomani@kaist.ac.kr [Department of Nuclear and Quantum Engineering, Korea Advanced Institute of Science and Technology, 291 Daehak-ro, Yuseong-gu, Daejeon 305-701 (Korea, Republic of); Lee, Sanghoon, E-mail: shlee1222@kmu.ac.kr [Department of Mechanical and Automotive Engineering, Keimyung University, Dalgubeol-daero 1095, Dalseo-gu, Daegu (Korea, Republic of); Jang, Dongchan, E-mail: dongchan.jang@kaist.ac.kr [Department of Nuclear and Quantum Engineering, Korea Advanced Institute of Science and Technology, 291 Daehak-ro, Yuseong-gu, Daejeon 305-701 (Korea, Republic of); Kang, Hyun Gook, E-mail: kangh6@rpi.edu [Department of Mechanical, Aerospace and Nuclear Engineering, Rensselaer Polytechnic Institute, Troy, NY 12180 (United States)

    2017-01-15

    Highlights: • A new risk assessment frame is proposed for aircraft impact into an interim dry storage. • It uses event tree analysis, response-structural analysis, consequence analysis, and Monte Carlo simulation. • A case study of the proposed procedure is presented to illustrate the methodology’s application. - Abstract: This paper proposes a systematic risk evaluation framework for one of the most significant impact events on an interim dry storage facility, an aircraft crash, by using a probabilistic approach. A realistic case study that includes a specific cask model and selected impact conditions is performed to demonstrate the practical applicability of the proposed framework. An event tree analysis of an occurred aircraft crash that defines a set of impact conditions and storage cask response is constructed. The Monte-Carlo simulation is employed for the probabilistic approach in consideration of sources of uncertainty associated with the impact loads onto the internal storage casks. The parameters for representing uncertainties that are managed probabilistically include the aircraft impact velocity, the compressive strength of the reinforced concrete wall, the missile shape factor, and the facility wall thickness. Failure probabilities of the impacted wall and a single storage cask under direct mechanical impact load caused by the aircraft crash are estimated. A finite element analysis is applied to simulate the postulated direct engine impact load onto the cask body, and a source term analysis for associated releases of radioactive materials as well as an off-site consequence analysis are performed. Finally, conditional risk contribution calculations are represented by an event tree model. Case study results indicate that no severe risk is presented, as the radiological consequences do not exceed regulatory exposure limits to the public. This risk model can be used with any other representative detailed parameters and reference design concepts for

  7. Probabilistic Modeling Of Ocular Biomechanics In VIIP: Risk Stratification

    Science.gov (United States)

    Feola, A.; Myers, J. G.; Raykin, J.; Nelson, E. S.; Mulugeta, L.; Samuels, B.; Ethier, C. R.

    2016-01-01

    the peak strains, we ranked and then normalized these coefficients, considering that normalized values 0.5 implied a substantial influence on the range of the peak strains in the optic nerve head (ONH). IOP and ICP were found to have a major influence on the peak strains in the ONH, as did optic nerve and LC stiffness. Interestingly, the stiffness of the sclera far from the scleral canal did not have a large influence on peak strains in ONH tissues; however, the collagen fiber stiffness in the peripapillary sclera and annular ring both influenced the peak strains within the ONH. We have created a physiologically relevant model that incorporated collagen fibers to study the effects of elevated ICP. Elevated ICP resulted in strains in the optic nerve that are not predicted to occur on earth: the upright or supine conditions. We found that IOP, ICP, lamina cribrosa stiffness and optic nerve stiffness had the highest association with these extreme strains in the ONH. These extreme strains may activate mechanosensitive cells that induce tissue remodeling and are a risk factor for the development of VIIP.

  8. Flood risk and adaptation strategies in Indonesia: a probabilistic analysis using globally available data

    Science.gov (United States)

    Muis, Sanne; Güneralp, Burak; Jongman, Brenden; Aerts, Jeroen; Ward, Philip

    2015-04-01

    In recent years, global flood losses are increasing due to socio-economic development and climate change, with the largest risk increases in developing countries such as Indonesia. For countries to undertake effective risk-management, an accurate understanding of both current and future risk is required. However, detailed information is rarely available, particularly for developing countries. We present a first of its kind country-scale analysis of flood risk using globally available data that combines a global inundation model with a land use change model and more local data on flood damages. To assess the contribution and uncertainty of different drivers of future risk, we integrate thousands of socio-economic and climate projections in a probabilistic way and include multiple adaptation strategies. Indonesia is used as a case-study as it a country that already faces high flood risk, and is undergoing rapid urbanization. We developed probabilistic and spatially-explicit urban expansion projections from 2000 to 2030 that show that the increase in urban extent ranges from 215% to 357% (5th and 95th percentile). We project rapidly rising flood risk, both for coastal and river floods. This increase is largely driven by economic growth and urban expansion (i.e. increasing exposure). Whilst sea level rise will amply this trend, the response of river floods to climate change is uncertain with the impact of the mean ensemble of 20 climate projections (5 GCMs and 4 RCPs) being close to zero. However, as urban expansion is the main driving force of future risk, we argue that the implementation of adaptation measures is increasingly pressing, regardless of the wide uncertainty in climate projections. Hence, we evaluated the effectiveness of two adaptation measures: spatial planning in flood prone areas and enhanced flood protection. Both strategies have a large potential to effectively offset the increasing risk trend. The risk reduction is in the range of 22-85% and 53

  9. Towards the development of a global probabilistic tsunami risk assessment methodology

    Science.gov (United States)

    Schaefer, Andreas; Daniell, James; Wenzel, Friedemann

    2017-04-01

    The assessment of tsunami risk is on many levels still ambiguous and under discussion. Over the last two decades, various methodologies and models have been developed to quantify tsunami risk, most of the time on a local or regional level, with either deterministic or probabilistic background. Probabilistic modelling has significant difficulties, as the underlying tsunami hazard modelling demands an immense amount of computational time and thus limits the assessment substantially, being often limited to either institutes with supercomputing access or the modellers are forced to reduce modelling resolution either quantitatively or qualitatively. Furthermore, data on the vulnerability of infrastructure and buildings is empirically limited to a few disasters in the recent years. Thus, a reliable quantification of socio-economic vulnerability is still questionable. Nonetheless, significant improvements have been developed recently on both the methodological site as well as computationally. This study, introduces a methodological framework for a globally uniform probabilistic tsunami risk assessment. Here, the power of recently developed hardware for desktop-based parallel computing plays a crucial role in the calculation of numerical tsunami wave propagation, while large-scale parametric models and paleo-seismological data enhances the return period assessment of tsunami-genic megathrust earthquake events. Adaptation of empirical tsunami vulnerability functions in conjunction with methodologies from flood modelling support a more reliable vulnerability quantification. In addition, methodologies for exposure modelling in coastal areas are introduced focusing on the diversity of coastal exposure landscapes and data availability. Overall, this study introduces a first overview of how a global tsunami risk modelling framework may be accomplished, while covering methodological, computational and data-driven aspects.

  10. Extravehicular Activity Probabilistic Risk Assessment Overview for Thermal Protection System Repair on the Hubble Space Telescope Servicing Mission

    Science.gov (United States)

    Bigler, Mark; Canga, Michael A.; Duncan, Gary

    2010-01-01

    The Shuttle Program initiated an Extravehicular Activity (EVA) Probabilistic Risk Assessment (PRA) to assess the risks associated with performing a Shuttle Thermal Protection System (TPS) repair during the Space Transportation System (STS)-125 Hubble repair mission as part of risk trades between TPS repair and crew rescue.

  11. Probabilistic modeling of the flows and environmental risks of nano-silica.

    Science.gov (United States)

    Wang, Yan; Kalinina, Anna; Sun, Tianyin; Nowack, Bernd

    2016-03-01

    Nano-silica, the engineered nanomaterial with one of the largest production volumes, has a wide range of applications in consumer products and industry. This study aimed to quantify the exposure of nano-silica to the environment and to assess its risk to surface waters. Concentrations were calculated for four environmental (air, soil, surface water, sediments) and two technical compartments (wastewater, solid waste) for the EU and Switzerland using probabilistic material flow modeling. The corresponding median concentration in surface water is predicted to be 0.12 μg/l in the EU (0.053-3.3 μg/l, 15/85% quantiles). The concentrations in sediments in the complete sedimentation scenario were found to be the largest among all environmental compartments, with a median annual increase of 0.43 mg/kg · y in the EU (0.19-12 mg/kg · y, 15/85% quantiles). Moreover, probabilistic species sensitivity distributions (PSSD) were computed and the risk of nano-silica in surface waters was quantified by comparing the predicted environmental concentration (PEC) with the predicted no-effect concentration (PNEC) distribution, which was derived from the cumulative PSSD. This assessment suggests that nano-silica currently poses no risk to aquatic organisms in surface waters. Further investigations are needed to assess the risk of nano-silica in other environmental compartments, which is currently not possible due to a lack of ecotoxicological data.

  12. Post-Probabilistic Uncertainty Quantification: Discussion of Potential Use in Product Development Risk Management

    DEFF Research Database (Denmark)

    Tegeltija, Miroslava; Oehmen, Josef; Kozin, Igor

    2016-01-01

    Uncertainty represents one of the key challenges in product development (PD) projects and can significantly impact a PD project's performance. Risks in PD lead to schedule and cost over-runs and poor product quality [Olechowski et al. 2012]. Risk management is one response for the identification...... if uncertainty is carefully addressed (e.g. [Prelec and Loewenstein 1991], [Riabacke 2006]). In the risk management community there is a strong argument that at least two distinct types of uncertainty have to be taken into account: aleatory and epistemic. Epistemic uncertainty arises due to lack of knowledge...... of the amount and quality of the data on which probability and utility assessments are based. Arguably, the key challenge in PD risk management today is that uncertainty quantification relies solely (or at least heavily) on probabilistic models. While these are appropriate to describe aleatory uncertainty...

  13. Assessing patient safety risk before the injury occurs: an introduction to sociotechnical probabilistic risk modelling in health care.

    Science.gov (United States)

    Marx, D A; Slonim, A D

    2003-12-01

    Since 1 July 2001 the Joint Commission on Accreditation of Healthcare Organizations (JCAHO) has required each accredited hospital to conduct at least one proactive risk assessment annually. Failure modes and effects analysis (FMEA) was recommended as one tool for conducting this task. This paper examines the limitations of FMEA and introduces a second tool used by the aviation and nuclear industries to examine low frequency, high impact events in complex systems. The adapted tool, known as sociotechnical probabilistic risk assessment (ST-PRA), provides an alternative for proactively identifying, prioritizing, and mitigating patient safety risk. The uniqueness of ST-PRA is its ability to model combinations of equipment failures, human error, at risk behavioral norms, and recovery opportunities through the use of fault trees. While ST-PRA is a complex, high end risk modelling tool, it provides an opportunity to visualize system risk in a manner that is not possible through FMEA.

  14. Probabilistic risk benchmark of the Brazilian electrical system; Risco probabilistico de referencia do sistema eletrico brasileiro

    Energy Technology Data Exchange (ETDEWEB)

    Soares, Neyl Hamilton Martelotta

    2002-05-01

    The main goal of this dissertation is to proceed a first numerical evaluation of the probabilistic risks magnitudes associated with the Brazilian Electrical network, considering the subsystems North, Northeast, South, Southeast and Mid West. This result is relevant because it can be used as an initial comparative reference for future reliability studies of the Brazilian Basic Grid. As a by-product, the whole set of criteria and procedures used in the work are described in detail. They may also serve as a preliminary base for future similar evaluations. (author)

  15. Probabilistic modeling of percutaneous absorption for risk-based exposure assessments and transdermal drug delivery.

    Energy Technology Data Exchange (ETDEWEB)

    Ho, Clifford Kuofei

    2004-06-01

    Chemical transport through human skin can play a significant role in human exposure to toxic chemicals in the workplace, as well as to chemical/biological warfare agents in the battlefield. The viability of transdermal drug delivery also relies on chemical transport processes through the skin. Models of percutaneous absorption are needed for risk-based exposure assessments and drug-delivery analyses, but previous mechanistic models have been largely deterministic. A probabilistic, transient, three-phase model of percutaneous absorption of chemicals has been developed to assess the relative importance of uncertain parameters and processes that may be important to risk-based assessments. Penetration routes through the skin that were modeled include the following: (1) intercellular diffusion through the multiphase stratum corneum; (2) aqueous-phase diffusion through sweat ducts; and (3) oil-phase diffusion through hair follicles. Uncertainty distributions were developed for the model parameters, and a Monte Carlo analysis was performed to simulate probability distributions of mass fluxes through each of the routes. Sensitivity analyses using stepwise linear regression were also performed to identify model parameters that were most important to the simulated mass fluxes at different times. This probabilistic analysis of percutaneous absorption (PAPA) method has been developed to improve risk-based exposure assessments and transdermal drug-delivery analyses, where parameters and processes can be highly uncertain.

  16. Probabilistic risk assessment framework for structural systems under multiple hazards using Bayesian statistics

    Energy Technology Data Exchange (ETDEWEB)

    Kwag, Shinyoung [North Carolina State University, Raleigh, NC 27695 (United States); Korea Atomic Energy Research Institute, Daejeon 305-353 (Korea, Republic of); Gupta, Abhinav, E-mail: agupta1@ncsu.edu [North Carolina State University, Raleigh, NC 27695 (United States)

    2017-04-15

    Highlights: • This study presents the development of Bayesian framework for probabilistic risk assessment (PRA) of structural systems under multiple hazards. • The concepts of Bayesian network and Bayesian inference are combined by mapping the traditionally used fault trees into a Bayesian network. • The proposed mapping allows for consideration of dependencies as well as correlations between events. • Incorporation of Bayesian inference permits a novel way for exploration of a scenario that is likely to result in a system level “vulnerability.” - Abstract: Conventional probabilistic risk assessment (PRA) methodologies (USNRC, 1983; IAEA, 1992; EPRI, 1994; Ellingwood, 2001) conduct risk assessment for different external hazards by considering each hazard separately and independent of each other. The risk metric for a specific hazard is evaluated by a convolution of the fragility and the hazard curves. The fragility curve for basic event is obtained by using empirical, experimental, and/or numerical simulation data for a particular hazard. Treating each hazard as an independently can be inappropriate in some cases as certain hazards are statistically correlated or dependent. Examples of such correlated events include but are not limited to flooding induced fire, seismically induced internal or external flooding, or even seismically induced fire. In the current practice, system level risk and consequence sequences are typically calculated using logic trees to express the causative relationship between events. In this paper, we present the results from a study on multi-hazard risk assessment that is conducted using a Bayesian network (BN) with Bayesian inference. The framework can consider statistical dependencies among risks from multiple hazards, allows updating by considering the newly available data/information at any level, and provide a novel way to explore alternative failure scenarios that may exist due to vulnerabilities.

  17. Risk-Based Predictive Maintenance for Safety-Critical Systems by Using Probabilistic Inference

    Directory of Open Access Journals (Sweden)

    Tianhua Xu

    2013-01-01

    Full Text Available Risk-based maintenance (RBM aims to improve maintenance planning and decision making by reducing the probability and consequences of failure of equipment. A new predictive maintenance strategy that integrates dynamic evolution model and risk assessment is proposed which can be used to calculate the optimal maintenance time with minimal cost and safety constraints. The dynamic evolution model provides qualified risks by using probabilistic inference with bucket elimination and gives the prospective degradation trend of a complex system. Based on the degradation trend, an optimal maintenance time can be determined by minimizing the expected maintenance cost per time unit. The effectiveness of the proposed method is validated and demonstrated by a collision accident of high-speed trains with obstacles in the presence of safety and cost constrains.

  18. Comparison of a Traditional Probabilistic Risk Assessment Approach with Advanced Safety Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Curtis L; Mandelli, Diego; Zhegang Ma

    2014-11-01

    As part of the Light Water Sustainability Program (LWRS) [1], the purpose of the Risk Informed Safety Margin Characterization (RISMC) [2] Pathway research and development (R&D) is to support plant decisions for risk-informed margin management with the aim to improve economics, reliability, and sustain safety of current NPPs. In this paper, we describe the RISMC analysis process illustrating how mechanistic and probabilistic approaches are combined in order to estimate a safety margin. We use the scenario of a “station blackout” (SBO) wherein offsite power and onsite power is lost, thereby causing a challenge to plant safety systems. We describe the RISMC approach, illustrate the station blackout modeling, and contrast this with traditional risk analysis modeling for this type of accident scenario. We also describe our approach we are using to represent advanced flooding analysis.

  19. Regional probabilistic nuclear risk and vulnerability assessment by integration of mathematical modelling land GIS-analysis

    Energy Technology Data Exchange (ETDEWEB)

    Rigina, O. [Univ. of Copenhagen, Inst. of Geography, Copenhagen (Denmark); Baklanov, A. [Danish Meteorological Inst., Copenhagen (Denmark)

    2002-04-01

    The Kola Peninsula, Russian Arctic exceeds all other regions in the world in the number of nuclear reactors. The study was aimed at estimating possible radiation risks to the population in the Nordic countries in case of a severe accident in the Kola Peninsula. A new approach based on probabilistic analysis of modelled possible pathways of radionuclide transport and precipitation was developed. For the general population, Finland is at most risk with respect to the Kola NPP, because of: high population density or proximity to the radiation-risk sites and relatively high probability of an airflow trajectory there, and precipitation. After considering the critical group, northern counties in Norway, Finland and Sweden appear to be most vulnerable. (au)

  20. Probabilistic Causal Analysis for System Safety Risk Assessments in Commercial Air Transport

    Science.gov (United States)

    Luxhoj, James T.

    2003-01-01

    Aviation is one of the critical modes of our national transportation system. As such, it is essential that new technologies be continually developed to ensure that a safe mode of transportation becomes even safer in the future. The NASA Aviation Safety Program (AvSP) is managing the development of new technologies and interventions aimed at reducing the fatal aviation accident rate by a factor of 5 by year 2007 and by a factor of 10 by year 2022. A portfolio assessment is currently being conducted to determine the projected impact that the new technologies and/or interventions may have on reducing aviation safety system risk. This paper reports on advanced risk analytics that combine the use of a human error taxonomy, probabilistic Bayesian Belief Networks, and case-based scenarios to assess a relative risk intensity metric. A sample case is used for illustrative purposes.

  1. Fork2Farmer: Enabling Success of Small Farms through Partnerships with Well-Known Chefs and the Tourism Sector

    Science.gov (United States)

    Morais, Duarte; Jakes, Susan; Bowen, Becky; Lelekacs, Joanna Massey

    2017-01-01

    A team of economic development, local foods, and tourism specialists from North Carolina Cooperative Extension is pursuing an initiative titled Fork2Farmer. The goal is to increase visits to local farms and diversify farm income by leveraging the high visibility of well-known farm-to-table chefs who support local small farms. To do this, those…

  2. A probabilistic model for hydrokinetic turbine collision risks: exploring impacts on fish.

    Directory of Open Access Journals (Sweden)

    Linus Hammar

    Full Text Available A variety of hydrokinetic turbines are currently under development for power generation in rivers, tidal straits and ocean currents. Because some of these turbines are large, with rapidly moving rotor blades, the risk of collision with aquatic animals has been brought to attention. The behavior and fate of animals that approach such large hydrokinetic turbines have not yet been monitored at any detail. In this paper, we conduct a synthesis of the current knowledge and understanding of hydrokinetic turbine collision risks. The outcome is a generic fault tree based probabilistic model suitable for estimating population-level ecological risks. New video-based data on fish behavior in strong currents are provided and models describing fish avoidance behaviors are presented. The findings indicate low risk for small-sized fish. However, at large turbines (≥5 m, bigger fish seem to have high probability of collision, mostly because rotor detection and avoidance is difficult in low visibility. Risks can therefore be substantial for vulnerable populations of large-sized fish, which thrive in strong currents. The suggested collision risk model can be applied to different turbine designs and at a variety of locations as basis for case-specific risk assessments. The structure of the model facilitates successive model validation, refinement and application to other organism groups such as marine mammals.

  3. A probabilistic method for computing quantitative risk indexes from medical injuries compensation claims.

    Science.gov (United States)

    Dalle Carbonare, S; Folli, F; Patrini, E; Giudici, P; Bellazzi, R

    2013-01-01

    The increasing demand of health care services and the complexity of health care delivery require Health Care Organizations (HCOs) to approach clinical risk management through proper methods and tools. An important aspect of risk management is to exploit the analysis of medical injuries compensation claims in order to reduce adverse events and, at the same time, to optimize the costs of health insurance policies. This work provides a probabilistic method to estimate the risk level of a HCO by computing quantitative risk indexes from medical injury compensation claims. Our method is based on the estimate of a loss probability distribution from compensation claims data through parametric and non-parametric modeling and Monte Carlo simulations. The loss distribution can be estimated both on the whole dataset and, thanks to the application of a Bayesian hierarchical model, on stratified data. The approach allows to quantitatively assessing the risk structure of the HCO by analyzing the loss distribution and deriving its expected value and percentiles. We applied the proposed method to 206 cases of injuries with compensation requests collected from 1999 to the first semester of 2007 by the HCO of Lodi, in the Northern part of Italy. We computed the risk indexes taking into account the different clinical departments and the different hospitals involved. The approach proved to be useful to understand the HCO risk structure in terms of frequency, severity, expected and unexpected loss related to adverse events.

  4. A probabilistic model for hydrokinetic turbine collision risks: exploring impacts on fish.

    Science.gov (United States)

    Hammar, Linus; Eggertsen, Linda; Andersson, Sandra; Ehnberg, Jimmy; Arvidsson, Rickard; Gullström, Martin; Molander, Sverker

    2015-01-01

    A variety of hydrokinetic turbines are currently under development for power generation in rivers, tidal straits and ocean currents. Because some of these turbines are large, with rapidly moving rotor blades, the risk of collision with aquatic animals has been brought to attention. The behavior and fate of animals that approach such large hydrokinetic turbines have not yet been monitored at any detail. In this paper, we conduct a synthesis of the current knowledge and understanding of hydrokinetic turbine collision risks. The outcome is a generic fault tree based probabilistic model suitable for estimating population-level ecological risks. New video-based data on fish behavior in strong currents are provided and models describing fish avoidance behaviors are presented. The findings indicate low risk for small-sized fish. However, at large turbines (≥5 m), bigger fish seem to have high probability of collision, mostly because rotor detection and avoidance is difficult in low visibility. Risks can therefore be substantial for vulnerable populations of large-sized fish, which thrive in strong currents. The suggested collision risk model can be applied to different turbine designs and at a variety of locations as basis for case-specific risk assessments. The structure of the model facilitates successive model validation, refinement and application to other organism groups such as marine mammals.

  5. Fuzzy-probabilistic multi agent system for breast cancer risk assessment and insurance premium assignment.

    Science.gov (United States)

    Tatari, Farzaneh; Akbarzadeh-T, Mohammad-R; Sabahi, Ahmad

    2012-12-01

    In this paper, we present an agent-based system for distributed risk assessment of breast cancer development employing fuzzy and probabilistic computing. The proposed fuzzy multi agent system consists of multiple fuzzy agents that benefit from fuzzy set theory to demonstrate their soft information (linguistic information). Fuzzy risk assessment is quantified by two linguistic variables of high and low. Through fuzzy computations, the multi agent system computes the fuzzy probabilities of breast cancer development based on various risk factors. By such ranking of high risk and low risk fuzzy probabilities, the multi agent system (MAS) decides whether the risk of breast cancer development is high or low. This information is then fed into an insurance premium adjuster in order to provide preventive decision making as well as to make appropriate adjustment of insurance premium and risk. This final step of insurance analysis also provides a numeric measure to demonstrate the utility of the approach. Furthermore, actual data are gathered from two hospitals in Mashhad during 1 year. The results are then compared with a fuzzy distributed approach.

  6. Probabilistic Assessment of TTC and Risk in Power Systems Using Multi-Objective Optimization

    Directory of Open Access Journals (Sweden)

    javad kafi kondori

    2013-02-01

    Full Text Available Increasing demand for the amount and the value of electricity consumption in recent decades, communication, security and continuity to the electric grid has a great significance. Transmission networks as the main element in the power network has an important role in covering consumer’s needs. Various indices for evaluating the transmission network are defined and among them TTC is evaluated to determine the ability of the network in different economic conditions. In this paper, the probabilistic assessment of TTC is done and by solving a multi-objective optimization problem different values of TTC are obtained for different risks. Objectives that considered in this optimization are increasing TTC and reducing the risk. In probability assessment of TTC the uncertainty of generators and transmission lines are considered. To select contingencies, the probability of outage and the amount of TTC are considered. The IEEE reliability test system is used to demonstrate the effectiveness of the approach.

  7. A probabilistic seismic risk assessment procedure for nuclear power plants: (I) Methodology

    Science.gov (United States)

    Huang, Y.-N.; Whittaker, A.S.; Luco, N.

    2011-01-01

    A new procedure for probabilistic seismic risk assessment of nuclear power plants (NPPs) is proposed. This procedure modifies the current procedures using tools developed recently for performance-based earthquake engineering of buildings. The proposed procedure uses (a) response-based fragility curves to represent the capacity of structural and nonstructural components of NPPs, (b) nonlinear response-history analysis to characterize the demands on those components, and (c) Monte Carlo simulations to determine the damage state of the components. The use of response-rather than ground-motion-based fragility curves enables the curves to be independent of seismic hazard and closely related to component capacity. The use of Monte Carlo procedure enables the correlation in the responses of components to be directly included in the risk assessment. An example of the methodology is presented in a companion paper to demonstrate its use and provide the technical basis for aspects of the methodology. ?? 2011 Published by Elsevier B.V.

  8. Probabilistic Risk Assessment System%概率风险评价系统

    Institute of Scientific and Technical Information of China (English)

    颜兆林; 龚时雨; 周经伦

    2001-01-01

    概率风险评价(PRA)是定量地评价复杂系统风险的有效途径。讨论了PRA的特点、实施步骤和存在问题,介绍了概率风险评价系统的结构和功能,对进一步的工作进行了展望。%The Probabilistic Risk Assessment (PRA) method is an effectiveway to assess the safety risk of complex systems. In this paper, we first addressed the procedure and difficulties to perform PRA as well as its characteristics. Then the architecture of the computer-aided tool we developed for performing PRA was introduced. Finally, some future work was prospected.

  9. Evaluating probabilistic dengue risk forecasts from a prototype early warning system for Brazil.

    Science.gov (United States)

    Lowe, Rachel; Coelho, Caio As; Barcellos, Christovam; Carvalho, Marilia Sá; Catão, Rafael De Castro; Coelho, Giovanini E; Ramalho, Walter Massa; Bailey, Trevor C; Stephenson, David B; Rodó, Xavier

    2016-02-24

    Recently, a prototype dengue early warning system was developed to produce probabilistic forecasts of dengue risk three months ahead of the 2014 World Cup in Brazil. Here, we evaluate the categorical dengue forecasts across all microregions in Brazil, using dengue cases reported in June 2014 to validate the model. We also compare the forecast model framework to a null model, based on seasonal averages of previously observed dengue incidence. When considering the ability of the two models to predict high dengue risk across Brazil, the forecast model produced more hits and fewer missed events than the null model, with a hit rate of 57% for the forecast model compared to 33% for the null model. This early warning model framework may be useful to public health services, not only ahead of mass gatherings, but also before the peak dengue season each year, to control potentially explosive dengue epidemics.

  10. Development of Probabilistic Risk Assessment Procedure of Nuclear Power Plant under Aircraft Impact Loadings

    Energy Technology Data Exchange (ETDEWEB)

    Hahm, Daegi; Shin, Sangshup; Park, Jin Hee; Choi, Inkil [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2014-05-15

    In this paper, the total technical roadmap and the procedure to assess the aircraft impact risk will be introduced. In the first year of the research project, 2012, we developed aircraft impact accident scenario and performed preliminary fragility analysis of the local failure of the targeted wall by aircraft impact. An aircraft impact event can be characterized by the appropriate load parameters (i. e., aircraft type, mass, velocity, angle of crash, etc.). Therefore, the reference parameter should be selected to represent each load effect in order to evaluate the capacity/fragility of SSCs using deterministic or probabilistic methods. This is similar to the use of the peak ground acceleration (PGA) to represent the ground motion spectrum of the earthquake in the seismic probabilistic risk assessment (SPRA) approach. We developed the methodology to decide on the reference parameter for the aircraft impact risk quantification among some reasonable candidates, which can represent many uncertain loading parameters. To detect the response and the damage of the target structure, missile-target interaction method and Riera's time-history analysis method have been used primarily in the aircraft impact research area. To define the reference loading parameter, we need to perform repetitive simulations for many analysis cases. Thus, we applied a revised version of Riera's method, which is appropriate for a simplified impact simulation. The target NPP to determine the reference parameter and evaluate the preliminary assessment of aircraft impact risk was selected among the typical Korean PWR NPPs. The response has been calculated for pre-stressed concrete containment buildings subjected to aircraft impact loading, and the responses according to each reference parameter have been analyzed. Recently, we also evaluated the floor response spectra for the locations of important components for the estimation of the failure probabilities and fragility functions of

  11. Are engineered nano iron oxide particles safe? an environmental risk assessment by probabilistic exposure, effects and risk modeling.

    Science.gov (United States)

    Wang, Yan; Deng, Lei; Caballero-Guzman, Alejandro; Nowack, Bernd

    2016-12-01

    Nano iron oxide particles are beneficial to our daily lives through their use in paints, construction materials, biomedical imaging and other industrial fields. However, little is known about the possible risks associated with the current exposure level of engineered nano iron oxides (nano-FeOX) to organisms in the environment. The goal of this study was to predict the release of nano-FeOX to the environment and assess their risks for surface waters in the EU and Switzerland. The material flows of nano-FeOX to technical compartments (waste incineration and waste water treatment plants) and to the environment were calculated with a probabilistic modeling approach. The mean value of the predicted environmental concentrations (PECs) of nano-FeOX in surface waters in the EU for a worst-case scenario (no particle sedimentation) was estimated to be 28 ng/l. Using a probabilistic species sensitivity distribution, the predicted no-effect concentration (PNEC) was determined from ecotoxicological data. The risk characterization ratio, calculated by dividing the PEC by PNEC values, was used to characterize the risks. The mean risk characterization ratio was predicted to be several orders of magnitude smaller than 1 (1.4 × 10(-)(4)). Therefore, this modeling effort indicates that only a very limited risk is posed by the current release level of nano-FeOX to organisms in surface waters. However, a better understanding of the hazards of nano-FeOX to the organisms in other ecosystems (such as sediment) needs to be assessed to determine the overall risk of these particles to the environment.

  12. Probabilistic ecological risk assessment of cadmium in the Bohai Sea using native saltwater species

    Institute of Scientific and Technical Information of China (English)

    MU Jingli; WANG Juying; WANG Ying; CONG Yi; ZHANG Zhifeng

    2014-01-01

    Predicted no-effect concentration (PNEC) is often used in ecological risk assessment to determine low-risk concentrations for chemicals. In the present study, the chronic data from native saltwater species were used to calculated PNEC values using four methods: log-normal distribution (ETX 2.0), log-triangle distribution (US EPA’s water quality criteria procedure), burr III distribution (BurrliOZ) and traditional assessment fac-tor (AF). The PNECs that were calculated using four methods ranged from 0.08 μg/L to 1.8 μg/L. Three of the SSD-derived PNECs range from 0.94 to 1.8 μg/L, about a factor of two apart. To demonstrate the use of SSD-based PNEC values and comprehensively estimate the regional ecological risk for cadmium in surface water of the Bohai Sea, in the Liaodong Bay, Bohai Bay, and Laizhou Bay, China, the dissolved cadmium con-centrations were measured and obtained 753 valid data covering 190 stations from July 2006 to November 2007. Based on three ecological risk assessment approaches, namely hazard quotient (HQ), probabilistic risk quotient and joint probability curve (JPC), the potential ecological risk of cadmium in surface water of the Liaodong Bay, Bohai Bay, and Laizahou Bay were estimated. Overall, the ecological risk of cadmium to aquatic ecosystem in the whole Bohai Sea was at acceptable ecological risk level, the order of ecological risk was Liaodong Bay>Bohai Bay>Laizhou Bay. However, more concerns should be paid to aquatic ecological risk in the Liaodong Bay which is the home of many steel, metallurgy and petrochemical industrial in China.

  13. Climate change risk analysis framework (CCRAF) a probabilistic tool for analyzing climate change uncertainties

    Science.gov (United States)

    Legget, J.; Pepper, W.; Sankovski, A.; Smith, J.; Tol, R.; Wigley, T.

    2003-04-01

    Potential risks of human-induced climate change are subject to a three-fold uncertainty associated with: the extent of future anthropogenic and natural GHG emissions; global and regional climatic responses to emissions; and impacts of climatic changes on economies and the biosphere. Long-term analyses are also subject to uncertainty regarding how humans will respond to actual or perceived changes, through adaptation or mitigation efforts. Explicitly addressing these uncertainties is a high priority in the scientific and policy communities Probabilistic modeling is gaining momentum as a technique to quantify uncertainties explicitly and use decision analysis techniques that take advantage of improved risk information. The Climate Change Risk Assessment Framework (CCRAF) presented here a new integrative tool that combines the probabilistic approaches developed in population, energy and economic sciences with empirical data and probabilistic results of climate and impact models. The main CCRAF objective is to assess global climate change as a risk management challenge and to provide insights regarding robust policies that address the risks, by mitigating greenhouse gas emissions and by adapting to climate change consequences. The CCRAF endogenously simulates to 2100 or beyond annual region-specific changes in population; GDP; primary (by fuel) and final energy (by type) use; a wide set of associated GHG emissions; GHG concentrations; global temperature change and sea level rise; economic, health, and biospheric impacts; costs of mitigation and adaptation measures and residual costs or benefits of climate change. Atmospheric and climate components of CCRAF are formulated based on the latest version of Wigley's and Raper's MAGICC model and impacts are simulated based on a modified version of Tol's FUND model. The CCRAF is based on series of log-linear equations with deterministic and random components and is implemented using a Monte-Carlo method with up to 5000

  14. Associations between polygenic risk for schizophrenia and brain function during probabilistic learning in healthy individuals.

    Science.gov (United States)

    Lancaster, Thomas M; Ihssen, Niklas; Brindley, Lisa M; Tansey, Katherine E; Mantripragada, Kiran; O'Donovan, Michael C; Owen, Michael J; Linden, David E J

    2016-02-01

    A substantial proportion of schizophrenia liability can be explained by additive genetic factors. Risk profile scores (RPS) directly index risk using a summated total of common risk variants weighted by their effect. Previous studies suggest that schizophrenia RPS predict alterations to neural networks that support working memory and verbal fluency. In this study, we apply schizophrenia RPS to fMRI data to elucidate the effects of polygenic risk on functional brain networks during a probabilistic-learning neuroimaging paradigm. The neural networks recruited during this paradigm have previously been shown to be altered to unmedicated schizophrenia patients and relatives of schizophrenia patients, which may reflect genetic susceptibility. We created schizophrenia RPS using summary data from the Psychiatric Genetic Consortium (Schizophrenia Working Group) for 83 healthy individuals and explore associations between schizophrenia RPS and blood oxygen level dependency (BOLD) during periods of choice behavior (switch-stay) and reflection upon choice outcome (reward-punishment). We show that schizophrenia RPS is associated with alterations in the frontal pole (PWHOLE-BRAIN-CORRECTED  = 0.048) and the ventral striatum (PROI-CORRECTED  = 0.036), during choice behavior, but not choice outcome. We suggest that the common risk variants that increase susceptibility to schizophrenia can be associated with alterations in the neural circuitry that support the processing of changing reward contingencies. Hum Brain Mapp 37:491-500, 2016. © 2015 Wiley Periodicals, Inc.

  15. Methodology for Developing a Probabilistic Risk Assessment Model of Spacecraft Rendezvous and Dockings

    Science.gov (United States)

    Farnham, Steven J., II; Garza, Joel, Jr.; Castillo, Theresa M.; Lutomski, Michael

    2011-01-01

    In 2007 NASA was preparing to send two new visiting vehicles carrying logistics and propellant to the International Space Station (ISS). These new vehicles were the European Space Agency s (ESA) Automated Transfer Vehicle (ATV), the Jules Verne, and the Japanese Aerospace and Explorations Agency s (JAXA) H-II Transfer Vehicle (HTV). The ISS Program wanted to quantify the increased risk to the ISS from these visiting vehicles. At the time, only the Shuttle, the Soyuz, and the Progress vehicles rendezvoused and docked to the ISS. The increased risk to the ISS was from an increase in vehicle traffic, thereby, increasing the potential catastrophic collision during the rendezvous and the docking or berthing of the spacecraft to the ISS. A universal method of evaluating the risk of rendezvous and docking or berthing was created by the ISS s Risk Team to accommodate the increasing number of rendezvous and docking or berthing operations due to the increasing number of different spacecraft, as well as the future arrival of commercial spacecraft. Before the first docking attempt of ESA's ATV and JAXA's HTV to the ISS, a probabilistic risk model was developed to quantitatively calculate the risk of collision of each spacecraft with the ISS. The 5 rendezvous and docking risk models (Soyuz, Progress, Shuttle, ATV, and HTV) have been used to build and refine the modeling methodology for rendezvous and docking of spacecrafts. This risk modeling methodology will be NASA s basis for evaluating the addition of future ISS visiting spacecrafts hazards, including SpaceX s Dragon, Orbital Science s Cygnus, and NASA s own Orion spacecraft. This paper will describe the methodology used for developing a visiting vehicle risk model.

  16. Probabilistic risk assessment support of emergency preparedness at the Savannah River Site

    Energy Technology Data Exchange (ETDEWEB)

    O`Kula, K.R.; Baker, W.H.; Simpkins, A.A.; Taylor, R.P. [Westinghouse Savannah River Co., Aiken, SC (United States); Wagner, K.C.; Amos, C.N. [Science Applications International Corp., Albuquerque, NM (United States)

    1992-12-31

    Integration of the Probabilistic Risk Assessment (PRA) for K Reactor operation into related technical areas at the Savannah River Site (SRS) includes coordination with several onsite organizations responsible for maintaining and upgrading emergency preparedness capabilities. Major functional categories of the PRA application are scenario development and source term algorithm enhancement. Insights and technologies from the SRS PRA have facilitated development of: (1) credible timelines for scenarios; (2) algorithms tied to plant instrumentation to provide best-estimate source terms for dose projection; and (3) expert-system logic models to implement informed counter-measures to assure onsite and offsite safety following accidental releases. The latter methodology, in particular, is readily transferable to other reactor and non-reactor facilities at SRS and represents a distinct advance relative to emergency preparedness capabilities elsewhere in the DOE complex.

  17. Probabilistic risk assessment support of emergency preparedness at the Savannah River Site

    Energy Technology Data Exchange (ETDEWEB)

    O' Kula, K.R.; Baker, W.H.; Simpkins, A.A.; Taylor, R.P. (Westinghouse Savannah River Co., Aiken, SC (United States)); Wagner, K.C.; Amos, C.N. (Science Applications International Corp., Albuquerque, NM (United States))

    1992-01-01

    Integration of the Probabilistic Risk Assessment (PRA) for K Reactor operation into related technical areas at the Savannah River Site (SRS) includes coordination with several onsite organizations responsible for maintaining and upgrading emergency preparedness capabilities. Major functional categories of the PRA application are scenario development and source term algorithm enhancement. Insights and technologies from the SRS PRA have facilitated development of: (1) credible timelines for scenarios; (2) algorithms tied to plant instrumentation to provide best-estimate source terms for dose projection; and (3) expert-system logic models to implement informed counter-measures to assure onsite and offsite safety following accidental releases. The latter methodology, in particular, is readily transferable to other reactor and non-reactor facilities at SRS and represents a distinct advance relative to emergency preparedness capabilities elsewhere in the DOE complex.

  18. Probabilistic assessment of erosion and flooding risk in the northern Gulf of Mexico

    Science.gov (United States)

    Plant, Nathaniel G.; Wahl, Thomas; Long, Joseph W.

    2016-01-01

    We assess erosion and flooding risk in the northern Gulf of Mexico by identifying interdependencies among oceanographic drivers and probabilistically modeling the resulting potential for coastal change. Wave and water level observations are used to determine relationships between six hydrodynamic parameters that influence total water level and therefore erosion and flooding, through consideration of a wide range of univariate distribution functions and multivariate elliptical copulas. Using these relationships, we explore how different our interpretation of the present-day erosion/flooding risk could be if we had seen more or fewer extreme realizations of individual and combinations of parameters in the past by simulating 10,000 physically and statistically consistent sea-storm time series. We find that seasonal total water levels associated with the 100 year return period could be up to 3 m higher in summer and 0.6 m higher in winter relative to our best estimate based on the observational records. Impact hours of collision and overwash—where total water levels exceed the dune toe or dune crest elevations—could be on average 70% (collision) and 100% (overwash) larger than inferred from the observations. Our model accounts for non-stationarity in a straightforward, non-parametric way that can be applied (with little adjustments) to many other coastlines. The probabilistic model presented here, which accounts for observational uncertainty, can be applied to other coastlines where short record lengths limit the ability to identify the full range of possible wave and water level conditions that coastal mangers and planners must consider to develop sustainable management strategies.

  19. Probabilistic assessment of erosion and flooding risk in the northern Gulf of Mexico

    Science.gov (United States)

    Wahl, Thomas; Plant, Nathaniel G.; Long, Joseph W.

    2016-05-01

    We assess erosion and flooding risk in the northern Gulf of Mexico by identifying interdependencies among oceanographic drivers and probabilistically modeling the resulting potential for coastal change. Wave and water level observations are used to determine relationships between six hydrodynamic parameters that influence total water level and therefore erosion and flooding, through consideration of a wide range of univariate distribution functions and multivariate elliptical copulas. Using these relationships, we explore how different our interpretation of the present-day erosion/flooding risk could be if we had seen more or fewer extreme realizations of individual and combinations of parameters in the past by simulating 10,000 physically and statistically consistent sea-storm time series. We find that seasonal total water levels associated with the 100 year return period could be up to 3 m higher in summer and 0.6 m higher in winter relative to our best estimate based on the observational records. Impact hours of collision and overwash—where total water levels exceed the dune toe or dune crest elevations—could be on average 70% (collision) and 100% (overwash) larger than inferred from the observations. Our model accounts for non-stationarity in a straightforward, non-parametric way that can be applied (with little adjustments) to many other coastlines. The probabilistic model presented here, which accounts for observational uncertainty, can be applied to other coastlines where short record lengths limit the ability to identify the full range of possible wave and water level conditions that coastal mangers and planners must consider to develop sustainable management strategies.

  20. A Preliminary Study of the Application of Probabilistic Risk Assessment Techniques to High-Energy Laser Safety

    Science.gov (United States)

    2001-12-01

    that scintillation gain exceeds a given level (g) ............... 19 Figure 8. Typical dose - response curve for laser-induced ocular damage...23 Figure 9. Probit transformation of the dose - response curve for laser-induced ocular dam age...Protection Criteria An important element in the probabilistic risk assessment is the biological damage model (or dose response curve ). This describes

  1. Probabilistic risk assessment model for allergens in food: sensitivity analysis of the minimum eliciting dose and food consumption

    NARCIS (Netherlands)

    Kruizinga, A.G.; Briggs, D.; Crevel, R.W.R.; Knulst, A.C.; Bosch, L.M.C.v.d.; Houben, G.F.

    2008-01-01

    Previously, TNO developed a probabilistic model to predict the likelihood of an allergic reaction, resulting in a quantitative assessment of the risk associated with unintended exposure to food allergens. The likelihood is estimated by including in the model the proportion of the population who is a

  2. Development of Simplified Probabilistic Risk Assessment Model for Seismic Initiating Event

    Energy Technology Data Exchange (ETDEWEB)

    S. Khericha; R. Buell; S. Sancaktar; M. Gonzalez; F. Ferrante

    2012-06-01

    ABSTRACT This paper discusses a simplified method to evaluate seismic risk using a methodology built on dividing the seismic intensity spectrum into multiple discrete bins. The seismic probabilistic risk assessment model uses Nuclear Regulatory Commission’s (NRC’s) full power Standardized Plant Analysis Risk (SPAR) model as the starting point for development. The seismic PRA models are integrated with their respective internal events at-power SPAR model. This is accomplished by combining the modified system fault trees from the full power SPAR model with seismic event tree logic. The peak ground acceleration is divided into five bins. The g-value for each bin is estimated using the geometric mean of lower and upper values of that particular bin and the associated frequency for each bin is estimated by taking the difference between upper and lower values of that bin. The component’s fragilities are calculated for each bin using the plant data, if available, or generic values of median peak ground acceleration and uncertainty values for the components. For human reliability analysis (HRA), the SPAR HRA (SPAR-H) method is used which requires the analysts to complete relatively straight forward worksheets that include the performance shaping factors (PSFs). The results are then used to estimate human error probabilities (HEPs) of interest. This work is expected to improve the NRC’s ability to include seismic hazards in risk assessments for operational events in support of the reactor oversight program (e.g., significance determination process).

  3. Modeling and Quantification of Team Performance in Human Reliability Analysis for Probabilistic Risk Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Jeffrey C. JOe; Ronald L. Boring

    2014-06-01

    Probabilistic Risk Assessment (PRA) and Human Reliability Assessment (HRA) are important technical contributors to the United States (U.S.) Nuclear Regulatory Commission’s (NRC) risk-informed and performance based approach to regulating U.S. commercial nuclear activities. Furthermore, all currently operating commercial NPPs in the U.S. are required by federal regulation to be staffed with crews of operators. Yet, aspects of team performance are underspecified in most HRA methods that are widely used in the nuclear industry. There are a variety of "emergent" team cognition and teamwork errors (e.g., communication errors) that are 1) distinct from individual human errors, and 2) important to understand from a PRA perspective. The lack of robust models or quantification of team performance is an issue that affects the accuracy and validity of HRA methods and models, leading to significant uncertainty in estimating HEPs. This paper describes research that has the objective to model and quantify team dynamics and teamwork within NPP control room crews for risk informed applications, thereby improving the technical basis of HRA, which improves the risk-informed approach the NRC uses to regulate the U.S. commercial nuclear industry.

  4. Believe it or not? The challenge of validating large scale probabilistic risk models

    Directory of Open Access Journals (Sweden)

    Sayers Paul

    2016-01-01

    Full Text Available The National Flood Risk Assessment (NaFRA for England and Wales was initially undertaken in 2002 with frequent updates since. NaFRA has become a key source of information on flood risk, informing policy and investment decisions as well as communicating risk to the public and insurers. To make well informed decisions based on these data, users rightfully demand to know the confidence they can place in them. The probability of inundation and associated damage however cannot be validated in the traditional sense, due the rare and random nature of damaging floods and the lack of a long (and widespread stationary observational record (reflecting not only changes in climate but also the significant changes in land use and flood defence infrastructure that are likely to have occurred. To explore the validity of NaFRA this paper therefore provides a bottom-up qualitative exploration of the potential errors within the supporting methods and data. The paper concludes by underlining the need for further research to understand how to robustly validate probabilistic risk models.

  5. Cystone, a well-known herbal formulation, inhibits struvite crystal growth formation in single diffusion gel growth technique

    Directory of Open Access Journals (Sweden)

    Pralhad S. Patki

    2013-02-01

    Full Text Available Objective: The present study was aimed to evaluate the beneficial effect of Cystone® against struvite crystal growth in in vitro conditions. Methods: Various concentrations of Cystone® was prepared in 1 M magnesium acetate solution and evaluated for crystal growth inhibition assay by a well-known method called single diffusion gel growth technique in vitro. Results: Cystone®, a well-known polyherbal formulation, at 0.5, 1 and 2% concentrations showed significant and dose-dependent inhibition of struvite crystal growth formation in in vitro by reducing number, total mass and total volume of the struvite crystals formed and also caused fragmentation of grown struvite crystals in the gel matrix. Conclusion: The results of the present study indicate, Cystone® significantly retards the formation of struvite stones and also brings about its fragmentation. This could be one of the probable mechanisms behind the beneficial effect offered by Cystone® in the clinical management of urolithiasis and urinary tract infections. [J Exp Integr Med 2013; 3(1: 51-55

  6. Probabilistic risk assessment for linear alkylbenzene sulfonate (LAS) in sewage sludge used on agricultural soil.

    Science.gov (United States)

    Schowanek, Diederik; David, Helen; Francaviglia, Rosa; Hall, Jeremy; Kirchmann, Holger; Krogh, Paul Henning; Schraepen, Nathalie; Smith, Stephen; Wildemann, Tanja

    2007-12-01

    Deterministic and probabilistic risk assessments were developed for commercial LAS in agricultural soil amended with sewage sludge. The procedure done according to ILSI Europe's Conceptual Framework [Schowanek, D., Carr, R., David, H., Douben, P., Hall, J., Kirchmann, H., Patria, L., Sequi, P., Smith, S., Webb, S.F., 2004. A risk-based methodology for deriving quality standards for organic contaminants in sewage sludge for use in agriculture-conceptual Framework. Regul. Toxicol. Pharmacol. 40 (3), 227-251], consists of three main steps. First, the most sensitive endpoint was determined. This was found to be the chronic ecotoxicity of LAS to soil invertebrates and plants. Additional endpoints, such as potential for plant uptake and transfer in the food chain, leaching to groundwater, surface erosion run-off, human health risk via drinking water, plant consumption and soil ingestion were also systematically evaluated but were all assessed to be of little toxicological significance. In the second step, a back-calculation was conducted from the Predicted No-Effect Concentration in soil (PNECsoil) to a safe level of LAS in sludge (here called 'Sludge Quality Standard'; SQS). The deterministic approach followed the default agricultural soil exposure scenario in the EU-Technical Guidance Document (TGD). The SQS for LAS was calculated as 49 g/kg sludge Dry Matter (DM). In order to assess the potential variability as a result of varying agricultural practices and local environmental conditions, two probabilistic exposure assessment scenarios were also developed. The mean SQS was estimated at 55 and 27.5 g/kg DM for the homogeneous soil mixing and soil injection scenarios, respectively. In the final step, the resulting SQS values were evaluated for consistency and relevance versus available information from agricultural experience and field tests. No build-up, adverse impact on soil fertility, agronomic performance, or animal/human health have been reported for agricultural

  7. Flood risk and adaptation strategies under climate change and urban expansion: A probabilistic analysis using global data.

    Science.gov (United States)

    Muis, Sanne; Güneralp, Burak; Jongman, Brenden; Aerts, Jeroen C J H; Ward, Philip J

    2015-12-15

    An accurate understanding of flood risk and its drivers is crucial for effective risk management. Detailed risk projections, including uncertainties, are however rarely available, particularly in developing countries. This paper presents a method that integrates recent advances in global-scale modeling of flood hazard and land change, which enables the probabilistic analysis of future trends in national-scale flood risk. We demonstrate its application to Indonesia. We develop 1000 spatially-explicit projections of urban expansion from 2000 to 2030 that account for uncertainty associated with population and economic growth projections, as well as uncertainty in where urban land change may occur. The projections show that the urban extent increases by 215%-357% (5th and 95th percentiles). Urban expansion is particularly rapid on Java, which accounts for 79% of the national increase. From 2000 to 2030, increases in exposure will elevate flood risk by, on average, 76% and 120% for river and coastal floods. While sea level rise will further increase the exposure-induced trend by 19%-37%, the response of river floods to climate change is highly uncertain. However, as urban expansion is the main driver of future risk, the implementation of adaptation measures is increasingly urgent, regardless of the wide uncertainty in climate projections. Using probabilistic urban projections, we show that spatial planning can be a very effective adaptation strategy. Our study emphasizes that global data can be used successfully for probabilistic risk assessment in data-scarce countries.

  8. Using Statistical and Probabilistic Methods to Evaluate Health Risk Assessment: A Case Study

    Directory of Open Access Journals (Sweden)

    Hongjing Wu

    2014-06-01

    Full Text Available The toxic chemical and heavy metals within wastewater can cause serious adverse impacts on human health. Health risk assessment (HRA is an effective tool for supporting decision-making and corrective actions in water quality management. HRA can also help people understand the water quality and quantify the adverse effects of pollutants on human health. Due to the imprecision of data, measurement error and limited available information, uncertainty is inevitable in the HRA process. The purpose of this study is to integrate statistical and probabilistic methods to deal with censored and limited numbers of input data to improve the reliability of the non-cancer HRA of dermal contact exposure to contaminated river water by considering uncertainty. A case study in the Kelligrews River in St. John’s, Canada, was conducted to demonstrate the feasibility and capacity of the proposed approach. Five heavy metals were selected to evaluate the risk level, including arsenic, molybdenum, zinc, uranium and manganese. The results showed that the probability of the total hazard index of dermal exposure exceeding 1 is very low, and there is no obvious evidence of risk in the study area.

  9. Flood hazards and masonry constructions: a probabilistic framework for damage, risk and resilience at urban scale

    Science.gov (United States)

    Mebarki, A.; Valencia, N.; Salagnac, J. L.; Barroca, B.

    2012-05-01

    This paper deals with the failure risk of masonry constructions under the effect of floods. It is developed within a probabilistic framework, with loads and resistances considered as random variables. Two complementary approaches have been investigated for this purpose: - a global approach based on combined effects of several governing parameters with individual weighted contribution (material quality and geometry, presence and distance between columns, beams, openings, resistance of the soil and its slope. . .), - and a reliability method using the failure mechanism of masonry walls standing out-plane pressure. The evolution of the probability of failure of masonry constructions according to the flood water level is analysed. The analysis of different failure probability scenarios for masonry walls is conducted to calibrate the influence of each "vulnerability governing parameter" in the global approach that is widely used in risk assessment at the urban or regional scale. The global methodology is implemented in a GIS that provides the spatial distribution of damage risk for different flood scenarios. A real case is considered for the simulations, i.e. Cheffes sur Sarthe (France), for which the observed river discharge, the hydraulic load according to the Digital Terrain Model, and the structural resistance are considered as random variables. The damage probability values provided by both approaches are compared. Discussions are also developed about reduction and mitigation of the flood disaster at various scales (set of structures, city, region) as well as resilience.

  10. A probabilistic approach to quantitatively assess the inhalation risk for airborne endotoxin in cotton textile workers

    Energy Technology Data Exchange (ETDEWEB)

    Liao, Vivian Hsiu-Chuan, E-mail: vivianliao@ntu.edu.tw [Department of Bioenvironmental Systems Engineering, National Taiwan University, 1 Roosevelt Road, Sec. 4, Taipei 106, Taiwan (China); Chou, Wei-Chun; Chio, Chia-Pin; Ju, Yun-Ru; Liao, Chung-Min [Department of Bioenvironmental Systems Engineering, National Taiwan University, 1 Roosevelt Road, Sec. 4, Taipei 106, Taiwan (China)

    2010-05-15

    Endotoxin, a component of gram-negative bacterial cell walls, is a proinflammatory agent that induces local and systemic inflammatory responses in normal subjects which can contribute to the risk of developing asthma and chronic obstructive lung diseases. A probabilistic approach linking models of exposure, internal dosimetry, and health effects were carried out to quantitatively assess the potential inhalation risk of airborne endotoxin for workers in cotton textile plants. Combining empirical data and modeling results, we show that the half-maximum effects of the endotoxin dose (ED50) were estimated to be 3.3 x 10{sup 5} (95% confidence interval (CI): 1.9-14.7 x 10{sup 5}) endotoxin units (EU) for the blood C-reactive protein (CRP) concentration, 1.1 x 10{sup 5} (95% CI: 0.6-1.7 x 10{sup 5}) EU for the blood polymorphonuclear neutrophil (PMN) count, and 1.5 x 10{sup 5} (95% CI: 0.4-2.5 x 10{sup 5}) EU for the sputum PMN count. Our study offers a risk-management framework for discussing future establishment of limits for respiratory exposure to airborne endotoxin for workers in cotton textile plants.

  11. Probabilistic risk assessment of veterinary medicines applied to four major aquaculture species produced in Asia.

    Science.gov (United States)

    Rico, Andreu; Van den Brink, Paul J

    2014-01-15

    Aquaculture production constitutes one of the main sources of pollution with veterinary medicines into the environment. About 90% of the global aquaculture production is produced in Asia and the potential environmental risks associated with the use of veterinary medicines in Asian aquaculture have not yet been properly evaluated. In this study we performed a probabilistic risk assessment for eight different aquaculture production scenarios in Asia by combining up-to-date information on the use of veterinary medicines and aquaculture production characteristics. The ERA-AQUA model was used to perform mass balances of veterinary medicinal treatments applied to aquaculture ponds and to characterize risks for primary producers, invertebrates, and fish potentially exposed to chemical residues through aquaculture effluents. The mass balance calculations showed that, on average, about 25% of the applied drug mass to aquaculture ponds is released into the environment, although this percentage varies with the chemical's properties, the mode of application, the cultured species density, and the water exchange rates in the aquaculture pond scenario. In general, the highest potential environmental risks were calculated for parasitic treatments, followed by disinfection and antibiotic treatments. Pangasius catfish production in Vietnam, followed by shrimp production in China, constitute possible hot-spots for environmental pollution due to the intensity of the aquaculture production and considerable discharge of toxic chemical residues into surrounding aquatic ecosystems. A risk-based ranking of compounds is provided for each of the evaluated scenarios, which offers crucial information for conducting further chemical and biological field and laboratory monitoring research. In addition, we discuss general knowledge gaps and research priorities for performing refined risk assessments of aquaculture medicines in the near future.

  12. Shuttle Risk Progression: Use of the Shuttle Probabilistic Risk Assessment (PRA) to Show Reliability Growth

    Science.gov (United States)

    Hamlin, Teri L.

    2011-01-01

    It is important to the Space Shuttle Program (SSP), as well as future manned spaceflight programs, to understand the early mission risk and progression of risk as the program gains insights into the integrated vehicle through flight. The risk progression is important to the SSP as part of the documentation of lessons learned. The risk progression is important to future programs to understand reliability growth and the first flight risk. This analysis uses the knowledge gained from 30 years of operational flights and the current Shuttle PRA to calculate the risk of Loss of Crew and Vehicle (LOCV) at significant milestones beginning with the first flight. Key flights were evaluated based upon historical events and significant re-designs. The results indicated that the Shuttle risk tends to follow a step function as opposed to following a traditional reliability growth pattern where risk exponentially improves with each flight. In addition, it shows that risk can increase due to trading safety margin for increased performance or due to external events. Due to the risk drivers not being addressed, the risk did not improve appreciably during the first 25 flights. It was only after significant events occurred such as Challenger and Columbia, where the risk drivers were apparent, that risk was significantly improved. In addition, this paper will show that the SSP has reduced the risk of LOCV by almost an order of magnitude. It is easy to look back afte r 30 years and point to risks that are now obvious, however; the key is to use this knowledge to benefit other programs which are in their infancy stages. One lesson learned from the SSP is understanding risk drivers are essential in order to considerably reduce risk. This will enable the new program to focus time and resources on identifying and reducing the significant risks. A comprehensive PRA, similar to that of the Shuttle PRA, is an effective tool quantifying risk drivers if support from all of the stakeholders is

  13. The effects of name and religious priming on ratings of a well-known political figure, President Barack Obama.

    Science.gov (United States)

    Williams, Gary A; Guichard, AnaMarie C; An, JungHa

    2017-01-01

    Priming with race-typed names and religious concepts have been shown to activate stereotypes and increase prejudice towards out-groups. We examined the effects of name and religious word priming on views of a specific and well-known person, President Barack Obama. We predicted that politically conservative participants primed with President Obama's middle name (Hussein) would rate him more negatively and be more likely to view him as a Muslim than those not shown his middle name. We also examined whether conservatives primed with concrete religious words would rate President Obama more negatively and be more likely to view him as Muslim than those primed with other word types. Furthermore, we predicted that those who mis-identify President Obama as Muslim would rate him more negatively than would those who view him as Christian. The results provided mixed support for these hypotheses. Conservatives primed with President Obama's middle name rated him significantly more negatively than did those in the control condition. This effect was not found for politically liberal or moderate participants. Name priming did not significantly affect views of President Obama's religious affiliation. Although not statistically significant, conservatives primed with abstract religious words tended to rate President Obama more negatively than did those primed with other word types. Religious word priming significantly influenced views of President Obama's religious affiliation; interestingly, participants primed with abstract religious words were more likely to think President Obama is Muslim than were those primed with religious agent or non-religious words. As predicted, participants who thought president Obama was Muslim rated him significantly more negatively than did those who thought he was Christian. Overall, our results provide some evidence that ethnic name and religious word priming can significantly influence opinions, even with a well-known and specific person.

  14. 名刊建设需有精品意识%Well-known Journal Construction Need the Quality Consciousness

    Institute of Scientific and Technical Information of China (English)

    马光

    2011-01-01

    名刊建设,必须突出精品意识。要把刊物办成精品,不要办成凡品,更不要办成次品。所谓精品,就是在学术观点和研究结论上,有创新,有突破;能言之有物,不人云亦云;能推进学科繁荣,促进科研深入;能发人所未发,启迪读者;能高瞻远瞩,引领学术潮流;能推动实践,促进社会发展。在编校质量上,文字内容近于零差错,编排美观,印刷精良,装帧有个性。选题策划、组稿约稿、编校质量这三个环节是否体现精品意识,直接关系到名刊建设的成败。%Well-known journal construction should highlight the quality consciousness. Journal should be a competitive product, rather than a common one, or even a defective one. The so-called competitive product means innovation and breakthrough based on academic viewpoints and research findings; it could contain something substantially; it could boost the discipline' prosperity and promote researches in depth; it could enlighten the reader, lead academic trends, and promote social developments. In terms of the editorial quality, there should be no errors in words, with a beautiful layout, excellent printing and personalized binding. Whether these three respects in planning topics, soliciting contributions and editing quality can reflect the quality consciousness are directly concerned with the success or failure of the well-known journal construction.

  15. Use of risk quotient and probabilistic approaches to assess risks of pesticides to birds

    Science.gov (United States)

    When conducting ecological risk assessments for pesticides, the United States Environmental Protection Agency typically relies upon the risk quotient (RQ). This approach is intended to be conservative in nature, making assumptions related to exposure and effects that are intended...

  16. Implementation of a risk assessment tool based on a probabilistic safety assessment developed for radiotherapy practices

    Energy Technology Data Exchange (ETDEWEB)

    Paz, A.; Godinez, V.; Lopez, R., E-mail: abpaz@cnsns.gob.m [Comision Nacional de Seguridad Nuclear y Salvaguardias, Dr. Barragan No. 779, Col. Narvarte, 03020 Mexico D. F. (Mexico)

    2010-10-15

    The present work describes the implementation process and main results of the risk assessment to the radiotherapy practices with Linear Accelerators (Linac), with cobalt 60, and with brachytherapy. These evaluations were made throughout the risk assessment tool for radiotherapy practices SEVRRA (risk evaluation system for radiotherapy), developed at the Mexican National Commission in Nuclear Safety and Safeguards derived from the outcome obtained with the Probabilistic Safety Analysis developed at the Ibero-American Regulators Forum for these radiotherapy facilities. The methodology used is supported by risk matrices method, a mathematical tool that estimates the risk to the patient, radiation workers and public from mechanical failures, mis calibration of the devices, human mistakes, and so. The initiating events are defined as those undesirable events that, together with other failures, can produce a delivery of an over-dose or an under-dose of the medical prescribed dose, to the planned target volume, or a significant dose to non prescribed human organs. Initiating events frequency and reducer of its frequency (actions intended to avoid the accident) are estimated as well as robustness of barriers to those actions, such as mechanical switches, which detect and prevent the accident from occurring. The spectrum of the consequences is parameterized, and the actions performed to reduce the consequences are identified. Based on this analysis, a software tool was developed in order to simplify the evaluations to radiotherapy installations and it has been applied as a first step forward to some Mexican installations, as part of a national implementation process, the final goal is evaluation of all Mexican facilities in the near future. The main target and benefits of the SEVRRA implementation are presented in this paper. (Author)

  17. Using Probabilistic Seismic Hazard Analysis in Assessing Seismic Risk for Taipei City and New Taipei City

    Science.gov (United States)

    Hsu, Ming-Kai; Wang, Yu-Ju; Cheng, Chin-Tung; Ma, Kuo-Fong; Ke, Siao-Syun

    2016-04-01

    In this study, we evaluate the seismic hazard and risk for Taipei city and new Taipei city, which are important municipalities and the most populous cities in Taiwan. The evaluation of seismic risk involves the combination of three main components: probabilistic seismic hazard model, exposure model defining the spatial distribution of elements exposed to the hazard and vulnerability functions capable of describing the distribution of percentage of loss for a set of intensity measure levels. Seismic hazard at Taipei city and New Taipei city assumed as the hazard maps are presented in terms of ground motion values expected to be exceed at a 10% probability level in 50 years (return period 475 years) and a 2% probability level in 50 years (return period 2475 years) according to the Taiwan Earthquake Model (TEM), which assesses two seismic hazard models for Taiwan. The first model adopted the source parameters of 38 seismogenic structures identified by the TEM geologists. The other model considered 33 active faults and was published by the Central Geological Survey (CGS), Taiwan, in 2010. The 500m by 500m Grid-based building data were selected for the evaluation which capable of providing detail information about the location, value and vulnerability classification of the exposed elements. The results from this study were evaluated by the Openquake engine, the open-source software for seismic risk and hazard assessment developed within the global earthquake model (GEM) initiative. Our intention is to give the first attempt on the modeling the seismic risk from hazard in an open platform for Taiwan. An analysis through disaggregation of hazard components will be also made to prioritize the risk for further policy making.

  18. Assessing risk by impacts: a probabilistic approach for drought assessment in Europe

    Science.gov (United States)

    Blauhut, Veit; Stahl, Kerstin; Vogt, Jürgen

    2015-04-01

    The risk of natural disasters in a very general sense is a combination of hazard and vulnerability. For drought, the hazard is commonly derived from the statistical analysis of one or a set of drought indicators. Their selection mostly depends on the focus of the study, with the usage of standardized indices experiencing growing popularity. Vulnerability to drought is typically estimated by a subjectively weighted combination of relevant factors describing different aspects of vulnerability, such as exposure, sensitivity and adaptive capacity. This epistemic approach requires explicit information on physical, ecological, institutional and socioeconomic parameters. Even though impacts are known as symptoms of vulnerability and risk is often defined as the likelihood of impact occurrence (e.g. by the IPCC 2012 SREX report), information on past impacts is only poorly integrated in current drought risk assessment. Only few approaches have verified their vulnerability index with past impact information. We present a probabilistic approach to estimate drought risk based on the assumption that a system is vulnerable if it was impacted during a certain hazard. Thatfore, information on past drought impacts from the European Drought Impact report Inventory (EDII) can function as a proxy for vulnerability to drought. Multivariable logistic regression is then applied to find non-subjective combinations of drought indices and vulnerability factors to predict the likelihood of drought impact occurrence. The Combined Drought Indicator (CDI) of the European Drought Observatory, SPI and SPEI (1-36) are considered as drought indices; vulnerability factors are gathered from quantitative and qualitative data of statistical databases (e.g. Eurostat, Aquastat). Thus, sector- specific drought risk maps for selected hazard levels were developed for Europe. This work reconsiders the practice of current research philosophies and highlights the importance to detect vulnerability by its

  19. Probabilistic mapping of urban flood risk: Application to extreme events in Surat, India

    Science.gov (United States)

    Ramirez, Jorge; Rajasekar, Umamaheshwaran; Coulthard, Tom; Keiler, Margreth

    2016-04-01

    Surat, India is a coastal city that lies on the banks of the river Tapti and is located downstream from the Ukai dam. Given Surat's geographic location, the population of five million people are repeatedly exposed to flooding caused by high tide combined with large emergency dam releases into the Tapti river. In 2006 such a flood event occurred when intense rainfall in the Tapti catchment caused a dam release near 25,000 m3 s-1 and flooded 90% of the city. A first step towards strengthening resilience in Surat requires a robust method for mapping potential flood risk that considers the uncertainty in future dam releases. Here, in this study we develop many combinations of dam release magnitude and duration for the Ukai dam. Afterwards we use these dam releases to drive a two dimensional flood model (CAESAR-Lisflood) of Surat that also considers tidal effects. Our flood model of Surat utilizes fine spatial resolution (30m) topography produced from an extensive differential global positioning system survey and measurements of river cross-sections. Within the city we have modelled scenarios that include extreme conditions with near maximum dam release levels (e.g. 1:250 year flood) and high tides. Results from all scenarios have been summarized into probabilistic flood risk maps for Surat. These maps are currently being integrated within the city disaster management plan for taking both mitigation and adaptation measures for different scenarios of flooding.

  20. Evaluating probabilistic dengue risk forecasts from a prototype early warning system for Brazil

    Science.gov (United States)

    Lowe, Rachel; Coelho, Caio AS; Barcellos, Christovam; Carvalho, Marilia Sá; Catão, Rafael De Castro; Coelho, Giovanini E; Ramalho, Walter Massa; Bailey, Trevor C; Stephenson, David B; Rodó, Xavier

    2016-01-01

    Recently, a prototype dengue early warning system was developed to produce probabilistic forecasts of dengue risk three months ahead of the 2014 World Cup in Brazil. Here, we evaluate the categorical dengue forecasts across all microregions in Brazil, using dengue cases reported in June 2014 to validate the model. We also compare the forecast model framework to a null model, based on seasonal averages of previously observed dengue incidence. When considering the ability of the two models to predict high dengue risk across Brazil, the forecast model produced more hits and fewer missed events than the null model, with a hit rate of 57% for the forecast model compared to 33% for the null model. This early warning model framework may be useful to public health services, not only ahead of mass gatherings, but also before the peak dengue season each year, to control potentially explosive dengue epidemics. DOI: http://dx.doi.org/10.7554/eLife.11285.001 PMID:26910315

  1. Developing a probabilistic fire risk model and its application to fire danger systems

    Science.gov (United States)

    Penman, T.; Bradstock, R.; Caccamo, G.; Price, O.

    2012-04-01

    Wildfires can result in significant economic losses where they encounter human assets. Management agencies have large budgets devoted to both prevention and suppression of fires, but little is known about the extent to which they alter the probability of asset loss. Prediction of the risk of asset loss as a result of wildfire requires an understanding of a number of complex processes from ignition, fire growth and impact on assets. These processes need to account for the additive or multiplicative effects of management, weather and the natural environment. Traditional analytical methods can only examine only a small subset of these. Bayesian Belief Networks (BBNs) provide a methodology to examine complex environmental problems. Outcomes of a BBN are represented as likelihoods, which can then form the basis for risk analysis and management. Here we combine a range of data sources, including simulation models, empirical statistical analyses and expert opinion to form a fire management BBN. Various management actions have been incorporated into the model including landscape and interface prescribed burning, initial attack and fire suppression. Performance of the model has been tested against fire history datasets with strong correlations being found. Adapting the BBN presented here we are capable of developing a spatial and temporal fire danger rating system. Currently Australian fire danger rating systems are based on the weather. Our model accounts for existing fires, as well as the risk of new ignitions combined with probabilistic weather forecasts to identify those areas which are most at risk of asset loss. Fire growth is modelled with consideration given to management prevention efforts, as well as suppression resources that are available in each geographic locality. At a 10km resolution the model will provide a probability of asset loss which represents a significant step forward in the level of information that can be provided to the general public.

  2. Use Of Probabilistic Risk Assessment (PRA) In Expert Systems To Advise Nuclear Plant Operators And Managers

    Science.gov (United States)

    Uhrig, Robert E.

    1988-03-01

    The use of expert systems in nuclear power plants to provide advice to managers, supervisors and/or operators is a concept that is rapidly gaining acceptance. f2 Generally, expert systems rely on the expertise of human experts or knowledge that has been codified in publications, books, or regulations to provide advice under a wide variety of conditions. In this work, a probabilistic risk assessment (PRA)3 of a nuclear power plant performed previously is used to assess the safety status of nuclear power plants and to make recommendations to the plant personnel. Nuclear power plants have many redundant systems and can continue to operate when one or more of these systems is disabled or removed from service for maintenance or testing. PRAs provide a means of evaluating the risk to the public associated with the operation of nuclear power plants with components or systems out of service. While the choice of the "source term" and methodology in a PRA may influence the absolute probability and consequences of a core melt, the ratio of two PRA calculations for two configurations of the same plant, carried out on a consistent basis, can readily identify the increase in risk associated with going from one configuration to the other. PRISIM,4 a personal computer program to calculate the ratio of core melt probabilities described above (based on previously performed PRAs), has been developed under the sponsorship of the U.S. Nuclear Regulatory Commission (NRC). When one or several components are removed from service, PRISM then calculates the ratio of the core melt probabilities. The inference engine of the expert system then uses this ratio and a constant risk criterion,5 along with information from its knowledge base (which includes information from the PRA), to advise plant personnel as to what action, if any, should be taken.

  3. Uncertainty analysis of EUSES: Improving risk management through probabilistic risk assessment

    NARCIS (Netherlands)

    Jager T; Rikken MGJ; Poel P van der; ECO

    1997-01-01

    In risk assessment of new and existing substances, it is current practice to characterise risk using a deterministic quotient of the exposure concentration, or the dose, and a no-effect level. Feelings of uncertainty are tackled by introducing worst-case assumptions in the methodology. Since this pr

  4. Novel Threat-risk Index Using Probabilistic Risk Assessment and Human Reliability Analysis - Final Report

    Energy Technology Data Exchange (ETDEWEB)

    George A. Beitel

    2004-02-01

    In support of a national need to improve the current state-of-the-art in alerting decision makers to the risk of terrorist attack, a quantitative approach employing scientific and engineering concepts to develop a threat-risk index was undertaken at the Idaho National Engineering and Environmental Laboratory (INEEL). As a result of this effort, a set of models has been successfully integrated into a single comprehensive model known as Quantitative Threat-Risk Index Model (QTRIM), with the capability of computing a quantitative threat-risk index on a system level, as well as for the major components of the system. Such a threat-risk index could provide a quantitative variant or basis for either prioritizing security upgrades or updating the current qualitative national color-coded terrorist threat alert.

  5. Extensive gaps and biases in our knowledge of a well-known fauna: Implications for integrating biological traits into macroecology

    KAUST Repository

    Tyler, Elizabeth

    2011-12-09

    Aim Ecologists seeking to describe patterns at ever larger scales require compilations of data on the global abundance and distribution of species. Comparable compilations of biological data are needed to elucidate the mechanisms behind these patterns, but have received far less attention. We assess the availability of biological data across an entire assemblage: the well-documented demersal marine fauna of the United Kingdom. We also test whether data availability for a species depends on its taxonomic group, maximum body size, the number of times it has been recorded in a global biogeographic database, or its commercial and conservation importance. Location Seas of the United Kingdom. Methods We defined a demersal marine fauna of 973 species from 15 phyla and 40 classes using five extensive surveys around the British Isles. We then quantified the availability of data on eight key biological traits (termed biological knowledge) for each species from online databases. Relationships between biological knowledge and our predictors were tested with generalized linear models. Results Full data on eight fundamental biological traits exist for only 9% (n= 88) of the UK demersal marine fauna, and 20% of species completely lack data. Clear trends in our knowledge exist: fish (median biological knowledge score = six traits) are much better known than invertebrates (one trait). Biological knowledge increases with biogeographic knowledge and (to a lesser extent) with body size, and is greater in species that are commercially exploited or of conservation concern. Main conclusions Our analysis reveals deep ignorance of the basic biology of a well-studied fauna, highlighting the need for far greater efforts to compile biological trait data. Clear biases in our knowledge, relating to how well sampled or \\'important\\' species are suggests that caution is required in extrapolating small subsets of biologically well-known species to ecosystem-level studies. © 2011 Blackwell

  6. Lunar-based Ultraviolet Telescope study of the well-known Algol-type binary TW Dra

    Science.gov (United States)

    Liao, Wen-Ping; Qian, Sheng-Bang; Zejda, Miloslav; Zhu, Li-Ying; Li, Lin-Jia

    2016-06-01

    By using the Lunar-based Ultraviolet Telescope (LUT) from 2014 December 2 to December 4, the first near-UV light curve of the well-known Algol-type binary TW Dra is reported, which is analyzed with the 2013 version of the W-D code. Our solutions confirmed that TW Dra is a semi-detached binary system where the secondary component fills its Roche lobe. The mass ratio and a high inclination are obtained (q = 0.47, i = 86.68°). Based on 589 available data spanning more than one century, the complex period changes are studied. Secular increase and three cyclical changes are found in the corresponding orbital period analysis. The secular increase changes reveal mass transfer from the secondary component to the primary one at a rate of 6.8 × 10-7 M ⊙ yr-1. One large cyclical change of 116.04 yr may be caused by disturbance of visual component ADS 9706B orbiting TW Dra (ADS 9706A), while the other two cyclical changes with shorter periods of 22.47 and 37.27 yr can be explained as the result of two circumbinary companions that are orbiting around TW Dra, where the two companions are in simple 3 : 5 orbit-rotation resonances. TW Dra itself is a basic binary in a possible sextuple system with the configuration (1 + 1) + (1 + 1) + (1 + 1), which further suggests that multiplicity may be a fairly common phenomenon in close binary systems.

  7. Inaccuracies inthe history ofa well-known introduction:a case study ofthe Australian House Sparrow (Passer domesticus)

    Institute of Scientific and Technical Information of China (English)

    Samuel C. Andrew; Simon C. Griffth

    2016-01-01

    Background: Modern ecosystems contain many invasive species as a result of the activity of acclimatisation socie-ties that operated in the second half of the nineteenth century, and these species provide good opportunities for studying invasion biology. However, to gain insight into the ecological and genetic mechanisms that determine the rate of colonization and adaptation to new environments, we need a good understanding of the history of the intro-duced species, and a knowledge of the source population, timing, and number of individuals introduced is particu-larly important. However, any inaccuracies in the history of an introduction will affect subsequent assumptions and conclusions. Methods: Focusing on a single well-known species, the House Sparrow (Passer domesticus), we have documented the introduction into Australia using primary sources (e.g. acclimatisation records and newspaper articles). Results: Our revised history differs in a number of signiifcant ways from previous accounts. Our evidence indicates that the House Sparrow was not solely introduced from source populations in England but also from Germany and most strikingly also from India—with the latter birds belonging to a different race. We also clarify the distinction between the number released and the number of founders, due to pre-release captive breeding programs, as well as identifying inaccuracies in a couple of well-cited sources with respect to the range expansion of the introduced populations. Conclusions: Our work suggests that caution is required for those studying introductions using the key sources of historical information and ideally should review original sources of information to verify the accuracy of published accounts.

  8. Ship Detection with Spectral Analysis of Synthetic Aperture Radar: A Comparison of New and Well-Known Algorithms

    Directory of Open Access Journals (Sweden)

    Armando Marino

    2015-04-01

    Full Text Available The surveillance of maritime areas with remote sensing is vital for security reasons, as well as for the protection of the environment. Satellite-borne synthetic aperture radar (SAR offers large-scale surveillance, which is not reliant on solar illumination and is rather independent of weather conditions. The main feature of vessels in SAR images is a higher backscattering compared to the sea background. This peculiarity has led to the development of several ship detectors focused on identifying anomalies in the intensity of SAR images. More recently, different approaches relying on the information kept in the spectrum of a single-look complex (SLC SAR image were proposed. This paper is focused on two main issues. Firstly, two recently developed sub-look detectors are applied for the first time to ship detection. Secondly, new and well-known ship detection algorithms are compared in order to understand which has the best performance under certain circumstances and if the sub-look analysis improves ship detection. The comparison is done on real SAR data exploiting diversity in frequency and polarization. Specifically, the employed data consist of six RADARSAT-2 fine quad-polacquisitions over the North Sea, five TerraSAR-X HH/VV dual-polarimetric data-takes, also over the North Sea, and one ALOS-PALSAR quad-polarimetric dataset over Tokyo Bay. Simultaneously to the SAR images, validation data were collected, which include the automatic identification system (AIS position of ships and wind speeds. The results of the analysis show that the performance of the different sub-look algorithms considered here is strongly dependent on polarization, frequency and resolution. Interestingly, these sub-look detectors are able to outperform the classical SAR intensity detector when the sea state is particularly high, leading to a strong clutter contribution. It was also observed that there are situations where the performance improvement thanks to the sub

  9. A probabilistic storm surge risk model for the German North Sea and Baltic Sea coast

    Science.gov (United States)

    Grabbert, Jan-Henrik; Reiner, Andreas; Deepen, Jan; Rodda, Harvey; Mai, Stephan; Pfeifer, Dietmar

    2010-05-01

    The German North Sea coast is highly exposed to storm surges. Due to its concave bay-like shape mainly orientated to the North-West, cyclones from Western, North-Western and Northern directions together with astronomical tide cause storm surges accumulating the water in the German bight. Due to the existence of widespread low-lying areas (below 5m above mean sea level) behind the defenses, large areas including large economic values are exposed to coastal flooding including cities like Hamburg or Bremen. The occurrence of extreme storm surges in the past like e.g. in 1962 taking about 300 lives and causing widespread flooding and 1976 raised the awareness and led to a redesign of the coastal defenses which provide a good level of protection for today's conditions. Never the less the risk of flooding exists. Moreover an amplification of storm surge risk can be expected under the influence of climate change. The Baltic Sea coast is also exposed to storm surges, which are caused by other meteorological patterns. The influence of the astronomical tide is quite low instead high water levels are induced by strong winds only. Since the exceptional extreme event in 1872 storm surge hazard has been more or less forgotten. Although such an event is very unlikely to happen, it is not impossible. Storm surge risk is currently (almost) non-insurable in Germany. The potential risk is difficult to quantify as there are almost no historical losses available. Also premiums are difficult to assess. Therefore a new storm surge risk model is being developed to provide a basis for a probabilistic quantification of potential losses from coastal inundation. The model is funded by the GDV (German Insurance Association) and is planned to be used within the German insurance sector. Results might be used for a discussion of insurance cover for storm surge. The model consists of a probabilistic event driven hazard and a vulnerability module, furthermore an exposure interface and a financial

  10. An overview of the evolution of human reliability analysis in the context of probabilistic risk assessment.

    Energy Technology Data Exchange (ETDEWEB)

    Bley, Dennis C. (Buttonwood Consulting Inc., Oakton, VA); Lois, Erasmia (U.S. Nuclear Regulatory Commission, Washington, DC); Kolaczkowski, Alan M. (Science Applications International Corporation, Eugene, OR); Forester, John Alan; Wreathall, John (John Wreathall and Co., Dublin, OH); Cooper, Susan E. (U.S. Nuclear Regulatory Commission, Washington, DC)

    2009-01-01

    Since the Reactor Safety Study in the early 1970's, human reliability analysis (HRA) has been evolving towards a better ability to account for the factors and conditions that can lead humans to take unsafe actions and thereby provide better estimates of the likelihood of human error for probabilistic risk assessments (PRAs). The purpose of this paper is to provide an overview of recent reviews of operational events and advances in the behavioral sciences that have impacted the evolution of HRA methods and contributed to improvements. The paper discusses the importance of human errors in complex human-technical systems, examines why humans contribute to accidents and unsafe conditions, and discusses how lessons learned over the years have changed the perspective and approach for modeling human behavior in PRAs of complicated domains such as nuclear power plants. It is argued that it has become increasingly more important to understand and model the more cognitive aspects of human performance and to address the broader range of factors that have been shown to influence human performance in complex domains. The paper concludes by addressing the current ability of HRA to adequately predict human failure events and their likelihood.

  11. Flood Risk and Probabilistic Benefit Assessment to Support Management of Flood-Prone Lands: Evidence From Candaba Floodplains, Philippines

    Science.gov (United States)

    Juarez, A. M.; Kibler, K. M.; Sayama, T.; Ohara, M.

    2016-12-01

    Flood management decision-making is often supported by risk assessment, which may overlook the role of coping capacity and the potential benefits derived from direct use of flood-prone land. Alternatively, risk-benefit analysis can support floodplain management to yield maximum socio-ecological benefits for the minimum flood risk. We evaluate flood risk-probabilistic benefit tradeoffs of livelihood practices compatible with direct human use of flood-prone land (agriculture/wild fisheries) and nature conservation (wild fisheries only) in Candaba, Philippines. Located north-west to Metro Manila, Candaba area is a multi-functional landscape that provides a temporally-variable mix of possible land uses, benefits and ecosystem services of local and regional value. To characterize inundation from 1.3- to 100-year recurrence intervals we couple frequency analysis with rainfall-runoff-inundation modelling and remotely-sensed data. By combining simulated probabilistic floods with both damage and benefit functions (e.g. fish capture and rice yield with flood intensity) we estimate potential damages and benefits over varying probabilistic flood hazards. We find that although direct human uses of flood-prone land are associated with damages, for all the investigated magnitudes of flood events with different frequencies, the probabilistic benefits ( 91 million) exceed risks by a large margin ( 33 million). Even considering risk, probabilistic livelihood benefits of direct human uses far exceed benefits provided by scenarios that exclude direct "risky" human uses (difference of 85 million). In addition, we find that individual coping strategies, such as adapting crop planting periods to the flood pulse or fishing rather than cultivating rice in the wet season, minimize flood losses ( 6 million) while allowing for valuable livelihood benefits ($ 125 million) in flood-prone land. Analysis of societal benefits and local capacities to cope with regular floods demonstrate the

  12. The Integrated Medical Model: A Probabilistic Simulation Model Predicting In-Flight Medical Risks

    Science.gov (United States)

    Keenan, Alexandra; Young, Millennia; Saile, Lynn; Boley, Lynn; Walton, Marlei; Kerstman, Eric; Shah, Ronak; Goodenow, Debra A.; Myers, Jerry G., Jr.

    2015-01-01

    The Integrated Medical Model (IMM) is a probabilistic model that uses simulation to predict mission medical risk. Given a specific mission and crew scenario, medical events are simulated using Monte Carlo methodology to provide estimates of resource utilization, probability of evacuation, probability of loss of crew, and the amount of mission time lost due to illness. Mission and crew scenarios are defined by mission length, extravehicular activity (EVA) schedule, and crew characteristics including: sex, coronary artery calcium score, contacts, dental crowns, history of abdominal surgery, and EVA eligibility. The Integrated Medical Evidence Database (iMED) houses the model inputs for one hundred medical conditions using in-flight, analog, and terrestrial medical data. Inputs include incidence, event durations, resource utilization, and crew functional impairment. Severity of conditions is addressed by defining statistical distributions on the dichotomized best and worst-case scenarios for each condition. The outcome distributions for conditions are bounded by the treatment extremes of the fully treated scenario in which all required resources are available and the untreated scenario in which no required resources are available. Upon occurrence of a simulated medical event, treatment availability is assessed, and outcomes are generated depending on the status of the affected crewmember at the time of onset, including any pre-existing functional impairments or ongoing treatment of concurrent conditions. The main IMM outcomes, including probability of evacuation and loss of crew life, time lost due to medical events, and resource utilization, are useful in informing mission planning decisions. To date, the IMM has been used to assess mission-specific risks with and without certain crewmember characteristics, to determine the impact of eliminating certain resources from the mission medical kit, and to design medical kits that maximally benefit crew health while meeting

  13. The Integrated Medical Model: A Probabilistic Simulation Model for Predicting In-Flight Medical Risks

    Science.gov (United States)

    Keenan, Alexandra; Young, Millennia; Saile, Lynn; Boley, Lynn; Walton, Marlei; Kerstman, Eric; Shah, Ronak; Goodenow, Debra A.; Myers, Jerry G.

    2015-01-01

    The Integrated Medical Model (IMM) is a probabilistic model that uses simulation to predict mission medical risk. Given a specific mission and crew scenario, medical events are simulated using Monte Carlo methodology to provide estimates of resource utilization, probability of evacuation, probability of loss of crew, and the amount of mission time lost due to illness. Mission and crew scenarios are defined by mission length, extravehicular activity (EVA) schedule, and crew characteristics including: sex, coronary artery calcium score, contacts, dental crowns, history of abdominal surgery, and EVA eligibility. The Integrated Medical Evidence Database (iMED) houses the model inputs for one hundred medical conditions using in-flight, analog, and terrestrial medical data. Inputs include incidence, event durations, resource utilization, and crew functional impairment. Severity of conditions is addressed by defining statistical distributions on the dichotomized best and worst-case scenarios for each condition. The outcome distributions for conditions are bounded by the treatment extremes of the fully treated scenario in which all required resources are available and the untreated scenario in which no required resources are available. Upon occurrence of a simulated medical event, treatment availability is assessed, and outcomes are generated depending on the status of the affected crewmember at the time of onset, including any pre-existing functional impairments or ongoing treatment of concurrent conditions. The main IMM outcomes, including probability of evacuation and loss of crew life, time lost due to medical events, and resource utilization, are useful in informing mission planning decisions. To date, the IMM has been used to assess mission-specific risks with and without certain crewmember characteristics, to determine the impact of eliminating certain resources from the mission medical kit, and to design medical kits that maximally benefit crew health while meeting

  14. Probabilistic Risk Based Decision Support for Oil and Gas Exploration and Production Facilities in Sensitive Ecosystems

    Energy Technology Data Exchange (ETDEWEB)

    Greg Thoma; John Veil; Fred Limp; Jackson Cothren; Bruce Gorham; Malcolm Williamson; Peter Smith; Bob Sullivan

    2009-05-31

    This report describes work performed during the initial period of the project 'Probabilistic Risk Based Decision Support for Oil and Gas Exploration and Production Facilities in Sensitive Ecosystems.' The specific region that is within the scope of this study is the Fayetteville Shale Play. This is an unconventional, tight formation, natural gas play that currently has approximately 1.5 million acres under lease, primarily to Southwestern Energy Incorporated and Chesapeake Energy Incorporated. The currently active play encompasses a region from approximately Fort Smith, AR east to Little Rock, AR approximately 50 miles wide (from North to South). The initial estimates for this field put it almost on par with the Barnett Shale play in Texas. It is anticipated that thousands of wells will be drilled during the next several years; this will entail installation of massive support infrastructure of roads and pipelines, as well as drilling fluid disposal pits and infrastructure to handle millions of gallons of fracturing fluids. This project focuses on gas production in Arkansas as the test bed for application of proactive risk management decision support system for natural gas exploration and production. The activities covered in this report include meetings with representative stakeholders, development of initial content and design for an educational web site, and development and preliminary testing of an interactive mapping utility designed to provide users with information that will allow avoidance of sensitive areas during the development of the Fayetteville Shale Play. These tools have been presented to both regulatory and industrial stakeholder groups, and their feedback has been incorporated into the project.

  15. A Methodology for the Development of a Reliability Database for an Advanced Reactor Probabilistic Risk Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Grabaskas, Dave; Brunett, Acacia J.; Bucknor, Matthew

    2016-06-26

    GE Hitachi Nuclear Energy (GEH) and Argonne National Laboratory are currently engaged in a joint effort to modernize and develop probabilistic risk assessment (PRA) techniques for advanced non-light water reactors. At a high level the primary outcome of this project will be the development of next-generation PRA methodologies that will enable risk-informed prioritization of safety- and reliability-focused research and development, while also identifying gaps that may be resolved through additional research. A subset of this effort is the development of a reliability database (RDB) methodology to determine applicable reliability data for inclusion in the quantification of the PRA. The RDB method developed during this project seeks to satisfy the requirements of the Data Analysis element of the ASME/ANS Non-LWR PRA standard. The RDB methodology utilizes a relevancy test to examine reliability data and determine whether it is appropriate to include as part of the reliability database for the PRA. The relevancy test compares three component properties to establish the level of similarity to components examined as part of the PRA. These properties include the component function, the component failure modes, and the environment/boundary conditions of the component. The relevancy test is used to gauge the quality of data found in a variety of sources, such as advanced reactor-specific databases, non-advanced reactor nuclear databases, and non-nuclear databases. The RDB also establishes the integration of expert judgment or separate reliability analysis with past reliability data. This paper provides details on the RDB methodology, and includes an example application of the RDB methodology for determining the reliability of the intermediate heat exchanger of a sodium fast reactor. The example explores a variety of reliability data sources, and assesses their applicability for the PRA of interest through the use of the relevancy test.

  16. Jerome Lewis Duggan: A Nuclear Physicist and a Well-Known, Six-Decade Accelerator Application Conference (CAARI) Organizer

    Science.gov (United States)

    Del McDaniel, Floyd; Doyle, Barney L.

    Jerry Duggan was an experimental MeV-accelerator-based nuclear and atomic physicist who, over the past few decades, played a key role in the important transition of this field from basic to applied physics. His fascination for and application of particle accelerators spanned almost 60 years, and led to important discoveries in the following fields: accelerator-based analysis (accelerator mass spectrometry, ion beam techniques, nuclear-based analysis, nuclear microprobes, neutron techniques); accelerator facilities, stewardship, and technology development; accelerator applications (industrial, medical, security and defense, and teaching with accelerators); applied research with accelerators (advanced synthesis and modification, radiation effects, nanosciences and technology); physics research (atomic and molecular physics, and nuclear physics); and many other areas and applications. Here we describe Jerry’s physics education at the University of North Texas (B. S. and M. S.) and Louisiana State University (Ph.D.). We also discuss his research at UNT, LSU, and Oak Ridge National Laboratory, his involvement with the industrial aspects of accelerators, and his impact on many graduate students, colleagues at UNT and other universities, national laboratories, and industry and acquaintances around the world. Along the way, we found it hard not to also talk about his love of family, sports, fishing, and other recreational activities. While these were significant accomplishments in his life, Jerry will be most remembered for his insight in starting and his industry in maintaining and growing what became one of the most diverse accelerator conferences in the world — the International Conference on the Application of Accelerators in Research and Industry, or what we all know as CAARI. Through this conference, which he ran almost single-handed for decades, Jerry came to know, and became well known by, literally thousands of atomic and nuclear physicists, accelerator

  17. Can exposure limitations for well-known contact allergens be simplified? An analysis of dose-response patch test data

    DEFF Research Database (Denmark)

    Neergaard, Louise Arup; Menné, Torkil; Voelund, Aage;

    2011-01-01

    Allergic contact dermatitis is triggered by chemicals in the environment. Primary prevention is aimed at minimizing the risk of induction, whereas secondary and tertiary prevention are aimed at reducing elicitation....

  18. A probabilistic modeling approach to assess human inhalation exposure risks to airborne aflatoxin B 1 (AFB 1)

    Science.gov (United States)

    Liao, Chung-Min; Chen, Szu-Chieh

    To assess how the human lung exposure to airborne aflatoxin B 1 (AFB 1) during on-farm activities including swine feeding, storage bin cleaning, corn harvest, and grain elevator loading/unloading, we present a probabilistic risk model, appraised with empirical data. The model integrates probabilistic exposure profiles from a compartmental lung model with the reconstructed dose-response relationships based on an empirical three-parameter Hill equation model, describing AFB 1 cytotoxicity for inhibition response in human bronchial epithelial cells, to quantitatively estimate the inhalation exposure risks. The risk assessment results implicate that exposure to airborne AFB 1 may pose no significance to corn harvest and grain elevator loading/unloading activities, yet a relatively high risk for swine feeding and storage bin cleaning. Applying a joint probability function method based on exceedence profiles, we estimate that a potential high risk for the bronchial region (inhibition=56.69% with 95% confidence interval (CI): 35.05-72.87%) and bronchiolar region (inhibition=44.93% with 95% CI: 21.61 - 66.78%) is alarming during swine feeding activity. We parameterized the proposed predictive model that should encourage a risk-management framework for discussion of carcinogenic risk in occupational settings where inhalation of AFB 1-contaminated dust occurs.

  19. Probabilistic integrated risk assessment of human exposure risk to environmental bisphenol A pollution sources.

    Science.gov (United States)

    Fu, Keng-Yen; Cheng, Yi-Hsien; Chio, Chia-Pin; Liao, Chung-Min

    2016-10-01

    Environmental bisphenol A (BPA) exposure has been linked to a variety of adverse health effects such as developmental and reproductive issues. However, establishing a clear association between BPA and the likelihood of human health is complex yet fundamentally uncertain. The purpose of this study was to assess the potential exposure risks from environmental BPA among Chinese population based on five human health outcomes, namely immune response, uterotrophic assay, cardiovascular disease (CVD), diabetes, and behavior change. We addressed these health concerns by using a stochastic integrated risk assessment approach. The BPA dose-dependent likelihood of effects was reconstructed by a series of Hill models based on animal models or epidemiological data. We developed a physiologically based pharmacokinetic (PBPK) model that allows estimation of urinary BPA concentration from external exposures. Here we showed that the daily average exposure concentrations of BPA and urinary BPA estimates were consistent with the published data. We found that BPA exposures were less likely to pose significant risks for infants (0-1 year) and adults (male and female >20 years) with human long-term BPA susceptibility in relation to multiple exposure pathways, and for informing the public of the negligible magnitude of environmental BPA pollution impacts on human health.

  20. Application of Probabilistic Modeling to Quantify the Reduction Levels of Hepatocellular Carcinoma Risk Attributable to Chronic Aflatoxins Exposure

    DEFF Research Database (Denmark)

    Wambui, Joseph M.; Karuri, Edward G.; Ojiambo, Julia A.

    2017-01-01

    Epidemiological studies show a definite connection between areas of high aflatoxin content and a high occurrence of human hepatocellular carcinoma (HCC). Hepatitis B virus in individuals further increases the risk of HCC. The two risk factors are prevalent in rural Kenya and continuously predispose...... the rural populations to HCC. A quantitative cancer risk assessment therefore quantified the levels at which potential pre- and postharvest interventions reduce the HCC risk attributable to consumption of contaminated maize and groundnuts. The assessment applied a probabilistic model to derive probability...... distributions of HCC cases and percentage reductions levels of the risk from secondary data. Contaminated maize and groundnuts contributed to 1,847 +/- 514 and 158 +/- 52 HCC cases per annum, respectively. The total contribution of both foods to the risk was additive as it resulted in 2,000 +/- 518 cases per...

  1. Probabilistic Risk Assessment of Cancer from Exposure Inorganic Arsenic in Duplicate Food by Villagers in Ronphibun, Thailand

    Directory of Open Access Journals (Sweden)

    Piyawat Saipan

    2010-07-01

    Full Text Available Ronphibun district is a district in Nakorn Si Thammarat province, within southern Thailand. This district is the site of several former tin mines that were in operation 100 years ago. Arsenic contamination caused by past mining activities remains in the area. The specific purpose of this study was conducted to assess cancer risk in people living within Ronphibun district from exposure to inorganic arsenic via duplicate food using probabilistic risk assessment. A hundred and fifty duplicate food samples were collected from participants. Inorganic arsenic concentrations are determined by hydride generation atomic absorption spectrometry. Inorganic arsenic concentrations in duplicate food ranged from 0.16 to 0.42 μg/g dry weight. The probabilistic carcinogenic risk levels were 6.76 x 10-4 and 1.74 x 10-3 based on the 50th and 95th percentile, respectively. Risk values for people in Ronphibun from exposure to inorganic arsenic remained higher than the acceptable target risk. Sensitivity analysis indicted that exposure duration and concentrations of arsenic in food were the two most influential of cancer risk estimates.

  2. Development of transient initiating event frequencies for use in probabilistic risk assessments

    Energy Technology Data Exchange (ETDEWEB)

    Mackowiak, D.P.; Gentillon, C.D.; Smith, K.L.

    1985-05-01

    Transient initiating event frequencies are an essential input to the analysis process of a nuclear power plant probabilistic risk assessment. These frequencies describe events causing or requiring scrams. This report documents an effort to validate and update from other sources a computer-based data file developed by the Electric Power Research Institute (EPRI) describing such events at 52 United States commercial nuclear power plants. Operating information from the United States Nuclear Regulatory Commission on 24 additional plants from their date of commercial operation has been combined with the EPRI data, and the entire data base has been updated to add 1980 through 1983 events for all 76 plants. The validity of the EPRI data and data analysis methodology and the adequacy of the EPRI transient categories are examined. New transient initiating event frequencies are derived from the expanded data base using the EPRI transient categories and data display methods. Upper bounds for these frequencies are also provided. Additional analyses explore changes in the dominant transients, changes in transient outage times and their impact on plant operation, and the effects of power level and scheduled scrams on transient event frequencies. A more rigorous data analysis methodology is developed to encourage further refinement of the transient initiating event frequencies derived herein. Updating the transient event data base resulted in approx.2400 events being added to EPRI's approx.3000-event data file. The resulting frequency estimates were in most cases lower than those reported by EPRI, but no significant order-of-magnitude changes were noted. The average number of transients per year for the combined data base is 8.5 for pressurized water reactors and 7.4 for boiling water reactors.

  3. Seamless Level 2/Level 3 probabilistic risk assessment using dynamic event tree analysis

    Science.gov (United States)

    Osborn, Douglas Matthew

    The current approach to Level 2 and Level 3 probabilistic risk assessment (PRA) using the conventional event-tree/fault-tree methodology requires pre-specification of event order occurrence which may vary significantly in the presence of uncertainties. Manual preparation of input data to evaluate the possible scenarios arising from these uncertainties may also lead to errors from faulty/incomplete input preparation and their execution using serial runs may lead to computational challenges. A methodology has been developed for Level 2 analysis using dynamic event trees (DETs) that removes these limitations with systematic and mechanized quantification of the impact of aleatory uncertainties on possible consequences and their likelihoods. The methodology is implemented using the Analysis of Dynamic Accident Progression Trees (ADAPT) software. For the purposes of this work, aleatory uncertainties are defined as those arising from the stochastic nature of the processes under consideration, such as the variability of weather, in which the probability of weather patterns is predictable but the conditions at the time of the accident are a matter of chance. Epistemic uncertainties are regarded as those arising from the uncertainty in the model (system code) input parameters (e.g., friction or heat transfer correlation parameters). This work conducts a seamless Level 2/3 PRA using a DET analysis. The research helps to quantify and potentially reduce the magnitude of the source term uncertainty currently experienced in Level 3 PRA. Current techniques have been demonstrated with aleatory uncertainties for environmental releases of radioactive materials. This research incorporates epistemic and aleatory uncertainties in a phenomenologically consistent manner through use of DETs. The DETs were determined using the ADAPT framework and linking ADAPT with MELCOR, MELMACCS, and the MELCOR Accident Consequence Code System, Version 2. Aleatory and epistemic uncertainties incorporated

  4. Probabilistic migration modelling focused on functional barrier efficiency and low migration concepts in support of risk assessment.

    Science.gov (United States)

    Brandsch, Rainer

    2017-10-01

    Migration modelling provides reliable migration estimates from food-contact materials (FCM) to food or food simulants based on mass-transfer parameters like diffusion and partition coefficients related to individual materials. In most cases, mass-transfer parameters are not readily available from the literature and for this reason are estimated with a given uncertainty. Historically, uncertainty was accounted for by introducing upper limit concepts first, turning out to be of limited applicability due to highly overestimated migration results. Probabilistic migration modelling gives the possibility to consider uncertainty of the mass-transfer parameters as well as other model inputs. With respect to a functional barrier, the most important parameters among others are the diffusion properties of the functional barrier and its thickness. A software tool that accepts distribution as inputs and is capable of applying Monte Carlo methods, i.e., random sampling from the input distributions of the relevant parameters (i.e., diffusion coefficient and layer thickness), predicts migration results with related uncertainty and confidence intervals. The capabilities of probabilistic migration modelling are presented in the view of three case studies (1) sensitivity analysis, (2) functional barrier efficiency and (3) validation by experimental testing. Based on the predicted migration by probabilistic migration modelling and related exposure estimates, safety evaluation of new materials in the context of existing or new packaging concepts is possible. Identifying associated migration risk and potential safety concerns in the early stage of packaging development is possible. Furthermore, dedicated material selection exhibiting required functional barrier efficiency under application conditions becomes feasible. Validation of the migration risk assessment by probabilistic migration modelling through a minimum of dedicated experimental testing is strongly recommended.

  5. Probabilistic risk analysis in manufacturing situational operation: application of modelling techniques and causal structure to improve safety performance.

    Directory of Open Access Journals (Sweden)

    Jose Cristiano Pereira

    2015-01-01

    Full Text Available The use of probabilistic risk analysis in jet engines manufacturing process is essential to prevent failure. The objective of this study is to present a probabilistic risk analysis model to analyze the safety of this process. The standard risk assessment normally conducted is inadequate to address the risks. To remedy this problem, the model presented in this paper considers the effects of human, software and calibration reliability in the process. Bayesian Belief Network coupled to a Bow Tie diagram is used to identify potential engine failure scenarios. In this context and to meet this objective, an in depth literature research was conducted to identify the most appropriate modeling techniques and an interview were conducted with experts. As a result of this study, this paper presents a model that combines fault tree analysis, event tree analysis and a Bayesian Belief Networks into a single model that can be used by decision makers to identify critical risk factors in order to allocate resources to improve the safety of the system. The model is delivered in the form of a computer assisted decision tool supported by subject expert estimates.

  6. Predicting Rib Fracture Risk With Whole-Body Finite Element Models: Development and Preliminary Evaluation of a Probabilistic Analytical Framework

    Science.gov (United States)

    Forman, Jason L.; Kent, Richard W.; Mroz, Krystoffer; Pipkorn, Bengt; Bostrom, Ola; Segui-Gomez, Maria

    2012-01-01

    This study sought to develop a strain-based probabilistic method to predict rib fracture risk with whole-body finite element (FE) models, and to describe a method to combine the results with collision exposure information to predict injury risk and potential intervention effectiveness in the field. An age-adjusted ultimate strain distribution was used to estimate local rib fracture probabilities within an FE model. These local probabilities were combined to predict injury risk and severity within the whole ribcage. The ultimate strain distribution was developed from a literature dataset of 133 tests. Frontal collision simulations were performed with the THUMS (Total HUman Model for Safety) model with four levels of delta-V and two restraints: a standard 3-point belt and a progressive 3.5–7 kN force-limited, pretensioned (FL+PT) belt. The results of three simulations (29 km/h standard, 48 km/h standard, and 48 km/h FL+PT) were compared to matched cadaver sled tests. The numbers of fractures predicted for the comparison cases were consistent with those observed experimentally. Combining these results with field exposure informantion (ΔV, NASS-CDS 1992–2002) suggests a 8.9% probability of incurring AIS3+ rib fractures for a 60 year-old restrained by a standard belt in a tow-away frontal collision with this restraint, vehicle, and occupant configuration, compared to 4.6% for the FL+PT belt. This is the first study to describe a probabilistic framework to predict rib fracture risk based on strains observed in human-body FE models. Using this analytical framework, future efforts may incorporate additional subject or collision factors for multi-variable probabilistic injury prediction. PMID:23169122

  7. Willingness-to-pay for a probabilistic flood forecast: a risk-based decision-making game

    Science.gov (United States)

    Arnal, Louise; Ramos, Maria-Helena; Coughlan de Perez, Erin; Cloke, Hannah Louise; Stephens, Elisabeth; Wetterhall, Fredrik; van Andel, Schalk Jan; Pappenberger, Florian

    2016-08-01

    Probabilistic hydro-meteorological forecasts have over the last decades been used more frequently to communicate forecast uncertainty. This uncertainty is twofold, as it constitutes both an added value and a challenge for the forecaster and the user of the forecasts. Many authors have demonstrated the added (economic) value of probabilistic over deterministic forecasts across the water sector (e.g. flood protection, hydroelectric power management and navigation). However, the richness of the information is also a source of challenges for operational uses, due partially to the difficulty in transforming the probability of occurrence of an event into a binary decision. This paper presents the results of a risk-based decision-making game on the topic of flood protection mitigation, called "How much are you prepared to pay for a forecast?". The game was played at several workshops in 2015, which were attended by operational forecasters and academics working in the field of hydro-meteorology. The aim of this game was to better understand the role of probabilistic forecasts in decision-making processes and their perceived value by decision-makers. Based on the participants' willingness-to-pay for a forecast, the results of the game show that the value (or the usefulness) of a forecast depends on several factors, including the way users perceive the quality of their forecasts and link it to the perception of their own performances as decision-makers.

  8. Potential for accidents in a nuclear power plant: probabilistic risk assessment, applied statistical decision theory, and implications of such considerations to mathematics education

    Energy Technology Data Exchange (ETDEWEB)

    Dios, R.A.

    1984-01-01

    This dissertation focuses upon the field of probabilistic risk assessment and its development. It investigates the development of probabilistic risk assessment in nuclear engineering. To provide background for its development, the related areas of population dynamics (demography), epidemiology and actuarial science are studied by presenting information upon how risk has been viewed in these areas over the years. A second major problem involves presenting an overview of the mathematical models related to risk analysis to mathematics educators and making recommendations for presenting this theory in classes of probability and statistics for mathematics and engineering majors at the undergraduate and graduate levels.

  9. Improving depiction of benefits and harms: analyses of studies of well-known therapeutics and review of high-impact medical journals.

    Science.gov (United States)

    Sedrakyan, Artyom; Shih, Chuck

    2007-10-01

    The issues of weighing benefits and harms and of shared decision-making have become increasingly important in recent years. There is limited knowledge and lack of adequate data on the most transparent method of communicating the information. In this article we discuss examples of communicating benefits and harms for well-known therapeutics, illustrating that relative risk estimates are not helpful for communicating the chance of experiencing adverse events. In addition, we show that asymmetric presentation of the data for benefits and harms is likely to bias toward showing greater benefits and diminishing the importance of the harms (or vice versa). We also present preliminary results of a brief review of high-impact medical journals that show limitations of current systematic reviews. In the review we found that every second published study does not discuss frequency data and 1 in 3 studies that report information on both benefits and harms does not report information in the same metric. We conclude that consistently depicting benefit and harm information in frequencies can substantially improve the communication of benefits and harms. Investigators should be requested to provide frequency data along with relative risk information in the publication of their scientific findings. Currently, even in the highest impact medical journals, evidence of benefits and harms is not consistently presented in ways that facilitate accurate interpretation.

  10. Invited commentary: multilevel analysis of individual heterogeneity-a fundamental critique of the current probabilistic risk factor epidemiology.

    Science.gov (United States)

    Merlo, Juan

    2014-07-15

    In this issue of the Journal, Dundas et al. (Am J Epidemiol. 2014;180(2):197-207) apply a hitherto infrequent multilevel analytical approach: multiple membership multiple classification (MMMC) models. Specifically, by adopting a life-course approach, they use a multilevel regression with individuals cross-classified in different contexts (i.e., families, early schools, and neighborhoods) to investigate self-reported health and mental health in adulthood. They provide observational evidence suggesting the relevance of the early family environment for launching public health interventions in childhood in order to improve health in adulthood. In their analyses, the authors distinguish between specific contextual measures (i.e., the association between particular contextual characteristics and individual health) and general contextual measures (i.e., the share of the total interindividual heterogeneity in health that appears at each level). By doing so, they implicitly question the traditional probabilistic risk factor epidemiology including classical "neighborhood effects" studies. In fact, those studies use simple hierarchical structures and disregard the analysis of general contextual measures. The innovative MMMC approach properly responds to the call for a multilevel eco-epidemiology against a widespread probabilistic risk factors epidemiology. The risk factors epidemiology is not only reduced to individual-level analyses, but it also embraces many current "multilevel analyses" that are exclusively focused on analyzing contextual risk factors.

  11. Using Probabilistic-Risky Programming Models in Identifying Optimized Pattern of Cultivation under Risk Conditions (Case Study: Shoshtar Region

    Directory of Open Access Journals (Sweden)

    Mohammad Kavoosi Kelashemi

    2011-03-01

    Full Text Available Using Telser and Kataoka models of probabilistic-riskymathematical programming, the present research is to determine the optimized pattern of cultivating the agricultural products of Shoshtar region under risky conditions. In order to consider the risk in the mentioned models, time period of agricultural years 1996-1997 till 2004-2005 was taken into account. Results from Telser and Kataoka models showed that due to accepting the risk amounts, most of the optimized amounts suggest the tomato cultivation during the cultivation period of fall, and watermelon cultivation during the cultivation period of spring. On the basis of results, due to allocation of agricultural lands of Shoshtar to tomato and watermeloncultivation and specializing the farming activity in this province,gross profit of agricultural production system can be increasedto 6116047000 and 727782272 thousand Rials, respectively.The results of understudy models were investigated under different income scenarios and probabilistic levels of risk acceptance. Correct policy making in order to offer the suitable equipments for adjusting the effects of lack of certainty and risks due to the climatic unwanted conditions in production process of agricultural products of Shoshtar region improve the life situation of farmers of the mentioned region.

  12. Development and application of the dynamic system doctor to nuclear reactor probabilistic risk assessments.

    Energy Technology Data Exchange (ETDEWEB)

    Kunsman, David Marvin; Aldemir, Tunc (Ohio State University); Rutt, Benjamin (Ohio State University); Metzroth, Kyle (Ohio State University); Catalyurek, Umit (Ohio State University); Denning, Richard (Ohio State University); Hakobyan, Aram (Ohio State University); Dunagan, Sean C.

    2008-05-01

    This LDRD project has produced a tool that makes probabilistic risk assessments (PRAs) of nuclear reactors - analyses which are very resource intensive - more efficient. PRAs of nuclear reactors are being increasingly relied on by the United States Nuclear Regulatory Commission (U.S.N.R.C.) for licensing decisions for current and advanced reactors. Yet, PRAs are produced much as they were 20 years ago. The work here applied a modern systems analysis technique to the accident progression analysis portion of the PRA; the technique was a system-independent multi-task computer driver routine. Initially, the objective of the work was to fuse the accident progression event tree (APET) portion of a PRA to the dynamic system doctor (DSD) created by Ohio State University. Instead, during the initial efforts, it was found that the DSD could be linked directly to a detailed accident progression phenomenological simulation code - the type on which APET construction and analysis relies, albeit indirectly - and thereby directly create and analyze the APET. The expanded DSD computational architecture and infrastructure that was created during this effort is called ADAPT (Analysis of Dynamic Accident Progression Trees). ADAPT is a system software infrastructure that supports execution and analysis of multiple dynamic event-tree simulations on distributed environments. A simulator abstraction layer was developed, and a generic driver was implemented for executing simulators on a distributed environment. As a demonstration of the use of the methodological tool, ADAPT was applied to quantify the likelihood of competing accident progression pathways occurring for a particular accident scenario in a particular reactor type using MELCOR, an integrated severe accident analysis code developed at Sandia. (ADAPT was intentionally created with flexibility, however, and is not limited to interacting with only one code. With minor coding changes to input files, ADAPT can be linked to other

  13. A Strategy to Integrate Probabilistic Risk Assessment into Design and Development Processes for Aerospace Based pon Mars Exploration Rover Experiences

    Science.gov (United States)

    Nunes, Jeffery; Paulos, Todd; Everline, Chester J.; Dezfuli, Homayoon

    2006-01-01

    This paper will discuss the Probabilistic Risk Assessment (PRA) effort and its involvement with related activities during the development of the Mars Exploration Rover (MER). The Rovers were launched 2003.June.10 (Spirit) and 2003.July.7 (Opportunity), and both have proven very successful. Although designed for a 90-day mission, the Rovers have been operating for over two earth years. This paper will review aspects of how the MER project integrated PRA into the design and development process. A companion paper (Development of the Mars Exploration Rover PRA) will describe the MER PRA and design changes from those results.

  14. On the use of hierarchical probabilistic models for characterizing and managing uncertainty in risk/safety assessment.

    Science.gov (United States)

    Kodell, Ralph L; Chen, James J

    2007-04-01

    A general probabilistically-based approach is proposed for both cancer and noncancer risk/safety assessments. The familiar framework of the original ADI/RfD formulation is used, substituting in the numerator a benchmark dose derived from a hierarchical pharmacokinetic/pharmacodynamic model and in the denominator a unitary uncertainty factor derived from a hierarchical animal/average human/sensitive human model. The empirical probability distributions of the numerator and denominator can be combined to produce an empirical human-equivalent distribution for an animal-derived benchmark dose in external-exposure units.

  15. Health risk assessment of heavy metals through the consumption of food crops fertilized by biosolids: A probabilistic-based analysis

    Energy Technology Data Exchange (ETDEWEB)

    Hosseini Koupaie, E., E-mail: ehssan.hosseini.k@gmail.com; Eskicioglu, C., E-mail: cigdem.eskicioglu@ubc.ca

    2015-12-30

    Highlights: • No potential health risk of land application of the regional biosolids. • More realistic risk assessment via probabilistic approach than that of deterministic. • Increasing the total hazard index with increasing fertilizer land application rate. • Significant effect of long-term biosolids land application of hazard index. • Greater contribution of rice ingestion than vegetable ingestion on hazard index. - Abstract: The objective of this study was to perform a probabilistic risk analysis (PRA) to assess the health risk of Cadmium (Cd), Copper (Cu), and Zinc (Zn) through the consumption of food crops grown on farm lands fertilized by biosolids. The risk analysis was conducted using 8 years of historical heavy metal data (2005–2013) of the municipal biosolids generated by a nearby treatment facility considering one-time and long-term biosolids land application scenarios for a range of 5–100 t/ha fertilizer application rate. The 95th percentile of the hazard index (HI) increased from 0.124 to 0.179 when the rate of fertilizer application increased from 5 to 100 t/ha at one-time biosolids land application. The HI at long-term biosolids land application was also found 1.3 and 1.9 times greater than that of one-time land application at fertilizer application rates of 5 and 100 t/ha, respectively. Rice ingestion had more contribution to the HI than vegetable ingestion. Cd and Cu were also found to have more contribution to the health risk associated to vegetable and rice ingestion, respectively. Results indicated no potential risk to the human health even at long-term biosolids land application scenario at 100 t/ha fertilizer application rate.

  16. Using Probabilistic Methods in Water Scarcity Assessments: A First Step Towards a Water Scarcity Risk Assessment Framework

    Science.gov (United States)

    Veldkamp, Ted; Wada, Yoshihide; Aerts, Jeroen; Ward, Phillip

    2016-01-01

    Water scarcity -driven by climate change, climate variability, and socioeconomic developments- is recognized as one of the most important global risks, both in terms of likelihood and impact. Whilst a wide range of studies have assessed the role of long term climate change and socioeconomic trends on global water scarcity, the impact of variability is less well understood. Moreover, the interactions between different forcing mechanisms, and their combined effect on changes in water scarcity conditions, are often neglected. Therefore, we provide a first step towards a framework for global water scarcity risk assessments, applying probabilistic methods to estimate water scarcity risks for different return periods under current and future conditions while using multiple climate and socioeconomic scenarios.

  17. Probabilistic Risk Assessment in Medium Scale for Rainfall-Induced Earthflows: Catakli Catchment Area (Cayeli, Rize, Turkey

    Directory of Open Access Journals (Sweden)

    H. A. Nefeslioglu

    2011-01-01

    Full Text Available The aim of the present study is to introduce a probabilistic approach to determine the components of the risk evaluation for rainfall-induced earthflows in medium scale. The Catakli catchment area (Cayeli, Rize, Turkey was selected as the application site of this study. The investigations were performed in four different stages: (i evaluation of the conditioning factors, (ii calculation of the probability of spatial occurrence, (iii calculation of the probability of the temporal occurrence, and (iv evaluation of the consequent risk. For the purpose, some basic concepts such as “Risk Cube”, “Risk Plane”, and “Risk Vector” were defined. Additionally, in order to assign the vulnerability to the terrain units being studied in medium scale, a new more robust and more objective equation was proposed. As a result, considering the concrete type of roads in the catchment area, the economic risks were estimated as 3.6×106€—in case the failures occur on the terrain units including element at risk, and 12.3×106€—in case the risks arise from surrounding terrain units. The risk assessments performed in medium scale considering the technique proposed in the present study will supply substantial economic contributions to the mitigation planning studies in the region.

  18. Probabilistic risk assessment of dietary exposure to single and multiple pesticide residues or contaminants: summary of the work performed within the SAFE FOODS project.

    Science.gov (United States)

    van Klaveren, Jacob D; Boon, Polly E

    2009-12-01

    This introduction to the journal's supplement on probabilistic risk assessment of single and multiple exposure to pesticide residues or contaminants summarizes the objectives and results of the work performed in work package 3 of the EU-funded project SAFE FOODS. Within this work package, we developed an electronic platform of food consumption and chemical concentration databases harmonised at raw agricultural commodity level. In this platform the databases are connected to probabilistic software to allow probabilistic modelling of dietary exposure in a standardised way. The usefulness of this platform is demonstrated in two papers, which describe the exposure to pesticides and glycoalkaloids in several European countries. Furthermore, an integrated probabilistic risk assessment (IPRA) model was developed: a new tool to integrate exposure and effect modelling, including uncertainty analyses. The use of this model was shown in a paper on the cumulative exposure to anti-androgen pesticides. Combined with a health impact prioritization system, developed within this work package to compare heath risks between chemicals, the IPRA tool can also be used to compare health risks between multiple chemicals in complex risk assessment situation such as risk-benefit and risk trade-off analyses. Both the electronic platform of databases as the IPRA model may proof to be powerful tools to tackle the challenges risk managers are or will be faced with in the future.

  19. Risk assessment methods in radiotherapy: Probabilistic safety assessment (PSA); Los metodos de analisis de riesgo en radioterapia: Analisis Probabilistico de seguridad (APS)

    Energy Technology Data Exchange (ETDEWEB)

    Ramirez Vera, M. L.; Perez Mulas, A.; Delgado, J. M.; Barrientos Ontero, M.; Somoano, F.; Alvarez Garcia, C.; Rodriguez Marti, M.

    2011-07-01

    The understanding of accidents that have occurred in radiotherapy and the lessons learned from them are very useful to prevent repetition, but there are other risks that have not been detected to date. With a view to identifying and preventing such risks, proactive methods successfully applied in other fields, such as probabilistic safety assessment (PSA), have been developed. (Author)

  20. Developing EHR-driven heart failure risk prediction models using CPXR(Log) with the probabilistic loss function.

    Science.gov (United States)

    Taslimitehrani, Vahid; Dong, Guozhu; Pereira, Naveen L; Panahiazar, Maryam; Pathak, Jyotishman

    2016-04-01

    Computerized survival prediction in healthcare identifying the risk of disease mortality, helps healthcare providers to effectively manage their patients by providing appropriate treatment options. In this study, we propose to apply a classification algorithm, Contrast Pattern Aided Logistic Regression (CPXR(Log)) with the probabilistic loss function, to develop and validate prognostic risk models to predict 1, 2, and 5year survival in heart failure (HF) using data from electronic health records (EHRs) at Mayo Clinic. The CPXR(Log) constructs a pattern aided logistic regression model defined by several patterns and corresponding local logistic regression models. One of the models generated by CPXR(Log) achieved an AUC and accuracy of 0.94 and 0.91, respectively, and significantly outperformed prognostic models reported in prior studies. Data extracted from EHRs allowed incorporation of patient co-morbidities into our models which helped improve the performance of the CPXR(Log) models (15.9% AUC improvement), although did not improve the accuracy of the models built by other classifiers. We also propose a probabilistic loss function to determine the large error and small error instances. The new loss function used in the algorithm outperforms other functions used in the previous studies by 1% improvement in the AUC. This study revealed that using EHR data to build prediction models can be very challenging using existing classification methods due to the high dimensionality and complexity of EHR data. The risk models developed by CPXR(Log) also reveal that HF is a highly heterogeneous disease, i.e., different subgroups of HF patients require different types of considerations with their diagnosis and treatment. Our risk models provided two valuable insights for application of predictive modeling techniques in biomedicine: Logistic risk models often make systematic prediction errors, and it is prudent to use subgroup based prediction models such as those given by CPXR

  1. Probabilistic safety assessment of WWER440 reactors prediction, quantification and management of the risk

    CERN Document Server

    Kovacs, Zoltan

    2014-01-01

    The aim of this book is to summarize probabilistic safety assessment (PSA) of nuclear power plants with WWER440 reactors and  demonstrate that the plants are safe enough for producing energy even in light of the Fukushima accident. The book examines level 1 and 2 full power, low power and shutdown PSA, and summarizes the author's experience gained during the last 35 years in this area. It provides useful examples taken from PSA training courses the author has lectured and organized by the International Atomic Energy Agency. Such training courses were organised in Argonne National Laboratory (

  2. Probabilistic authenticated quantum dialogue

    Science.gov (United States)

    Hwang, Tzonelih; Luo, Yi-Ping

    2015-12-01

    This work proposes a probabilistic authenticated quantum dialogue (PAQD) based on Bell states with the following notable features. (1) In our proposed scheme, the dialogue is encoded in a probabilistic way, i.e., the same messages can be encoded into different quantum states, whereas in the state-of-the-art authenticated quantum dialogue (AQD), the dialogue is encoded in a deterministic way; (2) the pre-shared secret key between two communicants can be reused without any security loophole; (3) each dialogue in the proposed PAQD can be exchanged within only one-step quantum communication and one-step classical communication. However, in the state-of-the-art AQD protocols, both communicants have to run a QKD protocol for each dialogue and each dialogue requires multiple quantum as well as classical communicational steps; (4) nevertheless, the proposed scheme can resist the man-in-the-middle attack, the modification attack, and even other well-known attacks.

  3. Probabilistic Structural Analysis Program

    Science.gov (United States)

    Pai, Shantaram S.; Chamis, Christos C.; Murthy, Pappu L. N.; Stefko, George L.; Riha, David S.; Thacker, Ben H.; Nagpal, Vinod K.; Mital, Subodh K.

    2010-01-01

    NASA/NESSUS 6.2c is a general-purpose, probabilistic analysis program that computes probability of failure and probabilistic sensitivity measures of engineered systems. Because NASA/NESSUS uses highly computationally efficient and accurate analysis techniques, probabilistic solutions can be obtained even for extremely large and complex models. Once the probabilistic response is quantified, the results can be used to support risk-informed decisions regarding reliability for safety-critical and one-of-a-kind systems, as well as for maintaining a level of quality while reducing manufacturing costs for larger-quantity products. NASA/NESSUS has been successfully applied to a diverse range of problems in aerospace, gas turbine engines, biomechanics, pipelines, defense, weaponry, and infrastructure. This program combines state-of-the-art probabilistic algorithms with general-purpose structural analysis and lifting methods to compute the probabilistic response and reliability of engineered structures. Uncertainties in load, material properties, geometry, boundary conditions, and initial conditions can be simulated. The structural analysis methods include non-linear finite-element methods, heat-transfer analysis, polymer/ceramic matrix composite analysis, monolithic (conventional metallic) materials life-prediction methodologies, boundary element methods, and user-written subroutines. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. NASA/NESSUS 6.2c is structured in a modular format with 15 elements.

  4. Probabilistic risk assessment of Chinese residents' exposure to fluoride in improved drinking water in endemic fluorosis areas.

    Science.gov (United States)

    Zhang, Li E; Huang, Daizheng; Yang, Jie; Wei, Xiao; Qin, Jian; Ou, Songfeng; Zhang, Zhiyong; Zou, Yunfeng

    2017-03-01

    Studies have yet to evaluate the effects of water improvement on fluoride concentrations in drinking water and the corresponding health risks to Chinese residents in endemic fluorosis areas (EFAs) at a national level. This paper summarized available data in the published literature (2008-2016) on water fluoride from the EFAs in China before and after water quality was improved. Based on these obtained data, health risk assessment of Chinese residents' exposure to fluoride in improved drinking water was performed by means of a probabilistic approach. The uncertainties in the risk estimates were quantified using Monte Carlo simulation and sensitivity analysis. Our results showed that in general, the average fluoride levels (0.10-2.24 mg/L) in the improved drinking water in the EFAs of China were lower than the pre-intervention levels (0.30-15.24 mg/L). The highest fluoride levels were detected in North and Southwest China. The mean non-carcinogenic risks associated with consumption of the improved drinking water for Chinese residents were mostly accepted (hazard quotient water, ingestion rate of water, and the exposure time in the shower were the most relevant variables in the model, therefore, efforts should focus mainly on the definition of their probability distributions for a more accurate risk assessment. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. Generalized Fragility Relationships with Local Site Conditions for Probabilistic Performance-based Seismic Risk Assessment of Bridge Inventories

    Directory of Open Access Journals (Sweden)

    Sivathayalan S.

    2012-01-01

    Full Text Available The current practice of detailed seismic risk assessment cannot be easily applied to all the bridges in a large transportation networks due to limited resources. This paper presents a new approach for seismic risk assessment of large bridge inventories in a city or national bridge network based on the framework of probabilistic performance based seismic risk assessment. To account for the influences of local site effects, a procedure to generate site-specific hazard curves that includes seismic hazard microzonation information has been developed for seismic risk assessment of bridge inventories. Simulated ground motions compatible with the site specific seismic hazard are used as input excitations in nonlinear time history analysis of representative bridges for calibration. A normalizing procedure to obtain generalized fragility relationships in terms of structural characteristic parameters of bridge span and size and longitudinal and transverse reinforcement ratios is presented. The seismic risk of bridges in a large inventory can then be easily evaluated using the normalized fragility relationships without the requirement of carrying out detailed nonlinear time history analysis.

  6. Probabilistic risk assessment of dietary exposure to single and multiple pesticide residues or contaminants: Summary of the work performed within the SAFE FOODS project

    NARCIS (Netherlands)

    Klaveren, van J.D.; Boon, P.E.

    2009-01-01

    This introduction to the journal's supplement on probabilistic risk assessment of single and multiple exposure to pesticide residues or contaminants summarizes the objectives and results of the work performed in work package 3 of the EU-funded project SAFE FOODS. Within this work package, we develop

  7. Probabilistic assessment of risks of diethylhexyl phthalate (DEHP) in surface waters of China on reproduction of fish.

    Science.gov (United States)

    Liu, Na; Wang, Yeyao; Yang, Qi; Lv, Yibing; Jin, Xiaowei; Giesy, John P; Johnson, Andrew C

    2016-06-01

    Diethylhexyl phthalate (DEHP) is considered to be an endocrine disruptor, which unlike other chemicals that have either non-specific (e.g., narcotics) or more generalized reactive modes of action, affect the Hypothalamic-pituitary-gonadal (HPG) axis and tend to have specific interactions with particular molecular targets within biochemical pathways. Responding to this challenge, a novel method for deriving predicted no-effect concentration (PNEC) and probabilistic ecological risk assessment (PERAs) for DEHP based on long-term exposure to potentially sensitive species with appropriate apical endpoints was development for protection of Chinese surface waters. PNECs based on potencies to cause lesions in reproductive tissues of fishes, which ranged from 0.04 to 0.20 μg DEHP L(-1), were significantly less than those derived based on other endpoints or other taxa, such as invertebrates. An assessment of risks posed by DEHP to aquatic organisms in surface waters of China showed that 88.17% and 78.85% of surface waters in China were predicted to pose risks to reproductive fitness of fishes with thresholds of protection for aquatic organisms based on 5% (HC5) and 10% (HC10), respectively. Assessment of risks of effects based on effects mediated by the HPG-axis should consider effects on chronic, non-lethal endpoints for specific taxa, especially for reproductive fitness of fishes.

  8. Primary risk assessment of dimethyldithiocarbamate, a dithiocarbamate fungicide metabolite, based on their probabilistic concentrations in a coastal environment.

    Science.gov (United States)

    Hano, Takeshi; Ito, Katsutoshi; Mochida, Kazuhiko; Ohkubo, Nobuyuki; Kono, Kumiko; Onduka, Toshimitsu; Ito, Mana; Ichihashi, Hideki; Fujii, Kazunori; Tanaka, Hiroyuki

    2015-07-01

    The primary ecological risk of dimethyldithiocarbamate (DMDC), a dithiocarbamate fungicide (DTC) metabolite, was evaluated based on their probabilistic environmental concentration distributions (ECDs) in the coastal environment, Hiroshima Bay, Japan. And their behavior and temporal trends was further considered. This is the first report of the identification of DMDC from environmental seawater and sediment samples. DMDC concentrations in bottom seawater were substantially higher than those in surface seawater, which are associated with the leachability from sediments in bottom seawaters, and with photodegradation in surface seawaters. Furthermore, seasonal risks are dominated by higher concentrations from April to June, indicating temporal variation in the risk to exposed species. Hierarchical Bayesian analysis offered DMDC ECD medians and range (5th to 95th percentiles) of 0.85 ng L(-1) (0.029, 22), 12 ng L(-1) (3.2, 48) and 110 ng kg dry(-1) (9.5, 1200) in surface seawater, bottom seawater and sediment, respectively. Considering that DMDC and DTCs have similar toxicological potential to aquatic organisms, the occurrence of the compound in water is likely to be of biological relevance. In summary, this work provides the first demonstration that the ecological risk of DMDC and its derived DTCs in Hiroshima Bay is relatively high, and that DTCs should be a high priority for future research on marine contamination, especially in bottom seawaters.

  9. Implications of Two Well-Known Models for Instructional Designers in Distance Education: Dick-Carey versus Morrison-Ross-Kemp

    Science.gov (United States)

    Akbulut, Yavuz

    2007-01-01

    This paper first summarizes, and then compares and contrasts two well-known instructional design models: Dick and Carey Model (DC) and Morrison, Ross and Kemp model (MRK). The target audiences of both models are basically instructional designers. Both models have applications for different instructional design settings. They both see the…

  10. Probabilistic Networks

    DEFF Research Database (Denmark)

    Jensen, Finn Verner; Lauritzen, Steffen Lilholt

    2001-01-01

    This article describes the basic ideas and algorithms behind specification and inference in probabilistic networks based on directed acyclic graphs, undirected graphs, and chain graphs.......This article describes the basic ideas and algorithms behind specification and inference in probabilistic networks based on directed acyclic graphs, undirected graphs, and chain graphs....

  11. Cost-Risk Trade-off of Solar Radiation Management and Mitigation under Probabilistic Information on Climate Sensitivity

    Science.gov (United States)

    Khabbazan, Mohammad Mohammadi; Roshan, Elnaz; Held, Hermann

    2017-04-01

    In principle solar radiation management (SRM) offers an option to ameliorate anthropogenic temperature rise. However we cannot expect it to simultaneously compensate for anthropogenic changes in further climate variables in a perfect manner. Here, we ask to what extent a proponent of the 2°C-temperature target would apply SRM in conjunction with mitigation in view of global or regional disparities in precipitation changes. We apply cost-risk analysis (CRA), which is a decision analytic framework that makes a trade-off between the expected welfare-loss from climate policy costs and the climate risks from transgressing a climate target. Here, in both global-scale and 'Giorgi'-regional-scale analyses, we evaluate the optimal mixture of SRM and mitigation under probabilistic information about climate sensitivity. To do so, we generalize CRA for the sake of including not only temperature risk, but also globally aggregated and regionally disaggregated precipitation risks. Social welfare is maximized for the following three valuation scenarios: temperature-risk-only, precipitation-risk-only, and equally weighted both-risks. For now, the Giorgi regions are treated by equal weight. We find that for regionally differentiated precipitation targets, the usage of SRM will be comparably more restricted. In the course of time, a cooling of up to 1.3°C can be attributed to SRM for the latter scenario and for a median climate sensitivity of 3°C (for a global target only, this number reduces by 0.5°C). Our results indicate that although SRM would almost completely substitute for mitigation in the globally aggregated analysis, it only saves 70% to 75% of the welfare-loss compared to a purely mitigation-based analysis (from economic costs and climate risks, approximately 4% in terms of BGE) when considering regional precipitation risks in precipitation-risk-only and both-risks scenarios. It remains to be shown how the inclusion of further risks or different regional weights would

  12. Risk assessment and food allergy: the probabilistic model applied to allergens

    NARCIS (Netherlands)

    Spanjersberg, M.Q.I.; Kruizinga, A.G.; Rennen, M.A.J.; Houben, G.F.

    2007-01-01

    In order to assess the risk of unintended exposure to food allergens, traditional deterministic risk assessment is usually applied, leading to inconsequential conclusions as 'an allergic reaction cannot be excluded'. TNO therefore developed a quantitative risk assessment model for allergens based on

  13. Risk assessment and food allergy: the probabilistic model applied to allergens

    NARCIS (Netherlands)

    Spanjersberg, M.Q.I.; Kruizinga, A.G.; Rennen, M.A.J.; Houben, G.F.

    2007-01-01

    In order to assess the risk of unintended exposure to food allergens, traditional deterministic risk assessment is usually applied, leading to inconsequential conclusions as 'an allergic reaction cannot be excluded'. TNO therefore developed a quantitative risk assessment model for allergens based on

  14. Willingness-to-pay for a probabilistic flood forecast: a risk-based decision-making game

    Science.gov (United States)

    Arnal, Louise; Ramos, Maria-Helena; Coughlan, Erin; Cloke, Hannah L.; Stephens, Elisabeth; Wetterhall, Fredrik; van Andel, Schalk-Jan; Pappenberger, Florian

    2016-04-01

    Forecast uncertainty is a twofold issue, as it constitutes both an added value and a challenge for the forecaster and the user of the forecasts. Many authors have demonstrated the added (economic) value of probabilistic forecasts over deterministic forecasts for a diversity of activities in the water sector (e.g. flood protection, hydroelectric power management and navigation). However, the richness of the information is also a source of challenges for operational uses, due partially to the difficulty to transform the probability of occurrence of an event into a binary decision. The setup and the results of a risk-based decision-making experiment, designed as a game on the topic of flood protection mitigation, called ``How much are you prepared to pay for a forecast?'', will be presented. The game was played at several workshops in 2015, including during this session at the EGU conference in 2015, and a total of 129 worksheets were collected and analysed. The aim of this experiment was to contribute to the understanding of the role of probabilistic forecasts in decision-making processes and their perceived value by decision-makers. Based on the participants' willingness-to-pay for a forecast, the results of the game showed that the value (or the usefulness) of a forecast depends on several factors, including the way users perceive the quality of their forecasts and link it to the perception of their own performances as decision-makers. Balancing avoided costs and the cost (or the benefit) of having forecasts available for making decisions is not straightforward, even in a simplified game situation, and is a topic that deserves more attention from the hydrological forecasting community in the future.

  15. Probabilistic modelling of exposure doses and implications for health risk characterization: glycoalkaloids from potatoes.

    Science.gov (United States)

    Ruprich, J; Rehurkova, I; Boon, P E; Svensson, K; Moussavian, S; Van der Voet, H; Bosgra, S; Van Klaveren, J D; Busk, L

    2009-12-01

    Potatoes are a source of glycoalkaloids (GAs) represented primarily by alpha-solanine and alpha-chaconine (about 95%). Content of GAs in tubers is usually 10-100 mg/kg and maximum levels do not exceed 200 mg/kg. GAs can be hazardous for human health. Poisoning involve gastrointestinal ailments and neurological symptoms. A single intake of >1-3 mg/kg b.w. is considered a critical effect dose (CED). Probabilistic modelling of acute and chronic (usual) exposure to GAs was performed in the Czech Republic, Sweden and The Netherlands. National databases on individual consumption of foods, data on concentration of GAs in tubers (439 Czech and Swedish results) and processing factors were used for modelling. Results concluded that potatoes currently available at the European market may lead to acute intakes >1 mg GAs/kg b.w./day for upper tail of the intake distribution (0.01% of population) in all three countries. 50 mg GAs/kg raw unpeeled tubers ensures that at least 99.99% of the population does not exceed the CED. Estimated chronic (usual) intake in participating countries was 0.25, 0.29 and 0.56 mg/kg b.w./day (97.5% upper confidence limit). It remains unclear if the incidence of GAs poisoning is underreported or if assumptions are the worst case for extremely sensitive persons.

  16. Lessons from the conviction of the L'Aquila seven: The standard probabilistic earthquake hazard and risk assessment is ineffective

    Science.gov (United States)

    Wyss, Max

    2013-04-01

    An earthquake of M6.3 killed 309 people in L'Aquila, Italy, on 6 April 2011. Subsequently, a judge in L'Aquila convicted seven who had participated in an emergency meeting on March 30, assessing the probability of a major event to follow the ongoing earthquake swarm. The sentence was six years in prison, a combine fine of 2 million Euros, loss of job, loss of retirement rent, and lawyer's costs. The judge followed the prosecution's accusation that the review by the Commission of Great Risks had conveyed a false sense of security to the population, which consequently did not take their usual precautionary measures before the deadly earthquake. He did not consider the facts that (1) one of the convicted was not a member of the commission and had merrily obeyed orders to bring the latest seismological facts to the discussion, (2) another was an engineer who was not required to have any expertise regarding the probability of earthquakes, (3) and two others were seismologists not invited to speak to the public at a TV interview and a press conference. This exaggerated judgment was the consequence of an uproar in the population, who felt misinformed and even mislead. Faced with a population worried by an earthquake swarm, the head of the Italian Civil Defense is on record ordering that the population be calmed, and the vice head executed this order in a TV interview one hour before the meeting of the Commission by stating "the scientific community continues to tell me that the situation is favorable and that there is a discharge of energy." The first lesson to be learned is that communications to the public about earthquake hazard and risk must not be left in the hands of someone who has gross misunderstandings about seismology. They must be carefully prepared by experts. The more significant lesson is that the approach to calm the population and the standard probabilistic hazard and risk assessment, as practiced by GSHAP, are misleading. The later has been criticized as

  17. Probabilistic Risk Analysis and Fault Trees as Tools in Improving the Delineation of Wellhead Protection Areas: An Initial Discussion

    Science.gov (United States)

    Rodak, C. M.; Silliman, S. E.

    2010-12-01

    Delineation of a wellhead protection area (WHPA) is a critical component of managing / protecting the aquifer(s) supplying potable water to a public water-supply well. While a number of previous authors have addressed questions related to uncertainties in advective capture zones, methods for assessing WHPAs in the presence of uncertainty in the chemistry of groundwater contaminants, the relationship between land-use and contaminant sources, and the impact on health risk within the receiving population are more limited. Probabilistic risk analysis (PRA) combined with fault trees (FT) addresses this latter challenge by providing a structure whereby four key WHPA issues may be addressed: (i) uncertainty in land-use practices and chemical release, (ii) uncertainty in groundwater flow, (iii) variability in natural attenuation properties (and/or remediation) of the contaminants, and (iv) estimated health risk from contaminant arrival at a well. The potential utility of PRA-FT in this application is considered through a simplified case study involving management decisions related both to regional land use planning and local land-use zoning regulation. An application-specific fault tree is constructed to visualize and identify the events required for health risk failure at the well and a Monte Carlo approach is used to create multiple realizations of groundwater flow and chemical transport to a well in a model of a simple, unconfined aquifer. Model parameters allowed to vary during this simplified case study include hydraulic conductivity, probability of a chemical spill (related to land use variation in space), and natural attenuation through variation in rate of decay of the contaminant. Numerical results are interpreted in association with multiple land-use management scenarios as well as multiple cancer risk assumptions regarding the contaminant arriving at the well. This case study shows significant variability of health risk at the well, however general trends were

  18. A Probabilistic Framework for Risk Analysis of Widespread Flood Events: A Proof-of-Concept Study.

    Science.gov (United States)

    Schneeberger, Klaus; Huttenlau, Matthias; Winter, Benjamin; Steinberger, Thomas; Achleitner, Stefan; Stötter, Johann

    2017-07-27

    This article presents a flood risk analysis model that considers the spatially heterogeneous nature of flood events. The basic concept of this approach is to generate a large sample of flood events that can be regarded as temporal extrapolation of flood events. These are combined with cumulative flood impact indicators, such as building damages, to finally derive time series of damages for risk estimation. Therefore, a multivariate modeling procedure that is able to take into account the spatial characteristics of flooding, the regionalization method top-kriging, and three different impact indicators are combined in a model chain. Eventually, the expected annual flood impact (e.g., expected annual damages) and the flood impact associated with a low probability of occurrence are determined for a study area. The risk model has the potential to augment the understanding of flood risk in a region and thereby contribute to enhanced risk management of, for example, risk analysts and policymakers or insurance companies. The modeling framework was successfully applied in a proof-of-concept exercise in Vorarlberg (Austria). The results of the case study show that risk analysis has to be based on spatially heterogeneous flood events in order to estimate flood risk adequately. © 2017 Society for Risk Analysis.

  19. Probabilistic meta-analysis of risk from the exposure to Hg in artisanal gold mining communities in Colombia.

    Science.gov (United States)

    De Miguel, Eduardo; Clavijo, Diana; Ortega, Marcelo F; Gómez, Amaia

    2014-08-01

    Colombia is one of the largest per capita mercury polluters in the world as a consequence of its artisanal gold mining activities. The severity of this problem in terms of potential health effects was evaluated by means of a probabilistic risk assessment carried out in the twelve departments (or provinces) in Colombia with the largest gold production. The two exposure pathways included in the risk assessment were inhalation of elemental Hg vapors and ingestion of fish contaminated with methyl mercury. Exposure parameters for the adult population (especially rates of fish consumption) were obtained from nation-wide surveys and concentrations of Hg in air and of methyl-mercury in fish were gathered from previous scientific studies. Fish consumption varied between departments and ranged from 0 to 0.3 kg d(-1). Average concentrations of total mercury in fish (70 data) ranged from 0.026 to 3.3 μg g(-1). A total of 550 individual measurements of Hg in workshop air (ranging from Colombia.

  20. Treating Uncertainties in A Nuclear Seismic Probabilistic Risk Assessment by Means of the Distemper-Safer Theory of Evidence

    Energy Technology Data Exchange (ETDEWEB)

    Lo, Chungkung [Chair on Systems Science and the Energetic Challenge, Paris (France); Pedroni, N.; Zio, E. [Politecnico di Milano, Milano (Italy)

    2014-02-15

    The analyses carried out within the Seismic Probabilistic Risk Assessments (SPRAs) of Nuclear Power Plants (NPPs) are affected by significant aleatory and epistemic uncertainties. These uncertainties have to be represented and quantified coherently with the data, information and knowledge available, to provide reasonable assurance that related decisions can be taken robustly and with confidence. The amount of data, information and knowledge available for seismic risk assessment is typically limited, so that the analysis must strongly rely on expert judgments. In this paper, a Dempster-Shafer Theory (DST) framework for handling uncertainties in NPP SPRAs is proposed and applied to an example case study. The main contributions of this paper are two: (i) applying the complete DST framework to SPRA models, showing how to build the Dempster-Shafer structures of the uncertainty parameters based on industry generic data, and (ii) embedding Bayesian updating based on plant specific data into the framework. The results of the application to a case study show that the approach is feasible and effective in (i) describing and jointly propagating aleatory and epistemic uncertainties in SPRA models and (ii) providing 'conservative' bounds on the safety quantities of interest (i. e. Core Damage Frequency, CDF) that reflect the (limited) state of knowledge of the experts about the system of interest.

  1. TREATING UNCERTAINTIES IN A NUCLEAR SEISMIC PROBABILISTIC RISK ASSESSMENT BY MEANS OF THE DEMPSTER-SHAFER THEORY OF EVIDENCE

    Directory of Open Access Journals (Sweden)

    CHUNG-KUNG LO

    2014-02-01

    Full Text Available The analyses carried out within the Seismic Probabilistic Risk Assessments (SPRAs of Nuclear Power Plants (NPPs are affected by significant aleatory and epistemic uncertainties. These uncertainties have to be represented and quantified coherently with the data, information and knowledge available, to provide reasonable assurance that related decisions can be taken robustly and with confidence. The amount of data, information and knowledge available for seismic risk assessment is typically limited, so that the analysis must strongly rely on expert judgments. In this paper, a Dempster-Shafer Theory (DST framework for handling uncertainties in NPP SPRAs is proposed and applied to an example case study. The main contributions of this paper are two: (i applying the complete DST framework to SPRA models, showing how to build the Dempster-Shafer structures of the uncertainty parameters based on industry generic data, and (ii embedding Bayesian updating based on plant specific data into the framework. The results of the application to a case study show that the approach is feasible and effective in (i describing and jointly propagating aleatory and epistemic uncertainties in SPRA models and (ii providing ‘conservative’ bounds on the safety quantities of interest (i.e. Core Damage Frequency, CDF that reflect the (limited state of knowledge of the experts about the system of interest.

  2. The use of modelling and probabilistic methods in cumulative risk assessment

    NARCIS (Netherlands)

    Bosgra, S.

    2008-01-01

    This thesis was realized as part of the EU integrated project SAFE FOODS, the overall objective of which was to change the scope of decision-making on food safety from single risks to considering foods as sources of risks, benefits and costs associated with their production and consumption, and taki

  3. Potential for Application of a Probabilistic Catastrophe Risk Modelling Framework to Poverty Outcomes

    OpenAIRE

    2016-01-01

    This paper analyzes the potential to combine catastrophe risk modelling (CAT risk modeling) with economic analysis of vulnerability to poverty using the example of drought hazard impacts on the welfare of rural households in Ethiopia. The aim is to determine the potential for applying a derived set of damage (vulnerability) functions based on realized shocks and household expenditure/consu...

  4. Probabilistic soil erosion modeling using the Erosion Risk Management Tool (ERMIT) after wildfires

    Science.gov (United States)

    P. R. Robichaud; W. J. Elliot; J. W. Wagenbrenner

    2011-01-01

    The decision of whether or not to apply post-fire hillslope erosion mitigation treatments, and if so, where these treatments are most needed, is a multi-step process. Land managers must assess the risk of damaging runoff and sediment delivery events occurring on the unrecovered burned hillslope. We developed the Erosion Risk Management Tool (ERMiT) to address this need...

  5. Limitations of the entomological operational risk assessment using probabilistic and deterministic analyses.

    Science.gov (United States)

    Schleier, Jerome J; Peterson, Robert K D

    2010-08-01

    The Entomological Operational Risk Assessment (EORA) is used by the U.S. military to estimate risks posed by arthropod-vectored pathogens that produce human diseases. Our analysis demonstrated that the EORA matrix is formatted so that a small change in probability results in a discontinuous jump in risk. In addition, we show the overlap of different risk categories with respect to their probability of occurrence. Our results reveal that the fundamental mathematical problems associated with the EORA process may not provide estimates that are better than random chance. To ameliorate many of the problems associated with the EORA, we suggest more robust methods for performing qualitative and semiquantitative risk assessments when it is difficult to obtain the probability that an adverse event will occur and when the knowledge of experts can aid the process.

  6. Parent of origin, mosaicism, and recurrence risk: probabilistic modeling explains the broken symmetry of transmission genetics.

    Science.gov (United States)

    Campbell, Ian M; Stewart, Jonathan R; James, Regis A; Lupski, James R; Stankiewicz, Paweł; Olofsson, Peter; Shaw, Chad A

    2014-10-02

    Most new mutations are observed to arise in fathers, and increasing paternal age positively correlates with the risk of new variants. Interestingly, new mutations in X-linked recessive disease show elevated familial recurrence rates. In male offspring, these mutations must be inherited from mothers. We previously developed a simulation model to consider parental mosaicism as a source of transmitted mutations. In this paper, we extend and formalize the model to provide analytical results and flexible formulas. The results implicate parent of origin and parental mosaicism as central variables in recurrence risk. Consistent with empirical data, our model predicts that more transmitted mutations arise in fathers and that this tendency increases as fathers age. Notably, the lack of expansion later in the male germline determines relatively lower variance in the proportion of mutants, which decreases with paternal age. Subsequently, observation of a transmitted mutation has less impact on the expected risk for future offspring. Conversely, for the female germline, which arrests after clonal expansion in early development, variance in the mutant proportion is higher, and observation of a transmitted mutation dramatically increases the expected risk of recurrence in another pregnancy. Parental somatic mosaicism considerably elevates risk for both parents. These findings have important implications for genetic counseling and for understanding patterns of recurrence in transmission genetics. We provide a convenient online tool and source code implementing our analytical results. These tools permit varying the underlying parameters that influence recurrence risk and could be useful for analyzing risk in diverse family structures.

  7. Geothermal probabilistic cost study

    Energy Technology Data Exchange (ETDEWEB)

    Orren, L.H.; Ziman, G.M.; Jones, S.C.; Lee, T.K.; Noll, R.; Wilde, L.; Sadanand, V.

    1981-08-01

    A tool is presented to quantify the risks of geothermal projects, the Geothermal Probabilistic Cost Model (GPCM). The GPCM model is used to evaluate a geothermal reservoir for a binary-cycle electric plant at Heber, California. Three institutional aspects of the geothermal risk which can shift the risk among different agents are analyzed. The leasing of geothermal land, contracting between the producer and the user of the geothermal heat, and insurance against faulty performance are examined. (MHR)

  8. Risk from exposure to trihalomethanes during shower: probabilistic assessment and control.

    Science.gov (United States)

    Chowdhury, Shakhawat; Champagne, Pascale

    2009-02-15

    Exposure to trihalomethanes (THMs) through inhalation and dermal contact during showering and bathing may pose risks to human health. During showering and bathing, warm water (35 degrees C-45 degrees C) is generally used. Warming of chlorinated supply water may increase THMs formation through enhanced reactions between organics and residual chlorine. Exposure assessment using THMs concentrations in cold water may under-predict the possible risks to human health. In this study, THMs concentrations in warm water were estimated by developing a THMs formation rate model. Using THMs in warm water, cancer and non-cancer risks to human health were predicted for three major cities in Ontario (Canada). The parameters for risk assessments were characterized by statistical distributions. The total cancer risks from exposure to THMs during showering were predicted to be 7.6x10(-6), 6.3x10(-6) and 4.3x10(-6) for Ottawa, Hamilton and Toronto respectively. The cancer risks exceedance probabilities were estimated to be highest in Ottawa at different risk levels. The risks through inhalation exposure were found to be comparable (2.1x10(-6)-3.7x10(-6)) to those of the dermal contact (2.2x10(-6)-3.9x10(-6)) for the cities. This study predicted 36 cancer incidents from exposure to THMs during showering for these three cities, while Toronto contributed the highest number of possible cancer incidents (22), followed by Ottawa (10) and Hamilton (4). The sensitivity analyses showed that health risks could be controlled by varying shower stall volume and/or shower duration following the power law relationship.

  9. Probabilistic Modeling for Risk Assessment of California Ground Water Contamination by Pesticides

    Science.gov (United States)

    Clayton, M.; Troiano, J.; Spurlock, F.

    2007-12-01

    The California Department of Pesticide Regulation (DPR) is responsible for the registration of pesticides in California. DPR's Environmental Monitoring Branch evaluates the potential for pesticide active ingredients to move to ground water under legal agricultural use conditions. Previous evaluations were primarily based on threshold values for specific persistence and mobility properties of pesticides as prescribed in the California Pesticide Contamination Prevention Act of 1985. Two limitations identified with that process were the univariate nature where interactions of the properties were not accounted for, and the inability to accommodate multiple values of a physical-chemical property. We addressed these limitations by developing a probabilistic modeling method based on prediction of potential well water concentrations. A mechanistic pesticide transport model, LEACHM, is used to simulate sorption, degradation and transport of a candidate pesticide through the root zone. A second empirical model component then simulates pesticide degradation and transport through the vadose zone to a receiving ground water aquifer. Finally, degradation during transport in the aquifer to the well screen is included in calculating final potential well concentrations. Using Monte Carlo techniques, numerous LEACHM simulations are conducted using random samples of the organic carbon normalized soil adsorption coefficients (Koc) and soil dissipation half-life values derived from terrestrial field dissipation (TFD) studies. Koc and TFD values are obtained from gamma distributions fitted to pooled data from agricultural-use pesticides detected in California ground water: atrazine, simazine, diuron, bromacil, hexazinone, and norflurazon. The distribution of predicted well water concentrations for these pesticides is in good agreement with concentrations measured in domestic wells in coarse, leaching vulnerable soils of Fresno and Tulure Counties. The leaching potential of a new

  10. A probabilistic seismic risk assessment procedure for nuclear power plants: (II) Application

    Science.gov (United States)

    Huang, Y.-N.; Whittaker, A.S.; Luco, N.

    2011-01-01

    This paper presents the procedures and results of intensity- and time-based seismic risk assessments of a sample nuclear power plant (NPP) to demonstrate the risk-assessment methodology proposed in its companion paper. The intensity-based assessments include three sets of sensitivity studies to identify the impact of the following factors on the seismic vulnerability of the sample NPP, namely: (1) the description of fragility curves for primary and secondary components of NPPs, (2) the number of simulations of NPP response required for risk assessment, and (3) the correlation in responses between NPP components. The time-based assessment is performed as a series of intensity-based assessments. The studies illustrate the utility of the response-based fragility curves and the inclusion of the correlation in the responses of NPP components directly in the risk computation. ?? 2011 Published by Elsevier B.V.

  11. Post-Probabilistic Uncertainty Quantification: Discussion of Potential Use in Product Development Risk Management

    DEFF Research Database (Denmark)

    Tegeltija, Miroslava; Oehmen, Josef; Kozin, Igor

    2016-01-01

    and management of risks. Acknowledging the increasing societal and business criticality of product development projects, there is a need to more thoroughly explore the various fundamental approaches to describe and quantify various types of uncertainty as part of the overall decision making process. Decisions......Uncertainty represents one of the key challenges in product development (PD) projects and can significantly impact a PD project's performance. Risks in PD lead to schedule and cost over-runs and poor product quality [Olechowski et al. 2012]. Risk management is one response for the identification...... if uncertainty is carefully addressed (e.g. [Prelec and Loewenstein 1991], [Riabacke 2006]). In the risk management community there is a strong argument that at least two distinct types of uncertainty have to be taken into account: aleatory and epistemic. Epistemic uncertainty arises due to lack of knowledge...

  12. VRAKA – a probabilistic risk assessment method for potentially polluting shipwrecks

    Directory of Open Access Journals (Sweden)

    Hanna Landquist

    2016-07-01

    Full Text Available Shipwrecks around the world contain unknown volumes of hazardous substances which, if discharged, could harm the marine environment. Shipwrecks can deteriorate for a number of reasons, including corrosion and physical impact from trawling and other activities, and the probability of a leakage increases with time. Before deciding on possible mitigation measures, there are currently few comprehensive methods for assessing shipwrecks with respect to pollution risks. A holistic method for estimating environmental risks from shipwrecks should be based on well-established risk assessment methods and should take into account both the probability of discharge and the potential consequences. The purpose of this study was therefore to present a holistic risk assessment method for potentially polluting shipwrecks. The focus is set to developing a method for estimating the environmental consequences of potential discharges of hazardous substances from shipwrecks and to combine this with earlier research on a tool for estimating the probability of discharge of hazardous substances. Risk evaluation should also be included in a full risk assessment and is the subject of further research. The consequence assessment was developed for application in three tiers. In Tier 1, the probability of discharge and possible amount of discharge are compared to other shipwrecks. In Tier 2, a risk matrix, including a classification of potential consequences, is suggested as a basis for assessment and comparison. The most detailed level, Tier 3, is based on advanced tools for oil spill trajectory modelling and sensitivity mapping of the Swedish coast.To illustrate the method an example application on two wrecks is presented. Wreck number 1 present a lower probability of discharge and a lower consequence in a Tier 1 and Tier 3 assessment. For the Tier 2 consequence assessment, the two example wrecks present equal consequence. The tool for estimating the probability of discharge

  13. A probabilistic risk assessment for the vulnerability of the European carbon cycle to weather extremes: the ecosystem perspective

    Science.gov (United States)

    Rolinski, S.; Rammig, A.; Walz, A.; von Bloh, W.; van Oijen, M.; Thonicke, K.

    2015-03-01

    Extreme weather events are likely to occur more often under climate change and the resulting effects on ecosystems could lead to a further acceleration of climate change. But not all extreme weather events lead to extreme ecosystem response. Here, we focus on hazardous ecosystem behaviour and identify coinciding weather conditions. We use a simple probabilistic risk assessment based on time series of ecosystem behaviour and climate conditions. Given the risk assessment terminology, vulnerability and risk for the previously defined hazard are estimated on the basis of observed hazardous ecosystem behaviour. We apply this approach to extreme responses of terrestrial ecosystems to drought, defining the hazard as a negative net biome productivity over a 12-month period. We show an application for two selected sites using data for 1981-2010 and then apply the method to the pan-European scale for the same period, based on numerical modelling results (LPJmL for ecosystem behaviour; ERA-Interim data for climate). Our site-specific results demonstrate the applicability of the proposed method, using the SPEI to describe the climate condition. The site in Spain provides an example of vulnerability to drought because the expected value of the SPEI is 0.4 lower for hazardous than for non-hazardous ecosystem behaviour. In northern Germany, on the contrary, the site is not vulnerable to drought because the SPEI expectation values imply wetter conditions in the hazard case than in the non-hazard case. At the pan-European scale, ecosystem vulnerability to drought is calculated in the Mediterranean and temperate region, whereas Scandinavian ecosystems are vulnerable under conditions without water shortages. These first model-based applications indicate the conceptual advantages of the proposed method by focusing on the identification of critical weather conditions for which we observe hazardous ecosystem behaviour in the analysed data set. Application of the method to empirical time

  14. Probabilistic acute risk assessment of cumulative exposure to organophosphorus and carbamate pesticides from dietary vegetables and fruits in Shanghai populations.

    Science.gov (United States)

    Li, Fan; Yuan, Yaqun; Meng, Pai; Wu, Min; Li, Shuguang; Chen, Bo

    2017-05-01

    Organophosphorus pesticides (OPs) and carbamate pesticides (CPs) are among the most widely used pesticides in China, playing a major role in protecting agricultural commodities. In this study, we determined the cumulative acute exposure to OPs and CPs of Shanghai residents from vegetables and fruits (VFs). The food consumption data were obtained from the Shanghai Food Consumption Survey (SHFCS) of 2012-14 including a total of 1973 participants aged 2-90 years. The pesticide residue data were obtained from the Shanghai monitoring programme during 2008-11 with 34 organophosphates and 11 carbamates analysed in a total of 5335 samples of VFs. A probabilistic approach was performed as recommended by the EFSA, using the optimistic model with non-detects set as zero and with processing factors (PFs) being used and the pessimistic model with non-detects replaced by limit of detection (LOD) and without PFs. We used the relative potency factor (RPF) method to normalise the various pesticides to the index compound (IC) of methamidophos and chlorpyrifos separately. Only in the pessimistic model using methamidophos as the IC was there was small risk of exposure exceeding the ARfD (3 µg kg(-)(1) bw day(-)(1)) in the populations of preschool children (0.029%), school-age children (0.022%) and adults (0.002%). There were no risk of exposure exceeding the ARfD of methamidophos in the optimistic model and of chlorpyrifos (100 µg kg(-)(1) bw day(-)(1)) in both optimistic and pessimistic models in all three populations. Considering the Chinese habits of overwhelmingly eating processed food (vegetables being cooked, and fruits being washed or peeled), we conclude that little acute risk was found for the exposure to VF-sourced OPs and CPs in Shanghai.

  15. Exploring probabilistic tools for the development of a platform for Quantitative Risk Assessment (QRA) of hydro-meteorological hazards in Europe

    Science.gov (United States)

    Zumpano, V.; Hussin, H. Y.; Breinl, K.

    2012-04-01

    Mass-movements and floods are hydro-meteorological hazards that can have catastrophic effects on communities living in mountainous areas prone to these disastrous events. Environmental, climate and socio-economic changes are expected to affect the tempo-spatial patterns of hydro-meteorological hazards and associated risks in Europe. These changes and their effects on the occurrence of future hazards need to be analyzed and modeled using probabilistic hazard and risk assessment methods in order to assist stakeholders in disaster management strategies and policy making. Quantitative Risk Assessment (QRA) using probabilistic methods can further calculate damage and losses to multi-hazards and determine the uncertainties related to all the probabilistic components of the hazard and the vulnerability of the elements at risk. Therefore, in order to develop an effective platform that can quantitatively calculate the risk of mass-movements and floods in several European test sites, an extensive inventory and analysis has been carried out of the available tools and software related to the probabilistic risk assessment of single and multi-hazards. The tools have been reviewed based on whether they are open source and freely available, their required input data, the availability and type of hazard and vulnerability modules, transparency of methods used, their validation and calibration techniques, the inclusion of uncertainties and their state of the art. The analysis also specially focused on the applicability of the tools to European study areas. The findings showed that assumptions and simplifications are made when assessing and quantifying the hazards. The interaction between multiple hazards, like cascading effects are not assessed in most tools and some consider the hazard and vulnerability as qualitative components, rather than quantitative ones. This analysis of hazard and risk assessment tools and software will give future developers and experts a better overview of

  16. A probabilistic risk assessment for dengue fever by a threshold based-quantile regression

    Science.gov (United States)

    Chiu, Chuan-Hung; Tan, Yih-Chi; Wen, Tzai-Hung; Chien, Lung-Chang; Yu, Hwa-Lung

    2014-05-01

    This article introduces an important concept "return period" to analyze potential incident rate of dengue fever by bringing together two models: the quantile regression model and the threshold-based method. The return period provided the frequency of incidence of dengue fever, and established the risk maps for potential incidence of dengue fever to point out highest risk in certain areas. A threshold-based linear quantile regression model was constructed to find significantly main effects and interactions based on collinearity test and stepwise selection, and also showed the performance of our model via pseudo R2. Finally, the spatial risk maps of the specified return periods and average incident rates were given, and indicated that high population density place (e.g., residential area), water conservancy facilities, and corresponding interactions could lead to a positive influence on dengue fever. These factors would be the key point to disease protection in a given study area.

  17. Dam overtopping risk using probabilistic concepts – Case study: The Meijaran Dam, Iran

    Directory of Open Access Journals (Sweden)

    Ehsan Goodarzi

    2013-06-01

    Full Text Available Hydrologic risk assessment and uncertainty analysis by mathematical and statistical methods provide useful information for decision makers. This study presents the application of risk and uncertainty analysis to dam overtopping due to various inflows and wind speeds for the Meijaran Dam in the north of Iran. The procedure includes univariate flood and wind speed frequency analyses, reservoir routing, and integration of wind set-up and run-up to calculate the reservoir water elevation. Afterwards, the probability of overtopping was assessed by applying two uncertainty analysis methods (Monte Carlo simulation and Latin hypercube sampling, and considering the quantile of flood peak discharge, initial depth of water in the reservoir, and spillway discharge coefficient as uncertain variables. The results revealed that rising water level in the reservoir is the most important factor in overtopping risk analysis and that wind speed also has a considerable impact on reservoirs that are placed in windy areas.

  18. Probabilistic cost-benefit analysis of disaster risk management in a development context.

    Science.gov (United States)

    Kull, Daniel; Mechler, Reinhard; Hochrainer-Stigler, Stefan

    2013-07-01

    Limited studies have shown that disaster risk management (DRM) can be cost-efficient in a development context. Cost-benefit analysis (CBA) is an evaluation tool to analyse economic efficiency. This research introduces quantitative, stochastic CBA frameworks and applies them in case studies of flood and drought risk reduction in India and Pakistan, while also incorporating projected climate change impacts. DRM interventions are shown to be economically efficient, with integrated approaches more cost-effective and robust than singular interventions. The paper highlights that CBA can be a useful tool if certain issues are considered properly, including: complexities in estimating risk; data dependency of results; negative effects of interventions; and distributional aspects. The design and process of CBA must take into account specific objectives, available information, resources, and the perceptions and needs of stakeholders as transparently as possible. Intervention design and uncertainties should be qualified through dialogue, indicating that process is as important as numerical results.

  19. Enhancing Cost Realism through Risk-Driven Contracting: Designing Incentive Fees Based on Probabilistic Cost Estimates

    Science.gov (United States)

    2012-04-01

    it would award a CPFF contract to what it knew to be the lowest cost contractor to avoid the risk premium of incentive contracts ( Samuelson , 1986... Samuelson , W. (1986). Bidding for contracts. Management Science, 32(12), 1533–1550. Scherer, F. M. (1964). The theory of contractual incentives for cost

  20. A Framework to Expand and Advance Probabilistic Risk Assessment to Support Small Modular Reactors

    Energy Technology Data Exchange (ETDEWEB)

    Curtis Smith; David Schwieder; Robert Nourgaliev; Cherie Phelan; Diego Mandelli; Kellie Kvarfordt; Robert Youngblood

    2012-09-01

    During the early development of nuclear power plants, researchers and engineers focused on many aspects of plant operation, two of which were getting the newly-found technology to work and minimizing the likelihood of perceived accidents through redundancy and diversity. As time, and our experience, has progressed, the realization of plant operational risk/reliability has entered into the design, operation, and regulation of these plants. But, to date, we have only dabbled at the surface of risk and reliability technologies. For the next generation of small modular reactors (SMRs), it is imperative that these technologies evolve into an accepted, encompassing, validated, and integral part of the plant in order to reduce costs and to demonstrate safe operation. Further, while it is presumed that safety margins are substantial for proposed SMR designs, the depiction and demonstration of these margins needs to be better understood in order to optimize the licensing process.

  1. A probabilistic model for silver bioaccumulation in aquatic systems and assessment of human health risks.

    Science.gov (United States)

    Warila, J; Batterman, S; Passino-Reader, D R

    2001-02-01

    Silver (Ag) is discharged in wastewater effluents and is also a component in a proposed secondary water disinfectant. A steady-state model was developed to simulate bioaccumulation in aquatic biota and assess ecological and human health risks. Trophic levels included phytoplankton, invertebrates, brown trout, and common carp. Uptake routes included water, food, or sediment. Based on an extensive review of the literature, distributions were derived for most inputs for use in Monte Carlo simulations. Three scenarios represented ranges of dilution and turbidity. Compared with the limited field data available, median estimates of Ag in carp (0.07-2.1 micrograms/g dry weight) were 0.5 to 9 times measured values, and all measurements were within the predicted interquartile range. Median Ag concentrations in biota were ranked invertebrates > phytoplankton > trout > carp. Biotic concentrations were highest for conditions of low dilution and low turbidity. Critical variables included Ag assimilation efficiency, specific feeding rate, and the phytoplankton bioconcentration factor. Bioaccumulation of Ag seems unlikely to result in toxicity to aquatic biota and humans consuming fish. Although the highest predicted Ag concentrations in water (> 200 ng/L) may pose chronic risks to early survival and development of salmonids and risks of argyria to subsistence fishers, these results occur under highly conservative conditions.

  2. The use of check valve performance data to support new concepts (probabilistic risk assessment, condition monitoring) for check valve program

    Energy Technology Data Exchange (ETDEWEB)

    Hart, K.A.; Gower, D.

    1996-12-01

    The concept of developing an integrated check valve database based on the Nuclear Power Reliability Data System (NPRDS) data was presented at the last Symposium. The Nuclear Industry Check Valve Group (NIC), working in cooperation with the Oak Ridge National Laboratory (ORNL), has completed an operational database of check valve performance from 1984 to the present. NIC has committed to the nuclear industry to periodically update the data and maintain this information accessible. As the new concepts of probabilistic risk analysis and condition monitoring are integrated into the American Society of Mechanical Engineers (ASME) Code, a critical element will be performance data. From check valve performance data, feasible failure modes and rates can be established. When a failure rate or frequency of failures can be established based on a significant enough population (sampling), a more solid foundation for focusing resources and determining appropriate frequencies and testing can be determined. The presentation will give the updated status of the NIC Check Valve Performance Database covering (1) methodology used to combine the original ORNL data; (2) process/controls established for continuing update and refinement of the data; (3) discussion of how this data is being utilized by (a) OM-22 for condition monitoring, and (b) risk-based inservice testing work of Westinghouse Owners` Group; and (4) results/trends of data evaluations. At the 1994 Symposium, ORNL provided an update as of 1991 to their original work of 1984 -1990 which they had performed to characterize check valve degradations and failures in the nuclear industry. These characterizations will be updated to 1995 and additional reviews provided to give insight into the current condition and trends of check valve performance.

  3. Probabilistic Health Risk Assessment of Chemical Mixtures: Importance of Travel Times and Connectivity

    Science.gov (United States)

    Henri, Christopher V.; Fernàndez-Garcia, Daniel; de Barros, Felipe P. J.

    2014-05-01

    Subsurface contamination cases giving rise to groundwater pollutions are extensively found in all industrialized countries. Under this pressure, risk assessment methods play an important role in population protection by (1) quantifying the potential impact on human health of an aquifer contamination and (2) helping and driving decisions of groundwater-resource managers. Many reactive components such as chlorinated solvents or nitrates potentially experience attenuation processes under common geochemical conditions. This represents an attractive and extensively used remediation solution but leads often to the production of by-products before to reach a harmless chemical form. This renders mixtures of contaminants a common issue for groundwater resources managers. In this case, the threat posed by these contaminants to human health at a given sensitive location greatly depends on the competition between reactive and advective-dispersive characteristic times. However, hydraulic properties of the aquifer are known to be spatially variable, which can lead to the formation of preferential flow channels and fast contamination pathways. Therefore, the uncertainty on the spatial distribution of the aquifer properties controlling the plume travel time may then play a particular role in the human health risk assessment of chemical mixtures. We investigate here the risk related to a multispecies system in response to different degrees of heterogeneity of the hydraulic conductivity (K or Y =ln(K)). This work focuses on a Perchloroethylene (PCE) contamination problem followed by the sequential first-order production/biodegradation of its daughter species Trichloroethylene (TCE), Dichloroethylene (DCE) and Vinyl Chlorine (VC). For this specific case, VC is known to be a highly toxic contaminant. By performing numerical experiments, we evaluate transport through three-dimensional mildly (σY 2=1.0) and highly (σY 2=4.0) heterogeneous aquifers. Uncertainty on the hydraulic

  4. Probabilistic runoff volume forecasting in risk-based optimization for RTC of urban drainage systems

    DEFF Research Database (Denmark)

    Löwe, Roland; Vezzaro, Luca; Mikkelsen, Peter Steen

    2016-01-01

    This article demonstrates the incorporation of stochastic grey-box models for urban runoff forecasting into a full-scale, system-wide control setup where setpoints are dynamically optimized considering forecast uncertainty and sensitivity of overflow locations in order to reduce combined sewer...... overflow risk. The stochastic control framework and the performance of the runoff forecasting models are tested in a case study in Copenhagen (76 km2 with 6 sub-catchments and 7 control points) using 2-h radar rainfall forecasts and inlet flows to control points computed from a variety of noisy...... smoothing. Simulations demonstrate notable improvements of the control efficiency when considering forecast information and additionally when considering forecast uncertainty, compared with optimization based on current basin fillings only....

  5. Short-term probabilistic earthquake risk assessment considering time-dependent b values

    Science.gov (United States)

    Gulia, Laura; Tormann, Thessa; Wiemer, Stefan; Herrmann, Marcus; Seif, Stefanie

    2016-02-01

    Laboratory experiments highlight a systematic b value decrease during the stress increase period before failure, and some large natural events are known to show a precursory decrease in the b value. However, short-term forecast models currently consider only the generic probability that an event can trigger subsequent seismicity in the near field. While the probability increase over a stationary Poissonian background is substantial, selected case studies have shown through cost-benefit analysis that the absolute main shock probability remains too low to warrant significant mitigation actions. We analyze the probabilities considering both changes in the seismicity rates and temporal changes in the b value. The precursory b value decrease in the 2009 L'Aquila case results in an additional fiftyfold probability increase for a M6.3 event. Translated into time-varying hazard and risk, these changes surpass the cost-benefit threshold for short-term evacuation.

  6. PBPK-Based Probabilistic Risk Assessment for Total Chlorotriazines in Drinking Water.

    Science.gov (United States)

    Breckenridge, Charles B; Campbell, Jerry L; Clewell, Harvey J; Andersen, Melvin E; Valdez-Flores, Ciriaco; Sielken, Robert L

    2016-04-01

    The risk of human exposure to total chlorotriazines (TCT) in drinking water was evaluated using a physiologically based pharmacokinetic (PBPK) model. Daily TCT (atrazine, deethylatrazine, deisopropylatrazine, and diaminochlorotriazine) chemographs were constructed for 17 frequently monitored community water systems (CWSs) using linear interpolation and Krieg estimates between observed TCT values. Synthetic chemographs were created using a conservative bias factor of 3 to generate intervening peaks between measured values. Drinking water consumption records from 24-h diaries were used to calculate daily exposure. Plasma TCT concentrations were updated every 30 minutes using the PBPK model output for each simulated calendar year from 2006 to 2010. Margins of exposure (MOEs) were calculated (MOE = [Human Plasma TCTPOD] ÷ [Human Plasma TCTEXP]) based on the toxicological point of departure (POD) and the drinking water-derived exposure to TCT. MOEs were determined based on 1, 2, 3, 4, 7, 14, 28, or 90 days of rolling average exposures and plasma TCT Cmax, or the area under the curve (AUC). Distributions of MOE were determined and the 99.9th percentile was used for risk assessment. MOEs for all 17 CWSs were >1000 at the 99.9(th)percentile. The 99.9(th)percentile of the MOE distribution was 2.8-fold less when the 3-fold synthetic chemograph bias factor was used. MOEs were insensitive to interpolation method, the consumer's age, the water consumption database used and the duration of time over which the rolling average plasma TCT was calculated, for up to 90 days. MOEs were sensitive to factors that modified the toxicological, or hyphenated appropriately no-observed-effects level (NOEL), including rat strain, endpoint used, method of calculating the NOEL, and the pharmacokinetics of elimination, as well as the magnitude of exposure (CWS, calendar year, and use of bias factors).

  7. PROBABILIST ANTIREALISM

    NARCIS (Netherlands)

    Douven, Igor; Horsten, Leon; Romeijn, Jan-Willem

    2010-01-01

    Until now, antirealists have offered sketches of a theory of truth, at best. In this paper, we present a probabilist account of antirealist truth in some formal detail, and we assess its ability to deal with the problems that are standardly taken to beset antirealism.

  8. Probabilistic Design

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Burcharth, H. F.

    This chapter describes how partial safety factors can be used in design of vertical wall breakwaters and an example of a code format is presented. The partial safety factors are calibrated on a probabilistic basis. The code calibration process used to calibrate some of the partial safety factors...

  9. Probabilistic linguistics

    NARCIS (Netherlands)

    Bod, R.; Heine, B.; Narrog, H.

    2010-01-01

    Probabilistic linguistics takes all linguistic evidence as positive evidence and lets statistics decide. It allows for accurate modelling of gradient phenomena in production and perception, and suggests that rule-like behaviour is no more than a side effect of maximizing probability. This chapter

  10. Probabilistic Risk Assessment of Cask Drop Accident during On-site Spent Nuclear Fuel Transportation

    Energy Technology Data Exchange (ETDEWEB)

    Ham, Jae Hyun; Christian, Robby; Momani, Belal Al; Kang, Hyun Gook [KAERI, Daejeon (Korea, Republic of)

    2016-05-15

    There are two ways to transfer the SNF from a site to other site, one is land transportation and the other is maritime transportation. Maritime transportation might be used because this way uses more safe route which is far from populated area. The whole transportation process can be divided in two parts: transferring the SNF between SNP and wharf in-Nuclear Power Plant (NPP) site by truck, and transferring the SNF from the wharf to the other wharf by ship. In this research, on-site SNF transportation between SNP and wharf was considered. Two kinds of single accident can occur during this type of SNF transportation, impact and fire, caused by internal events and external events. In this research, PRA of cask drop accident during onsite SNF transportation was done, risk to a person (mSv/person) from a case with specific conditions was calculated. In every 11 FEM simulation drop cases, FDR is 1 even the fuel assemblies are located inside of the cask. It is a quite larger value for all cases than the results with similar drop condition from the reports which covers the PRA on cask storage system. Because different from previous reports, subsequent impact was considered. Like in figure 8, accelerations which are used to calculate the FDR has extremely higher values in subsequent impact than the first impact for all SNF assemblies.

  11. Probabilistic risk assessment of the Space Shuttle. Phase 3: A study of the potential of losing the vehicle during nominal operation. Volume 2: Integrated loss of vehicle model

    Science.gov (United States)

    Fragola, Joseph R.; Maggio, Gaspare; Frank, Michael V.; Gerez, Luis; Mcfadden, Richard H.; Collins, Erin P.; Ballesio, Jorge; Appignani, Peter L.; Karns, James J.

    1995-01-01

    The application of the probabilistic risk assessment methodology to a Space Shuttle environment, particularly to the potential of losing the Shuttle during nominal operation is addressed. The different related concerns are identified and combined to determine overall program risks. A fault tree model is used to allocate system probabilities to the subsystem level. The loss of the vehicle due to failure to contain energetic gas and debris, to maintain proper propulsion and configuration is analyzed, along with the loss due to Orbiter, external tank failure, and landing failure or error.

  12. Health status of adults with Short Stature: A comparison with the normal population and one well-known chronic disease (Rheumatoid Arthritis

    Directory of Open Access Journals (Sweden)

    Naess Eva E

    2007-02-01

    Full Text Available Abstract Background To examine the subjective health status of adults with short stature (ShSt and compare with the general population (GP and one well-known chronic disease, rheumatoid artritis (RA. In addition, to explore the association between age, gender, height, educational level and different aspects of health status of adults with short stature. Methods A questionnaire was mailed to 72 subjects with short stature registered in the database of a Norwegian resource centre for rare disorders, response rate 61% (n = 44, age 16–61. Health status was assessed with SF-36 version 2. Comparison was done with age and gender matched samples from the general population in Norway (n = 264 and from subjects with RA (n = 88. Results The ShSt sample reported statistically significant impaired health status in all SF-36 subscales compared with the GP sample, most in the physical functioning, Mean Difference (MD 34 (95% Confidence Interval (CI 25–44. The ShSt reported poorer health status in mental health, MD 11 (95% CI 4–18 and social functioning, MD 11 (95% CI 2–20 but better in role physical MD 13 (95% CI 1–25 than the RA sample. On the other subscales there were minor difference between the ShSt and the RA sample. Within the short stature group there was a significant association between age and all SF-36 physical subcales, height was significantly associated with physical functioning while level of education was significantly associated with mental health. Conclusion People with short stature reported impaired health status in all SF-36 subscales indicating that they have health problems that influence their daily living. Health status seems to decline with increasing age, and earlier than in the general population.

  13. Inaccuracies in the history of a well-known introduction:a case study of the Australian House Sparrow(Passer domesticus)

    Institute of Scientific and Technical Information of China (English)

    Samuel C.Andrew; Simon C.Griffith

    2016-01-01

    Background:Modern ecosystems contain many invasive species as a result of the activity of acclimatisation societies that operated in the second half of the nineteenth century,and these species provide good opportunities for studying invasion biology.However,to gain insight into the ecological and genetic mechanisms that determine the rate of colonization and adaptation to new environments,we need a good understanding of the history of the introduced species,and a knowledge of the source population,timing,and number of individuals introduced is particularly important.However,any inaccuracies in the history of an introduction will affect subsequent assumptions and conclusions.Methods:Focusing on a single well-known species,the House Sparrow(Passer domesticus),we have documented the introduction into Australia using primary sources(e.g.acclimatisation records and newspaper articles).Results:Our revised history differs in a number of significant ways from previous accounts.Our evidence indicates that the House Sparrow was not solely introduced from source populations in England but also from Germany and most strikingly also from India—with the latter birds belonging to a different race.We also clarify the distinction between the number released and the number of founders,due to pre-release captive breeding programs,as well as identifying inaccuracies in a couple of well-cited sources with respect to the range expansion of the introduced populations.Conclusions:Our work suggests that caution is required for those studying introductions using the key sources of historical information and ideally should review original sources of information to verify the accuracy of published accounts.

  14. A workflow for in silico design of hIL-10 and ebvIL-10 inhibitors using well-known miniprotein scaffolds.

    Science.gov (United States)

    Dueñas, Salvador; Aguila, Sergio A; Pimienta, Genaro

    2017-04-01

    The over-expression of immune-suppressors such as IL-10 is a crucial landmark in both tumor progression, and latent viral and parasite infection. IL-10 is a multifunctional protein. Besides its immune-cell suppressive function, it also promotes B-cell tumorigenesis of lymphomas and melanoma. Human pathogens like unicellular parasites and viruses that remain latent inside B cells promote the over-expression of hIL-10 upon infection, which inhibits cell-mediated immune surveillance, and at the same time mediates B cell proliferation. The B-cell specific oncogenic latent virus Epstein-Barr virus (EBV) encodes a viral homologue of hIL-10 (ebvIL-10), expressed during lytic viral proliferation. Once expressed, ebvIL-10 inhibits cell-mediated immune surveillance, assuring EBV re-infection. During long-term latency, EBV-infected B cells over-express hIL-10 to assure B-cell proliferation, occasionally inducing EBV-mediated lymphomas. The amino acid sequences of hIL-10 and ebvIL-10 are more than 80% identical and thus have a very similar tridimensional structure. Based on their published crystallographic structures bound to their human receptor IL10R1, we report a structure-based design of hIL-10 and ebvIL-10 inhibitors based on 3 loops from IL10R1 that establish specific hydrogen bonds with the two IL10s. We have grafted these loops onto a permissible loop in three well-known miniprotein scaffolds-the Conus snail toxin MVIIA, the plant-derived trypsin inhibitor EETI, and the human appetite modulator AgRP. Our computational workflow described in detail below was invigorated by the negative and positive controls implemented, and therefore paves the way for future in vitro and in vivo validation assays of the IL-10 inhibitors engineered.

  15. Determining the Area of Review (AoR) in Carbon Capture and Storage: A tiered, probabilistic methodology to generate risk map

    Science.gov (United States)

    Cihan, A.; Siirila-Woodburn, E. R.; Birkholzer, J. T.

    2015-12-01

    The effects and related risks to potable aquifers due to pressure increases and brine leakage through abandoned wells is a poorly understood phenomena and a potentially significant contributor to the risk profile in Geologic Carbon Capture and Storage. Numerical models are used to investigate the evolution of brine leakage (during and post-injection) through wells located in the region where plugged and abandoned (P&A) wellbores leakage could occur. This area, termed tier 3, builds on a 3-tier methodology to define the Area of Review (AoR) proposed by Birkholzer et al. (2013). This work, in conjunction with a quantitative assessment of tier 1 AoR (an area encompassing the CO2 plume) and tier 2 AoR (an are encompassing the extent where open well-bore brine leakage could occur), will lead to a quantitative understanding of potential risks and a metric for the complete spatial extent of environmental risk in Carbon Capture and Storage. Here, we develop a probabilistic methodology to generate "risk maps" related to tier 3 AoR. The risk maps are based on the premise that the two greatest sources of uncertainty in P&A leakage are 1) the location of the unknown well with respect to the injection well and 2) the permeability of the leaky P&A (which can span over several orders of magnitude). The methodology utilizes numerical simulations and probability theory to generate spatial distributions of risk, defined with no-impact or MCL thresholds. Probabilistic risk maps can be used to provide risk-based descriptions of the AoR to inform site selection and monitoring during and post-injection.

  16. A probabilistic risk assessment for the vulnerability of the European carbon cycle to extreme events: the ecosystem perspective

    Directory of Open Access Journals (Sweden)

    S. Rolinski

    2014-06-01

    Full Text Available Extreme meteorological events are most likely to occur more often with climate change, leading to a further acceleration of climate change through potentially devastating effects on terrestrial ecosystems. But not all extreme meteorological events lead to extreme ecosystem response. Unlike most current studies, we therefore focus on pre-defined hazardous ecosystem behaviour and the identification of coinciding meteorological conditions, instead of expected ecosystem damage for a pre-defined meteorological event. We use a simple probabilistic risk assessment based on time series of ecosystem behaviour and meteorological conditions. Given the risk assessment terminology, vulnerability and risk for the previously defined hazard are, thus, estimated on the basis of observed hazardous ecosystem behaviour. We first adapt this generic approach to extreme responses of terrestrial ecosystems to drought and high temperatures, with defining the hazard as a negative net biome productivity over a 12 months period. Further, we show an instructive application for two selected sites using data for 1981–2010; and then apply the method on pan-European scale addressing the 1981–2010 period and future projections for 2071–2100, both based on numerical modelling results (LPJmL for ecosystem behaviour; REMO-SRES A1B for climate. Our site-specific results demonstrate the applicability of the proposed method, using the SPEI index to describe the meteorological condition. They also provide examples for their interpretation in case of vulnerability to drought for Spain with the expected value of the SPEI being 0.4 lower for hazardous than for non-hazardous ecosystem behaviour, and of non-vulnerability for Northern Germany, where the expected drought index value for hazard observations relates to wetter conditions than for the non-hazard observations. The pan-European assessment shows that significant results could be obtained for large areas within Europe. For 2071

  17. Evaluation of anionic surfactant concentrations in US effluents and probabilistic determination of their combined ecological risk in mixing zones.

    Science.gov (United States)

    McDonough, Kathleen; Casteel, Kenneth; Itrich, Nina; Menzies, Jennifer; Belanger, Scott; Wehmeyer, Kenneth; Federle, Thomas

    2016-12-01

    Alcohol sulfates (AS), alcohol ethoxysulfates (AES), linear alkyl benzenesulfonates (LAS) and methyl ester sulfonates (MES) are anionic surfactants that are widely used in household detergents and consumer products resulting in over 1 million tons being disposed of down the drain annually in the US. A monitoring campaign was conducted which collected grab effluent samples from 44 wastewater treatment plants (WWTPs) across the US to generate statistical distributions of effluent concentrations for anionic surfactants. The mean concentrations for AS, AES, LAS and MES were 5.03±4.5, 1.95±0.7, 15.3±19, and 0.35±0.13μg/L respectively. Since each of these surfactants consist of multiple homologues that differ in their toxicity, the concentration of each homologue measured in an effluent sample was converted into a toxic unit (TU) by normalizing to the predicted no effect concentration (PNEC) derived from high tier effects data (mesocosm studies). The statistical distributions of the combined TUs in the effluents were used in combination with distributions of dilution factors for WWTP mixing zones to conduct a US-wide probabilistic risk assessment for the aquatic environment for each of the surfactants. The 90th percentile level of TUs for AS, AES, LAS and MES in mixing zones were 1.89×10(-2), 2.73×10(-3), 2.72×10(-2), and 3.65×10(-5) under 7Q10 (lowest river flow occurring over a 7day period every 10years) low flow conditions. Because these surfactants have the same toxicological mode of action, the TUs were summed and the aquatic safety for anionic surfactants as a whole was assessed. At the 90th percentile level under the conservative 7Q10 low flow conditions the forecasted TUs were 4.21×10(-2) which indicates that there is a significant margin of safety for the class of anionic surfactants in US aquatic environments.

  18. 100个知名药品品牌市场营销战略分析%Marketing Strategy Analysis of 100 Well-known Drug Brands

    Institute of Scientific and Technical Information of China (English)

    宿凌; 张灵幸; 黄文龙

    2009-01-01

    OBJECTIVE: To provide reference for the selection of marketing strategy for post-marketing drugs. METHODS: χ~2 test was employed to analyze the number of 4 marketing strategies (flank attack strategy, guerrilla strategy, defensive strategy and attack strategy) used for the 100 well-known drug brands(74 from domestic versus 26 from abroad), meanwhile the regular marketing strategies for the domestic vs. foreign drug enterprises and for the OTC drugs vs. prescribed drugs were analyzed. RESULTS & CONCLUSIONS: There were significant differences between domestic and foreign pharmaceutical enterprises in the use of guerrilla strategy and attack strategy, 20 of the domestic brands (27.03%) vs. none of the foreign brands used guerrilla strategy (0.00%), however, the foreign enterprises prefer attack strategy than domestic enterprises (14 foreign brands (53.85%) vs. 13 domestic ones (17.57%)). There were no significant differences between domestic and foreign brands in the application of flank attack strategy and defensive strategy, the same is true between OTC and prescription drugs in the four strategies mentioned above.%目的:为国内上市药品选择市场营销战略提供参考.方法:运用χ~2检验对国内(74个)、国9F(26个)共100个知名药品品牌运用的4种市场营销战略--侧翼战略、游击战略、防御战略、进攻战略的数量进行统计分析,分别总结其中国内和国外药企以及非处方药和处方药市场营销战略的运用规律.结果与结论:国内和国外药企在游击战略和进攻战略运用上有显著差别,国内药企常用游击战略(20个品牌,27.03%),国外药企基本不运用游击战略(0个品牌,0.00%),而更倾向于运用进攻战略(国内药企:13个品牌,17.57%;国外药企:14个品牌,53.85%);侧翼战略和防御战略运用上国内和国外药企没有显著差别;非处方药和处方药在4种市场营销战略运用上都没有显著差别.

  19. Use of the Safety probabilistic analysis for the risk monitor before maintenance; Uso del Analisis probabilistico de seguridad para el monitor de riesgo antes de mantenimiento

    Energy Technology Data Exchange (ETDEWEB)

    Gonzalez C, M. [Emersis S.A. de C.V., Tabachines 9-bis, 62589 Temixco, Morelos (Mexico)]. e-mail: cuesta@emersis.com

    2004-07-01

    In this work the use of the Safety Probabilistic Analysis (APS) of the Laguna Verde Power plant to quantify the risk before maintenance is presented. Beginning to describe the nature of the Rule of Maintenance and their risk evaluations, it is planned about the paper of the APS for that purpose, and a systematic form to establish the reaches for this use open of the model is delineated. The work provides some technique details of the implantation methods of the APS like risk monitor, including the form of introducing the systems, trains and components to the user, as well as the fitness to the models and improvements to the used platform. There are covered some of the measures taken to achieve the objectives of preserving the base model approved, to facilitate the periodic realize, and to achieve acceptable times of execution for their efficient use. (Author)

  20. Using probabilistic terrorism risk modeling for regulatory benefit-cost analysis: application to the Western hemisphere travel initiative in the land environment.

    Science.gov (United States)

    Willis, Henry H; LaTourrette, Tom

    2008-04-01

    This article presents a framework for using probabilistic terrorism risk modeling in regulatory analysis. We demonstrate the framework with an example application involving a regulation under consideration, the Western Hemisphere Travel Initiative for the Land Environment, (WHTI-L). First, we estimate annualized loss from terrorist attacks with the Risk Management Solutions (RMS) Probabilistic Terrorism Model. We then estimate the critical risk reduction, which is the risk-reducing effectiveness of WHTI-L needed for its benefit, in terms of reduced terrorism loss in the United States, to exceed its cost. Our analysis indicates that the critical risk reduction depends strongly not only on uncertainties in the terrorism risk level, but also on uncertainty in the cost of regulation and how casualties are monetized. For a terrorism risk level based on the RMS standard risk estimate, the baseline regulatory cost estimate for WHTI-L, and a range of casualty cost estimates based on the willingness-to-pay approach, our estimate for the expected annualized loss from terrorism ranges from $2.7 billion to $5.2 billion. For this range in annualized loss, the critical risk reduction for WHTI-L ranges from 7% to 13%. Basing results on a lower risk level that results in halving the annualized terrorism loss would double the critical risk reduction (14-26%), and basing the results on a higher risk level that results in a doubling of the annualized terrorism loss would cut the critical risk reduction in half (3.5-6.6%). Ideally, decisions about terrorism security regulations and policies would be informed by true benefit-cost analyses in which the estimated benefits are compared to costs. Such analyses for terrorism security efforts face substantial impediments stemming from the great uncertainty in the terrorist threat and the very low recurrence interval for large attacks. Several approaches can be used to estimate how a terrorism security program or regulation reduces the

  1. A tiered approach for probabilistic ecological risk assessment of contaminated sites; Un approccio multilivello per l'analisi probabilistica di rischio ecologico di siti contaminati

    Energy Technology Data Exchange (ETDEWEB)

    Zolezzi, M. [Fisia Italimpianti SpA, Genova (Italy); Nicolella, C. [Pisa Univ., Pisa (Italy). Dipartimento di ingegneria chimica, chimica industriale e scienza dei materiali; Tarazona, J.V. [Instituto Nacional de Investigacion y Tecnologia Agraria y Alimentaria, Madrid (Spain). Departamento de Medio Ambiente, Laboratorio de toxicologia

    2005-09-15

    This paper presents a tiered methodology for probabilistic ecological risk assessment. The proposed approach starts from deterministic comparison (ratio) of single exposure concentration and threshold or safe level calculated from a dose-response relationship, goes through comparison of probabilistic distributions that describe exposure values and toxicological responses of organisms to the chemical of concern, and finally determines the so called distribution-based quotients (DBQs). In order to illustrate the proposed approach, soil concentrations of 1,2,4-trichlorobenzene (1,2,4- TCB) measured in an industrial contaminated site were used for site-specific probabilistic ecological risks assessment. By using probabilistic distributions, the risk, which exceeds a level of concern for soil organisms with the deterministic approach, is associated to the presence of hot spots reaching concentrations able to affect acutely more than 50% of the soil species, while the large majority of the area presents 1,2,4- TCB concentrations below those reported as toxic. [Italian] Scopo del presente studio e fornire una procedura per l'analisi di rischio ecologico di siti contaminati basata su livelli successivi di approfondimento. L'approccio proposto, partendo dal semplice rapporto deterministico tra un livello di esposizione ed un valore di effetto che consenta la salvaguardia del maggior numero di specie dell'ecosistema considerato, procede attraverso il confronto tra le distribuzioni statistiche dei valori di esposizione e di sensitivita delle specie, per determinare infine la distribuzione probabilistica del quoziente di rischio. Ai fini di illustrare la metodologia proposta, le concentrazioni di 1,2,4-triclorobenzene determinate nel suolo di un sito industriale contaminato sono state utilizzate per condurre l'analisi di rischio per le specie terrestri. L'utilizzo delle distribuzioni probabilistiche ha permesso di associare il rischio, inizialmente

  2. Probabilistic Physics-Based Risk Tools Used to Analyze the International Space Station Electrical Power System Output

    Science.gov (United States)

    Patel, Bhogila M.; Hoge, Peter A.; Nagpal, Vinod K.; Hojnicki, Jeffrey S.; Rusick, Jeffrey J.

    2004-01-01

    This paper describes the methods employed to apply probabilistic modeling techniques to the International Space Station (ISS) power system. These techniques were used to quantify the probabilistic variation in the power output, also called the response variable, due to variations (uncertainties) associated with knowledge of the influencing factors called the random variables. These uncertainties can be due to unknown environmental conditions, variation in the performance of electrical power system components or sensor tolerances. Uncertainties in these variables, cause corresponding variations in the power output, but the magnitude of that effect varies with the ISS operating conditions, e.g. whether or not the solar panels are actively tracking the sun. Therefore, it is important to quantify the influence of these uncertainties on the power output for optimizing the power available for experiments.

  3. Regional probabilistic risk assessment of heavy metals in different environmental media and land uses: An urbanization-affected drinking water supply area

    Science.gov (United States)

    Peng, Chi; Cai, Yimin; Wang, Tieyu; Xiao, Rongbo; Chen, Weiping

    2016-11-01

    In this study, we proposed a Regional Probabilistic Risk Assessment (RPRA) to estimate the health risks of exposing residents to heavy metals in different environmental media and land uses. The mean and ranges of heavy metal concentrations were measured in water, sediments, soil profiles and surface soils under four land uses along the Shunde Waterway, a drinking water supply area in China. Hazard quotients (HQs) were estimated for various exposure routes and heavy metal species. Riverbank vegetable plots and private vegetable plots had 95th percentiles of total HQs greater than 3 and 1, respectively, indicating high risks of cultivation on the flooded riverbank. Vegetable uptake and leaching to groundwater were the two transfer routes of soil metals causing high health risks. Exposure risks during outdoor recreation, farming and swimming along the Shunde Waterway are theoretically safe. Arsenic and cadmium were identified as the priority pollutants that contribute the most risk among the heavy metals. Sensitivity analysis showed that the exposure route, variations in exposure parameters, mobility of heavy metals in soil, and metal concentrations all influenced the risk estimates.

  4. Probabilistic-Numerical assessment of pyroclastic current hazard at Campi Flegrei and Naples city: Multi-VEI scenarios as a tool for full-scale risk management

    CERN Document Server

    Mastrolorenzo, Giuseppe; Pappalardo, Lucia; Rossano, Sergio

    2016-01-01

    The Campi Flegrei volcanic field (Italy) poses very high risk to the highly urbanized Neapolitan area. Eruptive history was dominated by explosive activity producing pyroclastic currents (PDCs; (Proclastic Density Currents) ranging in scale from localized base surges to regional flows. Here we apply probabilistic numerical simulation approaches to produce PDC hazard maps, based on a comprehensive spectrum of flow properties and vent locations. These maps and provide all probable Volcanic Explosivity Index (VEI) scenarios from different source vents in the caldera, relevant for risk management planning. For each VEI scenario, we report the conditional probability for PDCs (i.e., the probability for a given area to be affected by the passage of PDCs) and related dynamic pressure. Model results indicate that PDCs from VEI<4 events would be confined within the Campi Flegrei caldera, PDC propagation being impeded by the northern and eastern caldera walls. Conversely, PDCs from VEI 4-5 events could invade a wide...

  5. Probabilistic Unawareness

    Directory of Open Access Journals (Sweden)

    Mikaël Cozic

    2016-11-01

    Full Text Available The modeling of awareness and unawareness is a significant topic in the doxastic logic literature, where it is usually tackled in terms of full belief operators. The present paper aims at a treatment in terms of partial belief operators. It draws upon the modal probabilistic logic that was introduced by Aumann (1999 at the semantic level, and then axiomatized by Heifetz and Mongin (2001. The paper embodies in this framework those properties of unawareness that have been highlighted in the seminal paper by Modica and Rustichini (1999. Their paper deals with full belief, but we argue that the properties in question also apply to partial belief. Our main result is a (soundness and completeness theorem that reunites the two strands—modal and probabilistic—of doxastic logic.

  6. Probabilistic Tsunami Hazard Analysis

    Science.gov (United States)

    Thio, H. K.; Ichinose, G. A.; Somerville, P. G.; Polet, J.

    2006-12-01

    The recent tsunami disaster caused by the 2004 Sumatra-Andaman earthquake has focused our attention to the hazard posed by large earthquakes that occur under water, in particular subduction zone earthquakes, and the tsunamis that they generate. Even though these kinds of events are rare, the very large loss of life and material destruction caused by this earthquake warrant a significant effort towards the mitigation of the tsunami hazard. For ground motion hazard, Probabilistic Seismic Hazard Analysis (PSHA) has become a standard practice in the evaluation and mitigation of seismic hazard to populations in particular with respect to structures, infrastructure and lifelines. Its ability to condense the complexities and variability of seismic activity into a manageable set of parameters greatly facilitates the design of effective seismic resistant buildings but also the planning of infrastructure projects. Probabilistic Tsunami Hazard Analysis (PTHA) achieves the same goal for hazards posed by tsunami. There are great advantages of implementing such a method to evaluate the total risk (seismic and tsunami) to coastal communities. The method that we have developed is based on the traditional PSHA and therefore completely consistent with standard seismic practice. Because of the strong dependence of tsunami wave heights on bathymetry, we use a full waveform tsunami waveform computation in lieu of attenuation relations that are common in PSHA. By pre-computing and storing the tsunami waveforms at points along the coast generated for sets of subfaults that comprise larger earthquake faults, we can efficiently synthesize tsunami waveforms for any slip distribution on those faults by summing the individual subfault tsunami waveforms (weighted by their slip). This efficiency make it feasible to use Green's function summation in lieu of attenuation relations to provide very accurate estimates of tsunami height for probabilistic calculations, where one typically computes

  7. A probabilistic risk-based approach for spinning reserve provision using day-ahead demand response program

    Energy Technology Data Exchange (ETDEWEB)

    Shayesteh, E. [Islamic Azad University, Garmsar Branch, Garmsar (Iran); Yousefi, A.; Parsa Moghaddam, M. [Department of Electrical Engineering, Tarbiat Modares University (TMU), Tehran (Iran)

    2010-05-15

    Spinning Reserve is one of the ancillary services which is essential to satisfy system security constraints when the power system faces with a contingency. In this paper, Day Ahead Demand Response Program as one of the incentive-based Demand Response programs is implemented as a source of spinning reserve. In this regard, certain number of demands are selected according to a sensitivity analysis, and simulated as virtual generation units. The reserve market is cleared for Spinning Reserve allocation considering a probabilistic technique. A comparison is performed between the absence and existence of Day Ahead Demand Response Program from both economical and reliability viewpoints. Numerical studies based on IEEE 57 bus test system is conducted for evaluation of the proposed method. (author)

  8. Mosquito control insecticides: a probabilistic ecological risk assessment on drift exposures of naled, dichlorvos (naled metabolite) and permethrin to adult butterflies.

    Science.gov (United States)

    Hoang, T C; Rand, G M

    2015-01-01

    A comprehensive probabilistic terrestrial ecological risk assessment (ERA) was conducted to characterize the potential risk of mosquito control insecticide (i.e., naled, it's metabolite dichlorvos, and permethrin) usage to adult butterflies in south Florida by comparing the probability distributions of environmental exposure concentrations following actual mosquito control applications at labeled rates from ten field monitoring studies with the probability distributions of butterfly species response (effects) data from our laboratory acute toxicity studies. The overlap of these distributions was used as a measure of risk to butterflies. The long-term viability (survival) of adult butterflies, following topical (thorax/wings) exposures was the environmental value we wanted to protect. Laboratory acute toxicity studies (24-h LD50) included topical exposures (thorax and wings) to five adult butterfly species and preparation of species sensitivity distributions (SSDs). The ERA indicated that the assessment endpoint of protection, of at least 90% of the species, 90% of the time (or the 10th percentile from the acute SSDs) from acute naled and permethrin exposures, is most likely not occurring when considering topical exposures to adults. Although the surface areas for adulticide exposures are greater for the wings, exposures to the thorax provide the highest potential for risk (i.e., SSD 10th percentile is lowest) for adult butterflies. Dichlorvos appeared to present no risk. The results of this ERA can be applied to other areas of the world, where these insecticides are used and where butterflies may be exposed. Since there are other sources (e.g., agriculture) of pesticides in the environment, where butterfly exposures will occur, the ERA may under-estimate the potential risks under real-world conditions. Copyright © 2014 Elsevier B.V. All rights reserved.

  9. Probabilistic metric spaces

    CERN Document Server

    Schweizer, B

    2005-01-01

    Topics include special classes of probabilistic metric spaces, topologies, and several related structures, such as probabilistic normed and inner-product spaces. 1983 edition, updated with 3 new appendixes. Includes 17 illustrations.

  10. Probabilistic Concurrent Kleene Algebra

    Directory of Open Access Journals (Sweden)

    Annabelle McIver

    2013-06-01

    Full Text Available We provide an extension of concurrent Kleene algebras to account for probabilistic properties. The algebra yields a unified framework containing nondeterminism, concurrency and probability and is sound with respect to the set of probabilistic automata modulo probabilistic simulation. We use the resulting algebra to generalise the algebraic formulation of a variant of Jones' rely/guarantee calculus.

  11. Probabilistic-Multiobjective Comparison of User-Defined Operating Rules. Case Study: Hydropower Dam in Spain

    Directory of Open Access Journals (Sweden)

    Paola Bianucci

    2015-03-01

    Full Text Available A useful tool is proposed in this paper to assist dam managers in comparing and selecting suitable operating rules. This procedure is based on well-known multiobjective and probabilistic methodologies, which were jointly applied here to assess and compare flood control strategies in hydropower reservoirs. The procedure consisted of evaluating the operating rules’ performance using a simulation fed by a representative and sufficiently large flood event series. These flood events were obtained from a synthetic rainfall series stochastically generated by using the RainSimV3 model coupled with a deterministic hydrological model. The performance of the assessed strategies was characterized using probabilistic variables. Finally, evaluation and comparison were conducted by analyzing objective functions which synthesize different aspects of the rules’ performance. These objectives were probabilistically defined in terms of risk and expected values. To assess the applicability and flexibility of the tool, it was implemented in a hydropower dam located in Galicia (Northern Spain. This procedure allowed alternative operating rule to be derived which provided a reasonable trade-off between dam safety, flood control, operability and energy production.

  12. A probabilistic model for simultaneous exposure to multiple compounds from food and its use for risk-benefit assessment

    NARCIS (Netherlands)

    Voet, van der H.; Mul, de A.; Klaveren, van J.D.

    2007-01-01

    A model is presented which allows to quantify the simultaneous distribution of the exposure to two compounds, for example a health-risk and a health promoting compound. The model considers the total dietary intake, and can be used as a first step to study the effects on the balance between risks and

  13. Impact of droughts on the C-cycle in European vegetation: a probabilistic risk analysis using six vegetation models

    Directory of Open Access Journals (Sweden)

    M. Van Oijen

    2014-06-01

    Full Text Available We analyse how climate change may alter risks posed by droughts to carbon fluxes in European ecosystems. The approach follows a recently proposed framework for risk analysis based on probability theory. In this approach, risk is quantified as the product of hazard probability and ecosystem vulnerability. The probability of a drought hazard is calculated here from the Standardised Precipitation Evapotranspiration Index. Vulnerability is calculated from the response to drought simulated by process-based vegetation models. Here we use six different models: three for generic vegetation (JSBACH, LPJmL, ORCHIDEE and three for specific ecosystems (Scots pine forests: BASFOR; winter wheat fields: EPIC; grasslands: PASIM. The periods 1971–2000 and 2071–2100 are compared. Climate data are based on observations and on output from the regional climate model REMO using the SRES A1B scenario. The risk analysis is carried out for ∼22 000 grid cells of 0.25° × 0.25° across Europe. For each grid cell, drought vulnerability and risk are quantified for five seasonal variables: net primary and ecosystem productivity (NPP, NEP, heterotrophic respiration (RH, soil water content and evapotranspiration. Climate change is expected to lead to increased drought risks to net primary productivity in the Mediterranean area: five of the models estimate that risk will exceed 15%. The risks will increase mainly because of greater drought probability; ecosystem vulnerability will increase to lesser extent. Because NPP will be affected more than RH, future C-sequestration (NEP will also be at risk predominantly in southern Europe, with risks exceeding 0.25 g C m−2 d−1 according to most models, amounting to reductions in carbon sequestration of 20 to 80%.

  14. Quantitative risk assessment relating to adventitious presence of allergens in food: a probabilistic model applied to peanut in chocolate.

    Science.gov (United States)

    Rimbaud, Loup; Heraud, Fanny; La Vieille, Sébastien; Leblanc, Jean-Charles; Crepet, Amélie

    2010-01-01

    Peanut allergy is a public health concern, owing to the high prevalence in France and the severity of the reactions. Despite peanut-containing product avoidance diets, a risk may exist due to the adventitious presence of peanut allergens in a wide range of food products. Peanut is not mentioned in their ingredients list, but precautionary labeling is often present. A method of quantifying the risk of allergic reactions following the consumption of such products is developed, taking the example of peanut in chocolate tablets. The occurrence of adventitious peanut proteins in chocolate and the dose-response relationship are estimated with a Bayesian approach using available published data. The consumption pattern is described by the French individual consumption survey INCA2. Risk simulations are performed using second-order Monte Carlo simulations, which separately propagates variability and uncertainty of the model input variables. Peanut allergens occur in approximately 36% of the chocolates, leading to a mean exposure level of 0.2 mg of peanut proteins per eating occasion. The estimated risk of reaction averages 0.57% per eating occasion for peanut-allergic adults. The 95% values of the risk stand between 0 and 3.61%, which illustrates the risk variability. The uncertainty, represented by the 95% credible intervals, is concentrated around these risk estimates. Children have similar results. The conclusion is that adventitious peanut allergens induce a risk of reaction for a part of the French peanut-allergic population. The method developed can be generalized to assess the risk due to the consumption of every foodstuff potentially contaminated by allergens.

  15. A SCOPING STUDY: Development of Probabilistic Risk Assessment Models for Reactivity Insertion Accidents During Shutdown In U.S. Commercial Light Water Reactors

    Energy Technology Data Exchange (ETDEWEB)

    S. Khericha

    2011-06-01

    This report documents the scoping study of developing generic simplified fuel damage risk models for quantitative analysis from inadvertent reactivity insertion events during shutdown (SD) in light water pressurized and boiling water reactors. In the past, nuclear fuel reactivity accidents have been analyzed both mainly deterministically and probabilistically for at-power and SD operations of nuclear power plants (NPPs). Since then, many NPPs had power up-rates and longer refueling intervals, which resulted in fuel configurations that may potentially respond differently (in an undesirable way) to reactivity accidents. Also, as shown in a recent event, several inadvertent operator actions caused potential nuclear fuel reactivity insertion accident during SD operations. The set inadvertent operator actions are likely to be plant- and operation-state specific and could lead to accident sequences. This study is an outcome of the concern which arose after the inadvertent withdrawal of control rods at Dresden Unit 3 in 2008 due to operator actions in the plant inadvertently three control rods were withdrawn from the reactor without knowledge of the main control room operator. The purpose of this Standardized Plant Analysis Risk (SPAR) Model development project is to develop simplified SPAR Models that can be used by staff analysts to perform risk analyses of operating events and/or conditions occurring during SD operation. These types of accident scenarios are dominated by the operator actions, (e.g., misalignment of valves, failure to follow procedures and errors of commissions). Human error probabilities specific to this model were assessed using the methodology developed for SPAR model human error evaluations. The event trees, fault trees, basic event data and data sources for the model are provided in the report. The end state is defined as the reactor becomes critical. The scoping study includes a brief literature search/review of historical events, developments of

  16. Prediction of transition from ultra-high risk to first-episode psychosis using a probabilistic model combining history, clinical assessment and fatty-acid biomarkers

    Science.gov (United States)

    Clark, S R; Baune, B T; Schubert, K O; Lavoie, S; Smesny, S; Rice, S M; Schäfer, M R; Benninger, F; Feucht, M; Klier, C M; McGorry, P D; Amminger, G P

    2016-01-01

    Current criteria identifying patients with ultra-high risk of psychosis (UHR) have low specificity, and less than one-third of UHR cases experience transition to psychosis within 3 years of initial assessment. We explored whether a Bayesian probabilistic multimodal model, combining baseline historical and clinical risk factors with biomarkers (oxidative stress, cell membrane fatty acids, resting quantitative electroencephalography (qEEG)), could improve this specificity. We analyzed data of a UHR cohort (n=40) with a 1-year transition rate of 28%. Positive and negative likelihood ratios were calculated for predictor variables with statistically significant receiver operating characteristic curves (ROCs), which excluded oxidative stress markers and qEEG parameters as significant predictors of transition. We clustered significant variables into historical (history of drug use), clinical (Positive and Negative Symptoms Scale positive, negative and general scores and Global Assessment of Function) and biomarker (total omega-3, nervonic acid) groups, and calculated the post-test probability of transition for each group and for group combinations using the odds ratio form of Bayes' rule. Combination of the three variable groups vastly improved the specificity of prediction (area under ROC=0.919, sensitivity=72.73%, specificity=96.43%). In this sample, our model identified over 70% of UHR patients who transitioned within 1 year, compared with 28% identified by standard UHR criteria. The model classified 77% of cases as very high or low risk (P>0.9, <0.1) based on history and clinical assessment, suggesting that a staged approach could be most efficient, reserving fatty-acid markers for 23% of cases remaining at intermediate probability following bedside interview. PMID:27648919

  17. Probabilistic quantitative microbial risk assessment model of farmer exposure to Cryptosporidium spp. in irrigation water within Kumasi Metropolis-Ghana

    DEFF Research Database (Denmark)

    Sampson, Angelina; Owusu-Ansah, Emmanuel de-Graft Johnson; Mills-Robertson, Felix C.

    2017-01-01

    Cryptosporidium is a protozoan parasite which can be transmitted via food and water. Some studies have shown irrigation water to be routes of transmission for Cryptosporidium into the food chain, however, little information is known about Cryptosporidium levels in wastewater used for irrigation...... causing gastroenteritis. The results indicate high positive levels of Cryptosporidium in the irrigation water, however, the levels of Cryptosporidium decreases during the rainfall seasons, risk assessment results show that, farmers face a higher risk of being infected by Cryptosporidium due to frequent...

  18. Analytical solutions of linked fault tree probabilistic risk assessments using binary decision diagrams with emphasis on nuclear safety applications[Dissertation 17286

    Energy Technology Data Exchange (ETDEWEB)

    Nusbaumer, O. P. M

    2007-07-01

    This study is concerned with the quantification of Probabilistic Risk Assessment (PRA) using linked Fault Tree (FT) models. Probabilistic Risk assessment (PRA) of Nuclear Power Plants (NPPs) complements traditional deterministic analysis; it is widely recognized as a comprehensive and structured approach to identify accident scenarios and to derive numerical estimates of the associated risk levels. PRA models as found in the nuclear industry have evolved rapidly. Increasingly, they have been broadly applied to support numerous applications on various operational and regulatory matters. Regulatory bodies in many countries require that a PRA be performed for licensing purposes. PRA has reached the point where it can considerably influence the design and operation of nuclear power plants. However, most of the tools available for quantifying large PRA models are unable to produce analytically correct results. The algorithms of such quantifiers are designed to neglect sequences when their likelihood decreases below a predefined cutoff limit. In addition, the rare event approximation (e.g. Moivre's equation) is typically implemented for the first order, ignoring the success paths and the possibility that two or more events can occur simultaneously. This is only justified in assessments where the probabilities of the basic events are low. When the events in question are failures, the first order rare event approximation is always conservative, resulting in wrong interpretation of risk importance measures. Advanced NPP PRA models typically include human errors, common cause failure groups, seismic and phenomenological basic events, where the failure probabilities may approach unity, leading to questionable results. It is accepted that current quantification tools have reached their limits, and that new quantification techniques should be investigated. A novel approach using the mathematical concept of Binary Decision Diagram (BDD) is proposed to overcome these

  19. Probabilistic Algorithms in Robotics

    OpenAIRE

    Thrun, Sebastian

    2000-01-01

    This article describes a methodology for programming robots known as probabilistic robotics. The probabilistic paradigm pays tribute to the inherent uncertainty in robot perception, relying on explicit representations of uncertainty when determining what to do. This article surveys some of the progress in the field, using in-depth examples to illustrate some of the nuts and bolts of the basic approach. My central conjecture is that the probabilistic approach to robotics scales better to compl...

  20. Probabilistic liver atlas construction

    OpenAIRE

    Dura, Esther; Domingo, Juan; Ayala, Guillermo; Marti-Bonmati, Luis; Goceri, E.

    2017-01-01

    Background Anatomical atlases are 3D volumes or shapes representing an organ or structure of the human body. They contain either the prototypical shape of the object of interest together with other shapes representing its statistical variations (statistical atlas) or a probability map of belonging to the object (probabilistic atlas). Probabilistic atlases are mostly built with simple estimations only involving the data at each spatial location. Results A new method for probabilistic atlas con...

  1. Probabilistic Logical Characterization

    DEFF Research Database (Denmark)

    Hermanns, Holger; Parma, Augusto; Segala, Roberto;

    2011-01-01

    Probabilistic automata exhibit both probabilistic and non-deterministic choice. They are therefore a powerful semantic foundation for modeling concurrent systems with random phenomena arising in many applications ranging from artificial intelligence, security, systems biology to performance model...... modeling. Several variations of bisimulation and simulation relations have proved to be useful as means to abstract and compare different automata. This paper develops a taxonomy of logical characterizations of these relations on image-finite and image-infinite probabilistic automata....

  2. 驰名商标自我淡化的表现形式和特点及其防范对策%The Forms, Characteristics and Preventive Measures of the Self-dilution of Well-known Trademarks

    Institute of Scientific and Technical Information of China (English)

    王晓先

    2012-01-01

    Well-known trademarks have an absolute significance for Sharing market, and any diluting behavior to well-known trademarks would result in an irreversible damage. There are differences between self-dilu- tion and traditional dilution in terms of their form and subject. In reality, enterprises are alert to the self-di- lution of well-known trademarks, but do not take sufficient care and preventive measures. Exploring the forms, characteristics and preventive measures of self-dilution of well-known trademarks has vital significance to an enterprise's long-term development.%驰名商标对占有市场有着绝对的意义,任何淡化行为对驰名商标都将构成难以逆转的损害。自我淡化与传统的淡化相比,在淡化的主体或淡化的形式上都不相同。现实中,企业对驰名商标被淡化的情形已经非常警觉,但是对驰名商标自我淡化的情形并没有引起足够的重视和防范。探索驰名商标自我淡化的表现形式、特点及其防范对策,对企业的长远发展有着重要的意义。

  3. A model for probabilistic health impact assessment of exposure to food chemicals.

    NARCIS (Netherlands)

    van der Voet, H.; van der Heijden, G.W.; Bos, P.M.J.; Bosgra, S.; Boon, P.E.; Muri, S.D.; Bruschweiler, B.J.

    2010-01-01

    A statistical model is presented extending the integrated probabilistic risk assessment (IPRA) model of van der Voet and Slob [van der Voet, H., Slob, W., 2007. Integration of probabilistic exposure assessment and probabilistic hazard characterisation. Risk Analysis, 27, 351-371]. The aim is to char

  4. A model for probabilistic health impact assessment of exposure to food chemicals

    NARCIS (Netherlands)

    Voet, van der H.; Heijden, van der G.W.A.M.; Bos, P.M.J.; Bosgra, S.; Boon, P.E.; Muri, S.D.; Brüschweiler, B.J.

    2009-01-01

    A statistical model is presented extending the integrated probabilistic risk assessment (IPRA) model of van der Voet and Slob [van der Voet, H., Slob, W., 2007. Integration of probabilistic exposure assessment and probabilistic hazard characterisation. Risk Analysis, 27, 351–371]. The aim is to char

  5. Duplicate Detection in Probabilistic Data

    NARCIS (Netherlands)

    Panse, Fabian; Keulen, van Maurice; Keijzer, de Ander; Ritter, Norbert

    2009-01-01

    Collected data often contains uncertainties. Probabilistic databases have been proposed to manage uncertain data. To combine data from multiple autonomous probabilistic databases, an integration of probabilistic data has to be performed. Until now, however, data integration approaches have focused o

  6. Do probabilistic forecasts lead to better decisions?

    Science.gov (United States)

    Ramos, M. H.; van Andel, S. J.; Pappenberger, F.

    2013-06-01

    The last decade has seen growing research in producing probabilistic hydro-meteorological forecasts and increasing their reliability. This followed the promise that, supplied with information about uncertainty, people would take better risk-based decisions. In recent years, therefore, research and operational developments have also started focusing attention on ways of communicating the probabilistic forecasts to decision-makers. Communicating probabilistic forecasts includes preparing tools and products for visualisation, but also requires understanding how decision-makers perceive and use uncertainty information in real time. At the EGU General Assembly 2012, we conducted a laboratory-style experiment in which several cases of flood forecasts and a choice of actions to take were presented as part of a game to participants, who acted as decision-makers. Answers were collected and analysed. In this paper, we present the results of this exercise and discuss if we indeed make better decisions on the basis of probabilistic forecasts.

  7. Probabilistic Dynamic Epistemic Logic

    NARCIS (Netherlands)

    Kooi, B.P.

    2003-01-01

    In this paper I combine the dynamic epistemic logic of Gerbrandy (1999) with the probabilistic logic of Fagin and Halpern (1999). The result is a new probabilistic dynamic epistemic logic, a logic for reasoning about probability, information, and information change that takes higher order informatio

  8. Probabilistic Mu-Calculus

    DEFF Research Database (Denmark)

    Larsen, Kim Guldstrand; Mardare, Radu Iulian; Xue, Bingtian

    2016-01-01

    We introduce a version of the probabilistic µ-calculus (PMC) built on top of a probabilistic modal logic that allows encoding n-ary inequational conditions on transition probabilities. PMC extends previously studied calculi and we prove that, despite its expressiveness, it enjoys a series of good...

  9. Probabilistic analysis of risks to US drinking water intakes from 1,4-dioxane in domestic wastewater treatment plant effluents.

    Science.gov (United States)

    Simonich, Staci Massey; Sun, Ping; Casteel, Ken; Dyer, Scott; Wernery, Dave; Garber, Kevin; Carr, Gregory; Federle, Thomas

    2013-10-01

    The risks of 1,4-dioxane (dioxane) concentrations in wastewater treatment plant (WWTP) effluents, receiving primarily domestic wastewater, to downstream drinking water intakes was estimated using distributions of measured dioxane concentrations in effluents from 40 WWTPs and surface water dilution factors of 1323 drinking water intakes across the United States. Effluent samples were spiked with a d8 -1,4-dioxane internal standard in the field immediately after sample collection. Dioxane was extracted with ENVI-CARB-Plus solid phase columns and analyzed by GC/MS/MS, with a limit of quantification of 0.30 μg/L. Measured dioxane concentrations in domestic wastewater effluents ranged from water intakes using the iSTREEM model at mean flow conditions, assuming no in-stream loss of dioxane. Dilution factors ranged from 2.6 to 48 113, with a mean of 875. The distributions of dilution factors and dioxane concentration in effluent were then combined using Monte Carlo analysis to estimate dioxane concentrations at drinking water intakes. This analysis showed the probability was negligible (p = 0.0031) that dioxane inputs from upstream WWTPs could result in intake concentrations exceeding the USEPA drinking water advisory concentration of 0.35 μg/L, before any treatment of the water for drinking use.

  10. 概率安全分析软件RiskA与RiskSpectrum的故障树计算引擎计算对比分析研究%Calculation Engine Comparison of Probabilistic Safety Analysis Program RiskA and RiskSpectrum

    Institute of Scientific and Technical Information of China (English)

    殷园; 汪进; 陈珊琦; 王芳; 王家群

    2014-01-01

    本文采用完整的秦山第三核电厂PSA模型,分别用中科院 FDS团队自主研发的概率安全分析软件 RiskA的计算引擎 RiskAT与瑞典斯堪伯奥公司开发的 RiskSpectrum 的计算引擎 RSAT 进行了计算,结果表明二者定性和定量计算结果一致,在计算性能方面,RiskA的计算速度快于 RiskSpectrum.%With mockup model of Qinshan Ⅲ, RiskA and RiskSpectrum calculation engines were compared.RiskA was probabilistic safety analysis program independently developed by FDS Team,and Risk Spectrum was another similar program developed by Sweden Scand Power which was widely used.The comparison showed that the calculation results were exactly the same,and the computing speed of RiskA was faster than Risk Spectrum.

  11. Probabilistic risk model for staphylococcal intoxication from pork-based food dishes prepared in food service establishments in Korea.

    Science.gov (United States)

    Kim, Hyun Jung; Griffiths, Mansel W; Fazil, Aamir M; Lammerding, Anna M

    2009-09-01

    Foodborne illness contracted at food service operations is an important public health issue in Korea. In this study, the probabilities for growth of, and enterotoxin production by, Staphylococcus aureus in pork meat-based foods prepared in food service operations were estimated by the Monte Carlo simulation. Data on the prevalence and concentration of S. aureus as well as compliance to guidelines for time and temperature controls during food service operations were collected. The growth of S. aureus was initially estimated by using the U.S. Department of Agriculture's Pathogen Modeling Program. A second model based on raw pork meat was derived to compare cell number predictions. The correlation between toxin level and cell number as well as minimum toxin dose obtained from published data was adopted to quantify the probability of staphylococcal intoxication. When data gaps were found, assumptions were made based on guidelines for food service practices. Baseline risk model and scenario analyses were performed to indicate possible outcomes of staphylococcal intoxication under the scenarios generated based on these data gaps. Staphylococcal growth was predicted during holding before and after cooking, and the highest estimated concentration (4.59 log CFU/g for the 99.9th percentile value) of S. aureus was observed in raw pork initially contaminated with S. aureus and held before cooking. The estimated probability for staphylococcal intoxication was very low, using currently available data. However, scenario analyses revealed an increased possibility of staphylococcal intoxication when increased levels of initial contamination in the raw meat, andlonger holding time both before and after cooking the meat occurred.

  12. 100 km under ground. Longest well-known aqueduct tunnel of the antique in Jordan and Syria; 100 km unter Tage. Laengster bisher bekannter Aquaedukttunnel der Antike in Jordanien und Syrien

    Energy Technology Data Exchange (ETDEWEB)

    Doering, Mathias [Technische Univ. Bergakademie Freiberg (Germany). IWTG

    2010-05-15

    Since 2004, the author of the contribution under consideration investigates an ancient tunnel system with unknown extents in the border area between Jordan and Syria. It is a part of a nearly 170 km long Roman aqueduct which supplies three cities with water. The nearly 106 km long, partly plastered tunneling system was built from approximately 2,900 building pits with stairs in open ends tunneling. Not only mallet and iron, but also half-mechanical propulsion equipment were used due to regular cut traces. The aqueduct might be one the most extensive aqueducts in the Roman antiquity. The tunnel might be the longest well-known tunnel from the antiquity.

  13. Probabilistic Flood Defence Assessment Tools

    Directory of Open Access Journals (Sweden)

    Slomp Robert

    2016-01-01

    Full Text Available The WTI2017 project is responsible for the development of flood defence assessment tools for the 3600 km of Dutch primary flood defences, dikes/levees, dunes and hydraulic structures. These tools are necessary, as per January 1st 2017, the new flood risk management policy for the Netherlands will be implemented. Then, the seven decades old design practice (maximum water level methodology of 1958 and two decades old safety standards (and maximum hydraulic load methodology of 1996 will formally be replaced by a more risked based approach for the national policy in flood risk management. The formal flood defence assessment is an important part of this new policy, especially for flood defence managers, since national and regional funding for reinforcement is based on this assessment. This new flood defence policy is based on a maximum allowable probability of flooding. For this, a maximum acceptable individual risk was determined at 1/100 000 per year, this is the probability of life loss of for every protected area in the Netherlands. Safety standards of flood defences were then determined based on this acceptable individual risk. The results were adjusted based on information from cost -benefit analysis, societal risk and large scale societal disruption due to the failure of critical infrastructure e.g. power stations. The resulting riskbased flood defence safety standards range from a 300 to a 100 000 year return period for failure. Two policy studies, WV21 (Safety from floods in the 21st century and VNK-2 (the National Flood Risk in 2010 provided the essential information to determine the new risk based safety standards for flood defences. The WTI2017 project will provide the safety assessment tools based on these new standards and is thus an essential element for the implementation of this policy change. A major issue to be tackled was the development of user-friendly tools, as the new assessment is to be carried out by personnel of the

  14. Probabilistic structural analysis algorithm development for computational efficiency

    Science.gov (United States)

    Wu, Y.-T.

    1991-01-01

    The PSAM (Probabilistic Structural Analysis Methods) program is developing a probabilistic structural risk assessment capability for the SSME components. An advanced probabilistic structural analysis software system, NESSUS (Numerical Evaluation of Stochastic Structures Under Stress), is being developed as part of the PSAM effort to accurately simulate stochastic structures operating under severe random loading conditions. One of the challenges in developing the NESSUS system is the development of the probabilistic algorithms that provide both efficiency and accuracy. The main probability algorithms developed and implemented in the NESSUS system are efficient, but approximate in nature. In the last six years, the algorithms have improved very significantly.

  15. Analytical solutions of linked fault tree probabilistic risk assessments using binary decision diagrams with emphasis on nuclear safety applications[Dissertation 17286

    Energy Technology Data Exchange (ETDEWEB)

    Nusbaumer, O. P. M

    2007-07-01

    This study is concerned with the quantification of Probabilistic Risk Assessment (PRA) using linked Fault Tree (FT) models. Probabilistic Risk assessment (PRA) of Nuclear Power Plants (NPPs) complements traditional deterministic analysis; it is widely recognized as a comprehensive and structured approach to identify accident scenarios and to derive numerical estimates of the associated risk levels. PRA models as found in the nuclear industry have evolved rapidly. Increasingly, they have been broadly applied to support numerous applications on various operational and regulatory matters. Regulatory bodies in many countries require that a PRA be performed for licensing purposes. PRA has reached the point where it can considerably influence the design and operation of nuclear power plants. However, most of the tools available for quantifying large PRA models are unable to produce analytically correct results. The algorithms of such quantifiers are designed to neglect sequences when their likelihood decreases below a predefined cutoff limit. In addition, the rare event approximation (e.g. Moivre's equation) is typically implemented for the first order, ignoring the success paths and the possibility that two or more events can occur simultaneously. This is only justified in assessments where the probabilities of the basic events are low. When the events in question are failures, the first order rare event approximation is always conservative, resulting in wrong interpretation of risk importance measures. Advanced NPP PRA models typically include human errors, common cause failure groups, seismic and phenomenological basic events, where the failure probabilities may approach unity, leading to questionable results. It is accepted that current quantification tools have reached their limits, and that new quantification techniques should be investigated. A novel approach using the mathematical concept of Binary Decision Diagram (BDD) is proposed to overcome these

  16. Contextual Risk and Its Relevance in Economics

    CERN Document Server

    Aerts, Diederik

    2011-01-01

    Uncertainty in economics still poses some fundamental problems illustrated, e.g., by the Allais and Ellsberg paradoxes. To overcome these difficulties, economists have introduced an interesting distinction between 'risk' and 'ambiguity' depending on the existence of a (classical Kolmogorovian) probabilistic structure modeling these uncertainty situations. On the other hand, evidence of everyday life suggests that 'context' plays a fundamental role in human decisions under uncertainty. Moreover, it is well known from physics that any probabilistic structure modeling contextual interactions between entities structurally needs a non-Kolmogorovian quantum-like framework. In this paper we introduce the notion of 'contextual risk' with the aim of modeling a substantial part of the situations in which usually only 'ambiguity' is present. More precisely, we firstly introduce the essentials of an operational formalism called 'the hidden measurement approach' in which probability is introduced as a consequence of fluct...

  17. Symbolic Computing in Probabilistic and Stochastic Analysis

    Directory of Open Access Journals (Sweden)

    Kamiński Marcin

    2015-12-01

    Full Text Available The main aim is to present recent developments in applications of symbolic computing in probabilistic and stochastic analysis, and this is done using the example of the well-known MAPLE system. The key theoretical methods discussed are (i analytical derivations, (ii the classical Monte-Carlo simulation approach, (iii the stochastic perturbation technique, as well as (iv some semi-analytical approaches. It is demonstrated in particular how to engage the basic symbolic tools implemented in any system to derive the basic equations for the stochastic perturbation technique and how to make an efficient implementation of the semi-analytical methods using an automatic differentiation and integration provided by the computer algebra program itself. The second important illustration is probabilistic extension of the finite element and finite difference methods coded in MAPLE, showing how to solve boundary value problems with random parameters in the environment of symbolic computing. The response function method belongs to the third group, where interference of classical deterministic software with the non-linear fitting numerical techniques available in various symbolic environments is displayed. We recover in this context the probabilistic structural response in engineering systems and show how to solve partial differential equations including Gaussian randomness in their coefficients.

  18. Independent Review of U.S. and Russian Probabilistic Risk Assessments for the International Space Station Mini Research Module #2 Micrometeoroid and Orbital Debris Risk

    Science.gov (United States)

    Squire, Michael D.

    2011-01-01

    The Mini-Research Module-2 (MRM-2), a Russian module on the International Space Station, does not meet its requirements for micrometeoroid and orbital debris probability of no penetration (PNP). To document this condition, the primary Russian Federal Space Agency ISS contractor, S.P. Korolev Rocket and Space Corporation-Energia (RSC-E), submitted an ISS non-compliance report (NCR) which was presented at the 5R Stage Operations Readiness Review (SORR) in October 2009. In the NCR, RSC-E argued for waiving the PNP requirement based on several factors, one of which was the risk of catastrophic failure was acceptably low at 1 in 11,100. However, NASA independently performed an assessment of the catastrophic risk resulting in a value of 1 in 1380 and believed that the risk at that level was unacceptable. The NASA Engineering and Safety Center was requested to evaluate the two competing catastrophic risk values and determine which was more accurate. This document contains the outcome of the assessment.

  19. Might gluten traces in wheat substitutes pose a risk in patients with celiac disease? A population-based probabilistic approach to risk estimation

    NARCIS (Netherlands)

    Gibert, A.; Kruizinga, A.G.; Neuhold, S.; Houben, G.F.; Canela, M.A.; Fasano, A.; Catassi, C.

    2013-01-01

    Background: In patients with treated celiac disease (CD), the ingestion of gluten traces contained in gluten-free (GF) wheat substitutes (eg, GF bread, flour, and pasta) could cause persisting intestinal mucosal damage. Objective: The objective was to evaluate the proportion of CD patients at risk

  20. Probabilistic transmission system planning

    CERN Document Server

    Li, Wenyuan

    2011-01-01

    "The book is composed of 12 chapters and three appendices, and can be divided into four parts. The first part includes Chapters 2 to 7, which discuss the concepts, models, methods and data in probabilistic transmission planning. The second part, Chapters 8 to 11, addresses four essential issues in probabilistic transmission planning applications using actual utility systems as examples. Chapter 12, as the third part, focuses on a special issue, i.e. how to deal with uncertainty of data in probabilistic transmission planning. The fourth part consists of three appendices, which provide the basic knowledge in mathematics for probabilistic planning. Please refer to the attached table of contents which is given in a very detailed manner"--

  1. Conditioning Probabilistic Databases

    CERN Document Server

    Koch, Christoph

    2008-01-01

    Past research on probabilistic databases has studied the problem of answering queries on a static database. Application scenarios of probabilistic databases however often involve the conditioning of a database using additional information in the form of new evidence. The conditioning problem is thus to transform a probabilistic database of priors into a posterior probabilistic database which is materialized for subsequent query processing or further refinement. It turns out that the conditioning problem is closely related to the problem of computing exact tuple confidence values. It is known that exact confidence computation is an NP-hard problem. This has lead researchers to consider approximation techniques for confidence computation. However, neither conditioning nor exact confidence computation can be solved using such techniques. In this paper we present efficient techniques for both problems. We study several problem decomposition methods and heuristics that are based on the most successful search techn...

  2. A total diet study and probabilistic assessment risk assessment of dietary mercury exposure among First Nations living on-reserve in Ontario, Canada.

    Science.gov (United States)

    Juric, Amanda K; Batal, Malek; David, Will; Sharp, Donald; Schwartz, Harold; Ing, Amy; Fediuk, Karen; Black, Andrew; Tikhonov, Constantine; Chan, Hing Man

    2017-10-01

    Methyl Mercury (MeHg) exposure is a global environmental health concern. Indigenous peoples around the world are susceptible to MeHg exposure from often higher fish consumption compared to general populations. The objective of this study was to estimate dietary exposure to methylmercury (MeHg) among First Nations living on-reserve in the province of Ontario, Canada. A total diet study was constructed based on a 24-h recall from the First Nations Food, Nutrition, and Environment Study (FNFNES), and measured contaminant concentrations from Health Canada for market foods, and FNFNES for traditional foods. A probabilistic assessment of annual and seasonal traditional food consumptions was conducted for 1429 adult participants. Results were compared to exposures in the general Canadian population and reference values from Health Canada for adults and women of childbearing age (ages 19-50). Results indicated traditional foods to be the primary contributor to the dietary total MeHg intake (72%). The average dietary total MeHg exposure in the First Nations population in Ontario (0.039μg/kg/d) was 1.6 times higher than the general Canadian population; however, the majority (97.8%) of the population was below the reference values. Mercury concentrations in participants' hair samples (n = 744) ranged from 0.03 to 13.54µg/g, with an average of 0.64µg/g (geometric average of 0.27µg/g). Less than 1% of the population had a hair mercury value above the 6µg/g level, and 1.3% of women of child bearing age had values greater than 2µg/g. Fish species contributing to the MeHg intake included pickerel-walleye, pike, perch and trout. Only 7.9% of the population met the recommended fish consumption rate of two, 3.5oz servings per week from the American Heart Association. Therefore, consumption of lower trophic level fish can be promoted to provide the maximum nutritional benefit with minimal risk of MeHg exposure. Copyright © 2017 Elsevier Inc. All rights reserved.

  3. Introduction of Probabilistic Risk Assessment Approach to Analyze the Safety of Space Systems%航天系统安全性分析的概率风险评估方法

    Institute of Scientific and Technical Information of China (English)

    顾基发; 赵丽艳

    1999-01-01

    在对航天系统进行安全性分析的定性、定量、综合方法的基础上,主要介绍了以定量风险评估为主的、在NASA和ESA均得到广泛应用的概率风险评估(Probabilistic Risk Assessment,PRA)方法.着重介绍了该方法的基本思想和具体实施过程,并探讨了在我国航天系统安全性分析中使用PRA方法的可行性.

  4. A Methodology for Modeling Nuclear Power Plant Passive Component Aging in Probabilistic Risk Assessment under the Impact of Operating Conditions, Surveillance and Maintenance Activities

    Science.gov (United States)

    Guler Yigitoglu, Askin

    In the context of long operation of nuclear power plants (NPPs) (i.e., 60-80 years, and beyond), investigation of the aging of passive systems, structures and components (SSCs) is important to assess safety margins and to decide on reactor life extension as indicated within the U.S. Department of Energy (DOE) Light Water Reactor Sustainability (LWRS) Program. In the traditional probabilistic risk assessment (PRA) methodology, evaluating the potential significance of aging of passive SSCs on plant risk is challenging. Although passive SSC failure rates can be added as initiating event frequencies or basic event failure rates in the traditional event-tree/fault-tree methodology, these failure rates are generally based on generic plant failure data which means that the true state of a specific plant is not reflected in a realistic manner on aging effects. Dynamic PRA methodologies have gained attention recently due to their capability to account for the plant state and thus address the difficulties in the traditional PRA modeling of aging effects of passive components using physics-based models (and also in the modeling of digital instrumentation and control systems). Physics-based models can capture the impact of complex aging processes (e.g., fatigue, stress corrosion cracking, flow-accelerated corrosion, etc.) on SSCs and can be utilized to estimate passive SSC failure rates using realistic NPP data from reactor simulation, as well as considering effects of surveillance and maintenance activities. The objectives of this dissertation are twofold: The development of a methodology for the incorporation of aging modeling of passive SSC into a reactor simulation environment to provide a framework for evaluation of their risk contribution in both the dynamic and traditional PRA; and the demonstration of the methodology through its application to pressurizer surge line pipe weld and steam generator tubes in commercial nuclear power plants. In the proposed methodology, a

  5. Volcanic risk metrics at Mt Ruapehu, New Zealand: some background to a probabilistic eruption forecasting scheme and a cost/benefit analysis at an open conduit volcano

    Science.gov (United States)

    Jolly, Gill; Sandri, Laura; Lindsay, Jan; Scott, Brad; Sherburn, Steve; Jolly, Art; Fournier, Nico; Keys, Harry; Marzocchi, Warner

    2010-05-01

    The Bayesian Event Tree for Eruption Forecasting software (BET_EF) is a probabilistic model based on an event tree scheme that was created specifically to compute long- and short-term probabilities of different outcomes (volcanic unrest, magmatic unrest, eruption, vent location and eruption size) at long-time dormant and routinely monitored volcanoes. It is based on the assumption that upward movements of magma in a closed conduit volcano will produce detectable changes in the monitored parameters at the surface. In this perspective, the goal of BET_EF is to compute probabilities by merging information from geology, models, past data and present monitoring measurements, through a Bayesian inferential method. In the present study, we attempt to apply BET_EF to Mt Ruapehu, a very active and well-monitored volcano exhibiting the typical features of open conduit volcanoes. In such conditions, current monitoring at the surface is not necessarily able to detect short term changes at depth that may occur only seconds to minutes before an eruption. This results in so-called "blue sky eruptions" of Mt Ruapehu (for example in September 2007), that are volcanic eruptions apparently not preceded by any presently detectable signal in the current monitoring. A further complication at Mt Ruapehu arises from the well-developed hydrothermal system and the permanent crater lake sitting on top of the magmatic conduit. Both the hydrothermal system and crater lake may act to mask or change monitoring signals (if present) that magma produces deeper in the edifice. Notwithstanding these potential drawbacks, we think that an attempt to apply BET_EF at Ruapehu is worthwhile, for several reasons. First, with the exception of a few "blue sky" events, monitoring data at Mt Ruapehu can be helpful in forecasting major events, especially if a large amount of magma is intruded into the edifice and becomes available for phreatomagmatic or magmatic eruptions, as for example in 1995-96. Secondly, in

  6. A ''well-known'' advanced oxidation reaction revisited. The photo-Fenton-oxidation of 4-chlorophenol and 2,4-dichloro-phenol in a homogeneous and a heterogeneous system

    Energy Technology Data Exchange (ETDEWEB)

    Rios-Enriquez, M.A.; Bossmann, S.H.; Oliveros, E.; Shahin, N.; Braun, A.M. [Lehrstuhl fuer Umweltmesstechnik, am Engler-Bunte Inst. der Univ. Karlsruhe (Germany); Duran-de-Bazua, C. [Facultad de Quimica, Univ. Nacional Autonoma de Mexico (Mexico)

    2003-07-01

    The oxidative degradation of 4-chlorophenol and 2,4-dichlorophenol by the thermal and photochemically enhanced Fenton reactions has served as a model reaction for the comparison of different reaction conditions. The process of dechlorination of the chlorinated phenols is generally monitored by the determination of the chloride anion by ion chromatography. Some authors even proposed the measurement of the released chloride as a convenient measure for the advancement of Fenton reactions of reaction mixtures occurring in the environment. Therefore, we revisited these ''well-known'' reactions and combined mechanistic investigations of the formed chemical intermediates by GC-mass spectroscopy with the determinations of thorough chloride balances. This mechanistic tool was further employed for the comparison of the homogeneous and the heterogeneous photochemically enhanced Fenton degradation of 4-chlorophenol and 2,4-dichlorophenol. A mixture of ferric sulphate and hydrogen peroxide was employed for the homogeneous Fenton reactions, whereas the heterogeneous Fenton experiments were performed using an iron(III)-exchanged zeolite Y photocatalyst. Different reaction pathways for the homogenous and heterogeneous (photo)Fenton reactions and especially for the oxidative degradation of 4-chlorophenol and 2,4-dichlorophenol were observed. Consequences for the comparison of different operating conditions of (photo)Fenton processes are discussed. (orig.)

  7. Probabilistic Belief Logic and Its Probabilistic Aumann Semantics

    Institute of Scientific and Technical Information of China (English)

    CAO ZiNing(曹子宁); SHI ChunYi(石纯一)

    2003-01-01

    In this paper, we present a logic system for probabilistic belief named PBL,which expands the language of belief logic by introducing probabilistic belief. Furthermore, wegive the probabilistic Aumann semantics of PBL. We also list some valid properties of belief andprobabilistic belief, which form the deduction system of PBL. Finally, we prove the soundness andcompleteness of these properties with respect to probabilistic Aumann semantics.

  8. Formalizing Probabilistic Safety Claims

    Science.gov (United States)

    Herencia-Zapana, Heber; Hagen, George E.; Narkawicz, Anthony J.

    2011-01-01

    A safety claim for a system is a statement that the system, which is subject to hazardous conditions, satisfies a given set of properties. Following work by John Rushby and Bev Littlewood, this paper presents a mathematical framework that can be used to state and formally prove probabilistic safety claims. It also enables hazardous conditions, their uncertainties, and their interactions to be integrated into the safety claim. This framework provides a formal description of the probabilistic composition of an arbitrary number of hazardous conditions and their effects on system behavior. An example is given of a probabilistic safety claim for a conflict detection algorithm for aircraft in a 2D airspace. The motivation for developing this mathematical framework is that it can be used in an automated theorem prover to formally verify safety claims.

  9. Probabilistic approach to mechanisms

    CERN Document Server

    Sandler, BZ

    1984-01-01

    This book discusses the application of probabilistics to the investigation of mechanical systems. The book shows, for example, how random function theory can be applied directly to the investigation of random processes in the deflection of cam profiles, pitch or gear teeth, pressure in pipes, etc. The author also deals with some other technical applications of probabilistic theory, including, amongst others, those relating to pneumatic and hydraulic mechanisms and roller bearings. Many of the aspects are illustrated by examples of applications of the techniques under discussion.

  10. Probabilistic conditional independence structures

    CERN Document Server

    Studeny, Milan

    2005-01-01

    Probabilistic Conditional Independence Structures provides the mathematical description of probabilistic conditional independence structures; the author uses non-graphical methods of their description, and takes an algebraic approach.The monograph presents the methods of structural imsets and supermodular functions, and deals with independence implication and equivalence of structural imsets.Motivation, mathematical foundations and areas of application are included, and a rough overview of graphical methods is also given.In particular, the author has been careful to use suitable terminology, and presents the work so that it will be understood by both statisticians, and by researchers in artificial intelligence.The necessary elementary mathematical notions are recalled in an appendix.

  11. Probabilistic Causation without Probability.

    Science.gov (United States)

    Holland, Paul W.

    The failure of Hume's "constant conjunction" to describe apparently causal relations in science and everyday life has led to various "probabilistic" theories of causation of which the study by P. C. Suppes (1970) is an important example. A formal model that was developed for the analysis of comparative agricultural experiments…

  12. Probabilistic simple sticker systems

    Science.gov (United States)

    Selvarajoo, Mathuri; Heng, Fong Wan; Sarmin, Nor Haniza; Turaev, Sherzod

    2017-04-01

    A model for DNA computing using the recombination behavior of DNA molecules, known as a sticker system, was introduced by by L. Kari, G. Paun, G. Rozenberg, A. Salomaa, and S. Yu in the paper entitled DNA computing, sticker systems and universality from the journal of Acta Informatica vol. 35, pp. 401-420 in the year 1998. A sticker system uses the Watson-Crick complementary feature of DNA molecules: starting from the incomplete double stranded sequences, and iteratively using sticking operations until a complete double stranded sequence is obtained. It is known that sticker systems with finite sets of axioms and sticker rules generate only regular languages. Hence, different types of restrictions have been considered to increase the computational power of sticker systems. Recently, a variant of restricted sticker systems, called probabilistic sticker systems, has been introduced [4]. In this variant, the probabilities are initially associated with the axioms, and the probability of a generated string is computed by multiplying the probabilities of all occurrences of the initial strings in the computation of the string. Strings for the language are selected according to some probabilistic requirements. In this paper, we study fundamental properties of probabilistic simple sticker systems. We prove that the probabilistic enhancement increases the computational power of simple sticker systems.

  13. Probabilistic parsing strategies

    NARCIS (Netherlands)

    Nederhof, Mark-Jan; Satta, Giorgio

    We present new results on the relation between purely symbolic context-free parsing strategies and their probabilistic counterparts. Such parsing strategies are seen as constructions of push-down devices from grammars. We show that preservation of probability distribution is possible under two

  14. Probabilistic thread algebra

    NARCIS (Netherlands)

    Bergstra, J.A.; Middelburg, C.A.

    2015-01-01

    We add probabilistic features to basic thread algebra and its extensions with thread-service interaction and strategic interleaving. Here, threads represent the behaviours produced by instruction sequences under execution and services represent the behaviours exhibited by the components of execution

  15. Probabilistic Load Flow

    DEFF Research Database (Denmark)

    Chen, Peiyuan; Chen, Zhe; Bak-Jensen, Birgitte

    2008-01-01

    This paper reviews the development of the probabilistic load flow (PLF) techniques. Applications of the PLF techniques in different areas of power system steady-state analysis are also discussed. The purpose of the review is to identify different available PLF techniques and their corresponding...

  16. Probabilistic dynamic belief revision

    NARCIS (Netherlands)

    Baltag, A.; Smets, S.

    2008-01-01

    We investigate the discrete (finite) case of the Popper-Renyi theory of conditional probability, introducing discrete conditional probabilistic models for knowledge and conditional belief, and comparing them with the more standard plausibility models. We also consider a related notion, that of safe

  17. Síndrome de Rett: 50 años de historia de un trastorno aun no bien conocido Rett syndrome: 50 years' history of a still not well known condition

    Directory of Open Access Journals (Sweden)

    Jaime Campos-Castello

    2007-01-01

    Full Text Available Desde que fue descrito por primera vez por Andreas Rett hace 50 años, el síndrome de Rett (SR ha sido objeto de muchas investigaciones, sin embargo continúa siendo un trastorno aún no bien conocido. Presentamos nuestra propia experiencia y una revisión de la literatura sobre el SR. Se trata de un trastorno del neurodesarrollo, dominante ligado a X, que afecta casi siempre a mujeres, la mayoría de los casos de forma esporádica. El diagnóstico de SR debe hacerse en base a la observación clínica. Las principales características son la aparición de un retraso mental, cambios conductuales, estereotipias, pérdida del lenguaje y, sobre todo, del uso propositivo de las manos, aparición de una apraxia de la marcha, presencia de alteraciones de la respiración y, frecuentemente, crisis epilépticas. Los criterios diagnósticos consensuados internacionalmente son aquí revisados. El SR se debe en la mayoría de casos a mutaciones del gen MECP2, si bien una proporción de casos atípicos puede estar causada por mutaciones de CDKL5, particularmente la variante con epilepsia precoz. Sin embargo, los mecanismos patogénicos moleculares no son bien conocidos, así como la relación entre las mutaciones de MECP2 y otros trastornos del desarrollo. Revisamos también los hallazgos de neuroimagen, neuropatológicos y neurobioquímicos descritos en el SR. Respecto al tratamiento, aparte del sintomático, no hay ninguno que se haya mostrado eficaz. Un trabajo reciente abre perspectivas terapéuticas futuras al haber demostrado mediante un modelo animal de ratón la reversión de los síntomas neurológicos mediante la activación de la expresión de MeCP2.Since it was first described by Andrea Rett 50 years ago, Rett syndrome (RS has been the subject of further investigations, nonetheless it continues to be a not well known condition. Our own experience and an updated literature review on RS is presented. RS is a severe dominant X chromosome

  18. 美国驰名商标法则、TRIPS协议与香烟平装立法%The U.S. Well-Known Mark Doctrine, TRIPS Agreement and Plain Packaging on Tobacco Products

    Institute of Scientific and Technical Information of China (English)

    杰夫·M·塞缪尔斯[美; 蔡元臻(译)

    2014-01-01

    The Well-known Mark Doctrine, put forth by the Paris Convention, being effective and maneuvered internationally for almost a century, is however still facing dififculties and suffering regarding its judicatory applicability within the U.S. court system through a series of trademark cases. With inevitable considerations towards the U.S. legislation process, judicatory system and financial interests of national industries, the uncertain applicability of the said doctrine will likely to continue, despite the orientational signiifcance offered by certain widely known case laws in the current century, including the Grupo v. Dallo case, ITC v. Punchgini case and Fiat v. ISM case. Whereas another widely disputed issue in the field of international trademark regime development is the plain packaging law on tobacco products, the Australian government, as an aspiring promoter, is being confronted by various corporations and nations upon the legality of its plain packaging law. Regarding the ongoing disputes, the conlficting interaction between the law and article 8 as well as article 20 of the TRIPS Agreement is considered to be the crucial factors, admittedly the future decisions from the WTO Dispute Settlement panel will exert major effects on the relative legislation of various nations.%驰名商标法则缘自《巴黎公约》,其国际应用由来已久,但是它在美国的司法适用却在绵延不断的判例中摇摆不定,迟迟无法得到承认。21世纪著名的Grupo案、ITC案和Fiat案虽能对这一问题起到一定的导向性作用,但是鉴于美国的立法程序、司法体制和国内产业利益方面的考量,法则的适用仍会是一个悬而未决的问题。有关国际商标制度变迁的另一个热点话题是香烟平装立法,澳大利亚政府作为此项规定的积极推行者,其法律遭到了多方企业和国家的反对,香烟平装立法因与TRIPS协议第8条、第20条的相互关系而备受关注,而WTO

  19. Cardiac surgery and hypertension: a dangerous association that must be well known Cirurgia cardíaca e hipertensão: uma associação perigosa que deve ser bem conhecida

    Directory of Open Access Journals (Sweden)

    Shi-Min Yuan

    2011-06-01

    Full Text Available It is well-known that hypertension is a very common disease, and severe cerebrovascular accidents might occur if the blood pressure is not properly controlled. However, conditions associated with uncontrolled hypertension may be overlooked, and may become critical and eventually require a surgical intervention on an urgent basis. Coronary artery disease, acute aortic syndrome, congenital and valvular heart disease, and arrhythmias are under this topic of discussion. Of them, coronary artery disease including myocardial infarction and especially postinfarction myocardial rupture, and aortic dissection are major critical situations that physicians may encounter in clinical practice. The role that hypertension plays in these conditions can be complex, including hemodynamic, electrophysiological and biomolecular factors, where the latter may prevail in the current era. Coronary artery disease may be associated with a reduced nitric oxide synthesis. Transforming growth factor and matrix metalloproteinases have been observed in relation to aortic syndrome. Wnt, p38 and JNK signaling pathway may be involved in the development of ventricular hypertrophy responsible for cardiac arrythmias. Various gene phynotypes may present in different congenital heart defects. This article is to present these conditions, and to further discuss the possible etiologies and the potential treatment strategies so as to highlight the relevance at a prognostic level.É sabido que a hipertensão é uma doença muito comum, e que os acidentes cerebrovasculares graves podem ocorrer se a pressão sanguínea não for apropriadamente controlada. Contudo, as condições associadas à hipertensão não controlada podem ser negligenciadas, e tornarem-se críticas, necessitando, eventualmente, uma intervenção cirúrgica urgente. Doença coronariana, síndrome aórtica aguda, cardiopatias congênitas, valvopatias e arritmias são sob este tópico de discussão. Dentre eles, a doen

  20. Probabilistic, meso-scale flood loss modelling

    Science.gov (United States)

    Kreibich, Heidi; Botto, Anna; Schröter, Kai; Merz, Bruno

    2016-04-01

    Flood risk analyses are an important basis for decisions on flood risk management and adaptation. However, such analyses are associated with significant uncertainty, even more if changes in risk due to global change are expected. Although uncertainty analysis and probabilistic approaches have received increased attention during the last years, they are still not standard practice for flood risk assessments and even more for flood loss modelling. State of the art in flood loss modelling is still the use of simple, deterministic approaches like stage-damage functions. Novel probabilistic, multi-variate flood loss models have been developed and validated on the micro-scale using a data-mining approach, namely bagging decision trees (Merz et al. 2013). In this presentation we demonstrate and evaluate the upscaling of the approach to the meso-scale, namely on the basis of land-use units. The model is applied in 19 municipalities which were affected during the 2002 flood by the River Mulde in Saxony, Germany (Botto et al. submitted). The application of bagging decision tree based loss models provide a probability distribution of estimated loss per municipality. Validation is undertaken on the one hand via a comparison with eight deterministic loss models including stage-damage functions as well as multi-variate models. On the other hand the results are compared with official loss data provided by the Saxon Relief Bank (SAB). The results show, that uncertainties of loss estimation remain high. Thus, the significant advantage of this probabilistic flood loss estimation approach is that it inherently provides quantitative information about the uncertainty of the prediction. References: Merz, B.; Kreibich, H.; Lall, U. (2013): Multi-variate flood damage assessment: a tree-based data-mining approach. NHESS, 13(1), 53-64. Botto A, Kreibich H, Merz B, Schröter K (submitted) Probabilistic, multi-variable flood loss modelling on the meso-scale with BT-FLEMO. Risk Analysis.

  1. Probabilistic simulation of fire scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Hostikka, Simo E-mail: simo.bostikka@vtt.fi; Keski-Rahkonen, Olavi

    2003-10-01

    A risk analysis tool is developed for computation of the distributions of fire model output variables. The tool, called Probabilistic Fire Simulator (PFS), combines Monte Carlo simulation and CFAST, a two-zone fire model. In this work, the tool is used to estimate the failure probability of redundant cables in a cable tunnel fire, and the failure and smoke filling probabilities in an electronics room during an electronics cabinet fire. Sensitivity of the output variables to the input variables is calculated in terms of the rank order correlations. The use of the rank order correlations allows the user to identify both modelling parameters and actual facility properties that have the most influence on the results. Various steps of the simulation process, i.e. data collection, generation of the input distributions, modelling assumptions, definition of the output variables and the actual simulation, are described.

  2. Model exploration and analysis for quantitative safety refinement in probabilistic B

    CERN Document Server

    Ndukwu, Ukachukwu; 10.4204/EPTCS.55.7

    2011-01-01

    The role played by counterexamples in standard system analysis is well known; but less common is a notion of counterexample in probabilistic systems refinement. In this paper we extend previous work using counterexamples to inductive invariant properties of probabilistic systems, demonstrating how they can be used to extend the technique of bounded model checking-style analysis for the refinement of quantitative safety specifications in the probabilistic B language. In particular, we show how the method can be adapted to cope with refinements incorporating probabilistic loops. Finally, we demonstrate the technique on pB models summarising a one-step refinement of a randomised algorithm for finding the minimum cut of undirected graphs, and that for the dependability analysis of a controller design.

  3. Probabilistic Analysis of Failures Mechanisms of Large Dams

    NARCIS (Netherlands)

    Shams Ghahfarokhi, G.

    2014-01-01

    Risk and reliability analysis is presently being performed in almost all fields of engineering depending upon the specific field and its particular area. Probabilistic risk analysis (PRA), also called quantitative risk analysis (QRA) is a central feature of hydraulic engineering structural design.

  4. Probabilistic Event Categorization

    CERN Document Server

    Wiebe, J; Duan, L; Wiebe, Janyce; Bruce, Rebecca; Duan, Lei

    1997-01-01

    This paper describes the automation of a new text categorization task. The categories assigned in this task are more syntactically, semantically, and contextually complex than those typically assigned by fully automatic systems that process unseen test data. Our system for assigning these categories is a probabilistic classifier, developed with a recent method for formulating a probabilistic model from a predefined set of potential features. This paper focuses on feature selection. It presents a number of fully automatic features. It identifies and evaluates various approaches to organizing collocational properties into features, and presents the results of experiments covarying type of organization and type of property. We find that one organization is not best for all kinds of properties, so this is an experimental parameter worth investigating in NLP systems. In addition, the results suggest a way to take advantage of properties that are low frequency but strongly indicative of a class. The problems of rec...

  5. The Magic of Logical Inference in Probabilistic Programming

    CERN Document Server

    Gutmann, Bernd; Kimmig, Angelika; Bruynooghe, Maurice; De Raedt, Luc; 10.1017/S1471068411000238

    2011-01-01

    Today, many different probabilistic programming languages exist and even more inference mechanisms for these languages. Still, most logic programming based languages use backward reasoning based on SLD resolution for inference. While these methods are typically computationally efficient, they often can neither handle infinite and/or continuous distributions, nor evidence. To overcome these limitations, we introduce distributional clauses, a variation and extension of Sato's distribution semantics. We also contribute a novel approximate inference method that integrates forward reasoning with importance sampling, a well-known technique for probabilistic inference. To achieve efficiency, we integrate two logic programming techniques to direct forward sampling. Magic sets are used to focus on relevant parts of the program, while the integration of backward reasoning allows one to identify and avoid regions of the sample space that are inconsistent with the evidence.

  6. Probabilistic approaches to recommendations

    CERN Document Server

    Barbieri, Nicola; Ritacco, Ettore

    2014-01-01

    The importance of accurate recommender systems has been widely recognized by academia and industry, and recommendation is rapidly becoming one of the most successful applications of data mining and machine learning. Understanding and predicting the choices and preferences of users is a challenging task: real-world scenarios involve users behaving in complex situations, where prior beliefs, specific tendencies, and reciprocal influences jointly contribute to determining the preferences of users toward huge amounts of information, services, and products. Probabilistic modeling represents a robus

  7. On probabilistic Mandelbrot maps

    Energy Technology Data Exchange (ETDEWEB)

    Andreadis, Ioannis [International School of The Hague, Wijndaelerduin 1, 2554 BX The Hague (Netherlands)], E-mail: i.andreadis@ish-rijnlandslyceum.nl; Karakasidis, Theodoros E. [Department of Civil Engineering, University of Thessaly, GR-38334 Volos (Greece)], E-mail: thkarak@uth.gr

    2009-11-15

    In this work, we propose a definition for a probabilistic Mandelbrot map in order to extend and support the study initiated by Argyris et al. [Argyris J, Andreadis I, Karakasidis Th. On perturbations of the Mandelbrot map. Chaos, Solitons and Fractals 2000;11:1131-1136.] with regard to the numerical stability of the Mandelbrot and Julia set of the Mandelbrot map when subjected to noise.

  8. Optimization-Based Approaches to Control of Probabilistic Boolean Networks

    Directory of Open Access Journals (Sweden)

    Koichi Kobayashi

    2017-02-01

    Full Text Available Control of gene regulatory networks is one of the fundamental topics in systems biology. In the last decade, control theory of Boolean networks (BNs, which is well known as a model of gene regulatory networks, has been widely studied. In this review paper, our previously proposed methods on optimal control of probabilistic Boolean networks (PBNs are introduced. First, the outline of PBNs is explained. Next, an optimal control method using polynomial optimization is explained. The finite-time optimal control problem is reduced to a polynomial optimization problem. Furthermore, another finite-time optimal control problem, which can be reduced to an integer programming problem, is also explained.

  9. Probabilistic Modeling of Timber Structures

    DEFF Research Database (Denmark)

    Köhler, J.D.; Sørensen, John Dalsgaard; Faber, Michael Havbro

    2005-01-01

    The present paper contains a proposal for the probabilistic modeling of timber material properties. It is produced in the context of the Probabilistic Model Code (PMC) of the Joint Committee on Structural Safety (JCSS) and of the COST action E24 'Reliability of Timber Structures'. The present pro...... probabilistic model for these basic properties is presented and possible refinements are given related to updating of the probabilistic model given new information, modeling of the spatial variation of strength properties and the duration of load effects.......The present paper contains a proposal for the probabilistic modeling of timber material properties. It is produced in the context of the Probabilistic Model Code (PMC) of the Joint Committee on Structural Safety (JCSS) and of the COST action E24 'Reliability of Timber Structures'. The present...

  10. Storing and Querying Probabilistic XML Using a Probabilistic Relational DBMS

    NARCIS (Netherlands)

    Hollander, E.S.; Keulen, van M.

    2010-01-01

    This work explores the feasibility of storing and querying probabilistic XML in a probabilistic relational database. Our approach is to adapt known techniques for mapping XML to relational data such that the possible worlds are preserved. We show that this approach can work for any XML-to-relational

  11. Quantum probability for probabilists

    CERN Document Server

    Meyer, Paul-André

    1993-01-01

    In recent years, the classical theory of stochastic integration and stochastic differential equations has been extended to a non-commutative set-up to develop models for quantum noises. The author, a specialist of classical stochastic calculus and martingale theory, tries to provide anintroduction to this rapidly expanding field in a way which should be accessible to probabilists familiar with the Ito integral. It can also, on the other hand, provide a means of access to the methods of stochastic calculus for physicists familiar with Fock space analysis.

  12. Learning Probabilistic Decision Graphs

    DEFF Research Database (Denmark)

    Jaeger, Manfred; Dalgaard, Jens; Silander, Tomi

    2004-01-01

    Probabilistic decision graphs (PDGs) are a representation language for probability distributions based on binary decision diagrams. PDGs can encode (context-specific) independence relations that cannot be captured in a Bayesian network structure, and can sometimes provide computationally more...... efficient representations than Bayesian networks. In this paper we present an algorithm for learning PDGs from data. First experiments show that the algorithm is capable of learning optimal PDG representations in some cases, and that the computational efficiency of PDG models learned from real-life data...

  13. Probabilistic quantum error correction

    CERN Document Server

    Fern, J; Fern, Jesse; Terilla, John

    2002-01-01

    There are well known necessary and sufficient conditions for a quantum code to correct a set of errors. We study weaker conditions under which a quantum code may correct errors with probabilities that may be less than one. We work with stabilizer codes and as an application study how the nine qubit code, the seven qubit code, and the five qubit code perform when there are errors on more than one qubit. As a second application, we discuss the concept of syndrome quality and use it to suggest a way that quantum error correction can be practically improved.

  14. A General Framework for Probabilistic Characterizing Formulae

    DEFF Research Database (Denmark)

    Sack, Joshua; Zhang, Lijun

    2012-01-01

    a general method for determining characteristic formulae of behavioral relations for probabilistic automata using fixed-point probability logics. We consider such behavioral relations as simulations and bisimulations, probabilistic bisimulations, probabilistic weak simulations, and probabilistic forward......Recently, a general framework on characteristic formulae was proposed by Aceto et al. It offers a simple theory that allows one to easily obtain characteristic formulae of many non-probabilistic behavioral relations. Our paper studies their techniques in a probabilistic setting. We provide...

  15. Probabilistic ecorisk assessment for fish, mammals, and birds at the Great Dismal Swamp National Wildlife Refuge

    Data.gov (United States)

    US Fish and Wildlife Service, Department of the Interior — The primary objective of this study was to conduct a probabilistic ecological risk assessment for birds and evaluate risk to fish and mammals that may be exposed to...

  16. Probabilistic population aging

    Science.gov (United States)

    2017-01-01

    We merge two methodologies, prospective measures of population aging and probabilistic population forecasts. We compare the speed of change and variability in forecasts of the old age dependency ratio and the prospective old age dependency ratio as well as the same comparison for the median age and the prospective median age. While conventional measures of population aging are computed on the basis of the number of years people have already lived, prospective measures are computed also taking account of the expected number of years they have left to live. Those remaining life expectancies change over time and differ from place to place. We compare the probabilistic distributions of the conventional and prospective measures using examples from China, Germany, Iran, and the United States. The changes over time and the variability of the prospective indicators are smaller than those that are observed in the conventional ones. A wide variety of new results emerge from the combination of methodologies. For example, for Germany, Iran, and the United States the likelihood that the prospective median age of the population in 2098 will be lower than it is today is close to 100 percent. PMID:28636675

  17. Quantum probabilistic logic programming

    Science.gov (United States)

    Balu, Radhakrishnan

    2015-05-01

    We describe a quantum mechanics based logic programming language that supports Horn clauses, random variables, and covariance matrices to express and solve problems in probabilistic logic. The Horn clauses of the language wrap random variables, including infinite valued, to express probability distributions and statistical correlations, a powerful feature to capture relationship between distributions that are not independent. The expressive power of the language is based on a mechanism to implement statistical ensembles and to solve the underlying SAT instances using quantum mechanical machinery. We exploit the fact that classical random variables have quantum decompositions to build the Horn clauses. We establish the semantics of the language in a rigorous fashion by considering an existing probabilistic logic language called PRISM with classical probability measures defined on the Herbrand base and extending it to the quantum context. In the classical case H-interpretations form the sample space and probability measures defined on them lead to consistent definition of probabilities for well formed formulae. In the quantum counterpart, we define probability amplitudes on Hinterpretations facilitating the model generations and verifications via quantum mechanical superpositions and entanglements. We cast the well formed formulae of the language as quantum mechanical observables thus providing an elegant interpretation for their probabilities. We discuss several examples to combine statistical ensembles and predicates of first order logic to reason with situations involving uncertainty.

  18. Passage Retrieval: A Probabilistic Technique.

    Science.gov (United States)

    Melucci, Massimo

    1998-01-01

    Presents a probabilistic technique to retrieve passages from texts having a large size or heterogeneous semantic content. Results of experiments comparing the probabilistic technique to one based on a text segmentation algorithm revealed that the passage size affects passage retrieval performance; text organization and query generality may have an…

  19. Probabilistic modeling of timber structures

    DEFF Research Database (Denmark)

    Köhler, Jochen; Sørensen, John Dalsgaard; Faber, Michael Havbro

    2007-01-01

    The present paper contains a proposal for the probabilistic modeling of timber material properties. It is produced in the context of the Probabilistic Model Code (PMC) of the Joint Committee on Structural Safety (JCSS) [Joint Committee of Structural Safety. Probabilistic Model Code, Internet Publ...... is presented and possible refinements are given related to updating of the probabilistic model given new information, modeling of the spatial variation of strength properties and the duration of load effects.......The present paper contains a proposal for the probabilistic modeling of timber material properties. It is produced in the context of the Probabilistic Model Code (PMC) of the Joint Committee on Structural Safety (JCSS) [Joint Committee of Structural Safety. Probabilistic Model Code, Internet...... and comments from participants of the COST E24 action and the members of the JCSS. The paper contains a description of the basic reference properties for timber strength parameters and ultimate limit state equations for timber components. The recommended probabilistic model for these basic properties...

  20. Incorporating psychological influences in probabilistic cost analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kujawski, Edouard; Alvaro, Mariana; Edwards, William

    2004-01-08

    Today's typical probabilistic cost analysis assumes an ''ideal'' project that is devoid of the human and organizational considerations that heavily influence the success and cost of real-world projects. In the real world ''Money Allocated Is Money Spent'' (MAIMS principle); cost underruns are rarely available to protect against cost overruns while task overruns are passed on to the total project cost. Realistic cost estimates therefore require a modified probabilistic cost analysis that simultaneously models the cost management strategy including budget allocation. Psychological influences such as overconfidence in assessing uncertainties and dependencies among cost elements and risks are other important considerations that are generally not addressed. It should then be no surprise that actual project costs often exceed the initial estimates and are delivered late and/or with a reduced scope. This paper presents a practical probabilistic cost analysis model that incorporates recent findings in human behavior and judgment under uncertainty, dependencies among cost elements, the MAIMS principle, and project management practices. Uncertain cost elements are elicited from experts using the direct fractile assessment method and fitted with three-parameter Weibull distributions. The full correlation matrix is specified in terms of two parameters that characterize correlations among cost elements in the same and in different subsystems. The analysis is readily implemented using standard Monte Carlo simulation tools such as {at}Risk and Crystal Ball{reg_sign}. The analysis of a representative design and engineering project substantiates that today's typical probabilistic cost analysis is likely to severely underestimate project cost for probability of success values of importance to contractors and procuring activities. The proposed approach provides a framework for developing a viable cost management strategy for

  1. Probabilistic Forecasts of Wind Power Generation by Stochastic Differential Equation Models

    DEFF Research Database (Denmark)

    Møller, Jan Kloppenborg; Zugno, Marco; Madsen, Henrik

    2016-01-01

    The increasing penetration of wind power has resulted in larger shares of volatile sources of supply in power systems worldwide. In order to operate such systems efficiently, methods for reliable probabilistic forecasts of future wind power production are essential. It is well known that the cond......The increasing penetration of wind power has resulted in larger shares of volatile sources of supply in power systems worldwide. In order to operate such systems efficiently, methods for reliable probabilistic forecasts of future wind power production are essential. It is well known...... that the conditional density of wind power production is highly dependent on the level of predicted wind power and prediction horizon. This paper describes a new approach for wind power forecasting based on logistic-type stochastic differential equations (SDEs). The SDE formulation allows us to calculate both state...... structures, and skewness of the predictive distributions as a function of explanatory variables....

  2. A probabilistic assessment of the contribution of wastewater-irrigated lettuce to Escherichia coli O157:H7 infection risk and disease burden in Kumasi, Ghana.

    Science.gov (United States)

    Seidu, Razak; Abubakari, Amina; Dennis, Isaac Amoah; Heistad, Arve; Stenstrom, Thor Axel; Larbi, John A; Abaidoo, Robert C

    2015-03-01

    Wastewater use for vegetable production is widespread across the cities of many developing countries. Studies on the microbial health risks associated with the practice have largely depended on faecal indicator organisms with potential underestimation or overestimation of the microbial health risks and disease burdens. This study assessed the Escherichia coli O157:H7 infection risk and diarrhoeal disease burden measured in disability-adjusted life years (DALYs) associated with the consumption of wastewater-irrigated lettuce in Kumasi, Ghana using data on E. coli O157:H7 in ready-to-harvest, wastewater-irrigated lettuce. Two exposure scenarios - best case and worst case - associated with a single consumption of wastewater-irrigated lettuce were assessed. The assessment revealed wastewater-irrigated lettuce is contributing to the transmission of E. coli O157:H7 in Kumasi, Ghana. The mean E. coli O157:H7 infection risk and DALYs in the wet and dry seasons, irrespective of the exposure scenario, were above the World Health Organization tolerable daily infection risk of 2.7 × 10⁻⁷ per person per day and 10⁻⁶ DALYs per person per year. It is recommended that legislation with clear monitoring indicators and penalties is implemented to ensure that farmers and food sellers fully implement risk mitigating measures.

  3. Probabilistic quantum multimeters

    CERN Document Server

    Fiurasek, J; Fiurasek, Jaromir; Dusek, Miloslav

    2004-01-01

    We propose quantum devices that can realize probabilistically different projective measurements on a qubit. The desired measurement basis is selected by the quantum state of a program register. First we analyze the phase-covariant multimeters for a large class of program states, then the universal multimeters for a special choice of program. In both cases we start with deterministic but erroneous devices and then proceed to devices that never make a mistake but from time to time they give an inconclusive result. These multimeters are optimized (for a given type of a program) with respect to the minimum probability of inconclusive result. This concept is further generalized to the multimeters that minimize the error rate for a given probability of an inconclusive result (or vice versa). Finally, we propose a generalization for qudits.

  4. Probabilistic retinal vessel segmentation

    Science.gov (United States)

    Wu, Chang-Hua; Agam, Gady

    2007-03-01

    Optic fundus assessment is widely used for diagnosing vascular and non-vascular pathology. Inspection of the retinal vasculature may reveal hypertension, diabetes, arteriosclerosis, cardiovascular disease and stroke. Due to various imaging conditions retinal images may be degraded. Consequently, the enhancement of such images and vessels in them is an important task with direct clinical applications. We propose a novel technique for vessel enhancement in retinal images that is capable of enhancing vessel junctions in addition to linear vessel segments. This is an extension of vessel filters we have previously developed for vessel enhancement in thoracic CT scans. The proposed approach is based on probabilistic models which can discern vessels and junctions. Evaluation shows the proposed filter is better than several known techniques and is comparable to the state of the art when evaluated on a standard dataset. A ridge-based vessel tracking process is applied on the enhanced image to demonstrate the effectiveness of the enhancement filter.

  5. A probabilistic tsunami hazard assessment for Indonesia

    Science.gov (United States)

    Horspool, N.; Pranantyo, I.; Griffin, J.; Latief, H.; Natawidjaja, D. H.; Kongko, W.; Cipta, A.; Bustaman, B.; Anugrah, S. D.; Thio, H. K.

    2014-11-01

    Probabilistic hazard assessments are a fundamental tool for assessing the threats posed by hazards to communities and are important for underpinning evidence-based decision-making regarding risk mitigation activities. Indonesia has been the focus of intense tsunami risk mitigation efforts following the 2004 Indian Ocean tsunami, but this has been largely concentrated on the Sunda Arc with little attention to other tsunami prone areas of the country such as eastern Indonesia. We present the first nationally consistent probabilistic tsunami hazard assessment (PTHA) for Indonesia. This assessment produces time-independent forecasts of tsunami hazards at the coast using data from tsunami generated by local, regional and distant earthquake sources. The methodology is based on the established monte carlo approach to probabilistic seismic hazard assessment (PSHA) and has been adapted to tsunami. We account for sources of epistemic and aleatory uncertainty in the analysis through the use of logic trees and sampling probability density functions. For short return periods (100 years) the highest tsunami hazard is the west coast of Sumatra, south coast of Java and the north coast of Papua. For longer return periods (500-2500 years), the tsunami hazard is highest along the Sunda Arc, reflecting the larger maximum magnitudes. The annual probability of experiencing a tsunami with a height of > 0.5 m at the coast is greater than 10% for Sumatra, Java, the Sunda islands (Bali, Lombok, Flores, Sumba) and north Papua. The annual probability of experiencing a tsunami with a height of > 3.0 m, which would cause significant inundation and fatalities, is 1-10% in Sumatra, Java, Bali, Lombok and north Papua, and 0.1-1% for north Sulawesi, Seram and Flores. The results of this national-scale hazard assessment provide evidence for disaster managers to prioritise regions for risk mitigation activities and/or more detailed hazard or risk assessment.

  6. Probabilistic Evaluation of Quality Risk in Input Stage of Design%设计输入阶段的质量风险概率估算

    Institute of Scientific and Technical Information of China (English)

    张旻翔

    2015-01-01

    The importance of risk probability assessment in risk management and its influence in quality control of engineering company was analyzed at ifrst. A solution of risk probability assessment for national chemical engineering design ifeld was presented. And then in this article, the calculating procedures of quality risk probability in design input stage was discussed in detall. The integral procedure of risk assessment, which includes risk factors analysis, baseline setting and probability model establishing, was proposed. Finally, the condition and the selection of risk assessment model by using regression method and probability distribution model were discussed.%浅析了风险概率估算在风险评价中的地位以及其对工程设计公司质量管理水平的影响,提出适用于工程设计行业的风险概率估算思路。详细探讨了设计输入阶段的质量风险发生概率的估算过程。提出细化风险因素、基准值设定、概率模型建立到概率风险分析的完整估算流程。讨论了采用回归法建立的经验估算模型和概率分布法建立的随机模型在风险概率估算中的适用范围和选择准则。

  7. Accretion onto some well-known regular black holes

    Energy Technology Data Exchange (ETDEWEB)

    Jawad, Abdul; Shahzad, M.U. [COMSATS Institute of Information Technology, Department of Mathematics, Lahore (Pakistan)

    2016-03-15

    In this work, we discuss the accretion onto static spherically symmetric regular black holes for specific choices of the equation of state parameter. The underlying regular black holes are charged regular black holes using the Fermi-Dirac distribution, logistic distribution, nonlinear electrodynamics, respectively, and Kehagias-Sftesos asymptotically flat regular black holes. We obtain the critical radius, critical speed, and squared sound speed during the accretion process near the regular black holes. We also study the behavior of radial velocity, energy density, and the rate of change of the mass for each of the regular black holes. (orig.)

  8. New Applications for a Well-known Phenomenon

    Institute of Scientific and Technical Information of China (English)

    2004-01-01

    @@ The Talbot effect, the self-imaging of a grating when it is illuminated by a monochromatic plane wave, was first discovered by the English scientist Henry Fox Talbot more than a century ago.It is generally described in university textbooks as an example of Fresnel diffraction and is considered one of the fundamental phenomena in optics.

  9. A Well-Known but Still Surprising Generator

    Science.gov (United States)

    Haugland, Ole Anton

    2014-01-01

    The bicycle generator is often mentioned as an example of a method to produce electric energy. It is cheap and easily accessible, so it is a natural example to use in teaching. There are different types, but I prefer the old side-wall dynamo. The most common explanation of its working principle seems to be something like the illustration in Fig.…

  10. Quantum Algorithms for Some Well-Known NP Problems

    Institute of Scientific and Technical Information of China (English)

    GUO Hao; LONG Gui-Lu; LI Feng

    2002-01-01

    It is known that quantum computer is more powerful than classical computer.In this paper we present quantum algorithms for some famous NP problems in graph theory and combination theory,these quantum algorithms are at least quadratically faster than the classical ones.

  11. Jiang Linjun, A Well-Known Landscape Painter

    Institute of Scientific and Technical Information of China (English)

    2004-01-01

    <正>It is an artistic treat to enjoy the landscape paintings drawn by Jiang Linjun, who not only makes use of traditional skills but also absorbs new aesthetic concepts in creating his works.By dint of years’ creative practice, Jiang has made deep researches of the paintings and calligraphy in the dynasties of Song, Yuan, Ming and Qing (960-1911) and kept copying them to learn a lot from the masters’ works and enrich his painting knowledge and skills.

  12. Log-concavity property for some well-known distributions

    Directory of Open Access Journals (Sweden)

    G. R. Mohtashami Borzadaran

    2011-12-01

    Full Text Available Interesting properties and propositions, in many branches of science such as economics have been obtained according to the property of cumulative distribution function of a random variable as a concave function. Caplin and Nalebuff (1988,1989, Bagnoli and Khanna (1989 and Bagnoli and Bergstrom (1989 , 1989, 2005 have discussed the log-concavity property of probability distributions and their applications, especially in economics. Log-concavity concerns twice differentiable real-valued function g whose domain is an interval on extended real line. g as a function is said to be log-concave on the interval (a,b if the function ln(g is a concave function on (a,b. Log-concavity of g on (a,b is equivalent to g'/g being monotone decreasing on (a,b or (ln(g" 6] have obtained log-concavity for distributions such as normal, logistic, extreme-value, exponential, Laplace, Weibull, power function, uniform, gamma, beta, Pareto, log-normal, Student's t, Cauchy and F distributions. We have discussed and introduced the continuous versions of the Pearson family, also found the log-concavity for this family in general cases, and then obtained the log-concavity property for each distribution that is a member of Pearson family. For the Burr family these cases have been calculated, even for each distribution that belongs to Burr family. Also, log-concavity results for distributions such as generalized gamma distributions, Feller-Pareto distributions, generalized Inverse Gaussian distributions and generalized Log-normal distributions have been obtained.

  13. Korean red ginseng,a well-known medicine

    Institute of Scientific and Technical Information of China (English)

    2006-01-01

    The efficacy and applications of ginseng are also described in many other journals of Oriental medicine,which rate ginseng as a master medicine that plays a major role in prescriptions. Dr.I.I.Brekhmann.a Russian scientist

  14. A Well-Known But Still Surprising Generator

    Science.gov (United States)

    Haugland, Ole Anton

    2014-12-01

    The bicycle generator is often mentioned as an example of a method to produce electric energy. It is cheap and easily accessible, so it is a natural example to use in teaching. There are different types, but I prefer the old side-wall dynamo. The most common explanation of its working principle seems to be something like the illustration in Fig. 1. The illustration is taken from a popular textbook in the Norwegian junior high school.1 Typically it is explained as a system of a moving magnet or coils that directly results in a varying magnetic field through the coils. According to Faraday's law a voltage is induced in the coils. Simple and easy! A few times I have had a chance to glimpse into a bicycle generator, and I was somewhat surprised to sense that the magnet rotated parallel to the turns of the coil. How could the flux through the coil change and induce a voltage when the magnet rotated parallel to the turns of the coil? When teaching electromagnetic induction I have showed the students a dismantled generator and asked them how this could work. They naturally found that this was more difficult to understand than the principle illustrated in Fig. 1. Other authors in this journal have discussed even more challenging questions concerning electric generators.2,3

  15. Accretion onto Some Well-Known Regular Black Holes

    CERN Document Server

    Jawad, Abdul

    2016-01-01

    In this work, we discuss the accretion onto static spherical symmetric regular black holes for specific choices of equation of state parameter. The underlying regular black holes are charged regular black hole using Fermi-Dirac Distribution, logistic distribution, nonlinear electrodynamics, respectively and Kehagias-Sftesos asymptotically flat regular black hole. We obtain the critical radius, critical speed and squared sound speed during the accretion process near the regular black holes. We also study the behavior of radial velocity, energy density and rate of change of mass for each regular black holes.

  16. A Well-Known but Still Surprising Generator

    Science.gov (United States)

    Haugland, Ole Anton

    2014-01-01

    The bicycle generator is often mentioned as an example of a method to produce electric energy. It is cheap and easily accessible, so it is a natural example to use in teaching. There are different types, but I prefer the old side-wall dynamo. The most common explanation of its working principle seems to be something like the illustration in Fig.…

  17. Study on quantification method based on Monte Carlo sampling for multiunit probabilistic safety assessment models

    Energy Technology Data Exchange (ETDEWEB)

    Oh, Kye Min [KHNP Central Research Institute, Daejeon (Korea, Republic of); Han, Sang Hoon; Park, Jin Hee; Lim, Ho Gon; Yang, Joon Yang [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Heo, Gyun Young [Kyung Hee University, Yongin (Korea, Republic of)

    2017-06-15

    In Korea, many nuclear power plants operate at a single site based on geographical characteristics, but the population density near the sites is higher than that in other countries. Thus, multiunit accidents are a more important consideration than in other countries and should be addressed appropriately. Currently, there are many issues related to a multiunit probabilistic safety assessment (PSA). One of them is the quantification of a multiunit PSA model. A traditional PSA uses a Boolean manipulation of the fault tree in terms of the minimal cut set. However, such methods have some limitations when rare event approximations cannot be used effectively or a very small truncation limit should be applied to identify accident sequence combinations for a multiunit site. In particular, it is well known that seismic risk in terms of core damage frequency can be overestimated because there are many events that have a high failure probability. In this study, we propose a quantification method based on a Monte Carlo approach for a multiunit PSA model. This method can consider all possible accident sequence combinations in a multiunit site and calculate a more exact value for events that have a high failure probability. An example model for six identical units at a site was also developed and quantified to confirm the applicability of the proposed method.

  18. Probabilistic Decision Graphs - Combining Verification and AI Techniques for Probabilistic Inference

    DEFF Research Database (Denmark)

    Jaeger, Manfred

    2004-01-01

    We adopt probabilistic decision graphs developed in the field of automated verification as a tool for probabilistic model representation and inference. We show that probabilistic inference has linear time complexity in the size of the probabilistic decision graph, that the smallest probabilistic ...

  19. Probabilistic Decision Graphs - Combining Verification and AI Techniques for Probabilistic Inference

    DEFF Research Database (Denmark)

    Jaeger, Manfred

    2004-01-01

    We adopt probabilistic decision graphs developed in the field of automated verification as a tool for probabilistic model representation and inference. We show that probabilistic inference has linear time complexity in the size of the probabilistic decision graph, that the smallest probabilistic ...

  20. 14th International Probabilistic Workshop

    CERN Document Server

    Taerwe, Luc; Proske, Dirk

    2017-01-01

    This book presents the proceedings of the 14th International Probabilistic Workshop that was held in Ghent, Belgium in December 2016. Probabilistic methods are currently of crucial importance for research and developments in the field of engineering, which face challenges presented by new materials and technologies and rapidly changing societal needs and values. Contemporary needs related to, for example, performance-based design, service-life design, life-cycle analysis, product optimization, assessment of existing structures and structural robustness give rise to new developments as well as accurate and practically applicable probabilistic and statistical engineering methods to support these developments. These proceedings are a valuable resource for anyone interested in contemporary developments in the field of probabilistic engineering applications.

  1. Common Difficulties with Probabilistic Reasoning.

    Science.gov (United States)

    Hope, Jack A.; Kelly, Ivan W.

    1983-01-01

    Several common errors reflecting difficulties in probabilistic reasoning are identified, relating to ambiguity, previous outcomes, sampling, unusual events, and estimating. Knowledge of these mistakes and interpretations may help mathematics teachers understand the thought processes of their students. (MNS)

  2. Probabilistic quantitative microbial risk assessment model of norovirus from wastewater irrigated vegetables in Ghana using genome copies and fecal indicator ratio conversion for estimating exposure dose

    DEFF Research Database (Denmark)

    Owusu-Ansah, Emmanuel de-Graft Johnson; Sampson, Angelina; Amponsah, Samuel K.

    2017-01-01

    The need to replace the commonly applied fecal indicator conversions ratio (an assumption of 1:10− 5 virus to fecal indicator organism) in Quantitative Microbial Risk Assessment (QMRA) with models based on quantitative data on the virus of interest has gained prominence due to the different...... physical and environmental factors that might influence the reliability of using indicator organisms in microbial risk assessment. The challenges facing analytical studies on virus enumeration (genome copies or particles) have contributed to the already existing lack of data in QMRA modelling. This study...... to estimate the norovirus count. In all scenarios of using different water sources, the application of the fecal indicator conversion ratio underestimated the norovirus disease burden, measured by the Disability Adjusted Life Years (DALYs), when compared to results using the genome copies norovirus data...

  3. Multiple-Strain Approach and Probabilistic Modeling of Consumer Habits in Quantitative Microbial Risk Assessment: A Quantitative Assessment of Exposure to Staphylococcal Enterotoxin A in Raw Milk.

    Science.gov (United States)

    Crotta, Matteo; Rizzi, Rita; Varisco, Giorgio; Daminelli, Paolo; Cunico, Elena Cosciani; Luini, Mario; Graber, Hans Ulrich; Paterlini, Franco; Guitian, Javier

    2016-03-01

    Quantitative microbial risk assessment (QMRA) models are extensively applied to inform management of a broad range of food safety risks. Inevitably, QMRA modeling involves an element of simplification of the biological process of interest. Two features that are frequently simplified or disregarded are the pathogenicity of multiple strains of a single pathogen and consumer behavior at the household level. In this study, we developed a QMRA model with a multiple-strain approach and a consumer phase module (CPM) based on uncertainty distributions fitted from field data. We modeled exposure to staphylococcal enterotoxin A in raw milk in Lombardy; a specific enterotoxin production module was thus included. The model is adaptable and could be used to assess the risk related to other pathogens in raw milk as well as other staphylococcal enterotoxins. The multiplestrain approach, implemented as a multinomial process, allowed the inclusion of variability and uncertainty with regard to pathogenicity at the bacterial level. Data from 301 questionnaires submitted to raw milk consumers were used to obtain uncertainty distributions for the CPM. The distributions were modeled to be easily updatable with further data or evidence. The sources of uncertainty due to the multiple-strain approach and the CPM were identified, and their impact on the output was assessed by comparing specific scenarios to the baseline. When the distributions reflecting the uncertainty in consumer behavior were fixed to the 95th percentile, the risk of exposure increased up to 160 times. This reflects the importance of taking into consideration the diversity of consumers' habits at the household level and the impact that the lack of knowledge about variables in the CPM can have on the final QMRA estimates. The multiple-strain approach lends itself to use in other food matrices besides raw milk and allows the model to better capture the complexity of the real world and to be capable of geographical

  4. Interval probabilistic neural network.

    Science.gov (United States)

    Kowalski, Piotr A; Kulczycki, Piotr

    2017-01-01

    Automated classification systems have allowed for the rapid development of exploratory data analysis. Such systems increase the independence of human intervention in obtaining the analysis results, especially when inaccurate information is under consideration. The aim of this paper is to present a novel approach, a neural networking, for use in classifying interval information. As presented, neural methodology is a generalization of probabilistic neural network for interval data processing. The simple structure of this neural classification algorithm makes it applicable for research purposes. The procedure is based on the Bayes approach, ensuring minimal potential losses with regard to that which comes about through classification errors. In this article, the topological structure of the network and the learning process are described in detail. Of note, the correctness of the procedure proposed here has been verified by way of numerical tests. These tests include examples of both synthetic data, as well as benchmark instances. The results of numerical verification, carried out for different shapes of data sets, as well as a comparative analysis with other methods of similar conditioning, have validated both the concept presented here and its positive features.

  5. Probabilistic risk assessment and nuclear waste transportation: A case study of the use of RADTRAN in the 1986 Environmental Assessment for Yucca Mountain

    Energy Technology Data Exchange (ETDEWEB)

    Resnikoff, M. [Radioactive Waste Management Associates, New York, NY (United States)

    1990-12-01

    The analysis of the risks of transporting irradiated nuclear fuel to a federal repository, Appendix A of the DOE Environmental Assessment for Yucca Mountain (DOE84), is based on the RADTRAN model and input parameters. The RADTRAN computer code calculates the radiation exposures and health effects under normal or incident-free transport, and over all credible accident conditions. The RADTRAN model also calculates the economic consequences of transportation accidents, though these costs were not included in the Department`s Environmental Assessment for the proposed Yucca Mountain repository.

  6. Guidance for the definition and application of probabilistic safety criteria

    Energy Technology Data Exchange (ETDEWEB)

    Holmberg, J.-E. (VTT Technical Research Centre of Finland (Finland)); Knochenhauer, M. (Scandpower AB (Sweden))

    2011-05-15

    The project 'The Validity of Safety Goals' has been financed jointly by NKS (Nordic Nuclear Safety Research), SSM (Swedish Radiation Safety Authority) and the Swedish and Finnish nuclear utilities. The national financing went through NPSAG, the Nordic PSA Group (Swedish contributions) and SAFIR2010, the Finnish research programme on NPP safety (Finnish contributions). The project has been performed in four phases during 2006-2010. This guidance document aims at describing, on the basis of the work performed throughout the project, issues to consider when defining, applying and interpreting probabilistic safety criteria. Thus, the basic aim of the document is to serve as a checklist and toolbox for the definition and application of probabilistic safety criteria. The document describes the terminology and concepts involved, the levels of criteria and relations between these, how to define a probabilistic safety criterion, how to apply a probabilistic safety criterion, on what to apply the probabilistic safety criterion, and how to interpret the result of the application. The document specifically deals with what makes up a probabilistic safety criterion, i.e., the risk metric, the frequency criterion, the PSA used for assessing compliance and the application procedure for the criterion. It also discusses the concept of subsidiary criteria, i.e., different levels of safety goals. The results from the project can be used as a platform for discussions at the utilities on how to define and use quantitative safety goals. The results can also be used by safety authorities as a reference for risk-informed regulation. The outcome can have an impact on the requirements on PSA, e.g., regarding quality, scope, level of detail, and documentation. Finally, the results can be expected to support on-going activities concerning risk-informed applications. (Author)

  7. Simulation Of Probabilistic Wind Loads On A Building

    Science.gov (United States)

    Chamis, Christos C.; Shah, Ashwin R.

    1994-01-01

    Method of simulating probabilistic windloads on building developed. Numerical results of simulation used to assess reliability of building and risk associated with tendencies of large gusts or high steady winds to cause building to sway, buckle, and/or overturn. Using method to analyze proposed design in iterative design cycle, building designed for specified reliability.

  8. Probabilistic risk analysis of casing drilling operation for an onshore Brazilian well; Analise probabilistica de risco de uma operacao de casing drilling para um poco terrestre no Brasil

    Energy Technology Data Exchange (ETDEWEB)

    Jacinto, Carlos M.C.; Petersen, Flavia C.; Placido, Joao C.R. [PETROBRAS, Rio de Janeiro, RJ (Brazil); Garcia, Pauli A.A. [Universidade Federal Fluminense (UFF), Niteroi, RJ (Brazil)

    2008-07-01

    In the present paper, one presents an approach to hazard identification and risks quantification concerning the use of retrievable BHA, of a casing drilling system, in 12 1/4 phase of an onshore well. The adopted approach can be subdivided as: execution of a hazard and operability study; prioritization of critical deviance; modeling of critical deviance by mean of event sequence diagram, fault tree and Bayesian network; modeling and simulation of a dynamic decision tree and experts' opinion analysis. As results, one has obtained: the time distribution to achieve the different ends modeled in the decision tree, i.e., sidetrack, or operation canceling, or success; the probabilities to achieve each modeled end and all recommendation to improve the success probability. The approach proved to be efficient in order that it presents significant results to support the decisions involving the casing drilling operations. (author)

  9. Irrigation and Instream Management under Drought Conditions using Probabilistic Constraints

    Science.gov (United States)

    Oviedo-Salcedo, D. M.; Cai, X.; Valocchi, A. J.

    2009-12-01

    It is well-known that river-aquifer flux exchange may be an important control on low flow condition in a stream. Moreover, the connections between streams and underlying formations can be spatially variable due to geological heterogeneity and landscape topography. For example, during drought seasons, farming activities may induce critical peak pumping rates to supply irrigation water needs for crops, and this leads to increased concerns about reductions in baseflow and adverse impacts upon riverine ecosystems. Quantitative management of the subsurface water resources is a required key component in this particular human-nature interaction system to evaluate the tradeoffs between irrigation for agriculture and the ecosystems low flow requirements. This work presents an optimization scheme developed upon the systems reliability-based design optimization -SRBDO- analysis, which accounts for prescribed probabilistic constraint evaluation. This approach can provide optimal solutions in the presence of uncertainty with a higher level of confidence. In addition, the proposed methodology quantifies and controls the risk of failure. SRBDO have been developed in the aerospace industry and extensively applied in the field of structural engineering, but has only seen limited application in the field of hydrology. SRBDO uses probability theory to model uncertainty and to determine the probability of failure by solving a mathematical nonlinear programming problem. Furthermore, the reliability-based design optimization provides a complete and detailed insight of the relative importance of each random variable involved in the application, in this case the surface -groundwater coupled system. Importance measures and sensitivity analyses of both, random variables and probability distribution function parameters are integral components of the system reliability analysis. Therefore, with this methodology it is possible to assess the contribution of each uncertain variable on the total

  10. Probabilistic Projections of Future Sea-Level Change and Their Implications for Flood Risk Management: Insights from the American Climate Prospectus

    Science.gov (United States)

    Kopp, R. E., III; Delgado, M.; Horton, R. M.; Houser, T.; Little, C. M.; Muir-Wood, R.; Oppenheimer, M.; Rasmussen, D. M., Jr.; Strauss, B.; Tebaldi, C.

    2014-12-01

    Global mean sea level (GMSL) rise projections are insufficient for adaptation planning; local decisions require local projections that characterize risk over a range of timeframes and tolerances. We present a global set of local sea level (LSL) projections to inform decisions on timescales ranging from the coming decades through the 22nd century. We present complete probability distributions, informed by a combination of expert community assessment, expert elicitation, and process modeling [1]. We illustrate the application of this framework by estimating the joint distribution of future sea-level change and coastal flooding, and associated economic costs [1,2]. In much of the world in the current century, differences in median LSL projections are due primarily to varying levels of non-climatic uplift or subsidence. In the 22nd century and in the high-end tails, larger ice sheet contributions, particularly from the Antarctic ice sheet (AIS), contribute significantly to site-to-site differences. Uncertainty in GMSL and most LSL projections is dominated by the uncertain AIS component. Sea-level rise dramatically reshapes flood risk. For example, at the New York City (Battery) tide gauge, our projections indicate a likely (67% probability) 21st century LSL rise under RCP 8.5 of 65--129 cm (1-in-20 chance of exceeding 154 cm). Convolving the distribution of projected sea-level rise with the extreme value distribution of flood return periods indicates that this rise will cause the current 1.80 m `1-in-100 year' flood event to occur an expected nine times over the 21st century -- equivalent to the expected number of `1-in-11 year' floods in the absence of sea-level change. Projected sea-level rise for 2100 under RCP 8.5 would likely place 80-160 billion of current property in New York below the high tide line, with a 1-in-20 chance of losses >190 billion. Even without accounting for potential changes in storms themselves, it would likely increase average annual storm

  11. 应用PSA方法进行核电站维修风险管理%Maintenance risk management using probabilistic safety assessment in nuclear power plants

    Institute of Scientific and Technical Information of China (English)

    何旭洪; 童节娟; 薛大知

    2006-01-01

    该文讨论了如何应用概率安全评价(PSA)方法对核电站的维修风险进行管理,给出维修风险管理的4个阶段,并重点对其中的短期维修计划制订中的风险管理进行了详细分析,包括实施过程、评价的风险指标以及可以采取的一些风险管理行动,通过实际案例介绍了所开发的MRM (maintenance risk monitor) 系统.应用PSA方法,可以对电站各种风险来源以及缓解系统的能力进行综合分析,从而得到定量化的、全面的风险信息,实现对电站维修风险的有效控制和管理,避免出现高风险配置状态.

  12. Risk assessment of CST-7 proposed waste treatment and storage facilities Volume I: Limited-scope probabilistic risk assessment (PRA) of proposed CST-7 waste treatment & storage facilities. Volume II: Preliminary hazards analysis of proposed CST-7 waste storage & treatment facilities

    Energy Technology Data Exchange (ETDEWEB)

    Sasser, K.

    1994-06-01

    In FY 1993, the Los Alamos National Laboratory Waste Management Group [CST-7 (formerly EM-7)] requested the Probabilistic Risk and Hazards Analysis Group [TSA-11 (formerly N-6)] to conduct a study of the hazards associated with several CST-7 facilities. Among these facilities are the Hazardous Waste Treatment Facility (HWTF), the HWTF Drum Storage Building (DSB), and the Mixed Waste Receiving and Storage Facility (MWRSF), which are proposed for construction beginning in 1996. These facilities are needed to upgrade the Laboratory`s storage capability for hazardous and mixed wastes and to provide treatment capabilities for wastes in cases where offsite treatment is not available or desirable. These facilities will assist Los Alamos in complying with federal and state requlations.

  13. Probabilistic aspects of Wigner function

    CERN Document Server

    Usenko, C V

    2004-01-01

    The Wigner function of quantum systems is an effective instrument to construct the approximate classical description of the systems for which the classical approximation is possible. During the last time, the Wigner function formalism is applied as well to seek indications of specific quantum properties of quantum systems leading to impossibility of the classical approximation construction. Most of all, as such an indication the existence of negative values in Wigner function for specific states of the quantum system being studied is used. The existence of such values itself prejudices the probabilistic interpretation of the Wigner function, though for an arbitrary observable depending jointly on the coordinate and the momentum of the quantum system just the Wigner function gives an effective instrument to calculate the average value and the other statistical characteristics. In this paper probabilistic interpretation of the Wigner function based on coordination of theoretical-probabilistic definition of the ...

  14. Quantum probabilistically cloning and computation

    Institute of Scientific and Technical Information of China (English)

    2008-01-01

    In this article we make a review on the usefulness of probabilistically cloning and present examples of quantum computation tasks for which quantum cloning offers an advantage which cannot be matched by any approach that does not resort to it.In these quantum computations,one needs to distribute quantum information contained in states about which we have some partial information.To perform quantum computations,one uses state-dependent probabilistic quantum cloning procedure to distribute quantum information in the middle of a quantum computation.And we discuss the achievable efficiencies and the efficient quantum logic network for probabilistic cloning the quantum states used in implementing quantum computation tasks for which cloning provides enhancement in performance.

  15. Emergent patterns from probabilistic generalizations of lateral activation and inhibition

    Science.gov (United States)

    Kabla, Alexandre

    2016-01-01

    The combination of laterally activating and inhibiting feedbacks is well known to spontaneously generate spatial organization. It was introduced by Gierer and Meinhardt as an extension of Turing's great insight that two reacting and diffusing chemicals can spontaneously drive spatial morphogenesis per se. In this study, we develop an accessible nonlinear and discrete probabilistic model to study simple generalizations of lateral activation and inhibition. By doing so, we identify a range of modes of morphogenesis beyond the familiar Turing-type modes; notably, beyond stripes, hexagonal nets, pores and labyrinths, we identify labyrinthine highways, Kagome lattices, gyrating labyrinths and multi-colour travelling waves and spirals. The results are discussed within the context of Turing's original motivating interest: the mechanisms which underpin the morphogenesis of living organisms. PMID:27170648

  16. Efficient Quadrature Operator Using Dual-Perspectives-Fusion Probabilistic Weights

    Directory of Open Access Journals (Sweden)

    Ashok Sahai

    2009-08-01

    Full Text Available A new quadrature formula has been proposed which uses weight functions derived using a probabilistic approach, and a rather-ingenious 'Fusion' of two dual perspectives. Unlike the complicatedly structured quadrature formulae of Gauss,Hermite and others of similar type, the proposed quadrature formula only needs the values of integrand at user-defined equidistant points in the interval of integration. The weights are functions of the impugned variable in the integrand, and are not mere constants. The quadrature formula has been compared empirically with the simple classical method of numerical integration using the well-known "Bernstein Operator". The percentage absolute relative errors for the proposed quadrature formula and that with the "Bernstein Operator" have been computed for certain selected functions and with different number of node points in the interval of integration. It has been observed that the proposed quadrature formula produces significantly better results.

  17. On the interpretation, verification and calibration of ternary probabilistic forecasts

    CERN Document Server

    Jupp, Tim E; Coelho, Caio A S; Stephenson, David B

    2011-01-01

    We develop a geometrical interpretation of ternary probabilistic forecasts in which forecasts and observations are regarded as points inside a triangle. Within the triangle, we define a continuous colour palette in which hue and colour saturation are defined with reference to the observed climatology. In contrast to current methods, forecast maps created with this colour scheme convey all of the information present in each ternary forecast. The geometrical interpretation is then extended to verification under quadratic scoring rules (of which the Brier Score and the Ranked Probability Score are well--known examples). Each scoring rule defines an associated triangle in which the square roots of the \\emph{score}, the \\emph{reliability}, the \\emph{uncertainty} and the \\emph{resolution} all have natural interpretations as root--mean--square distances. This leads to our proposal for a \\emph{Ternary Reliability Diagram} in which data relating to verification and calibration can be summarised. We illustrate these id...

  18. HISTORY BASED PROBABILISTIC BACKOFF ALGORITHM

    Directory of Open Access Journals (Sweden)

    Narendran Rajagopalan

    2012-01-01

    Full Text Available Performance of Wireless LAN can be improved at each layer of the protocol stack with respect to energy efficiency. The Media Access Control layer is responsible for the key functions like access control and flow control. During contention, Backoff algorithm is used to gain access to the medium with minimum probability of collision. After studying different variations of back off algorithms that have been proposed, a new variant called History based Probabilistic Backoff Algorithm is proposed. Through mathematical analysis and simulation results using NS-2, it is seen that proposed History based Probabilistic Backoff algorithm performs better than Binary Exponential Backoff algorithm.

  19. Probabilistic Design of Wind Turbines

    DEFF Research Database (Denmark)

    Toft, Henrik Stensgaard

    , new and more refined design methods must be developed. These methods can for instance be developed using probabilistic design where the uncertainties in all phases of the design life are taken into account. The main aim of the present thesis is to develop models for probabilistic design of wind....... The uncertainty related to the existing methods for estimating the loads during operation is assessed by applying these methods to a case where the load response is assumed to be Gaussian. In this case an approximate analytical solution exists for a statistical description of the extreme load response. In general...

  20. Probabilistic methods in combinatorial analysis

    CERN Document Server

    Sachkov, Vladimir N

    2014-01-01

    This 1997 work explores the role of probabilistic methods for solving combinatorial problems. These methods not only provide the means of efficiently using such notions as characteristic and generating functions, the moment method and so on but also let us use the powerful technique of limit theorems. The basic objects under investigation are nonnegative matrices, partitions and mappings of finite sets, with special emphasis on permutations and graphs, and equivalence classes specified on sequences of finite length consisting of elements of partially ordered sets; these specify the probabilist

  1. Probabilistic reasoning in data analysis.

    Science.gov (United States)

    Sirovich, Lawrence

    2011-09-20

    This Teaching Resource provides lecture notes, slides, and a student assignment for a lecture on probabilistic reasoning in the analysis of biological data. General probabilistic frameworks are introduced, and a number of standard probability distributions are described using simple intuitive ideas. Particular attention is focused on random arrivals that are independent of prior history (Markovian events), with an emphasis on waiting times, Poisson processes, and Poisson probability distributions. The use of these various probability distributions is applied to biomedical problems, including several classic experimental studies.

  2. Probabilistic Approach to Rough Set Theory

    Institute of Scientific and Technical Information of China (English)

    Wojciech Ziarko

    2006-01-01

    The presentation introduces the basic ideas and investigates the probabilistic approach to rough set theory. The major aspects of the probabilistic approach to rough set theory to be explored during the presentation are: the probabilistic view of the approximation space, the probabilistic approximations of sets, as expressed via variable precision and Bayesian rough set models, and probabilistic dependencies between sets and multi-valued attributes, as expressed by the absolute certainty gain and expected certainty gain measures, respectively. The probabilis-tic dependency measures allow for representation of subtle stochastic associations between attributes. They also allow for more comprehensive evaluation of rules computed from data and for computation of attribute reduct, core and significance factors in probabilistic decision tables. It will be shown that the probabilistic dependency measure-based attribute reduction techniques are also extendible to hierarchies of decision tables. The presentation will include computational examples to illustrate pre-sented concepts and to indicate possible practical applications.

  3. Probabilistic Logic Programming under Answer Sets Semantics

    Institute of Scientific and Technical Information of China (English)

    王洁; 鞠实儿

    2003-01-01

    Although traditional logic programming languages provide powerful tools for knowledge representation, they cannot deal with uncertainty information (e. g. probabilistic information). In this paper, we propose a probabilistic logic programming language by introduce probability into a general logic programming language. The work combines 4-valued logic with probability. Conditional probability can be easily represented in a probabilistic logic program. The semantics of such a probabilistic logic program i...

  4. Probabilistic Seismic Hazard Analysis for Yemen

    Directory of Open Access Journals (Sweden)

    Rakesh Mohindra

    2012-01-01

    Full Text Available A stochastic-event probabilistic seismic hazard model, which can be used further for estimates of seismic loss and seismic risk analysis, has been developed for the territory of Yemen. An updated composite earthquake catalogue has been compiled using the databases from two basic sources and several research publications. The spatial distribution of earthquakes from the catalogue was used to define and characterize the regional earthquake source zones for Yemen. To capture all possible scenarios in the seismic hazard model, a stochastic event set has been created consisting of 15,986 events generated from 1,583 fault segments in the delineated seismic source zones. Distribution of horizontal peak ground acceleration (PGA was calculated for all stochastic events considering epistemic uncertainty in ground-motion modeling using three suitable ground motion-prediction relationships, which were applied with equal weight. The probabilistic seismic hazard maps were created showing PGA and MSK seismic intensity at 10% and 50% probability of exceedance in 50 years, considering local soil site conditions. The resulting PGA for 10% probability of exceedance in 50 years (return period 475 years ranges from 0.2 g to 0.3 g in western Yemen and generally is less than 0.05 g across central and eastern Yemen. The largest contributors to Yemen’s seismic hazard are the events from the West Arabian Shield seismic zone.

  5. A Probabilistic Ontology Development Methodology

    Science.gov (United States)

    2014-06-01

    Model-Based Systems Engineering (MBSE) Methodologies," Seattle, 2008. [17] Jeffrey O. Grady, System Requirements Analysis. New York: McGraw-Hill, Inc...software. [Online]. http://www.norsys.com/index.html [26] Lise Getoor, Nir Friedman, Daphne Koller, Avi Pfeffer , and Ben Taskar, "Probabilistic

  6. Probabilistic aspects of ocean waves

    NARCIS (Netherlands)

    Battjes, J.A.

    1977-01-01

    Background material for a special lecture on probabilistic aspects of ocean waves for a seminar in Trondheim. It describes long term statistics and short term statistics. Statistical distributions of waves, directional spectra and frequency spectra. Sea state parameters, response peaks, encounter

  7. Probabilistic aspects of ocean waves

    NARCIS (Netherlands)

    Battjes, J.A.

    1977-01-01

    Background material for a special lecture on probabilistic aspects of ocean waves for a seminar in Trondheim. It describes long term statistics and short term statistics. Statistical distributions of waves, directional spectra and frequency spectra. Sea state parameters, response peaks, encounter pr

  8. Sound Probabilistic #SAT with Projection

    Directory of Open Access Journals (Sweden)

    Vladimir Klebanov

    2016-10-01

    Full Text Available We present an improved method for a sound probabilistic estimation of the model count of a boolean formula under projection. The problem solved can be used to encode a variety of quantitative program analyses, such as concerning security of resource consumption. We implement the technique and discuss its application to quantifying information flow in programs.

  9. Probabilistic localisation in repetitive environments

    NARCIS (Netherlands)

    Vroegindeweij, Bastiaan A.; IJsselmuiden, Joris; Henten, van Eldert J.

    2016-01-01

    One of the problems in loose housing systems for laying hens is the laying of eggs on the floor, which need to be collected manually. In previous work, PoultryBot was presented to assist in this and other tasks. Here, probabilistic localisation with a particle filter is evaluated for use inside p

  10. A probabilistic Hu-Washizu variational principle

    Science.gov (United States)

    Liu, W. K.; Belytschko, T.; Besterfield, G. H.

    1987-01-01

    A Probabilistic Hu-Washizu Variational Principle (PHWVP) for the Probabilistic Finite Element Method (PFEM) is presented. This formulation is developed for both linear and nonlinear elasticity. The PHWVP allows incorporation of the probabilistic distributions for the constitutive law, compatibility condition, equilibrium, domain and boundary conditions into the PFEM. Thus, a complete probabilistic analysis can be performed where all aspects of the problem are treated as random variables and/or fields. The Hu-Washizu variational formulation is available in many conventional finite element codes thereby enabling the straightforward inclusion of the probabilistic features into present codes.

  11. Model Checking with Probabilistic Tabled Logic Programming

    CERN Document Server

    Gorlin, Andrey; Smolka, Scott A

    2012-01-01

    We present a formulation of the problem of probabilistic model checking as one of query evaluation over probabilistic logic programs. To the best of our knowledge, our formulation is the first of its kind, and it covers a rich class of probabilistic models and probabilistic temporal logics. The inference algorithms of existing probabilistic logic-programming systems are well defined only for queries with a finite number of explanations. This restriction prohibits the encoding of probabilistic model checkers, where explanations correspond to executions of the system being model checked. To overcome this restriction, we propose a more general inference algorithm that uses finite generative structures (similar to automata) to represent families of explanations. The inference algorithm computes the probability of a possibly infinite set of explanations directly from the finite generative structure. We have implemented our inference algorithm in XSB Prolog, and use this implementation to encode probabilistic model...

  12. Sensitivity analysis and probabilistic re-entry modeling for debris using high dimensional model representation based uncertainty treatment

    Science.gov (United States)

    Mehta, Piyush M.; Kubicek, Martin; Minisci, Edmondo; Vasile, Massimiliano

    2017-01-01

    Well-known tools developed for satellite and debris re-entry perform break-up and trajectory simulations in a deterministic sense and do not perform any uncertainty treatment. The treatment of uncertainties associated with the re-entry of a space object requires a probabilistic approach. A Monte Carlo campaign is the intuitive approach to performing a probabilistic analysis, however, it is computationally very expensive. In this work, we use a recently developed approach based on a new derivation of the high dimensional model representation method for implementing a computationally efficient probabilistic analysis approach for re-entry. Both aleatoric and epistemic uncertainties that affect aerodynamic trajectory and ground impact location are considered. The method is applicable to both controlled and un-controlled re-entry scenarios. The resulting ground impact distributions are far from the typically used Gaussian or ellipsoid distributions.

  13. A Probabilistic and Observation Based Methodology to Estimate Small Craft Harbor Vulnerability to Tsunami Events

    Science.gov (United States)

    Keen, A. S.; Lynett, P. J.; Ayca, A.

    2016-12-01

    Because of the damage resulting from the 2010 Chile and 2011 Japanese tele-tsunamis, the tsunami risk to the small craft marinas in California has become an important concern. The talk will outline an assessment tool which can be used to assess the tsunami hazard to small craft harbors. The methodology is based on the demand and structural capacity of the floating dock system, composed of floating docks/fingers and moored vessels. The structural demand is determined using a Monte Carlo methodology. Monte Carlo methodology is a probabilistic computational tool where the governing might be well known, but the independent variables of the input (demand) as well as the resisting structural components (capacity) may not be completely known. The Monte Carlo approach uses a distribution of each variable, and then uses that random variable within the described parameters, to generate a single computation. The process then repeats hundreds or thousands of times. The numerical model "Method of Splitting Tsunamis" (MOST) has been used to determine the inputs for the small craft harbors within California. Hydrodynamic model results of current speed, direction and surface elevation were incorporated via the drag equations to provide the bases of the demand term. To determine the capacities, an inspection program was developed to identify common features of structural components. A total of six harbors have been inspected ranging from Crescent City in Northern California to Oceanside Harbor in Southern California. Results from the inspection program were used to develop component capacity tables which incorporated the basic specifications of each component (e.g. bolt size and configuration) and a reduction factor (which accounts for the component reduction in capacity with age) to estimate in situ capacities. Like the demand term, these capacities are added probabilistically into the model. To date the model has been applied to Santa Cruz Harbor as well as Noyo River. Once

  14. Probabilistic Design of Wind Turbines

    DEFF Research Database (Denmark)

    Toft, Henrik Stensgaard

    , new and more refined design methods must be developed. These methods can for instance be developed using probabilistic design where the uncertainties in all phases of the design life are taken into account. The main aim of the present thesis is to develop models for probabilistic design of wind......, the uncertainty is dependent on the method used for load extrapolation, the number of simulations and the distribution fitted to the extracted peaks. Another approach for estimating the uncertainty on the estimated load effects during operation is to use field measurements. A new method for load extrapolation......, which is based on average conditional exceedence rates, is applied to wind turbine response. The advantage of this method is that it can handle dependence in the response and use exceedence rates instead of extracted peaks which normally are more stable. The results show that the method estimates...

  15. Probabilistic Design of Wind Turbines

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Toft, H.S.

    2010-01-01

    Probabilistic design of wind turbines requires definition of the structural elements to be included in the probabilistic basis: e.g., blades, tower, foundation; identification of important failure modes; careful stochastic modeling of the uncertain parameters; recommendations for target reliability....... It is described how uncertainties in wind turbine design related to computational models, statistical data from test specimens, results from a few full-scale tests and from prototype wind turbines can be accounted for using the Maximum Likelihood Method and a Bayesian approach. Assessment of the optimal...... reliability level by cost-benefit optimization is illustrated by an offshore wind turbine example. Uncertainty modeling is illustrated by an example where physical, statistical and model uncertainties are estimated....

  16. Probabilistic Design of Wind Turbines

    Directory of Open Access Journals (Sweden)

    Henrik S. Toft

    2010-02-01

    Full Text Available Probabilistic design of wind turbines requires definition of the structural elements to be included in the probabilistic basis: e.g., blades, tower, foundation; identification of important failure modes; careful stochastic modeling of the uncertain parameters; recommendations for target reliability levels and recommendation for consideration of system aspects. The uncertainties are characterized as aleatoric (physical uncertainty or epistemic (statistical, measurement and model uncertainties. Methods for uncertainty modeling consistent with methods for estimating the reliability are described. It is described how uncertainties in wind turbine design related to computational models, statistical data from test specimens, results from a few full-scale tests and from prototype wind turbines can be accounted for using the Maximum Likelihood Method and a Bayesian approach. Assessment of the optimal reliability level by cost-benefit optimization is illustrated by an offshore wind turbine example. Uncertainty modeling is illustrated by an example where physical, statistical and model uncertainties are estimated.

  17. Probabilistic Sizing and Verification of Space Ceramic Structures

    Science.gov (United States)

    Denaux, David; Ballhause, Dirk; Logut, Daniel; Lucarelli, Stefano; Coe, Graham; Laine, Benoit

    2012-07-01

    Sizing of ceramic parts is best optimised using a probabilistic approach which takes into account the preexisting flaw distribution in the ceramic part to compute a probability of failure of the part depending on the applied load, instead of a maximum allowable load as for a metallic part. This requires extensive knowledge of the material itself but also an accurate control of the manufacturing process. In the end, risk reduction approaches such as proof testing may be used to lower the final probability of failure of the part. Sizing and verification of ceramic space structures have been performed by Astrium for more than 15 years, both with Zerodur and SiC: Silex telescope structure, Seviri primary mirror, Herschel telescope, Formosat-2 instrument, and other ceramic structures flying today. Throughout this period of time, Astrium has investigated and developed experimental ceramic analysis tools based on the Weibull probabilistic approach. In the scope of the ESA/ESTEC study: “Mechanical Design and Verification Methodologies for Ceramic Structures”, which is to be concluded in the beginning of 2012, existing theories, technical state-of-the-art from international experts, and Astrium experience with probabilistic analysis tools have been synthesized into a comprehensive sizing and verification method for ceramics. Both classical deterministic and more optimised probabilistic methods are available, depending on the criticality of the item and on optimisation needs. The methodology, based on proven theory, has been successfully applied to demonstration cases and has shown its practical feasibility.

  18. Unified Probabilistic Models for Face Recognition from a Single Example Image per Person

    Institute of Scientific and Technical Information of China (English)

    Pin Liao; Li Shen

    2004-01-01

    This paper presents a new technique of unified probabilistic models for face recognition from only one single example image per person. The unified models, trained on an obtained training set with multiple samples per person, are used to recognize facial images from another disjoint database with a single sample per person. Variations between facial images are modeled as two unified probabilistic models: within-class variations and between-class variations. Gaussian Mixture Models are used to approximate the distributions of the two variations and exploit a classifier combination method to improve the performance. Extensive experimental results on the ORL face database and the authors' database (the ICT-JDL database) including totally 1,750facial images of 350 individuals demonstrate that the proposed technique, compared with traditional eigenface method and some well-known traditional algorithms, is a significantly more effective and robust approach for face recognition.

  19. Unified Probabilistic Models for Face Recognition from a Single Example Image per Person

    Institute of Scientific and Technical Information of China (English)

    PinLiao; LiShen

    2004-01-01

    This paper presents a new technique of unified probabilistic models for face recognition from only one single example image per person. The unified models, trained on an obtained training set with multiple samples per person, are used to recognize facial images from another disjoint database with a single sample per person. Variations between facial images are modeled as two unified probabilistic models: within-class variations and between-class variations. Gaussian Mixture Models are used to approximate the distributions of the two variations and exploit a classifier combination method to improve the performance. Extensive experimental results on the ORL face database and the authors' database (the ICT-JDL database) including totally 1,750 facial images of 350 individuals demonstrate that the proposed technique, compared with traditional eigenface method and some well-known traditional algorithms, is a significantly more effective and robust approach for face recognition.

  20. Modified Claus process probabilistic model

    Energy Technology Data Exchange (ETDEWEB)

    Larraz Mora, R. [Chemical Engineering Dept., Univ. of La Laguna (Spain)

    2006-03-15

    A model is proposed for the simulation of an industrial Claus unit with a straight-through configuration and two catalytic reactors. Process plant design evaluations based on deterministic calculations does not take into account the uncertainties that are associated with the different input variables. A probabilistic simulation method was applied in the Claus model to obtain an impression of how some of these inaccuracies influences plant performance. (orig.)

  1. Probabilistic Cloning and Quantum Computation

    Institute of Scientific and Technical Information of China (English)

    GAO Ting; YAN Feng-Li; WANG Zhi-Xi

    2004-01-01

    @@ We discuss the usefulness of quantum cloning and present examples of quantum computation tasks for which the cloning offers an advantage which cannot be matched by any approach that does not resort to quantum cloning.In these quantum computations, we need to distribute quantum information contained in the states about which we have some partial information. To perform quantum computations, we use a state-dependent probabilistic quantum cloning procedure to distribute quantum information in the middle of a quantum computation.

  2. Probabilistic analysis and related topics

    CERN Document Server

    Bharucha-Reid, A T

    1983-01-01

    Probabilistic Analysis and Related Topics, Volume 3 focuses on the continuity, integrability, and differentiability of random functions, including operator theory, measure theory, and functional and numerical analysis. The selection first offers information on the qualitative theory of stochastic systems and Langevin equations with multiplicative noise. Discussions focus on phase-space evolution via direct integration, phase-space evolution, linear and nonlinear systems, linearization, and generalizations. The text then ponders on the stability theory of stochastic difference systems and Marko

  3. Probabilistic methods for rotordynamics analysis

    Science.gov (United States)

    Wu, Y.-T.; Torng, T. Y.; Millwater, H. R.; Fossum, A. F.; Rheinfurth, M. H.

    1991-01-01

    This paper summarizes the development of the methods and a computer program to compute the probability of instability of dynamic systems that can be represented by a system of second-order ordinary linear differential equations. Two instability criteria based upon the eigenvalues or Routh-Hurwitz test functions are investigated. Computational methods based on a fast probability integration concept and an efficient adaptive importance sampling method are proposed to perform efficient probabilistic analysis. A numerical example is provided to demonstrate the methods.

  4. Probabilistic analysis and related topics

    CERN Document Server

    Bharucha-Reid, A T

    1979-01-01

    Probabilistic Analysis and Related Topics, Volume 2 focuses on the integrability, continuity, and differentiability of random functions, as well as functional analysis, measure theory, operator theory, and numerical analysis.The selection first offers information on the optimal control of stochastic systems and Gleason measures. Discussions focus on convergence of Gleason measures, random Gleason measures, orthogonally scattered Gleason measures, existence of optimal controls without feedback, random necessary conditions, and Gleason measures in tensor products. The text then elaborates on an

  5. Probabilistic confidence for decisions based on uncertain reliability estimates

    Science.gov (United States)

    Reid, Stuart G.

    2013-05-01

    Reliability assessments are commonly carried out to provide a rational basis for risk-informed decisions concerning the design or maintenance of engineering systems and structures. However, calculated reliabilities and associated probabilities of failure often have significant uncertainties associated with the possible estimation errors relative to the 'true' failure probabilities. For uncertain probabilities of failure, a measure of 'probabilistic confidence' has been proposed to reflect the concern that uncertainty about the true probability of failure could result in a system or structure that is unsafe and could subsequently fail. The paper describes how the concept of probabilistic confidence can be applied to evaluate and appropriately limit the probabilities of failure attributable to particular uncertainties such as design errors that may critically affect the dependability of risk-acceptance decisions. This approach is illustrated with regard to the dependability of structural design processes based on prototype testing with uncertainties attributable to sampling variability.

  6. A framework for probabilistic pluvial flood nowcasting for urban areas

    DEFF Research Database (Denmark)

    Ntegeka, Victor; Murla, Damian; Wang, Lipen;

    2016-01-01

    the potential of nowcasting products for urban inundation applications. One of the most advanced Quantitative Precipitation Forecasting (QPF) techniques is the Short-Term Ensemble Prediction System, which was originally co-developed by the UK Met Office and Australian Bureau of Meteorology. The scheme...... was further tuned to better estimate extreme and moderate events for the Belgian area (STEPS-BE). Against this backdrop, a probabilistic framework has been developed that consists of: (1) rainfall nowcasts; (2) sewer hydraulic model; (3) flood damage estimation; and (4) urban inundation risk mapping. STEPS...... (12.5 – 50 m2) and low flood hazard areas (75 – 300 m2). Functions describing urban flood damage and social consequences were empirically derived based on questionnaires to people in the region that were recently affected by sewer floods. Probabilistic urban flood risk maps were prepared based...

  7. Probabilistic interpretation of resonant states

    Indian Academy of Sciences (India)

    Naomichi Hatano; Tatsuro Kawamoto; Joshua Feinberg

    2009-09-01

    We provide probabilistic interpretation of resonant states. We do this by showing that the integral of the modulus square of resonance wave functions (i.e., the conventional norm) over a properly expanding spatial domain is independent of time, and therefore leads to probability conservation. This is in contrast with the conventional employment of a bi-orthogonal basis that precludes probabilistic interpretation, since wave functions of resonant states diverge exponentially in space. On the other hand, resonant states decay exponentially in time, because momentum leaks out of the central scattering area. This momentum leakage is also the reason for the spatial exponential divergence of resonant state. It is by combining the opposite temporal and spatial behaviours of resonant states that we arrive at our probabilistic interpretation of these states. The physical need to normalize resonant wave functions over an expanding spatial domain arises because particles leak out of the region which contains the potential range and escape to infinity, and one has to include them in the total count of particles.

  8. An expectation transformer approach to predicate abstraction and data independence for probabilistic programs

    CERN Document Server

    Ndukwu, Ukachukwu; 10.4204/EPTCS.28.9

    2010-01-01

    In this paper we revisit the well-known technique of predicate abstraction to characterise performance attributes of system models incorporating probability. We recast the theory using expectation transformers, and identify transformer properties which correspond to abstractions that yield nevertheless exact bound on the performance of infinite state probabilistic systems. In addition, we extend the developed technique to the special case of "data independent" programs incorporating probability. Finally, we demonstrate the subtleness of the extended technique by using the PRISM model checking tool to analyse an infinite state protocol, obtaining exact bounds on its performance.

  9. Proposal of a risk model for vehicular traffic: A Boltzmann-type kinetic approach

    CERN Document Server

    Freguglia, Paolo

    2015-01-01

    This paper deals with a Boltzmann-type kinetic model describing the interplay between vehicle dynamics and safety aspects in vehicular traffic. Sticking to the idea that the macroscopic characteristics of traffic flow, including the distribution of the driving risk along a road, are ultimately generated by one-to-one interactions among drivers, the model links the personal (i.e., individual) risk to the changes of speeds of single vehicles and implements a probabilistic description of such microscopic interactions in a Boltzmann-type collisional operator. By means of suitable statistical moments of the kinetic distribution function, it is finally possible to recover macroscopic relationships between the average risk and the road congestion, which show an interesting and reasonable correlation with the well-known free and congested phases of the flow of vehicles.

  10. Standardized approach for developing probabilistic exposure factor distributions

    Energy Technology Data Exchange (ETDEWEB)

    Maddalena, Randy L.; McKone, Thomas E.; Sohn, Michael D.

    2003-03-01

    The effectiveness of a probabilistic risk assessment (PRA) depends critically on the quality of input information that is available to the risk assessor and specifically on the probabilistic exposure factor distributions that are developed and used in the exposure and risk models. Deriving probabilistic distributions for model inputs can be time consuming and subjective. The absence of a standard approach for developing these distributions can result in PRAs that are inconsistent and difficult to review by regulatory agencies. We present an approach that reduces subjectivity in the distribution development process without limiting the flexibility needed to prepare relevant PRAs. The approach requires two steps. First, we analyze data pooled at a population scale to (1) identify the most robust demographic variables within the population for a given exposure factor, (2) partition the population data into subsets based on these variables, and (3) construct archetypal distributions for each subpopulation. Second, we sample from these archetypal distributions according to site- or scenario-specific conditions to simulate exposure factor values and use these values to construct the scenario-specific input distribution. It is envisaged that the archetypal distributions from step 1 will be generally applicable so risk assessors will not have to repeatedly collect and analyze raw data for each new assessment. We demonstrate the approach for two commonly used exposure factors--body weight (BW) and exposure duration (ED)--using data for the U.S. population. For these factors we provide a first set of subpopulation based archetypal distributions along with methodology for using these distributions to construct relevant scenario-specific probabilistic exposure factor distributions.

  11. Probabilistic fire simulator - Monte Carlo simulation tool for fire scenarios

    Energy Technology Data Exchange (ETDEWEB)

    Hostikka, S.; Keski-Rahkonen, O. [VTT Building and Transport, Espoo (Finland)

    2002-11-01

    Risk analysis tool is developed for computing of the distributions of fire model output variables. The tool, called Probabilistic Fire Simulator, combines Monte Carlo simulation and CFAST two-zone fire model. In this work, it is used to calculate failure probability of redundant cables and fire detector activation times in a cable tunnel fire. Sensitivity of the output variables to the input variables is calculated in terms of the rank order correlations. (orig.)

  12. Multi-criteria decision analysis with fuzzy probabilistic risk assessment for produced water management[Includes the CSCE forum on professional practice and career development : 1. international engineering mechanics and materials specialty conference : 1. international/3. coastal, estuarine and offshore engineering specialty conference : 2. international/8. construction specialty conference

    Energy Technology Data Exchange (ETDEWEB)

    Mofarrah, A.; Husain, T.; Hawboldt, K. [Memorial Univ. of Newfoundland, St. John' s, NL (Canada). Faculty of Engineering and Applied Science

    2009-07-01

    This paper presented an integrated approach for management of produced water (PW) from oil and gas production. A multi-criteria analysis technique was integrated with risk assessment methodology to enhance the decision making process. As an integral part of overall environmental risk analyses, risk management involves selecting the most appropriate action, and integrating the results of risk assessment with engineering data, social, economic, and political concerns to make an acceptable decision. The risk assessment process involves objectivity, whereas risk management involves preferences and attitudes, which have objective and subjective elements. Choosing an alternative based on risk criteria is challenging for decision makers. Multi-criteria decision making (MCDM) can be used for this purpose. The proposed decision analysis framework integrates fuzzy-probabilistic risk assessment (FPRA) methodology into a fuzzy multi-criteria decision making (FMCDM) analysis. The centroid method was used for defuzzification and ranking alternatives. The efficacy of the integrated technique was demonstrated in an application where 3 types of PW management systems for offshore oil and gas operations were evaluated. 17 refs., 4 tabs., 5 figs.

  13. Probabilistic models of language processing and acquisition.

    Science.gov (United States)

    Chater, Nick; Manning, Christopher D

    2006-07-01

    Probabilistic methods are providing new explanatory approaches to fundamental cognitive science questions of how humans structure, process and acquire language. This review examines probabilistic models defined over traditional symbolic structures. Language comprehension and production involve probabilistic inference in such models; and acquisition involves choosing the best model, given innate constraints and linguistic and other input. Probabilistic models can account for the learning and processing of language, while maintaining the sophistication of symbolic models. A recent burgeoning of theoretical developments and online corpus creation has enabled large models to be tested, revealing probabilistic constraints in processing, undermining acquisition arguments based on a perceived poverty of the stimulus, and suggesting fruitful links with probabilistic theories of categorization and ambiguity resolution in perception.

  14. Integration of an Evidence Base into a Probabilistic Risk Assessment Model. The Integrated Medical Model Database: An Organized Evidence Base for Assessing In-Flight Crew Health Risk and System Design

    Science.gov (United States)

    Saile, Lynn; Lopez, Vilma; Bickham, Grandin; FreiredeCarvalho, Mary; Kerstman, Eric; Byrne, Vicky; Butler, Douglas; Myers, Jerry; Walton, Marlei

    2011-01-01

    This slide presentation reviews the Integrated Medical Model (IMM) database, which is an organized evidence base for assessing in-flight crew health risk. The database is a relational database accessible to many people. The database quantifies the model inputs by a ranking based on the highest value of the data as Level of Evidence (LOE) and the quality of evidence (QOE) score that provides an assessment of the evidence base for each medical condition. The IMM evidence base has already been able to provide invaluable information for designers, and for other uses.

  15. Probabilistic Seismic Hazard Assessment of Babol, Iran

    Directory of Open Access Journals (Sweden)

    Gholamreza Abdollahzadeh

    2011-01-01

    Full Text Available This paper presents a probabilistic seismic hazard assessment of Babol, one of big cities in north of Iran. Many destructive earthquakes happened in Iran in the last centuries. It comes from historical references that at least many times; Babol has been destroyed by catastrophic earthquakes. In this paper, the peak horizontal ground acceleration over the bedrock (PGA is calculated by a probabilistic seismic hazard assessment (PSHA. For this reason, at first, a collected catalogue, containing both historical and instrumental events that occurred in a radius of 200 km of Babol city and covering the period from 874 to 2004 have been gathered. Then, seismic sources are modeled and recur¬rence relationship is established. After elimination of the aftershocks and foreshocks, the main earthquakes were taken into consideration to calculate the seismic parameters (SP by Kijko method. The calculations were performed using the logic tree method and four weighted attenuation relationships Ghodrati, 0.35, Khademi, 0.25, Ambraseys and Simpson, 0.2, and Sarma and Srbulov, 0.2. Seismic hazard assessment is then carried out for 8 horizontal by 7 vertical lines grid points using SEISRISK III. Finally, two seismic hazard maps of the studied area based on Peak Horizontal Ground Acceleration (PGA over bedrock for 2 and 10% probability of ex¬ceedance in one life cycles of 50 year are presented. These calculations have been performed by the Poisson distribution of two hazard levels. The results showed that the PGA ranges from 0.32 to 0.33 g for a return period of 475 years and from 0.507 to 0.527 g for a return period of 2475 years. Since population is very dense in Babol and vulnerability of buildings is high, the risk of future earthquakes will be very significant.

  16. A Tutorial on Probablilistic Risk Assessement and its Role in Risk-Informed Decision Making

    Science.gov (United States)

    Dezfuli, Homayoon

    2010-01-01

    This slide presentation reviews risk assessment and its role in risk-informed decision making. It includes information on probabilistic risk assessment, typical risk management process, origins of risk matrix, performance measures, performance objectives and Bayes theorem.

  17. Probabilistic Planning with Imperfect Sensing Actions Using Hybrid Probabilistic Logic Programs

    Science.gov (United States)

    Saad, Emad

    Effective planning in uncertain environment is important to agents and multi-agents systems. In this paper, we introduce a new logic based approach to probabilistic contingent planning (probabilistic planning with imperfect sensing actions), by relating probabilistic contingent planning to normal hybrid probabilistic logic programs with probabilistic answer set semantics [24]. We show that any probabilistic contingent planning problem can be encoded as a normal hybrid probabilistic logic program. We formally prove the correctness of our approach. Moreover, we show that the complexity of finding a probabilistic contingent plan in our approach is NP-complete. In addition, we show that any probabilistic contingent planning problem, \\cal PP, can be encoded as a classical normal logic program with answer set semantics, whose answer sets corresponds to valid trajectories in \\cal PP. We show that probabilistic contingent planning problems can be encoded as SAT problems. We present a new high level probabilistic action description language that allows the representation of sensing actions with probabilistic outcomes.

  18. Dietary Exposure Assessment of Danish Consumers to Dithiocarbamate Residues in Food: a Comparison of the Deterministic and Probabilistic Approach

    DEFF Research Database (Denmark)

    Jensen, Bodil Hamborg; Andersen, Jens Hinge; Petersen, Annette;

    2008-01-01

    Probabilistic and deterministic estimates of the acute and chronic exposure of the Danish populations to dithiocarbamate residues were performed. The Monte Carlo Risk Assessment programme (MCRA 4.0) was used for the probabilistic risk assessment. Food consumption data were obtained from the natio......Probabilistic and deterministic estimates of the acute and chronic exposure of the Danish populations to dithiocarbamate residues were performed. The Monte Carlo Risk Assessment programme (MCRA 4.0) was used for the probabilistic risk assessment. Food consumption data were obtained from...... the nationwide dietary survey conducted in 2000-02. Residue data for 5721 samples from the monitoring programme conducted in the period 1998-2003 were used for dithiocarbamates, which had been determined as carbon disulphide. Contributions from 26 commodities were included in the calculations. Using...

  19. Application of Dynamic Probabilistic Safety Assessment Approach for Accident Sequence Precursor Analysis: Case Study for Steam Generator Tube Rupture

    National Research Council Canada - National Science Library

    Lee, Hansul; Kim, Taewan; Heo, Gyunyoung

    2017-01-01

    ...) analysis, and to propose a case study using the dynamic-probabilistic safety assessment (D-PSA) approach. The D-PSA approach can aid in the determination of high-risk/low-frequency accident scenarios from all potential scenarios...

  20. -Boundedness and -Compactness in Finite Dimensional Probabilistic Normed Spaces

    Indian Academy of Sciences (India)

    Reza Saadati; Massoud Amini

    2005-11-01

    In this paper, we prove that in a finite dimensional probabilistic normed space, every two probabilistic norms are equivalent and we study the notion of -compactness and -boundedness in probabilistic normed spaces.

  1. Benaloh's Dense Probabilistic Encryption Revisited

    CERN Document Server

    Fousse, Laurent; Alnuaimi, Mohamed

    2010-01-01

    In 1994, Josh Benaloh proposed a probabilistic homomorphic encryption scheme, enhancing the poor expansion factor provided by Goldwasser and Micali's scheme. Since then, numerous papers have taken advantage of Benaloh's homomorphic encryption function, including voting schemes, non-interactive verifiable secret sharing, online poker... In this paper we show that the original description of the scheme is incorrect, possibly resulting in ambiguous decryption of ciphertexts. We give a corrected description of the scheme, provide a complete proof of correctness and an analysis of the probability of failure in the initial description.

  2. Probabilistic Analysis of Crack Width

    Directory of Open Access Journals (Sweden)

    J. Marková

    2000-01-01

    Full Text Available Probabilistic analysis of crack width of a reinforced concrete element is based on the formulas accepted in Eurocode 2 and European Model Code 90. Obtained values of reliability index b seem to be satisfactory for the reinforced concrete slab that fulfils requirements for the crack width specified in Eurocode 2. However, the reliability of the slab seems to be insufficient when the European Model Code 90 is considered; reliability index is less than recommended value 1.5 for serviceability limit states indicated in Eurocode 1. Analysis of sensitivity factors of basic variables enables to find out variables significantly affecting the total crack width.

  3. Savannah River Site K-Reactor Probabilistic Safety Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Brandyberry, M.D.; Bailey, R.T.; Baker, W.H.; Kearnaghan, D.P.; O`Kula, K.R.; Wittman, R.S.; Woody, N.D. [Westinghouse Savannah River Co., Aiken, SC (United States); Amos, C.N.; Weingardt, J.J. [Science Applications International Corp. (United States)

    1992-12-01

    This report gives the results of a Savannah River Site (SRS) K-Reactor Probabilistic Safety Assessment (PSA). Measures of adverse consequences to health and safety resulting from representations of severe accidents in SRS reactors are presented. In addition, the report gives a summary of the methods employed to represent these accidents and to assess the resultant consequences. The report is issued to provide useful information to the U. S. Department of Energy (DOE) on the risk of operation of SRS reactors, for insights into severe accident phenomena that contribute to this risk, and in support of improved bases for other DOE programs in Heavy Water Reactor safety.

  4. Probabilistic earthquake hazard analysis for Cairo, Egypt

    Science.gov (United States)

    Badawy, Ahmed; Korrat, Ibrahim; El-Hadidy, Mahmoud; Gaber, Hanan

    2016-04-01

    Cairo is the capital of Egypt and the largest city in the Arab world and Africa, and the sixteenth largest metropolitan area in the world. It was founded in the tenth century (969 ad) and is 1046 years old. It has long been a center of the region's political and cultural life. Therefore, the earthquake risk assessment for Cairo has a great importance. The present work aims to analysis the earthquake hazard of Cairo as a key input's element for the risk assessment. The regional seismotectonics setting shows that Cairo could be affected by both far- and near-field seismic sources. The seismic hazard of Cairo has been estimated using the probabilistic seismic hazard approach. The logic tree frame work was used during the calculations. Epistemic uncertainties were considered into account by using alternative seismotectonics models and alternative ground motion prediction equations. Seismic hazard values have been estimated within a grid of 0.1° × 0.1 ° spacing for all of Cairo's districts at different spectral periods and four return periods (224, 615, 1230, and 4745 years). Moreover, the uniform hazard spectra have been calculated at the same return periods. The pattern of the contour maps show that the highest values of the peak ground acceleration is concentrated in the eastern zone's districts (e.g., El Nozha) and the lowest values at the northern and western zone's districts (e.g., El Sharabiya and El Khalifa).

  5. Probabilistic analysis of tsunami hazards

    Science.gov (United States)

    Geist, E.L.; Parsons, T.

    2006-01-01

    Determining the likelihood of a disaster is a key component of any comprehensive hazard assessment. This is particularly true for tsunamis, even though most tsunami hazard assessments have in the past relied on scenario or deterministic type models. We discuss probabilistic tsunami hazard analysis (PTHA) from the standpoint of integrating computational methods with empirical analysis of past tsunami runup. PTHA is derived from probabilistic seismic hazard analysis (PSHA), with the main difference being that PTHA must account for far-field sources. The computational methods rely on numerical tsunami propagation models rather than empirical attenuation relationships as in PSHA in determining ground motions. Because a number of source parameters affect local tsunami runup height, PTHA can become complex and computationally intensive. Empirical analysis can function in one of two ways, depending on the length and completeness of the tsunami catalog. For site-specific studies where there is sufficient tsunami runup data available, hazard curves can primarily be derived from empirical analysis, with computational methods used to highlight deficiencies in the tsunami catalog. For region-wide analyses and sites where there are little to no tsunami data, a computationally based method such as Monte Carlo simulation is the primary method to establish tsunami hazards. Two case studies that describe how computational and empirical methods can be integrated are presented for Acapulco, Mexico (site-specific) and the U.S. Pacific Northwest coastline (region-wide analysis).

  6. Why do probabilistic finite element analysis ?

    CERN Document Server

    Thacker, B H

    2008-01-01

    The intention of this book is to provide an introduction to performing probabilistic finite element analysis. As a short guideline, the objective is to inform the reader of the use, benefits and issues associated with performing probabilistic finite element analysis without excessive theory or mathematical detail.

  7. Function Approximation Using Probabilistic Fuzzy Systems

    NARCIS (Netherlands)

    J.H. van den Berg (Jan); U. Kaymak (Uzay); R.J. Almeida e Santos Nogueira (Rui Jorge)

    2011-01-01

    textabstractWe consider function approximation by fuzzy systems. Fuzzy systems are typically used for approximating deterministic functions, in which the stochastic uncertainty is ignored. We propose probabilistic fuzzy systems in which the probabilistic nature of uncertainty is taken into account.

  8. Probabilistic Remaining Useful Life Prediction of Composite Aircraft Components Project

    Data.gov (United States)

    National Aeronautics and Space Administration — A Probabilistic Fatigue Damage Assessment Network (PFDAN) toolkit for Abaqus will be developed for probabilistic life management of a laminated composite structure...

  9. Semantics of sub-probabilistic programs

    Institute of Scientific and Technical Information of China (English)

    Yixing CHEN; Hengyang WU

    2008-01-01

    The aim of this paper is to extend the probabil-istic choice in probabilistic programs to sub-probabilistic choice, i.e., of the form (p)P (q) Q where p + q ≤ 1. It means that program P is executed with probability p and program Q is executed with probability q. Then, start-ing from an initial state, the execution of a sub-probabil-istic program results in a sub-probability distribution. This paper presents two equivalent semantics for a sub-probabilistic while-programming language. One of these interprets programs as sub-probabilistic distributions on state spaces via denotational semantics. The other inter-prets programs as bounded expectation transformers via wp-semantics. This paper proposes an axiomatic systems for total logic, and proves its soundness and completeness in a classical pattern on the structure of programs.

  10. Learning to Estimate Dynamical State with Probabilistic Population Codes.

    Directory of Open Access Journals (Sweden)

    Joseph G Makin

    2015-11-01

    Full Text Available Tracking moving objects, including one's own body, is a fundamental ability of higher organisms, playing a central role in many perceptual and motor tasks. While it is unknown how the brain learns to follow and predict the dynamics of objects, it is known that this process of state estimation can be learned purely from the statistics of noisy observations. When the dynamics are simply linear with additive Gaussian noise, the optimal solution is the well known Kalman filter (KF, the parameters of which can be learned via latent-variable density estimation (the EM algorithm. The brain does not, however, directly manipulate matrices and vectors, but instead appears to represent probability distributions with the firing rates of population of neurons, "probabilistic population codes." We show that a recurrent neural network-a modified form of an exponential family harmonium (EFH-that takes a linear probabilistic population code as input can learn, without supervision, to estimate the state of a linear dynamical system. After observing a series of population responses (spike counts to the position of a moving object, the network learns to represent the velocity of the object and forms nearly optimal predictions about the position at the next time-step. This result builds on our previous work showing that a similar network can learn to perform multisensory integration and coordinate transformations for static stimuli. The receptive fields of the trained network also make qualitative predictions about the developing and learning brain: tuning gradually emerges for higher-order dynamical states not explicitly present in the inputs, appearing as delayed tuning for the lower-order states.

  11. Staged decision making based on probabilistic forecasting

    Science.gov (United States)

    Booister, Nikéh; Verkade, Jan; Werner, Micha; Cranston, Michael; Cumiskey, Lydia; Zevenbergen, Chris

    2016-04-01

    Flood forecasting systems reduce, but cannot eliminate uncertainty about the future. Probabilistic forecasts explicitly show that uncertainty remains. However, as - compared to deterministic forecasts - a dimension is added ('probability' or 'likelihood'), with this added dimension decision making is made slightly more complicated. A technique of decision support is the cost-loss approach, which defines whether or not to issue a warning or implement mitigation measures (risk-based method). With the cost-loss method a warning will be issued when the ratio of the response costs to the damage reduction is less than or equal to the probability of the possible flood event. This cost-loss method is not widely used, because it motivates based on only economic values and is a technique that is relatively static (no reasoning, yes/no decision). Nevertheless it has high potential to improve risk-based decision making based on probabilistic flood forecasting because there are no other methods known that deal with probabilities in decision making. The main aim of this research was to explore the ways of making decision making based on probabilities with the cost-loss method better applicable in practice. The exploration began by identifying other situations in which decisions were taken based on uncertain forecasts or predictions. These cases spanned a range of degrees of uncertainty: from known uncertainty to deep uncertainty. Based on the types of uncertainties, concepts of dealing with situations and responses were analysed and possible applicable concepts where chosen. Out of this analysis the concepts of flexibility and robustness appeared to be fitting to the existing method. Instead of taking big decisions with bigger consequences at once, the idea is that actions and decisions are cut-up into smaller pieces and finally the decision to implement is made based on economic costs of decisions and measures and the reduced effect of flooding. The more lead-time there is in

  12. Probabilistic Fatigue Damage Program (FATIG)

    Science.gov (United States)

    Michalopoulos, Constantine

    2012-01-01

    FATIG computes fatigue damage/fatigue life using the stress rms (root mean square) value, the total number of cycles, and S-N curve parameters. The damage is computed by the following methods: (a) traditional method using Miner s rule with stress cycles determined from a Rayleigh distribution up to 3*sigma; and (b) classical fatigue damage formula involving the Gamma function, which is derived from the integral version of Miner's rule. The integration is carried out over all stress amplitudes. This software solves the problem of probabilistic fatigue damage using the integral form of the Palmgren-Miner rule. The software computes fatigue life using an approach involving all stress amplitudes, up to N*sigma, as specified by the user. It can be used in the design of structural components subjected to random dynamic loading, or by any stress analyst with minimal training for fatigue life estimates of structural components.

  13. The Complexity of Probabilistic Lobbying

    CERN Document Server

    Erdélyi, Gábor; Goldsmith, Judy; Mattei, Nicholas; Raible, Daniel; Rothe, Jörg

    2009-01-01

    We propose various models for lobbying in a probabilistic environment, in which an actor (called "The Lobby") seeks to influence the voters' preferences of voting for or against multiple issues when the voters' preferences are represented in terms of probabilities. In particular, we provide two evaluation criteria and three bribery methods to formally describe these models, and we consider the resulting forms of lobbying with and without issue weighting. We provide a formal analysis for these problems of lobbying in a stochastic environment, and determine their classical and parameterized complexity depending on the given bribery/evaluation criteria. Specifically, we show that some of these problems can be solved in polynomial time, some are NP-complete but fixed-parameter tractable, and some are W[2]-complete. Finally, we provide (in)approximability results.

  14. Machine learning a probabilistic perspective

    CERN Document Server

    Murphy, Kevin P

    2012-01-01

    Today's Web-enabled deluge of electronic data calls for automated methods of data analysis. Machine learning provides these, developing methods that can automatically detect patterns in data and then use the uncovered patterns to predict future data. This textbook offers a comprehensive and self-contained introduction to the field of machine learning, based on a unified, probabilistic approach. The coverage combines breadth and depth, offering necessary background material on such topics as probability, optimization, and linear algebra as well as discussion of recent developments in the field, including conditional random fields, L1 regularization, and deep learning. The book is written in an informal, accessible style, complete with pseudo-code for the most important algorithms. All topics are copiously illustrated with color images and worked examples drawn from such application domains as biology, text processing, computer vision, and robotics. Rather than providing a cookbook of different heuristic method...

  15. Probabilistic direct counterfactual quantum communication

    Science.gov (United States)

    Zhang, Sheng

    2017-02-01

    It is striking that the quantum Zeno effect can be used to launch a direct counterfactual communication between two spatially separated parties, Alice and Bob. So far, existing protocols of this type only provide a deterministic counterfactual communication service. However, this counterfactuality should be payed at a price. Firstly, the transmission time is much longer than a classical transmission costs. Secondly, the chained-cycle structure makes them more sensitive to channel noises. Here, we extend the idea of counterfactual communication, and present a probabilistic-counterfactual quantum communication protocol, which is proved to have advantages over the deterministic ones. Moreover, the presented protocol could evolve to a deterministic one solely by adjusting the parameters of the beam splitters. Project supported by the National Natural Science Foundation of China (Grant No. 61300203).

  16. Probabilistic cloning with supplementary information

    CERN Document Server

    Azuma, K; Koashi, M; Imoto, N; Azuma, Koji; Shimamura, Junichi; Koashi, Masato; Imoto, Nobuyuki

    2005-01-01

    We consider probabilistic cloning of a state chosen from a mutually nonorthogonal set of pure states, with the help of a party holding supplementary information in the form of pure states. When the number of states is two, we show that the best efficiency of producing m copies is always achieved by a two-step protocol in which the helping party first attempts to produce m-1 copies from the supplementary state, and if it fails, then the original state is used to produce m copies. On the other hand, when the number of states exceeds two, the best efficiency is not always achieved by such a protocol. We give examples in which the best efficiency is not achieved even if we allow any amount of one-way classical communication from the helping party.

  17. A~probabilistic tsunami hazard assessment for Indonesia

    Directory of Open Access Journals (Sweden)

    N. Horspool

    2014-05-01

    Full Text Available Probabilistic hazard assessments are a fundamental tool for assessing the threats posed by hazards to communities and are important for underpinning evidence based decision making on risk mitigation activities. Indonesia has been the focus of intense tsunami risk mitigation efforts following the 2004 Indian Ocean Tsunami, but this has been largely concentrated on the Sunda Arc, with little attention to other tsunami prone areas of the country such as eastern Indonesia. We present the first nationally consistent Probabilistic Tsunami Hazard Assessment (PTHA for Indonesia. This assessment produces time independent forecasts of tsunami hazard at the coast from tsunami generated by local, regional and distant earthquake sources. The methodology is based on the established monte-carlo approach to probabilistic seismic hazard assessment (PSHA and has been adapted to tsunami. We account for sources of epistemic and aleatory uncertainty in the analysis through the use of logic trees and through sampling probability density functions. For short return periods (100 years the highest tsunami hazard is the west coast of Sumatra, south coast of Java and the north coast of Papua. For longer return periods (500–2500 years, the tsunami hazard is highest along the Sunda Arc, reflecting larger maximum magnitudes along the Sunda Arc. The annual probability of experiencing a tsunami with a height at the coast of > 0.5 m is greater than 10% for Sumatra, Java, the Sunda Islands (Bali, Lombok, Flores, Sumba and north Papua. The annual probability of experiencing a tsunami with a height of >3.0 m, which would cause significant inundation and fatalities, is 1–10% in Sumatra, Java, Bali, Lombok and north Papua, and 0.1–1% for north Sulawesi, Seram and Flores. The results of this national scale hazard assessment provide evidence for disaster managers to prioritise regions for risk mitigation activities and/or more detailed hazard or risk assessment.

  18. A~probabilistic tsunami hazard assessment for Indonesia

    Science.gov (United States)

    Horspool, N.; Pranantyo, I.; Griffin, J.; Latief, H.; Natawidjaja, D. H.; Kongko, W.; Cipta, A.; Bustaman, B.; Anugrah, S. D.; Thio, H. K.

    2014-05-01

    Probabilistic hazard assessments are a fundamental tool for assessing the threats posed by hazards to communities and are important for underpinning evidence based decision making on risk mitigation activities. Indonesia has been the focus of intense tsunami risk mitigation efforts following the 2004 Indian Ocean Tsunami, but this has been largely concentrated on the Sunda Arc, with little attention to other tsunami prone areas of the country such as eastern Indonesia. We present the first nationally consistent Probabilistic Tsunami Hazard Assessment (PTHA) for Indonesia. This assessment produces time independent forecasts of tsunami hazard at the coast from tsunami generated by local, regional and distant earthquake sources. The methodology is based on the established monte-carlo approach to probabilistic seismic hazard assessment (PSHA) and has been adapted to tsunami. We account for sources of epistemic and aleatory uncertainty in the analysis through the use of logic trees and through sampling probability density functions. For short return periods (100 years) the highest tsunami hazard is the west coast of Sumatra, south coast of Java and the north coast of Papua. For longer return periods (500-2500 years), the tsunami hazard is highest along the Sunda Arc, reflecting larger maximum magnitudes along the Sunda Arc. The annual probability of experiencing a tsunami with a height at the coast of > 0.5 m is greater than 10% for Sumatra, Java, the Sunda Islands (Bali, Lombok, Flores, Sumba) and north Papua. The annual probability of experiencing a tsunami with a height of >3.0 m, which would cause significant inundation and fatalities, is 1-10% in Sumatra, Java, Bali, Lombok and north Papua, and 0.1-1% for north Sulawesi, Seram and Flores. The results of this national scale hazard assessment provide evidence for disaster managers to prioritise regions for risk mitigation activities and/or more detailed hazard or risk assessment.

  19. An Enhanced Artificial Bee Colony Algorithm with Solution Acceptance Rule and Probabilistic Multisearch

    Directory of Open Access Journals (Sweden)

    Alkın Yurtkuran

    2016-01-01

    Full Text Available The artificial bee colony (ABC algorithm is a popular swarm based technique, which is inspired from the intelligent foraging behavior of honeybee swarms. This paper proposes a new variant of ABC algorithm, namely, enhanced ABC with solution acceptance rule and probabilistic multisearch (ABC-SA to address global optimization problems. A new solution acceptance rule is proposed where, instead of greedy selection between old solution and new candidate solution, worse candidate solutions have a probability to be accepted. Additionally, the acceptance probability of worse candidates is nonlinearly decreased throughout the search process adaptively. Moreover, in order to improve the performance of the ABC and balance the intensification and diversification, a probabilistic multisearch strategy is presented. Three different search equations with distinctive characters are employed using predetermined search probabilities. By implementing a new solution acceptance rule and a probabilistic multisearch approach, the intensification and diversification performance of the ABC algorithm is improved. The proposed algorithm has been tested on well-known benchmark functions of varying dimensions by comparing against novel ABC variants, as well as several recent state-of-the-art algorithms. Computational results show that the proposed ABC-SA outperforms other ABC variants and is superior to state-of-the-art algorithms proposed in the literature.

  20. Integration of Fuzzy and Probabilistic Information in the Description of Hydraulic Conductivity

    Science.gov (United States)

    Druschel, B.; Ozbek, M.; Pinder, G.

    2004-12-01

    Evaluation of the heterogeneity of hydraulic conductivity, K, is a well-known problem in groundwater hydrology. The open question is how to fully represent a given highly heterogeneous K field and its inherent uncertainty at least cost. Today, most K fields are analyzed using field test data and probability theory. Uncertainty is usually reported in the spatial covariance. In an attempt to develop a more cost effective method which still provides an accurate approximation of a K field, we propose using an evidence theory framework to merge probabilistic and fuzzy (or possibilistic) information in an effort to improve our ability to fully define a K field. The tool chosen to fuse probabilistic information obtained via experiment and subjective information provided by the groundwater professional is Dempster's Rule of Combination. In using this theory we must create mass assignments for our subject of interest, describing the degree of evidence that supports the presence of our subject in a particular set. These mass assignments can be created directly from the probabilistic information and, in the case of the subjective information, from feedback we obtain from an expert. The fusion of these two types of information provides a better description of uncertainty than would typically be available with just probability theory alone.

  1. A note on probabilistic models over strings: the linear algebra approach.

    Science.gov (United States)

    Bouchard-Côté, Alexandre

    2013-12-01

    Probabilistic models over strings have played a key role in developing methods that take into consideration indels as phylogenetically informative events. There is an extensive literature on using automata and transducers on phylogenies to do inference on these probabilistic models, in which an important theoretical question is the complexity of computing the normalization of a class of string-valued graphical models. This question has been investigated using tools from combinatorics, dynamic programming, and graph theory, and has practical applications in Bayesian phylogenetics. In this work, we revisit this theoretical question from a different point of view, based on linear algebra. The main contribution is a set of results based on this linear algebra view that facilitate the analysis and design of inference algorithms on string-valued graphical models. As an illustration, we use this method to give a new elementary proof of a known result on the complexity of inference on the "TKF91" model, a well-known probabilistic model over strings. Compared to previous work, our proving method is easier to extend to other models, since it relies on a novel weak condition, triangular transducers, which is easy to establish in practice. The linear algebra view provides a concise way of describing transducer algorithms and their compositions, opens the possibility of transferring fast linear algebra libraries (for example, based on GPUs), as well as low rank matrix approximation methods, to string-valued inference problems.

  2. Probabilistic machine learning and artificial intelligence.

    Science.gov (United States)

    Ghahramani, Zoubin

    2015-05-28

    How can a machine learn from experience? Probabilistic modelling provides a framework for understanding what learning is, and has therefore emerged as one of the principal theoretical and practical approaches for designing machines that learn from data acquired through experience. The probabilistic framework, which describes how to represent and manipulate uncertainty about models and predictions, has a central role in scientific data analysis, machine learning, robotics, cognitive science and artificial intelligence. This Review provides an introduction to this framework, and discusses some of the state-of-the-art advances in the field, namely, probabilistic programming, Bayesian optimization, data compression and automatic model discovery.

  3. Probabilistic machine learning and artificial intelligence

    Science.gov (United States)

    Ghahramani, Zoubin

    2015-05-01

    How can a machine learn from experience? Probabilistic modelling provides a framework for understanding what learning is, and has therefore emerged as one of the principal theoretical and practical approaches for designing machines that learn from data acquired through experience. The probabilistic framework, which describes how to represent and manipulate uncertainty about models and predictions, has a central role in scientific data analysis, machine learning, robotics, cognitive science and artificial intelligence. This Review provides an introduction to this framework, and discusses some of the state-of-the-art advances in the field, namely, probabilistic programming, Bayesian optimization, data compression and automatic model discovery.

  4. A History of Probabilistic Inductive Logic Programming

    Directory of Open Access Journals (Sweden)

    Fabrizio eRiguzzi

    2014-09-01

    Full Text Available The field of Probabilistic Logic Programming (PLP has seen significant advances in the last 20 years, with many proposals for languages that combine probability with logic programming. Since the start, the problem of learning probabilistic logic programs has been the focus of much attention. Learning these programs represents a whole subfield of Inductive Logic Programming (ILP. In Probabilistic ILP (PILP two problems are considered: learning the parameters of a program given the structure (the rules and learning both the structure and the parameters. Usually structure learning systems use parameter learning as a subroutine. In this article we present an overview of PILP and discuss the main results.

  5. Probabilistic Modeling of Graded Timber Material Properties

    DEFF Research Database (Denmark)

    Faber, M. H.; Köhler, J.; Sørensen, John Dalsgaard

    2004-01-01

    The probabilistic modeling of timber material characteristics is considered with special emphasis to the modeling of the effect of different quality control and selection procedures used as means for quality grading in the production line. It is shown how statistical models may be established...... an important role in the overall probabilistic modeling. Therefore a scheme for estimating the parameters of probability distribution parameters focusing on the tail behavior has been established using a censored Maximum Likelihood estimation technique. The proposed probabilistic models have been formulated...

  6. satisfies probabilistic k-anonymity criterion

    Directory of Open Access Journals (Sweden)

    Anna Oganian

    2017-04-01

    Full Text Available Before releasing databases which contain sensitive information about individuals, data publishers must apply Statistical Disclosure Limitation (SDL methods to them, in order to avoid disclosure of sensitive information on any identifiable data subject. SDL methods often consist of masking or synthesizing the original data records in such a way as to minimize the risk of disclosure of the sensitive information while providing data users with accurate information about the population of interest. In this paper we propose a new scheme for disclosure limitation, based on the idea of local synthesis of data. Our approach is predicated on model-based clustering. The proposed method satisfies the requirements of k-anonymity; in particular we use a variant of the k-anonymity privacy model, namely probabilistic k-anonymity, by incorporating constraints on cluster cardinality. Regarding data utility, for continuous attributes, we exactly preserve means and covariances of the original data, while approximately preserving higher-order moments and analyses on subdomains (defined by clusters and cluster combinations. For both continuous and categorical data, our experiments with medical data sets show that, from the point of view of data utility, local synthesis compares very favorably with other methods of disclosure limitation including the sequential regression approach for synthetic data generation.

  7. Seismic performance of viaducts with probabilistic method

    Institute of Scientific and Technical Information of China (English)

    ZHU Xi; WANG Jianmin

    2007-01-01

    Due to the uncertainty of both ground motions and structural capacity,it is necessary to consider the seismic performance of viaduct structures using the probabilistic method.The risk is quantified by a procedure on the basis of a numerical determination of the fragility curves.A group of ground motions,Large Magnitude-Short Distance Bin (LMSR-N),selected specially due to its response spectra,is accorded well with the corresponding spectra of the Chinese code for seismic design.The characteristic values of the curvature ductility factors for the serviceability and the damage control limit states are obtained,and two equations for estimating the characteristic values of the curvature ductility factors are developed through regression analysis.Then,the serviceability and damage control limit states were proposed.Three damage states were constituted according the results of the experiment by Pacific Earthquake Engineering Research (PEER) Center.The analytical fragility curves were obtained specifically,using both Capacity Spectrum Method (CSM) (non-linear static) analysis and Ineremental Dynamic Method (IDM) (non-linear dynamic) analysis,respectively,in this paper.The structural fragility curves developed by CSM method can help make the structural analysis simple and quick,avoiding the implementation of the dynamic response history analysis (RHA).Although the dynamic RHA requires a lot of complicated analysis for the structure,the results from RHA are reliable and accurate.Fragility curves are powerful tools for use in performance-based seismic bridge design.

  8. Probabilistic Seismic Hazard Assessment for Taiwan

    Directory of Open Access Journals (Sweden)

    Yu-Ju Wang

    2016-06-01

    Full Text Available The Taiwan Earthquake Model (TEM was established to assess the seismic hazard and risk for Taiwan by considering the social and economic impacts of various components from geology, seismology, and engineering. This paper gives the first version of TEM probabilistic seismic hazard analysis for Taiwan in these aspects. We named it TEM PSHA2015. The model adopts the source parameters of 38 seismogenic structures identified by TEM geologists. In addition to specific fault source-based categorization, seismic activities are categorized as shallow, subduction intraplate, and subduction interplate events. To evaluate the potential ground-shaking resulting from each seismic source, the corresponding ground-motion prediction equations for crustal and subduction earthquakes are adopted. The highest hazard probability is evaluated to be in Southwestern Taiwan and the Longitudinal Valley of Eastern Taiwan. Among the special municipalities in the highly populated Western Taiwan region, Taichung, Tainan, and New Taipei City are evaluated to have the highest hazard. Tainan has the highest seismic hazard for peak ground acceleration in the model based on TEM fault parameters. In terms of pseudo-spectral acceleration, Tainan has higher hazard over short spectral periods, whereas Taichung has higher hazard over long spectral periods. The analysis indicates the importance of earthquake-resistant designs for low-rise buildings in Tainan and high-rise buildings in Taichung.

  9. Probabilistic UML statecharts for specification and verification: a case study

    NARCIS (Netherlands)

    Jansen, D.N.; Jürjens, J.; Cengarle, M.V.; Fernandez, E.B.; Rumpe, B.; Sander, R.

    2002-01-01

    This paper introduces a probabilistic extension of UML statecharts. A requirements-level semantics of statecharts is extended to include probabilistic elements. Desired properties for probabilistic statecharts are expressed in the probabilistic logic PCTL, and verified using the model checker Prism.

  10. Risk taking and risk sharing : does responsibility matter?

    NARCIS (Netherlands)

    Cettolin, Elena; Tausch, Franziska

    Risk sharing arrangements diminish individuals’ vulnerability to probabilistic events that negatively affect their financial situation. This is because risk sharing implies redistribution, as lucky individuals support the unlucky ones. We hypothesize that responsibility for risky choices decreases

  11. Risk taking and risk sharing does responsibility matter?

    NARCIS (Netherlands)

    Cettolin, E.; Tausch, F.

    2013-01-01

    Risk sharing arrangements diminish individuals’ vulnerability to probabilistic events that negatively affect their financial situation. This is because risk sharing implies redistribution, as lucky individuals support the unlucky ones. We hypothesize that responsibility for risky choices decreases

  12. Maximizing Statistical Power When Verifying Probabilistic Forecasts of Hydrometeorological Events

    Science.gov (United States)

    DeChant, C. M.; Moradkhani, H.

    2014-12-01

    Hydrometeorological events (i.e. floods, droughts, precipitation) are increasingly being forecasted probabilistically, owing to the uncertainties in the underlying causes of the phenomenon. In these forecasts, the probability of the event, over some lead time, is estimated based on some model simulations or predictive indicators. By issuing probabilistic forecasts, agencies may communicate the uncertainty in the event occurring. Assuming that the assigned probability of the event is correct, which is referred to as a reliable forecast, the end user may perform some risk management based on the potential damages resulting from the event. Alternatively, an unreliable forecast may give false impressions of the actual risk, leading to improper decision making when protecting resources from extreme events. Due to this requisite for reliable forecasts to perform effective risk management, this study takes a renewed look at reliability assessment in event forecasts. Illustrative experiments will be presented, showing deficiencies in the commonly available approaches (Brier Score, Reliability Diagram). Overall, it is shown that the conventional reliability assessment techniques do not maximize the ability to distinguish between a reliable and unreliable forecast. In this regard, a theoretical formulation of the probabilistic event forecast verification framework will be presented. From this analysis, hypothesis testing with the Poisson-Binomial distribution is the most exact model available for the verification framework, and therefore maximizes one's ability to distinguish between a reliable and unreliable forecast. Application of this verification system was also examined within a real forecasting case study, highlighting the additional statistical power provided with the use of the Poisson-Binomial distribution.

  13. Probabilistic analysis of linear elastic cracked structures

    Institute of Scientific and Technical Information of China (English)

    2007-01-01

    This paper presents a probabilistic methodology for linear fracture mechanics analysis of cracked structures. The main focus is on probabilistic aspect related to the nature of crack in material. The methodology involves finite element analysis; statistical models for uncertainty in material properties, crack size, fracture toughness and loads; and standard reliability methods for evaluating probabilistic characteristics of linear elastic fracture parameter. The uncertainty in the crack size can have a significant effect on the probability of failure, particularly when the crack size has a large coefficient of variation. Numerical example is presented to show that probabilistic methodology based on Monte Carlo simulation provides accurate estimates of failure probability for use in linear elastic fracture mechanics.

  14. Structural reliability codes for probabilistic design

    DEFF Research Database (Denmark)

    Ditlevsen, Ove Dalager

    1997-01-01

    difficulties of ambiguity and definition show up when attempting to make the transition from a given authorized partial safety factor code to a superior probabilistic code. For any chosen probabilistic code format there is a considerable variation of the reliability level over the set of structures defined...... considerable variation of the reliability measure as defined by a specific probabilistic code format. Decision theoretical principles are applied to get guidance about which of these different reliability levels of existing practice to choose as target reliability level. Moreover, it is shown that the chosen...... probabilistic code format has not only strong influence on the formal reliability measure, but also on the formal cost of failure to be associated if a design made to the target reliability level is considered to be optimal. In fact, the formal cost of failure can be different by several orders of size for two...

  15. Revising incompletely specified convex probabilistic belief bases

    CSIR Research Space (South Africa)

    Rens, G

    2016-04-01

    Full Text Available International Workshop on Non-Monotonic Reasoning (NMR), 22-24 April 2016, Cape Town, South Africa Revising Incompletely Specified Convex Probabilistic Belief Bases Gavin Rens CAIR_, University of KwaZulu-Natal, School of Mathematics, Statistics...

  16. A logic for inductive probabilistic reasoning

    DEFF Research Database (Denmark)

    Jaeger, Manfred

    2005-01-01

    Inductive probabilistic reasoning is understood as the application of inference patterns that use statistical background information to assign (subjective) probabilities to single events. The simplest such inference pattern is direct inference: from '70% of As are Bs" and "a is an A" infer...... that a is a B with probability 0.7. Direct inference is generalized by Jeffrey's rule and the principle of cross-entropy minimization. To adequately formalize inductive probabilistic reasoning is an interesting topic for artificial intelligence, as an autonomous system acting in a complex environment may have...... to base its actions on a probabilistic model of its environment, and the probabilities needed to form this model can often be obtained by combining statistical background information with particular observations made, i.e., by inductive probabilistic reasoning. In this paper a formal framework...

  17. Non-unitary probabilistic quantum computing

    Science.gov (United States)

    Gingrich, Robert M.; Williams, Colin P.

    2004-01-01

    We present a method for designing quantum circuits that perform non-unitary quantum computations on n-qubit states probabilistically, and give analytic expressions for the success probability and fidelity.

  18. Safety Verification for Probabilistic Hybrid Systems

    DEFF Research Database (Denmark)

    Zhang, Lijun; She, Zhikun; Ratschan, Stefan;

    2010-01-01

    The interplay of random phenomena and continuous real-time control deserves increased attention for instance in wireless sensing and control applications. Safety verification for such systems thus needs to consider probabilistic variations of systems with hybrid dynamics. In safety verification...... hybrid systems and develop a general abstraction technique for verifying probabilistic safety problems. This gives rise to the first mechanisable technique that can, in practice, formally verify safety properties of non-trivial continuous-time stochastic hybrid systems-without resorting to point...... of classical hybrid systems we are interested in whether a certain set of unsafe system states can be reached from a set of initial states. In the probabilistic setting, we may ask instead whether the probability of reaching unsafe states is below some given threshold. In this paper, we consider probabilistic...

  19. Safety Verification for Probabilistic Hybrid Systems

    DEFF Research Database (Denmark)

    Zhang, Lijun; She, Zhikun; Ratschan, Stefan;

    2012-01-01

    The interplay of random phenomena and continuous dynamics deserves increased attention, especially in the context of wireless sensing and control applications. Safety verification for such systems thus needs to consider probabilistic variants of systems with hybrid dynamics. In safety verification...... hybrid systems and develop a general abstraction technique for verifying probabilistic safety problems. This gives rise to the first mechanisable technique that can, in practice, formally verify safety properties of non-trivial continuous-time stochastic hybrid systems. Moreover, being based...... of classical hybrid systems, we are interested in whether a certain set of unsafe system states can be reached from a set of initial states. In the probabilistic setting, we may ask instead whether the probability of reaching unsafe states is below some given threshold. In this paper, we consider probabilistic...

  20. Improved transformer protection using probabilistic neural network ...

    African Journals Online (AJOL)

    user

    This article presents a novel technique to distinguish between magnetizing inrush ... Protective relaying, Probabilistic neural network, Active power relays, Power ... Forward Neural Network (MFFNN) with back-propagation learning technique.

  1. Probabilistic composition of preferences, theory and applications

    CERN Document Server

    Parracho Sant'Anna, Annibal

    2015-01-01

    Putting forward a unified presentation of the features and possible applications of probabilistic preferences composition, and serving as a methodology for decisions employing multiple criteria, this book maximizes reader insights into the evaluation in probabilistic terms and the development of composition approaches that do not depend on assigning weights to the criteria. With key applications in important areas of management such as failure modes, effects analysis and productivity analysis – together with explanations about the application of the concepts involved –this book makes available numerical examples of probabilistic transformation development and probabilistic composition. Useful not only as a reference source for researchers, but also in teaching classes of graduate courses in Production Engineering and Management Science, the key themes of the book will be of especial interest to researchers in the field of Operational Research.

  2. Strategic Team AI Path Plans: Probabilistic Pathfinding

    Directory of Open Access Journals (Sweden)

    Tng C. H. John

    2008-01-01

    Full Text Available This paper proposes a novel method to generate strategic team AI pathfinding plans for computer games and simulations using probabilistic pathfinding. This method is inspired by genetic algorithms (Russell and Norvig, 2002, in that, a fitness function is used to test the quality of the path plans. The method generates high-quality path plans by eliminating the low-quality ones. The path plans are generated by probabilistic pathfinding, and the elimination is done by a fitness test of the path plans. This path plan generation method has the ability to generate variation or different high-quality paths, which is desired for games to increase replay values. This work is an extension of our earlier work on team AI: probabilistic pathfinding (John et al., 2006. We explore ways to combine probabilistic pathfinding and genetic algorithm to create a new method to generate strategic team AI pathfinding plans.

  3. Application of probabilistic precipitation forecasts from a ...

    African Journals Online (AJOL)

    Application of probabilistic precipitation forecasts from a deterministic model towards increasing the lead-time of flash flood forecasts in South Africa. ... An ensemble set of 30 adjacent basins is then identified as ensemble members for each ...

  4. A probabilistic strategy for parametric catastrophe insurance

    Science.gov (United States)

    Figueiredo, Rui; Martina, Mario; Stephenson, David; Youngman, Benjamin

    2017-04-01

    Economic losses due to natural hazards have shown an upward trend since 1980, which is expected to continue. Recent years have seen a growing worldwide commitment towards the reduction of disaster losses. This requires effective management of disaster risk at all levels, a part of which involves reducing financial vulnerability to disasters ex-ante, ensuring that necessary resources will be available following such events. One way to achieve this is through risk transfer instruments. These can be based on different types of triggers, which determine the conditions under which payouts are made after an event. This study focuses on parametric triggers, where payouts are determined by the occurrence of an event exceeding specified physical parameters at a given location, or at multiple locations, or over a region. This type of product offers a number of important advantages, and its adoption is increasing. The main drawback of parametric triggers is their susceptibility to basis risk, which arises when there is a mismatch between triggered payouts and the occurrence of loss events. This is unavoidable in said programmes, as their calibration is based on models containing a number of different sources of uncertainty. Thus, a deterministic definition of the loss event triggering parameters appears flawed. However, often for simplicity, this is the way in which most parametric models tend to be developed. This study therefore presents an innovative probabilistic strategy for parametric catastrophe insurance. It is advantageous as it recognizes uncertainties and minimizes basis risk while maintaining a simple and transparent procedure. A logistic regression model is constructed here to represent the occurrence of loss events based on certain loss index variables, obtained through the transformation of input environmental variables. Flood-related losses due to rainfall are studied. The resulting model is able, for any given day, to issue probabilities of occurrence of loss

  5. Probabilistic Analysis Methods for Hybrid Ventilation

    DEFF Research Database (Denmark)

    Brohus, Henrik; Frier, Christian; Heiselberg, Per

    This paper discusses a general approach for the application of probabilistic analysis methods in the design of ventilation systems. The aims and scope of probabilistic versus deterministic methods are addressed with special emphasis on hybrid ventilation systems. A preliminary application of stoc...... of stochastic differential equations is presented comprising a general heat balance for an arbitrary number of loads and zones in a building to determine the thermal behaviour under random conditions....

  6. PROBABILISTIC METHODOLOGY OF LOW CYCLE FATIGUE ANALYSIS

    Institute of Scientific and Technical Information of China (English)

    Jin Hui; Wang Jinnuo; Wang Libin

    2003-01-01

    The cyclic stress-strain responses (CSSR), Neuber's rule (NR) and cyclic strain-life relation (CSLR) are treated as probabilistic curves in local stress and strain method of low cycle fatigue analy sis. The randomness of loading and the theory of fatigue damage accumulation (TOFDA) are consid ered. The probabilistic analysis of local stress, local strain and fatigue life are constructed based on the first-order Taylor's series expansions. Through this method proposed fatigue reliability analysis can be accomplished.

  7. DEMPSTER-SHAFER THEORY BY PROBABILISTIC REASONING

    Directory of Open Access Journals (Sweden)

    Chiranjib Mukherjee

    2015-10-01

    Full Text Available Probabilistic reasoning is used when outcomes are unpredictable. We examine the methods which use probabilistic representations for all knowledge and which reason by propagating the uncertainties can arise from evidence and assertions to conclusions. The uncertainties can arise from an inability to predict outcomes due to unreliable, vague, in complete or inconsistent knowledge. Some approaches taken in Artificial Intelligence system to deal with reasoning under similar types of uncertain conditions.

  8. Probabilistic nature in L/H transition

    Energy Technology Data Exchange (ETDEWEB)

    Toda, Shinichiro; Itoh, Sanae-I.; Yagi, Masatoshi [Kyushu Univ., Fukuoka (Japan). Research Inst. for Applied Mechanics; Itoh, Kimitaka; Fukuyama, Atsushi

    1999-11-01

    Statistical picture for an excitation of a plasma transition is examined, which occurs in a strongly turbulent state. The physical picture of transition phenomena is extended to include the statistical variances. The dynamics of the plasma density and turbulent-driven flux is studied with hysteresis nature in the flux-density relation. The probabilistic excitation is predicted and the critical conditions are described by the probabilistic distribution function. The stability for the model equations is also discussed. (author)

  9. Semantics of probabilistic processes an operational approach

    CERN Document Server

    Deng, Yuxin

    2015-01-01

    This book discusses the semantic foundations of concurrent systems with nondeterministic and probabilistic behaviour. Particular attention is given to clarifying the relationship between testing and simulation semantics and characterising bisimulations from metric, logical, and algorithmic perspectives. Besides presenting recent research outcomes in probabilistic concurrency theory, the book exemplifies the use of many mathematical techniques to solve problems in computer science, which is intended to be accessible to postgraduate students in Computer Science and Mathematics. It can also be us

  10. A hybrid approach for probabilistic forecasting of electricity price

    DEFF Research Database (Denmark)

    Wan, Can; Xu, Zhao; Wang, Yelei

    2014-01-01

    The electricity market plays a key role in realizing the economic prophecy of smart grids. Accurate and reliable electricity market price forecasting is essential to facilitate various decision making activities of market participants in the future smart grid environment. However, due...... to probabilistic interval forecasts can be of great importance to quantify the uncertainties of potential forecasts, thus effectively supporting the decision making activities against uncertainties and risks ahead. This paper proposes a hybrid approach to construct prediction intervals of MCPs with a two...... electricity price forecasting is proposed in this paper. The effectiveness of the proposed hybrid method has been validated through comprehensive tests using real price data from Australian electricity market....

  11. Nuclear power plant personnel errors in decision-making as an object of probabilistic risk assessment. Methodological extensions on the basis of a differentiated analysis of safety-relevant goals; Entscheidungsfehler des Betriebspersonals von Kernkraftwerken als Objekt probabilistischer Risikoanalysen; Methodische Erweiterungen auf der Basis einer differenzierten Betrachtungsweise sicherheitsgerichteter Ziele

    Energy Technology Data Exchange (ETDEWEB)

    Reer, B.

    1993-09-01

    Integration of human error (man-machine system analysis (MMSA)) is an essential part of probabilistic risk assessment (PRA). A method is presented for systematic, comprehensive PRA inclusions of decision-based errors due to conflicts or similarities. For error identification procedure, new question techniques are developed. These errors are identified by looking at retroactions caused by subordinate goals as components of overall safety relevant goal. New quantification methods for estimating situation-specific probabilities are developed. The factors conflict and similarity are operationalized in a way that allows their quantification based on informations usually available in PRA. Quantification procedure uses extrapolations and interpolations based on a poor set of data related to decision-based errors. Moreover, for passive errors in decision-making a completely new approach is presented where errors are quantified via a delay initiating the required action rather than via error probabilities. Practicability of this dynamic approach is demonstrated by probabilistic analysis of the actions required during the total loss of feedwater event at the Davis-Besse plant 1985. The extensions of the classical PRA method developed in this work are applied to a MMSA of the decay heat removal (DHR) of the HTR-500. Errors in decision-making - as potential roots of extraneous acts - are taken into account in a comprehensive and systematic manner. Five additional errors are identified. However, the probabilistic quantification results a nonsignificant increase of the DHR failure probability. (orig.) [Deutsch] Einbeziehung von Operateurfehlern (Mensch-Maschine-Systemanalyse (MMSA)) ist Bestandteil einer probabilistischen Risikoanalyse (PRA). Es wird eine Methode vorgestellt, mit der sich Entscheidungsfehler aufgrund der Faktoren Konflikt und Aehnlichkeit systematisch und umfassend in MMSA integrieren lassen. Zur Identifizierung der entsprechenden Situationen im Stoerfallablauf

  12. Automated Probabilistic System Architecture Analysis in the Multi-Attribute Prediction Language (MAPL: Iteratively Developed using Multiple Case Studies

    Directory of Open Access Journals (Sweden)

    Robert Lagerström

    2017-07-01

    Full Text Available The Multi-Attribute Prediction Language (MAPL, an analysis metamodel for non-functional qualities of system architectures, is introduced. MAPL features automate analysis in five non-functional areas: service cost, service availability, data accuracy, application coupling, and application size. In addition, MAPL explicitly includes utility modeling to make trade-offs between the qualities. The article introduces how each of the five non-functional qualities are modeled and quantitatively analyzed based on the ArchiMate standard for enterprise architecture modeling and the previously published Predictive, Probabilistic Architecture Modeling Framework, building on the well-known UML and OCL formalisms. The main contribution of MAPL lies in the probabilistic use of multi-attribute utility theory for the trade-off analysis of the non-functional properties. Additionally, MAPL proposes novel model-based analyses of several non-functional attributes. We also report how MAPL has iteratively been developed using multiple case studies.

  13. Probabilistic Prediction of Lifetimes of Ceramic Parts

    Science.gov (United States)

    Nemeth, Noel N.; Gyekenyesi, John P.; Jadaan, Osama M.; Palfi, Tamas; Powers, Lynn; Reh, Stefan; Baker, Eric H.

    2006-01-01

    ANSYS/CARES/PDS is a software system that combines the ANSYS Probabilistic Design System (PDS) software with a modified version of the Ceramics Analysis and Reliability Evaluation of Structures Life (CARES/Life) Version 6.0 software. [A prior version of CARES/Life was reported in Program for Evaluation of Reliability of Ceramic Parts (LEW-16018), NASA Tech Briefs, Vol. 20, No. 3 (March 1996), page 28.] CARES/Life models effects of stochastic strength, slow crack growth, and stress distribution on the overall reliability of a ceramic component. The essence of the enhancement in CARES/Life 6.0 is the capability to predict the probability of failure using results from transient finite-element analysis. ANSYS PDS models the effects of uncertainty in material properties, dimensions, and loading on the stress distribution and deformation. ANSYS/CARES/PDS accounts for the effects of probabilistic strength, probabilistic loads, probabilistic material properties, and probabilistic tolerances on the lifetime and reliability of the component. Even failure probability becomes a stochastic quantity that can be tracked as a response variable. ANSYS/CARES/PDS enables tracking of all stochastic quantities in the design space, thereby enabling more precise probabilistic prediction of lifetimes of ceramic components.

  14. Probabilistic Choice, Reversibility, Loops, and Miracles

    Science.gov (United States)

    Stoddart, Bill; Bell, Pete

    We consider an addition of probabilistic choice to Abrial's Generalised Substitution Language (GSL) in a form that accommodates the backtracking interpretation of non-deterministic choice. Our formulation is introduced as an extension of the Prospective Values formalism we have developed to describe the results from a backtracking search. Significant features are that probabilistic choice is governed by feasibility, and non-termination is strict. The former property allows us to use probabilistic choice to generate search heuristics. In this paper we are particularly interested in iteration. By demonstrating sub-conjunctivity and monotonicity properties of expectations we give the basis for a fixed point semantics of iterative constructs, and we consider the practical proof treatment of probabilistic loops. We discuss loop invariants, loops with probabilistic behaviour, and probabilistic termination in the context of a formalism in which a small probability of non-termination can dominate our calculations, proposing a method of limits to avoid this problem. The formal programming constructs described have been implemented in a reversible virtual machine (RVM).

  15. Refinement for Probabilistic Systems with Nondeterminism

    Directory of Open Access Journals (Sweden)

    David Streader

    2011-06-01

    Full Text Available Before we combine actions and probabilities two very obvious questions should be asked. Firstly, what does "the probability of an action" mean? Secondly, how does probability interact with nondeterminism? Neither question has a single universally agreed upon answer but by considering these questions at the outset we build a novel and hopefully intuitive probabilistic event-based formalism. In previous work we have characterised refinement via the notion of testing. Basically, if one system passes all the tests that another system passes (and maybe more we say the first system is a refinement of the second. This is, in our view, an important way of characterising refinement, via the question "what sort of refinement should I be using?" We use testing in this paper as the basis for our refinement. We develop tests for probabilistic systems by analogy with the tests developed for non-probabilistic systems. We make sure that our probabilistic tests, when performed on non-probabilistic automata, give us refinement relations which agree with for those non-probabilistic automata. We formalise this property as a vertical refinement.

  16. Probabilistic seasonal Forecasts to deterministic Farm Leve Decisions: Innovative Approach

    Science.gov (United States)

    Mwangi, M. W.

    2015-12-01

    Climate change and vulnerability are major challenges in ensuring household food security. Climate information services have the potential to cushion rural households from extreme climate risks. However, most the probabilistic nature of climate information products is not easily understood by majority of smallholder farmers. Despite the probabilistic nature, climate information have proved to be a valuable climate risk adaptation strategy at the farm level. This calls for innovative ways to help farmers understand and apply climate information services to inform their farm level decisions. The study endeavored to co-design and test appropriate innovation systems for climate information services uptake and scale up necessary for achieving climate risk development. In addition it also determined the conditions necessary to support the effective performance of the proposed innovation system. Data and information sources included systematic literature review, secondary sources, government statistics, focused group discussions, household surveys and semi-structured interviews. Data wasanalyzed using both quantitative and qualitative data analysis techniques. Quantitative data was analyzed using the Statistical Package for Social Sciences (SPSS) software. Qualitative data was analyzed using qualitative techniques, which involved establishing the categories and themes, relationships/patterns and conclusions in line with the study objectives. Sustainable livelihood, reduced household poverty and climate change resilience were the impact that resulted from the study.

  17. Computing Distances between Probabilistic Automata

    Directory of Open Access Journals (Sweden)

    Mathieu Tracol

    2011-07-01

    Full Text Available We present relaxed notions of simulation and bisimulation on Probabilistic Automata (PA, that allow some error epsilon. When epsilon is zero we retrieve the usual notions of bisimulation and simulation on PAs. We give logical characterisations of these notions by choosing suitable logics which differ from the elementary ones, L with negation and L without negation, by the modal operator. Using flow networks, we show how to compute the relations in PTIME. This allows the definition of an efficiently computable non-discounted distance between the states of a PA. A natural modification of this distance is introduced, to obtain a discounted distance, which weakens the influence of long term transitions. We compare our notions of distance to others previously defined and illustrate our approach on various examples. We also show that our distance is not expansive with respect to process algebra operators. Although L without negation is a suitable logic to characterise epsilon-(bisimulation on deterministic PAs, it is not for general PAs; interestingly, we prove that it does characterise weaker notions, called a priori epsilon-(bisimulation, which we prove to be NP-difficult to decide.

  18. Pearl A Probabilistic Chart Parser

    CERN Document Server

    Magerman, D M; Magerman, David M.; Marcus, Mitchell P.

    1994-01-01

    This paper describes a natural language parsing algorithm for unrestricted text which uses a probability-based scoring function to select the "best" parse of a sentence. The parser, Pearl, is a time-asynchronous bottom-up chart parser with Earley-type top-down prediction which pursues the highest-scoring theory in the chart, where the score of a theory represents the extent to which the context of the sentence predicts that interpretation. This parser differs from previous attempts at stochastic parsers in that it uses a richer form of conditional probabilities based on context to predict likelihood. Pearl also provides a framework for incorporating the results of previous work in part-of-speech assignment, unknown word models, and other probabilistic models of linguistic features into one parsing tool, interleaving these techniques instead of using the traditional pipeline architecture. In preliminary tests, Pearl has been successful at resolving part-of-speech and word (in speech processing) ambiguity, dete...

  19. Optimal probabilistic dense coding schemes

    Science.gov (United States)

    Kögler, Roger A.; Neves, Leonardo

    2017-04-01

    Dense coding with non-maximally entangled states has been investigated in many different scenarios. We revisit this problem for protocols adopting the standard encoding scheme. In this case, the set of possible classical messages cannot be perfectly distinguished due to the non-orthogonality of the quantum states carrying them. So far, the decoding process has been approached in two ways: (i) The message is always inferred, but with an associated (minimum) error; (ii) the message is inferred without error, but only sometimes; in case of failure, nothing else is done. Here, we generalize on these approaches and propose novel optimal probabilistic decoding schemes. The first uses quantum-state separation to increase the distinguishability of the messages with an optimal success probability. This scheme is shown to include (i) and (ii) as special cases and continuously interpolate between them, which enables the decoder to trade-off between the level of confidence desired to identify the received messages and the success probability for doing so. The second scheme, called multistage decoding, applies only for qudits ( d-level quantum systems with d>2) and consists of further attempts in the state identification process in case of failure in the first one. We show that this scheme is advantageous over (ii) as it increases the mutual information between the sender and receiver.

  20. Probabilistic description of traffic flow

    Science.gov (United States)

    Mahnke, R.; Kaupužs, J.; Lubashevsky, I.

    2005-03-01

    A stochastic description of traffic flow, called probabilistic traffic flow theory, is developed. The general master equation is applied to relatively simple models to describe the formation and dissolution of traffic congestions. Our approach is mainly based on spatially homogeneous systems like periodically closed circular rings without on- and off-ramps. We consider a stochastic one-step process of growth or shrinkage of a car cluster (jam). As generalization we discuss the coexistence of several car clusters of different sizes. The basic problem is to find a physically motivated ansatz for the transition rates of the attachment and detachment of individual cars to a car cluster consistent with the empirical observations in real traffic. The emphasis is put on the analogy with first-order phase transitions and nucleation phenomena in physical systems like supersaturated vapour. The results are summarized in the flux-density relation, the so-called fundamental diagram of traffic flow, and compared with empirical data. Different regimes of traffic flow are discussed: free flow, congested mode as stop-and-go regime, and heavy viscous traffic. The traffic breakdown is studied based on the master equation as well as the Fokker-Planck approximation to calculate mean first passage times or escape rates. Generalizations are developed to allow for on-ramp effects. The calculated flux-density relation and characteristic breakdown times coincide with empirical data measured on highways. Finally, a brief summary of the stochastic cellular automata approach is given.