WorldWideScience

Sample records for safety probabilistic analysis

  1. Probabilistic safety analysis of external floods. Method and application

    Energy Technology Data Exchange (ETDEWEB)

    Kluegel, J.U. [Kernkraftwerk Goesgen-Daeniken (Switzerland)

    2013-05-15

    The events of Fukushima amplified the scientific interest in the improvement of methods for probabilistic safety assessment (PSA) of extreme external events. The assessment of consequences of external floods belongs to this group of events. The paper presents the key steps of methodology for probabilistic safety assessment of external floods and a recent application for a nuclear power plant in Switzerland. The presented methodology is an extension of earlier activities and provides more focus on the PSA methodology part that may be applicable also for other studies. (orig.)

  2. Formalizing Probabilistic Safety Claims

    Science.gov (United States)

    Herencia-Zapana, Heber; Hagen, George E.; Narkawicz, Anthony J.

    2011-01-01

    A safety claim for a system is a statement that the system, which is subject to hazardous conditions, satisfies a given set of properties. Following work by John Rushby and Bev Littlewood, this paper presents a mathematical framework that can be used to state and formally prove probabilistic safety claims. It also enables hazardous conditions, their uncertainties, and their interactions to be integrated into the safety claim. This framework provides a formal description of the probabilistic composition of an arbitrary number of hazardous conditions and their effects on system behavior. An example is given of a probabilistic safety claim for a conflict detection algorithm for aircraft in a 2D airspace. The motivation for developing this mathematical framework is that it can be used in an automated theorem prover to formally verify safety claims.

  3. The probabilistic studies of safety; Les etudes probabilistes de surete

    Energy Technology Data Exchange (ETDEWEB)

    Lacoste, A.C. [Direction Generale de la Surete Nucleaire et de la Radioprotection, 75 - Paris (France); Kalalo, E.; Brenot, D. [Direction Generale de la Surete Nucleaire et de la Radioprotection, 75 - Paris (France); Pichereau, F.; Lanore, J.M.; Corenwinder, F.; Dupuy, P.; Corenwinder, F. [Institut de Radioprotection et de Surete Nucleaire (IRSN), 92 - Clamart (France); Primet, J.; Huyart, Th.; Faidy, C.; Meister, E.; Ardillon, E.; Le Bot, P. [Electricite de France (EDF), 75 - Paris (France); Maurin, Th. [Direction Gle de la Surete Nucleaire et de la Radioprotection, 75 - Paris (France); Gryffroy, D. [Association Vincotte Nucleaire (Belgium); Sandberg, J.; Virolainen, R. [STUK, Helsinki (Finland); Merrifield, J.S. [Nuclear Regulatory Commission (United States); Hill, T. [National Nuclear Regulator, Cape Town (South Africa); Cahen, B. [Ministere de Ecologie et du Developpement Durable, 75 - Paris (France); Chevalier, B. [Shell France, 92 - Colombes (France)

    2003-11-15

    The safety of the French nuclear reactors is based on determinist studies. The probabilistic studies of safety allow to complete the classical determinist analysis with a particular method of investigation. The probabilistic studies of safety are an evaluation method of risks based on a systematic investigation of accidental scenarios. They are composed of a whole of technical analysis allowing to estimate the risks linked to nuclear installations in term of frequency of accidental events and their consequences. (N.C.)

  4. A probabilistic safety analysis of UF{sub 6} handling at the Portsmouth Gaseous Diffusion Plant

    Energy Technology Data Exchange (ETDEWEB)

    Boyd, G.J.; Lewis, S.R.; Summitt, R.L. [Safety and Reliability Optimization Services (SAROS), Inc., Knoxville, TN (United States)

    1991-12-31

    A probabilistic safety study of UF{sub 6} handling activities at the Portsmouth Gaseous Diffusion Plant has recently been completed. The analysis provides a unique perspective on the safety of UF{sub 6} handling activities. The estimated release frequencies provide an understanding of current risks, and the examination of individual contributors yields a ranking of important plant features and operations. Aside from the probabilistic results, however, there is an even more important benefit derived from a systematic modeling of all operations. The integrated approach employed in the analysis allows the interrelationships among the equipment and the required operations to be explored in depth. This paper summarizes the methods used in the study and provides an overview of some of the technical insights that were obtained. Specific areas of possible improvement in operations are described.

  5. Probabilistic Structural Analysis Program

    Science.gov (United States)

    Pai, Shantaram S.; Chamis, Christos C.; Murthy, Pappu L. N.; Stefko, George L.; Riha, David S.; Thacker, Ben H.; Nagpal, Vinod K.; Mital, Subodh K.

    2010-01-01

    NASA/NESSUS 6.2c is a general-purpose, probabilistic analysis program that computes probability of failure and probabilistic sensitivity measures of engineered systems. Because NASA/NESSUS uses highly computationally efficient and accurate analysis techniques, probabilistic solutions can be obtained even for extremely large and complex models. Once the probabilistic response is quantified, the results can be used to support risk-informed decisions regarding reliability for safety-critical and one-of-a-kind systems, as well as for maintaining a level of quality while reducing manufacturing costs for larger-quantity products. NASA/NESSUS has been successfully applied to a diverse range of problems in aerospace, gas turbine engines, biomechanics, pipelines, defense, weaponry, and infrastructure. This program combines state-of-the-art probabilistic algorithms with general-purpose structural analysis and lifting methods to compute the probabilistic response and reliability of engineered structures. Uncertainties in load, material properties, geometry, boundary conditions, and initial conditions can be simulated. The structural analysis methods include non-linear finite-element methods, heat-transfer analysis, polymer/ceramic matrix composite analysis, monolithic (conventional metallic) materials life-prediction methodologies, boundary element methods, and user-written subroutines. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. NASA/NESSUS 6.2c is structured in a modular format with 15 elements.

  6. Probabilistic Safety Analysis of High Speed and Conventional Lines Using Bayesian Networks

    Energy Technology Data Exchange (ETDEWEB)

    Grande Andrade, Z.; Castillo Ron, E.; O' Connor, A.; Nogal, M.

    2016-07-01

    A Bayesian network approach is presented for probabilistic safety analysis (PSA) of railway lines. The idea consists of identifying and reproducing all the elements that the train encounters when circulating along a railway line, such as light and speed limit signals, tunnel or viaduct entries or exits, cuttings and embankments, acoustic sounds received in the cabin, curves, switches, etc. In addition, since the human error is very relevant for safety evaluation, the automatic train protection (ATP) systems and the driver behavior and its time evolution are modelled and taken into account to determine the probabilities of human errors. The nodes of the Bayesian network, their links and the associated probability tables are automatically constructed based on the line data that need to be carefully given. The conditional probability tables are reproduced by closed formulas, which facilitate the modelling and the sensitivity analysis. A sorted list of the most dangerous elements in the line is obtained, which permits making decisions about the line safety and programming maintenance operations in order to optimize them and reduce the maintenance costs substantially. The proposed methodology is illustrated by its application to several cases that include real lines such as the Palencia-Santander and the Dublin-Belfast lines. (Author)

  7. Safety Verification for Probabilistic Hybrid Systems

    DEFF Research Database (Denmark)

    Zhang, Lijun; She, Zhikun; Ratschan, Stefan

    2010-01-01

    hybrid systems and develop a general abstraction technique for verifying probabilistic safety problems. This gives rise to the first mechanisable technique that can, in practice, formally verify safety properties of non-trivial continuous-time stochastic hybrid systems-without resorting to point......-wise discretisation. Moreover, being based on arbitrary abstractions computed by tools for the analysis of non-probabilistic hybrid systems, improvements in effectivity of such tools directly carry over to improvements in effectivity of the technique we describe. We demonstrate the applicability of our approach...

  8. Probabilistic safety goals. Phase 3 - Status report

    Energy Technology Data Exchange (ETDEWEB)

    Holmberg, J.-E. (VTT (Finland)); Knochenhauer, M. (Relcon Scandpower AB, Sundbyberg (Sweden))

    2009-07-15

    The first phase of the project (2006) described the status, concepts and history of probabilistic safety goals for nuclear power plants. The second and third phases (2007-2008) have provided guidance related to the resolution of some of the problems identified, and resulted in a common understanding regarding the definition of safety goals. The basic aim of phase 3 (2009) has been to increase the scope and level of detail of the project, and to start preparations of a guidance document. Based on the conclusions from the previous project phases, the following issues have been covered: 1) Extension of international overview. Analysis of results from the questionnaire performed within the ongoing OECD/NEA WGRISK activity on probabilistic safety criteria, including participation in the preparation of the working report for OECD/NEA/WGRISK (to be finalised in phase 4). 2) Use of subsidiary criteria and relations between these (to be finalised in phase 4). 3) Numerical criteria when using probabilistic analyses in support of deterministic safety analysis (to be finalised in phase 4). 4) Guidance for the formulation, application and interpretation of probabilistic safety criteria (to be finalised in phase 4). (LN)

  9. Common problems in the elicitation and analysis of expert opinion affecting probabilistic safety assessments

    Energy Technology Data Exchange (ETDEWEB)

    Meyer, M.A.; Booker, J.M.

    1990-01-01

    Expert opinion is frequently used in probabilistic safety assessment (PSA), particularly in estimating low probability events. In this paper, we discuss some of the common problems encountered in eliciting and analyzing expert opinion data and offer solutions or recommendations. The problems are: that experts are not naturally Bayesian. People fail to update their existing information to account for new information as it becomes available, as would be predicted by the Bayesian philosophy; that experts cannot be fully calibrated. To calibrate experts, the feedback from the known quantities must be immediate, frequent, and specific to the task; that experts are limited in the number of things that they can mentally juggle at a time to 7 {plus minus} 2; that data gatherers and analysts can introduce bias by unintentionally causing an altering of the expert's thinking or answers; that the level of detail the data, or granularity, can affect the analyses; and the conditioning effect poses difficulties in gathering and analyzing of the expert data. The data that the expert gives can be conditioned on a variety of factors that can affect the analysis and the interpretation of the results. 31 refs.

  10. Wind Power in Mexico: Simulation of a Wind Farm and Application of Probabilistic Safety Analysis

    Directory of Open Access Journals (Sweden)

    C. Martín del Campo–Márquez

    2009-10-01

    Full Text Available The most important aspects of wind energy in Mexico, including the potential for generating electricity and the major projects planned are presented here. Inparticular, the generation costs are compared to those of other energy sources. The results from the simulation of a 100 MWwind farm in the Tehuantepec Isthmus are also presented. In addition, the environmental impacts related to the wind farm in the mentioned zone are analyzed. Finally, some benefits of using Probabilistic Safety Analysis are discussed with respect to evaluating the risks associated with events that can occur in wind parks, being especially useful for design and maintenance of the parks and the wind turbines themselves. In particular, an event tree was developed to analyze possible accident sequences that could occur when the wind speed is too great. Also, fault trees were developed for each mitigating system considered, in order to determine the relative importance of the wind generator components to the failure sequences, in order to evaluate the yield of suggested improvements and the optimization of maintenance programs.

  11. Probabilistic Analysis of Passive Safety System Reliability in Advanced Small Modular Reactors: Methodologies and Lessons Learned

    Energy Technology Data Exchange (ETDEWEB)

    Grabaskas, David; Bucknor, Matthew; Brunett, Acacia; Grelle, Austin

    2015-06-28

    Many advanced small modular reactor designs rely on passive systems to fulfill safety functions during accident sequences. These systems depend heavily on boundary conditions to induce a motive force, meaning the system can fail to operate as intended due to deviations in boundary conditions, rather than as the result of physical failures. Furthermore, passive systems may operate in intermediate or degraded modes. These factors make passive system operation difficult to characterize with a traditional probabilistic framework that only recognizes discrete operating modes and does not allow for the explicit consideration of time-dependent boundary conditions. Argonne National Laboratory has been examining various methodologies for assessing passive system reliability within a probabilistic risk assessment for a station blackout event at an advanced small modular reactor. This paper describes the most promising options: mechanistic techniques, which share qualities with conventional probabilistic methods, and simulation-based techniques, which explicitly account for time-dependent processes. The primary intention of this paper is to describe the strengths and weaknesses of each methodology and highlight the lessons learned while applying the two techniques while providing high-level results. This includes the global benefits and deficiencies of the methods and practical problems encountered during the implementation of each technique.

  12. Probabilistic safety analysis and interpretation thereof; Probabilistische Sicherheitsanalysen und wie man sie interpretiert

    Energy Technology Data Exchange (ETDEWEB)

    Steininger, U.; Sacher, H. [TUEV Bau und Betrieb GmbH, Muenchen (Germany). Zentralbereich Risikoanalysen und Komponententechnik

    1999-05-01

    Increasing use of the instrumentation of PSA is being made in Germany for quantitative technical safety assessment, for example with regard to incidents which must be reported and forwarding of information, especially in the case of modification of nuclear plants. The Commission for Nuclear Reactor Safety recommends regular execution of PSA on a cycle period of ten years. According to the PSA guidance instructions, probabilistic analyses serve for assessing the degree of safety of the entire plant, expressed as the expectation value for the frequency of endangering conditions. The authors describe the method, action sequence and evaluation of the probabilistic safety analyses. The limits of probabilistic safety analyses arise in the practical implementation. Normally the guidance instructions for PSA are confined to the safety systems, so that in practice they are at best suitable for operational optimisation only to a limited extent. The present restriction of the analyses has a similar effect on power output operation of the plant. This seriously degrades the utilitarian value of these analyses for the plant operators. In order to further develop PSA as a supervisory and operational optimisation instrument, both authors consider it to be appropriate to bring together the specific know-how of analysts, manufacturers, plant operators and experts. (orig.) [Deutsch] Das Instrumentarium PSA wird in Deutschland in zunehmendem Masse zur quantitativen sicherheitstechnischen Bewertung beispielsweise von meldepflichtigen Ereignissen und Weiterleitungsnachrichten und vor allem auch von Aenderungen in Anlagen herangezogen. Die Reaktorsicherheitskommission empfiehlt die Durchfuehrung der PSA in einem Rhythmus von zehn Jahren. Laut PSA Leitfaden dienen probabilistische Analysen der Beurteilung des Sicherheitsniveaus der Gesamtanlage, das durch den Erwartungwert der Haeufigkeit von Gefaehrdungszustaenden ausgedrueckt wird. Die Autoren schildern Methode, Ablauf und Bewertung der

  13. Probabilistic Causal Analysis for System Safety Risk Assessments in Commercial Air Transport

    Science.gov (United States)

    Luxhoj, James T.

    2003-01-01

    Aviation is one of the critical modes of our national transportation system. As such, it is essential that new technologies be continually developed to ensure that a safe mode of transportation becomes even safer in the future. The NASA Aviation Safety Program (AvSP) is managing the development of new technologies and interventions aimed at reducing the fatal aviation accident rate by a factor of 5 by year 2007 and by a factor of 10 by year 2022. A portfolio assessment is currently being conducted to determine the projected impact that the new technologies and/or interventions may have on reducing aviation safety system risk. This paper reports on advanced risk analytics that combine the use of a human error taxonomy, probabilistic Bayesian Belief Networks, and case-based scenarios to assess a relative risk intensity metric. A sample case is used for illustrative purposes.

  14. The Probabilistic Safety Analysis during low power and shutdown, framework to improve safety; El APS a baja potencia en parada, marco para la mejora de la seguridad

    Energy Technology Data Exchange (ETDEWEB)

    Nos, V.

    2014-02-01

    Historically Probabilistic Safety Analysis (PSA) has been focused exclusively at full power operation, nevertheless, operational experience has revealed that events occurred during low power and shutdown can also present threats for the safety of the plant. Through qualitative assessment (NUMARC 91-06) about the configuration in shutdown have been internationally accepted, the benefits of Low Power and Shutdown PSA have been demonstrated as fundamental framework of quantitative understanding for improving safety and risk management in the above mentioned operative conditions of the plant. (Author)

  15. Probabilistic risk analysis in manufacturing situational operation: application of modelling techniques and causal structure to improve safety performance.

    Directory of Open Access Journals (Sweden)

    Jose Cristiano Pereira

    2015-01-01

    Full Text Available The use of probabilistic risk analysis in jet engines manufacturing process is essential to prevent failure. The objective of this study is to present a probabilistic risk analysis model to analyze the safety of this process. The standard risk assessment normally conducted is inadequate to address the risks. To remedy this problem, the model presented in this paper considers the effects of human, software and calibration reliability in the process. Bayesian Belief Network coupled to a Bow Tie diagram is used to identify potential engine failure scenarios. In this context and to meet this objective, an in depth literature research was conducted to identify the most appropriate modeling techniques and an interview were conducted with experts. As a result of this study, this paper presents a model that combines fault tree analysis, event tree analysis and a Bayesian Belief Networks into a single model that can be used by decision makers to identify critical risk factors in order to allocate resources to improve the safety of the system. The model is delivered in the form of a computer assisted decision tool supported by subject expert estimates.

  16. Uncertainty and sensitivity analysis in a Probabilistic Safety Analysis level-1; Analisis de incertidumbres y sensibilidad en un APS nivel I

    Energy Technology Data Exchange (ETDEWEB)

    Nunez Mc Leod, Jorge E.; Rivera, Selva S. [Universidad Nacional de Cuyo, Mendoza (Argentina). Facultad de Ingenieria. Centro de Estudios de Ingenieria Asistida por Computadora

    1996-07-01

    A methodology for sensitivity and uncertainty analysis, applicable to a Probabilistic Safety Assessment Level I has been presented. The work contents are: correct association of distributions to parameters, importance and qualification of expert opinions, generations of samples according to sample sizes, and study of the relationships among system variables and systems response. A series of statistical-mathematical techniques are recommended along the development of the analysis methodology, as well as different graphical visualization for the control of the study. (author)

  17. Application of Dynamic Probabilistic Safety Assessment Approach for Accident Sequence Precursor Analysis: Case Study for Steam Generator Tube Rupture

    Directory of Open Access Journals (Sweden)

    Hansul Lee

    2017-03-01

    Full Text Available The purpose of this research is to introduce the technical standard of accident sequence precursor (ASP analysis, and to propose a case study using the dynamic-probabilistic safety assessment (D-PSA approach. The D-PSA approach can aid in the determination of high-risk/low-frequency accident scenarios from all potential scenarios. It can also be used to investigate the dynamic interaction between the physical state and the actions of the operator in an accident situation for risk quantification. This approach lends significant potential for safety analysis. Furthermore, the D-PSA approach provides a more realistic risk assessment by minimizing assumptions used in the conventional PSA model so-called the static-PSA model, which are relatively static in comparison. We performed risk quantification of a steam generator tube rupture (SGTR accident using the dynamic event tree (DET methodology, which is the most widely used methodology in D-PSA. The risk quantification results of D-PSA and S-PSA are compared and evaluated. Suggestions and recommendations for using D-PSA are described in order to provide a technical perspective.

  18. Application of dynamic probabilistic safety assessment approach for accident sequence precursor analysis: Case study for steam generator tube rupture

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Han Sul; Heo, Gyun Young [Kyung Hee University, Yongin (Korea, Republic of); Kim, Tae Wan [Incheon National University, Incheon (Korea, Republic of)

    2017-03-15

    The purpose of this research is to introduce the technical standard of accident sequence precursor (ASP) analysis, and to propose a case study using the dynamic-probabilistic safety assessment (D-PSA) approach. The D-PSA approach can aid in the determination of high-risk/low-frequency accident scenarios from all potential scenarios. It can also be used to investigate the dynamic interaction between the physical state and the actions of the operator in an accident situation for risk quantification. This approach lends significant potential for safety analysis. Furthermore, the D-PSA approach provides a more realistic risk assessment by minimizing assumptions used in the conventional PSA model so-called the static-PSA model, which are relatively static in comparison. We performed risk quantification of a steam generator tube rupture (SGTR) accident using the dynamic event tree (DET) methodology, which is the most widely used methodology in D-PSA. The risk quantification results of D-PSA and S-PSA are compared and evaluated. Suggestions and recommendations for using D-PSA are described in order to provide a technical perspective.

  19. Procedure for conducting probabilistic safety assessment: level 1 full power internal event analysis

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Won Dae; Lee, Y. H.; Hwang, M. J. [and others

    2003-07-01

    This report provides guidance on conducting a Level I PSA for internal events in NPPs, which is based on the method and procedure that was used in the PSA for the design of Korea Standard Nuclear Plants (KSNPs). Level I PSA is to delineate the accident sequences leading to core damage and to estimate their frequencies. It has been directly used for assessing and modifying the system safety and reliability as a key and base part of PSA. Also, Level I PSA provides insights into design weakness and into ways of preventing core damage, which in most cases is the precursor to accidents leading to major accidents. So Level I PSA has been used as the essential technical bases for risk-informed application in NPPs. The report consists six major procedural steps for Level I PSA; familiarization of plant, initiating event analysis, event tree analysis, system fault tree analysis, reliability data analysis, and accident sequence quantification. The report is intended to assist technical persons performing Level I PSA for NPPs. A particular aim is to promote a standardized framework, terminology and form of documentation for PSAs. On the other hand, this report would be useful for the managers or regulatory persons related to risk-informed regulation, and also for conducting PSA for other industries.

  20. Probabilistic analysis of safety in industrial irradiation plants; Analisis probabilistico de seguridad en plantas industriales de irradiacion

    Energy Technology Data Exchange (ETDEWEB)

    Alderete, F.; Elechosa, C. [Autoridad Regulatoria Nuclear, Av. del Libertador 8250 - Buenos Aires (Argentina)]. e-mail: falderet@sede.arn.gov.ar

    2006-07-01

    The Argentinean Nuclear Regulatory Authority is carrying out the Probabilistic Safety Analysis (PSA) of the two industrial irradiation plants existent in the country. The objective of this presentation is to show from the regulatory point of view, the advantages of applying this tool, as well as the appeared difficulties; for it will be made a brief description of the facilities, of the method and of the normative one. Both plants are multipurpose facilities classified as 'industrial irradiator category IV' (panoramic irradiator with source deposited in pool). Basically, the execution of an APS consists of the following stages: 1. Identification of initiating events. 2. Modeling of Accidental Sequences (Event Trees). 3. Analysis of Systems (Fault trees). 4. Quantification of Accidental Sequences. The argentine normative doesn't demand to these facilities the realization of an APS, however the basic standard of Radiological Safety establishes that in the design of this type of facilities in the cases that is justified, should make sure that the annual probability of occurrence of an accidental sequence and the resulting dose in a person gives as result an radiological risk inferior to the risk limit adopted as acceptance criteria. On the other hand the design standard specifies for these irradiators it demands a maximum fault rate of 10{sup -2} for the related components with the systems of radiological safety. In our case, the possible initiating events have been identified that carried out to not wanted situations (about people exposure, radioactive contamination). Then, for each one of the significant initiating events, the corresponding accidental sequences were modeled and the safety systems that intervene in this sequences by means of fault trees were analyzed, for then to determine the fault probabilities of the same ones. At the moment they are completing these fault trees, but the difficulty resides in the impossibility of obtaining real data

  1. Development of a Novel Nuclear Safety Culture Evaluation Method for an Operating Team Using Probabilistic Safety Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Han, Sangmin; Lee, Seung Min; Seong, Poong Hyun [KAIST, Daejeon (Korea, Republic of)

    2015-05-15

    IAEA defined safety culture as follows: 'Safety Culture is that assembly of characteristics and attitudes in organizations and individuals which establishes that, as an overriding priority, nuclear plant safety issues receive the attention warranted by their significance'. Also, celebrated behavioral scientist, Cooper, defined safety culture as,'safety culture is that observable degree of effort by which all organizational members direct their attention and actions toward improving safety on a daily basis' with his internal psychological, situational, and behavioral context model. With these various definitions and criteria of safety culture, several safety culture assessment methods have been developed to improve and manage safety culture. To develop a new quantitative safety culture evaluation method for an operating team, we unified and redefined safety culture assessment items. Then we modeled a new safety culture evaluation by adopting level 1 PSA concept. Finally, we suggested the criteria to obtain nominal success probabilities of assessment items by using 'operational definition'. To validate the suggested evaluation method, we analyzed the collected audio-visual recording data collected from a full scope main control room simulator of a NPP in Korea.

  2. Probabilistic analysis of the safety margin assured by shear strength models of stirrup reinforced concrete beams

    Directory of Open Access Journals (Sweden)

    Jaskulski Roman

    2017-01-01

    Full Text Available The aim of this study was to assess the safety margin assured by stirrups of reinforced concrete elements subjected to shear. The safety margin was assessed by means of Monte Carlo simulation. The impact of the strength of steel and spacing of stirrups on the results was analysed as well. The used models of shear resistance were taken from the Polish standards PN-84/B-03264 and PN-EN 1992-1-1:2008. The safety margin, expressed as the logarithm of the probability of not achieving the design value of the design ultimate shear capacity was analysed as well as the impact of different assumptions on the obtained results. An attempt was made to assess the “sensitivity” of models to changes of the basic parameters of the probability distribution function of their selected variables.

  3. Scenario Grouping and Classification Methodology for Postprocessing of Data Generated by Integrated Deterministic-Probabilistic Safety Analysis

    Directory of Open Access Journals (Sweden)

    Sergey Galushin

    2015-01-01

    Full Text Available Integrated Deterministic-Probabilistic Safety Assessment (IDPSA combines deterministic model of a nuclear power plant with a method for exploration of the uncertainty space. Huge amount of data is generated in the process of such exploration. It is very difficult to “manually” process and extract from such data information that can be used by a decision maker for risk-informed characterization, understanding, and eventually decision making on improvement of the system safety and performance. Such understanding requires an approach for interpretation, grouping of similar scenario evolutions, and classification of the principal characteristics of the events that contribute to the risk. In this work, we develop an approach for classification and characterization of failure domains. The method is based on scenario grouping, clustering, and application of decision trees for characterization of the influence of timing and order of events. We demonstrate how the proposed approach is used to classify scenarios that are amenable to treatment with Boolean logic in classical Probabilistic Safety Assessment (PSA from those where timing and order of events determine process evolution and eventually violation of safety criteria. The efficiency of the approach has been verified with application to the SARNET benchmark exercise on the effectiveness of hydrogen management in the containment.

  4. Comparison of the MACCS2 atmospheric transport model with Lagrangian puff models as applied to deterministic and probabilistic safety analysis.

    Science.gov (United States)

    Till, John E; Rood, Arthur S; Garzon, Caroline D; Lagdon, Richard H

    2014-09-01

    The suitability of a new facility in terms of potential impacts from routine and accidental releases is typically evaluated using conservative models and assumptions to assure dose standards are not exceeded. However, overly conservative dose estimates that exceed target doses can result in unnecessary and costly facility design changes. This paper examines one such case involving the U.S. Department of Energy's pretreatment facility of the Waste Treatment and Immobilization Plant (WTP). The MELCOR Accident Consequence Code System Version 2 (MACCS2) was run using conservative parameter values in prescribed guidance to demonstrate that the dose from a postulated airborne release would not exceed the guideline dose of 0.25 Sv. External review of default model parameters identified the deposition velocity of 1.0 cm s as being non-conservative. The deposition velocity calculated using resistance models was in the range of 0.1 to 0.3 cm s-1. A value of 0.1 cm s-1 would result in the dose guideline being exceeded. To test the overall conservatism of the MACCS2 transport model, the 95th percentile hourly average dispersion factor based on one year of meteorological data was compared to dispersion factors generated from two state-of-the-art Lagrangian puff models. The 95th percentile dispersion factor from MACCS2 was a factor of 3 to 6 higher compared to those of the Lagrangian puff models at a distance of 9.3 km and a deposition velocity of 0.1 cm s-1. Thus, the inherent conservatism in MACCS2 more than compensated for the high deposition velocity used in the assessment. Applications of models like MACCS2 with a conservative set of parameters are essentially screening calculations, and failure to meet dose criteria should not trigger facility design changes but prompt a more in-depth analysis using probabilistic methods with a defined margin of safety in the target dose. A sample application of the probabilistic approach is provided.

  5. Probabilistic Durability Analysis in Advanced Engineering Design

    Directory of Open Access Journals (Sweden)

    A. Kudzys

    2000-01-01

    Full Text Available Expedience of probabilistic durability concepts and approaches in advanced engineering design of building materials, structural members and systems is considered. Target margin values of structural safety and serviceability indices are analyzed and their draft values are presented. Analytical methods of the cumulative coefficient of correlation and the limit transient action effect for calculation of reliability indices are given. Analysis can be used for probabilistic durability assessment of carrying and enclosure metal, reinforced concrete, wood, plastic, masonry both homogeneous and sandwich or composite structures and some kinds of equipments. Analysis models can be applied in other engineering fields.

  6. A probabilistic bridge safety evaluation against floods.

    Science.gov (United States)

    Liao, Kuo-Wei; Muto, Yasunori; Chen, Wei-Lun; Wu, Bang-Ho

    2016-01-01

    To further capture the influences of uncertain factors on river bridge safety evaluation, a probabilistic approach is adopted. Because this is a systematic and nonlinear problem, MPP-based reliability analyses are not suitable. A sampling approach such as a Monte Carlo simulation (MCS) or importance sampling is often adopted. To enhance the efficiency of the sampling approach, this study utilizes Bayesian least squares support vector machines to construct a response surface followed by an MCS, providing a more precise safety index. Although there are several factors impacting the flood-resistant reliability of a bridge, previous experiences and studies show that the reliability of the bridge itself plays a key role. Thus, the goal of this study is to analyze the system reliability of a selected bridge that includes five limit states. The random variables considered here include the water surface elevation, water velocity, local scour depth, soil property and wind load. Because the first three variables are deeply affected by river hydraulics, a probabilistic HEC-RAS-based simulation is performed to capture the uncertainties in those random variables. The accuracy and variation of our solutions are confirmed by a direct MCS to ensure the applicability of the proposed approach. The results of a numerical example indicate that the proposed approach can efficiently provide an accurate bridge safety evaluation and maintain satisfactory variation.

  7. Probabilistic Design and Analysis Framework

    Science.gov (United States)

    Strack, William C.; Nagpal, Vinod K.

    2010-01-01

    PRODAF is a software package designed to aid analysts and designers in conducting probabilistic analysis of components and systems. PRODAF can integrate multiple analysis programs to ease the tedious process of conducting a complex analysis process that requires the use of multiple software packages. The work uses a commercial finite element analysis (FEA) program with modules from NESSUS to conduct a probabilistic analysis of a hypothetical turbine blade, disk, and shaft model. PRODAF applies the response surface method, at the component level, and extrapolates the component-level responses to the system level. Hypothetical components of a gas turbine engine are first deterministically modeled using FEA. Variations in selected geometrical dimensions and loading conditions are analyzed to determine the effects of the stress state within each component. Geometric variations include the cord length and height for the blade, inner radius, outer radius, and thickness, which are varied for the disk. Probabilistic analysis is carried out using developing software packages like System Uncertainty Analysis (SUA) and PRODAF. PRODAF was used with a commercial deterministic FEA program in conjunction with modules from the probabilistic analysis program, NESTEM, to perturb loads and geometries to provide a reliability and sensitivity analysis. PRODAF simplified the handling of data among the various programs involved, and will work with many commercial and opensource deterministic programs, probabilistic programs, or modules.

  8. Human reliability analysis in low power and shut-down probabilistic safety assessment: Outcomes of an international initiative

    Directory of Open Access Journals (Sweden)

    Manna Giustino

    2012-01-01

    Full Text Available Since the beginning of the nuclear power generation, human performance has been a very important factor in all phases of the plant lifecycle: design, commissioning, operation, maintenance, surveillance, modification, and decommissioning. This aspect has been confirmed by the operating experience. A workshop was organized by the IAEA and the Joint Research Centre of the European Commission, on Harmonization of low power and shutdown probabilistic safety assessment for WWER nuclear power plants. One of the major objectives of the Workshop was to provide a comparison of the approaches and results of human reliability analyses for WWER 440 and WWER 1000, and gain insights for future application of human reliability analyses in Low Power and Shutdown scenarios. This paper provides the insights and conclusions of the workshops concerning human reliability analyses and human factors.

  9. PROBABILISTIC MODEL FOR AIRPORT RUNWAY SAFETY AREAS

    Directory of Open Access Journals (Sweden)

    Stanislav SZABO

    2017-06-01

    Full Text Available The Laboratory of Aviation Safety and Security at CTU in Prague has recently started a project aimed at runway protection zones. The probability of exceeding by a certain distance from the runway in common incident/accident scenarios (take-off/landing overrun/veer-off, landing undershoot is being identified relative to the runway for any airport. As a result, the size and position of safety areas around runways are defined for the chosen probability. The basis for probability calculation is a probabilistic model using statistics from more than 1400 real-world cases where jet airplanes have been involved over the last few decades. Other scientific studies have contributed to understanding the issue and supported the model’s application to different conditions.

  10. Probabilistic safety assessment for optimum nuclear power plant life management (PLiM) theory and application of reliability analysis methods for major power plant components

    CERN Document Server

    Arkadov, G V; Rodionov, A N

    2012-01-01

    Probabilistic safety assessment methods are used to calculate nuclear power plant durability and resource lifetime. Directing preventative maintenance, this title provides a comprehensive review of the theory and application of these methods.$bProbabilistic safety assessment methods are used to calculate nuclear power plant durability and resource lifetime. Successful calculation of the reliability and ageing of components is critical for forecasting safety and directing preventative maintenance, and Probabilistic safety assessment for optimum nuclear power plant life management provides a comprehensive review of the theory and application of these methods. Part one reviews probabilistic methods for predicting the reliability of equipment. Following an introduction to key terminology, concepts and definitions, formal-statistical and various physico-statistical approaches are discussed. Approaches based on the use of defect-free models are considered, along with those using binomial distribution and models bas...

  11. Probabilistic safety assessment for the Savannah River Site K reactor

    Energy Technology Data Exchange (ETDEWEB)

    Brandyberry, M.D.; Woody, N.D.; Baker, W.H.; Kearnaghan, D.P.; Wittman, R.S. (Westinghouse Savannah River Co., Aiken, SC (United States))

    1991-01-01

    A probabilistic study of the overall safety of the special materials production reactors located at the US Department of Energy's Savannah River site (SRS) has been performed. Assessments of the risk associated with reactor operation that is posed to the work force at SRS and to the surrounding population are among the results obtained. Safety assessment methodology that has evolved from applications in the commercials nuclear power industry over the past 20 yr, and has recently been employed in two other major studies was used for the analysis. The results of the study indicate that risks from severe reactor accidents to individuals in the neighboring populace are within levels that have been found to be acceptable for commercial nuclear power plants. The objectives of the SRS probabilistic safety assessment (PSA) were as follows: (1) to assess the margin of safety of the reactor system design; (2) to calculate risk measures as a means of assessment of safety in terms of levels of risk to socity; (3) to identify the equipment, human actions, and plant design features that contribute in greatest measure to assurance of overall safety by exercising the analytical models that constitute the PSA.

  12. Development of specific data of plant for a safety probabilistic analysis; Desarrollo de datos especificos de planta para un analisis probabilistico de seguridad

    Energy Technology Data Exchange (ETDEWEB)

    Gonzalez C, M. [Emersis S.A. de C.V., Tabachines 9-bis, 62589 Temixco, Morelos (Mexico); Nelson E, P. [LAIRN, UNAM, Paseo Cuauhnahuac 8532, Jiutepec, Morelos (Mexico)]. e-mail: cuesta@emersis.com

    2004-07-01

    In this work the development of specific data of plant is described for the Safety Probabilistic Analysis (APS) of the Laguna Verde Central. The description of those used methods concentrate on the obtention of rates of failure of the equipment and frequencies of initiator events modeled in the APS, making mention to other types of data that also appeal to specific sources of the plant. The method to obtain the rates of failure of the equipment takes advantage the information of failures of components and unavailability of systems obtained entreaty in execution with the Maintenance Rule (1OCFR50.65). The method to develop the frequencies of initiators take in account the registered operational experience as reportable events. In both cases the own experience is combined with published generic data using Bayesian realized techniques. Details are provided about the gathering of information, the confirmations of consistency and adjustment necessities, presenting examples of the obtained results. (Author)

  13. Use of the Safety probabilistic analysis for the risk monitor before maintenance; Uso del Analisis probabilistico de seguridad para el monitor de riesgo antes de mantenimiento

    Energy Technology Data Exchange (ETDEWEB)

    Gonzalez C, M. [Emersis S.A. de C.V., Tabachines 9-bis, 62589 Temixco, Morelos (Mexico)]. e-mail: cuesta@emersis.com

    2004-07-01

    In this work the use of the Safety Probabilistic Analysis (APS) of the Laguna Verde Power plant to quantify the risk before maintenance is presented. Beginning to describe the nature of the Rule of Maintenance and their risk evaluations, it is planned about the paper of the APS for that purpose, and a systematic form to establish the reaches for this use open of the model is delineated. The work provides some technique details of the implantation methods of the APS like risk monitor, including the form of introducing the systems, trains and components to the user, as well as the fitness to the models and improvements to the used platform. There are covered some of the measures taken to achieve the objectives of preserving the base model approved, to facilitate the periodic realize, and to achieve acceptable times of execution for their efficient use. (Author)

  14. Guidance for the definition and application of probabilistic safety criteria

    Energy Technology Data Exchange (ETDEWEB)

    Holmberg, J.-E. (VTT Technical Research Centre of Finland (Finland)); Knochenhauer, M. (Scandpower AB (Sweden))

    2011-05-15

    The project 'The Validity of Safety Goals' has been financed jointly by NKS (Nordic Nuclear Safety Research), SSM (Swedish Radiation Safety Authority) and the Swedish and Finnish nuclear utilities. The national financing went through NPSAG, the Nordic PSA Group (Swedish contributions) and SAFIR2010, the Finnish research programme on NPP safety (Finnish contributions). The project has been performed in four phases during 2006-2010. This guidance document aims at describing, on the basis of the work performed throughout the project, issues to consider when defining, applying and interpreting probabilistic safety criteria. Thus, the basic aim of the document is to serve as a checklist and toolbox for the definition and application of probabilistic safety criteria. The document describes the terminology and concepts involved, the levels of criteria and relations between these, how to define a probabilistic safety criterion, how to apply a probabilistic safety criterion, on what to apply the probabilistic safety criterion, and how to interpret the result of the application. The document specifically deals with what makes up a probabilistic safety criterion, i.e., the risk metric, the frequency criterion, the PSA used for assessing compliance and the application procedure for the criterion. It also discusses the concept of subsidiary criteria, i.e., different levels of safety goals. The results from the project can be used as a platform for discussions at the utilities on how to define and use quantitative safety goals. The results can also be used by safety authorities as a reference for risk-informed regulation. The outcome can have an impact on the requirements on PSA, e.g., regarding quality, scope, level of detail, and documentation. Finally, the results can be expected to support on-going activities concerning risk-informed applications. (Author)

  15. Probabilistic safety assessment in the chemical and nuclear industries

    CERN Document Server

    Fullwood, Ralph R

    2000-01-01

    Probabilistic Safety Analysis (PSA) determines the probability and consequences of accidents, hence, the risk. This subject concerns policy makers, regulators, designers, educators and engineers working to achieve maximum safety with operational efficiency. Risk is analyzed using methods for achieving reliability in the space program. The first major application was to the nuclear power industry, followed by applications to the chemical industry. It has also been applied to space, aviation, defense, ground, and water transportation. This book is unique in its treatment of chemical and nuclear risk. Problems are included at the end of many chapters, and answers are in the back of the book. Computer files are provided (via the internet), containing reliability data, a calculator that determines failure rate and uncertainty based on field experience, pipe break calculator, event tree calculator, FTAP and associated programs for fault tree analysis, and a units conversion code. It contains 540 references and many...

  16. Probabilistic analysis of safety of a production plant of hydrogen using nuclear energy; Analisis probabilistico de seguridad de una planta de produccion de hidrogeno utilizando energia nuclear

    Energy Technology Data Exchange (ETDEWEB)

    Flores F, A. [Facultad de Ingenieria, UNAM, 04510 Mexico D.F. (Mexico); Nelson E, P.F.; Francois L, J.L. [Facultad de Ingenieria, UNAM, Paseo Cuauhnahuac 8532, Jiutepec, Morelos (Mexico)]. e-mail: alain_fyf@yahoo.com

    2005-07-01

    The present work makes use of the Probabilistic Safety analysis to evaluate and to quantify the safety in a plant producer of hydrogen coupled to a nuclear reactor of high temperature, the one which is building in Japan. It is had the description of systems and devices of the HTTR, the pipe diagrams and instrumentation of the plant, as well as the rates of generic faults for the components of the plant. The first step was to carry out a HAZOP study (Hazard and Operability Study) with the purpose of obtaining the initiator events; once obtained these, it was developed a tree of events by each initiator event and for each system it was developed a fault tree; the data used for the quantification of the failure probability of the systems were obtained starting from several generic sources of information. In each tree of events different final states were obtained and it stops each one, their occurrence frequency. The construction and evaluation of the tree of events and of failures one carries out with the SAPHIRE program. The results show the safety of the shutdown system of the HTTR and they allow to suggest modifications to the auxiliary system of refrigeration and to the heat exchanger helium/water pressurized. (Author)

  17. Probabilistic Analysis of Crack Width

    Directory of Open Access Journals (Sweden)

    J. Marková

    2000-01-01

    Full Text Available Probabilistic analysis of crack width of a reinforced concrete element is based on the formulas accepted in Eurocode 2 and European Model Code 90. Obtained values of reliability index b seem to be satisfactory for the reinforced concrete slab that fulfils requirements for the crack width specified in Eurocode 2. However, the reliability of the slab seems to be insufficient when the European Model Code 90 is considered; reliability index is less than recommended value 1.5 for serviceability limit states indicated in Eurocode 1. Analysis of sensitivity factors of basic variables enables to find out variables significantly affecting the total crack width.

  18. Development of a Probabilistic Safety Assessment Framework for an Interim Dry Storage Facility Subjected to an Aircraft Crash Using Best-Estimate Structural Analysis

    Directory of Open Access Journals (Sweden)

    Belal Almomani

    2017-03-01

    Full Text Available Using a probabilistic safety assessment, a risk evaluation framework for an aircraft crash into an interim spent fuel storage facility is presented. Damage evaluation of a detailed generic cask model in a simplified building structure under an aircraft impact is discussed through a numerical structural analysis and an analytical fragility assessment. Sequences of the impact scenario are shown in a developed event tree, with uncertainties considered in the impact analysis and failure probabilities calculated. To evaluate the influence of parameters relevant to design safety, risks are estimated for three specification levels of cask and storage facility structures. The proposed assessment procedure includes the determination of the loading parameters, reference impact scenario, structural response analyses of facility walls, cask containment, and fuel assemblies, and a radiological consequence analysis with dose–risk estimation. The risk results for the proposed scenario in this study are expected to be small relative to those of design basis accidents for best-estimated conservative values. The importance of this framework is seen in its flexibility to evaluate the capability of the facility to withstand an aircraft impact and in its ability to anticipate potential realistic risks; the framework also provides insight into epistemic uncertainty in the available data and into the sensitivity of the design parameters for future research.

  19. Contribution to a probabilistic safety analysis for the dismantling of slender reinforced-concrete structures; Ein Beitrag zur probabilistischen Sicherheitsanalyse von Abbruchvorgaengen turmartiger Bauwerke aus Stahlbeton

    Energy Technology Data Exchange (ETDEWEB)

    Lehnen, D.J.

    1997-12-31

    In the present work a concept of probabilistic safety-analysis for the dismantling of slender concrete-structures by tilting is developed. Based on requirements, that define a regular dismantling process, models describing characteristic limit-states of the building are derived. The connection of these limit-states allows rating the whole process. Uncertainties in the model-input are caught by using stochastic variables. Uncertainties in the model itself are caught by using inferior and superior modelling. With the help of two concluding examples it is shown, how the obtained probability of failure can be used to enhance objectiveness of safety-considerations. The numeric simulation is based on a Monte-Carlo method. (orig.) [Deutsch] In der vorliegenden Arbeit wird ein Konzept zur probabilistischen Sicherheitsanalyse des Fallrichtungsabbruchs turmartiger Bauwerke aus Stahlbeton entwickelt. Ausgehend von einem definierten Anforderungsprofil an den ordnungsgemaessen Ablauf eines Fallrichtungsabbruchs werden Modellvorstellungen herausgearbeitet, die einzelne Bauwerksgrenzzustaende abbilden, welche sich zur Beurteilung des Gesamtvorgangs eignen. Unsicherheiten in den Eingangsgroessen werden durch deren Auffassung als Wahrschlichkeitsdichten erfasst. Unsicherheiten in den Modellbildungen werden durch den jeweiligen Einsatz unterschaetzender und ueberschaetzender Betrachtungen, sogenannter Minoranten und Majoranten, beruecksichtigt. Anhand zweier Beispiele wird abschliessend demonstriert, wie die erhaltene operative Versagenswahrscheinlichkeit zur Objektivierung von Sicherheitsbetrachtungen herangezogen werden kann. Dabei beruht die numerische Umsetzung auf einer Monte-Carlo Simulation. (orig.)

  20. Why do probabilistic finite element analysis ?

    CERN Document Server

    Thacker, Ben H

    2008-01-01

    The intention of this book is to provide an introduction to performing probabilistic finite element analysis. As a short guideline, the objective is to inform the reader of the use, benefits and issues associated with performing probabilistic finite element analysis without excessive theory or mathematical detail.

  1. Methods and data of probabilistic safety analysis for nuclear power plants. Status May 2015; Methoden und Daten zur probabilistischen Sicherheitsanalyse fuer Kernkraftwerke. Stand: Mai 2015

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    2016-09-15

    The supplement for the methodology of probabilistic safety analyses includes modifications, extensions and actualizations based on recent experiences. The chapter on personnel actions has been reorganized and adapted to the status of science and technology. Especially the possibility of decision fault identification and evaluation has been included. The chapters on floods and earthquakes are revised with respect to the actual regulatory developments and the new safety requirements. An extension of the spectra of PSA methods and data for the non-power operation has not been revised with respect to the Fukushima experiences. Based on fire experiences during power operation a new section on fire during non-power operation was included.

  2. Results of the Safety probabilistic analysis of Level 2 of the CNSNS; Resultados del analisis probabilista de seguridad de nivel 2 de la CNSNS

    Energy Technology Data Exchange (ETDEWEB)

    Lopez M, R.; Godinez S, V. [CNSNS, 03020 Mexico D.F. (Mexico)]. e-mail: rlopezm@cnsns.gob.mx

    2004-07-01

    The National Commission of Nuclear Safety and Safeguards (CNSNS) it has concluded the one develop of their Probabilistic Analysis of Safety (APS) of Level 2. The reach of the study it considers internal events to full power and it was developed on the base of the methodology of the NUREG-1150, for what you it was built an Event Tree of the Progression of the Accident (APET) to analyze the 25 States of Damage to the Plant (PDS) obtained of the APS Nl of the CNSNS. In the APET are considered the phenomenology of severe accidents, the performance of mitigation systems and actions of the operator that could modify the evolution of a severe accident in the CNLV, as well as the diverse modes of failure of the primary container and it identifies the trajectories of liberation of radioactive material to the exterior. The conditional probabilities of failure of the primary container were obtained and it was characterized the time so much to which happens the liberation of radioactive material as the quantity of the term liberated source. Also, to establish the times and parameters of the evolution of accidents were selected representative accident sequences of the diverse accident types and their conditions were simulated by means of the MELCOR computer code. Also it was developed a code of parametric compute type XSOR, specific for Laguna Verde, with which it was carried out the estimate of the term source in each one of the release trajectories. In this work the main characteristic ones are presented and results of the APS N2 developed in the CNSNS and they are compared against the model and results of the EIP of the CNLV. (Author)

  3. Probabilistic methods in combinatorial analysis

    CERN Document Server

    Sachkov, Vladimir N

    2014-01-01

    This 1997 work explores the role of probabilistic methods for solving combinatorial problems. These methods not only provide the means of efficiently using such notions as characteristic and generating functions, the moment method and so on but also let us use the powerful technique of limit theorems. The basic objects under investigation are nonnegative matrices, partitions and mappings of finite sets, with special emphasis on permutations and graphs, and equivalence classes specified on sequences of finite length consisting of elements of partially ordered sets; these specify the probabilist

  4. Savannah River Site K-Reactor Probabilistic Safety Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Brandyberry, M.D.; Bailey, R.T.; Baker, W.H.; Kearnaghan, D.P.; O`Kula, K.R.; Wittman, R.S.; Woody, N.D. [Westinghouse Savannah River Co., Aiken, SC (United States); Amos, C.N.; Weingardt, J.J. [Science Applications International Corp. (United States)

    1992-12-01

    This report gives the results of a Savannah River Site (SRS) K-Reactor Probabilistic Safety Assessment (PSA). Measures of adverse consequences to health and safety resulting from representations of severe accidents in SRS reactors are presented. In addition, the report gives a summary of the methods employed to represent these accidents and to assess the resultant consequences. The report is issued to provide useful information to the U. S. Department of Energy (DOE) on the risk of operation of SRS reactors, for insights into severe accident phenomena that contribute to this risk, and in support of improved bases for other DOE programs in Heavy Water Reactor safety.

  5. Probabilistic safety goals. Phase 2 - Status report

    Energy Technology Data Exchange (ETDEWEB)

    Holmberg, J.-E.; Bjoerkman, K. Rossi, J. (VTT (Finland)); Knochenhauer, M.; Xuhong He; Persson, A.; Gustavsson, H. (Relcon Scandpower AB, Sundbyberg (Sweden))

    2008-07-15

    The second phase of the project, the outcome of which is described in this project report has mainly dealt with four issues: 1) Consistency in the usage of safety goals 2) Criteria for assessment of results from PSA level 2 3) Overview of international safety goals and experiences from their use 4) Safety goals related to other man-made risks in society. Consistency in judgement over time has been perceived to be one of the main problems in the usage of safety goals. Safety goals defined in the 80ies were met in the beginning with PSA:s performed to the standards of that time, i.e., by PSA:s that were quite limited in scope and level of detail compared to today's state of the art. This issue was investigated by performing a comparative review was performed of three generations of the same PSA, focusing on the impact from changes over time in component failure data, IE frequency, and modelling of the plant, including plant changes and changes in success criteria. It proved to be very time-consuming and in some cases next to impossible to correctly identify the basic causes for changes in PSA results. A multitude of different sub-causes turned out to combined and difficult to differentiate. Thus, rigorous book-keeping is needed in order to keep track of how and why PSA results change. This is especially important in order to differentiate 'real' differences due to plant changes and updated component and IE data from differences that are due to general PSA development (scope, level of detail, modelling issues). (au)

  6. A Methodology To Incorporate The Safety Culture Into Probabilistic Safety Assessments

    Energy Technology Data Exchange (ETDEWEB)

    Park, Sunghyun; Kim, Namyeong; Jae, Moosung [Hanyang University, Seoul (Korea, Republic of)

    2015-10-15

    In order to incorporate organizational factors into PSA, a methodology needs to be developed. Using the AHP to weigh organizational factors as well as the SLIM to rate those factors, a methodology is introduced in this study. The safety issues related to nuclear safety culture have occurred increasingly. The quantification tool has to be developed in order to include the organizational factor into Probabilistic Safety Assessments. In this study, the state-of-the-art for the organizational evaluation methodologies has been surveyed. This study includes the research for organizational factors, maintenance process, maintenance process analysis models, a quantitative methodology using Analytic Hierarchy Process, Success Likelihood Index Methodology. The purpose of this study is to develop a methodology to incorporate the safety culture into PSA for obtaining more objective risk than before. The organizational factor considered in nuclear safety culture might affect the potential risk of human error and hardware-failure. The safety culture impact index to monitor the plant safety culture can be assessed by applying the developed methodology into a nuclear power plant.

  7. Probabilistic Output Analysis by Program Manipulation

    DEFF Research Database (Denmark)

    Rosendahl, Mads; Kirkeby, Maja Hanne

    2015-01-01

    The aim of a probabilistic output analysis is to derive a probability distribution of possible output values for a program from a probability distribution of its input. We present a method for performing static output analysis, based on program transformation techniques. It generates a probability...

  8. A Level 1+ Probabilistic Safety Assessment of the High Flux Australian Reactor. Vol 3: Appendices

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-01-01

    The third volume of the Probabilistic Safety Assessment contains supporting information for the PSA as follows: Appendix C (continued) with details of the system analysis and reports for the system/top event models; Appendix D with results of the specific engineering analyses of internal initiating events; Appendix E, containing supporting data for the human performance assessment,; Appendix F with details of the estimation of the frequency of leaks at HIFAR and Appendix G, containing event sequence model and quantification results

  9. Incorporating psychological influences in probabilistic cost analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kujawski, Edouard; Alvaro, Mariana; Edwards, William

    2004-01-08

    Today's typical probabilistic cost analysis assumes an ''ideal'' project that is devoid of the human and organizational considerations that heavily influence the success and cost of real-world projects. In the real world ''Money Allocated Is Money Spent'' (MAIMS principle); cost underruns are rarely available to protect against cost overruns while task overruns are passed on to the total project cost. Realistic cost estimates therefore require a modified probabilistic cost analysis that simultaneously models the cost management strategy including budget allocation. Psychological influences such as overconfidence in assessing uncertainties and dependencies among cost elements and risks are other important considerations that are generally not addressed. It should then be no surprise that actual project costs often exceed the initial estimates and are delivered late and/or with a reduced scope. This paper presents a practical probabilistic cost analysis model that incorporates recent findings in human behavior and judgment under uncertainty, dependencies among cost elements, the MAIMS principle, and project management practices. Uncertain cost elements are elicited from experts using the direct fractile assessment method and fitted with three-parameter Weibull distributions. The full correlation matrix is specified in terms of two parameters that characterize correlations among cost elements in the same and in different subsystems. The analysis is readily implemented using standard Monte Carlo simulation tools such as {at}Risk and Crystal Ball{reg_sign}. The analysis of a representative design and engineering project substantiates that today's typical probabilistic cost analysis is likely to severely underestimate project cost for probability of success values of importance to contractors and procuring activities. The proposed approach provides a framework for developing a viable cost management strategy for

  10. Probabilistic Safety Assessment of Waste from PyroGreen Processes

    Energy Technology Data Exchange (ETDEWEB)

    Ju, Hee Jae; Ham, In hye; Hwang, Il Soon [Seoul National University, Seoul (Korea, Republic of)

    2016-05-15

    The main object of PyroGreen processes is decontaminating SNFs into intermediate level waste meeting U.S. WIPP contact-handled (CH) waste characteristics to achieve long-term radiological safety of waste disposal. In this paper, radiological impact of PyroGreen waste disposal is probabilistically assessed using domestic input parameters for safety assessment of disposal. PyroGreen processes is decontamination technology using pyro-chemical process developed by Seoul National University in collaboration with KAERI, Chungnam University, Korea Hydro-Nuclear Power and Yonsei University. Advanced Korean Reference Disposal System (A-KRS) design for vitrified waste is applied to develop safety assessment model using GoldSim software. The simulation result shows that PyroGreen vitrified waste is expected to satisfy the regulatory dose limit criteria, 0.1 mSv/yr. With small probability, however, radiological impact to public can be higher than the expected value after 2E5-year. Although the result implies 100 times safety margin even in that case, further study will be needed to assess the sensitivity of other input parameters which can affect the radiological impact for long-term.

  11. PROBABILISTIC SAFETY ASSESSMENT OF OPERATIONAL ACCIDENTS AT THE WASTE ISOLATION PILOT PLANT

    Energy Technology Data Exchange (ETDEWEB)

    Rucker, D.F.

    2000-09-01

    This report presents a probabilistic safety assessment of radioactive doses as consequences from accident scenarios to complement the deterministic assessment presented in the Waste Isolation Pilot Plant (WIPP) Safety Analysis Report (SAR). The International Council of Radiation Protection (ICRP) recommends both assessments be conducted to ensure that ''an adequate level of safety has been achieved and that no major contributors to risk are overlooked'' (ICRP 1993). To that end, the probabilistic assessment for the WIPP accident scenarios addresses the wide range of assumptions, e.g. the range of values representing the radioactive source of an accident, that could possibly have been overlooked by the SAR. Routine releases of radionuclides from the WIPP repository to the environment during the waste emplacement operations are expected to be essentially zero. In contrast, potential accidental releases from postulated accident scenarios during waste handling and emplacement could be substantial, which necessitates the need for radiological air monitoring and confinement barriers (DOE 1999). The WIPP Safety Analysis Report (SAR) calculated doses from accidental releases to the on-site (at 100 m from the source) and off-site (at the Exclusive Use Boundary and Site Boundary) public by a deterministic approach. This approach, as demonstrated in the SAR, uses single-point values of key parameters to assess the 50-year, whole-body committed effective dose equivalent (CEDE). The basic assumptions used in the SAR to formulate the CEDE are retained for this report's probabilistic assessment. However, for the probabilistic assessment, single-point parameter values were replaced with probability density functions (PDF) and were sampled over an expected range. Monte Carlo simulations were run, in which 10,000 iterations were performed by randomly selecting one value for each parameter and calculating the dose. Statistical information was then derived

  12. Reachability Analysis of Probabilistic Systems

    DEFF Research Database (Denmark)

    D'Argenio, P. R.; Jeanett, B.; Jensen, Henrik Ejersbo

    2001-01-01

    than the original model, and may safely refute or accept the required property. Otherwise, the abstraction is refined and the process repeated. As the numerical analysis involved in settling the validity of the property is more costly than the refinement process, the method profits from applying...... such numerical analysis on smaller state spaces. The method is significantly enhanced by a number of novel strategies: a strategy for reducing the size of the numerical problems to be analyzed by identification of so-called {essential states}, and heuristic strategies for guiding the refinement process....

  13. Development of the probabilistic exposure modeling in the frame of the radioactive residues final repository long-term safety analysis; Weiterentwicklung der probabilistischen Expositionsmodellierung im Rahmen der Langzeitsicherheitsanalyse von Endlagern fuer radioaktive Reststoffe

    Energy Technology Data Exchange (ETDEWEB)

    Ciecior, Willy

    2017-04-28

    The long-term safety analysis of repositories for radioactive waste is based on the modeling of the releases of nuclides from the waste matrix and the subsequent transport through the near and far field of the repository system to the living part of the environment (biosphere). For the conversion of the nuclide release into a potential hazard (e. g. into an effective dose), a conceptual biosphere model and a mathematical exposure model is used. The parametrization of the mathematical model can be carried out deterministic as well as probabilistic using distributions and Monte Carlo simulation. However, to date, particularly in the context of the probabilistic safety analysis for deep-geological repositories, there is no uniform procedure for the derivation of the distributions to be used. The distributions used by the analyst are mostly chosen according to personal conviction and often illogical with respect to the underlying nature of the actual model parameter, but model results are in part very dependent on the type of the selected distributions of the input parameters. Furthermore, there less studies available on the influence of interactions and correlations or other dependencies between the radiological input parameters of the model. Therefore, the impact of different types of distributions (empirical, parametric) for different input parameters as well as the influence of interactions and correlations between input parameters on the results of the mathematical exposure modeling were analyzed in the present study. The influence of the type of distribution for the representation of the variability of the physical input parameter as well as their interactions and dependencies could be identified as less relevant. However, by means of Monte Carlo simulation of the second order, the composition of the corresponding samples or the condition of the sample moments to be used for the construction of parametric distributions were determined as the essential factors for

  14. The probabilistic safety analysis as seen from the point of view of the Technical Inspectorates TUeV Bavaria and Saxonia; Probabilistische Sicherheitsanalyse aus der Sicht des TUEV Bayern Sachsen

    Energy Technology Data Exchange (ETDEWEB)

    Vinzens, K. [TUEV Bayern Sachsen e.V., Muenchen (Germany); Sacher, H. [TUEV Bayern Sachsen e.V., Muenchen (Germany)

    1994-07-01

    Probabilistics safety analysis (PBA) has been developed to a useful tool for an in-depth assessment of the safety of nuclear power plant. PSA methods permit a quantification of engineered plant safety covering all parameters, including in particular also the human factors. The goal pursued with PSA is not only the numeric data describing the plant safety. The systematic approach covering all parameters offers the possibility of detecting design-basis reserves for improvement, or weak points linked to certain events, possibilities of detecting contributions of plant systems to the occurrence of risky situations or accidents that cannot be managed. (orig./HP) [Deutsch] Mittlerweile haben sich probabilistische Sicherheitsanalysen zu einem nuetzlichen Werkzeug fuer die umfassende Beurteilung der Sicherheit von Kernkraftwerken entwickelt. Die angewandte Methodik erlaubt eine Quantifizierung der Anlagensicherheit unter Einbeziehung aller Einflussgroessen, inbesondere auch menschlicher Faktoren. Die Zielsetzung von Probabilistischen Sicherheitsanalysen liegt nicht allein in der zahlenmaessigen Bestimmung der Sicherheit des Kernkraftwerks. Die systematische Vorgehensweise unter Einbeziehung aller Einflussgroessen eroeffnet die Moeglichkeit, Auslegungsreserven deutlich zu machen, Schwachstellen im Zusammenhang mit bestimmten Ereignissen aufzudecken und Beitraege systemtechnischer Einrichtungen zu nicht beherrschten Ereignissen oder Gefaehrdungszustaenden zu minimieren. (orig./HP)

  15. Symbolic Computing in Probabilistic and Stochastic Analysis

    Directory of Open Access Journals (Sweden)

    Kamiński Marcin

    2015-12-01

    Full Text Available The main aim is to present recent developments in applications of symbolic computing in probabilistic and stochastic analysis, and this is done using the example of the well-known MAPLE system. The key theoretical methods discussed are (i analytical derivations, (ii the classical Monte-Carlo simulation approach, (iii the stochastic perturbation technique, as well as (iv some semi-analytical approaches. It is demonstrated in particular how to engage the basic symbolic tools implemented in any system to derive the basic equations for the stochastic perturbation technique and how to make an efficient implementation of the semi-analytical methods using an automatic differentiation and integration provided by the computer algebra program itself. The second important illustration is probabilistic extension of the finite element and finite difference methods coded in MAPLE, showing how to solve boundary value problems with random parameters in the environment of symbolic computing. The response function method belongs to the third group, where interference of classical deterministic software with the non-linear fitting numerical techniques available in various symbolic environments is displayed. We recover in this context the probabilistic structural response in engineering systems and show how to solve partial differential equations including Gaussian randomness in their coefficients.

  16. Quantitative analysis of probabilistic BPMN workflows

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Sharp, Robin

    2012-01-01

    of events, reward-based properties and best- and worst- case scenarios. We develop a simple example of medical workflow and demonstrate the utility of this analysis in accurate provisioning of drug stocks. Finally, we suggest a path to building upon these techniques to cover the entire BPMN language, allow......We present a framework for modelling and analysis of realworld business workflows. We present a formalised core subset of the Business Process Modelling and Notation (BPMN) and then proceed to extend this language with probabilistic nondeterministic branching and general-purpose reward annotations...

  17. A combined deterministic and probabilistic procedure for safety assessment of components with cracks - Handbook.

    Energy Technology Data Exchange (ETDEWEB)

    Dillstroem, Peter; Bergman, Mats; Brickstad, Bjoern; Weilin Zang; Sattari-Far, Iradj; Andersson, Peder; Sund, Goeran; Dahlberg, Lars; Nilsson, Fred (Inspecta Technology AB, Stockholm (Sweden))

    2008-07-01

    SSM has supported research work for the further development of a previously developed procedure/handbook (SKI Report 99:49) for assessment of detected cracks and tolerance for defect analysis. During the operative use of the handbook it was identified needs to update the deterministic part of the procedure and to introduce a new probabilistic flaw evaluation procedure. Another identified need was a better description of the theoretical basis to the computer program. The principal aim of the project has been to update the deterministic part of the recently developed procedure and to introduce a new probabilistic flaw evaluation procedure. Other objectives of the project have been to validate the conservatism of the procedure, make the procedure well defined and easy to use and make the handbook that documents the procedure as complete as possible. The procedure/handbook and computer program ProSACC, Probabilistic Safety Assessment of Components with Cracks, has been extensively revised within this project. The major differences compared to the last revision are within the following areas: It is now possible to deal with a combination of deterministic and probabilistic data. It is possible to include J-controlled stable crack growth. The appendices on material data to be used for nuclear applications and on residual stresses are revised. A new deterministic safety evaluation system is included. The conservatism in the method for evaluation of the secondary stresses for ductile materials is reduced. A new geometry, a circular bar with a circumferential surface crack has been introduced. The results of this project will be of use to SSM in safety assessments of components with cracks and in assessments of the interval between the inspections of components in nuclear power plants

  18. Incorporating organizational factors into probabilistic safety assessment of nuclear power plants through canonical probabilistic models

    Energy Technology Data Exchange (ETDEWEB)

    Galan, S.F. [Dpto. de Inteligencia Artificial, E.T.S.I. Informatica (UNED), Juan del Rosal, 16, 28040 Madrid (Spain)]. E-mail: seve@dia.uned.es; Mosleh, A. [2100A Marie Mount Hall, Materials and Nuclear Engineering Department, University of Maryland, College Park, MD 20742 (United States)]. E-mail: mosleh@umd.edu; Izquierdo, J.M. [Area de Modelado y Simulacion, Consejo de Seguridad Nuclear, Justo Dorado, 11, 28040 Madrid (Spain)]. E-mail: jmir@csn.es

    2007-08-15

    The {omega}-factor approach is a method that explicitly incorporates organizational factors into Probabilistic safety assessment of nuclear power plants. Bayesian networks (BNs) are the underlying formalism used in this approach. They have a structural part formed by a graph whose nodes represent organizational variables, and a parametric part that consists of conditional probabilities, each of them quantifying organizational influences between one variable and its parents in the graph. The aim of this paper is twofold. First, we discuss some important limitations of current procedures in the {omega}-factor approach for either assessing conditional probabilities from experts or estimating them from data. We illustrate the discussion with an example that uses data from Licensee Events Reports of nuclear power plants for the estimation task. Second, we introduce significant improvements in the way BNs for the {omega}-factor approach can be constructed, so that parameter acquisition becomes easier and more intuitive. The improvements are based on the use of noisy-OR gates as model of multicausal interaction between each BN node and its parents.

  19. Probabilistic Resource Analysis by Program Transformation

    DEFF Research Database (Denmark)

    Kirkeby, Maja Hanne; Rosendahl, Mads

    2016-01-01

    The aim of a probabilistic resource analysis is to derive a probability distribution of possible resource usage for a program from a probability distribution of its input. We present an automated multi-phase rewriting based method to analyze programs written in a subset of C. It generates...... a probability distribution of the resource usage as a possibly uncomputable expression and then transforms it into a closed form expression using over-approximations. We present the technique, outline the implementation and show results from experiments with the system....

  20. Probabilistic Analysis of the Quality Calculus

    DEFF Research Database (Denmark)

    Nielson, Hanne Riis; Nielson, Flemming

    2013-01-01

    We consider a fragment of the Quality Calculus, previously introduced for defensive programming of software components such that it becomes natural to plan for default behaviour in case the ideal behaviour fails due to unreliable communication. This paper develops a probabilistically based trust...... analysis supporting the Quality Calculus. It uses information about the probabilities that expected input will be absent in order to determine the trustworthiness of the data used for controlling the distributed system; the main challenge is to take accord of the stochastic dependency between some...

  1. Probabilistic Principal Component Analysis for Metabolomic Data.

    LENUS (Irish Health Repository)

    Nyamundanda, Gift

    2010-11-23

    Abstract Background Data from metabolomic studies are typically complex and high-dimensional. Principal component analysis (PCA) is currently the most widely used statistical technique for analyzing metabolomic data. However, PCA is limited by the fact that it is not based on a statistical model. Results Here, probabilistic principal component analysis (PPCA) which addresses some of the limitations of PCA, is reviewed and extended. A novel extension of PPCA, called probabilistic principal component and covariates analysis (PPCCA), is introduced which provides a flexible approach to jointly model metabolomic data and additional covariate information. The use of a mixture of PPCA models for discovering the number of inherent groups in metabolomic data is demonstrated. The jackknife technique is employed to construct confidence intervals for estimated model parameters throughout. The optimal number of principal components is determined through the use of the Bayesian Information Criterion model selection tool, which is modified to address the high dimensionality of the data. Conclusions The methods presented are illustrated through an application to metabolomic data sets. Jointly modeling metabolomic data and covariates was successfully achieved and has the potential to provide deeper insight to the underlying data structure. Examination of confidence intervals for the model parameters, such as loadings, allows for principled and clear interpretation of the underlying data structure. A software package called MetabolAnalyze, freely available through the R statistical software, has been developed to facilitate implementation of the presented methods in the metabolomics field.

  2. Development Of Dynamic Probabilistic Safety Assessment: The Accident Dynamic Simulator (ADS) Tool

    Energy Technology Data Exchange (ETDEWEB)

    Chang, Y.H.; Mosleh, A.; Dang, V.N

    2003-03-01

    The development of a dynamic methodology for Probabilistic Safety Assessment (PSA) addresses the complex interactions between the behaviour of technical systems and personnel response in the evolution of accident scenarios. This paper introduces the discrete dynamic event tree, a framework for dynamic PSA, and its implementation in the Accident Dynamic Simulator (ADS) tool. Dynamic event tree tools generate and quantify accident scenarios through coupled simulation models of the plant physical processes, its automatic systems, the equipment reliability, and the human response. The current research on the framework, the ADS tool, and on Human Reliability Analysis issues within dynamic PSA, is discussed. (author)

  3. Probabilistic Seismic Hazard Analysis for Yemen

    Directory of Open Access Journals (Sweden)

    Rakesh Mohindra

    2012-01-01

    Full Text Available A stochastic-event probabilistic seismic hazard model, which can be used further for estimates of seismic loss and seismic risk analysis, has been developed for the territory of Yemen. An updated composite earthquake catalogue has been compiled using the databases from two basic sources and several research publications. The spatial distribution of earthquakes from the catalogue was used to define and characterize the regional earthquake source zones for Yemen. To capture all possible scenarios in the seismic hazard model, a stochastic event set has been created consisting of 15,986 events generated from 1,583 fault segments in the delineated seismic source zones. Distribution of horizontal peak ground acceleration (PGA was calculated for all stochastic events considering epistemic uncertainty in ground-motion modeling using three suitable ground motion-prediction relationships, which were applied with equal weight. The probabilistic seismic hazard maps were created showing PGA and MSK seismic intensity at 10% and 50% probability of exceedance in 50 years, considering local soil site conditions. The resulting PGA for 10% probability of exceedance in 50 years (return period 475 years ranges from 0.2 g to 0.3 g in western Yemen and generally is less than 0.05 g across central and eastern Yemen. The largest contributors to Yemen’s seismic hazard are the events from the West Arabian Shield seismic zone.

  4. Advanced probabilistic risk analysis using RAVEN and RELAP-7

    Energy Technology Data Exchange (ETDEWEB)

    Rabiti, Cristian [Idaho National Lab. (INL), Idaho Falls, ID (United States); Alfonsi, Andrea [Idaho National Lab. (INL), Idaho Falls, ID (United States); Mandelli, Diego [Idaho National Lab. (INL), Idaho Falls, ID (United States); Cogliati, Joshua [Idaho National Lab. (INL), Idaho Falls, ID (United States); Kinoshita, Robert [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2014-06-01

    RAVEN, under the support of the Nuclear Energy Advanced Modeling and Simulation (NEAMS) program [1], is advancing its capability to perform statistical analyses of stochastic dynamic systems. This is aligned with its mission to provide the tools needed by the Risk Informed Safety Margin Characterization (RISMC) path-lead [2] under the Department Of Energy (DOE) Light Water Reactor Sustainability program [3]. In particular this task is focused on the synergetic development with the RELAP-7 [4] code to advance the state of the art on the safety analysis of nuclear power plants (NPP). The investigation of the probabilistic evolution of accident scenarios for a complex system such as a nuclear power plant is not a trivial challenge. The complexity of the system to be modeled leads to demanding computational requirements even to simulate one of the many possible evolutions of an accident scenario (tens of CPU/hour). At the same time, the probabilistic analysis requires thousands of runs to investigate outcomes characterized by low probability and severe consequence (tail problem). The milestone reported in June of 2013 [5] described the capability of RAVEN to implement complex control logic and provide an adequate support for the exploration of the probabilistic space using a Monte Carlo sampling strategy. Unfortunately the Monte Carlo approach is ineffective with a problem of this complexity. In the following year of development, the RAVEN code has been extended with more sophisticated sampling strategies (grids, Latin Hypercube, and adaptive sampling). This milestone report illustrates the effectiveness of those methodologies in performing the assessment of the probability of core damage following the onset of a Station Black Out (SBO) situation in a boiling water reactor (BWR). The first part of the report provides an overview of the available probabilistic analysis capabilities, ranging from the different types of distributions available, possible sampling

  5. Probabilistic sensitivity analysis of biochemical reaction systems.

    Science.gov (United States)

    Zhang, Hong-Xuan; Dempsey, William P; Goutsias, John

    2009-09-07

    Sensitivity analysis is an indispensable tool for studying the robustness and fragility properties of biochemical reaction systems as well as for designing optimal approaches for selective perturbation and intervention. Deterministic sensitivity analysis techniques, using derivatives of the system response, have been extensively used in the literature. However, these techniques suffer from several drawbacks, which must be carefully considered before using them in problems of systems biology. We develop here a probabilistic approach to sensitivity analysis of biochemical reaction systems. The proposed technique employs a biophysically derived model for parameter fluctuations and, by using a recently suggested variance-based approach to sensitivity analysis [Saltelli et al., Chem. Rev. (Washington, D.C.) 105, 2811 (2005)], it leads to a powerful sensitivity analysis methodology for biochemical reaction systems. The approach presented in this paper addresses many problems associated with derivative-based sensitivity analysis techniques. Most importantly, it produces thermodynamically consistent sensitivity analysis results, can easily accommodate appreciable parameter variations, and allows for systematic investigation of high-order interaction effects. By employing a computational model of the mitogen-activated protein kinase signaling cascade, we demonstrate that our approach is well suited for sensitivity analysis of biochemical reaction systems and can produce a wealth of information about the sensitivity properties of such systems. The price to be paid, however, is a substantial increase in computational complexity over derivative-based techniques, which must be effectively addressed in order to make the proposed approach to sensitivity analysis more practical.

  6. Probabilistic Analysis of Facility Location on Random Shortest Path Metrics

    NARCIS (Netherlands)

    Klootwijk, Stefan; Manthey, Bodo

    The facility location problem is an NP-hard optimization problem. Therefore, approximation algorithms are often used to solve large instances. Probabilistic analysis is a widely used tool to analyze such algorithms. Most research on probabilistic analysis of NP-hard optimization problems involving

  7. Human Reliability in Probabilistic Safety Assessments; Fiabilidad Humana en los Analisis Probabilisticos de Seguridad

    Energy Technology Data Exchange (ETDEWEB)

    Nunez Mendez, J.

    1989-07-01

    Nowadays a growing interest in environmental aspects is detected in our country. It implies an assessment of the risk involved in the industrial processes and installations in order to determine if those are into the acceptable limits. In these safety assessments, among which PSA (Probabilistic Safety Assessments), can be pointed out the role played by the human being in the system is one of the more relevant subjects (This relevance has been demonstrated in the accidents happened) . However, in Spain there aren't manuals specifically dedicated to asses the human contribution to risk in the frame of PSAs. This report aims to improve this situation providing: a) a theoretical background to help the reader in the understanding of the nature of the human error, b) a quid to carry out a Human Reliability Analysis and c) a selected overview of the techniques and methodologies currently applied in this area. (Author) 20 refs.

  8. Probabilistic safety assessment of the French PWR 900 MWE series results and insights

    Energy Technology Data Exchange (ETDEWEB)

    Corenwinder, F.; Bertrand, V.; Dupuy, P.; Gomane, C.; Mattei, J.M.; Pichereau, F. [CEA Fontenay aux Roses, Institut de Radioprotection et de Surete Nucleaire, IRSN, 92 (France)

    2002-06-01

    The Institute for Nuclear Safety and Protection (IRSN) has been performing an updating of the Probabilistic Safety Assessment of the standard French 900 MWe PWR in order to include design and operation modifications set up since the first version of this study published by IRSN in 1990. Moreover the updating takes into account new data and new knowledge. For these reasons many sequences have been reassessed and new sequences have been identified. The updating has highlighted significant accident sequences (cold over pressurization, interfacing LOCA, loss of ventilation systems and inadvertent dilution), which were not included or not analyzed in details in the first PSA version. Moreover, the update has allowed the verification of the design and operational changes implemented to decrease the frequencies of the dominant accident sequences identified by the 1990 PSA study. The purpose of the paper is to describe the dominant accident sequences and to present the main significant points of the analysis, the main insights and the conclusions. (authors)

  9. Biological sequence analysis: probabilistic models of proteins and nucleic acids

    National Research Council Canada - National Science Library

    Durbin, Richard

    1998-01-01

    ... analysis methods are now based on principles of probabilistic modelling. Examples of such methods include the use of probabilistically derived score matrices to determine the significance of sequence alignments, the use of hidden Markov models as the basis for profile searches to identify distant members of sequence families, and the inference...

  10. Implementation of probabilistic risk estimation for VRU safety

    NARCIS (Netherlands)

    Nunen, E. van; Broek, T.H.A. van den; Kwakkernaat, M.R.J.A.E.; Kotiadis, D.

    2011-01-01

    This paper describes the design, implementation and results of a novel probabilistic collision warning system. To obtain reliable results for risk estimation, preprocessing sensor data is essential. The work described herein presents all the necessary preprocessing steps such as filtering, sensor

  11. A Level 1+ Probabilistic Safety Assessment of the High Flux Australian Reactor. Vol 1

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1998-01-01

    The Department of Industry, Science and Tourism selected PLG, an EQE International Company, to systematically and independently evaluate the safety of the High Flux Australian Reactor (HIFAR), located at Lucas Heights, New South Wales. PLG performed a comprehensive probabilistic safety assessment (PSA) to quantify the risks posed by operation of HIFAR . The PSA identified possible accident scenarios, estimated their likelihood of occurrence, and assigned each scenario to a consequence category; i.e., end state. The accident scenarios developed included the possible release of radioactive material from irradiated nuclear fuel and of tritium releases from reactor coolant. The study team developed a recommended set of safety criteria against which the results of the PSA may be judged. HIFAR was found to exceed one of the two primary safety objectives and two of the five secondary safety objectives. Reactor coolant leaks, earthquakes, and coolant pump trips were the accident initiators that contributed most to scenarios that could result in fuel overheating. Scenarios initiated by earthquakes were the reason the frequency criterion for the one primary safety objective was exceeded. Overall, the plant safety status has been shown to be generally good with no evidence of major safety-related problems from its operation. One design deficiency associated with the emergency core cooling system was identified that should be corrected as soon as possible. Additionally, several analytical issues have been identified that should be investigated further. The results from these additional investigations should be used to determine whether additional plant and procedural changes are required, or if further evaluations of postulated severe accidents are warranted. Supporting information can be found in Appendix A for the seismic analysis and in the Appendix B for selected other external events refs., 139 tabs., 85 figs. Prepared for Department of Industry, Science and Tourism

  12. A Probabilistic Analysis of the Sacco and Vanzetti Evidence

    CERN Document Server

    Kadane, Joseph B

    2011-01-01

    A Probabilistic Analysis of the Sacco and Vanzetti Evidence is a Bayesian analysis of the trial and post-trial evidence in the Sacco and Vanzetti case, based on subjectively determined probabilities and assumed relationships among evidential events. It applies the ideas of charting evidence and probabilistic assessment to this case, which is perhaps the ranking cause celebre in all of American legal history. Modern computation methods applied to inference networks are used to show how the inferential force of evidence in a complicated case can be graded. The authors employ probabilistic assess

  13. Probabilistic evaluation of scenarios in long-term safety analyses. Results of the project ISIBEL; Probabilistische Bewertung von Szenarien in Langzeitsicherheitsanalysen. Ergebnisse des Vorhabens ISIBEL

    Energy Technology Data Exchange (ETDEWEB)

    Buhmann, Dieter; Becker, Dirk-Alexander; Laggiard, Eduardo; Ruebel, Andre; Spiessl, Sabine; Wolf, Jens

    2016-07-15

    In the frame of the project ISIBEL deterministic analyses on the radiological consequences of several possible developments of the final repository were performed (VSG: preliminary safety analysis of the site Gorleben). The report describes the probabilistic evaluation of the VSG scenarios using uncertainty and sensitivity analyses. It was shown that probabilistic analyses are important to evaluate the influence of uncertainties. The transfer of the selected scenarios in computational cases and the used modeling parameters are discussed.

  14. Probabilistic approaches for geotechnical site characterization and slope stability analysis

    CERN Document Server

    Cao, Zijun; Li, Dianqing

    2017-01-01

    This is the first book to revisit geotechnical site characterization from a probabilistic point of view and provide rational tools to probabilistically characterize geotechnical properties and underground stratigraphy using limited information obtained from a specific site. This book not only provides new probabilistic approaches for geotechnical site characterization and slope stability analysis, but also tackles the difficulties in practical implementation of these approaches. In addition, this book also develops efficient Monte Carlo simulation approaches for slope stability analysis and implements these approaches in a commonly available spreadsheet environment. These approaches and the software package are readily available to geotechnical practitioners and alleviate them from reliability computational algorithms. The readers will find useful information for a non-specialist to determine project-specific statistics of geotechnical properties and to perform probabilistic analysis of slope stability.

  15. Financial Markets Analysis by Probabilistic Fuzzy Modelling

    NARCIS (Netherlands)

    J.H. van den Berg (Jan); W.-M. van den Bergh (Willem-Max); U. Kaymak (Uzay)

    2003-01-01

    textabstractFor successful trading in financial markets, it is important to develop financial models where one can identify different states of the market for modifying one???s actions. In this paper, we propose to use probabilistic fuzzy systems for this purpose. We concentrate on Takagi???Sugeno

  16. Landslide susceptibility analysis using Probabilistic Certainty Factor ...

    Indian Academy of Sciences (India)

    This paper reports the use of a GIS based Probabilistic Certainty Factor method to assess the geo-environmental factors that contribute to landslide susceptibility in Tevankarai Ar sub-watershed,. Kodaikkanal. Landslide occurrences are a common phenomenon in the Tevankarai Ar sub-watershed,. Kodaikkanal owing to ...

  17. Scalable group level probabilistic sparse factor analysis

    DEFF Research Database (Denmark)

    Hinrich, Jesper Løve; Nielsen, Søren Føns Vind; Riis, Nicolai Andre Brogaard

    2017-01-01

    shows that noise is reduced in areas typically associated with activation by the experimental design. The psFA model identifies sparse components and the probabilistic setting provides a natural way to handle parameter uncertainties. The variational Bayesian framework easily extends to more complex...

  18. Why is Probabilistic Seismic Hazard Analysis (PSHA) still used?

    Science.gov (United States)

    Mulargia, Francesco; Stark, Philip B.; Geller, Robert J.

    2017-03-01

    Even though it has never been validated by objective testing, Probabilistic Seismic Hazard Analysis (PSHA) has been widely used for almost 50 years by governments and industry in applications with lives and property hanging in the balance, such as deciding safety criteria for nuclear power plants, making official national hazard maps, developing building code requirements, and determining earthquake insurance rates. PSHA rests on assumptions now known to conflict with earthquake physics; many damaging earthquakes, including the 1988 Spitak, Armenia, event and the 2011 Tohoku, Japan, event, have occurred in regions relatively rated low-risk by PSHA hazard maps. No extant method, including PSHA, produces reliable estimates of seismic hazard. Earthquake hazard mitigation should be recognized to be inherently political, involving a tradeoff between uncertain costs and uncertain risks. Earthquake scientists, engineers, and risk managers can make important contributions to the hard problem of allocating limited resources wisely, but government officials and stakeholders must take responsibility for the risks of accidents due to natural events that exceed the adopted safety criteria.

  19. Use of the t-distribution to construct seismic hazard curves for seismic probabilistic safety assessments

    Energy Technology Data Exchange (ETDEWEB)

    Yee, Eric [KEPCO International Nuclear Graduate School, Dept. of Nuclear Power Plant Engineering, Ulsan (Korea, Republic of)

    2017-03-15

    Seismic probabilistic safety assessments are used to help understand the impact potential seismic events can have on the operation of a nuclear power plant. An important component to seismic probabilistic safety assessment is the seismic hazard curve which shows the frequency of seismic events. However, these hazard curves are estimated assuming a normal distribution of the seismic events. This may not be a strong assumption given the number of recorded events at each source-to-site distance. The use of a normal distribution makes the calculations significantly easier but may underestimate or overestimate the more rare events, which is of concern to nuclear power plants. This paper shows a preliminary exploration into the effect of using a distribution that perhaps more represents the distribution of events, such as the t-distribution to describe data. The integration of a probability distribution with potentially larger tails basically pushes the hazard curves outward, suggesting a different range of frequencies for use in seismic probabilistic safety assessments. Therefore the use of a more realistic distribution results in an increase in the frequency calculations suggesting rare events are less rare than thought in terms of seismic probabilistic safety assessment. However, the opposite was observed with the ground motion prediction equation considered.

  20. Deterministic and Probabilistic Analysis of NPP Communication Bridge Resistance Due to Extreme Loads

    Directory of Open Access Journals (Sweden)

    Králik Juraj

    2014-12-01

    Full Text Available This paper presents the experiences from the deterministic and probability analysis of the reliability of communication bridge structure resistance due to extreme loads - wind and earthquake. On the example of the steel bridge between two NPP buildings is considered the efficiency of the bracing systems. The advantages and disadvantages of the deterministic and probabilistic analysis of the structure resistance are discussed. The advantages of the utilization the LHS method to analyze the safety and reliability of the structures is presented

  1. Assessing risk: the role of probabilistic risk assessment (PRA) in patient safety improvement.

    Science.gov (United States)

    Wreathall, J; Nemeth, C

    2004-06-01

    Morbidity and mortality due to "medical errors" compel better understanding of health care as a system. Probabilistic risk assessment (PRA) has been used to assess the designs of high hazard, low risk systems such as commercial nuclear power plants and chemical manufacturing plants and is now being studied for its potential in the improvement of patient safety. PRA examines events that contribute to adverse outcomes through the use of event tree analysis and determines the likelihood of event occurrence through fault tree analysis. It complements tools already in use in patient safety such as failure modes and effects analyses (FMEAs) and root cause analyses (RCAs). PRA improves on RCA by taking account of the more complex causal interrelationships that are typical in health care. It also enables the analyst to examine potential solution effectiveness by direct graphical representations. However, PRA simplifies real world complexity by forcing binary conditions on events, and it lacks adequate probability data (although recent developments help to overcome these limitations). Its reliance on expert assessment calls for deep domain knowledge which has to come from research performed at the "sharp end" of acute care.

  2. Reliability data update using condition monitoring and prognostics in probabilistic safety assessment

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Hyeon Min; Lee, Sang Hwan; Park, Jun Seok; Kim, Hyung Dae; Chang, Yoon Suk; Heo, Gyun Young [Dept. of Nuclear Engineering, Kyung Hee University, Yongin (Korea, Republic of)

    2015-03-15

    Probabilistic safety assessment (PSA) has had a significant role in quantitative decision making by finding design and operational vulnerabilities and evaluating cost-benefit in improving such weak points. In particular, it has been widely used as the core methodology for risk-informed applications (RIAs). Even though the nature of PSA seeks realistic results, there are still 'conservative' aspects. One of the sources for the conservatism is the assumptions of safety analysis and the estimation of failure frequency. Surveillance, diagnosis, and prognosis (SDP), utilizing massive databases and information technology, is worth highlighting in terms of its capability for alleviating the conservatism in conventional PSA. This article provides enabling techniques to solidify a method to provide time and condition-dependent risks by integrating a conventional PSA model with condition monitoring and prognostics techniques. We will discuss how to integrate the results with frequency of initiating events (IEs) and probability of basic events (BEs). Two illustrative examples will be introduced: (1) how the failure probability of a passive system can be evaluated under different plant conditions and (2) how the IE frequency for a steam generator tube rupture (SGTR) can be updated in terms of operating time. We expect that the proposed model can take a role of annunciator to show the variation of core damage frequency (CDF) depending on operational conditions.

  3. Wind power in Mexico: simulation of a wind farm and application of probabilistic safety analysis; La energia del viento en Mexico: Simulacion de un parque eolico y aplicacion de analisis probabilistica de seguridad

    Energy Technology Data Exchange (ETDEWEB)

    Martin del Campo Marquez, C.; Nelson Edestein, P.F.; Garcia Vazquez, M.A. [Facultad de Ingenieria, Universidad Nacional Autonoma de Mexico (Mexico)]. E-mail: cecilia.martin.del.campo@gmail.com; pnelson_007@yahoo.com; maiki27@yahoo.com

    2009-10-15

    The most important aspects of wind energy in Mexico, including the potential for generating electricity and the major projects planned are presented here. In particular, the generation costs are compared to those of other energy sources. The results from the simulation of a 100MW wind farm in the Tehuantepec Isthmus are also presented. In addition, the environmental impacts related to the wind farm in the mentioned zone are analyzed. Finally, some benefits of using Probabilistic Safety Analysis are discussed with respect to evaluating the risks associated with events that can occur in wind parks, being especially useful for design and maintenance of the parks and the wind turbines themselves. In particular, an event tree was developed to analyze possible accident sequences that could occur when the wind speed is too great. Also, fault trees were developed for each mitigating system considered, in order to determine the relative importance of the wind generator components to the failure sequences, in order to evaluate the yield of suggested improvements and the optimization of maintenance programs. [Spanish] Se presentan los aspectos mas importantes referentes a la energia eolica en Mexico, su potencial de aprovechamiento y los proyectos planeados. Se comparan sus costos de generacion electrica con los de otras fuentes de energia. Se presentan los resultados de la simulacion con el programa WindPro, de un parque eolico de 100 MW localizado en el Istmo de Tehuantepec. Asimismo, se analizan algunos de los impactos ambientales relacionados con la instalacion de paquetes eolicos en la zona mencionada. Finalmente, se discuten las ventajas que pueden aportar los analisis probabilisticas de seguridad para evaluar los riesgos asociados a eventos que pueden ocurrir en los parques eolicos, sino de los resultados de este analisis de utilidad para el diseno y mantenimiento de los parques y de los propios aerogeneradores. Especificamente se desarrollo un arbol de eventos con el

  4. Methodology of containment response analysis for the Probabilistic Safety Assessment -PSA of the CAREM-25 nuclear power plant; Metodologia de analisis de la respuesta de contencion para el APS en la Central CAREM-25

    Energy Technology Data Exchange (ETDEWEB)

    Baron, Jorge [Universidad Nacional de Cuyo, Mendoza (Argentina). Facultad de Ingenieria. Centro de Estudios de Ingenieria Asistida por Computadora

    1996-07-01

    This work is part of the Probabilistic Safety Assessment actually under development for the CAREM-25 Nuclear Power Station, and departs from the accident sequences already obtained and quantified by the Event Trees/Fault Trees techniques. At first, the potential containment failure modes for nuclear stations are listed, based on the experience. Then, the CAREM-25 design peculiarities are analyzed, on their possible influence on the containment behavior during, severe accidents. Then Plan Damage States are then defined. Furthermore, Containment Damage States are also defined, and Containment Event Trees are built for each Plant Damage State. Those sequences considered representative from the annual probability (those which exceed or equal a probability of 1E-09 per year, are used to quantify the combinations of Plant Damage States/Containment Damage States, based on the estimation of a Vulnerability Matrix. (author)

  5. An overview of engineering concepts and current design algorithms for probabilistic structural analysis

    Science.gov (United States)

    Duffy, S. F.; Hu, J.; Hopkins, D. A.

    1995-01-01

    The article begins by examining the fundamentals of traditional deterministic design philosophy. The initial section outlines the concepts of failure criteria and limit state functions two traditional notions that are embedded in deterministic design philosophy. This is followed by a discussion regarding safety factors (a possible limit state function) and the common utilization of statistical concepts in deterministic engineering design approaches. Next the fundamental aspects of a probabilistic failure analysis are explored and it is shown that deterministic design concepts mentioned in the initial portion of the article are embedded in probabilistic design methods. For components fabricated from ceramic materials (and other similarly brittle materials) the probabilistic design approach yields the widely used Weibull analysis after suitable assumptions are incorporated. The authors point out that Weibull analysis provides the rare instance where closed form solutions are available for a probabilistic failure analysis. Since numerical methods are usually required to evaluate component reliabilities, a section on Monte Carlo methods is included to introduce the concept. The article concludes with a presentation of the technical aspects that support the numerical method known as fast probability integration (FPI). This includes a discussion of the Hasofer-Lind and Rackwitz-Fiessler approximations.

  6. A Probabilistic Analysis Framework for Malicious Insider Threats

    DEFF Research Database (Denmark)

    Chen, Taolue; Kammuller, Florian; Nemli, Ibrahim

    2015-01-01

    Malicious insider threats are difficult to detect and to mitigate. Many approaches for explaining behaviour exist, but there is little work to relate them to formal approaches to insider threat detection. In this work we present a general formal framework to perform analysis for malicious insider...... threats, based on probabilistic modelling, verification, and synthesis techniques. The framework first identifies insiders’ intention to perform an inside attack, using Bayesian networks, and in a second phase computes the probability of success for an inside attack by this actor, using probabilistic...

  7. Site-specific Probabilistic Analysis of DCGLs Using RESRAD Code

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jeongju; Yoon, Suk Bon; Sohn, Wook [KHNP CRI, Daejeon (Korea, Republic of)

    2016-10-15

    In general, DCGLs can be conservative (screening DCGL) if they do not take into account site specific factors. Use of such conservative DCGLs can lead to additional remediation that would not be required if the effort was made to develop site-specific DCGLs. Therefore, the objective of this work is to provide an example on the use of the RESRAD 6.0 probabilistic (site-specific) dose analysis to compare with the screening DCGL. Site release regulations state that a site will be considered acceptable for unrestricted use if the residual radioactivity that is distinguishable from background radiation results in a Total Effective Dose Equivalent (TEDE) to an average member of the critical group of less than the site release criteria, for example 0.25 mSv per year in U.S. Utilities use computer dose modeling codes to establish an acceptable level of contamination, the derived concentration guideline level (DCGL) that will meet this regulatory limit. Since the DCGL value is the principal measure of residual radioactivity, it is critical to understand the technical basis of these dose modeling codes. The objective this work was to provide example on nuclear power plant decommissioning dose analysis in a probabilistic analysis framework. The focus was on the demonstration of regulatory compliance for surface soil contamination using the RESRAD 6.0 code. Both the screening and site-specific probabilistic dose analysis methodologies were examined. Example analyses performed with the screening probabilistic dose analysis confirmed the conservatism of the NRC screening values and indicated the effectiveness of probabilistic dose analysis in reducing the conservatism in DCGL derivation.

  8. Implementation of a risk assessment tool based on a probabilistic safety assessment developed for radiotherapy practices

    Energy Technology Data Exchange (ETDEWEB)

    Paz, A.; Godinez, V.; Lopez, R., E-mail: abpaz@cnsns.gob.m [Comision Nacional de Seguridad Nuclear y Salvaguardias, Dr. Barragan No. 779, Col. Narvarte, 03020 Mexico D. F. (Mexico)

    2010-10-15

    The present work describes the implementation process and main results of the risk assessment to the radiotherapy practices with Linear Accelerators (Linac), with cobalt 60, and with brachytherapy. These evaluations were made throughout the risk assessment tool for radiotherapy practices SEVRRA (risk evaluation system for radiotherapy), developed at the Mexican National Commission in Nuclear Safety and Safeguards derived from the outcome obtained with the Probabilistic Safety Analysis developed at the Ibero-American Regulators Forum for these radiotherapy facilities. The methodology used is supported by risk matrices method, a mathematical tool that estimates the risk to the patient, radiation workers and public from mechanical failures, mis calibration of the devices, human mistakes, and so. The initiating events are defined as those undesirable events that, together with other failures, can produce a delivery of an over-dose or an under-dose of the medical prescribed dose, to the planned target volume, or a significant dose to non prescribed human organs. Initiating events frequency and reducer of its frequency (actions intended to avoid the accident) are estimated as well as robustness of barriers to those actions, such as mechanical switches, which detect and prevent the accident from occurring. The spectrum of the consequences is parameterized, and the actions performed to reduce the consequences are identified. Based on this analysis, a software tool was developed in order to simplify the evaluations to radiotherapy installations and it has been applied as a first step forward to some Mexican installations, as part of a national implementation process, the final goal is evaluation of all Mexican facilities in the near future. The main target and benefits of the SEVRRA implementation are presented in this paper. (Author)

  9. Integration of Advanced Probabilistic Analysis Techniques with Multi-Physics Models

    Energy Technology Data Exchange (ETDEWEB)

    Cetiner, Mustafa Sacit; none,; Flanagan, George F. [ORNL; Poore III, Willis P. [ORNL; Muhlheim, Michael David [ORNL

    2014-07-30

    An integrated simulation platform that couples probabilistic analysis-based tools with model-based simulation tools can provide valuable insights for reactive and proactive responses to plant operating conditions. The objective of this work is to demonstrate the benefits of a partial implementation of the Small Modular Reactor (SMR) Probabilistic Risk Assessment (PRA) Detailed Framework Specification through the coupling of advanced PRA capabilities and accurate multi-physics plant models. Coupling a probabilistic model with a multi-physics model will aid in design, operations, and safety by providing a more accurate understanding of plant behavior. This represents the first attempt at actually integrating these two types of analyses for a control system used for operations, on a faster than real-time basis. This report documents the development of the basic communication capability to exchange data with the probabilistic model using Reliability Workbench (RWB) and the multi-physics model using Dymola. The communication pathways from injecting a fault (i.e., failing a component) to the probabilistic and multi-physics models were successfully completed. This first version was tested with prototypic models represented in both RWB and Modelica. First, a simple event tree/fault tree (ET/FT) model was created to develop the software code to implement the communication capabilities between the dynamic-link library (dll) and RWB. A program, written in C#, successfully communicates faults to the probabilistic model through the dll. A systems model of the Advanced Liquid-Metal Reactor–Power Reactor Inherently Safe Module (ALMR-PRISM) design developed under another DOE project was upgraded using Dymola to include proper interfaces to allow data exchange with the control application (ConApp). A program, written in C+, successfully communicates faults to the multi-physics model. The results of the example simulation were successfully plotted.

  10. Probabilistic Safety Goals. Phase 1 Status and Experiences in Sweden and Finland

    Energy Technology Data Exchange (ETDEWEB)

    Holmberg, Jan-Erik (VTT, FI-02044 VTT (Finland)); Knochenhauer, Michael (Relcon Scandpower AB, SE-172 25 Sundbyberg (Sweden))

    2007-02-15

    The outcome of a probabilistic safety assessment (PSA) for a nuclear power plant is a combination of qualitative and quantitative results. Quantitative results are typically presented as the Core Damage Frequency (CDF) and as the frequency of an unacceptable radioactive release. In order to judge the acceptability of PSA results, criteria for the interpretation of results and the assessment of their acceptability need to be defined. Ultimately, the goals are intended to define an acceptable level of risk from the operation of a nuclear facility. However, safety goals usually have a dual function, i.e., they define an acceptable safety level, but they also have a wider and more general use as decision criteria. The exact levels of the safety goals differ between organisations and between different countries. There are also differences in the definition of the safety goal, and in the formal status of the goals, i.e., whether they are mandatory or not. In this first phase of the project, the aim has been on providing a clear description of the issue of probabilistic safety goals for nuclear power plants, to define and describe important concepts related to the definition and application of safety goals, and to describe experiences in Finland and Sweden. Based on a series of interviews and on literature reviews as well as on a limited international over-view, the project has described the history and current status of safety goals in Sweden and Finland, and elaborated on a number of issues, including the following: The status of the safety goals in view of the fact that they have been exceeded for much of the time they have been in use, as well as the possible implications of these exceedances. Safety goals as informal or mandatory limits. Strategies for handling violations of safety goals, including various graded approaches, such as ALARP (As Low As Reasonably Practicable). Relation between safety goals defined on different levels, e.g., for core damage and for

  11. Probabilistic safety goals. Phase 1 - Status and experiences in Sweden and Finland

    Energy Technology Data Exchange (ETDEWEB)

    Holmberg, J.E. [VTT (Finland); Knochenhauer, M. [Relcon Scandpower AB (Sweden)

    2007-03-15

    The outcome of a probabilistic safety assessment (PSA) for a nuclear power plant is a combination of qualitative and quantitative results. Quantitative results are typically presented as the Core Damage Frequency (CDF) and as the frequency of an unacceptable radioactive release. In order to judge the acceptability of PSA results, criteria for the interpretation of results and the assessment of their acceptability need to be defined. Ultimately, the goals are intended to define an acceptable level of risk from the operation of a nuclear facility. However, safety goals usually have a dual function, i.e., they define an acceptable safety level, but they also have a wider and more general use as decision criteria. The exact levels of the safety goals differ between organisations and between different countries. There are also differences in the definition of the safety goal, and in the formal status of the goals, i.e., whether they are mandatory or not. In this first phase of the project, the aim has been on providing a clear description of the issue of probabilistic safety goals for nuclear power plants, to define and describe important concepts related to the definition and application of safety goals, and to describe experiences in Finland and Sweden. Based on a series of interviews and on literature reviews as well as on a limited international over-view, the project has described the history and current status of safety goals in Sweden and Finland, and elaborated on a number of issues, including the following: 1) The status of the safety goals in view of the fact that they have been exceeded for much of the time they have been in use, as well as the possible implications of these exceedances. 2) Safety goals as informal or mandatory limits. 3) Strategies for handling violations of safety goals, including various graded approaches, such as ALARP (As Low As Reasonably Practicable). 4) Relation between safety goals defined on different levels, e.g., for core damage

  12. Probabilistic Slow Features for Behavior Analysis

    NARCIS (Netherlands)

    Zafeiriou, Lazaros; Nicolaou, Mihalis A.; Zafeiriou, Stefanos; Nikitidis, Symeon; Pantic, Maja

    A recently introduced latent feature learning technique for time-varying dynamic phenomena analysis is the so-called slow feature analysis (SFA). SFA is a deterministic component analysis technique for multidimensional sequences that, by minimizing the variance of the first-order time derivative

  13. Tools for voltage stability analysis, including a probabilistic approach

    Energy Technology Data Exchange (ETDEWEB)

    Vieira Filho, X.; Martins, N.; Bianco, A.; Pinto, H.J.C.P. [Centro de Pesquisas de Energia Eletrica (CEPEL), Rio de Janeiro, RJ (Brazil); Pereira, M.V.F. [Power System Research (PSR), Inc., Rio de Janeiro, RJ (Brazil); Gomes, P.; Santos, M.G. dos [ELETROBRAS, Rio de Janeiro, RJ (Brazil)

    1994-12-31

    This paper reviews some voltage stability analysis tools that are being used or envisioned for expansion and operational planning studies in the Brazilian system, as well as, their applications. The paper also shows that deterministic tools can be linked together in a probabilistic framework, so as to provide complementary help to the analyst in choosing the most adequate operation strategies, or the best planning solutions for a given system. (author) 43 refs., 8 figs., 8 tabs.

  14. Risk analysis of analytical validations by probabilistic modification of FMEA.

    Science.gov (United States)

    Barends, D M; Oldenhof, M T; Vredenbregt, M J; Nauta, M J

    2012-05-01

    Risk analysis is a valuable addition to validation of an analytical chemistry process, enabling not only detecting technical risks, but also risks related to human failures. Failure Mode and Effect Analysis (FMEA) can be applied, using a categorical risk scoring of the occurrence, detection and severity of failure modes, and calculating the Risk Priority Number (RPN) to select failure modes for correction. We propose a probabilistic modification of FMEA, replacing the categorical scoring of occurrence and detection by their estimated relative frequency and maintaining the categorical scoring of severity. In an example, the results of traditional FMEA of a Near Infrared (NIR) analytical procedure used for the screening of suspected counterfeited tablets are re-interpretated by this probabilistic modification of FMEA. Using this probabilistic modification of FMEA, the frequency of occurrence of undetected failure mode(s) can be estimated quantitatively, for each individual failure mode, for a set of failure modes, and the full analytical procedure. Copyright © 2012 Elsevier B.V. All rights reserved.

  15. Probabilistic safety assessment of WWER440 reactors prediction, quantification and management of the risk

    CERN Document Server

    Kovacs, Zoltan

    2014-01-01

    The aim of this book is to summarize probabilistic safety assessment (PSA) of nuclear power plants with WWER440 reactors and  demonstrate that the plants are safe enough for producing energy even in light of the Fukushima accident. The book examines level 1 and 2 full power, low power and shutdown PSA, and summarizes the author's experience gained during the last 35 years in this area. It provides useful examples taken from PSA training courses the author has lectured and organized by the International Atomic Energy Agency. Such training courses were organised in Argonne National Laboratory (

  16. A framework for the probabilistic analysis of meteotsunamis

    Science.gov (United States)

    Geist, Eric L.; ten Brink, Uri S.; Gove, Matthew D.

    2014-01-01

    A probabilistic technique is developed to assess the hazard from meteotsunamis. Meteotsunamis are unusual sea-level events, generated when the speed of an atmospheric pressure or wind disturbance is comparable to the phase speed of long waves in the ocean. A general aggregation equation is proposed for the probabilistic analysis, based on previous frameworks established for both tsunamis and storm surges, incorporating different sources and source parameters of meteotsunamis. Parameterization of atmospheric disturbances and numerical modeling is performed for the computation of maximum meteotsunami wave amplitudes near the coast. A historical record of pressure disturbances is used to establish a continuous analytic distribution of each parameter as well as the overall Poisson rate of occurrence. A demonstration study is presented for the northeast U.S. in which only isolated atmospheric pressure disturbances from squall lines and derechos are considered. For this study, Automated Surface Observing System stations are used to determine the historical parameters of squall lines from 2000 to 2013. The probabilistic equations are implemented using a Monte Carlo scheme, where a synthetic catalog of squall lines is compiled by sampling the parameter distributions. For each entry in the catalog, ocean wave amplitudes are computed using a numerical hydrodynamic model. Aggregation of the results from the Monte Carlo scheme results in a meteotsunami hazard curve that plots the annualized rate of exceedance with respect to maximum event amplitude for a particular location along the coast. Results from using multiple synthetic catalogs, resampled from the parent parameter distributions, yield mean and quantile hazard curves. Further refinements and improvements for probabilistic analysis of meteotsunamis are discussed.

  17. A Probabilistic Analysis of the Fermi Paradox

    CERN Document Server

    Solomonides, Evan; Terzian, Yervant

    2016-01-01

    The fermi paradox uses an appeal to the mediocrity principle to make it seem counter-intuitive that humanity has not been contacted by extraterrestrial intelligence. A numerical, statistical analysis was conducted to determine whether this apparent loneliness is, in fact, unexpected. An inequality was derived to relate the frequency of life arising and developing technology on a suitable planet in the galaxy, the average length of time since the first broadcast of such a civilization, and a constant term. An analysis of the sphere reached thus far by human communication was also conducted, considering our local neighborhood and planets of particular interest. We clearly show that human communication has not reached a number of stars and planets adequate to expect an answer. These analyses both conclude that the Fermi paradox is not, in fact, unexpected. By the mediocrity principle and numerical modeling, it is actually unlikely that the Earth would have been reached by extraterrestrial communication at this p...

  18. A Probabilistic Analysis of the Fermi Paradox

    OpenAIRE

    Solomonides, Evan; Terzian, Yervant

    2016-01-01

    The fermi paradox uses an appeal to the mediocrity principle to make it seem counter-intuitive that humanity has not been contacted by extraterrestrial intelligence. A numerical, statistical analysis was conducted to determine whether this apparent loneliness is, in fact, unexpected. An inequality was derived to relate the frequency of life arising and developing technology on a suitable planet in the galaxy, the average length of time since the first broadcast of such a civilization, and a c...

  19. The importance of Probabilistic Safety Assessment in the careful study of risks involved to new nuclear power plant projects

    Energy Technology Data Exchange (ETDEWEB)

    Mata, Jônatas F.C. da, E-mail: jonatasfmata@yahoo.com.br [Universidade do Estado de Minas Gerais (UEMG), João Monlevade, MG (Brazil); Mesquita, Amir Z., E-mail: amir@cdtn.br [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil)

    2017-07-01

    The Fukushima Daiichi nuclear accident in Japan in 2011 has raised public fears about the actual safety of nuclear power plants in several countries. The response to this concern by government agencies and private companies has been objective and pragmatic in order to guarantee best practices in the design, construction, operation and decommissioning phases of nuclear reactors. In countries where the nucleo-electric matrix is consolidated, such as the United States, France and the United Kingdom, the safety assessment is carried out considering deterministic and probabilistic criteria. In the licensing stages of new projects, it is necessary to analyze and simulate the behavior of the nuclear power plant, when subjected to conditions that can lead to sequences of accidents. Each initiator event is studied and simulated through computational models, which allow the description and estimation of possible physical phenomena occurring in nuclear reactors. Probabilistic Safety Assessment (PSA) is fundamental in this process, as it studies in depth the sequences of events that can lead to the fusion of the nucleus of the nuclear reactor. Such sequences should be quantified in terms of probability of occurrence and your possible consequences, and organized through techniques such as Fault Tree Analysis and Event Tree Analysis. For these simulations, specialized computer codes for each type of phenomenon should be used, as well as databases based on experience gained in the operation of similar nuclear reactors. The present work will describe, in an objective way, the procedures for the realization of PSA and its applicability to the assurance of the operational reliability of the nuclear reactors, as well as a brief comparative between the approaches used in some countries traditionally users of thermonuclear energy and Brazil. By means of this analysis, it can be concluded that nuclear power is increasingly reliable and safe, being able to provide the necessary

  20. Probabilistic Analysis in Management Decision Making

    DEFF Research Database (Denmark)

    Delmar, M. V.; Sørensen, John Dalsgaard

    1992-01-01

    The target group in this paper is people concerned with mathematical economic decision theory. It is shown how the numerically effective First Order Reliability Methods (FORM) can be used in rational management decision making, where some parameters in the applied decision basis are uncertainty...... quantities. The uncertainties are taken into account consistently and the decision analysis is based on the general decision theory in combination with reliability and optimization theory. Examples are shown where the described technique is used and some general conclusion are stated....

  1. Probabilistic safety goals for nuclear power plants; Phases 2-4. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Bengtsson, L.; Knochenhauer, M. (Scandpower AB (Sweden)); Holmberg, J.-E.; Rossi, J. (VTT Technical Research Centre of Finland (Finland))

    2011-05-15

    Safety goals are defined in different ways in different countries and also used differently. Many countries are presently developing them in connection to the transfer to risk-informed regulation of both operating nuclear power plants (NPP) and new designs. However, it is far from self-evident how probabilistic safety criteria should be defined and used. On one hand, experience indicates that safety goals are valuable tools for the interpretation of results from a probabilistic safety assessment (PSA), and they tend to enhance the realism of a risk assessment. On the other hand, strict use of probabilistic criteria is usually avoided. A major problem is the large number of different uncertainties in a PSA model, which makes it difficult to demonstrate the compliance with a probabilistic criterion. Further, it has been seen that PSA results can change a lot over time due to scope extensions, revised operating experience data, method development, changes in system requirements, or increases of level of detail, mostly leading to an increase of the frequency of the calculated risk. This can cause a problem of consistency in the judgments. This report presents the results from the second, third and fourth phases of the project (2007-2009), which have dealt with providing guidance related to the resolution of some specific problems, such as the problem of consistency in judgement, comparability of safety goals used in different industries, the relationship between criteria on different levels, and relations between criteria for level 2 and 3 PSA. In parallel, additional context information has been provided. This was achieved by extending the international overview by contributing to and benefiting from a survey on PSA safety criteria which was initiated in 2006 within the OECD/NEA Working Group Risk. The results from the project can be used as a platform for discussions at the utilities on how to define and use quantitative safety goals. The results can also be used by

  2. Seismic fragility analysis of a nuclear building based on probabilistic seismic hazard assessment and soil-structure interaction analysis

    Energy Technology Data Exchange (ETDEWEB)

    Gonzalez, R.; Ni, S.; Chen, R.; Han, X.M. [CANDU Energy Inc, Mississauga, Ontario (Canada); Mullin, D. [New Brunswick Power, Point Lepreau, New Brunswick (Canada)

    2016-09-15

    Seismic fragility analyses are conducted as part of seismic probabilistic safety assessment (SPSA) for nuclear facilities. Probabilistic seismic hazard assessment (PSHA) has been undertaken for a nuclear power plant in eastern Canada. Uniform Hazard Spectra (UHS), obtained from the PSHA, is characterized by high frequency content which differs from the original plant design basis earthquake spectral shape. Seismic fragility calculations for the service building of a CANDU 6 nuclear power plant suggests that the high frequency effects of the UHS can be mitigated through site response analysis with site specific geological conditions and state-of-the-art soil-structure interaction analysis. In this paper, it is shown that by performing a detailed seismic analysis using the latest technology, the conservatism embedded in the original seismic design can be quantified and the seismic capacity of the building in terms of High Confidence of Low Probability of Failure (HCLPF) can be improved. (author)

  3. Probabilistic latent semantic analysis for dynamic textures recognition and localization

    Science.gov (United States)

    Wang, Yong; Hu, Shiqiang

    2014-11-01

    We present a framework for dynamic textures (DTs) recognition and localization by using a model developed in the text analysis literature: probabilistic latent semantic analysis (pLSA). The novelty is revealed in three aspects. First, chaotic feature vector is introduced and characterizes each pixel intensity series. Next, the pLSA model is employed to discover the topics by using the bag of words representation. Finally, the spatial layout of DTs can be found. Experimental results are conducted on the well-known DTs datasets. The results show that the proposed method can successfully build DTs models and achieve higher accuracies in DTs recognition and effectively localize DTs.

  4. Risk analysis of analytical validations by probabilistic modification of FMEA

    DEFF Research Database (Denmark)

    Barends, D.M.; Oldenhof, M.T.; Vredenbregt, M.J.

    2012-01-01

    Risk analysis is a valuable addition to validation of an analytical chemistry process, enabling not only detecting technical risks, but also risks related to human failures. Failure Mode and Effect Analysis (FMEA) can be applied, using a categorical risk scoring of the occurrence, detection...... and severity of failure modes, and calculating the Risk Priority Number (RPN) to select failure modes for correction. We propose a probabilistic modification of FMEA, replacing the categorical scoring of occurrence and detection by their estimated relative frequency and maintaining the categorical scoring...

  5. Outcomes of an international initiative for harmonization of low power and shutdown probabilistic safety assessment

    Directory of Open Access Journals (Sweden)

    Manna Giustino

    2010-01-01

    Full Text Available Many probabilistic safety assessment studies completed to the date have demonstrated that the risk dealing with low power and shutdown operation of nuclear power plants is often comparable with the risk of at-power operation, and the main contributors to the low power and shutdown risk often deal with human factors. Since the beginning of the nuclear power generation, human performance has been a very important factor in all phases of the plant lifecycle: design, commissioning, operation, maintenance, surveillance, modification, decommissioning and dismantling. The importance of this aspect has been confirmed by recent operating experience. This paper provides the insights and conclusions of a workshop organized in 2007 by the IAEA and the Joint Research Centre of the European Commission, on Harmonization of low power and shutdown probabilistic safety assessment for WWER nuclear power plants. The major objective of the workshop was to provide a comparison of the approaches and the results of human reliability analyses and gain insights in the enhanced handling of human factors.

  6. Risk-Informed Safety Assurance and Probabilistic Assessment of Mission-Critical Software-Intensive Systems

    Science.gov (United States)

    Guarro, Sergio B.

    2010-01-01

    This report validates and documents the detailed features and practical application of the framework for software intensive digital systems risk assessment and risk-informed safety assurance presented in the NASA PRA Procedures Guide for Managers and Practitioner. This framework, called herein the "Context-based Software Risk Model" (CSRM), enables the assessment of the contribution of software and software-intensive digital systems to overall system risk, in a manner which is entirely compatible and integrated with the format of a "standard" Probabilistic Risk Assessment (PRA), as currently documented and applied for NASA missions and applications. The CSRM also provides a risk-informed path and criteria for conducting organized and systematic digital system and software testing so that, within this risk-informed paradigm, the achievement of a quantitatively defined level of safety and mission success assurance may be targeted and demonstrated. The framework is based on the concept of context-dependent software risk scenarios and on the modeling of such scenarios via the use of traditional PRA techniques - i.e., event trees and fault trees - in combination with more advanced modeling devices such as the Dynamic Flowgraph Methodology (DFM) or other dynamic logic-modeling representations. The scenarios can be synthesized and quantified in a conditional logic and probabilistic formulation. The application of the CSRM method documented in this report refers to the MiniAERCam system designed and developed by the NASA Johnson Space Center.

  7. Risk-Based Predictive Maintenance for Safety-Critical Systems by Using Probabilistic Inference

    Directory of Open Access Journals (Sweden)

    Tianhua Xu

    2013-01-01

    Full Text Available Risk-based maintenance (RBM aims to improve maintenance planning and decision making by reducing the probability and consequences of failure of equipment. A new predictive maintenance strategy that integrates dynamic evolution model and risk assessment is proposed which can be used to calculate the optimal maintenance time with minimal cost and safety constraints. The dynamic evolution model provides qualified risks by using probabilistic inference with bucket elimination and gives the prospective degradation trend of a complex system. Based on the degradation trend, an optimal maintenance time can be determined by minimizing the expected maintenance cost per time unit. The effectiveness of the proposed method is validated and demonstrated by a collision accident of high-speed trains with obstacles in the presence of safety and cost constrains.

  8. K Basin safety analysis

    Energy Technology Data Exchange (ETDEWEB)

    Porten, D.R.; Crowe, R.D.

    1994-12-16

    The purpose of this accident safety analysis is to document in detail, analyses whose results were reported in summary form in the K Basins Safety Analysis Report WHC-SD-SNF-SAR-001. The safety analysis addressed the potential for release of radioactive and non-radioactive hazardous material located in the K Basins and their supporting facilities. The safety analysis covers the hazards associated with normal K Basin fuel storage and handling operations, fuel encapsulation, sludge encapsulation, and canister clean-up and disposal. After a review of the Criticality Safety Evaluation of the K Basin activities, the following postulated events were evaluated: Crane failure and casks dropped into loadout pit; Design basis earthquake; Hypothetical loss of basin water accident analysis; Combustion of uranium fuel following dryout; Crane failure and cask dropped onto floor of transfer area; Spent ion exchange shipment for burial; Hydrogen deflagration in ion exchange modules and filters; Release of Chlorine; Power availability and reliability; and Ashfall.

  9. Use of probabilistic safety assessment in supporting regulatory authority`s work; Todennaekoeisyyspohjaisen turvallisuusanalyysin kaeyttoe viranomaistyoen tukena

    Energy Technology Data Exchange (ETDEWEB)

    Julin, A.

    1995-11-01

    The aim of the study was to examine possibilities to use probabilistic safety assessment (PSA) more effectively in regulatory control of nuclear power plants. The structure, results and evaluation methods of PSA along with the necessary equations and principles, which could be used in utilising level 1 PSA results in decision making, have been introduced. The presented examples describe the ways PSA has been utilised abroad and particularly in Finnish Centre for Radiation and Nuclear Safety (STUK). The examples calculated in the study are based on the SPSA code and the PSA model of Olkiluoto nuclear power plant (TVO). The examples compare component safety classes versus safety importance and the risk of continued operation versus shutdown alternative in residual heat removal system failures. In addition to this allowed outage times, as calculated by PSA, were compared to allowed outage times according to technical specifications. The last 9 years operating experiences of TVO II was also examined by analysing the risk importance of significant component failures and operational disturbances. The analysis showed that the contribution of component failures and operational disturbances to the overall core damage risk during the studied time period was only 5 per cent. It appeared that the rare, significant initiating events provide the main contribution to the total cumulative risk. (57 refs., 22 figs., 17 tabs.).

  10. Randomized Probabilistic Latent Semantic Analysis for Scene Recognition

    Science.gov (United States)

    Rodner, Erik; Denzler, Joachim

    The concept of probabilistic Latent Semantic Analysis (pLSA) has gained much interest as a tool for feature transformation in image categorization and scene recognition scenarios. However, a major issue of this technique is overfitting. Therefore, we propose to use an ensemble of pLSA models which are trained using random fractions of the training data. We analyze empirically the influence of the degree of randomization and the size of the ensemble on the overall classification performance of a scene recognition task. A thoughtful evaluation shows the benefits of this approach compared to a single pLSA model.

  11. Probabilistic Analysis of Structural Member from Recycled Aggregate Concrete

    Science.gov (United States)

    Broukalová, I.; Šeps, K.

    2017-09-01

    The paper aims at the topic of sustainable building concerning recycling of waste rubble concrete from demolition. Considering demands of maximising recycled aggregate use and minimising of cement consumption, composite from recycled concrete aggregate was proposed. The objective of the presented investigations was to verify feasibility of the recycled aggregate cement based fibre reinforced composite in a structural member. Reliability of wall from recycled aggregate fibre reinforced composite was assessed in a probabilistic analysis of a load-bearing capacity of the wall. The applicability of recycled aggregate fibre reinforced concrete in structural applications was demonstrated. The outcomes refer to issue of high scatter of material parameters of recycled aggregate concretes.

  12. PROSA-1: a probabilistic response-surface analysis code. [LMFBR

    Energy Technology Data Exchange (ETDEWEB)

    Vaurio, J. K.; Mueller, C.

    1978-06-01

    Techniques for probabilistic response-surface analysis have been developed to obtain the probability distributions of the consequences of postulated nuclear-reactor accidents. The uncertainties of the consequences are caused by the variability of the system and model input parameters used in the accident analysis. Probability distributions are assigned to the input parameters, and parameter values are systematically chosen from these distributions. These input parameters are then used in deterministic consequence analyses performed by mechanistic accident-analysis codes. The results of these deterministic consequence analyses are used to generate the coefficients for analytical functions that approximate the consequences in terms of the selected input parameters. These approximating functions are used to generate the probability distributions of the consequences with random sampling being used to obtain values for the accident parameters from their distributions. A computer code PROSA has been developed for implementing the probabilistic response-surface technique. Special features of the code generate or treat sensitivities, statistical moments of the input and output variables, regionwise response surfaces, correlated input parameters, and conditional distributions. The code can also be used for calculating important distributions of the input parameters. The use of the code is illustrated in conjunction with the fast-running accident-analysis code SACO to provide probability studies of LMFBR hypothetical core-disruptive accidents. However, the methods and the programming are general and not limited to such applications.

  13. Linking Safety Analysis to Safety Requirements

    DEFF Research Database (Denmark)

    Hansen, Kirsten Mark

    Software for safety critical systems must deal with the hazards identified by safety analysistechniques: Fault trees, event trees,and cause consequence diagrams can be interpreted as safety requirements and used in the design activity. We propose that the safety analysis and the system design use...... the same system model and that this model is formalized in a real-time, interval logic, based on a conventional dynamic systems model with a state over time. The three safety analysis techniques are interpreted in this model and it is shown how to derive safety requirements for components of a system....

  14. Safety analysis for `Fugen`

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-10-01

    The improvement of safety in nuclear power stations is an important proposition. Therefore also as to the safety evaluation, it is important to comprehensively and systematically execute it by referring to the operational experience and the new knowledge which is important for the safety throughout the period of use as well as before the construction and the start of operation of nuclear power stations. In this report, the results when the safety analysis for ``Fugen`` was carried out by referring to the newest technical knowledge are described. As the result, it was able to be confirmed that the safety of ``Fugen`` has been secured by the inherent safety and the facilities which were designed for securing the safety. The basic way of thinking on the safety analysis including the guidelines to be conformed to is mentioned. As to the abnormal transient change in operation and accidents, their definition, the events to be evaluated and the standards for judgement are reported. The matters which were taken in consideration at the time of the analysis are shown. The computation programs used for the analysis were REACT, HEATUP, LAYMON, FATRAC, SENHOR, LOTRAC, FLOOD and CONPOL. The analyses of the abnormal transient change in operation and accidents are reported on the causes, countermeasures, protective functions and results. (K.I.)

  15. Probabilistic Design

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Burcharth, H. F.

    This chapter describes how partial safety factors can be used in design of vertical wall breakwaters and an example of a code format is presented. The partial safety factors are calibrated on a probabilistic basis. The code calibration process used to calibrate some of the partial safety factors...

  16. Probabilistic liquefaction hazard analysis at liquefied sites of 1956 Dunaharaszti earthquake, in Hungary

    Science.gov (United States)

    Győri, Erzsébet; Gráczer, Zoltán; Tóth, László; Bán, Zoltán; Horváth, Tibor

    2017-04-01

    Liquefaction potential evaluations are generally made to assess the hazard from specific scenario earthquakes. These evaluations may estimate the potential in a binary fashion (yes/no), define a factor of safety or predict the probability of liquefaction given a scenario event. Usually the level of ground shaking is obtained from the results of PSHA. Although it is determined probabilistically, a single level of ground shaking is selected and used within the liquefaction potential evaluation. In contrary, the fully probabilistic liquefaction potential assessment methods provide a complete picture of liquefaction hazard, namely taking into account the joint probability distribution of PGA and magnitude of earthquake scenarios; both of which are key inputs in the stress-based simplified methods. Kramer and Mayfield (2007) has developed a fully probabilistic liquefaction potential evaluation method using a performance-based earthquake engineering (PBEE) framework. The results of the procedure are the direct estimate of the return period of liquefaction and the liquefaction hazard curves in function of depth. The method combines the disaggregation matrices computed for different exceedance frequencies during probabilistic seismic hazard analysis with one of the recent models for the conditional probability of liquefaction. We have developed a software for the assessment of performance-based liquefaction triggering on the basis of Kramer and Mayfield method. Originally the SPT based probabilistic method of Cetin et al. (2004) was built-in into the procedure of Kramer and Mayfield to compute the conditional probability however there is no professional consensus about its applicability. Therefore we have included not only Cetin's method but Idriss and Boulanger (2012) SPT based moreover Boulanger and Idriss (2014) CPT based procedures into our computer program. In 1956, a damaging earthquake of magnitude 5.6 occurred in Dunaharaszti, in Hungary. Its epicenter was located

  17. Analysis of Non-Linear Probabilistic Hybrid Systems

    Directory of Open Access Journals (Sweden)

    Joseph Assouramou

    2011-07-01

    Full Text Available This paper shows how to compute, for probabilistic hybrid systems, the clock approximation and linear phase-portrait approximation that have been proposed for non probabilistic processes by Henzinger et al. The techniques permit to define a rectangular probabilistic process from a non rectangular one, hence allowing the model-checking of any class of systems. Clock approximation, which applies under some restrictions, aims at replacing a non rectangular variable by a clock variable. Linear phase-approximation applies without restriction and yields an approximation that simulates the original process. The conditions that we need for probabilistic processes are the same as those for the classic case.

  18. Use of probabilistic safety analysis for design of emergency mitigation systems in hydrogen producer plant with sulfur-iodine technology, Section II: sulfuric acid decomposition; Uso de analisis probabilistico de seguridad para el diseno de sistemas de mitigacion de emergencia en planta productora de hidrogeno con tecnologia azufre-iodo, Seccion II: descomposicion de acido sulfurico

    Energy Technology Data Exchange (ETDEWEB)

    Mendoza A, A.; Nelson E, P. F.; Francois L, J. L. [Facultad de Ingenieria, Departamento de Sistemas Energeticos, UNAM, Paseo Cuauhnahuac 8532, 62550 Jiutepec, Morelos (Mexico)], e-mail: iqalexmdz@yahoo.com.mx

    2009-10-15

    Over the last decades, the need to reduce emissions of greenhouse gases has prompted the development of technologies for the production of clean fuels through the use of primary energy resources of zero emissions, as the heat of nuclear reactors of high temperature. Within these technologies, one of the most promising is the hydrogen production by sulfur-iodine cycle coupled to a high temperature reactor initially proposed by General Atomics. By their nature and because it will be large-scale plants, the development of these technologies from its present phase to its procurement and construction, will have to incorporate emergency mitigation systems in all its parts and interconnections to prevent undesired events that could put threaten the plant integrity and the nearby area. For the particular case of sulfur-iodine thermochemical cycle, most analysis have focused on hydrogen explosions and failures in the primary cooling systems. While these events are the most catastrophic, is that there are also many other events that even taking less direct consequences, could jeopardize the plant operation, the people safety of nearby communities and carry the same economic consequences. In this study we analyzed one of these events, which is the formation of a toxic cloud prompted by uncontrolled leakage of concentrated sulfuric acid in the second section of sulfur-iodine process of General Atomics. In this section, the sulfuric acid concentration is near to 90% in conditions of high temperature and positive pressure. Under these conditions the sulfuric acid and sulfur oxides from the reactor will form a toxic cloud that the have contact with the plant personnel could cause fatalities, or to reach a town would cause suffocation, respiratory problems and eye irritation. The methodology used for this study is the supported design in probabilistic safety analysis. Mitigation systems were postulated based on the isolation of a possible leak, the neutralization of a pond of

  19. A Probabilistic Physics of Failure Approach for Structure Corrosion Reliability Analysis

    Directory of Open Access Journals (Sweden)

    Chaoyang Xie

    2016-01-01

    Full Text Available Corrosion is recognized as one of the most important degradation mechanisms that affect the long-term reliability and integrity of metallic structures. Studying the structural reliability with pitting corrosion damage is useful for risk control and safety operation for the corroded structure. This paper proposed a structure corrosion reliability analysis approach based on the physics-based failure model of pitting corrosion, where the states of pitting growth, pit-to-crack, and cracking propagation are included in failure model. Then different probabilistic analysis methods such as Monte-Carlo Simulation (MCS, First-Order Reliability Method (FORM, Second-Order Reliability Method (SORM, and response surface method are employed to calculate the reliability. At last, an example is presented to demonstrate the capability of the proposed structural reliability model and calculating methods for structural corrosion failure analysis.

  20. OVERVIEW OF THE SAPHIRE PROBABILISTIC RISK ANALYSIS SOFTWARE

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Curtis L.; Wood, Ted; Knudsen, James; Ma, Zhegang

    2016-10-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) is a software application developed for performing a complete probabilistic risk assessment (PRA) using a personal computer (PC) running the Microsoft Windows operating system. SAPHIRE Version 8 is funded by the U.S. Nuclear Regulatory Commission (NRC) and developed by the Idaho National Laboratory (INL). INL's primary role in this project is that of software developer and tester. However, INL also plays an important role in technology transfer by interfacing and supporting SAPHIRE users, who constitute a wide range of PRA practitioners from the NRC, national laboratories, the private sector, and foreign countries. In this paper, we provide an overview of the current technical capabilities found in SAPHIRE Version 8, including the user interface and enhanced solving algorithms.

  1. Probabilistic safety assessment for Hanford high-level waste tank 241-SY-101

    Energy Technology Data Exchange (ETDEWEB)

    MacFarlane, D.R.; Bott, T.F.; Brown, L.F.; Stack, D.W. [Los Alamos National Lab., NM (United States); Kindinger, J.; Deremer, R.K.; Medhekar, S.R.; Mikschl, T.J. [PLG, Inc., Newport Beach, CA (United States)

    1994-05-01

    Los Alamos National Laboratory (Los Alamos) is performing a comprehensive probabilistic safety assessment (PSA), which will include consideration of external events for the 18 tank farms at the Hanford Site. This effort is sponsored by the Department of Energy (DOE/EM, EM-36). Even though the methodology described herein will be applied to the entire tank farm, this report focuses only on the risk from the weapons-production wastes stored in tank number 241-SY-101, commonly known as Tank 101-SY, as configured in December 1992. This tank, which periodically releases ({open_quotes}burps{close_quotes}) a gaseous mixture of hydrogen, nitrous oxide, ammonia, and nitrogen, was analyzed first because of public safety concerns associated with the potential for release of radioactive tank contents should this gas mixture be ignited during one of the burps. In an effort to mitigate the burping phenomenon, an experiment is being conducted in which a large pump has been inserted into the tank to determine if pump-induced circulation of the tank contents will promote a slow, controlled release of the gases. At the Hanford Site there are 177 underground tanks in 18 separate tank farms containing accumulated liquid/sludge/salt cake radioactive wastes from 50 yr of weapons materials production activities. The total waste volume is about 60 million gal., which contains approximately 120 million Ci of radioactivity.

  2. Probabilistic performance analysis using the SLEUTH fuel modelling code

    Energy Technology Data Exchange (ETDEWEB)

    Palmer, I.D.

    1986-01-01

    The paper describes the development and sample use of a computer code which automates both the Monte Carlo and response surface approaches to probabilistic fuel performance modelling utilising the SLEUTH-82 deterministic program. A number of the statistical procedures employed, which have been prepared as independent computer codes, are also described. These are of general applicability in many areas of probabilistic assessment.

  3. Design of a cable stayed composite bridge using a probabilistic FE analysis

    NARCIS (Netherlands)

    Boer, A. de; Waarts, P.H.

    1999-01-01

    Within the use of alternative methods of probabilistic analysis and coming up computers with more CPU-power, today it Is possible within a design office, to analyse structures in a probabilistic way. After a pilot Implementation of some alternative methods in an existing FE-code, some structures are

  4. The analysis of probability task completion; Taxonomy of probabilistic thinking-based across gender in elementary school students

    Science.gov (United States)

    Sari, Dwi Ivayana; Budayasa, I. Ketut; Juniati, Dwi

    2017-08-01

    Formulation of mathematical learning goals now is not only oriented on cognitive product, but also leads to cognitive process, which is probabilistic thinking. Probabilistic thinking is needed by students to make a decision. Elementary school students are required to develop probabilistic thinking as foundation to learn probability at higher level. A framework of probabilistic thinking of students had been developed by using SOLO taxonomy, which consists of prestructural probabilistic thinking, unistructural probabilistic thinking, multistructural probabilistic thinking and relational probabilistic thinking. This study aimed to analyze of probability task completion based on taxonomy of probabilistic thinking. The subjects were two students of fifth grade; boy and girl. Subjects were selected by giving test of mathematical ability and then based on high math ability. Subjects were given probability tasks consisting of sample space, probability of an event and probability comparison. The data analysis consisted of categorization, reduction, interpretation and conclusion. Credibility of data used time triangulation. The results was level of boy's probabilistic thinking in completing probability tasks indicated multistructural probabilistic thinking, while level of girl's probabilistic thinking in completing probability tasks indicated unistructural probabilistic thinking. The results indicated that level of boy's probabilistic thinking was higher than level of girl's probabilistic thinking. The results could contribute to curriculum developer in developing probability learning goals for elementary school students. Indeed, teachers could teach probability with regarding gender difference.

  5. Probabilistic Seismic Hazard Disaggregation Analysis for the South of Portugal

    Science.gov (United States)

    Rodrigues, I.; Sousa, M.; Teves-Costa, P.

    2010-12-01

    Probabilistic seismic hazard disaggregation analysis was performed and seismic scenarios were identified for Southern Mainland Portugal. This region’s seismicity is characterized by small and moderate magnitude events and by the sporadic occurrence of large earthquakes (e.g. the 1755 Lisbon earthquake). Thus, the Portuguese Civil Protection Agency (ANPC) sponsored a collaborative research project for the study of the seismic and tsunami risks in the Algarve (project ERSTA). In the framework of this project, a series of new developments were obtained, namely the revision of the seismic catalogue (IM, 2008), the delineation of new seismogenic zones affecting the Algarve region, which reflects the growing knowledge of this region's seismotectonic context, the derivation of new spectral attenuation laws (Carvalho and Campos Costa, 2008) and the revision of the probabilistic seismic hazard (Sousa et al. 2008). Seismic hazard was disaggregated considering different spaces of random variables, namely, bivariate conditional hazard distributions of X-Y (seismic source latitude and longitude) and multivariate 4D conditional hazard distributions of M-(X-Y)-ɛ (ɛ - deviation of ground motion to the median value predicted by an attenuation model). These procedures were performed for the peak ground acceleration (PGA) and for the 5% damped 1.0 and 2.5 Hz spectral acceleration levels of three return periods: 95, 475 and 975 years. The seismic scenarios controlling the hazard of a given ground motion level, were identified as the modal values of the 4D disaggregation analysis for each of the 84 parishes of the Algarve region. Those scenarios, based on a probabilistic analysis, are meant to be used in the emergency planning as a complement to the historical scenarios that severely affected this region. Seismic scenarios share a few number of geographical locations for all return periods. Moreover, seismic hazard of most Algarve’s parishes is dominated by the seismicity located

  6. Development of a computational database for probabilistic safety assessment of nuclear research reactors

    Energy Technology Data Exchange (ETDEWEB)

    Macedo, Vagner S.; Oliveira, Patricia S. Pagetti de; Andrade, Delvonei Alves de, E-mail: vagner.macedo@usp.br, E-mail: patricia@ipen.br, E-mail: delvonei@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2015-07-01

    The objective of this work is to describe the database being developed at IPEN - CNEN / SP for application in the Probabilistic Safety Assessment of nuclear research reactors. The database can be accessed by means of a computational program installed in the corporate computer network, named IPEN Intranet, and this access will be allowed only to professionals previously registered. Data updating, editing and searching tasks will be controlled by a system administrator according to IPEN Intranet security rules. The logical model and the physical structure of the database can be represented by an Entity Relationship Model, which is based on the operational routines performed by IPEN - CNEN / SP users. The web application designed for the management of the database is named PSADB. It is being developed with MySQL database software and PHP programming language is being used. Data stored in this database are divided into modules that refer to technical specifications, operating history, maintenance history and failure events associated with the main components of the nuclear facilities. (author)

  7. Study on quantification method based on Monte Carlo sampling for multiunit probabilistic safety assessment models

    Energy Technology Data Exchange (ETDEWEB)

    Oh, Kye Min [KHNP Central Research Institute, Daejeon (Korea, Republic of); Han, Sang Hoon; Park, Jin Hee; Lim, Ho Gon; Yang, Joon Yang [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Heo, Gyun Young [Kyung Hee University, Yongin (Korea, Republic of)

    2017-06-15

    In Korea, many nuclear power plants operate at a single site based on geographical characteristics, but the population density near the sites is higher than that in other countries. Thus, multiunit accidents are a more important consideration than in other countries and should be addressed appropriately. Currently, there are many issues related to a multiunit probabilistic safety assessment (PSA). One of them is the quantification of a multiunit PSA model. A traditional PSA uses a Boolean manipulation of the fault tree in terms of the minimal cut set. However, such methods have some limitations when rare event approximations cannot be used effectively or a very small truncation limit should be applied to identify accident sequence combinations for a multiunit site. In particular, it is well known that seismic risk in terms of core damage frequency can be overestimated because there are many events that have a high failure probability. In this study, we propose a quantification method based on a Monte Carlo approach for a multiunit PSA model. This method can consider all possible accident sequence combinations in a multiunit site and calculate a more exact value for events that have a high failure probability. An example model for six identical units at a site was also developed and quantified to confirm the applicability of the proposed method.

  8. Study on Quantification Method Based on Monte Carlo Sampling for Multiunit Probabilistic Safety Assessment Models

    Directory of Open Access Journals (Sweden)

    Kyemin Oh

    2017-06-01

    Full Text Available In Korea, many nuclear power plants operate at a single site based on geographical characteristics, but the population density near the sites is higher than that in other countries. Thus, multiunit accidents are a more important consideration than in other countries and should be addressed appropriately. Currently, there are many issues related to a multiunit probabilistic safety assessment (PSA. One of them is the quantification of a multiunit PSA model. A traditional PSA uses a Boolean manipulation of the fault tree in terms of the minimal cut set. However, such methods have some limitations when rare event approximations cannot be used effectively or a very small truncation limit should be applied to identify accident sequence combinations for a multiunit site. In particular, it is well known that seismic risk in terms of core damage frequency can be overestimated because there are many events that have a high failure probability. In this study, we propose a quantification method based on a Monte Carlo approach for a multiunit PSA model. This method can consider all possible accident sequence combinations in a multiunit site and calculate a more exact value for events that have a high failure probability. An example model for six identical units at a site was also developed and quantified to confirm the applicability of the proposed method.

  9. A Sensitivity Study for an Evaluation of Input Parameters Effect on a Preliminary Probabilistic Tsunami Hazard Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Rhee, Hyun-Me; Kim, Min Kyu; Choi, In-Kil [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of); Sheen, Dong-Hoon [Chonnam National University, Gwangju (Korea, Republic of)

    2014-10-15

    The tsunami hazard analysis has been based on the seismic hazard analysis. The seismic hazard analysis has been performed by using the deterministic method and the probabilistic method. To consider the uncertainties in hazard analysis, the probabilistic method has been regarded as attractive approach. The various parameters and their weight are considered by using the logic tree approach in the probabilistic method. The uncertainties of parameters should be suggested by analyzing the sensitivity because the various parameters are used in the hazard analysis. To apply the probabilistic tsunami hazard analysis, the preliminary study for the Ulchin NPP site had been performed. The information on the fault sources which was published by the Atomic Energy Society of Japan (AESJ) had been used in the preliminary study. The tsunami propagation was simulated by using the TSUNAMI{sub 1}.0 which was developed by Japan Nuclear Energy Safety Organization (JNES). The wave parameters have been estimated from the result of tsunami simulation. In this study, the sensitivity analysis for the fault sources which were selected in the previous studies has been performed. To analyze the effect of the parameters, the sensitivity analysis for the E3 fault source which was published by AESJ was performed. The effect of the recurrence interval, the potential maximum magnitude, and the beta were suggested by the sensitivity analysis results. Level of annual exceedance probability has been affected by the recurrence interval.. Wave heights have been influenced by the potential maximum magnitude and the beta. In the future, the sensitivity analysis for the all fault sources in the western part of Japan which were published AESJ would be performed.

  10. Probabilistic Structural Analysis of the SRB Aft Skirt External Fitting Modification

    Science.gov (United States)

    Townsend, John S.; Peck, J.; Ayala, S.

    1999-01-01

    NASA has funded several major programs (the PSAM Project is an example) to develop Probabilistic Structural Analysis Methods and tools for engineers to apply in the design and assessment of aerospace hardware. A probabilistic finite element design tool, known as NESSUS, is used to determine the reliability of the Space Shuttle Solid Rocket Booster (SRB) aft skirt critical weld. An external bracket modification to the aft skirt provides a comparison basis for examining the details of the probabilistic analysis and its contributions to the design process.

  11. Probabilistic Design Analysis (PDA) Approach to Determine the Probability of Cross-System Failures for a Space Launch Vehicle

    Science.gov (United States)

    Shih, Ann T.; Lo, Yunnhon; Ward, Natalie C.

    2010-01-01

    Quantifying the probability of significant launch vehicle failure scenarios for a given design, while still in the design process, is critical to mission success and to the safety of the astronauts. Probabilistic risk assessment (PRA) is chosen from many system safety and reliability tools to verify the loss of mission (LOM) and loss of crew (LOC) requirements set by the NASA Program Office. To support the integrated vehicle PRA, probabilistic design analysis (PDA) models are developed by using vehicle design and operation data to better quantify failure probabilities and to better understand the characteristics of a failure and its outcome. This PDA approach uses a physics-based model to describe the system behavior and response for a given failure scenario. Each driving parameter in the model is treated as a random variable with a distribution function. Monte Carlo simulation is used to perform probabilistic calculations to statistically obtain the failure probability. Sensitivity analyses are performed to show how input parameters affect the predicted failure probability, providing insight for potential design improvements to mitigate the risk. The paper discusses the application of the PDA approach in determining the probability of failure for two scenarios from the NASA Ares I project

  12. Applicability of Linear Analysis in Probabilistic Estimation of Seismic Building Damage to Reinforced-Concrete Structures

    Science.gov (United States)

    2012-06-01

    Linear Analysis in Probabilistic Estimation of Seismic Building N00244-09-G-OO 14 Damage to Reinforced- Concrete Structures Sb. GRANT NUMBER Sc. PROGRAM...Professional 7.0 Applicability of Linear Analysis in Probabilistic Estimation of Seismic Building Damage to Reinforced- Concrete Structures By Timothy...Estimation of Seismic Building Damage to Reinforced- Concrete Structures By Timothy P. James Submitted to the Department of Civil and

  13. Comparative analysis of deterministic and probabilistic fracture mechanical assessment tools

    Energy Technology Data Exchange (ETDEWEB)

    Heckmann, Klaus [Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS) gGmbH, Koeln (Germany); Saifi, Qais [VTT Technical Research Centre of Finland, Espoo (Finland)

    2016-11-15

    Uncertainties in material properties, manufacturing processes, loading conditions and damage mechanisms complicate the quantification of structural reliability. Probabilistic structure mechanical computing codes serve as tools for assessing leak- and break probabilities of nuclear piping components. Probabilistic fracture mechanical tools were compared in different benchmark activities, usually revealing minor, but systematic discrepancies between results of different codes. In this joint paper, probabilistic fracture mechanical codes are compared. Crack initiation, crack growth and the influence of in-service inspections are analyzed. Example cases for stress corrosion cracking and fatigue in LWR conditions are analyzed. The evolution of annual failure probabilities during simulated operation time is investigated, in order to identify the reasons for differences in the results of different codes. The comparison of the tools is used for further improvements of the codes applied by the partners.

  14. Safety of long-distance pipelines. Probabilistic and deterministic aspects; Sicherheit von Rohrfernleitungen. Probabilistik und Deterministik im Vergleich

    Energy Technology Data Exchange (ETDEWEB)

    Hollaender, Robert [Leipzig Univ. (Germany). Inst. fuer Infrastruktur und Ressourcenmanagement

    2013-03-15

    The Committee for Long-Distance Pipelines (Berlin, Federal Republic of Germany) reported on the relation between deterministic and probabilistic approaches in order to contribute to a better understanding of the safety management of long-distance pipelines. The respective strengths and weaknesses as well as the deterministic and probabilistic fundamentals of the safety management are described. The comparison includes fundamental aspects, but is essentially determined by the special character of the technical plant 'long-distance pipeline' as an infrastructure project in the area. This special feature results to special operation conditions and related responsibilities. However, our legal system 'long-distance pipeline' does not grant the same legal position in comparison to other infrastructural facilities such as streets and railways. Thus, the question whether and in what manner the impacts from the land-use in the environment of long-distance pipelines have to be considered is again and again the initial point for the discussion on probabilistic and deterministic approaches.

  15. Dynamic Positioning System (DPS) Risk Analysis Using Probabilistic Risk Assessment (PRA)

    Science.gov (United States)

    Thigpen, Eric B.; Boyer, Roger L.; Stewart, Michael A.; Fougere, Pete

    2017-01-01

    The National Aeronautics and Space Administration (NASA) Safety & Mission Assurance (S&MA) directorate at the Johnson Space Center (JSC) has applied its knowledge and experience with Probabilistic Risk Assessment (PRA) to projects in industries ranging from spacecraft to nuclear power plants. PRA is a comprehensive and structured process for analyzing risk in complex engineered systems and/or processes. The PRA process enables the user to identify potential risk contributors such as, hardware and software failure, human error, and external events. Recent developments in the oil and gas industry have presented opportunities for NASA to lend their PRA expertise to both ongoing and developmental projects within the industry. This paper provides an overview of the PRA process and demonstrates how this process was applied in estimating the probability that a Mobile Offshore Drilling Unit (MODU) operating in the Gulf of Mexico and equipped with a generically configured Dynamic Positioning System (DPS) loses location and needs to initiate an emergency disconnect. The PRA described in this paper is intended to be generic such that the vessel meets the general requirements of an International Maritime Organization (IMO) Maritime Safety Committee (MSC)/Circ. 645 Class 3 dynamically positioned vessel. The results of this analysis are not intended to be applied to any specific drilling vessel, although provisions were made to allow the analysis to be configured to a specific vessel if required.

  16. Dynamic probabilistic CCA for analysis of affective behaviour

    NARCIS (Netherlands)

    Nicolaou, Mihalis A.; Pavlovic, Vladimir; Pantic, Maja

    2012-01-01

    Fusing multiple continuous expert annotations is a crucial problem in machine learning and computer vision, particularly when dealing with uncertain and subjective tasks related to affective behaviour. Inspired by the concept of inferring shared and individual latent spaces in probabilistic CCA

  17. Probabilistic Anomaly Detection Based On System Calls Analysis

    Directory of Open Access Journals (Sweden)

    Przemysław Maciołek

    2007-01-01

    Full Text Available We present an application of probabilistic approach to the anomaly detection (PAD. Byanalyzing selected system calls (and their arguments, the chosen applications are monitoredin the Linux environment. This allows us to estimate “(abnormality” of their behavior (bycomparison to previously collected profiles. We’ve attached results of threat detection ina typical computer environment.

  18. Probabilistic thermo-chemical analysis of a pultruded composite rod

    DEFF Research Database (Denmark)

    Baran, Ismet; Tutum, Cem Celal; Hattel, Jesper Henri

    2012-01-01

    case, the probabilistic design of the pultrusion process, which has not been considered until now, is performed. The effect of statistical variations in the material (i.e. fiber and resin) and resin kinetic properties, as well as process parameters such as pulling speed and inlet temperature...

  19. Reachability-based Analysis for Probabilistic Roadmap Planners

    NARCIS (Netherlands)

    Geraerts, R.J.; Overmars, M.H.

    2007-01-01

    In the last fifteen years, sampling-based planners like the Probabilistic Roadmap Method (PRM) have proved to be successful in solving complex motion planning problems. While theoretically, the complexity of the motion planning problem is exponential in the number of degrees of freedom,

  20. Uncertainty analysis for probabilistic pipe fracture evaluations in LBB applications

    Energy Technology Data Exchange (ETDEWEB)

    Rahman, S.; Ghadiali, N.; Wilkowski, G.

    1997-04-01

    During the NRC`s Short Cracks in Piping and Piping Welds Program at Battelle, a probabilistic methodology was developed to conduct fracture evaluations of circumferentially cracked pipes for application to leak-rate detection. Later, in the IPIRG-2 program, several parameters that may affect leak-before-break and other pipe flaw evaluations were identified. This paper presents new results from several uncertainty analyses to evaluate the effects of normal operating stresses, normal plus safe-shutdown earthquake stresses, off-centered cracks, restraint of pressure-induced bending, and dynamic and cyclic loading rates on the conditional failure probability of pipes. systems in BWR and PWR. For each parameter, the sensitivity to conditional probability of failure and hence, its importance on probabilistic leak-before-break evaluations were determined.

  1. Schedulability analysis of dependent probabilistic real-time tasks

    OpenAIRE

    Ben-Amor, Slim; Maxim, Dorin,; Cucu-Grosjean, Liliana

    2016-01-01

    International audience; The complexity of modern architectures has increased the timing variability of programs (or tasks). In this context, new approaches based on probabilistic methods are proposed to decrease the pessimism by associating probabilities to the worst case values of the programs (tasks) time execution. In this paper, we extend the original work of Chetto et al. [7] on precedence constrained tasks to the case of tasks with worst case execution times described by probability dis...

  2. Probabilistic Tsunami Hazard Analysis: Multiple Sources and Global Applications

    Science.gov (United States)

    Grezio, Anita; Babeyko, Andrey; Baptista, Maria Ana; Behrens, Jörn; Costa, Antonio; Davies, Gareth; Geist, Eric L.; Glimsdal, Sylfest; González, Frank I.; Griffin, Jonathan; Harbitz, Carl B.; LeVeque, Randall J.; Lorito, Stefano; Løvholt, Finn; Omira, Rachid; Mueller, Christof; Paris, Raphaël.; Parsons, Tom; Polet, Jascha; Power, William; Selva, Jacopo; Sørensen, Mathilde B.; Thio, Hong Kie

    2017-12-01

    Applying probabilistic methods to infrequent but devastating natural events is intrinsically challenging. For tsunami analyses, a suite of geophysical assessments should be in principle evaluated because of the different causes generating tsunamis (earthquakes, landslides, volcanic activity, meteorological events, and asteroid impacts) with varying mean recurrence rates. Probabilistic Tsunami Hazard Analyses (PTHAs) are conducted in different areas of the world at global, regional, and local scales with the aim of understanding tsunami hazard to inform tsunami risk reduction activities. PTHAs enhance knowledge of the potential tsunamigenic threat by estimating the probability of exceeding specific levels of tsunami intensity metrics (e.g., run-up or maximum inundation heights) within a certain period of time (exposure time) at given locations (target sites); these estimates can be summarized in hazard maps or hazard curves. This discussion presents a broad overview of PTHA, including (i) sources and mechanisms of tsunami generation, emphasizing the variety and complexity of the tsunami sources and their generation mechanisms, (ii) developments in modeling the propagation and impact of tsunami waves, and (iii) statistical procedures for tsunami hazard estimates that include the associated epistemic and aleatoric uncertainties. Key elements in understanding the potential tsunami hazard are discussed, in light of the rapid development of PTHA methods during the last decade and the globally distributed applications, including the importance of considering multiple sources, their relative intensities, probabilities of occurrence, and uncertainties in an integrated and consistent probabilistic framework.

  3. Deep Borehole Disposal Safety Analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Freeze, Geoffrey A. [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Stein, Emily [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Price, Laura L. [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); MacKinnon, Robert J. [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States); Tillman, Jack Bruce [Sandia National Laboratories (SNL-NM), Albuquerque, NM (United States)

    2016-10-01

    This report presents a preliminary safety analysis for the deep borehole disposal (DBD) concept, using a safety case framework. A safety case is an integrated collection of qualitative and quantitative arguments, evidence, and analyses that substantiate the safety, and the level of confidence in the safety, of a geologic repository. This safety case framework for DBD follows the outline of the elements of a safety case, and identifies the types of information that will be required to satisfy these elements. At this very preliminary phase of development, the DBD safety case focuses on the generic feasibility of the DBD concept. It is based on potential system designs, waste forms, engineering, and geologic conditions; however, no specific site or regulatory framework exists. It will progress to a site-specific safety case as the DBD concept advances into a site-specific phase, progressing through consent-based site selection and site investigation and characterization.

  4. Probabilistic stability analysis: the way forward for stability analysis of sustainable power systems.

    Science.gov (United States)

    Milanović, Jovica V

    2017-08-13

    Future power systems will be significantly different compared with their present states. They will be characterized by an unprecedented mix of a wide range of electricity generation and transmission technologies, as well as responsive and highly flexible demand and storage devices with significant temporal and spatial uncertainty. The importance of probabilistic approaches towards power system stability analysis, as a subsection of power system studies routinely carried out by power system operators, has been highlighted in previous research. However, it may not be feasible (or even possible) to accurately model all of the uncertainties that exist within a power system. This paper describes for the first time an integral approach to probabilistic stability analysis of power systems, including small and large angular stability and frequency stability. It provides guidance for handling uncertainties in power system stability studies and some illustrative examples of the most recent results of probabilistic stability analysis of uncertain power systems.This article is part of the themed issue 'Energy management: flexibility, risk and optimization'. © 2017 The Author(s).

  5. PROBABILISTIC SENSITIVITY AND UNCERTAINTY ANALYSIS WORKSHOP SUMMARY REPORT

    Energy Technology Data Exchange (ETDEWEB)

    Seitz, R

    2008-06-25

    Stochastic or probabilistic modeling approaches are being applied more frequently in the United States and globally to quantify uncertainty and enhance understanding of model response in performance assessments for disposal of radioactive waste. This increased use has resulted in global interest in sharing results of research and applied studies that have been completed to date. This technical report reflects the results of a workshop that was held to share results of research and applied work related to performance assessments conducted at United States Department of Energy sites. Key findings of this research and applied work are discussed and recommendations for future activities are provided.

  6. Human and management factors in probabilistic risk analysis: the SAM approach and observations from recent applications

    Energy Technology Data Exchange (ETDEWEB)

    Elisabeth Pate-Cornell, M.; Murphy, Dean M

    1996-08-01

    Most severe industrial accidents have been shown to involve one or more human errors and these are generally rooted in management problems. The objective of this paper is to draw some conclusions from the experience that we have acquired from three different studies of this phenomenon: (1) the Piper Alpha accident including problems of operations management and fire risks on-board offshore platforms, (2) the management of the heat shield of the NASA space shuttle orbiter, and (3) the roots of patient risks in anaesthesia. This paper describes and illustrates the SAM approach (System-Action-Management) that was developed and used in these studies to link the probabilities of system failures to human and management factors. This SAM model includes: first, a probabilistic risk analysis of the physical system, second, an analysis of the decisions and actions that affect the probabilities of its basic events, and third, a study of the management factors that influence those decisions and actions. In the three initial studies, the analytical links (conditional probabilities) among these three submodels were coarsely quantified based on statistical data whenever available, or most often, on expert opinions. This paper describes some observations that were made across these three studies, for example, the importance of the informal reward system, the difficulties in the communication of uncertainties, the problems of managing resource constraints, and the safety implications of the short cuts that they often imply.

  7. Sargent-IV Project. Development of new methodologies for safety analysis of Generation IV reactors; Proyecto SARGEB-IV. Desarrollo de nuevas metodologias de analisis de seguridad para reactores de Generacion IV

    Energy Technology Data Exchange (ETDEWEB)

    Queral, C.; Gallego, E.; Jimenez, G.

    2013-07-01

    The main result of this paper is the proposal for the addition of new ingredients in the safety analysis methodologies for Generation-IV reactors that integrates the features of probabilistic safety analysis within deterministic. This ensures a higher degree of integration between the classical deterministic and probabilistic methodologies.

  8. Development of a Probabilistic Flood Hazard Assessment (PFHA) for the nuclear safety

    Science.gov (United States)

    Ben Daoued, Amine; Guimier, Laurent; Hamdi, Yasser; Duluc, Claire-Marie; Rebour, Vincent

    2016-04-01

    The purpose of this study is to lay the basis for a probabilistic evaluation of flood hazard (PFHA). Probabilistic assessment of external floods has become a current topic of interest to the nuclear scientific community. Probabilistic approaches complement deterministic approaches by exploring a set of scenarios and associating a probability to each of them. These approaches aim to identify all possible failure scenarios, combining their probability, in order to cover all possible sources of risk. They are based on the distributions of initiators and/or the variables caracterizing these initiators. The PFHA can characterize the water level for example at defined point of interest in the nuclear site. This probabilistic flood hazard characterization takes into account all the phenomena that can contribute to the flooding of the site. The main steps of the PFHA are: i) identification of flooding phenomena (rains, sea water level, etc.) and screening of relevant phenomena to the nuclear site, ii) identification and probabilization of parameters associated to selected flooding phenomena, iii) spreading of the probabilized parameters from the source to the point of interest in the site, v) obtaining hazard curves and aggregation of flooding phenomena contributions at the point of interest taking into account the uncertainties. Within this framework, the methodology of the PFHA has been developed for several flooding phenomena (rain and/or sea water level, etc.) and then implemented and tested with a simplified case study. In the same logic, our study is still in progress to take into account other flooding phenomena and to carry out more case studies.

  9. Probabilistic FE analysis of a cable stayed composite bridge

    NARCIS (Netherlands)

    Boer, A. de; Waarts, P.H.

    1999-01-01

    This paper describes the design of a new cable stayed composite bridge near Kampen in the Netherlands. In the design process, the safety of bridges is insured by means of partial safety factors for both strength and load parameters. As a result it is generally accepted that the structure as a whole

  10. An overview of the evolution of human reliability analysis in the context of probabilistic risk assessment.

    Energy Technology Data Exchange (ETDEWEB)

    Bley, Dennis C. (Buttonwood Consulting Inc., Oakton, VA); Lois, Erasmia (U.S. Nuclear Regulatory Commission, Washington, DC); Kolaczkowski, Alan M. (Science Applications International Corporation, Eugene, OR); Forester, John Alan; Wreathall, John (John Wreathall and Co., Dublin, OH); Cooper, Susan E. (U.S. Nuclear Regulatory Commission, Washington, DC)

    2009-01-01

    Since the Reactor Safety Study in the early 1970's, human reliability analysis (HRA) has been evolving towards a better ability to account for the factors and conditions that can lead humans to take unsafe actions and thereby provide better estimates of the likelihood of human error for probabilistic risk assessments (PRAs). The purpose of this paper is to provide an overview of recent reviews of operational events and advances in the behavioral sciences that have impacted the evolution of HRA methods and contributed to improvements. The paper discusses the importance of human errors in complex human-technical systems, examines why humans contribute to accidents and unsafe conditions, and discusses how lessons learned over the years have changed the perspective and approach for modeling human behavior in PRAs of complicated domains such as nuclear power plants. It is argued that it has become increasingly more important to understand and model the more cognitive aspects of human performance and to address the broader range of factors that have been shown to influence human performance in complex domains. The paper concludes by addressing the current ability of HRA to adequately predict human failure events and their likelihood.

  11. Development of a Probabilistic Dynamic Synthesis Method for the Analysis of Nondeterministic Structures

    Science.gov (United States)

    Brown, A. M.

    1998-01-01

    Accounting for the statistical geometric and material variability of structures in analysis has been a topic of considerable research for the last 30 years. The determination of quantifiable measures of statistical probability of a desired response variable, such as natural frequency, maximum displacement, or stress, to replace experience-based "safety factors" has been a primary goal of these studies. There are, however, several problems associated with their satisfactory application to realistic structures, such as bladed disks in turbomachinery. These include the accurate definition of the input random variables (rv's), the large size of the finite element models frequently used to simulate these structures, which makes even a single deterministic analysis expensive, and accurate generation of the cumulative distribution function (CDF) necessary to obtain the probability of the desired response variables. The research presented here applies a methodology called probabilistic dynamic synthesis (PDS) to solve these problems. The PDS method uses dynamic characteristics of substructures measured from modal test as the input rv's, rather than "primitive" rv's such as material or geometric uncertainties. These dynamic characteristics, which are the free-free eigenvalues, eigenvectors, and residual flexibility (RF), are readily measured and for many substructures, a reasonable sample set of these measurements can be obtained. The statistics for these rv's accurately account for the entire random character of the substructure. Using the RF method of component mode synthesis, these dynamic characteristics are used to generate reduced-size sample models of the substructures, which are then coupled to form system models. These sample models are used to obtain the CDF of the response variable by either applying Monte Carlo simulation or by generating data points for use in the response surface reliability method, which can perform the probabilistic analysis with an order of

  12. System Safety Analysis Application Guide. Safety Analysis Report Update Program

    Energy Technology Data Exchange (ETDEWEB)

    1993-05-01

    Martin Marietta Energy Systems, Inc., (Energy Systems) is committed to performing and documenting safety analyses for facilities it manages for the Department of Energy (DOE). Safety analyses are performed to identify hazards and potential accidents; to analyze the adequacy of measures taken to eliminate, control, or mitigate hazards; and to evaluate potential accidents and determine associated risks. Safety Analysis Reports (SARs) are prepared to document the safety analysis to ensure facilities can be operated safely and in accordance with regulations. SARs include Technical Safety Requirements (TSRs), which are specific technical and administrative requirements that prescribe limits and controls to ensure safe operation of DOE facilities. These documented descriptions and analyses contribute to the authorization basis for facility operation. Energy Systems has established a process to perform Unreviewed Safety Question Determinations (USQDs) for planned changes and as-found conditions that are not described and analyzed in existing safety analyses. The process evaluates changes and as-found conditions to determine whether revisions to the authorization basis must be reviewed and approved by DOE. There is an Unreviewed Safety Question (USQ) if a change introduces conditions not bounded by the facility authorization basis. When it is necessary to request DOE approval to revise the authorization basis, preparation of a System Safety Analysis (SSA) is recommended. This application guide describes the process of preparing an SSA and the desired contents of an SSA. Guidance is provided on how to identify items and practices which are important to safety; how to determine the credibility and significance of consequences of proposed accident scenarios; how to evaluate accident prevention and mitigation features of the planned change; and how to establish special requirements to ensure that a change can be implemented with adequate safety.

  13. Probabilistic Structural Analysis of Ship Hull Longitudinal Strength.

    Science.gov (United States)

    1980-12-01

    7fo-v+VZ2 -6- #( 2+V7- V )1/2 (12) 11 Z. Vz2 . Vs2V )/ (2 I - 2VS2 - s=e mz (13) M o N vih ,-re: Y m= safety index ,. mn/, 0 - central safety factor...surface finish, load, heat treatment , direct surface environment, temperature, time, corrosion, etc. The approach given by Equation (18) has been

  14. Classroom Use of Microcomputer Graphics and Probabilistic Sensitivity Analysis to Demonstrate Petroleum Engineering Concepts.

    Science.gov (United States)

    Whitman, David L.; Terry, R. E.

    1984-01-01

    A computer program which allows the solution of a Monte Carlo simulation (probabilistic sensitivity analysis) has been developed for the Vic-20 microcomputer. Theory of Monte Carlo simulation, program capabilities and operation, and sample calculations are discussed. Student comments on the program are included. (JN)

  15. Static probabilistic timing analysis for real-time systems using random replacement caches

    NARCIS (Netherlands)

    Altmeyer, S.; Cucu-Grosjean, L.; Davis, R.I.

    2015-01-01

    In this paper, we investigate static probabilistic timing analysis (SPTA) for single processor real-time systems that use a cache with an evict-on-miss random replacement policy. We show that previously published formulae for the probability of a cache hit can produce results that are optimistic and

  16. Statistical analysis of probabilistic models of software product lines with quantitative constraints

    DEFF Research Database (Denmark)

    Beek, M.H. ter; Legay, A.; Lluch Lafuente, Alberto

    2015-01-01

    We investigate the suitability of statistical model checking for the analysis of probabilistic models of software product lines with complex quantitative constraints and advanced feature installation options. Such models are specified in the feature-oriented language QFLan, a rich process algebra...

  17. Synchronization Analysis of Master-Slave Probabilistic Boolean Networks.

    Science.gov (United States)

    Lu, Jianquan; Zhong, Jie; Li, Lulu; Ho, Daniel W C; Cao, Jinde

    2015-08-28

    In this paper, we analyze the synchronization problem of master-slave probabilistic Boolean networks (PBNs). The master Boolean network (BN) is a deterministic BN, while the slave BN is determined by a series of possible logical functions with certain probability at each discrete time point. In this paper, we firstly define the synchronization of master-slave PBNs with probability one, and then we investigate synchronization with probability one. By resorting to new approach called semi-tensor product (STP), the master-slave PBNs are expressed in equivalent algebraic forms. Based on the algebraic form, some necessary and sufficient criteria are derived to guarantee synchronization with probability one. Further, we study the synchronization of master-slave PBNs in probability. Synchronization in probability implies that for any initial states, the master BN can be synchronized by the slave BN with certain probability, while synchronization with probability one implies that master BN can be synchronized by the slave BN with probability one. Based on the equivalent algebraic form, some efficient conditions are derived to guarantee synchronization in probability. Finally, several numerical examples are presented to show the effectiveness of the main results.

  18. VIPR: A probabilistic algorithm for analysis of microbial detection microarrays

    Directory of Open Access Journals (Sweden)

    Holbrook Michael R

    2010-07-01

    Full Text Available Abstract Background All infectious disease oriented clinical diagnostic assays in use today focus on detecting the presence of a single, well defined target agent or a set of agents. In recent years, microarray-based diagnostics have been developed that greatly facilitate the highly parallel detection of multiple microbes that may be present in a given clinical specimen. While several algorithms have been described for interpretation of diagnostic microarrays, none of the existing approaches is capable of incorporating training data generated from positive control samples to improve performance. Results To specifically address this issue we have developed a novel interpretive algorithm, VIPR (Viral Identification using a PRobabilistic algorithm, which uses Bayesian inference to capitalize on empirical training data to optimize detection sensitivity. To illustrate this approach, we have focused on the detection of viruses that cause hemorrhagic fever (HF using a custom HF-virus microarray. VIPR was used to analyze 110 empirical microarray hybridizations generated from 33 distinct virus species. An accuracy of 94% was achieved as measured by leave-one-out cross validation. Conclusions VIPR outperformed previously described algorithms for this dataset. The VIPR algorithm has potential to be broadly applicable to clinical diagnostic settings, wherein positive controls are typically readily available for generation of training data.

  19. Risk assessment methods in radiotherapy: Probabilistic safety assessment (PSA); Los metodos de analisis de riesgo en radioterapia: Analisis Probabilistico de seguridad (APS)

    Energy Technology Data Exchange (ETDEWEB)

    Ramirez Vera, M. L.; Perez Mulas, A.; Delgado, J. M.; Barrientos Ontero, M.; Somoano, F.; Alvarez Garcia, C.; Rodriguez Marti, M.

    2011-07-01

    The understanding of accidents that have occurred in radiotherapy and the lessons learned from them are very useful to prevent repetition, but there are other risks that have not been detected to date. With a view to identifying and preventing such risks, proactive methods successfully applied in other fields, such as probabilistic safety assessment (PSA), have been developed. (Author)

  20. BETASCAN: probable beta-amyloids identified by pairwise probabilistic analysis.

    Directory of Open Access Journals (Sweden)

    Allen W Bryan

    2009-03-01

    Full Text Available Amyloids and prion proteins are clinically and biologically important beta-structures, whose supersecondary structures are difficult to determine by standard experimental or computational means. In addition, significant conformational heterogeneity is known or suspected to exist in many amyloid fibrils. Recent work has indicated the utility of pairwise probabilistic statistics in beta-structure prediction. We develop here a new strategy for beta-structure prediction, emphasizing the determination of beta-strands and pairs of beta-strands as fundamental units of beta-structure. Our program, BETASCAN, calculates likelihood scores for potential beta-strands and strand-pairs based on correlations observed in parallel beta-sheets. The program then determines the strands and pairs with the greatest local likelihood for all of the sequence's potential beta-structures. BETASCAN suggests multiple alternate folding patterns and assigns relative a priori probabilities based solely on amino acid sequence, probability tables, and pre-chosen parameters. The algorithm compares favorably with the results of previous algorithms (BETAPRO, PASTA, SALSA, TANGO, and Zyggregator in beta-structure prediction and amyloid propensity prediction. Accurate prediction is demonstrated for experimentally determined amyloid beta-structures, for a set of known beta-aggregates, and for the parallel beta-strands of beta-helices, amyloid-like globular proteins. BETASCAN is able both to detect beta-strands with higher sensitivity and to detect the edges of beta-strands in a richly beta-like sequence. For two proteins (Abeta and Het-s, there exist multiple sets of experimental data implying contradictory structures; BETASCAN is able to detect each competing structure as a potential structure variant. The ability to correlate multiple alternate beta-structures to experiment opens the possibility of computational investigation of prion strains and structural heterogeneity of amyloid

  1. A probabilistic approach to remote compositional analysis of planetary surfaces

    Science.gov (United States)

    Lapotre, Mathieu G. A.; Ehlmann, Bethany L.; Minson, Sarah E.

    2017-01-01

    Reflected light from planetary surfaces provides information, including mineral/ice compositions and grain sizes, by study of albedo and absorption features as a function of wavelength. However, deconvolving the compositional signal in spectra is complicated by the nonuniqueness of the inverse problem. Trade-offs between mineral abundances and grain sizes in setting reflectance, instrument noise, and systematic errors in the forward model are potential sources of uncertainty, which are often unquantified. Here we adopt a Bayesian implementation of the Hapke model to determine sets of acceptable-fit mineral assemblages, as opposed to single best fit solutions. We quantify errors and uncertainties in mineral abundances and grain sizes that arise from instrument noise, compositional end members, optical constants, and systematic forward model errors for two suites of ternary mixtures (olivine-enstatite-anorthite and olivine-nontronite-basaltic glass) in a series of six experiments in the visible-shortwave infrared (VSWIR) wavelength range. We show that grain sizes are generally poorly constrained from VSWIR spectroscopy. Abundance and grain size trade-offs lead to typical abundance errors of ≤1 wt % (occasionally up to ~5 wt %), while ~3% noise in the data increases errors by up to ~2 wt %. Systematic errors further increase inaccuracies by a factor of 4. Finally, phases with low spectral contrast or inaccurate optical constants can further increase errors. Overall, typical errors in abundance are <10%, but sometimes significantly increase for specific mixtures, prone to abundance/grain-size trade-offs that lead to high unmixing uncertainties. These results highlight the need for probabilistic approaches to remote determination of planetary surface composition.

  2. Investigation of techniques for the development of seismic design basis using the probabilistic seismic hazard analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bernreuter, D.L.; Boissonnade, A.C.; Short, C.M.

    1998-04-01

    The Nuclear Regulatory Commission asked Lawrence Livermore National Laboratory to form a group of experts to assist them in revising the seismic and geologic siting criteria for nuclear power plants, Appendix A to 10 CFR Part 100. This document describes a deterministic approach for determining a Safe Shutdown Earthquake (SSE) Ground Motion for a nuclear power plant site. One disadvantage of this approach is the difficulty of integrating differences of opinions and differing interpretations into seismic hazard characterization. In answer to this, probabilistic seismic hazard assessment methodologies incorporate differences of opinion and interpretations among earth science experts. For this reason, probabilistic hazard methods were selected for determining SSEs for the revised regulation, 10 CFR Part 100.23. However, because these methodologies provide a composite analysis of all possible earthquakes that may occur, they do not provide the familiar link between seismic design loading requirements and engineering design practice. Therefore, approaches used to characterize seismic events (magnitude and distance) which best represent the ground motion level determined with the probabilistic hazard analysis were investigated. This report summarizes investigations conducted at 69 nuclear reactor sites in the central and eastern U.S. for determining SSEs using probabilistic analyses. Alternative techniques are presented along with justification for key choices. 16 refs., 32 figs., 60 tabs.

  3. The European ASAMPSA_E project : towards guidance to model the impact of high amplitude natural hazards in the probabilistic safety assessment of nuclear power plants. Information on the project progress and needs from the geosciences.

    Science.gov (United States)

    Raimond, Emmanuel; Decker, Kurt; Guigueno, Yves; Klug, Joakim; Loeffler, Horst

    2015-04-01

    The Fukushima nuclear accident in Japan resulted from the combination of two correlated extreme external events (earthquake and tsunami). The consequences, in particular flooding, went beyond what was considered in the initial engineering design design of nuclear power plants (NPPs). Such situations can in theory be identified using probabilistic safety assessment (PSA) methodology. PSA results may then lead industry (system suppliers and utilities) or Safety Authorities to take appropriate decisions to reinforce the defence-in-depth of the NPP for low probability event but high amplitude consequences. In reality, the development of such PSA remains a challenging task. Definitions of the design basis of NPPs, for example, require data on events with occurrence probabilities not higher than 10-4 per year. Today, even lower probabilities, down to 10-8, are expected and typically used for probabilistic safety analyses (PSA) of NPPs and the examination of so-called design extension conditions. Modelling the combinations of natural or man-made hazards that can affect a NPP and affecting some meaningful probability of occurrence seems to be difficult. The European project ASAMPSAE (www.asampsa.eu) gathers more than 30 organizations (industry, research, safety control) from Europe, US and Japan and aims at identifying some meaningful practices to extend the scope and the quality of the existing probabilistic safety analysis developed for nuclear power plants. It offers a framework to discuss, at a technical level, how "extended PSA" can be developed efficiently and be used to verify if the robustness of Nuclear Power Plants (NPPs) in their environment is sufficient. The paper will present the objectives of this project, some first lessons and introduce which type of guidance is being developed. It will explain the need of expertise from geosciences to support the nuclear safety assessment in the different area (seismotectonic, hydrological, meteorological and biological

  4. Fault tree analysis for integrated and probabilistic risk analysis of drinking water systems.

    Science.gov (United States)

    Lindhe, Andreas; Rosén, Lars; Norberg, Tommy; Bergstedt, Olof

    2009-04-01

    Drinking water systems are vulnerable and subject to a wide range of risks. To avoid sub-optimisation of risk-reduction options, risk analyses need to include the entire drinking water system, from source to tap. Such an integrated approach demands tools that are able to model interactions between different events. Fault tree analysis is a risk estimation tool with the ability to model interactions between events. Using fault tree analysis on an integrated level, a probabilistic risk analysis of a large drinking water system in Sweden was carried out. The primary aims of the study were: (1) to develop a method for integrated and probabilistic risk analysis of entire drinking water systems; and (2) to evaluate the applicability of Customer Minutes Lost (CML) as a measure of risk. The analysis included situations where no water is delivered to the consumer (quantity failure) and situations where water is delivered but does not comply with water quality standards (quality failure). Hard data as well as expert judgements were used to estimate probabilities of events and uncertainties in the estimates. The calculations were performed using Monte Carlo simulations. CML is shown to be a useful measure of risks associated with drinking water systems. The method presented provides information on risk levels, probabilities of failure, failure rates and downtimes of the system. This information is available for the entire system as well as its different sub-systems. Furthermore, the method enables comparison of the results with performance targets and acceptable levels of risk. The method thus facilitates integrated risk analysis and consequently helps decision-makers to minimise sub-optimisation of risk-reduction options.

  5. A Probabilistic Decision-Making Scoring System for Quality and Safety Management in Aloreña de Málaga Table Olive Processing

    Science.gov (United States)

    Ruiz Bellido, Miguel Á.; Valero, Antonio; Medina Pradas, Eduardo; Romero Gil, Verónica; Rodríguez-Gómez, Francisco; Posada-Izquierdo, Guiomar D.; Rincón, Francisco; Possas, Aricia; García-Gimeno, Rosa M.; Arroyo-López, Francisco N.

    2017-01-01

    Table olives are one of the most representatives and consumed fermented vegetables in Mediterranean countries. However, there is an evident lack of standardization of production processes and HACCP systems thus implying the need of establishing decision-making tools allowing their commercialization and shelf-life extension. The present work aims at developing a decision-making scoring system by means of a probabilistic assessment to standardize production process of Aloreña de Málaga table olives based on the identification of potential hazards or deficiencies in hygienic processes for the subsequent implementation of corrective measures. A total of 658 microbiological and physico-chemical data were collected over three consecutive olive campaigns (2014–2016) to measure the variability and relative importance of each elaboration step on total hygienic quality and product safety. Three representative companies were visited to collect samples from food-contact surfaces, olive fruits, brines, air environment, olive dressings, water tanks, and finished/packaged products. A probabilistic assessment was done based on the establishment of Performance Hygiene and Safety Scores (PHSS 0–100%) through a standardized system for evaluating product acceptability. The mean value of the global PHSS for the Aloreña de Málaga table olives processing (PHHSFTOT) was 64.82% (90th CI: 52.78–76.39%) indicating the high variability among facilities in the evaluated processing steps on final product quality and safety. Washing and cracking, and selection and addition of olive dressings were detected as the most deficient ones in relation to PHSSFi values (p < 0.05) (mean = 53.02 and 56.62%, respectively). The relative contribution of each processing step was quantified by different experts (n = 25) from the Aloreña de Málaga table olive sector through a weighted PHSS (PHSSw). The mean value of PHSSw was 65.53% (90th CI: 53.12–77.52%). The final processing steps obtained

  6. A Probabilistic Decision-Making Scoring System for Quality and Safety Management in Aloreña de Málaga Table Olive Processing

    Directory of Open Access Journals (Sweden)

    Miguel Á. Ruiz Bellido

    2017-11-01

    Full Text Available Table olives are one of the most representatives and consumed fermented vegetables in Mediterranean countries. However, there is an evident lack of standardization of production processes and HACCP systems thus implying the need of establishing decision-making tools allowing their commercialization and shelf-life extension. The present work aims at developing a decision-making scoring system by means of a probabilistic assessment to standardize production process of Aloreña de Málaga table olives based on the identification of potential hazards or deficiencies in hygienic processes for the subsequent implementation of corrective measures. A total of 658 microbiological and physico-chemical data were collected over three consecutive olive campaigns (2014–2016 to measure the variability and relative importance of each elaboration step on total hygienic quality and product safety. Three representative companies were visited to collect samples from food-contact surfaces, olive fruits, brines, air environment, olive dressings, water tanks, and finished/packaged products. A probabilistic assessment was done based on the establishment of Performance Hygiene and Safety Scores (PHSS 0–100% through a standardized system for evaluating product acceptability. The mean value of the global PHSS for the Aloreña de Málaga table olives processing (PHHSFTOT was 64.82% (90th CI: 52.78–76.39% indicating the high variability among facilities in the evaluated processing steps on final product quality and safety. Washing and cracking, and selection and addition of olive dressings were detected as the most deficient ones in relation to PHSSFi values (p < 0.05 (mean = 53.02 and 56.62%, respectively. The relative contribution of each processing step was quantified by different experts (n = 25 from the Aloreña de Málaga table olive sector through a weighted PHSS (PHSSw. The mean value of PHSSw was 65.53% (90th CI: 53.12–77.52%. The final processing steps

  7. Probabilistic risk assessment: Number 219

    Energy Technology Data Exchange (ETDEWEB)

    Bari, R.A.

    1985-11-13

    This report describes a methodology for analyzing the safety of nuclear power plants. A historical overview of plants in the US is provided, and past, present, and future nuclear safety and risk assessment are discussed. A primer on nuclear power plants is provided with a discussion of pressurized water reactors (PWR) and boiling water reactors (BWR) and their operation and containment. Probabilistic Risk Assessment (PRA), utilizing both event-tree and fault-tree analysis, is discussed as a tool in reactor safety, decision making, and communications. (FI)

  8. Probabilistic analysis of PWR and BWR fuel rod performance using the code CASINO-SLEUTH

    Energy Technology Data Exchange (ETDEWEB)

    Bull, A.J.

    1987-05-01

    This paper presents a brief description of the Monte Carlo and response surface techniques used in the code, and a probabilistic analysis of fuel rod performance in PWR and BWR applications. The analysis shows that fission gas release predictions are very sensitive to changes in certain of the code's inputs, identifies the most dominant input parameters and compares their effects in the two cases.

  9. Risk Analysis of Multipurpose Reservoir Real-time Operation based on Probabilistic Hydrologic Forecasting

    Science.gov (United States)

    Liu, P.

    2011-12-01

    Quantitative analysis of the risk for reservoir real-time operation is a hard task owing to the difficulty of accurate description of inflow uncertainties. The ensemble-based probabilistic hydrologic forecasting, which outputs a lot of inflow scenarios or traces, does well in depicting the inflow not only the marginal distribution but also their corrections. This motivates us to analyze the reservoir operating risk by inputting probabilistic hydrologic forecasting into reservoir real-time operation. The proposed procedure involves: (1) based upon the Bayesian inference, two alternative techniques, the generalized likelihood uncertainty estimation (GLUE) and Markov chain Monte Carlo (MCMC), are implemented for producing probabilistic hydrologic forecasting, respectively, (2) the reservoir risk is defined as the ratio of the number of traces that excessive (or below) the critical value to the total number of traces, and (3) a multipurpose reservoir operation model is build to produce Pareto solutions for trade-offs between risks and profits with the inputted probabilistic hydrologic forecasting. With a case study of the China's Three Gorges Reservoir, it is found that the reservoir real-time operation risks can be estimated and minimized based on the proposed methods, and this is great potential benefit in decision and choosing the most realistic one.

  10. Exploratory analysis of the safety climate and safety behavior relationship.

    Science.gov (United States)

    Cooper, M D; Phillips, R A

    2004-01-01

    Safety climate refers to the degree to which employees believe true priority is given to organizational safety performance, and its measurement is thought to provide an "early warning" of potential safety system failure(s). However, researchers have struggled over the last 25 years to find empirical evidence to demonstrate actual links between safety climate and safety performance. A safety climate measure was distributed to manufacturing employees at the beginning of a behavioral safety initiative and redistributed one year later. Multiple regression analysis demonstrated that perceptions of the importance of safety training were predictive of actual levels of safety behavior. The results also demonstrate that the magnitude of change in perceptual safety climate scores will not necessarily match actual changes (r=0.56, n.s.) in employee's safety behavior. This study obtained empirical links between safety climate scores and actual safety behavior. Confirming and contradicting findings within the extant safety climate literature, the results strongly suggest that the hypothesized climate-behavior-accident path is not as clear cut as commonly assumed. A statistical link between safety climate perceptions and safety behavior will be obtained when sufficient behavioral data is collected. The study further supports the use of safety climate measures as useful diagnostic tools in ascertaining employee's perceptions of the way that safety is being operationalized.

  11. Study on the Application of Probabilistic Tsunami Hazard Analysis for the Nuclear Power Plant Site in Korean Peninsula

    Science.gov (United States)

    Rhee, H. M.; Kim, M.; Sheen, D. H.; Choi, I. K.

    2014-12-01

    The necessity of study on the tsunami hazard assessment for Nuclear Power Plant (NPP) site was suggested since the event of Fukushima in 2011 had been occurred. It has being emphasized because all of the NPPs in Korean Peninsula are located in coastal region. The tsunami hazard is regarded as the annual exceedance probability for the wave heights. The methodology for analysis of tsunami hazard is based on the seismic hazard analysis. The seismic hazard analysis had been performed by using both deterministic and probabilistic method. Recently, the probabilistic method had been received more attention than the deterministic method because the uncertainties of hazard analysis could be considered by using the logic tree approach. In this study, the probabilistic tsunami hazard analysis for Uljin NPP site was performed by using the information of fault sources which was published by Atomic Energy Society of Japan (AESJ). The wave parameter is the most different parameter with seismic hazard. It could be estimated from the results of tsunami propagation analysis. The TSUNAMI_ver1.0 which was developed by Japan nuclear energy safety organization (JNES), was used for the tsunami simulation. The 80 cases tsunami simulations were performed and then the wave parameters were estimated. For reducing the sensitivity which was encouraged by location of sampling point, the wave parameters were estimated from group of sampling points.The probability density function on the tsunami height was computed by using the recurrence intervals and the wave parameters. And then the exceedance probability distribution was calculated from the probability density function. The tsunami hazards for the sampling groups were calculated. The fractile curves which were shown the uncertainties of input parameters were estimated from the hazards by using the round-robin algorithm. In general, tsunami hazard analysis is focused on the maximum wave heights. But the minimum wave height should be considered

  12. State of the art on the probabilistic safety assessment (P.S.A.); Etat de l'art sur les etudes probabilistes de surete (E.P.S.)

    Energy Technology Data Exchange (ETDEWEB)

    Devictor, N.; Bassi, A.; Saignes, P.; Bertrand, F

    2008-07-01

    The use of Probabilistic Safety Assessment (PSA) is internationally increasing as a means of assessing and improving the safety of nuclear and non-nuclear facilities. To support the development of a competence on Probabilistic Safety Assessment, a set of states of the art regarding these tools and their use has been made between 2001 and 2005, in particular on the following topics: - Definition of the PSA of level 1, 2 and 3; - Use of PSA in support to design and operation of nuclear plants (risk-informed applications); - Applications to Non Reactor Nuclear Facilities. The report compiled in a single document these states of the art in order to ensure a broader use; this work has been done in the frame of the Project 'Reliability and Safety of Nuclear Facility' of the Nuclear Development and Innovation Division of the Nuclear Energy Division. As some of these states of the art have been made in support to exchanges with international partners and were written in English, a section of this document is written in English. This work is now applied concretely in support to the design of 4. Generation nuclear systems as Sodium-cooled Fast Reactors and especially Gas-cooled Fast Reactor, that have been the subject of communications during the conferences ANS (Annual Meeting 2007), PSA'08, ICCAP'08 and in the journal Science and Technology of Nuclear Installations. (authors)

  13. Probabilistic safety criteria for improvement of Nuclear Power Plant design and operation

    Energy Technology Data Exchange (ETDEWEB)

    Cho, Nam Jin; Chung, Woo Sick; Park, Moon Kyu [Korea Advanced Institute of Science and Technology, Taejon (Korea, Republic of)

    1991-12-15

    The procedure of this study is to : research on the status of IAEA(International Atomic Energy Agency) member states about the policy of safety goals, study figures of merit and demerit that inherently exist in the existing methodology for reliability allocation, develop an efficient methodology for allocating reliability from top-level safety goals to intermediate and low-level PSC, write a computer code on the basis of the methodology proposed in the study, and apply the methodology to Surry Unit 1 that is the type of PWR.

  14. Probabilistic Structural Integrity Analysis of Boiling Water Reactor Pressure Vessel under Low Temperature Overpressure Event

    Directory of Open Access Journals (Sweden)

    Hsoung-Wei Chou

    2015-01-01

    Full Text Available The probabilistic structural integrity of a Taiwan domestic boiling water reactor pressure vessel has been evaluated by the probabilistic fracture mechanics analysis. First, the analysis model was built for the beltline region of the reactor pressure vessel considering the plant specific data. Meanwhile, the flaw models which comprehensively simulate all kinds of preexisting flaws along the vessel wall were employed here. The low temperature overpressure transient which has been concluded to be the severest accident for a boiling water reactor pressure vessel was considered as the loading condition. It is indicated that the fracture mostly happens near the fusion-line area of axial welds but with negligible failure risk. The calculated results indicate that the domestic reactor pressure vessel has sufficient structural integrity until doubling of the present end-of-license operation.

  15. An Integrated, Probabilistic Framework for Requirement Change Impact Analysis

    Directory of Open Access Journals (Sweden)

    Simon Lock

    1999-05-01

    Full Text Available Impact analysis is an essential part of change management. Without adequate analysis it is not possible to confidently determine the extent, complexity and cost of proposed changes to a software system. This diminishes the ability of a developer or maintainer to make informed decisions regarding the inclusion or rejection of proposed changes. The lack of coherent impact analysis can also hinder the process of ensuring that all system components affected by a change are updated. The abstract nature of requirement level entities has meant that current impact analysis techniques have focused largely on design and code level artifacts. This paper proposes a novel approach which integrates traditional impact analysis with experience based techniques to extend current approaches to support requirement level impact analysis. Central to this approach is the use of probability to assist in the combination and presentation of predicted impact propagation paths. An Auto Teller Machine (ATM example is used to illustrate the approach.

  16. A probabilistic analysis of the dynamic response of monopile foundations: Soil variability and its consequences

    DEFF Research Database (Denmark)

    Damgaard, M.; Andersen, L.V.; Ibsen, L.B.

    2015-01-01

    -analytical impedance functions of a monopile embedded in a stochastic linear viscoelastic soil layer, fully coupled aero-hydro-elastic simulations are conducted in the nonlinear multi-body code Hawc2. The probabilistic analysis accounts for the uncertainty of soil properties (e.g. damping and stiffness) and relies...... properties. Lognormal and Gumbel distributed modal damping and accumulated side-side fatigue damage equivalent moments with a coefficient of variation of 30% and 8%, respectively, are observed....

  17. A method for a categorized and probabilistic analysis of the surface electromyogram in dynamic contractions

    Directory of Open Access Journals (Sweden)

    Sylvie Charlotte Frieda Anneliese von Werder

    2015-02-01

    Full Text Available The human motor system permits a wide variety of complex movements. Thereby, the inter- individual variability as well as the biomechanical aspects of the performed movement itself contribute to the challenge of the interpretation of sEMG signals in dynamic contractions. A procedure for the systematic analysis of sEMG recordings during dynamic contraction was introduced, which includes categorization of the data in combination with the analysis of frequency distributions of the sEMG with a probabilistic approach. Using the example of elbow flexion and extension the procedure was evaluated with 10 healthy subjects. The recorded sEMG signals of brachioradialis were categorized into a combination of constant and variable movement factors, which originate from the performed movement. Subsequently, for each combination of movement factors cumulative frequency distributions were computed for each subject separately. Finally, the probability of the difference of muscular activation in varying movement conditions was assessed. The probabilistic approach was compared to a deterministic analysis of the same data. Both approaches observed a significant change of muscular activation of brachioradialis during concentric and eccentric contractions exclusively for flexion and extension angles exceeding 30°. However, with the probabilistic approach additional information on the likelihood that the tested effect occurs can be provided.Especially for movements under uncontrollable boundary conditions, this information to assess the confidence of the detected results is of high relevance. Thus, the procedure provides new insights into the quantification and interpretation of muscular activity.

  18. Latent Profile Analysis of Schizotypy and Paranormal Belief: Associations with Probabilistic Reasoning Performance

    Science.gov (United States)

    Denovan, Andrew; Dagnall, Neil; Drinkwater, Kenneth; Parker, Andrew

    2018-01-01

    This study assessed the extent to which within-individual variation in schizotypy and paranormal belief influenced performance on probabilistic reasoning tasks. A convenience sample of 725 non-clinical adults completed measures assessing schizotypy (Oxford-Liverpool Inventory of Feelings and Experiences; O-Life brief), belief in the paranormal (Revised Paranormal Belief Scale; RPBS) and probabilistic reasoning (perception of randomness, conjunction fallacy, paranormal perception of randomness, and paranormal conjunction fallacy). Latent profile analysis (LPA) identified four distinct groups: class 1, low schizotypy and low paranormal belief (43.9% of sample); class 2, moderate schizotypy and moderate paranormal belief (18.2%); class 3, moderate schizotypy (high cognitive disorganization) and low paranormal belief (29%); and class 4, moderate schizotypy and high paranormal belief (8.9%). Identification of homogeneous classes provided a nuanced understanding of the relative contribution of schizotypy and paranormal belief to differences in probabilistic reasoning performance. Multivariate analysis of covariance revealed that groups with lower levels of paranormal belief (classes 1 and 3) performed significantly better on perception of randomness, but not conjunction problems. Schizotypy had only a negligible effect on performance. Further analysis indicated that framing perception of randomness and conjunction problems in a paranormal context facilitated performance for all groups but class 4. PMID:29434562

  19. Probabilistic Structural Analysis Methods for select space propulsion system components (PSAM). Volume 3: Literature surveys and technical reports

    Science.gov (United States)

    1992-01-01

    The technical effort and computer code developed during the first year are summarized. Several formulations for Probabilistic Finite Element Analysis (PFEA) are described with emphasis on the selected formulation. The strategies being implemented in the first-version computer code to perform linear, elastic PFEA is described. The results of a series of select Space Shuttle Main Engine (SSME) component surveys are presented. These results identify the critical components and provide the information necessary for probabilistic structural analysis.

  20. SEISMIC ANALYSIS FOR PRECLOSURE SAFETY

    Energy Technology Data Exchange (ETDEWEB)

    E.N. Lindner

    2004-12-03

    The purpose of this seismic preclosure safety analysis is to identify the potential seismically-initiated event sequences associated with preclosure operations of the repository at Yucca Mountain and assign appropriate design bases to provide assurance of achieving the performance objectives specified in the Code of Federal Regulations (CFR) 10 CFR Part 63 for radiological consequences. This seismic preclosure safety analysis is performed in support of the License Application for the Yucca Mountain Project. In more detail, this analysis identifies the systems, structures, and components (SSCs) that are subject to seismic design bases. This analysis assigns one of two design basis ground motion (DBGM) levels, DBGM-1 or DBGM-2, to SSCs important to safety (ITS) that are credited in the prevention or mitigation of seismically-initiated event sequences. An application of seismic margins approach is also demonstrated for SSCs assigned to DBGM-2 by showing a high confidence of a low probability of failure at a higher ground acceleration value, termed a beyond-design basis ground motion (BDBGM) level. The objective of this analysis is to meet the performance requirements of 10 CFR 63.111(a) and 10 CFR 63.111(b) for offsite and worker doses. The results of this calculation are used as inputs to the following: (1) A classification analysis of SSCs ITS by identifying potential seismically-initiated failures (loss of safety function) that could lead to undesired consequences; (2) An assignment of either DBGM-1 or DBGM-2 to each SSC ITS credited in the prevention or mitigation of a seismically-initiated event sequence; and (3) A nuclear safety design basis report that will state the seismic design requirements that are credited in this analysis. The present analysis reflects the design information available as of October 2004 and is considered preliminary. The evolving design of the repository will be re-evaluated periodically to ensure that seismic hazards are properly

  1. Adapting safety requirements analysis to intrusion detection

    Science.gov (United States)

    Lutz, R.

    2001-01-01

    Several requirements analysis techniques widely used in safety-critical systems are being adapted to support the analysis of secure systems. Perhaps the most relevant system safety techique for Intrusion Detection Systems is hazard analysis.

  2. Probabilistic analysis of the torsional effects on the tall building resistance due to earthquake even

    Science.gov (United States)

    Králik, Juraj; Králik, Juraj

    2017-07-01

    The paper presents the results from the deterministic and probabilistic analysis of the accidental torsional effect of reinforced concrete tall buildings due to earthquake even. The core-column structural system was considered with various configurations in plane. The methodology of the seismic analysis of the building structures in Eurocode 8 and JCSS 2000 is discussed. The possibilities of the utilization the LHS method to analyze the extensive and robust tasks in FEM is presented. The influence of the various input parameters (material, geometry, soil, masses and others) is considered. The deterministic and probability analysis of the seismic resistance of the structure was calculated in the ANSYS program.

  3. Simulated annealing with probabilistic analysis for solving traveling salesman problems

    Science.gov (United States)

    Hong, Pei-Yee; Lim, Yai-Fung; Ramli, Razamin; Khalid, Ruzelan

    2013-09-01

    Simulated Annealing (SA) is a widely used meta-heuristic that was inspired from the annealing process of recrystallization of metals. Therefore, the efficiency of SA is highly affected by the annealing schedule. As a result, in this paper, we presented an empirical work to provide a comparable annealing schedule to solve symmetric traveling salesman problems (TSP). Randomized complete block design is also used in this study. The results show that different parameters do affect the efficiency of SA and thus, we propose the best found annealing schedule based on the Post Hoc test. SA was tested on seven selected benchmarked problems of symmetric TSP with the proposed annealing schedule. The performance of SA was evaluated empirically alongside with benchmark solutions and simple analysis to validate the quality of solutions. Computational results show that the proposed annealing schedule provides a good quality of solution.

  4. Reduction of uncertainties in probabilistic seismic hazard analysis

    Energy Technology Data Exchange (ETDEWEB)

    Seo, Jeong Moon; Choun, Young Sun; Choi, In Kil [Korea Atomic Energy Research Institute, Taejon (Korea)

    1999-02-01

    An integrated research for the reduction of conservatism and uncertainties in PSHA in Korea was performed. The research consisted of five technical task areas as follows; Task 1: Earthquake Catalog Development for PSHA. Task 2: Evaluation of Seismicity and Tectonics of the Korea Region. Task 3: Development of a Ground Motion Relationships. Task 4: Improvement of PSHA Modelling Methodology. Task 5: Development of Seismic Source Interpretations for the region of Korea for Inputs to PSHA. A series of tests on an ancient wooden house and an analysis on medium size earthquake in Korea were performed intensively. Signification improvement, especially in the estimation of historical earthquake, ground motion attenuation, and seismic source interpretations, were made through this study. 314 refs., 180 figs., 54 tabs. (Author)

  5. NESSUS/expert and NESSUS/FPI in the Probabilistic Structural Analysis Methods (PSAM) program

    Science.gov (United States)

    Burnside, O. H.

    1987-01-01

    The Numerical Evaluation of Stochastic Structures under Stress (NESSUS) is the primary computer code being developed in the NASA Probabilistic Structural Analysis Methods (PSAM) project. It consists of four modules NESSUS/EXPERT, NESSUS/FPI, NESSUS/PRE and NESSUS/FEM. This presentation concentrates on EXPERT and FPI. To provide an effective interface between NESSUS and the user, an expert system module called NESSUS/EXPERT is being developed. That system uses the CLIPS artificial intelligence code developed to NASA-JSC. The code is compatible with FORTRAN, the standard language for codes in PSAM. The user interacts with the CLIPS inference engine, which is linked to the knowledge database. The perturbation database generated by NESSUS/FEM and managed in EXPERT is used to develop the so-called response or performance model in the random variables. Two independent probabilistic methods are available in PSAM for the computation of the probabilistic structural response. These are the Fast Probability Integration (FPI) method and Monte Carlo simulation. FPI is classified as an advanced reliability method and has been developed over the past ten years by researchers addressing the reliability of civil engineering structures. Monte Carlo is a well-established technique for computing probabilities by conducting a number of deterministic analyses with specified input distributional information.

  6. Computing Expected Value of Partial Sample Information from Probabilistic Sensitivity Analysis Using Linear Regression Metamodeling.

    Science.gov (United States)

    Jalal, Hawre; Goldhaber-Fiebert, Jeremy D; Kuntz, Karen M

    2015-07-01

    Decision makers often desire both guidance on the most cost-effective interventions given current knowledge and also the value of collecting additional information to improve the decisions made (i.e., from value of information [VOI] analysis). Unfortunately, VOI analysis remains underused due to the conceptual, mathematical, and computational challenges of implementing Bayesian decision-theoretic approaches in models of sufficient complexity for real-world decision making. In this study, we propose a novel practical approach for conducting VOI analysis using a combination of probabilistic sensitivity analysis, linear regression metamodeling, and unit normal loss integral function--a parametric approach to VOI analysis. We adopt a linear approximation and leverage a fundamental assumption of VOI analysis, which requires that all sources of prior uncertainties be accurately specified. We provide examples of the approach and show that the assumptions we make do not induce substantial bias but greatly reduce the computational time needed to perform VOI analysis. Our approach avoids the need to analytically solve or approximate joint Bayesian updating, requires only one set of probabilistic sensitivity analysis simulations, and can be applied in models with correlated input parameters. © The Author(s) 2015.

  7. Probabilistic Analysis of the Hard Rock Disintegration Process

    Directory of Open Access Journals (Sweden)

    K. Frydrýšek

    2008-01-01

    Full Text Available This paper focuses on a numerical analysis of the hard rock (ore disintegration process. The bit moves and sinks into the hard rock (mechanical contact with friction between the ore and the cutting bit and subsequently disintegrates it. The disintegration (i.e. the stress-strain relationship, contact forces, reaction forces and fracture of the ore is solved via the FEM (MSC.Marc/Mentat software and SBRA (Simulation-Based Reliability Assessment method (Monte Carlo simulations, Anthill and Mathcad software. The ore is disintegrated by deactivating the finite elements which satisfy the fracture condition. The material of the ore (i.e. yield stress, fracture limit, Young’s modulus and Poisson’s ratio, is given by bounded histograms (i.e. stochastic inputs which better describe reality. The results (reaction forces in the cutting bit are also of stochastic quantity and they are compared with experimental measurements. Application of the SBRA method in this area is a modern and innovative trend in mechanics. However, it takes a long time to solve this problem (due to material and structural nonlinearities, the large number of elements, many iteration steps and many Monte Carlo simulations. Parallel computers were therefore used to handle the large computational needs of this problem. 

  8. A probabilistic model of emphysema based on granulometry analysis

    Science.gov (United States)

    Marcos, J. V.; Nava, R.; Cristobal, G.; Munoz-Barrutia, A.; Escalante-Ramírez, B.; Ortiz-de-Solórzano, C.

    2013-11-01

    Emphysema is associated with the destruction of lung parenchyma, resulting in abnormal enlargement of airspaces. Accurate quantification of emphysema is required for a better understanding of the disease as well as for the assessment of drugs and treatments. In the present study, a novel method for emphysema characterization from histological lung images is proposed. Elastase-induced mice were used to simulate the effect of emphysema on the lungs. A database composed of 50 normal and 50 emphysematous lung patches of size 512 x 512 pixels was used in our experiments. The purpose is to automatically identify those patches containing emphysematous tissue. The proposed approach is based on the use of granulometry analysis, which provides the pattern spectrum describing the distribution of airspaces in the lung region under evaluation. The profile of the spectrum was summarized by a set of statistical features. A logistic regression model was then used to estimate the probability for a patch to be emphysematous from this feature set. An accuracy of 87% was achieved by our method in the classification between normal and emphysematous samples. This result shows the utility of our granulometry-based method to quantify the lesions due to emphysema.

  9. cooccur: Probabilistic Species Co-Occurrence Analysis in R

    Directory of Open Access Journals (Sweden)

    Daniel M. Griffith

    2016-02-01

    Full Text Available The observation that species may be positively or negatively associated with each other is at least as old as the debate surrounding the nature of community structure which began in the early 1900's with Gleason and Clements. Since then investigating species co-occurrence patterns has taken a central role in understanding the causes and consequences of evolution, history, coexistence mechanisms, competition, and environment for community structure and assembly. This is because co-occurrence among species is a measurable metric in community datasets that, in the context of phylogeny, geography, traits, and environment, can sometimes indicate the degree of competition, displacement, and phylogenetic repulsion as weighed against biotic and environmental effects promoting correlated species distributions. Historically, a multitude of different co-occurrence metrics have been developed and most have depended on data randomization procedures to produce null distributions for significance testing. Here we improve upon and present an R implementation of a recently published model that is metric-free, distribution-free, and randomization-free. The R package, cooccur, is highly accessible, easily integrates into common analyses, and handles large datasets with high performance. In the article we develop the package's functionality and demonstrate aspects of co-occurrence analysis using three sample datasets.

  10. Significance of the results from probabilistic safety assessment at level 2 for off-site consequences

    Energy Technology Data Exchange (ETDEWEB)

    Rossi, J. [VTT Energy, Espoo (Finland)

    2000-07-01

    The procedure was developed to enable STUK (Radiation and Nuclear Safety Authority) to make simplified estimates on off-site consequences based on the existing results of the PSA level 2 calculations done by e.g. power utilities. Method is based on dose calculated from each nuclide group of reactor activity inventory when the same release fraction for each group is assumed. This means that a specific new result from PSA level 2 can be categorised to find out a representative PSA level 3 result for this case. In addition a user interface including the procedure was prepared. Secondly some new insights about consequences based on the releases from PSA level 2 is expected to give better understanding of risks at prevailing increased reactor power levels. In this case only some early health effects and long-term doses were estimated without full-scope PSA level 3 approach. (orig.)

  11. Probabilistic analysis in normal operation of distribution system with distributed generation

    DEFF Research Database (Denmark)

    Villafafila-Robles, R.; Sumper, A.; Bak-Jensen, B.

    2011-01-01

    Nowadays, the incorporation of high levels of small-scale non-dispatchable distributed generation is leading to the transition from the traditional 'vertical' power system structure to a 'horizontally-operated' power system, where the distribution networks contain both stochastic generation...... and load. This fact increases the number of stochastic inputs and dependence structures between them need to be considered. The deterministic analysis is not enough to cope with these issues and a new approach is needed. Probabilistic analysis provides a better approach. Moreover, as distribution systems...

  12. Precise Quantitative Analysis of Probabilistic Business Process Model and Notation Workflows

    DEFF Research Database (Denmark)

    Herbert, Luke Thomas; Sharp, Robin

    2013-01-01

    , occurrence and ordering of events, reward-based properties, and best- and worst- case scenarios. We develop a simple example of medical workflow and demonstrate the utility of this analysis in accurate provisioning of drug stocks. Finally, we suggest a path to building upon these techniques to cover......We present a framework for modeling and analysis of real-world business workflows. We present a formalized core subset of the business process modeling and notation (BPMN) and then proceed to extend this language with probabilistic nondeterministic branching and general-purpose reward annotations...

  13. Consequence modeling for nuclear weapons probabilistic cost/benefit analyses of safety retrofits

    Energy Technology Data Exchange (ETDEWEB)

    Harvey, T.F.; Peters, L.; Serduke, F.J.D.; Hall, C.; Stephens, D.R.

    1998-01-01

    The consequence models used in former studies of costs and benefits of enhanced safety retrofits are considered for (1) fuel fires; (2) non-nuclear detonations; and, (3) unintended nuclear detonations. Estimates of consequences were made using a representative accident location, i.e., an assumed mixed suburban-rural site. We have explicitly quantified land- use impacts and human-health effects (e.g. , prompt fatalities, prompt injuries, latent cancer fatalities, low- levels of radiation exposure, and clean-up areas). Uncertainty in the wind direction is quantified and used in a Monte Carlo calculation to estimate a range of results for a fuel fire with uncertain respirable amounts of released Pu. We define a nuclear source term and discuss damage levels of concern. Ranges of damages are estimated by quantifying health impacts and property damages. We discuss our dispersal and prompt effects models in some detail. The models used to loft the Pu and fission products and their particle sizes are emphasized.

  14. Experimental analysis on classification of unmanned aerial vehicle images using the probabilistic latent semantic analysis

    Science.gov (United States)

    Yi, Wenbin; Tang, Hong

    2009-10-01

    In this paper, we present a novel algorithm to classify UAV images through the image annotation which is a semi-supervised method. During the annotation process, we first divide whole image into different sizes of blocks and generate suitable visual words which are the K-means clustering centers or just pixels in small size image block. Then, given a set of image blocks for each semantic concept as training data, learning is based on the Probabilistic Latent Semantic Analysis (PLSA). The probability distributions of visual words in every document can be learned through the PLSA model. The labeling of every document (image block) is done by computing the similarity of its feature distribution to the distribution of the training documents with the Kullback-Leibler (K-L) divergence. Finally, the classification of the UAV images will be done by combining all the image blocks in every block size. The UAV images using in our experiments was acquired during Sichuan earthquake in 2008. The results show that smaller size block image will get better classification results.

  15. Dependence in probabilistic modeling, Dempster-Shafer theory, and probability bounds analysis.

    Energy Technology Data Exchange (ETDEWEB)

    Oberkampf, William Louis; Tucker, W. Troy (Applied Biomathematics, Setauket, NY); Zhang, Jianzhong (Iowa State University, Ames, IA); Ginzburg, Lev (Applied Biomathematics, Setauket, NY); Berleant, Daniel J. (Iowa State University, Ames, IA); Ferson, Scott (Applied Biomathematics, Setauket, NY); Hajagos, Janos (Applied Biomathematics, Setauket, NY); Nelsen, Roger B. (Lewis & Clark College, Portland, OR)

    2004-10-01

    This report summarizes methods to incorporate information (or lack of information) about inter-variable dependence into risk assessments that use Dempster-Shafer theory or probability bounds analysis to address epistemic and aleatory uncertainty. The report reviews techniques for simulating correlated variates for a given correlation measure and dependence model, computation of bounds on distribution functions under a specified dependence model, formulation of parametric and empirical dependence models, and bounding approaches that can be used when information about the intervariable dependence is incomplete. The report also reviews several of the most pervasive and dangerous myths among risk analysts about dependence in probabilistic models.

  16. A probabilistic method for leak-before-break analysis of CANDU reactor pressure tubes

    Energy Technology Data Exchange (ETDEWEB)

    Puls, M.P.; Wilkins, B.J.S.; Rigby, G.L. [Whiteshell Labs., Pinawa (Canada)] [and others

    1997-04-01

    A probabilistic code for the prediction of the cumulative probability of pressure tube ruptures in CANDU type reactors is described. Ruptures are assumed to result from the axial growth by delayed hydride cracking. The BLOOM code models the major phenomena that affect crack length and critical crack length during the reactor sequence of events following the first indications of leakage. BLOOM can be used to develop unit-specific estimates of the actual probability of pressure rupture in operating CANDU reactors and supplement the existing leak before break analysis.

  17. Feature extraction through parallel Probabilistic Principal Component Analysis for heart disease diagnosis

    Science.gov (United States)

    Shah, Syed Muhammad Saqlain; Batool, Safeera; Khan, Imran; Ashraf, Muhammad Usman; Abbas, Syed Hussnain; Hussain, Syed Adnan

    2017-09-01

    Automatic diagnosis of human diseases are mostly achieved through decision support systems. The performance of these systems is mainly dependent on the selection of the most relevant features. This becomes harder when the dataset contains missing values for the different features. Probabilistic Principal Component Analysis (PPCA) has reputation to deal with the problem of missing values of attributes. This research presents a methodology which uses the results of medical tests as input, extracts a reduced dimensional feature subset and provides diagnosis of heart disease. The proposed methodology extracts high impact features in new projection by using Probabilistic Principal Component Analysis (PPCA). PPCA extracts projection vectors which contribute in highest covariance and these projection vectors are used to reduce feature dimension. The selection of projection vectors is done through Parallel Analysis (PA). The feature subset with the reduced dimension is provided to radial basis function (RBF) kernel based Support Vector Machines (SVM). The RBF based SVM serves the purpose of classification into two categories i.e., Heart Patient (HP) and Normal Subject (NS). The proposed methodology is evaluated through accuracy, specificity and sensitivity over the three datasets of UCI i.e., Cleveland, Switzerland and Hungarian. The statistical results achieved through the proposed technique are presented in comparison to the existing research showing its impact. The proposed technique achieved an accuracy of 82.18%, 85.82% and 91.30% for Cleveland, Hungarian and Switzerland dataset respectively.

  18. Airline Safety: A Comparative Analysis.

    Science.gov (United States)

    1987-01-01

    S.TP OFR O T PEIDCV E Airline Safety: A Comparative Analysis TRlES IS1j0’~fJ 6. PERFORMING 01G. REPORT NUMBER AU TNOR( ) Sign . CONTRACT OR GRANT NUMBER...accidents. Perhaps because of an airline’s understandable sensitivity to public knowledge of its accidents, one has little assurance that each airline...62,169 0 Royal Air Maroc 81,451 0 80,861 0 (Morocco) Royal Nepal 11,885 0 19,785 0 SAA (South Africa) 57,226 0 61,618 0 SAHSA (Honduras) 32,658 0 34,894 0

  19. Incorporating Site Amplification into Seismic Hazard Analysis: A Fully Probabilistic Approach

    Science.gov (United States)

    Cramer, C. H.

    2001-12-01

    Developing site-specific amplification factors from geological, geophysical, and geotechnical information has been the state-of-practice for the last couple of decades. Now the state-of-the-art is to develop a distribution of possible site-specific amplification factors for a given input rock ground-motion. These state-of-the-art site-amplification distributions account for the uncertainty in soil properties and Vs structure at the site. Applying these site amplification distributions to a site-specific seismic hazard analysis requires a fully probabilistic approach. One such approach is to modify the generic ground-motion attenuation relations used in a probabilistic seismic hazard analysis to site-specific relations using a site amplification distribution developed for that site. The modification of the ground-motion attenuation relation is done prior to calculating probabilistic seismic hazard at the site. This approach has been implemented using the USGS National Seismic Hazard Mapping codes. Standard hazard models and hard-rock ground-motion attenuation relations are input into the modified codes along with a description of the site-specific amplification in the form of a lognormal probability-density-function (pdf). For each hard-rock ground-motion level, the pdf is specified by the median site-amplification factor and its natural-logarithmic standard deviation. The fully probabilistic ground-motion hazard curves are always above the hazard curve derived from multiplying the hard-rock hazard curve by the site's median site-amplification factors. At Paducah, Kentucky the difference is significant for 2%-in-50-year ground-motion estimates (0.7g vs. 0.5g for PGA and 1.3g vs. 0.9g for 1.0 s Sa). At Memphis, Tennessee the differences are less significant and may only be important at long periods (1.0 s and longer) on Mississippi flood-plain (lowlands) deposits (on the uplands deposits: 0.35g vs. 0.30g for PGA and 0.8g vs. 0.7g for 1.0 s Sa; on the lowlands

  20. Propagating Water Quality Analysis Uncertainty Into Resource Management Decisions Through Probabilistic Modeling

    Science.gov (United States)

    Gronewold, A. D.; Wolpert, R. L.; Reckhow, K. H.

    2007-12-01

    Most probable number (MPN) and colony-forming-unit (CFU) are two estimates of fecal coliform bacteria concentration commonly used as measures of water quality in United States shellfish harvesting waters. The MPN is the maximum likelihood estimate (or MLE) of the true fecal coliform concentration based on counts of non-sterile tubes in serial dilution of a sample aliquot, indicating bacterial metabolic activity. The CFU is the MLE of the true fecal coliform concentration based on the number of bacteria colonies emerging on a growth plate after inoculation from a sample aliquot. Each estimating procedure has intrinsic variability and is subject to additional uncertainty arising from minor variations in experimental protocol. Several versions of each procedure (using different sized aliquots or different numbers of tubes, for example) are in common use, each with its own levels of probabilistic and experimental error and uncertainty. It has been observed empirically that the MPN procedure is more variable than the CFU procedure, and that MPN estimates are somewhat higher on average than CFU estimates, on split samples from the same water bodies. We construct a probabilistic model that provides a clear theoretical explanation for the observed variability in, and discrepancy between, MPN and CFU measurements. We then explore how this variability and uncertainty might propagate into shellfish harvesting area management decisions through a two-phased modeling strategy. First, we apply our probabilistic model in a simulation-based analysis of future water quality standard violation frequencies under alternative land use scenarios, such as those evaluated under guidelines of the total maximum daily load (TMDL) program. Second, we apply our model to water quality data from shellfish harvesting areas which at present are closed (either conditionally or permanently) to shellfishing, to determine if alternative laboratory analysis procedures might have led to different

  1. Probabilistic Sensitivity Analysis for Launch Vehicles with Varying Payloads and Adapters for Structural Dynamics and Loads

    Science.gov (United States)

    McGhee, David S.; Peck, Jeff A.; McDonald, Emmett J.

    2012-01-01

    This paper examines Probabilistic Sensitivity Analysis (PSA) methods and tools in an effort to understand their utility in vehicle loads and dynamic analysis. Specifically, this study addresses how these methods may be used to establish limits on payload mass and cg location and requirements on adaptor stiffnesses while maintaining vehicle loads and frequencies within established bounds. To this end, PSA methods and tools are applied to a realistic, but manageable, integrated launch vehicle analysis where payload and payload adaptor parameters are modeled as random variables. This analysis is used to study both Regional Response PSA (RRPSA) and Global Response PSA (GRPSA) methods, with a primary focus on sampling based techniques. For contrast, some MPP based approaches are also examined.

  2. Probabilistic topic modeling for the analysis and classification of genomic sequences

    Science.gov (United States)

    2015-01-01

    Background Studies on genomic sequences for classification and taxonomic identification have a leading role in the biomedical field and in the analysis of biodiversity. These studies are focusing on the so-called barcode genes, representing a well defined region of the whole genome. Recently, alignment-free techniques are gaining more importance because they are able to overcome the drawbacks of sequence alignment techniques. In this paper a new alignment-free method for DNA sequences clustering and classification is proposed. The method is based on k-mers representation and text mining techniques. Methods The presented method is based on Probabilistic Topic Modeling, a statistical technique originally proposed for text documents. Probabilistic topic models are able to find in a document corpus the topics (recurrent themes) characterizing classes of documents. This technique, applied on DNA sequences representing the documents, exploits the frequency of fixed-length k-mers and builds a generative model for a training group of sequences. This generative model, obtained through the Latent Dirichlet Allocation (LDA) algorithm, is then used to classify a large set of genomic sequences. Results and conclusions We performed classification of over 7000 16S DNA barcode sequences taken from Ribosomal Database Project (RDP) repository, training probabilistic topic models. The proposed method is compared to the RDP tool and Support Vector Machine (SVM) classification algorithm in a extensive set of trials using both complete sequences and short sequence snippets (from 400 bp to 25 bp). Our method reaches very similar results to RDP classifier and SVM for complete sequences. The most interesting results are obtained when short sequence snippets are considered. In these conditions the proposed method outperforms RDP and SVM with ultra short sequences and it exhibits a smooth decrease of performance, at every taxonomic level, when the sequence length is decreased. PMID:25916734

  3. PROBABILISTIC ANALYSIS OF DEPRESSIVE EPISODES: APPLICATION OF RENEWAL THEORY UNDER UNIFORM PROBABILITY LAW.

    Directory of Open Access Journals (Sweden)

    Runjun Phookun

    2013-04-01

    Full Text Available The renewal process has been formulated on the basis of hazard rate of time between two consecutive occurrences of depressive episodes. The probabilistic analysis of depressive episodes can be performed under various forms of hazard rate viz. constant, linear etc. In this paper we are considering a particular form of hazard rate which is h(x=(b-x^- where b is a constant, x is the time between two consecutive episodes of depression. As a result time between two consecutive occurrences of depressive episodes follows uniform distribution in (a,b The distribution of range i.e. the difference between the longest and the shortest occurrence time to a depressive episode, and the expected number of depressive episodes in a random interval of time are obtained for the distribution under consideration. If the closed form of expression for the waiting time distribution is not available, then the Laplace transformation is used for the study of probabilistic analysis. Hazard rate of occurrence and expected number of depressive episodes have been presented graphically

  4. Safety and business benefit analysis of NASA's aviation safety program

    Science.gov (United States)

    2004-09-20

    NASA Aviation Safety Program elements encompass a wide range of products that require both public and private investment. Therefore, two methods of analysis, one relating to the public and the other to the private industry, must be combined to unders...

  5. A Probabilistic Approach for Robustness Evaluation of Timber Structures

    DEFF Research Database (Denmark)

    Kirkegaard, Poul Henning; Sørensen, John Dalsgaard

    A probabilistic based robustness analysis has been performed for a glulam frame structure supporting the roof over the main court in a Norwegian sports centre. The robustness analysis is based on the framework for robustness analysis introduced in the Danish Code of Practice for the Safety...... of Structures and a probabilistic modelling of the timber material proposed in the Probabilistic Model Code (PMC) of the Joint Committee on Structural Safety (JCSS). Due to the framework in the Danish Code the timber structure has to be evaluated with respect to the following criteria where at least one shall...... be fulfilled: a) demonstrating that those parts of the structure essential for the safety only have little sensitivity with respect to unintentional loads and defects, or b) demonstrating a load case with ‘removal of a limited part of the structure’ in order to document that an extensive failure...

  6. Solid waste burial grounds interim safety analysis

    Energy Technology Data Exchange (ETDEWEB)

    Saito, G.H.

    1994-10-01

    This Interim Safety Analysis document supports the authorization basis for the interim operation and restrictions on interim operations for the near-surface land disposal of solid waste in the Solid Waste Burial Grounds. The Solid Waste Burial Grounds Interim Safety Basis supports the upgrade progress for the safety analysis report and the technical safety requirements for the operations in the Solid Waste Burial Grounds. Accident safety analysis scenarios have been analyzed based on the significant events identified in the preliminary hazards analysis. The interim safety analysis provides an evaluation of the operations in the Solid Waste Burial Grounds to determine if the radiological and hazardous material exposures will be acceptable from an overall health and safety standpoint to the worker, the onsite personnel, the public, and the environment.

  7. Application of probabilistic hydrologic forecasting for risk analysis of multipurpose reservoir real-time operation

    Science.gov (United States)

    Liu, P.

    2012-12-01

    Quantitative analysis of the risk for reservoir real-time operation is a hard task owing to the difficulty of accurate description of inflow uncertainties. The ensemble-based probabilistic hydrologic forecasting depicts the inflow not only the marginal distributions but also their corrections by producing inflow scenarios. This motivates us to analyze the reservoir real-time operating risk with ensemble-based hydrologic forecasting inputs. The proposed procedure involves: (1) based upon the Bayesian inference, the Markov Chain Monte Carlo (MCMC) is implemented to produce ensemble-based probabilistic hydrologic forecasting, (2) the reservoir risk is defined as the ratio of the number of scenarios that excessive the critical value to the total number of scenarios, (3) a multipurpose reservoir operation model is built and solved using scenario optimization to produce Pareto solutions for trade-offs between risks and profits. With a case study of the China's Three Gorges Reservoir (TGR) for the 2010 and 2012 floods, it is found that the reservoir real-time operation risks can be estimated directly and minimized based on the proposed methods, and is easy of implementation by the reservoir operators.

  8. A probabilistic analysis of cumulative carbon emissions and long-term planetary warming

    Science.gov (United States)

    Fyke, Jeremy; Damon Matthews, H.

    2015-11-01

    Efforts to mitigate and adapt to long-term climate change could benefit greatly from probabilistic estimates of cumulative carbon emissions due to fossil fuel burning and resulting CO2-induced planetary warming. Here we demonstrate the use of a reduced-form model to project these variables. We performed simulations using a large-ensemble framework with parametric uncertainty sampled to produce distributions of future cumulative emissions and consequent planetary warming. A hind-cast ensemble of simulations captured 1980-2012 historical CO2 emissions trends and an ensemble of future projection simulations generated a distribution of emission scenarios that qualitatively resembled the suite of Representative and Extended Concentration Pathways. The resulting cumulative carbon emission and temperature change distributions are characterized by 5-95th percentile ranges of 0.96-4.9 teratonnes C (Tt C) and 1.4 °C-8.5 °C, respectively, with 50th percentiles at 3.1 Tt C and 4.7 °C. Within the wide range of policy-related parameter combinations that produced these distributions, we found that low-emission simulations were characterized by both high carbon prices and low costs of non-fossil fuel energy sources, suggesting the importance of these two policy levers in particular for avoiding dangerous levels of climate warming. With this analysis we demonstrate a probabilistic approach to the challenge of identifying strategies for limiting cumulative carbon emissions and assessing likelihoods of surpassing dangerous temperature thresholds.

  9. Probabilistic finite element analysis of radiofrequency liver ablation using the unscented transform

    Energy Technology Data Exchange (ETDEWEB)

    Dos Santos, Icaro; Da Rocha, Adson Ferreira; Menezes, Leonardo Rax [Department of Electrical Engineering, University of Brasilia, Brasilia, DF 70910-900 (Brazil); Haemmerich, Dieter; Schutt, David [Division of Pediatric Cardiology, Medical University of South Carolina, 165 Ashley Ave., Charleston, SC 29425 (United States)], E-mail: icaro@ieee.org

    2009-02-07

    The main limitation of radiofrequency (RF) ablation numerical simulations reported in the literature is their failure to provide statistical results based on the statistical variability of tissue thermal-electrical parameters. This work developed an efficient probabilistic approach to hepatic RF ablation in order to statistically evaluate the effect of four thermal-electrical properties of liver tissue on the uncertainty of the ablation zone dimensions: thermal conductivity, specific heat, blood perfusion and electrical conductivity. A deterministic thermal-electrical finite element model of a monopolar electrode inserted in the liver was coupled with the unscented transform method in order to obtain coagulation zone confidence intervals, probability and cumulative density functions. The coagulation zone volume, diameter and length were 10.96 cm{sup 3}, 2.17 cm and 4.08 cm, respectively (P < 0.01). Furthermore, a probabilistic sensitivity analysis showed that perfusion and thermal conductivity account for >95% of the variability in coagulation zone volume, diameter and length.

  10. Probabilistic sensitivity analysis for decision trees with multiple branches: use of the Dirichlet distribution in a Bayesian framework.

    Science.gov (United States)

    Briggs, Andrew H; Ades, A E; Price, Martin J

    2003-01-01

    In structuring decision models of medical interventions, it is commonly recommended that only 2 branches be used for each chance node to avoid logical inconsistencies that can arise during sensitivity analyses if the branching probabilities do not sum to 1. However, information may be naturally available in an unconditional form, and structuring a tree in conditional form may complicate rather than simplify the sensitivity analysis of the unconditional probabilities. Current guidance emphasizes using probabilistic sensitivity analysis, and a method is required to provide probabilistic probabilities over multiple branches that appropriately represents uncertainty while satisfying the requirement that mutually exclusive event probabilities should sum to 1. The authors argue that the Dirichlet distribution, the multivariate equivalent of the beta distribution, is appropriate for this purpose and illustrate its use for generating a fully probabilistic transition matrix for a Markov model. Furthermore, they demonstrate that by adopting a Bayesian approach, the problem of observing zero counts for transitions of interest can be overcome.

  11. Probabilistic modelling of human exposure to intense sweeteners in Italian teenagers: validation and sensitivity analysis of a probabilistic model including indicators of market share and brand loyalty.

    Science.gov (United States)

    Arcella, D; Soggiu, M E; Leclercq, C

    2003-10-01

    For the assessment of exposure to food-borne chemicals, the most commonly used methods in the European Union follow a deterministic approach based on conservative assumptions. Over the past few years, to get a more realistic view of exposure to food chemicals, risk managers are getting more interested in the probabilistic approach. Within the EU-funded 'Monte Carlo' project, a stochastic model of exposure to chemical substances from the diet and a computer software program were developed. The aim of this paper was to validate the model with respect to the intake of saccharin from table-top sweeteners and cyclamate from soft drinks by Italian teenagers with the use of the software and to evaluate the impact of the inclusion/exclusion of indicators on market share and brand loyalty through a sensitivity analysis. Data on food consumption and the concentration of sweeteners were collected. A food frequency questionnaire aimed at identifying females who were high consumers of sugar-free soft drinks and/or of table top sweeteners was filled in by 3982 teenagers living in the District of Rome. Moreover, 362 subjects participated in a detailed food survey by recording, at brand level, all foods and beverages ingested over 12 days. Producers were asked to provide the intense sweeteners' concentration of sugar-free products. Results showed that consumer behaviour with respect to brands has an impact on exposure assessments. Only probabilistic models that took into account indicators of market share and brand loyalty met the validation criteria.

  12. Neo-Deterministic and Probabilistic Seismic Hazard Assessments: a Comparative Analysis

    Science.gov (United States)

    Peresan, Antonella; Magrin, Andrea; Nekrasova, Anastasia; Kossobokov, Vladimir; Panza, Giuliano F.

    2016-04-01

    Objective testing is the key issue towards any reliable seismic hazard assessment (SHA). Different earthquake hazard maps must demonstrate their capability in anticipating ground shaking from future strong earthquakes before an appropriate use for different purposes - such as engineering design, insurance, and emergency management. Quantitative assessment of maps performances is an essential step also in scientific process of their revision and possible improvement. Cross-checking of probabilistic models with available observations and independent physics based models is recognized as major validation procedure. The existing maps from the classical probabilistic seismic hazard analysis (PSHA), as well as those from the neo-deterministic analysis (NDSHA), which have been already developed for several regions worldwide (including Italy, India and North Africa), are considered to exemplify the possibilities of the cross-comparative analysis in spotting out limits and advantages of different methods. Where the data permit, a comparative analysis versus the documented seismic activity observed in reality is carried out, showing how available observations about past earthquakes can contribute to assess performances of the different methods. Neo-deterministic refers to a scenario-based approach, which allows for consideration of a wide range of possible earthquake sources as the starting point for scenarios constructed via full waveforms modeling. The method does not make use of empirical attenuation models (i.e. Ground Motion Prediction Equations, GMPE) and naturally supplies realistic time series of ground shaking (i.e. complete synthetic seismograms), readily applicable to complete engineering analysis and other mitigation actions. The standard NDSHA maps provide reliable envelope estimates of maximum seismic ground motion from a wide set of possible scenario earthquakes, including the largest deterministically or historically defined credible earthquake. In addition

  13. A formal approach for change impact analysis of long term composed services using Probabilistic Cellular Automata

    Directory of Open Access Journals (Sweden)

    M. Thirumaran

    2016-04-01

    Full Text Available Incorporating changes into the logics of composed services dynamically and successfully is a challenge for sustaining a business’ image and profit in the society, especially when the change is expected to be made immediately at low cost. In this paper, we address this challenge by proposing a change impact analysis framework for long term composed services (LCS which: (i enables the business people to implement the changes by themselves through their analysts, (ii reduces cost and time by eliminating the dependence on IT developers once the application services are developed and delivered, (iii ensures effective incorporation of the changes made by using standard methodologies for evaluation – finite state automaton for verifying the runtime compatibilities and change evaluation and probabilistic cellular automaton for impact analysis and prediction. Through the evaluated probability measures and effective incident matching, the knowledge gained by the analyst over his service logics and the efficiency of incorporating changes are increased.

  14. The Performance of Structure-Controller Coupled Systems Analysis Using Probabilistic Evaluation and Identification Model Approach

    Directory of Open Access Journals (Sweden)

    Mosbeh R. Kaloop

    2017-01-01

    Full Text Available This study evaluates the performance of passively controlled steel frame building under dynamic loads using time series analysis. A novel application is utilized for the time and frequency domains evaluation to analyze the behavior of controlling systems. In addition, the autoregressive moving average (ARMA neural networks are employed to identify the performance of the controller system. Three passive vibration control devices are utilized in this study, namely, tuned mass damper (TMD, tuned liquid damper (TLD, and tuned liquid column damper (TLCD. The results show that the TMD control system is a more reliable controller than TLD and TLCD systems in terms of vibration mitigation. The probabilistic evaluation and identification model showed that the probability analysis and ARMA neural network model are suitable to evaluate and predict the response of coupled building-controller systems.

  15. Preliminary safety analysis methodology for the SMART

    Energy Technology Data Exchange (ETDEWEB)

    Bae, Kyoo Hwan; Chung, Y. J.; Kim, H. C.; Sim, S. K.; Lee, W. J.; Chung, B. D.; Song, J. H. [Korea Atomic Energy Research Institute, Taejeon (Korea)

    2000-03-01

    This technical report was prepared for a preliminary safety analysis methodology of the 330MWt SMART (System-integrated Modular Advanced ReacTor) which has been developed by Korea Atomic Energy Research Institute (KAERI) and funded by the Ministry of Science and Technology (MOST) since July 1996. This preliminary safety analysis methodology has been used to identify an envelope for the safety of the SMART conceptual design. As the SMART design evolves, further validated final safety analysis methodology will be developed. Current licensing safety analysis methodology of the Westinghouse and KSNPP PWRs operating and under development in Korea as well as the Russian licensing safety analysis methodology for the integral reactors have been reviewed and compared to develop the preliminary SMART safety analysis methodology. SMART design characteristics and safety systems have been reviewed against licensing practices of the PWRs operating or KNGR (Korean Next Generation Reactor) under construction in Korea. Detailed safety analysis methodology has been developed for the potential SMART limiting events of main steam line break, main feedwater pipe break, loss of reactor coolant flow, CEA withdrawal, primary to secondary pipe break and the small break loss of coolant accident. SMART preliminary safety analysis methodology will be further developed and validated in parallel with the safety analysis codes as the SMART design further evolves. Validated safety analysis methodology will be submitted to MOST as a Topical Report for a review of the SMART licensing safety analysis methodology. Thus, it is recommended for the nuclear regulatory authority to establish regulatory guides and criteria for the integral reactor. 22 refs., 18 figs., 16 tabs. (Author)

  16. Probabilistic Fracture Mechanics Analysis of Boling Water Reactor Vessel for Cool-Down and Low Temperature Over-Pressurization Transients

    Directory of Open Access Journals (Sweden)

    Jeong Soon Park

    2016-04-01

    Full Text Available The failure probabilities of the reactor pressure vessel (RPV for low temperature over-pressurization (LTOP and cool-down transients are calculated in this study. For the cool-down transient, a pressure–temperature limit curve is generated in accordance with Section XI, Appendix G of the American Society of Mechanical Engineers (ASME code, from which safety margin factors are deliberately removed for the probabilistic fracture mechanics analysis. Then, sensitivity analyses are conducted to understand the effects of some input parameters. For the LTOP transient, the failure of the RPV mostly occurs during the period of the abrupt pressure rise. For the cool-down transient, the decrease of the fracture toughness with temperature and time plays a main role in RPV failure at the end of the cool-down process. As expected, the failure probability increases with increasing fluence, Cu and Ni contents, and initial reference temperature-nil ductility transition (RTNDT. The effect of warm prestressing on the vessel failure probability for LTOP is not significant because most of the failures happen before the stress intensity factor reaches the peak value while its effect reduces the failure probability by more than one order of magnitude for the cool-down transient.

  17. Probabilistic Modeling of Timber Structures

    DEFF Research Database (Denmark)

    Köhler, J.D.; Sørensen, John Dalsgaard; Faber, Michael Havbro

    2005-01-01

    The present paper contains a proposal for the probabilistic modeling of timber material properties. It is produced in the context of the Probabilistic Model Code (PMC) of the Joint Committee on Structural Safety (JCSS) and of the COST action E24 'Reliability of Timber Structures'. The present pro...

  18. Probabilistic Flood Defence Assessment Tools

    Directory of Open Access Journals (Sweden)

    Slomp Robert

    2016-01-01

    Full Text Available The WTI2017 project is responsible for the development of flood defence assessment tools for the 3600 km of Dutch primary flood defences, dikes/levees, dunes and hydraulic structures. These tools are necessary, as per January 1st 2017, the new flood risk management policy for the Netherlands will be implemented. Then, the seven decades old design practice (maximum water level methodology of 1958 and two decades old safety standards (and maximum hydraulic load methodology of 1996 will formally be replaced by a more risked based approach for the national policy in flood risk management. The formal flood defence assessment is an important part of this new policy, especially for flood defence managers, since national and regional funding for reinforcement is based on this assessment. This new flood defence policy is based on a maximum allowable probability of flooding. For this, a maximum acceptable individual risk was determined at 1/100 000 per year, this is the probability of life loss of for every protected area in the Netherlands. Safety standards of flood defences were then determined based on this acceptable individual risk. The results were adjusted based on information from cost -benefit analysis, societal risk and large scale societal disruption due to the failure of critical infrastructure e.g. power stations. The resulting riskbased flood defence safety standards range from a 300 to a 100 000 year return period for failure. Two policy studies, WV21 (Safety from floods in the 21st century and VNK-2 (the National Flood Risk in 2010 provided the essential information to determine the new risk based safety standards for flood defences. The WTI2017 project will provide the safety assessment tools based on these new standards and is thus an essential element for the implementation of this policy change. A major issue to be tackled was the development of user-friendly tools, as the new assessment is to be carried out by personnel of the

  19. Manpower analysis in transportation safety. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Bauer, C.S.; Bowden, H.M.; Colford, C.A.; DeFilipps, P.J.; Dennis, J.D.; Ehlert, A.K.; Popkin, H.A.; Schrader, G.F.; Smith, Q.N.

    1977-05-01

    The project described provides a manpower review of national, state and local needs for safety skills, and projects future manning levels for transportation safety personnel in both the public and private sectors. Survey information revealed that there are currently approximately 121,000 persons employed directly in transportation safety occupations within the air carrier, highway and traffic safety, motor carrier, pipeline, rail carrier, and marine carrier transportation industry groups. The projected need for 1980 is over 145,000 of which over 80 percent will be in highway safety. An analysis of transportation tasks is included, and shows ten general categories about which the majority of safety activities are focused. A skills analysis shows a generally high level of educational background and several years of experience are required for most transportation safety jobs. An overall review of safety programs in the transportation industry is included, together with chapters on the individual transportation modes.

  20. Automation for System Safety Analysis

    Science.gov (United States)

    Malin, Jane T.; Fleming, Land; Throop, David; Thronesbery, Carroll; Flores, Joshua; Bennett, Ted; Wennberg, Paul

    2009-01-01

    This presentation describes work to integrate a set of tools to support early model-based analysis of failures and hazards due to system-software interactions. The tools perform and assist analysts in the following tasks: 1) extract model parts from text for architecture and safety/hazard models; 2) combine the parts with library information to develop the models for visualization and analysis; 3) perform graph analysis and simulation to identify and evaluate possible paths from hazard sources to vulnerable entities and functions, in nominal and anomalous system-software configurations and scenarios; and 4) identify resulting candidate scenarios for software integration testing. There has been significant technical progress in model extraction from Orion program text sources, architecture model derivation (components and connections) and documentation of extraction sources. Models have been derived from Internal Interface Requirements Documents (IIRDs) and FMEA documents. Linguistic text processing is used to extract model parts and relationships, and the Aerospace Ontology also aids automated model development from the extracted information. Visualizations of these models assist analysts in requirements overview and in checking consistency and completeness.

  1. A Methodology for the Integration of a Mechanistic Source Term Analysis in a Probabilistic Framework for Advanced Reactors

    Energy Technology Data Exchange (ETDEWEB)

    Grabaskas, Dave; Brunett, Acacia J.; Bucknor, Matthew

    2016-06-26

    GE Hitachi Nuclear Energy (GEH) and Argonne National Laboratory are currently engaged in a joint effort to modernize and develop probabilistic risk assessment (PRA) techniques for advanced non-light water reactors. At a high level, the primary outcome of this project will be the development of next-generation PRA methodologies that will enable risk-informed prioritization of safety- and reliability-focused research and development, while also identifying gaps that may be resolved through additional research. A subset of this effort is the development of PRA methodologies to conduct a mechanistic source term (MST) analysis for event sequences that could result in the release of radionuclides. The MST analysis seeks to realistically model and assess the transport, retention, and release of radionuclides from the reactor to the environment. The MST methods developed during this project seek to satisfy the requirements of the Mechanistic Source Term element of the ASME/ANS Non-LWR PRA standard. The MST methodology consists of separate analysis approaches for risk-significant and non-risk significant event sequences that may result in the release of radionuclides from the reactor. For risk-significant event sequences, the methodology focuses on a detailed assessment, using mechanistic models, of radionuclide release from the fuel, transport through and release from the primary system, transport in the containment, and finally release to the environment. The analysis approach for non-risk significant event sequences examines the possibility of large radionuclide releases due to events such as re-criticality or the complete loss of radionuclide barriers. This paper provides details on the MST methodology, including the interface between the MST analysis and other elements of the PRA, and provides a simplified example MST calculation for a sodium fast reactor.

  2. Modeling and analysis of cell membrane systems with probabilistic model checking.

    Science.gov (United States)

    Crepalde, Mirlaine A; Faria-Campos, Alessandra C; Campos, Sérgio V A

    2011-12-22

    Recently there has been a growing interest in the application of Probabilistic Model Checking (PMC) for the formal specification of biological systems. PMC is able to exhaustively explore all states of a stochastic model and can provide valuable insight into its behavior which are more difficult to see using only traditional methods for system analysis such as deterministic and stochastic simulation. In this work we propose a stochastic modeling for the description and analysis of sodium-potassium exchange pump. The sodium-potassium pump is a membrane transport system presents in all animal cell and capable of moving sodium and potassium ions against their concentration gradient. We present a quantitative formal specification of the pump mechanism in the PRISM language, taking into consideration a discrete chemistry approach and the Law of Mass Action aspects. We also present an analysis of the system using quantitative properties in order to verify the pump reversibility and understand the pump behavior using trend labels for the transition rates of the pump reactions. Probabilistic model checking can be used along with other well established approaches such as simulation and differential equations to better understand pump behavior. Using PMC we can determine if specific events happen such as the potassium outside the cell ends in all model traces. We can also have a more detailed perspective on its behavior such as determining its reversibility and why its normal operation becomes slow over time. This knowledge can be used to direct experimental research and make it more efficient, leading to faster and more accurate scientific discoveries.

  3. 14 CFR 33.75 - Safety analysis.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 1 2010-01-01 2010-01-01 false Safety analysis. 33.75 Section 33.75... STANDARDS: AIRCRAFT ENGINES Design and Construction; Turbine Aircraft Engines § 33.75 Safety analysis. (a... consequences of all failures that can reasonably be expected to occur. This analysis will take into account, if...

  4. 14 CFR 35.15 - Safety analysis.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 1 2010-01-01 2010-01-01 false Safety analysis. 35.15 Section 35.15... STANDARDS: PROPELLERS Design and Construction § 35.15 Safety analysis. (a)(1) The applicant must analyze the.... This analysis will take into account, if applicable: (i) The propeller system in a typical installation...

  5. Hazard Analysis and Safety Requirements for Small Drone Operations: To What Extent Do Popular Drones Embed Safety?

    Science.gov (United States)

    Plioutsias, Anastasios; Karanikas, Nektarios; Chatzimihailidou, Maria Mikela

    2017-08-02

    Currently, published risk analyses for drones refer mainly to commercial systems, use data from civil aviation, and are based on probabilistic approaches without suggesting an inclusive list of hazards and respective requirements. Within this context, this article presents: (1) a set of safety requirements generated from the application of the systems theoretic process analysis (STPA) technique on a generic small drone system; (2) a gap analysis between the set of safety requirements and the ones met by 19 popular drone models; (3) the extent of the differences between those models, their manufacturers, and the countries of origin; and (4) the association of drone prices with the extent they meet the requirements derived by STPA. The application of STPA resulted in 70 safety requirements distributed across the authority, manufacturer, end user, or drone automation levels. A gap analysis showed high dissimilarities regarding the extent to which the 19 drones meet the same safety requirements. Statistical results suggested a positive correlation between drone prices and the extent that the 19 drones studied herein met the safety requirements generated by STPA, and significant differences were identified among the manufacturers. This work complements the existing risk assessment frameworks for small drones, and contributes to the establishment of a commonly endorsed international risk analysis framework. Such a framework will support the development of a holistic and methodologically justified standardization scheme for small drone flights. © 2017 Society for Risk Analysis.

  6. Modeling of the Sedimentary Interbedded Basalt Stratigraphy for the Idaho National Laboratory Probabilistic Seismic Hazard Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Suzette Payne

    2007-08-01

    This report summarizes how the effects of the sedimentary interbedded basalt stratigraphy were modeled in the probabilistic seismic hazard analysis (PSHA) of the Idaho National Laboratory (INL). Drill holes indicate the bedrock beneath INL facilities is composed of about 1.1 km of alternating layers of basalt rock and loosely consolidated sediments. Alternating layers of hard rock and “soft” loose sediments tend to attenuate seismic energy greater than uniform rock due to scattering and damping. The INL PSHA incorporated the effects of the sedimentary interbedded basalt stratigraphy by developing site-specific shear (S) wave velocity profiles. The profiles were used in the PSHA to model the near-surface site response by developing site-specific stochastic attenuation relationships.

  7. Modeling of the Sedimentary Interbedded Basalt Stratigraphy for the Idaho National Laboratory Probabilistic Seismic Hazard Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Suzette Payne

    2006-04-01

    This report summarizes how the effects of the sedimentary interbedded basalt stratigraphy were modeled in the probabilistic seismic hazard analysis (PSHA) of the Idaho National Laboratory (INL). Drill holes indicate the bedrock beneath INL facilities is composed of about 1.1 km of alternating layers of basalt rock and loosely consolidated sediments. Alternating layers of hard rock and “soft” loose sediments tend to attenuate seismic energy greater than uniform rock due to scattering and damping. The INL PSHA incorporated the effects of the sedimentary interbedded basalt stratigraphy by developing site-specific shear (S) wave velocity profiles. The profiles were used in the PSHA to model the near-surface site response by developing site-specific stochastic attenuation relationships.

  8. Analysis of Population Diversity of Dynamic Probabilistic Particle Swarm Optimization Algorithms

    Directory of Open Access Journals (Sweden)

    Qingjian Ni

    2014-01-01

    Full Text Available In evolutionary algorithm, population diversity is an important factor for solving performance. In this paper, combined with some population diversity analysis methods in other evolutionary algorithms, three indicators are introduced to be measures of population diversity in PSO algorithms, which are standard deviation of population fitness values, population entropy, and Manhattan norm of standard deviation in population positions. The three measures are used to analyze the population diversity in a relatively new PSO variant—Dynamic Probabilistic Particle Swarm Optimization (DPPSO. The results show that the three measure methods can fully reflect the evolution of population diversity in DPPSO algorithms from different angles, and we also discuss the impact of population diversity on the DPPSO variants. The relevant conclusions of the population diversity on DPPSO can be used to analyze, design, and improve the DPPSO algorithms, thus improving optimization performance, which could also be beneficial to understand the working mechanism of DPPSO theoretically.

  9. 230Th/U ages Supporting Hanford Site-Wide Probabilistic Seismic Hazard Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Paces, James B. [U.S. Geological Survey

    2014-08-31

    This product represents a USGS Administrative Report that discusses samples and methods used to conduct uranium-series isotope analyses and resulting ages and initial 234U/238U activity ratios of pedogenic cements developed in several different surfaces in the Hanford area middle to late Pleistocene. Samples were collected and dated to provide calibration of soil development in surface deposits that are being used in the Hanford Site-Wide probabilistic seismic hazard analysis conducted by AMEC. The report includes description of sample locations and physical characteristics, sample preparation, chemical processing and mass spectrometry, analytical results, and calculated ages for individual sites. Ages of innermost rinds on a number of samples from five sites in eastern Washington are consistent with a range of minimum depositional ages from 17 ka for cataclysmic flood deposits to greater than 500 ka for alluvium at several sites.

  10. Probabilistic accident consequence uncertainty analysis -- Early health effects uncertainty assessment. Volume 1: Main report

    Energy Technology Data Exchange (ETDEWEB)

    Haskin, F.E. [Univ. of New Mexico, Albuquerque, NM (United States); Harper, F.T. [Sandia National Labs., Albuquerque, NM (United States); Goossens, L.H.J.; Kraan, B.C.P. [Delft Univ. of Technology (Netherlands); Grupa, J.B. [Netherlands Energy Research Foundation (Netherlands)

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA early health effects models.

  11. Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for internal dosimetry. Volume 1: Main report

    Energy Technology Data Exchange (ETDEWEB)

    Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Harrison, J.D. [National Radiological Protection Board (United Kingdom); Harper, F.T. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States)

    1998-04-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA internal dosimetry models.

  12. Probabilistic accident consequence uncertainty analysis -- Late health effects uncertainty assessment. Volume 1: Main report

    Energy Technology Data Exchange (ETDEWEB)

    Little, M.P.; Muirhead, C.R. [National Radiological Protection Board (United Kingdom); Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Harper, F.T. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States)

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA late health effects models.

  13. Uncertainty analysis for probabilistic steam generators tube rupture in LBB applications

    Energy Technology Data Exchange (ETDEWEB)

    Durbec, V.; Pitner, P.; Pages, D. [Electricite de France, 78 - Chatou (France). Research and Development Div.; Riffard, T. [Electricite de France, 69 - Villeurbanne (France). Engineering and Construction Div.; Flesch, B. [Electricite de France, 92 - Paris la Defense (France). Generation and Transmission Div.

    1997-10-01

    Steam Generators (SG) of Pressurized Water Reactors have experienced world wide various types of tube degradations, mainly from stress corrosion cracking; because of this damage, primary-secondary leakage or tube rupture can occur. Safety against the risk of tube rupture is achieved through a combination of periodic in-service inspections (eddy current testing), surveillance of leaks during operation (leak before break concept) and tube plugging. In order to optimize the tube bundle SG maintenance, Electricite de France has developed a specific software named COMPROMIS. The model, based on probabilistic fracture mechanics makes it possible to quantify the influence of in service inspections and maintenance work on the risk of a SG Tube Rupture (SGTR), taking all significant parameters into account as random variables (initial defect size distribution, reliability of non-destructive examinations, crack initiation and propagation, critical sizes, leak before risk of break, etc...). This paper focuses on the leak rate calculation module and presents a sensitivity study of the influence of the leak before break on the conditional failure probability. (author) 8 refs.

  14. Analysis of vehicle-bicycle interactions at unsignalized crossings: A probabilistic approach and application.

    Science.gov (United States)

    Silvano, Ary P; Koutsopoulos, Haris N; Ma, Xiaoliang

    2016-12-01

    In the last decades, bicycle usage has been increasing in many countries due to the potential environmental and health benefits. Therefore, there is a need to better understand cyclists' interactions with vehicles, and to build models and tools for evaluating multimodal transportation infrastructure with respect to cycling safety, accessibility, and other planning aspects. This paper presents a modeling framework to describe driver-cyclist interactions when they are approaching a conflicting zone. In particular, the car driver yielding behavior is modeled as a function of a number of explanatory variables. A two-level hierarchical, probabilistic framework (based on discrete choice theory) is proposed to capture the driver's yielding decision process when interacting with a cyclist. The first level models the probability of the car driver perceiving a situation with a bicycle as a potential conflict whereas the second models the probability of yielding given that a conflict has been perceived by the driver. The framework also incorporates the randomness of the location of the drivers' decision point. The methodology is applied in a case study using observations at a typical Swedish roundabout. The results show that the conflict probability is affected differently depending on the user (cyclist or driver) who arrives at the interaction zone first. The yielding probability depends on the speed of the vehicle and the proximity of the cyclist. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. Task D: Hydrogen safety analysis

    Energy Technology Data Exchange (ETDEWEB)

    Swain, M.R.; Sievert, B.G. [Univ. of Miami, Coral Gables, FL (United States); Swain, M.N. [Analytical Technologies, Inc., Miami, FL (United States)

    1996-10-01

    This report covers two topics. The first is a review of codes, standards, regulations, recommendations, certifications, and pamphlets which address safety of gaseous fuels. The second is an experimental investigation of hydrogen flame impingement. Four areas of concern in the conversion of natural gas safety publications to hydrogen safety publications are delineated. Two suggested design criteria for hydrogen vehicle fuel systems are proposed. It is concluded from the experimental work that light weight, low cost, firewalls to resist hydrogen flame impingement are feasible.

  16. Flood risk and adaptation strategies in Indonesia: a probabilistic analysis using globally available data

    Science.gov (United States)

    Muis, Sanne; Güneralp, Burak; Jongman, Brenden; Aerts, Jeroen; Ward, Philip

    2015-04-01

    In recent years, global flood losses are increasing due to socio-economic development and climate change, with the largest risk increases in developing countries such as Indonesia. For countries to undertake effective risk-management, an accurate understanding of both current and future risk is required. However, detailed information is rarely available, particularly for developing countries. We present a first of its kind country-scale analysis of flood risk using globally available data that combines a global inundation model with a land use change model and more local data on flood damages. To assess the contribution and uncertainty of different drivers of future risk, we integrate thousands of socio-economic and climate projections in a probabilistic way and include multiple adaptation strategies. Indonesia is used as a case-study as it a country that already faces high flood risk, and is undergoing rapid urbanization. We developed probabilistic and spatially-explicit urban expansion projections from 2000 to 2030 that show that the increase in urban extent ranges from 215% to 357% (5th and 95th percentile). We project rapidly rising flood risk, both for coastal and river floods. This increase is largely driven by economic growth and urban expansion (i.e. increasing exposure). Whilst sea level rise will amply this trend, the response of river floods to climate change is uncertain with the impact of the mean ensemble of 20 climate projections (5 GCMs and 4 RCPs) being close to zero. However, as urban expansion is the main driving force of future risk, we argue that the implementation of adaptation measures is increasingly pressing, regardless of the wide uncertainty in climate projections. Hence, we evaluated the effectiveness of two adaptation measures: spatial planning in flood prone areas and enhanced flood protection. Both strategies have a large potential to effectively offset the increasing risk trend. The risk reduction is in the range of 22-85% and 53

  17. Safety analysis, risk assessment, and risk acceptance criteria

    Energy Technology Data Exchange (ETDEWEB)

    Jamali, K. [Dept. of Energy, Germantown, MD (United States). Core Technical Support and Facility Transition; Stack, D.W.; Sullivan, L.H.; Sanzo, D.L. [Los Alamos National Lab., NM (United States)

    1997-08-01

    This paper discusses a number of topics that relate safety analysis as documented in the Department of Energy (DOE) safety analysis reports (SARs), probabilistic risk assessments (PRA) as characterized primarily in the context of the techniques that have assumed some level of formality in commercial nuclear power plant applications, and risk acceptance criteria as an outgrowth of PRA applications. DOE SARs of interest are those that are prepared for DOE facilities under DOE Order 5480.23 and the implementing guidance in DOE STD-3009-94. It must be noted that the primary area of application for DOE STD-3009 is existing DOE facilities and that certain modifications of the STD-3009 approach are necessary in SARs for new facilities. Moreover, it is the hazard analysis (HA) and accident analysis (AA) portions of these SARs that are relevant to the present discussions. Although PRAs can be qualitative in nature, PRA as used in this paper refers more generally to all quantitative risk assessments and their underlying methods. HA as used in this paper refers more generally to all qualitative risk assessments and their underlying methods that have been in use in hazardous facilities other than nuclear power plants. This discussion includes both quantitative and qualitative risk assessment methods. PRA has been used, improved, developed, and refined since the Reactor Safety Study (WASH-1400) was published in 1975 by the Nuclear Regulatory Commission (NRC). Much debate has ensued since WASH-1400 on exactly what the role of PRA should be in plant design, reactor licensing, `ensuring` plant and process safety, and a large number of other decisions that must be made for potentially hazardous activities. Of particular interest in this area is whether the risks quantified using PRA should be compared with numerical risk acceptance criteria (RACs) to determine whether a facility is `safe.` Use of RACs requires quantitative estimates of consequence frequency and magnitude.

  18. Probabilistic thinking to support early evaluation of system quality: through requirement analysis

    NARCIS (Netherlands)

    Rajabali Nejad, Mohammadreza; Bonnema, Gerrit Maarten

    2014-01-01

    This paper focuses on coping with system quality in the early phases of design where there is lack of knowledge about a system, its functions or its architect. The paper encourages knowledge based evaluation of system quality and promotes probabilistic thinking. It states that probabilistic thinking

  19. Hot Cell Facility (HCF) Safety Analysis Report

    Energy Technology Data Exchange (ETDEWEB)

    MITCHELL,GERRY W.; LONGLEY,SUSAN W.; PHILBIN,JEFFREY S.; MAHN,JEFFREY A.; BERRY,DONALD T.; SCHWERS,NORMAN F.; VANDERBEEK,THOMAS E.; NAEGELI,ROBERT E.

    2000-11-01

    This Safety Analysis Report (SAR) is prepared in compliance with the requirements of DOE Order 5480.23, Nuclear Safety Analysis Reports, and has been written to the format and content guide of DOE-STD-3009-94 Preparation Guide for U. S. Department of Energy Nonreactor Nuclear Safety Analysis Reports. The Hot Cell Facility is a Hazard Category 2 nonreactor nuclear facility, and is operated by Sandia National Laboratories for the Department of Energy. This SAR provides a description of the HCF and its operations, an assessment of the hazards and potential accidents which may occur in the facility. The potential consequences and likelihood of these accidents are analyzed and described. Using the process and criteria described in DOE-STD-3009-94, safety-related structures, systems and components are identified, and the important safety functions of each SSC are described. Additionally, information which describes the safety management programs at SNL are described in ancillary chapters of the SAR.

  20. Probabilistic Networks

    DEFF Research Database (Denmark)

    Jensen, Finn Verner; Lauritzen, Steffen Lilholt

    2001-01-01

    This article describes the basic ideas and algorithms behind specification and inference in probabilistic networks based on directed acyclic graphs, undirected graphs, and chain graphs.......This article describes the basic ideas and algorithms behind specification and inference in probabilistic networks based on directed acyclic graphs, undirected graphs, and chain graphs....

  1. Probabilistic Insurance

    NARCIS (Netherlands)

    P.P. Wakker (Peter); R.H. Thaler (Richard); A. Tversky (Amos)

    1997-01-01

    textabstractProbabilistic insurance is an insurance policy involving a small probability that the consumer will not be reimbursed. Survey data suggest that people dislike probabilistic insurance and demand more than a 20% reduction in the premium to compensate for a 1% default risk. While these

  2. Probabilistic Insurance

    NARCIS (Netherlands)

    Wakker, P.P.; Thaler, R.H.; Tversky, A.

    1997-01-01

    Probabilistic insurance is an insurance policy involving a small probability that the consumer will not be reimbursed. Survey data suggest that people dislike probabilistic insurance and demand more than a 20% reduction in premium to compensate for a 1% default risk. These observations cannot be

  3. Development of safety analysis technology for LMR

    Energy Technology Data Exchange (ETDEWEB)

    Hahn, Do Hee; Kwon, Y. M.; Kim, K. D. [and others

    2000-05-01

    The analysis methodologies as well as the analysis computer code system for the transient, HCDA, and containment performance analyses, which are required for KALIMER safety analyses, have been developed. The SSC-K code has been developed based on SSC-L which is an analysis code for loop type LMR, by improving models necessary for the KALIMER system analysis, and additional models have been added to the code. In addition, HCDA analysis model has been developed and the containment performance analysis code has been also improved. The preliminary basis for the safety analysis has been established, and the preliminary safety analyses for the key design features have been performed. In addition, a state-of-art analysis for LMR PSA and overseas safety and licensing requirements have been reviewed. The design database for the systematic management of the design documents as well as design processes has been established as well.

  4. SECOND WASTE PACKAGE PROBABILISTIC CRITICALITY ANALYSIS: GENERATION AND EVALUATION OF INTERNAL CRITICIALITY CONFIGURATIONS

    Energy Technology Data Exchange (ETDEWEB)

    P. Gottlieb, J.R. Massari, J.K. McCoy

    1996-03-27

    This analysis is prepared by the Mined Geologic Disposal System (MGDS) Waste Package Development (WPD) department to provide an evaluation of the criticality potential within a waste package having sonic or all of its contents degraded by corrosion and removal of neutron absorbers. This analysis is also intended to provide an estimate of the consequences of any internal criticality, particularly in terms of any increase in radionuclide inventory. These consequence estimates will be used as part of the WPD input to the Total System Performance Assessment. The ultimate objective of this analysis is to augment the information gained from the Initial Waste Package Probabilistic Criticality Analyses (Ref. 5.8 and 5.9, hereafter referred to as IPA) to a degree which will support preliminary waste package design recommendations intended to reduce the risk of waste package criticality and the risk to total repository system performance posed by the consequences of any criticality. The IPA evaluated the criticality potential under the assumption that the waste package basket retained its structural integrity, so that the assemblies retained their initial separation, even when the neutron absorbers had been leached from the basket. This analysis is based on the more realistic condition that removal of the neutron absorbers is a consequence of the corrosion of the steel in which they are contained, which has the additional consequence of reducing the structural support between assemblies. The result is a set of more reactive configurations having a smaller spacing between assemblies, or no inter-assembly spacing at all. Another difference from the IPA is the minimal attention to probabilistic evaluation given in this study. Although the IPA covered a time horizon to 100,000 years, the lack of consideration of basket degradation modes made it primarily applicable to the first 10,000 years. In contrast, this study, by focusing on the degraded modes of the basket, is primarily

  5. A probabilistic framework for the exploration of enzymatic capabilities based on feasible kinetics and control analysis.

    Science.gov (United States)

    Saa, Pedro A; Nielsen, Lars K

    2016-03-01

    Analysis of limiting steps within enzyme-catalyzed reactions is fundamental to understand their behavior and regulation. Methods capable of unravelling control properties and exploring kinetic capabilities of enzymatic reactions would be particularly useful for protein and metabolic engineering. While single-enzyme control analysis formalism has previously been applied to well-studied enzymatic mechanisms, broader application of this formalism is limited in practice by the limited amount of kinetic data and the difficulty of describing complex allosteric mechanisms. To overcome these limitations, we present here a probabilistic framework enabling control analysis of previously unexplored mechanisms under uncertainty. By combining a thermodynamically consistent parameterization with an efficient Sequential Monte Carlo sampler embedded in a Bayesian setting, this framework yields insights into the capabilities of enzyme-catalyzed reactions with modest kinetic information, provided that the catalytic mechanism and a thermodynamic reference point are defined. The framework was used to unravel the impact of thermodynamic affinity, substrate saturation levels and effector concentrations on the flux control and response coefficients of a diverse set of enzymatic reactions. Our results highlight the importance of the metabolic context in the control analysis of isolated enzymes as well as the use of statistically sound methods for their interpretation. This framework significantly expands our current capabilities for unravelling the control properties of general reaction kinetics with limited amount of information. This framework will be useful for both theoreticians and experimentalists in the field. Crown Copyright © 2015. Published by Elsevier B.V. All rights reserved.

  6. Analysis in Banach spaces volume II probabilistic methods and operator theory

    CERN Document Server

    Hytönen, Tuomas; Veraar, Mark; Weis, Lutz

    2017-01-01

    This second volume of Analysis in Banach Spaces, Probabilistic Methods and Operator Theory, is the successor to Volume I, Martingales and Littlewood-Paley Theory. It presents a thorough study of the fundamental randomisation techniques and the operator-theoretic aspects of the theory. The first two chapters address the relevant classical background from the theory of Banach spaces, including notions like type, cotype, K-convexity and contraction principles. In turn, the next two chapters provide a detailed treatment of the theory of R-boundedness and Banach space valued square functions developed over the last 20 years. In the last chapter, this content is applied to develop the holomorphic functional calculus of sectorial and bi-sectorial operators in Banach spaces. Given its breadth of coverage, this book will be an invaluable reference to graduate students and researchers interested in functional analysis, harmonic analysis, spectral theory, stochastic analysis, and the operator-theoretic approac...

  7. Probabilistic analysis and material characterisation of canister insert for spent nuclear fuel. Summary report

    Energy Technology Data Exchange (ETDEWEB)

    Andersson, Claes-Goeran [Swedish Nuclear Fuel and Waste Management Co., Stockholm (Sweden); Andersson, Mats; Erixon, Bo [AaF Industriteknik, Stockholm (Sweden); Bjoerkegren, Lars-Erik [Swedish Foundry Association, Stockholm (Sweden); Dillstroem, Peter [DNV Technology, Stockholm (Sweden); Minnebo, Philip

    2005-11-15

    The KBS-3 canister for geological disposal of spent nuclear fuel in Sweden consists of a ductile cast iron insert and a copper shielding. The canister should inhibit release of radionuclides for at least 100,000 years. The copper protects the canister from corrosion whereas the ductile cast iron insert provides the mechanical strength. In the repository the hydrostatic pressure from the groundwater and the swelling pressure from the surrounding bentonite, which in total results in a maximum pressure of 14 MPa, will load the canisters in compression. During the extreme time scales, ice ages are expected with a maximum ice thickness of 3,000 m resulting in an additional pressure of 30 MPa. The maximum design pressure for the KBS-3 canisters has therefore been set to be 44 MPa. A relatively large number of canisters have been manufactured as part of SKB's development programme. To verify the strength of the canisters at this stage of development SKB initiated a project in cooperation with the European commissions Joint Research Centre (JRC), Institute of Energy in Petten in the Netherlands, together with a number of other partners. Three inserts manufactured by different Swedish foundries were used in the project. A large statistical test programme was developed to determine statistical distributions of various material parameters and defect distributions. These data together with the results from stress and strain finite element analysis were subsequently used in probabilistic analysis to determine the probability for plastic collapse caused by high pressure or fracture by crack growth in regions with tensile stresses. The main conclusions from the probabilistic analysis are: 1. At the design pressure of 44 MPa, the probability of failure is insignificant ({approx}2x10{sup -9}). This is the case even though several conservative assumptions have been made. 2. The stresses in the insert caused by the outer pressure are mainly compressive. The regions with tensile

  8. Site-specific seismic probabilistic tsunami hazard analysis: performances and potential applications

    Science.gov (United States)

    Tonini, Roberto; Volpe, Manuela; Lorito, Stefano; Selva, Jacopo; Orefice, Simone; Graziani, Laura; Brizuela, Beatriz; Smedile, Alessandra; Romano, Fabrizio; De Martini, Paolo Marco; Maramai, Alessandra; Piatanesi, Alessio; Pantosti, Daniela

    2017-04-01

    Seismic Probabilistic Tsunami Hazard Analysis (SPTHA) provides probabilities to exceed different thresholds of tsunami hazard intensity, at a specific site or region and in a given time span, for tsunamis caused by seismic sources. Results obtained by SPTHA (i.e., probabilistic hazard curves and inundation maps) represent a very important input to risk analyses and land use planning. However, the large variability of source parameters implies the definition of a huge number of potential tsunami scenarios, whose omission could lead to a biased analysis. Moreover, tsunami propagation from source to target requires the use of very expensive numerical simulations. At regional scale, the computational cost can be reduced using assumptions on the tsunami modeling (i.e., neglecting non-linear effects, using coarse topo-bathymetric meshes, empirically extrapolating maximum wave heights on the coast). On the other hand, moving to local scale, a much higher resolution is required and such assumptions drop out, since detailed inundation maps require significantly greater computational resources. In this work we apply a multi-step method to perform a site-specific SPTHA which can be summarized in the following steps: i) to perform a regional hazard assessment to account for both the aleatory and epistemic uncertainties of the seismic source, by combining the use of an event tree and an ensemble modeling technique; ii) to apply a filtering procedure which use a cluster analysis to define a significantly reduced number of representative scenarios contributing to the hazard of a specific target site; iii) to perform high resolution numerical simulations only for these representative scenarios and for a subset of near field sources placed in very shallow waters and/or whose coseismic displacements induce ground uplift or subsidence at the target. The method is applied to three target areas in the Mediterranean located around the cities of Milazzo (Italy), Thessaloniki (Greece) and

  9. Two non-probabilistic methods for uncertainty analysis in accident reconstruction.

    Science.gov (United States)

    Zou, Tiefang; Yu, Zhi; Cai, Ming; Liu, Jike

    2010-05-20

    There are many uncertain factors in traffic accidents, it is necessary to study the influence of these uncertain factors to improve the accuracy and confidence of accident reconstruction results. It is difficult to evaluate the uncertainty of calculation results if the expression of the reconstruction model is implicit and/or the distributions of the independent variables are unknown. Based on interval mathematics, convex models and design of experiment, two non-probabilistic methods were proposed. These two methods are efficient under conditions where existing uncertainty analysis methods can hardly work because the accident reconstruction model is implicit and/or the distributions of independent variables are unknown; and parameter sensitivity can be obtained from them too. An accident case is investigated by the methods proposed in the paper. Results show that the convex models method is the most conservative method, and the solution of interval analysis method is very close to the other methods. These two methods are a beneficial supplement to the existing uncertainty analysis methods.

  10. Probabilistic Analysis of a SiC/SiC Ceramic Matrix Composite Turbine Vane

    Science.gov (United States)

    Murthy, Pappu L. N.; Nemeth, Noel N.; Brewer, David N.; Mital, Subodh

    2004-01-01

    To demonstrate the advanced composite materials technology under development within the Ultra-Efficient Engine Technology (UEET) Program, it was planned to fabricate, test, and analyze a turbine vane made entirely of silicon carbide-fiber-reinforced silicon carbide matrix composite (SiC/SiC CMC) material. The objective was to utilize a five-harness satin weave melt-infiltrated (MI) SiC/SiC composite material developed under this program to design and fabricate a stator vane that can endure 1000 hours of engine service conditions. The vane was designed such that the expected maximum stresses were kept within the proportional limit strength of the material. Any violation of this design requirement was considered as the failure. This report presents results of a probabilistic analysis and reliability assessment of the vane. Probability of failure to meet the design requirements was computed. In the analysis, material properties, strength, and pressure loading were considered as random variables. The pressure loads were considered normally distributed with a nominal variation. A temperature profile on the vane was obtained by performing a computational fluid dynamics (CFD) analysis and was assumed to be deterministic. The results suggest that for the current vane design, the chance of not meeting design requirements is about 1.6 percent.

  11. Probabilistic Threshold Criterion

    Energy Technology Data Exchange (ETDEWEB)

    Gresshoff, M; Hrousis, C A

    2010-03-09

    The Probabilistic Shock Threshold Criterion (PSTC) Project at LLNL develops phenomenological criteria for estimating safety or performance margin on high explosive (HE) initiation in the shock initiation regime, creating tools for safety assessment and design of initiation systems and HE trains in general. Until recently, there has been little foundation for probabilistic assessment of HE initiation scenarios. This work attempts to use probabilistic information that is available from both historic and ongoing tests to develop a basis for such assessment. Current PSTC approaches start with the functional form of the James Initiation Criterion as a backbone, and generalize to include varying areas of initiation and provide a probabilistic response based on test data for 1.8 g/cc (Ultrafine) 1,3,5-triamino-2,4,6-trinitrobenzene (TATB) and LX-17 (92.5% TATB, 7.5% Kel-F 800 binder). Application of the PSTC methodology is presented investigating the safety and performance of a flying plate detonator and the margin of an Ultrafine TATB booster initiating LX-17.

  12. Autoclave nuclear criticality safety analysis

    Energy Technology Data Exchange (ETDEWEB)

    D`Aquila, D.M. [Martin Marietta Energy Systems, Inc., Piketon, OH (United States); Tayloe, R.W. Jr. [Battelle, Columbus, OH (United States)

    1991-12-31

    Steam-heated autoclaves are used in gaseous diffusion uranium enrichment plants to heat large cylinders of UF{sub 6}. Nuclear criticality safety for these autoclaves is evaluated. To enhance criticality safety, systems are incorporated into the design of autoclaves to limit the amount of water present. These safety systems also increase the likelihood that any UF{sub 6} inadvertently released from a cylinder into an autoclave is not released to the environment. Up to 140 pounds of water can be held up in large autoclaves. This mass of water is sufficient to support a nuclear criticality when optimally combined with 125 pounds of UF{sub 6} enriched to 5 percent U{sup 235}. However, water in autoclaves is widely dispersed as condensed droplets and vapor, and is extremely unlikely to form a critical configuration with released UF{sub 6}.

  13. Tolerance Analysis in Straight-Build Mechanical Assemblies Using a Probabilistic Approach-2D Assembly

    Directory of Open Access Journals (Sweden)

    Tanweer Hussain

    2013-04-01

    Full Text Available Product quality in mechanical assemblies is determined by the dispersion of manufacturing variance during the structure building. This paper focuses on straight-build assembly and proposes a probabilistic approach based on connective assembly model to analyze the effect of individual component variations on the eccentricity of the straight-build assembly. The probabilistic approach calculates the pdf (probability density function of key assembly variation of rotor assembly of high speed rotating machines. The probabilistic approach considers two straight-build scenarios: (i Best Build; and (ii Direct Build, for 2D (Two-Dimensional "axi-symmetric" assemblies. Numerical examples are presented to investigate the probabilistic approach for its efficiency and accuracy in comparison to MCS (Monte Carlo simulation.

  14. SSHAC Level 1 Probabilistic Seismic Hazard Analysis for the Idaho National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Payne, Suzette Jackson [Idaho National Lab. (INL), Idaho Falls, ID (United States); Coppersmith, Ryan [Idaho National Lab. (INL), Idaho Falls, ID (United States); Coppersmith, Kevin [Idaho National Lab. (INL), Idaho Falls, ID (United States); Rodriguez-Marek, Adrian [Idaho National Lab. (INL), Idaho Falls, ID (United States); Falero, Valentina Montaldo [Idaho National Lab. (INL), Idaho Falls, ID (United States); Youngs, Robert [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-09-01

    A Probabilistic Seismic Hazard Analysis (PSHA) was completed for the Materials and Fuels Complex (MFC), Advanced Test Reactor (ATR), and Naval Reactors Facility (NRF) at the Idaho National Laboratory (INL). The PSHA followed the approaches and procedures for Senior Seismic Hazard Analysis Committee (SSHAC) Level 1 study and included a Participatory Peer Review Panel (PPRP) to provide the confident technical basis and mean-centered estimates of the ground motions. A new risk-informed methodology for evaluating the need for an update of an existing PSHA was developed as part of the Seismic Risk Assessment (SRA) project. To develop and implement the new methodology, the SRA project elected to perform two SSHAC Level 1 PSHAs. The first was for the Fuel Manufacturing Facility (FMF), which is classified as a Seismic Design Category (SDC) 3 nuclear facility. The second was for the ATR Complex, which has facilities classified as SDC-4. The new methodology requires defensible estimates of ground motion levels (mean and full distribution of uncertainty) for its criteria and evaluation process. The INL SSHAC Level 1 PSHA demonstrates the use of the PPRP, evaluation and integration through utilization of a small team with multiple roles and responsibilities (four team members and one specialty contractor), and the feasibility of a short duration schedule (10 months). Additionally, a SSHAC Level 1 PSHA was conducted for NRF to provide guidance on the potential use of a design margin above rock hazard levels for the Spent Fuel Handling Recapitalization Project (SFHP) process facility.

  15. Designing, operating and maintaining artificial recharge pond under uncertainty: a probabilistic risk analysis

    Science.gov (United States)

    Pedretti, D.; Sanchez-Vila, X.; Fernandez-Garcia, D.; Bolster, D.; Tartakovsky, D. M.; Barahona-Palomo, M.

    2011-12-01

    Decision makers require long term effective hydraulic criteria to optimize the design of artificial recharge ponds. However, uncontrolled multiscale pore clogging effects on heterogeneous soils determines uncertainties which must be quantified. One of the most remarkable effect is the reduction of infiltration capacity over time, which affect the quantity and quality of aquifer recharging water. We developed a probabilistic (engineering) risk analysis where pore clogging is modeled as an exponential decay with time and where clogging mechanisms are differently sensitive to some properties of the soils, which are heterogeneously organized in space. We studied both a real case and some synthetic infiltration ponds. The risk is defined for the infiltration capacity to drop below a target value at a specific time after the facility is working. We can account for a variety of maintenance strategies that target different clogging mechanisms. In our analysis, physical clogging mechanisms induce the greatest uncertainty and that maintenance targeted at these can yield optimal results. However, considering the fundamental role of the spatial variability in the initial properties, we conclude that an adequate initial characterization of the surface infiltration ponds is strategically critical to determine the degree of uncertainty of different maintenance solutions and thus to make cost-effective and reliable decisions.

  16. Thermodynamic and Probabilistic Metabolic Control Analysis of Riboflavin (Vitamin B₂) Biosynthesis in Bacteria.

    Science.gov (United States)

    Birkenmeier, Markus; Mack, Matthias; Röder, Thorsten

    2015-10-01

    In this study, we applied a coupled in silico thermodynamic and probabilistic metabolic control analysis methodology to investigate the control mechanisms of the commercially relevant riboflavin biosynthetic pathway in bacteria. Under the investigated steady-state conditions, we found that several enzyme reactions of the pathway operate far from thermodynamic equilibrium (transformed Gibbs energies of reaction below about -17 kJ mol(-1)). Using the obtained thermodynamic information and applying enzyme elasticity sampling, we calculated the distributions of the scaled concentration control coefficients (CCCs) and scaled flux control coefficients (FCCs). From the statistical analysis of the calculated distributions, we inferred that the control over the riboflavin producing flux is shared among several enzyme activities and mostly resides in the initial reactions of the pathway. More precisely, the guanosine triphosphate (GTP) cyclohydrolase II activity, and therefore the bifunctional RibA protein of Bacillus subtilis because it catalyzes this activity, appears to mainly control the riboflavin producing flux (mean FCCs = 0.45 and 0.55, respectively). The GTP cyclohydrolase II activity and RibA also exert a high positive control over the riboflavin concentration (mean CCCs = 2.43 and 2.91, respectively). This prediction is consistent with previous findings for microbial riboflavin overproducing strains.

  17. Analysis of feature selection with Probabilistic Neural Network (PNN) to classify sources influencing indoor air quality

    Science.gov (United States)

    Saad, S. M.; Shakaff, A. Y. M.; Saad, A. R. M.; Yusof, A. M.; Andrew, A. M.; Zakaria, A.; Adom, A. H.

    2017-03-01

    There are various sources influencing indoor air quality (IAQ) which could emit dangerous gases such as carbon monoxide (CO), carbon dioxide (CO2), ozone (O3) and particulate matter. These gases are usually safe for us to breathe in if they are emitted in safe quantity but if the amount of these gases exceeded the safe level, they might be hazardous to human being especially children and people with asthmatic problem. Therefore, a smart indoor air quality monitoring system (IAQMS) is needed that able to tell the occupants about which sources that trigger the indoor air pollution. In this project, an IAQMS that able to classify sources influencing IAQ has been developed. This IAQMS applies a classification method based on Probabilistic Neural Network (PNN). It is used to classify the sources of indoor air pollution based on five conditions: ambient air, human activity, presence of chemical products, presence of food and beverage, and presence of fragrance. In order to get good and best classification accuracy, an analysis of several feature selection based on data pre-processing method is done to discriminate among the sources. The output from each data pre-processing method has been used as the input for the neural network. The result shows that PNN analysis with the data pre-processing method give good classification accuracy of 99.89% and able to classify the sources influencing IAQ high classification rate.

  18. ANOVA-based transformed probabilistic collocation method for Bayesian data-worth analysis

    Science.gov (United States)

    Man, Jun; Liao, Qinzhuo; Zeng, Lingzao; Wu, Laosheng

    2017-12-01

    Bayesian theory provides a coherent framework in quantifying the data worth of measurements and estimating unknown parameters. Nevertheless, one common problem in Bayesian methods is the considerably high computational cost since a large number of model evaluations is required in the likelihood evaluation. To address this issue, a new surrogate modeling method, i.e., ANOVA (analysis of variance)-based transformed probabilistic collocation method (ATPCM), is developed in this work. To cope with the strong nonlinearity, the model responses are transformed to the arrival times, which are then approximated with a set of low-order ANOVA components. The validity of the proposed method is demonstrated by synthetic numerical cases involving water and heat transport in the vadose zone. It is shown that, the ATPCM is more efficient than the existing surrogate modeling methods (e.g., PCM, ANOVA-based PCM and TPCM). At a very low computational cost, the ATPCM-based Bayesian data-worth analysis provides a quantitative metric in comparing different monitoring plans, and helps to improve the parameter estimation. Although the flow and heat transport in vadose zone is considered in this work, the proposed method can be equally applied in any other hydrologic problems.

  19. Safety analysis SFR 1. Long-term safety

    Energy Technology Data Exchange (ETDEWEB)

    2008-12-15

    An updated assessment of the long-term safety of SKB's final repository for radioactive operational waste, SFR 1, is presented in this report. The report is included in the safety analysis report for SFR 1. The most recent account of long-term safety was submitted to the regulatory authorities in 2001. The present report has been compiled on SKB's initiative to address the regulatory authorities' viewpoints regarding the preceding account of long-term safety. Besides the new mode of working with safety functions there is another important difference between the 2001 safety assessment and the current assessment: The time horizon in the current assessment has been extended to 100,000 years in order to include the effect of future climate changes. The purpose of this renewed assessment of the long-term safety of SFR 1 is to show with improved data that the repository is capable of protecting human health and the environment against ionizing radiation in a long-term perspective. This is done by showing that calculated risks lie below the risk criteria stipulated by the regulatory authorities. SFR 1 is built to receive, and after closure serve as a passive repository for, low. and intermediate-level radioactive waste. The disposal chambers are situated in rock beneath the sea floor, covered by about 60 metres of rock. The underground part of the facility is reached via two tunnels whose entrances are near the harbour. The repository has been designed so that it can be abandoned after closure without further measures needing to be taken to maintain its function. The waste in SFR 1 is short-lived low- and intermediate-level waste. After 100 years the activity is less than half, and after 1,000 years only about 2% of the original activity remains. The report on long-term safety comprises eleven chapters. Chapter 1 Introduction. The chapter describes the purpose, background, format and contents of SAR-08, applicable regulations and injunctions, and the

  20. HANFORD SAFETY ANALYSIS & RISK ASSESSMENT HANDBOOK (SARAH)

    Energy Technology Data Exchange (ETDEWEB)

    EVANS, C B

    2004-12-21

    The purpose of the Hanford Safety Analysis and Risk Assessment Handbook (SARAH) is to support the development of safety basis documentation for Hazard Category 2 and 3 (HC-2 and 3) U.S. Department of Energy (DOE) nuclear facilities to meet the requirements of 10 CFR 830, ''Nuclear Safety Management''. Subpart B, ''Safety Basis Requirements.'' Consistent with DOE-STD-3009-94, Change Notice 2, ''Preparation Guide for U.S. Department of Energy Nonreactor Nuclear Facility Documented Safety Analyses'' (STD-3009), and DOE-STD-3011-2002, ''Guidance for Preparation of Basis for Interim Operation (BIO) Documents'' (STD-3011), the Hanford SARAH describes methodology for performing a safety analysis leading to development of a Documented Safety Analysis (DSA) and derivation of Technical Safety Requirements (TSR), and provides the information necessary to ensure a consistently rigorous approach that meets DOE expectations. The DSA and TSR documents, together with the DOE-issued Safety Evaluation Report (SER), are the basic components of facility safety basis documentation. For HC-2 or 3 nuclear facilities in long-term surveillance and maintenance (S&M), for decommissioning activities, where source term has been eliminated to the point that only low-level, residual fixed contamination is present, or for environmental remediation activities outside of a facility structure, DOE-STD-1120-98, ''Integration of Environment, Safety, and Health into Facility Disposition Activities'' (STD-1120), may serve as the basis for the DSA. HC-2 and 3 environmental remediation sites also are subject to the hazard analysis methodologies of this standard.

  1. Software safety analysis practice in installation phase

    Energy Technology Data Exchange (ETDEWEB)

    Huang, H. W.; Chen, M. H.; Shyu, S. S., E-mail: hwhwang@iner.gov.t [Institute of Nuclear Energy Research, No. 1000 Wenhua Road, Chiaan Village, Longtan Township, 32546 Taoyuan County, Taiwan (China)

    2010-10-15

    This work performed a software safety analysis in the installation phase of the Lung men nuclear power plant in Taiwan, under the cooperation of Institute of Nuclear Energy Research and Tpc. The US Nuclear Regulatory Commission requests licensee to perform software safety analysis and software verification and validation in each phase of software development life cycle with Branch Technical Position 7-14. In this work, 37 safety grade digital instrumentation and control systems were analyzed by failure mode and effects analysis, which is suggested by IEEE standard 7-4.3.2-2003. During the installation phase, skew tests for safety grade network and point to point tests were performed. The failure mode and effects analysis showed all the single failure modes can be resolved by the redundant means. Most of the common mode failures can be resolved by operator manual actions. (Author)

  2. Probabilistic accident consequence uncertainty analysis -- Late health effects uncertain assessment. Volume 2: Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Little, M.P.; Muirhead, C.R. [National Radiological Protection Board (United Kingdom); Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Harper, F.T. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States)

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA late health effects models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the expert panel on late health effects, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  3. Simulation-Based Probabilistic Tsunami Hazard Analysis: Empirical and Robust Hazard Predictions

    Science.gov (United States)

    De Risi, Raffaele; Goda, Katsuichiro

    2017-08-01

    Probabilistic tsunami hazard analysis (PTHA) is the prerequisite for rigorous risk assessment and thus for decision-making regarding risk mitigation strategies. This paper proposes a new simulation-based methodology for tsunami hazard assessment for a specific site of an engineering project along the coast, or, more broadly, for a wider tsunami-prone region. The methodology incorporates numerous uncertain parameters that are related to geophysical processes by adopting new scaling relationships for tsunamigenic seismic regions. Through the proposed methodology it is possible to obtain either a tsunami hazard curve for a single location, that is the representation of a tsunami intensity measure (such as inundation depth) versus its mean annual rate of occurrence, or tsunami hazard maps, representing the expected tsunami intensity measures within a geographical area, for a specific probability of occurrence in a given time window. In addition to the conventional tsunami hazard curve that is based on an empirical statistical representation of the simulation-based PTHA results, this study presents a robust tsunami hazard curve, which is based on a Bayesian fitting methodology. The robust approach allows a significant reduction of the number of simulations and, therefore, a reduction of the computational effort. Both methods produce a central estimate of the hazard as well as a confidence interval, facilitating the rigorous quantification of the hazard uncertainties.

  4. Probabilistic accident consequence uncertainty analysis -- Early health effects uncertainty assessment. Volume 2: Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Haskin, F.E. [Univ. of New Mexico, Albuquerque, NM (United States); Harper, F.T. [Sandia National Labs., Albuquerque, NM (United States); Goossens, L.H.J.; Kraan, B.C.P. [Delft Univ. of Technology (Netherlands)

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA early health effects models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on early health effects, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  5. Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for internal dosimetry. Volume 2: Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Harrison, J.D. [National Radiological Protection Board (United Kingdom); Harper, F.T. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States)

    1998-04-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA internal dosimetry models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on internal dosimetry, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  6. Wetland Detection from Multi-sources Remote Sensing Images Based on Probabilistic Latent Semantic Analysis

    Directory of Open Access Journals (Sweden)

    XU Kai

    2017-08-01

    Full Text Available A novel wetland detection approach for multi-sources remote sensing images was proposed, which based on the probabilistic latent semantic analysis (pLSA. Firstly, spectral, texture, and subclass of wetland were extracted from high-resolution remote sensing image, and land surface temperature and soil moisture of wetland were derived from corresponding multispectral remote sensing image. The feature space of wetland scene was hence formed. Then, wetland scene was represented as a combination of several latent semantics using pLSA, and the feature space of the wetland scene was further described by weight vector of latent semantics. Finally, supporting vector machine (SVM classifier was applied to detect the wetland scene. Experiments indicated that the adoption of pLSA is able to map the high-dimensional feature space of wetland to low-dimensional latent semantic space. Besides, the addition of subclass and quantitative environment features is able to characterize wetland feature space more effectively and improve the detection accuracy significantly.

  7. Weighing Clinical Evidence Using Patient Preferences: An Application of Probabilistic Multi-Criteria Decision Analysis.

    Science.gov (United States)

    Broekhuizen, Henk; IJzerman, Maarten J; Hauber, A Brett; Groothuis-Oudshoorn, Catharina G M

    2017-03-01

    The need for patient engagement has been recognized by regulatory agencies, but there is no consensus about how to operationalize this. One approach is the formal elicitation and use of patient preferences for weighing clinical outcomes. The aim of this study was to demonstrate how patient preferences can be used to weigh clinical outcomes when both preferences and clinical outcomes are uncertain by applying a probabilistic value-based multi-criteria decision analysis (MCDA) method. Probability distributions were used to model random variation and parameter uncertainty in preferences, and parameter uncertainty in clinical outcomes. The posterior value distributions and rank probabilities for each treatment were obtained using Monte-Carlo simulations. The probability of achieving the first rank is the probability that a treatment represents the highest value to patients. We illustrated our methodology for a simplified case on six HIV treatments. Preferences were modeled with normal distributions and clinical outcomes were modeled with beta distributions. The treatment value distributions showed the rank order of treatments according to patients and illustrate the remaining decision uncertainty. This study demonstrated how patient preference data can be used to weigh clinical evidence using MCDA. The model takes into account uncertainty in preferences and clinical outcomes. The model can support decision makers during the aggregation step of the MCDA process and provides a first step toward preference-based personalized medicine, yet requires further testing regarding its appropriate use in real-world settings.

  8. Cost-effectiveness analysis of axitinib through a probabilistic decision model.

    Science.gov (United States)

    Petrou, Panagiotis

    2015-06-01

    The Oncology field is characterised by a steady increase in demand and a consistent launching of innovative and expensive products. Therefore, cost-effectiveness analysis can contribute as a significant decision-making tool by elucidating the most economically efficient ways to satisfy compelling health needs. The scope of this study is to estimate the cost-effectiveness of axitinib versus sorafenib, for the second-line treatment of renal cell carcinoma. A literature review for evidence synthesis was performed and a probabilistic Markov Model was employed to simulate disease progression. This study will also assess Value of Information. Compared to sorafenib, axitinib resulted in an incremental cost of 87,936 euro per quality adjusted life year. The probability of axitinib to being cost-effective at the willingness-to-pay threshold of 60,000 euro was 13%, while the corresponding probability of being cost-effective at the highest recommended willingness-to-pay threshold of 100,000 euro was 69.9%. Uncertainty was primarily attributed to the price of the product, the utility values, the progression-free survival and to a lesser degree to the overall survival. Axitinib can be considered as a cost-effective therapeutic option for second-line treatment of renal cell carcinoma.

  9. Probabilistic accident consequence uncertainty analysis -- Uncertainty assessment for deposited material and external doses. Volume 2: Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Goossens, L.H.J.; Kraan, B.C.P.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Boardman, J. [AEA Technology (United Kingdom); Jones, J.A. [National Radiological Protection Board (United Kingdom); Harper, F.T.; Young, M.L. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States)

    1997-12-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the consequence from the accidental releases of radiological material from hypothesized accidents at nuclear installations. In 1991, the US Nuclear Regulatory Commission and the Commission of the European Communities began cosponsoring a joint uncertainty analysis of the two codes. The ultimate objective of this joint effort was to systematically develop credible and traceable uncertainty distributions for the respective code input variables. A formal expert judgment elicitation and evaluation process was identified as the best technology available for developing a library of uncertainty distributions for these consequence parameters. This report focuses on the results of the study to develop distribution for variables related to the MACCS and COSYMA deposited material and external dose models. This volume contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures, (3) the rationales and results for the panel on deposited material and external doses, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  10. Analysis of time-correlated single photon counting data: a comparative evaluation of deterministic and probabilistic approaches

    Science.gov (United States)

    Smith, Darren A.; McKenzie, Grant; Jones, Anita C.; Smith, Trevor A.

    2017-12-01

    We review various methods for analysing time-resolved fluorescence data acquired using the time-correlated single photon counting method in an attempt to evaluate their benefits and limitations. We have applied these methods to both experimental and simulated data. The relative merits of using deterministic approaches, such as the commonly used iterative reconvolution method, and probabilistic approaches, such as the smoothed exponential series method, the maximum entropy method and recently proposed basis pursuit denoising (compressed sensing) method, are outlined. In particular, we show the value of using multiple methods to arrive at the most appropriate choice of model. We show that the use of probabilistic analysis methods can indicate whether a discrete component or distribution analysis provides the better representation of the data.

  11. 10 CFR 70.62 - Safety program and integrated safety analysis.

    Science.gov (United States)

    2010-01-01

    ... to the technology of the process, and information pertaining to the equipment in the process. (c... have experience in nuclear criticality safety, radiation safety, fire safety, and chemical process... this safety program; namely, process safety information, integrated safety analysis, and management...

  12. MSSV Modeling for Wolsong-1 Safety Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Moon, Bok Ja; Choi, Chul Jin; Kim, Seoung Rae [KEPCO EandC, Daejeon (Korea, Republic of)

    2010-10-15

    The main steam safety valves (MSSVs) are installed on the main steam line to prevent the overpressurization of the system. MSSVs are held in closed position by spring force and the valves pop open by internal force when the main steam pressure increases to open set pressure. If the overpressure condition is relieved, the valves begin to close. For the safety analysis of anticipated accident condition, the safety systems are modeled conservatively to simulate the accident condition more severe. MSSVs are also modeled conservatively for the analysis of over-pressurization accidents. In this paper, the pressure transient is analyzed at over-pressurization condition to evaluate the conservatism for MSSV models

  13. SSHAC Level 1 Probabilistic Seismic Hazard Analysis for the Idaho National Laboratory

    Energy Technology Data Exchange (ETDEWEB)

    Payne, Suzette [Idaho National Lab. (INL), Idaho Falls, ID (United States); Coppersmith, Ryan [Idaho National Lab. (INL), Idaho Falls, ID (United States); Coppersmith, Kevin [Idaho National Lab. (INL), Idaho Falls, ID (United States); Rodriguez-Marek, Adrian [Idaho National Lab. (INL), Idaho Falls, ID (United States); Falero, Valentina Montaldo [Idaho National Lab. (INL), Idaho Falls, ID (United States); Youngs, Robert [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2016-09-01

    A Probabilistic Seismic Hazard Analysis (PSHA) was completed for the Materials and Fuels Complex (MFC), Naval Reactors Facility (NRF), and the Advanced Test Reactor (ATR) at Idaho National Laboratory (INL) (Figure 1-1). The PSHA followed the approaches and procedures appropriate for a Study Level 1 provided in the guidance advanced by the Senior Seismic Hazard Analysis Committee (SSHAC) in U.S. Nuclear Regulatory Commission (NRC) NUREG/CR-6372 and NUREG-2117 (NRC, 1997; 2012a). The SSHAC Level 1 PSHAs for MFC and ATR were conducted as part of the Seismic Risk Assessment (SRA) project (INL Project number 31287) to develop and apply a new-risk informed methodology, respectively. The SSHAC Level 1 PSHA was conducted for NRF to provide guidance on the potential use of a design margin above rock hazard levels. The SRA project is developing a new risk-informed methodology that will provide a systematic approach for evaluating the need for an update of an existing PSHA. The new methodology proposes criteria to be employed at specific analysis, decision, or comparison points in its evaluation process. The first four of seven criteria address changes in inputs and results of the PSHA and are given in U.S. Department of Energy (DOE) Standard, DOE-STD-1020-2012 (DOE, 2012a) and American National Standards Institute/American Nuclear Society (ANSI/ANS) 2.29 (ANS, 2008a). The last three criteria address evaluation of quantitative hazard and risk-focused information of an existing nuclear facility. The seven criteria and decision points are applied to Seismic Design Category (SDC) 3, 4, and 5, which are defined in American Society of Civil Engineers/Structural Engineers Institute (ASCE/SEI) 43-05 (ASCE, 2005). The application of the criteria and decision points could lead to an update or could determine that such update is not necessary.

  14. Space Shuttle Rudder Speed Brake Actuator-A Case Study Probabilistic Fatigue Life and Reliability Analysis

    Science.gov (United States)

    Oswald, Fred B.; Savage, Michael; Zaretsky, Erwin V.

    2015-01-01

    The U.S. Space Shuttle fleet was originally intended to have a life of 100 flights for each vehicle, lasting over a 10-year period, with minimal scheduled maintenance or inspection. The first space shuttle flight was that of the Space Shuttle Columbia (OV-102), launched April 12, 1981. The disaster that destroyed Columbia occurred on its 28th flight, February 1, 2003, nearly 22 years after its first launch. In order to minimize risk of losing another Space Shuttle, a probabilistic life and reliability analysis was conducted for the Space Shuttle rudder/speed brake actuators to determine the number of flights the actuators could sustain. A life and reliability assessment of the actuator gears was performed in two stages: a contact stress fatigue model and a gear tooth bending fatigue model. For the contact stress analysis, the Lundberg-Palmgren bearing life theory was expanded to include gear-surface pitting for the actuator as a system. The mission spectrum of the Space Shuttle rudder/speed brake actuator was combined into equivalent effective hinge moment loads including an actuator input preload for the contact stress fatigue and tooth bending fatigue models. Gear system reliabilities are reported for both models and their combination. Reliability of the actuator bearings was analyzed separately, based on data provided by the actuator manufacturer. As a result of the analysis, the reliability of one half of a single actuator was calculated to be 98.6 percent for 12 flights. Accordingly, each actuator was subsequently limited to 12 flights before removal from service in the Space Shuttle.

  15. Trade Studies of Space Launch Architectures using Modular Probabilistic Risk Analysis

    Science.gov (United States)

    Mathias, Donovan L.; Go, Susie

    2006-01-01

    A top-down risk assessment in the early phases of space exploration architecture development can provide understanding and intuition of the potential risks associated with new designs and technologies. In this approach, risk analysts draw from their past experience and the heritage of similar existing systems as a source for reliability data. This top-down approach captures the complex interactions of the risk driving parts of the integrated system without requiring detailed knowledge of the parts themselves, which is often unavailable in the early design stages. Traditional probabilistic risk analysis (PRA) technologies, however, suffer several drawbacks that limit their timely application to complex technology development programs. The most restrictive of these is a dependence on static planning scenarios, expressed through fault and event trees. Fault trees incorporating comprehensive mission scenarios are routinely constructed for complex space systems, and several commercial software products are available for evaluating fault statistics. These static representations cannot capture the dynamic behavior of system failures without substantial modification of the initial tree. Consequently, the development of dynamic models using fault tree analysis has been an active area of research in recent years. This paper discusses the implementation and demonstration of dynamic, modular scenario modeling for integration of subsystem fault evaluation modules using the Space Architecture Failure Evaluation (SAFE) tool. SAFE is a C++ code that was originally developed to support NASA s Space Launch Initiative. It provides a flexible framework for system architecture definition and trade studies. SAFE supports extensible modeling of dynamic, time-dependent risk drivers of the system and functions at the level of fidelity for which design and failure data exists. The approach is scalable, allowing inclusion of additional information as detailed data becomes available. The tool

  16. Probabilistic Harmonic Analysis on Distributed Photovoltaic Integration Considering Typical Weather Scenarios

    Science.gov (United States)

    Bin, Che; Ruoying, Yu; Dongsheng, Dang; Xiangyan, Wang

    2017-05-01

    Distributed Generation (DG) integrating to the network would cause the harmonic pollution which would cause damages on electrical devices and affect the normal operation of power system. On the other hand, due to the randomness of the wind and solar irradiation, the output of DG is random, too, which leads to an uncertainty of the harmonic generated by the DG. Thus, probabilistic methods are needed to analyse the impacts of the DG integration. In this work we studied the harmonic voltage probabilistic distribution and the harmonic distortion in distributed network after the distributed photovoltaic (DPV) system integrating in different weather conditions, mainly the sunny day, cloudy day, rainy day and the snowy day. The probabilistic distribution function of the DPV output power in different typical weather conditions could be acquired via the parameter identification method of maximum likelihood estimation. The Monte-Carlo simulation method was adopted to calculate the probabilistic distribution of harmonic voltage content at different frequency orders as well as the harmonic distortion (THD) in typical weather conditions. The case study was based on the IEEE33 system and the results of harmonic voltage content probabilistic distribution as well as THD in typical weather conditions were compared.

  17. Probabilistic analysis of free ways for maintenance; Analisis probabilista de vias libres para mantenimiento

    Energy Technology Data Exchange (ETDEWEB)

    Torres V, A.; Rivero O, J.J. [Dpto. Ingenieria Nuclear, Instituto Superior de Tecnologias y Ciencias Aplicadas, Ave. Salvador Allende y Luaces, Quinta de los Molinos, Plaza, Ciudad Habana (Cuba)]. e-mail: atorres@fctn.isctn.edu.cu

    2004-07-01

    The safety during the maintenance interventions is treated in limited manner and in general independent of the systems of management of the maintenance. This variable is affected by multiple technical or human factors many times subjective and difficult to quantifying, what limits the design of preventive plans. However, some factors constitute common points: the isolation configurations during the free ways (bank drafts in the oil industry) and the human errors associated to their violation. This characteristic allowed to develop the analysis of such situations through the methodology of fault trees that it links faults of teams and human errors cohesively. The methodology has been automated inside the MOSEG Win Ver 1.0 code and the same one can embrace from the analysis of a particular situation of free way until that of a complete strategy of maintenance from the point of view of the safety of the maintenance personal. (Author)

  18. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment, main report

    Energy Technology Data Exchange (ETDEWEB)

    Harper, F.T.; Young, M.L.; Miller, L.A. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States); Lui, C.H. [Nuclear Regulatory Commission, Washington, DC (United States); Goossens, L.H.J.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Paesler-Sauer, J. [Research Center, Karlsruhe (Germany); Helton, J.C. [and others

    1995-01-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, was completed in 1990. These codes estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The ultimate objective of the joint effort was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. Experts developed their distributions independently. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. To validate the distributions generated for the dispersion code input variables, samples from the distributions and propagated through the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the first of a three-volume document describing the project.

  19. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment, appendices A and B

    Energy Technology Data Exchange (ETDEWEB)

    Harper, F.T.; Young, M.L.; Miller, L.A. [Sandia National Labs., Albuquerque, NM (United States); Hora, S.C. [Univ. of Hawaii, Hilo, HI (United States); Lui, C.H. [Nuclear Regulatory Commission, Washington, DC (United States); Goossens, L.H.J.; Cooke, R.M. [Delft Univ. of Technology (Netherlands); Paesler-Sauer, J. [Research Center, Karlsruhe (Germany); Helton, J.C. [and others

    1995-01-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the second of a three-volume document describing the project and contains two appendices describing the rationales for the dispersion and deposition data along with short biographies of the 16 experts who participated in the project.

  20. PCAN: Probabilistic correlation analysis of two non-normal data sets.

    Science.gov (United States)

    Zoh, Roger S; Mallick, Bani; Ivanov, Ivan; Baladandayuthapani, Veera; Manyam, Ganiraju; Chapkin, Robert S; Lampe, Johanna W; Carroll, Raymond J

    2016-12-01

    Most cancer research now involves one or more assays profiling various biological molecules, e.g., messenger RNA and micro RNA, in samples collected on the same individuals. The main interest with these genomic data sets lies in the identification of a subset of features that are active in explaining the dependence between platforms. To quantify the strength of the dependency between two variables, correlation is often preferred. However, expression data obtained from next-generation sequencing platforms are integer with very low counts for some important features. In this case, the sample Pearson correlation is not a valid estimate of the true correlation matrix, because the sample correlation estimate between two features/variables with low counts will often be close to zero, even when the natural parameters of the Poisson distribution are, in actuality, highly correlated. We propose a model-based approach to correlation estimation between two non-normal data sets, via a method we call Probabilistic Correlations ANalysis, or PCAN. PCAN takes into consideration the distributional assumption about both data sets and suggests that correlations estimated at the model natural parameter level are more appropriate than correlations estimated directly on the observed data. We demonstrate through a simulation study that PCAN outperforms other standard approaches in estimating the true correlation between the natural parameters. We then apply PCAN to the joint analysis of a microRNA (miRNA) and a messenger RNA (mRNA) expression data set from a squamous cell lung cancer study, finding a large number of negative correlation pairs when compared to the standard approaches. © 2016, The International Biometric Society.

  1. Have recent earthquakes exposed flaws in or misunderstandings of probabilistic seismic hazard analysis?

    Science.gov (United States)

    Hanks, Thomas C.; Beroza, Gregory C.; Toda, Shinji

    2012-01-01

    In a recent Opinion piece in these pages, Stein et al. (2011) offer a remarkable indictment of the methods, models, and results of probabilistic seismic hazard analysis (PSHA). The principal object of their concern is the PSHA map for Japan released by the Japan Headquarters for Earthquake Research Promotion (HERP), which is reproduced by Stein et al. (2011) as their Figure 1 and also here as our Figure 1. It shows the probability of exceedance (also referred to as the “hazard”) of the Japan Meteorological Agency (JMA) intensity 6–lower (JMA 6–) in Japan for the 30-year period beginning in January 2010. JMA 6– is an earthquake-damage intensity measure that is associated with fairly strong ground motion that can be damaging to well-built structures and is potentially destructive to poor construction (HERP, 2005, appendix 5). Reiterating Geller (2011, p. 408), Stein et al. (2011, p. 623) have this to say about Figure 1: The regions assessed as most dangerous are the zones of three hypothetical “scenario earthquakes” (Tokai, Tonankai, and Nankai; see map). However, since 1979, earthquakes that caused 10 or more fatalities in Japan actually occurred in places assigned a relatively low probability. This discrepancy—the latest in a string of negative results for the characteristic model and its cousin the seismic-gap model—strongly suggest that the hazard map and the methods used to produce it are flawed and should be discarded. Given the central role that PSHA now plays in seismic risk analysis, performance-based engineering, and design-basis ground motions, discarding PSHA would have important consequences. We are not persuaded by the arguments of Geller (2011) and Stein et al. (2011) for doing so because important misunderstandings about PSHA seem to have conditioned them. In the quotation above, for example, they have confused important differences between earthquake-occurrence observations and ground-motion hazard calculations.

  2. Analysis of molecular expression patterns and integration with other knowledge bases using probabilistic Bayesian network models

    Energy Technology Data Exchange (ETDEWEB)

    Moler, Edward J.; Mian, I.S.

    2000-03-01

    How can molecular expression experiments be interpreted with greater than ten to the fourth measurements per chip? How can one get the most quantitative information possible from the experimental data with good confidence? These are important questions whose solutions require an interdisciplinary combination of molecular and cellular biology, computer science, statistics, and complex systems analysis. The explosion of data from microarray techniques present the problem of interpreting the experiments. The availability of large-scale knowledge bases provide the opportunity to maximize the information extracted from these experiments. We have developed new methods of discovering biological function, metabolic pathways, and regulatory networks from these data and knowledge bases. These techniques are applicable to analyses for biomedical engineering, clinical, and fundamental cell and molecular biology studies. Our approach uses probabilistic, computational methods that give quantitative interpretations of data in a biological context. We have selected Bayesian statistical models with graphical network representations as a framework for our methods. As a first step, we use a nave Bayesian classifier to identify statistically significant patterns in gene expression data. We have developed methods which allow us to (a) characterize which genes or experiments distinguish each class from the others, (b) cross-index the resulting classes with other databases to assess biological meaning of the classes, and (c) display a gross overview of cellular dynamics. We have developed a number of visualization tools to convey the results. We report here our methods of classification and our first attempts at integrating the data and other knowledge bases together with new visualization tools. We demonstrate the utility of these methods and tools by analysis of a series of yeast cDNA microarray data and to a set of cancerous/normal sample data from colon cancer patients. We discuss

  3. Recommendations for probabilistic seismic hazard analysis: Guidance on uncertainty and use of experts

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-04-01

    Probabilistic Seismic Hazard Analysis (PSHA) is a methodology that estimates the likelihood that various levels of earthquake-caused ground motion will be exceeded at a given location in a given future time period. Due to large uncertainties in all the geosciences data and in their modeling, multiple model interpretations are often possible. This leads to disagreement among experts, which in the past has led to disagreement on the selection of ground motion for design at a given site. In order to review the present state-of-the-art and improve on the overall stability of the PSHA process, the U.S. Nuclear Regulatory Commission (NRC), the U.S. Department of Energy (DOE), and the Electric Power Research Institute (EPRI) co-sponsored a project to provide methodological guidance on how to perform a PSHA. The project has been carried out by a seven-member Senior Seismic Hazard Analysis Committee (SSHAC) supported by a large number other experts. The SSHAC reviewed past studies, including the Lawrence Livermore National Laboratory and the EPRI landmark PSHA studies of the 1980`s and examined ways to improve on the present state-of-the-art. The Committee`s most important conclusion is that differences in PSHA results are due to procedural rather than technical differences. Thus, in addition to providing a detailed documentation on state-of-the-art elements of a PSHA, this report provides a series of procedural recommendations. The role of experts is analyzed in detail. Two entities are formally defined-the Technical Integrator (TI) and the Technical Facilitator Integrator (TFI)--to account for the various levels of complexity in the technical issues and different levels of efforts needed in a given study.

  4. A Probabilistic Physics of Failure Approach for Structure Corrosion Reliability Analysis

    OpenAIRE

    Chaoyang Xie; Hong-Zhong Huang

    2016-01-01

    Corrosion is recognized as one of the most important degradation mechanisms that affect the long-term reliability and integrity of metallic structures. Studying the structural reliability with pitting corrosion damage is useful for risk control and safety operation for the corroded structure. This paper proposed a structure corrosion reliability analysis approach based on the physics-based failure model of pitting corrosion, where the states of pitting growth, pit-to-crack, and cracking propa...

  5. Global assessment of predictability of water availability: A bivariate probabilistic Budyko analysis

    Science.gov (United States)

    Wang, Weiguang; Fu, Jianyu

    2018-02-01

    Estimating continental water availability is of great importance for water resources management, in terms of maintaining ecosystem integrity and sustaining society development. To more accurately quantify the predictability of water availability, on the basis of univariate probabilistic Budyko framework, a bivariate probabilistic Budyko approach was developed using copula-based joint distribution model for considering the dependence between parameter ω of Wang-Tang's equation and the Normalized Difference Vegetation Index (NDVI), and was applied globally. The results indicate the predictive performance in global water availability is conditional on the climatic condition. In comparison with simple univariate distribution, the bivariate one produces the lower interquartile range under the same global dataset, especially in the regions with higher NDVI values, highlighting the importance of developing the joint distribution by taking into account the dependence structure of parameter ω and NDVI, which can provide more accurate probabilistic evaluation of water availability.

  6. Probabilistic risk analysis and fault trees: Initial discussion of application to identification of risk at a wellhead

    Science.gov (United States)

    Rodak, C.; Silliman, S.

    2012-02-01

    Wellhead protection is of critical importance for managing groundwater resources. While a number of previous authors have addressed questions related to uncertainties in advective capture zones, methods for addressing wellhead protection in the presence of uncertainty in the chemistry of groundwater contaminants, the relationship between land-use and contaminant sources, and the impact on health of the receiving population are limited. It is herein suggested that probabilistic risk analysis (PRA) combined with fault trees (FT) provides a structure whereby chemical transport can be combined with uncertainties in source, chemistry, and health impact to assess the probability of negative health outcomes in the population. As such, PRA-FT provides a new strategy for the identification of areas of probabilistically high human health risk. Application of this approach is demonstrated through a simplified case study involving flow to a well in an unconfined aquifer with heterogeneity in aquifer properties and contaminant sources.

  7. Trade-Off Analysis to Solve a Probabilistic Multi-Objective Problem for Passive Filtering System Planning

    Science.gov (United States)

    Carpinelli, Guido; Ferruzzi, Gabriella; Russo, Angela

    2013-06-01

    In recent years, modern distribution networks have rapidly evolved toward complex systems due to the increasing level of penetration of distributed generation units, storage systems, and information and communication technologies. In this framework, power quality disturbances such as waveform distortions should be minimized to guarantee optimal system behavior. This article formulates the planning problem of passive filtering systems in a multi-convertor electrical distribution system as a probabilistic multi-objective optimization problem whose input random variables are characterized with probability density functions. A heuristic simplified approach including trade-off analysis issues is applied to solve the planning problem with the aim of optimizing several objectives and meeting proper probabilistic equality and inequality constraints. The approach is able to quickly find solutions on the Pareto frontier that can help the decision-maker to select the final planning alternative for practical operation. The proposed approach is applied to a 17-busbar distribution test system to evidence its effectiveness.

  8. Resampling methods for evaluating the uncertainty of the nonparametric magnitude distribution estimation in the Probabilistic Seismic Hazard Analysis

    Science.gov (United States)

    Orlecka-Sikora, Beata

    2008-08-01

    The cumulative distribution function (CDF) of magnitude of seismic events is one of the most important probabilistic characteristics in Probabilistic Seismic Hazard Analysis (PSHA). The magnitude distribution of mining induced seismicity is complex. Therefore, it is estimated using kernel nonparametric estimators. Because of its model-free character the nonparametric approach cannot, however, provide confidence interval estimates for CDF using the classical methods of mathematical statistics. To assess errors in the seismic events magnitude estimation, and thereby in the seismic hazard parameters evaluation in the nonparametric approach, we propose the use of the resampling methods. Resampling techniques applied to a one dataset provide many replicas of this sample, which preserve its probabilistic properties. In order to estimate the confidence intervals for the CDF of magnitude, we have developed an algorithm based on the bias corrected and accelerated method (BC a method). This procedure uses the smoothed bootstrap and second-order bootstrap samples. We refer to this algorithm as the iterated BC a method. The algorithm performance is illustrated through the analysis of Monte Carlo simulated seismic event catalogues and actual data from an underground copper mine in the Legnica-Głogów Copper District in Poland. The studies show that the iterated BC a technique provides satisfactory results regardless of the sample size and actual shape of the magnitude distribution.

  9. Probabilistic Fatigue Damage Prognosis Using a Surrogate Model Trained Via 3D Finite Element Analysis

    Science.gov (United States)

    Leser, Patrick E.; Hochhalter, Jacob D.; Newman, John A.; Leser, William P.; Warner, James E.; Wawrzynek, Paul A.; Yuan, Fuh-Gwo

    2015-01-01

    Utilizing inverse uncertainty quantification techniques, structural health monitoring can be integrated with damage progression models to form probabilistic predictions of a structure's remaining useful life. However, damage evolution in realistic structures is physically complex. Accurately representing this behavior requires high-fidelity models which are typically computationally prohibitive. In the present work, a high-fidelity finite element model is represented by a surrogate model, reducing computation times. The new approach is used with damage diagnosis data to form a probabilistic prediction of remaining useful life for a test specimen under mixed-mode conditions.

  10. Cultural Safety: An Evolutionary Concept Analysis.

    Science.gov (United States)

    Bozorgzad, Parisa; Negarandeh, Reza; Raiesifar, Afsaneh; Poortaghi, Sarieh

    2016-01-01

    Healing occurs in a safe milieu, and patients feel safe when service providers view them as whole persons, recognizing the multiple underlying factors that cause illness. Cultural safety can lead to service delivery in this way, but most nurses have no clear understanding of this concept. This study aimed to clarify cultural safety on the basis of Rodgers' evolutionary concept analysis. Cultural sensitivity and cultural awareness are the antecedents of cultural safety. These concepts include a nurse's flexibility toward his or her patients with different perspectives, creating an atmosphere free from intimidation and judgment of the patients, with an overall promotion of health in multicultural communities.

  11. Method for analysis and assessment of the relation between stress and reliability of knowledge-based actions in the probabilistic safety analysis; Methode fuer die Analyse und Bewertung der Wechselwirkung zwischen Stress und der Zuverlaessigkeit wissensbasierten Handelns in der probabilistischen Sicherheitsanalyse

    Energy Technology Data Exchange (ETDEWEB)

    Fassmann, Werner

    2014-06-15

    According to the current theoretical and empirical state-of-the-art, stress has to be understood as the emotional and cognitive reaction by which humans adapt to situations which imply real or imagined danger, threat, or frustration of important personal goals or needs. The emotional reaction to such situations can be so extreme that rational coping with the situation will be precluded. In less extreme cases, changes of cognitive processes underlying human action will occur, which may systematically affect the reliability of tasks personnel has to perform in a stressful situation. Reliable task performance by personnel of nuclear power plants and other risk technologies is also affected by such effects. The method developed in the frame of the research and development project RS1198 sponsored by the German Federal Ministry for Economic Affairs and Energy (BMWi) addresses both aspects of emotional and cognitive coping with stressful situations. Analytical and evaluation steps of the approach provide guidance to the end users on how to capture and quantify the contribution of stress-related emotional and cognitive factors to the reliable performance of knowledge-based actions. For this purpose, a suitable guideline has been developed. Further research for clarifying open questions has been identified. A case study application illustrates how to use the method. Part of the work performed in this project was dedicated to a review addressing the question to which extent Swain's approach to the analysis and evaluation of stress is in line with current scientific knowledge. Suitable suggestions for updates have been developed.

  12. Probabilistic design of fibre concrete structures

    Science.gov (United States)

    Pukl, R.; Novák, D.; Sajdlová, T.; Lehký, D.; Červenka, J.; Červenka, V.

    2017-09-01

    Advanced computer simulation is recently well-established methodology for evaluation of resistance of concrete engineering structures. The nonlinear finite element analysis enables to realistically predict structural damage, peak load, failure, post-peak response, development of cracks in concrete, yielding of reinforcement, concrete crushing or shear failure. The nonlinear material models can cover various types of concrete and reinforced concrete: ordinary concrete, plain or reinforced, without or with prestressing, fibre concrete, (ultra) high performance concrete, lightweight concrete, etc. Advanced material models taking into account fibre concrete properties such as shape of tensile softening branch, high toughness and ductility are described in the paper. Since the variability of the fibre concrete material properties is rather high, the probabilistic analysis seems to be the most appropriate format for structural design and evaluation of structural performance, reliability and safety. The presented combination of the nonlinear analysis with advanced probabilistic methods allows evaluation of structural safety characterized by failure probability or by reliability index respectively. Authors offer a methodology and computer tools for realistic safety assessment of concrete structures; the utilized approach is based on randomization of the nonlinear finite element analysis of the structural model. Uncertainty of the material properties or their randomness obtained from material tests are accounted in the random distribution. Furthermore, degradation of the reinforced concrete materials such as carbonation of concrete, corrosion of reinforcement, etc. can be accounted in order to analyze life-cycle structural performance and to enable prediction of the structural reliability and safety in time development. The results can serve as a rational basis for design of fibre concrete engineering structures based on advanced nonlinear computer analysis. The presented

  13. A probabilistic analysis of human influence on recent record global mean temperature changes

    Directory of Open Access Journals (Sweden)

    Philip Kokic

    2014-01-01

    Full Text Available December 2013 was the 346th consecutive month where global land and ocean average surface temperature exceeded the 20th century monthly average, with February 1985 the last time mean temperature fell below this value. Even given these and other extraordinary statistics, public acceptance of human induced climate change and confidence in the supporting science has declined since 2007. The degree of uncertainty as to whether observed climate changes are due to human activity or are part of natural systems fluctuations remains a major stumbling block to effective adaptation action and risk management. Previous approaches to attribute change include qualitative expert-assessment approaches such as used in IPCC reports and use of ‘fingerprinting’ methods based on global climate models. Here we develop an alternative approach which provides a rigorous probabilistic statistical assessment of the link between observed climate changes and human activities in a way that can inform formal climate risk assessment. We construct and validate a time series model of anomalous global temperatures to June 2010, using rates of greenhouse gas (GHG emissions, as well as other causal factors including solar radiation, volcanic forcing and the El Niño Southern Oscillation. When the effect of GHGs is removed, bootstrap simulation of the model reveals that there is less than a one in one hundred thousand chance of observing an unbroken sequence of 304 months (our analysis extends to June 2010 with mean surface temperature exceeding the 20th century average. We also show that one would expect a far greater number of short periods of falling global temperatures (as observed since 1998 if climate change was not occurring. This approach to assessing probabilities of human influence on global temperature could be transferred to other climate variables and extremes allowing enhanced formal risk assessment of climate change.

  14. Quantification of source uncertainties in Seismic Probabilistic Tsunami Hazard Analysis (SPTHA)

    Science.gov (United States)

    Selva, J.; Tonini, R.; Molinari, I.; Tiberti, M. M.; Romano, F.; Grezio, A.; Melini, D.; Piatanesi, A.; Basili, R.; Lorito, S.

    2016-06-01

    We propose a procedure for uncertainty quantification in Probabilistic Tsunami Hazard Analysis (PTHA), with a special emphasis on the uncertainty related to statistical modelling of the earthquake source in Seismic PTHA (SPTHA), and on the separate treatment of subduction and crustal earthquakes (treated as background seismicity). An event tree approach and ensemble modelling are used in spite of more classical approaches, such as the hazard integral and the logic tree. This procedure consists of four steps: (1) exploration of aleatory uncertainty through an event tree, with alternative implementations for exploring epistemic uncertainty; (2) numerical computation of tsunami generation and propagation up to a given offshore isobath; (3) (optional) site-specific quantification of inundation; (4) simultaneous quantification of aleatory and epistemic uncertainty through ensemble modelling. The proposed procedure is general and independent of the kind of tsunami source considered; however, we implement step 1, the event tree, specifically for SPTHA, focusing on seismic source uncertainty. To exemplify the procedure, we develop a case study considering seismic sources in the Ionian Sea (central-eastern Mediterranean Sea), using the coasts of Southern Italy as a target zone. The results show that an efficient and complete quantification of all the uncertainties is feasible even when treating a large number of potential sources and a large set of alternative model formulations. We also find that (i) treating separately subduction and background (crustal) earthquakes allows for optimal use of available information and for avoiding significant biases; (ii) both subduction interface and crustal faults contribute to the SPTHA, with different proportions that depend on source-target position and tsunami intensity; (iii) the proposed framework allows sensitivity and deaggregation analyses, demonstrating the applicability of the method for operational assessments.

  15. Anonymous non-response analysis in the ABCD cohort study enabled by probabilistic record linkage.

    Science.gov (United States)

    Tromp, M; van Eijsden, M; Ravelli, A C J; Bonsel, G J

    2009-05-01

    Selective non-response is an important threat to study validity as it can lead to selection bias. The Amsterdam Born Children and their Development study (ABCD-study) is a large cohort study addressing the relationship between life style, psychological conditions, nutrition and sociodemographic background of pregnant women and their children's health. Possible selective non-response and selection bias in the ABCD-study were analysed using national perinatal registry data. ABCD-study data were linked with national perinatal registry data by probabilistic medical record linkage techniques. Differences in the prevalence of relevant risk factors (sociodemographic and care-related factors) and birth outcomes between respondents and non-respondents were tested using Pearson chi-squared tests. Selection bias (i.e. bias in the association between risk factors and specific outcomes) was analysed by regression analysis with and without adjustment for participation status. The ABCD non-respondents were significantly younger, more often non-western, and more often multiparae. Non-respondents entered antenatal care later, were more often under supervision of an obstetrician and had a spontaneous delivery more often. Non-response however, was not significantly associated with preterm birth (odds ratio 1.10; 95% CI 0.93, 1.29) or low birthweight (odds ratio 1.16; 95% CI 0.98, 1.37) after adjustment for sociodemographic risk factors. The associations found between risk factors and adverse pregnancy outcomes were similar for respondents and non-respondents. Anonymised record linkage of cohort study data with national registry data indicated that selective non-response was present in the ABCD-study, but selection bias was acceptably low and did not influence the main study questions.

  16. Probabilistic seismic hazard analysis (PSHA) for Ethiopia and the neighboring region

    Science.gov (United States)

    Ayele, Atalay

    2017-10-01

    Seismic hazard calculation is carried out for the Horn of Africa region (0°-20° N and 30°-50°E) based on the probabilistic seismic hazard analysis (PSHA) method. The earthquakes catalogue data obtained from different sources were compiled, homogenized to Mw magnitude scale and declustered to remove the dependent events as required by Poisson earthquake source model. The seismotectonic map of the study area that avails from recent studies is used for area sources zonation. For assessing the seismic hazard, the study area was divided into small grids of size 0.5° × 0.5°, and the hazard parameters were calculated at the center of each of these grid cells by considering contributions from all seismic sources. Peak Ground Acceleration (PGA) corresponding to 10% and 2% probability of exceedance in 50 years were calculated for all the grid points using generic rock site with Vs = 760 m/s. Obtained values vary from 0.0 to 0.18 g and 0.0-0.35 g for 475 and 2475 return periods, respectively. The corresponding contour maps showing the spatial variation of PGA values for the two return periods are presented here. Uniform hazard response spectrum (UHRS) for 10% and 2% probability of exceedance in 50 years and hazard curves for PGA and 0.2 s spectral acceleration (Sa) all at rock site are developed for the city of Addis Ababa. The hazard map of this study corresponding to the 475 return periods has already been used to update and produce the 3rd generation building code of Ethiopia.

  17. From Safety Analysis to Formal Specification

    DEFF Research Database (Denmark)

    Hansen, Kirsten Mark; Ravn, Anders P.; Stavridou, Victoria

    1998-01-01

    Software for safety critical systems must deal with the hazards identified bysafety analysis. This paper investigates, how the results of onesafety analysis technique, fault trees, are interpreted as software safetyrequirements to be used in the program design process. We propose thatfault tree a...... requirements for software components....

  18. Probabilistic Fatigue Damage analysis of a Shape-Optimized Slot Design

    DEFF Research Database (Denmark)

    Andersen, Michael Rye; Birk-Sørensen, Martin; Hansen, Peter Friis

    1998-01-01

    A Conventional VLCC hull design has a large number of complicated connections between the longitudinals and the transverse web frames. The production cost of these joints is relatively high. Thus, new design suitable for rational welding procedures are of interest. A probabilistic fatigue damage...

  19. Dynamic Probabilistic CCA for Analysis of Affective Behaviour and Fusion of Continuous Annotations

    NARCIS (Netherlands)

    Nicolaou, Mihalis A.; Pavlovic, Vladimir; Pantic, Maja

    Fusing multiple continuous expert annotations is a crucial problem in machine learning and computer vision, particularly when dealing with uncertain and subjective tasks related to affective behavior. Inspired by the concept of inferring shared and individual latent spaces in Probabilistic Canonical

  20. Department of Energy seismic siting and design decisions: Consistent use of probabilistic seismic hazard analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kimball, J.K.; Chander, H.

    1997-02-01

    The Department of Energy (DOE) requires that all nuclear or non-nuclear facilities shall be designed, constructed and operated so that the public, the workers, and the environment are protected from the adverse impacts of Natural Phenomena Hazards including earthquakes. The design and evaluation of DOE facilities to accommodate earthquakes shall be based on an assessment of the likelihood of future earthquakes occurrences commensurate with a graded approach which depends on the potential risk posed by the DOE facility. DOE has developed Standards for site characterization and hazards assessments to ensure that a consistent use of probabilistic seismic hazard is implemented at each DOE site. The criteria included in the DOE Standards are described, and compared to those criteria being promoted by the staff of the Nuclear Regulatory Commission (NRC) for commercial nuclear reactors. In addition to a general description of the DOE requirements and criteria, the most recent probabilistic seismic hazard results for a number of DOE sites are presented. Based on the work completed to develop the probabilistic seismic hazard results, a summary of important application issues are described with recommendations for future improvements in the development and use of probabilistic seismic hazard criteria for design of DOE facilities.

  1. A Probabilistic Design Methodology for a Turboshaft Engine Overall Performance Analysis

    Directory of Open Access Journals (Sweden)

    Min Chen

    2014-05-01

    Full Text Available In reality, the cumulative effect of the many uncertainties in engine component performance may stack up to affect the engine overall performance. This paper aims to quantify the impact of uncertainty in engine component performance on the overall performance of a turboshaft engine based on Monte-Carlo probabilistic design method. A novel probabilistic model of turboshaft engine, consisting of a Monte-Carlo simulation generator, a traditional nonlinear turboshaft engine model, and a probability statistical model, was implemented to predict this impact. One of the fundamental results shown herein is that uncertainty in component performance has a significant impact on the engine overall performance prediction. This paper also shows that, taking into consideration the uncertainties in component performance, the turbine entry temperature and overall pressure ratio based on the probabilistic design method should increase by 0.76% and 8.33%, respectively, compared with the ones of deterministic design method. The comparison shows that the probabilistic approach provides a more credible and reliable way to assign the design space for a target engine overall performance.

  2. Probabilistic analysis for the response of nonlinear base isolation system under the ground excitation induced by high dam flood discharge

    Science.gov (United States)

    Liang, Chao; Zhang, Jinliang; Lian, Jijian; Liu, Fang; Li, Xinyao

    2017-10-01

    According to theoretical analysis, a general characteristic of the ground vibration induced by high dam flood discharge is that the dominant frequency ranges over several narrow frequency bands, which is verified by observations from the Xiangjiaba Hydropower Station. Nonlinear base isolation is used to reduce the structure vibration under ground excitation and the advantage of the isolation application is that the low-frequency resonance problem does not need to be considered due to its excitation characteristics, which significantly facilitate the isolation design. In order to obtain the response probabilistic distribution of a nonlinear system, the state space split technique is modified. As only a few degrees of freedom are subjected to the random noise, the probabilistic distribution of the response without involving stochastic excitation is represented by the δ function. Then, the sampling property of the δ function is employed to reduce the dimension of the Fokker-Planck- Kolmogorov (FPK) equation and the low-dimensional FPK equation is solvable with existing methods. Numerical results indicate that the proposed approach is effective and accurate. Moreover, the response probabilistic distributions are more reasonable and scientific than the peak responses calculated by conventional time and frequency domain methods.

  3. Physics-based Probabilistic Seismic Hazard Analysis for Seismicity Induced by Fluid Injection

    Science.gov (United States)

    Foxall, W.; Hutchings, L. J.; Johnson, S.; Savy, J. B.

    2011-12-01

    Risk associated with induced seismicity (IS) is a significant factor in the design, permitting and operation of enhanced geothermal, geological CO2 sequestration and other fluid injection projects. Whereas conventional probabilistic seismic hazard and risk analysis (PSHA, PSRA) methods provide an overall framework, they require adaptation to address specific characteristics of induced earthquake occurrence and ground motion estimation, and the nature of the resulting risk. The first problem is to predict the earthquake frequency-magnitude distribution of induced events for PSHA required at the design and permitting stage before the start of injection, when an appropriate earthquake catalog clearly does not exist. Furthermore, observations and theory show that the occurrence of earthquakes induced by an evolving pore-pressure field is time-dependent, and hence does not conform to the assumption of Poissonian behavior in conventional PSHA. We present an approach to this problem based on generation of an induced seismicity catalog using numerical simulation of pressure-induced shear failure in a model of the geologic structure and stress regime in and surrounding the reservoir. The model is based on available measurements of site-specific in-situ properties as well as generic earthquake source parameters. We also discuss semi-empirical analysis to sequentially update hazard and risk estimates for input to management and mitigation strategies using earthquake data recorded during and after injection. The second important difference from conventional PSRA is that in addition to potentially damaging ground motions a significant risk associated with induce seismicity in general is the perceived nuisance caused in nearby communities by small, local felt earthquakes, which in general occur relatively frequently. Including these small, usually shallow earthquakes in the hazard analysis requires extending the ground motion frequency band considered to include the high

  4. A worldwide SPT-based soil liquefaction triggering analysis utilizing gene expression programming and Bayesian probabilistic method

    Directory of Open Access Journals (Sweden)

    Maral Goharzay

    2017-08-01

    Full Text Available In this context, two different approaches of soil liquefaction evaluation using a soft computing technique based on the worldwide standard penetration test (SPT databases have been studied. Gene expression programming (GEP as a gray-box modeling approach is used to develop different deterministic models in order to evaluate the occurrence of soil liquefaction in terms of liquefaction field performance indicator (LI and factor of safety (Fs in logistic regression and classification concepts. The comparative plots illustrate that the classification concept-based models show a better performance than those based on logistic regression. In the probabilistic approach, a calibrated mapping function is developed in the context of Bayes' theorem in order to capture the failure probabilities (PL in the absence of the knowledge of parameter uncertainty. Consistent results obtained from the proposed probabilistic models, compared to the most well-known models, indicate the robustness of the methodology used in this study. The probability models provide a simple, but also efficient decision-making tool in engineering design to quantitatively assess the liquefaction triggering thresholds.

  5. Probabilistic linguistics

    NARCIS (Netherlands)

    Bod, R.; Heine, B.; Narrog, H.

    2010-01-01

    Probabilistic linguistics takes all linguistic evidence as positive evidence and lets statistics decide. It allows for accurate modelling of gradient phenomena in production and perception, and suggests that rule-like behaviour is no more than a side effect of maximizing probability. This chapter

  6. Application of probabilistic risk assessment in nuclear and environmental licensing processes of nuclear reactors in Brazil

    Energy Technology Data Exchange (ETDEWEB)

    Mata, Jonatas F.C. da; Vasconcelos, Vanderley de; Mesquita, Amir Z., E-mail: jonatasfmata@yahoo.com.br, E-mail: vasconv@cdtn.br, E-mail: amir@cdtn.br [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil)

    2015-07-01

    The nuclear accident at Fukushima Daiichi, occurred in Japan in 2011, brought reflections, worldwide, on the management of nuclear and environmental licensing processes of existing nuclear reactors. One of the key lessons learned in this matter, is that the studies of Probabilistic Safety Assessment and Severe Accidents are becoming essential, even in the early stage of a nuclear development project. In Brazil, Brazilian Nuclear Energy Commission, CNEN, conducts the nuclear licensing. The organism responsible for the environmental licensing is Brazilian Institute of Environment and Renewable Natural Resources, IBAMA. In the scope of the licensing processes of these two institutions, the safety analysis is essentially deterministic, complemented by probabilistic studies. The Probabilistic Safety Assessment (PSA) is the study performed to evaluate the behavior of the nuclear reactor in a sequence of events that may lead to the melting of its core. It includes both probability and consequence estimation of these events, which are called Severe Accidents, allowing to obtain the risk assessment of the plant. Thus, the possible shortcomings in the design of systems are identified, providing basis for safety assessment and improving safety. During the environmental licensing, a Quantitative Risk Analysis (QRA), including probabilistic evaluations, is required in order to support the development of the Risk Analysis Study, the Risk Management Program and the Emergency Plan. This article aims to provide an overview of probabilistic risk assessment methodologies and their applications in nuclear and environmental licensing processes of nuclear reactors in Brazil. (author)

  7. Waste Isolation Pilot Plant Safety Analysis Report

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1995-11-01

    The following provides a summary of the specific issues addressed in this FY-95 Annual Update as they relate to the CH TRU safety bases: Executive Summary; Site Characteristics; Principal Design and Safety Criteria; Facility Design and Operation; Hazards and Accident Analysis; Derivation of Technical Safety Requirements; Radiological and Hazardous Material Protection; Institutional Programs; Quality Assurance; and Decontamination and Decommissioning. The System Design Descriptions`` (SDDS) for the WIPP were reviewed and incorporated into Chapter 3, Principal Design and Safety Criteria and Chapter 4, Facility Design and Operation. This provides the most currently available final engineering design information on waste emplacement operations throughout the disposal phase up to the point of permanent closure. Also, the criteria which define the TRU waste to be accepted for disposal at the WIPP facility were summarized in Chapter 3 based on the WAC for the Waste Isolation Pilot Plant.`` This Safety Analysis Report (SAR) documents the safety analyses that develop and evaluate the adequacy of the Waste Isolation Pilot Plant Contact-Handled Transuranic Wastes (WIPP CH TRU) safety bases necessary to ensure the safety of workers, the public and the environment from the hazards posed by WIPP waste handling and emplacement operations during the disposal phase and hazards associated with the decommissioning and decontamination phase. The analyses of the hazards associated with the long-term (10,000 year) disposal of TRU and TRU mixed waste, and demonstration of compliance with the requirements of 40 CFR 191, Subpart B and 40 CFR 268.6 will be addressed in detail in the WIPP Final Certification Application scheduled for submittal in October 1996 (40 CFR 191) and the No-Migration Variance Petition (40 CFR 268.6) scheduled for submittal in June 1996. Section 5.4, Long-Term Waste Isolation Assessment summarizes the current status of the assessment.

  8. Increasing safety of a robotic system for inner ear surgery using probabilistic error modeling near vital anatomy

    Science.gov (United States)

    Dillon, Neal P.; Siebold, Michael A.; Mitchell, Jason E.; Blachon, Gregoire S.; Balachandran, Ramya; Fitzpatrick, J. Michael; Webster, Robert J.

    2016-03-01

    Safe and effective planning for robotic surgery that involves cutting or ablation of tissue must consider all potential sources of error when determining how close the tool may come to vital anatomy. A pre-operative plan that does not adequately consider potential deviations from ideal system behavior may lead to patient injury. Conversely, a plan that is overly conservative may result in ineffective or incomplete performance of the task. Thus, enforcing simple, uniform-thickness safety margins around vital anatomy is insufficient in the presence of spatially varying, anisotropic error. Prior work has used registration error to determine a variable-thickness safety margin around vital structures that must be approached during mastoidectomy but ultimately preserved. In this paper, these methods are extended to incorporate image distortion and physical robot errors, including kinematic errors and deflections of the robot. These additional sources of error are discussed and stochastic models for a bone-attached robot for otologic surgery are developed. An algorithm for generating appropriate safety margins based on a desired probability of preserving the underlying anatomical structure is presented. Simulations are performed on a CT scan of a cadaver head and safety margins are calculated around several critical structures for planning of a robotic mastoidectomy.

  9. Towards a probabilistic tsunami hazard analysis for the Gulf of Cadiz

    Science.gov (United States)

    Løvholt, Finn; Urgeles, Roger

    2017-04-01

    Landslides and volcanic flank collapses constitute a significant portion of all known tsunami sources, and they are less constrained geographically than earthquakes as they are not tied to large fault zones. While landslides have mostly produced local tsunamis historically, prehistoric evidence show that landslides can also produce ocean wide tsunamis. Because the landslide induced tsunami probability is more difficult to quantify than the one induced by earthquakes, our understanding of the landslide tsunami hazard is less understood. To improve our understanding and methodologies to deal with this hazard, we here present results and methods for a preliminary landslide probabilistic tsunami hazard assessment (LPTHA) for the Gulf of Cadiz for submerged landslides. The present literature on LPTHA is sparse, and studies have so far been separated into two groups, the first based on observed magnitude frequency distributions (MFD's), the second based on simplified geotechnical slope stability analysis. We argue that the MFD based approach is best suited when a sufficient amount of data covering a wide range of volumes is available, although uncertainties in the dating of the landslides often represent a potential large source of bias. To this end, the relatively rich availability of landslide data in the Gulf of Cadiz makes this area suitable for developing and testing LPTHA models. In the presentation, we will first explore the landslide data and statistics, including different spatial factors such as slope versus volume relationships, faults etc. Examples of how random realizations can be used to distribute tsunami source over the study area will be demonstrated. Furthermore, computational strategies for simulating both the landslide and the tsunami generation in a simplified way will be described. To this end, we use depth averaged viscoplastic landslide model coupled to the numerical tsunami model to represent a set of idealized tsunami sources, which are in turn

  10. Probabilistic Seismic Hazard Analysis of Injection-Induced Seismicity Utilizing Physics-Based Simulation

    Science.gov (United States)

    Johnson, S.; Foxall, W.; Savy, J. B.; Hutchings, L. J.

    2012-12-01

    Risk associated with induced seismicity is a significant factor in the design, permitting and operation of enhanced geothermal, geological CO2 sequestration, wastewater disposal, and other fluid injection projects. The conventional probabilistic seismic hazard analysis (PSHA) approach provides a framework for estimation of induced seismicity hazard but requires adaptation to address the particular occurrence characteristics of induced earthquakes and to estimation of the ground motions they generate. The assumption often made in conventional PSHA of Poissonian earthquake occurrence in both space and time is clearly violated by seismicity induced by an evolving pore pressure field. Our project focuses on analyzing hazard at the pre-injection design and permitting stage, before an induced earthquake catalog can be recorded. In order to accommodate the commensurate lack of pre-existing data, we have adopted a numerical physics-based approach to synthesizing and estimating earthquake frequency-magnitude distributions. Induced earthquake sequences are generated using the program RSQSIM (Dieterich and Richards-Dinger, PAGEOPH, 2010) augmented to simulate pressure-induced shear failure on faults and fractures embedded in a 3D geological structure under steady-state tectonic shear loading. The model uses available site-specific data on rock properties and in-situ stress, and generic values of frictional properties appropriate to the shallow reservoir depths at which induced events usually occur. The space- and time-evolving pore pressure field is coupled into the simulation from a multi-phase flow model. In addition to potentially damaging ground motions, induced seismicity poses a risk of perceived nuisance in nearby communities caused by relatively frequent, low magnitude earthquakes. Including these shallow local earthquakes in the hazard analysis requires extending the magnitude range considered to as low as M2 and the frequency band to include the short

  11. K West integrated water treatment system subproject safety analysis document

    Energy Technology Data Exchange (ETDEWEB)

    SEMMENS, L.S.

    1999-02-24

    This Accident Analysis evaluates unmitigated accident scenarios, and identifies Safety Significant and Safety Class structures, systems, and components for the K West Integrated Water Treatment System.

  12. Accident Analysis and Highway Safety

    Directory of Open Access Journals (Sweden)

    Omar Noorliyana

    2017-01-01

    Full Text Available Since 2010, Federal Route FT050 (Jalan Batu Pahat-Kluang has undergone many changes, including the improvement of geometric features (i.e., construction of median, dedicated U-turns and additional lanes and upgrading the quality of the road surface. Unfortunately, even with these enhancements, accidents continue to occur along this route. This study covered both accident analysis and blackspot study. Accident point weightage was used to identify blackspot locations. The results reveal hazardous road locations and blackspot ranking along the route.

  13. Quantitative Analysis of Probabilistic Models of Software Product Lines with Statistical Model Checking

    Directory of Open Access Journals (Sweden)

    Maurice H. ter Beek

    2015-04-01

    Full Text Available We investigate the suitability of statistical model checking techniques for analysing quantitative properties of software product line models with probabilistic aspects. For this purpose, we enrich the feature-oriented language FLan with action rates, which specify the likelihood of exhibiting particular behaviour or of installing features at a specific moment or in a specific order. The enriched language (called PFLan allows us to specify models of software product lines with probabilistic configurations and behaviour, e.g. by considering a PFLan semantics based on discrete-time Markov chains. The Maude implementation of PFLan is combined with the distributed statistical model checker MultiVeStA to perform quantitative analyses of a simple product line case study. The presented analyses include the likelihood of certain behaviour of interest (e.g. product malfunctioning and the expected average cost of products.

  14. Quantitative Analysis of Probabilistic Models of Software Product Lines with Statistical Model Checking

    DEFF Research Database (Denmark)

    ter Beek, Maurice H.; Legay, Axel; Lluch Lafuente, Alberto

    2015-01-01

    We investigate the suitability of statistical model checking techniques for analysing quantitative properties of software product line models with probabilistic aspects. For this purpose, we enrich the feature-oriented language FLAN with action rates, which specify the likelihood of exhibiting...... particular behaviour or of installing features at a specific moment or in a specific order. The enriched language (called PFLAN) allows us to specify models of software product lines with probabilistic configurations and behaviour, e.g. by considering a PFLAN semantics based on discrete-time Markov chains....... The Maude implementation of PFLAN is combined with the distributed statistical model checker MultiVeStA to perform quantitative analyses of a simple product line case study. The presented analyses include the likelihood of certain behaviour of interest (e.g. product malfunctioning) and the expected average...

  15. Probabilistic seismic hazard analysis for Sumatra, Indonesia and across the Southern Malaysian Peninsula

    Science.gov (United States)

    Petersen, M.D.; Dewey, J.; Hartzell, S.; Mueller, C.; Harmsen, S.; Frankel, A.D.; Rukstales, K.

    2004-01-01

    The ground motion hazard for Sumatra and the Malaysian peninsula is calculated in a probabilistic framework, using procedures developed for the US National Seismic Hazard Maps. We constructed regional earthquake source models and used standard published and modified attenuation equations to calculate peak ground acceleration at 2% and 10% probability of exceedance in 50 years for rock site conditions. We developed or modified earthquake catalogs and declustered these catalogs to include only independent earthquakes. The resulting catalogs were used to define four source zones that characterize earthquakes in four tectonic environments: subduction zone interface earthquakes, subduction zone deep intraslab earthquakes, strike-slip transform earthquakes, and intraplate earthquakes. The recurrence rates and sizes of historical earthquakes on known faults and across zones were also determined from this modified catalog. In addition to the source zones, our seismic source model considers two major faults that are known historically to generate large earthquakes: the Sumatran subduction zone and the Sumatran transform fault. Several published studies were used to describe earthquakes along these faults during historical and pre-historical time, as well as to identify segmentation models of faults. Peak horizontal ground accelerations were calculated using ground motion prediction relations that were developed from seismic data obtained from the crustal interplate environment, crustal intraplate environment, along the subduction zone interface, and from deep intraslab earthquakes. Most of these relations, however, have not been developed for large distances that are needed for calculating the hazard across the Malaysian peninsula, and none were developed for earthquake ground motions generated in an interplate tectonic environment that are propagated into an intraplate tectonic environment. For the interplate and intraplate crustal earthquakes, we have applied ground

  16. Probabilistic Load Flow

    DEFF Research Database (Denmark)

    Chen, Peiyuan; Chen, Zhe; Bak-Jensen, Birgitte

    2008-01-01

    This paper reviews the development of the probabilistic load flow (PLF) techniques. Applications of the PLF techniques in different areas of power system steady-state analysis are also discussed. The purpose of the review is to identify different available PLF techniques and their corresponding...

  17. Relationships between psychological safety climate facets and safety behavior in the rail industry: a dominance analysis.

    Science.gov (United States)

    Morrow, Stephanie L; McGonagle, Alyssa K; Dove-Steinkamp, Megan L; Walker, Curtis T; Marmet, Matthew; Barnes-Farrell, Janet L

    2010-09-01

    The goals of this study were twofold: (1) to confirm a relationship between employee perceptions of psychological safety climate and safety behavior for a sample of workers in the rail industry and (2) to explore the relative strengths of relationships between specific facets of safety climate and safety behavior. Non-management rail maintenance workers employed by a large North American railroad completed a survey (n=421) regarding workplace safety perceptions and behaviors. Three facets of safety climate (management safety, coworker safety, and work-safety tension) were assessed as relating to individual workers' reported safety behavior. All three facets were significantly associated with safety behavior. Dominance analysis was used to assess the relative importance of each facet as related to the outcome, and work-safety tension evidenced the strongest relationship with safety behavior. Published by Elsevier Ltd.

  18. Classification of fault diagnosis in a gear wheel by used probabilistic neural network, fast Fourier transform and principal component analysis

    Directory of Open Access Journals (Sweden)

    Piotr CZECH

    2007-01-01

    Full Text Available This paper presents the results of an experimental application of artificial neural network as a classifier of the degree of cracking of a tooth root in a gear wheel. The neural classifier was based on the artificial neural network of Probabilistic Neural Network type (PNN. The input data for the classifier was in a form of matrix composedof statistical measures, obtained from fast Fourier transform (FFT and principal component analysis (PCA. The identified model of toothed gear transmission, operating in a circulating power system, served for generation of the teaching and testing set applied for the experiment.

  19. Development of methodology and computer programs for the ground response spectrum and the probabilistic seismic hazard analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Joon Kyoung [Semyung Univ., Research Institute of Industrial Science and Technol , Jecheon (Korea, Republic of)

    1996-12-15

    Objective of this study is to investigate and develop the methodologies and corresponding computer codes, compatible to the domestic seismological and geological environments, for estimating ground response spectrum and probabilistic seismic hazard. Using the PSHA computer program, the Cumulative Probability Functions(CPDF) and Probability Functions (PDF) of the annual exceedence have been investigated for the analysis of the uncertainty space of the annual probability at ten interested seismic hazard levels (0.1 g to 0.99 g). The cumulative provability functions and provability functions of the annual exceedence have been also compared to those results from the different input parameter spaces.

  20. RISMC Advanced Safety Analysis Project Plan – FY 2015 - FY 2019

    Energy Technology Data Exchange (ETDEWEB)

    Szilard, Ronaldo H. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Smith, Curtis L. [Idaho National Lab. (INL), Idaho Falls, ID (United States); Youngblood, Robert [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2014-09-01

    In this report, a project plan is developed, focused on industry applications, using Risk-Informed Safety Margin Characterization (RISMC) tools and methods applied to realistic, relevant, and current interest issues to the operating nuclear fleet. RISMC focuses on modernization of nuclear power safety analysis (tools, methods and data); implementing state-of-the-art modeling techniques (which include, for example, enabling incorporation of more detailed physics as they become available); taking advantage of modern computing hardware; and combining probabilistic and mechanistic analyses to enable a risk informed safety analysis process. The modernized tools will maintain the current high level of safety in our nuclear power plant fleet, while providing an improved understanding of safety margins and the critical parameters that affect them. Thus, the set of tools will provide information to inform decisions on plant modifications, refurbishments, and surveillance programs, while improving economics. This set of tools will also benefit the design of new reactors, enhancing safety per unit cost of a nuclear plant. The proposed plan will focus on application of the RISMC toolkit, in particular, solving realistic problems of important current issues to the nuclear industry, in collaboration with plant owners and operators to demonstrate the usefulness of these tools in decision making.

  1. A Demonstration of Advanced Safety Analysis Tools and Methods Applied to Large Break LOCA and Fuel Analysis for PWRs

    Energy Technology Data Exchange (ETDEWEB)

    Szilard, Ronaldo Henriques [Idaho National Laboratory; Smith, Curtis Lee [Idaho National Laboratory; Martineau, Richard Charles [Idaho National Laboratory

    2016-03-01

    The U.S. Nuclear Regulatory Commission (NRC) is currently proposing a rulemaking designated as 10 CFR 50.46c to revise the loss-of-coolant accident (LOCA)/emergency core cooling system acceptance criteria to include the effects of higher burnup on fuel/cladding performance. We propose a demonstration problem of a representative four-loop PWR plant to study the impact of this new rule in the US nuclear fleet. Within the scope of evaluation for the 10 CFR 50.46c rule, aspects of safety, operations, and economics are considered in the industry application demonstration presented in this paper. An advanced safety analysis approach is used, by integrating the probabilistic element with deterministic methods for LOCA analysis, a novel approach to solving these types of multi-physics, multi-scale problems.

  2. Behavior of engineered nanoparticles in aqueous solutions and porous media: Connecting experimentation to probabilistic analysis

    Science.gov (United States)

    Contreras, Carolina

    2011-12-01

    Engineered nanoparticles have enhanced products and services in the fields of medicine, energy, engineering, communications, personal care, environmental treatment, and many others. The increased use of engineered nanoparticles in consumer products will lead to these materials in natural systems, inevitably becoming a potential source of pollution. The study of the stability and mobility of these materials is fundamental to understand their behavior in natural systems and predict possible health and environmental implications. In addition, the use of probabilistic methods such as sensitivity analysis applied to the parameters controlling their behavior is useful in providing support in performing a risk assessment. This research investigated the stability and mobility of two types of metal oxide nanoparticles (aluminum oxide and titanium dioxide). The stability studies tested the effect of sand, pH 4, 7, and 10, and the NaCl in concentrations of 10mM, 25mM, 50mM, and 75mM. The mobility was tested using saturated quartz sand columns and nanoparticles suspension at pH 4 and 7 and in the presence of NaCl and CaCl2 in concentrations of 0.1mM, 1mM, and 10mM. Additionally, this work performed a sensitivity analysis of physical parameters used in mobility experiment performed for titanium dioxide and in mobility experiments taken from the literature for zero valent iron nanoparticles and fluorescent colloids to determine their effect on the value C/Co of by applying qualitative and quantitative methods. The results from the stability studies showed that titanium dioxide nanoparticles (TiO2) could remain suspended in solution for up to seven days at pH 10 and pH 7 even after settling of the sand; while for pH 4 solutions titanium settled along with the sand and after seven days no particles were observed in suspension. Other stability studies showed that nanoparticle aluminum oxide (Al2O3) and titanium dioxide (TiO2) size increased with increasing ionic strength (10 to 75

  3. Probabilistic Analysis and Design of a Raked Wing Tip for a Commercial Transport

    Science.gov (United States)

    Mason Brian H.; Chen, Tzi-Kang; Padula, Sharon L.; Ransom, Jonathan B.; Stroud, W. Jefferson

    2008-01-01

    An approach for conducting reliability-based design and optimization (RBDO) of a Boeing 767 raked wing tip (RWT) is presented. The goal is to evaluate the benefits of RBDO for design of an aircraft substructure. A finite-element (FE) model that includes eight critical static load cases is used to evaluate the response of the wing tip. Thirteen design variables that describe the thickness of the composite skins and stiffeners are selected to minimize the weight of the wing tip. A strain-based margin of safety is used to evaluate the performance of the structure. The randomness in the load scale factor and in the strain limits is considered. Of the 13 variables, the wing-tip design was controlled primarily by the thickness of the thickest plies in the upper skins. The report includes an analysis of the optimization results and recommendations for future reliability-based studies.

  4. iTOUGH2-IFC: An Integrated Flow Code in Support of Nagra's Probabilistic Safety Assessment:--User's Guide and Model Description

    Energy Technology Data Exchange (ETDEWEB)

    Finsterle, Stefan A.

    2009-01-02

    This document describes the development and use of the Integrated Flow Code (IFC), a numerical code and related model to be used for the simulation of time-dependent, two-phase flow in the near field and geosphere of a gas-generating nuclear waste repository system located in an initially fully water-saturated claystone (Opalinus Clay) in Switzerland. The development of the code and model was supported by the Swiss National Cooperative for the Disposal of Radioactive Waste (Nagra), Wettingen, Switzerland. Gas generation (mainly H{sub 2}, but also CH{sub 4} and CO{sub 2}) may affect repository performance by (1) compromising the engineered barriers through excessive pressure build-up, (2) displacing potentially contaminated pore water, (3) releasing radioactive gases (e.g., those containing {sup 14}C and {sup 3}H), (4) changing hydrogeologic properties of the engineered barrier system and the host rock, and (5) altering the groundwater flow field and thus radionuclide migration paths. The IFC aims at providing water and gas flow fields as the basis for the subsequent radionuclide transport simulations, which are performed by the radionuclide transport code (RTC). The IFC, RTC and a waste-dissolution and near-field transport model (STMAN) are part of the Integrated Radionuclide Release Code (IRRC), which integrates all safety-relevant features, events, and processes (FEPs). The IRRC is embedded into a Probabilistic Safety Assessment (PSA) computational tool that (1) evaluates alternative conceptual models, scenarios, and disruptive events, and (2) performs Monte-Carlo sampling to account for parametric uncertainties. The preliminary probabilistic safety assessment concept and the role of the IFC are visualized in Figure 1. The IFC was developed based on Nagra's PSA concept. Specifically, as many phenomena as possible are to be directly simulated using a (simplified) process model, which is at the core of the IRRC model. Uncertainty evaluation (scenario

  5. Flood risk and adaptation strategies under climate change and urban expansion: A probabilistic analysis using global data.

    Science.gov (United States)

    Muis, Sanne; Güneralp, Burak; Jongman, Brenden; Aerts, Jeroen C J H; Ward, Philip J

    2015-12-15

    An accurate understanding of flood risk and its drivers is crucial for effective risk management. Detailed risk projections, including uncertainties, are however rarely available, particularly in developing countries. This paper presents a method that integrates recent advances in global-scale modeling of flood hazard and land change, which enables the probabilistic analysis of future trends in national-scale flood risk. We demonstrate its application to Indonesia. We develop 1000 spatially-explicit projections of urban expansion from 2000 to 2030 that account for uncertainty associated with population and economic growth projections, as well as uncertainty in where urban land change may occur. The projections show that the urban extent increases by 215%-357% (5th and 95th percentiles). Urban expansion is particularly rapid on Java, which accounts for 79% of the national increase. From 2000 to 2030, increases in exposure will elevate flood risk by, on average, 76% and 120% for river and coastal floods. While sea level rise will further increase the exposure-induced trend by 19%-37%, the response of river floods to climate change is highly uncertain. However, as urban expansion is the main driver of future risk, the implementation of adaptation measures is increasingly urgent, regardless of the wide uncertainty in climate projections. Using probabilistic urban projections, we show that spatial planning can be a very effective adaptation strategy. Our study emphasizes that global data can be used successfully for probabilistic risk assessment in data-scarce countries. Copyright © 2015 Elsevier B.V. All rights reserved.

  6. Probabilistic properties of injection induced seismicity - implications for the seismic hazard analysis

    Science.gov (United States)

    Lasocki, Stanislaw; Urban, Pawel; Kwiatek, Grzegorz; Martinez-Garzón, Particia

    2017-04-01

    Injection induced seismicity (IIS) is an undesired dynamic rockmass response to massive fluid injections. This includes reactions, among others, to hydro-fracturing for shale gas exploitation. Complexity and changeability of technological factors that induce IIS, may result in significant deviations of the observed distributions of seismic process parameters from the models, which perform well in natural, tectonic seismic processes. Classic formulations of probabilistic seismic hazard analysis in natural seismicity assume the seismic marked point process to be a stationary Poisson process, whose marks - magnitudes are governed by a Gutenberg-Richter born exponential distribution. It is well known that the use of an inappropriate earthquake occurrence model and/or an inappropriate of magnitude distribution model leads to significant systematic errors of hazard estimates. It is therefore of paramount importance to check whether the mentioned, commonly used in natural seismicity assumptions on the seismic process, can be safely used in IIS hazard problems or not. Seismicity accompanying shale gas operations is widely studied in the framework of the project "Shale Gas Exploration and Exploitation Induced Risks" (SHEER). Here we present results of SHEER project investigations of such seismicity from Oklahoma and of a proxy of such seismicity - IIS data from The Geysers geothermal field. We attempt to answer to the following questions: • Do IIS earthquakes follow the Gutenberg-Richter distribution law, so that the magnitude distribution can be modelled by an exponential distribution? • Is the occurrence process of IIS earthquakes Poissonian? Is it segmentally Poissonian? If yes, how are these segments linked to cycles of technological operations? Statistical tests indicate that the Gutenberg-Richter relation born exponential distribution model for magnitude is, in general, inappropriate. The magnitude distribution can be complex, multimodal, with no ready

  7. Parameter estimation in Probabilistic Seismic Hazard Analysis: current problems and some solutions

    Science.gov (United States)

    Vermeulen, Petrus

    2017-04-01

    A typical Probabilistic Seismic Hazard Analysis (PSHA) comprises identification of seismic source zones, determination of hazard parameters for these zones, selection of an appropriate ground motion prediction equation (GMPE), and integration over probabilities according the Cornell-McGuire procedure. Determination of hazard parameters often does not receive the attention it deserves, and, therefore, problems therein are often overlooked. Here, many of these problems are identified, and some of them addressed. The parameters that need to be identified are those associated with the frequency-magnitude law, those associated with earthquake recurrence law in time, and the parameters controlling the GMPE. This study is concerned with the frequency-magnitude law and temporal distribution of earthquakes, and not with GMPEs. TheGutenberg-Richter frequency-magnitude law is usually adopted for the frequency-magnitude law, and a Poisson process for earthquake recurrence in time. Accordingly, the parameters that need to be determined are the slope parameter of the Gutenberg-Richter frequency-magnitude law, i.e. the b-value, the maximum value at which the Gutenberg-Richter law applies mmax, and the mean recurrence frequency,λ, of earthquakes. If, instead of the Cornell-McGuire, the "Parametric-Historic procedure" is used, these parameters do not have to be known before the PSHA computations, they are estimated directly during the PSHA computation. The resulting relation for the frequency of ground motion vibration parameters has an analogous functional form to the frequency-magnitude law, which is described by parameters γ (analogous to the b¬-value of the Gutenberg-Richter law) and the maximum possible ground motion amax (analogous to mmax). Originally, the approach was possible to apply only to the simple GMPE, however, recently a method was extended to incorporate more complex forms of GMPE's. With regards to the parameter mmax, there are numerous methods of estimation

  8. Lodalen slide: a probabilistic assessment

    National Research Council Canada - National Science Library

    El-Ramly, H; Morgenstern, N R; Cruden, D M

    2006-01-01

    .... A probabilistic slope analysis methodology based on Monte Carlo simulation using Microsoft® Excel and @Risk software is applied to investigate the Lodalen slide that occurred in Norway in 1954...

  9. A cost-effectiveness analysis of a proactive management strategy for the Sprint Fidelis recall: a probabilistic decision analysis model.

    Science.gov (United States)

    Bashir, Jamil; Cowan, Simone; Raymakers, Adam; Yamashita, Michael; Danter, Matthew; Krahn, Andrew; Lynd, Larry D

    2013-12-01

    The management of the recall is complicated by the competing risks of lead failure and complications that can occur with lead revision. Many of these patients are currently undergoing an elective generator change--an ideal time to consider lead revision. To determine the cost-effectiveness of a proactive management strategy for the Sprint Fidelis recall. We obtained detailed clinical outcomes and costing data from a retrospective analysis of 341 patients who received the Sprint Fidelis lead in British Columbia, where patients younger than 60 years were offered lead extraction when undergoing generator replacement. These population-based data were used to construct and populate a probabilistic Markov model in which a proactive management strategy was compared to a conservative strategy to determine the incremental cost per lead failure avoided. In our population, elective lead revisions were half the cost of emergent revisions and had a lower complication rate. In the model, the incremental cost-effectiveness ratio of proactive lead revision versus a recommended monitoring strategy was $12,779 per lead failure avoided. The proactive strategy resulted in 21 fewer failures per 100 patients treated and reduced the chance of an additional complication from an unexpected surgery. Cost-effectiveness analysis suggests that prospective lead revision should be considered when patients with a Sprint Fidelis lead present for pulse generator change. Elective revision of the lead is justified even when 25% of the population is operated on per year, and in some scenarios, it is both less costly and provides a better outcome. © 2013 Heart Rhythm Society Published by Heart Rhythm Society All rights reserved.

  10. Safety of GM crops: compositional analysis.

    Science.gov (United States)

    Brune, Philip D; Culler, Angela Hendrickson; Ridley, William P; Walker, Kate

    2013-09-04

    The compositional analysis of genetically modified (GM) crops has continued to be an important part of the overall evaluation in the safety assessment program for these materials. The variety and complexity of genetically engineered traits and modes of action that will be used in GM crops in the near future, as well as our expanded knowledge of compositional variability and factors that can affect composition, raise questions about compositional analysis and how it should be applied to evaluate the safety of traits. The International Life Sciences Institute (ILSI), a nonprofit foundation whose mission is to provide science that improves public health and well-being by fostering collaboration among experts from academia, government, and industry, convened a workshop in September 2012 to examine these and related questions, and a series of papers has been assembled to describe the outcomes of that meeting.

  11. Risk-Based Explosive Safety Analysis

    Science.gov (United States)

    2016-11-30

    REPORT DOCUMENTATION PAGE Form Approved OMB No. 0704-0188 Public reporting burden for this collection of information is estimated to average 1...currently valid OMB control number. PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS. 1. REPORT DATE (DD-MM-YYYY) 30 November 2016 2. REPORT TYPE...Technical Paper 3. DATES COVERED (From - To) 01 November 2016 – 30 November 2016 4. TITLE AND SUBTITLE Risk-Based Explosive Safety Analysis 5a

  12. Design and analysis of DNA strand displacement devices using probabilistic model checking.

    Science.gov (United States)

    Lakin, Matthew R; Parker, David; Cardelli, Luca; Kwiatkowska, Marta; Phillips, Andrew

    2012-07-07

    Designing correct, robust DNA devices is difficult because of the many possibilities for unwanted interference between molecules in the system. DNA strand displacement has been proposed as a design paradigm for DNA devices, and the DNA strand displacement (DSD) programming language has been developed as a means of formally programming and analysing these devices to check for unwanted interference. We demonstrate, for the first time, the use of probabilistic verification techniques to analyse the correctness, reliability and performance of DNA devices during the design phase. We use the probabilistic model checker prism, in combination with the DSD language, to design and debug DNA strand displacement components and to investigate their kinetics. We show how our techniques can be used to identify design flaws and to evaluate the merits of contrasting design decisions, even on devices comprising relatively few inputs. We then demonstrate the use of these components to construct a DNA strand displacement device for approximate majority voting. Finally, we discuss some of the challenges and possible directions for applying these methods to more complex designs.

  13. Steady-State Analysis of Genetic Regulatory Networks Modelled by Probabilistic Boolean Networks

    Directory of Open Access Journals (Sweden)

    Wei Zhang

    2006-04-01

    Full Text Available Probabilistic Boolean networks (PBNs have recently been introduced as a promising class of models of genetic regulatory networks. The dynamic behaviour of PBNs can be analysed in the context of Markov chains. A key goal is the determination of the steady-state (long-run behaviour of a PBN by analysing the corresponding Markov chain. This allows one to compute the long-term influence of a gene on another gene or determine the long-term joint probabilistic behaviour of a few selected genes. Because matrix-based methods quickly become prohibitive for large sizes of networks, we propose the use of Monte Carlo methods. However, the rate of convergence to the stationary distribution becomes a central issue. We discuss several approaches for determining the number of iterations necessary to achieve convergence of the Markov chain corresponding to a PBN. Using a recently introduced method based on the theory of two-state Markov chains, we illustrate the approach on a sub-network designed from human glioma gene expression data and determine the joint steadystate probabilities for several groups of genes.

  14. Comparative analysis of safety related site characteristics

    Energy Technology Data Exchange (ETDEWEB)

    Andersson, Johan (ed.)

    2010-12-15

    This document presents a comparative analysis of site characteristics related to long-term safety for the two candidate sites for a final repository for spent nuclear fuel in Forsmark (municipality of Oesthammar) and in Laxemar (municipality of Oskarshamn) from the point of view of site selection. The analyses are based on the updated site descriptions of Forsmark /SKB 2008a/ and Laxemar /SKB 2009a/, together with associated updated repository layouts and designs /SKB 2008b and SKB 2009b/. The basis for the comparison is thus two equally and thoroughly assessed sites. However, the analyses presented here are focussed on differences between the sites rather than evaluating them in absolute terms. The document serves as a basis for the site selection, from the perspective of long-term safety, in SKB's application for a final repository. A full evaluation of safety is made for a repository at the selected site in the safety assessment SR-Site /SKB 2011/, referred to as SR-Site main report in the following

  15. The current state analysis of the problem of probabilistic ice interaction with the seabed and underwater pipelines

    Energy Technology Data Exchange (ETDEWEB)

    Bekker, A.T.; Sabodash, O.A.; Seliverstov, V.I. [Far-Eastern National Technical Univ., Vladivostok (Russian Federation)

    2008-07-01

    This presentation addressed the challenge of designing an underwater pipeline route in Arctic regions where complex ice conditions prevail. In particular, it identified the following problems using a probabilistic approach: optimum value of pipeline installation depth; optimum variant of the pipeline layout; reliability target of the pipeline; estimation of hummock and scour parameters; the physical process of hummock penetration into the soil; soil deformation under scours; and actual bottom topography of the water area and soils. The behaviour of pipelines under sea ice was analyzed and recommendations were presented for designing offshore buried structures in Arctic regions. Simulation modeling of scour depths, or hummock penetration into soil, was presented along with actual ice/ground conditions and underwater topography. The analysis revealed that there is a high probability that underwater pipelines and other buried engineered structures in these areas could be damaged by drifting ice formations. The problem of probabilistic description of ice impacts on the sea bottom and underwater pipelines was analyzed. The results may be useful in estimating the normative burial depth of underwater pipelines in Arctic seas. 11 refs., 1 tab., 3 figs.

  16. Code development incorporating environmental, safety, and economic aspects of fusion reactors (FY 89--91)

    Energy Technology Data Exchange (ETDEWEB)

    Ho, S.K.; Fowler, T.K.; Holdren, J.P. (eds.)

    1991-11-01

    This report discusses the following aspects of Fusion reactors.: Activation Analysis; Tritium Inventory; Environmental and Safety Indices and Their Graphical Representation; Probabilistic Risk Assessment (PRA) and Decision Analysis; Plasma Burn Control -- Application to ITER; and Other Applications.

  17. Wireless capsule endoscopy video segmentation using an unsupervised learning approach based on probabilistic latent semantic analysis with scale invariant features.

    Science.gov (United States)

    Shen, Yao; Guturu, Parthasarathy Partha; Buckles, Bill P

    2012-01-01

    Since wireless capsule endoscopy (WCE) is a novel technology for recording the videos of the digestive tract of a patient, the problem of segmenting the WCE video of the digestive tract into subvideos corresponding to the entrance, stomach, small intestine, and large intestine regions is not well addressed in the literature. A selected few papers addressing this problem follow supervised leaning approaches that presume availability of a large database of correctly labeled training samples. Considering the difficulties in procuring sizable WCE training data sets needed for achieving high classification accuracy, we introduce in this paper an unsupervised learning approach that employs Scale Invariant Feature Transform (SIFT) for extraction of local image features and the probabilistic latent semantic analysis (pLSA) model used in the linguistic content analysis for data clustering. Results of experimentation indicate that this method compares well in classification accuracy with the state-of-the-art supervised classification approaches to WCE video segmentation.

  18. On the use of faults and background seismicity in Seismic Probabilistic Tsunami Hazard Analysis (SPTHA)

    Science.gov (United States)

    Selva, Jacopo; Lorito, Stefano; Basili, Roberto; Tonini, Roberto; Tiberti, Mara Monica; Romano, Fabrizio; Perfetti, Paolo; Volpe, Manuela

    2017-04-01

    Most of the SPTHA studies and applications rely on several working assumptions: i) the - mostly offshore - tsunamigenic faults are sufficiently well known; ii) the subduction zone earthquakes dominate the hazard; iii) and their location and geometry is sufficiently well constrained. Hence, a probabilistic model is constructed as regards the magnitude-frequency distribution and sometimes the slip distribution of earthquakes occurring on assumed known faults. Then, tsunami scenarios are usually constructed for all earthquakes location, sizes, and slip distributions included in the probabilistic model, through deterministic numerical modelling of tsunami generation, propagation and impact on realistic bathymetries. Here, we adopt a different approach (Selva et al., GJI, 2016) that releases some of the above assumptions, considering that i) also non-subduction earthquakes may contribute significantly to SPTHA, depending on the local tectonic context; ii) that not all the offshore faults are known or sufficiently well constrained; iii) and that the faulting mechanism of future earthquakes cannot be considered strictly predictable. This approach uses as much as possible information from known faults which, depending on the amount of available information and on the local tectonic complexity, among other things, are either modelled as Predominant Seismicity (PS) or as Background Seismicity (BS). PS is used when it is possible to assume sufficiently known geometry and mechanism (e.g. for the main subduction zones). Conversely, within the BS approach information on faults is merged with that on past seismicity, dominant stress regime, and tectonic characterisation, to determine a probability density function for the faulting mechanism. To illustrate the methodology and its impact on the hazard estimates, we present an application in the NEAM region (Northeast Atlantic, Mediterranean and connected seas), initially designed during the ASTARTE project and now applied for the

  19. Single cell analysis reveals the stochastic phase of reprogramming to pluripotency is an ordered probabilistic process.

    Science.gov (United States)

    Chung, Kyung-Min; Kolling, Frederick W; Gajdosik, Matthew D; Burger, Steven; Russell, Alexander C; Nelson, Craig E

    2014-01-01

    Despite years of research, the reprogramming of human somatic cells to pluripotency remains a slow, inefficient process, and a detailed mechanistic understanding of reprogramming remains elusive. Current models suggest reprogramming to pluripotency occurs in two-phases: a prolonged stochastic phase followed by a rapid deterministic phase. In this paradigm, the early stochastic phase is marked by the random and gradual expression of pluripotency genes and is thought to be a major rate-limiting step in the successful generation of induced Pluripotent Stem Cells (iPSCs). Recent evidence suggests that the epigenetic landscape of the somatic cell is gradually reset during a period known as the stochastic phase, but it is known neither how this occurs nor what rate-limiting steps control progress through the stochastic phase. A precise understanding of gene expression dynamics in the stochastic phase is required in order to answer these questions. Moreover, a precise model of this complex process will enable the measurement and mechanistic dissection of treatments that enhance the rate or efficiency of reprogramming to pluripotency. Here we use single-cell transcript profiling, FACS and mathematical modeling to show that the stochastic phase is an ordered probabilistic process with independent gene-specific dynamics. We also show that partially reprogrammed cells infected with OSKM follow two trajectories: a productive trajectory toward increasingly ESC-like expression profiles or an alternative trajectory leading away from both the fibroblast and ESC state. These two pathways are distinguished by the coordinated expression of a small group of chromatin modifiers in the productive trajectory, supporting the notion that chromatin remodeling is essential for successful reprogramming. These are the first results to show that the stochastic phase of reprogramming in human fibroblasts is an ordered, probabilistic process with gene-specific dynamics and to provide a precise

  20. Analysis of lesions in patients with unilateral tactile agnosia using cytoarchitectonic probabilistic maps.

    Science.gov (United States)

    Hömke, Lars; Amunts, Katrin; Bönig, Lutz; Fretz, Christian; Binkofski, Ferdinand; Zilles, Karl; Weder, Bruno

    2009-05-01

    We propose a novel methodical approach to lesion analyses involving high-resolution MR images in combination with probabilistic cytoarchitectonic maps. 3D-MR images of the whole brain and the manually segmented lesion mask are spatially normalized to the reference brain of a stereotaxic probabilistic cytoarchitectonic atlas using a multiscale registration algorithm based on an elastic model. The procedure is demonstrated in three patients suffering from aperceptive tactile agnosia of the right hand due to chronic infarction of the left parietal cortex. Patient 1 presents a lesion in areas of the postcentral sulcus, Patient 3 in areas of the superior parietal lobule and adjacent intraparietal sulcus, and Patient 2 lesions in both regions. On the basis of neurobehavioral data, we conjectured degradation of sequential elementary sensory information processing within the postcentral gyrus, impeding texture recognition in Patients 1 and 2, and disturbed kinaesthetic information processing in the posterior parietal lobe, causing degraded shape recognition in the patients 2 and 3. The involvement of Brodmann areas 4a, 4p, 3a, 3b, 1, 2, and areas IP1 and IP2 of the intraparietal sulcus was assessed in terms of the voxel overlap between the spatially transformed lesion masks and the 50%-isocontours of the cytoarchitectonic maps. The disruption of the critical cytoarchitectonic areas and the impaired subfunctions, texture and shape recognition, relate as conjectured above. We conclude that the proposed method represents a promising approach to hypothesis-driven lesion analyses, yielding lesion-function correlates based on a cytoarchitectonic model. Finally, the lesion-function correlates are validated by functional imaging reference data. (c) 2008 Wiley-Liss, Inc.

  1. Probabilistic analysis of mean-response along-wind induced vibrations on wind turbine towers using wireless network data sensors

    Science.gov (United States)

    Velazquez, Antonio; Swartz, Raymond A.

    2011-04-01

    Wind turbine systems are attracting considerable attention due to concerns regarding global energy consumption as well as sustainability. Advances in wind turbine technology promote the tendency to improve efficiency in the structure that support and produce this renewable power source, tending toward more slender and larger towers, larger gear boxes, and larger, lighter blades. The structural design optimization process must account for uncertainties and nonlinear effects (such as wind-induced vibrations, unmeasured disturbances, and material and geometric variabilities). In this study, a probabilistic monitoring approach is developed that measures the response of the turbine tower to stochastic loading, estimates peak demand, and structural resistance (in terms of serviceability). The proposed monitoring system can provide a real-time estimate of the probability of exceedance of design serviceability conditions based on data collected in-situ. Special attention is paid to wind and aerodynamic characteristics that are intrinsically present (although sometimes neglected in health monitoring analysis) and derived from observations or experiments. In particular, little attention has been devoted to buffeting, usually non-catastrophic but directly impacting the serviceability of the operating wind turbine. As a result, modal-based analysis methods for the study and derivation of flutter instability, and buffeting response, have been successfully applied to the assessment of the susceptibility of high-rise slender structures, including wind turbine towers. A detailed finite element model has been developed to generate data (calibrated to published experimental and analytical results). Risk assessment is performed for the effects of along wind forces in a framework of quantitative risk analysis. Both structural resistance and wind load demands were considered probabilistic with the latter assessed by dynamic analyses.

  2. Probabilistic Unawareness

    Directory of Open Access Journals (Sweden)

    Mikaël Cozic

    2016-11-01

    Full Text Available The modeling of awareness and unawareness is a significant topic in the doxastic logic literature, where it is usually tackled in terms of full belief operators. The present paper aims at a treatment in terms of partial belief operators. It draws upon the modal probabilistic logic that was introduced by Aumann (1999 at the semantic level, and then axiomatized by Heifetz and Mongin (2001. The paper embodies in this framework those properties of unawareness that have been highlighted in the seminal paper by Modica and Rustichini (1999. Their paper deals with full belief, but we argue that the properties in question also apply to partial belief. Our main result is a (soundness and completeness theorem that reunites the two strands—modal and probabilistic—of doxastic logic.

  3. A decision support system for fusion of hard and soft sensor information based on probabilistic latent semantic analysis technique

    Science.gov (United States)

    Shirkhodaie, Amir; Elangovan, Vinayak; Alkilani, Amjad; Habibi, Mohammad

    2013-05-01

    This paper presents an ongoing effort towards development of an intelligent Decision-Support System (iDSS) for fusion of information from multiple sources consisting of data from hard (physical sensors) and soft (textural sources. Primarily, this paper defines taxonomy of decision support systems for latent semantic data mining from heterogeneous data sources. A Probabilistic Latent Semantic Analysis (PLSA) approach is proposed for latent semantic concepts search from heterogeneous data sources. An architectural model for generating semantic annotation of multi-modality sensors in a modified Transducer Markup Language (TML) is described. A method for TML messages fusion is discussed for alignment and integration of spatiotemporally correlated and associated physical sensory observations. Lastly, the experimental results which exploit fusion of soft/hard sensor sources with support of iDSS are discussed.

  4. Health risk assessment of heavy metals through the consumption of food crops fertilized by biosolids: A probabilistic-based analysis

    Energy Technology Data Exchange (ETDEWEB)

    Hosseini Koupaie, E., E-mail: ehssan.hosseini.k@gmail.com; Eskicioglu, C., E-mail: cigdem.eskicioglu@ubc.ca

    2015-12-30

    Highlights: • No potential health risk of land application of the regional biosolids. • More realistic risk assessment via probabilistic approach than that of deterministic. • Increasing the total hazard index with increasing fertilizer land application rate. • Significant effect of long-term biosolids land application of hazard index. • Greater contribution of rice ingestion than vegetable ingestion on hazard index. - Abstract: The objective of this study was to perform a probabilistic risk analysis (PRA) to assess the health risk of Cadmium (Cd), Copper (Cu), and Zinc (Zn) through the consumption of food crops grown on farm lands fertilized by biosolids. The risk analysis was conducted using 8 years of historical heavy metal data (2005–2013) of the municipal biosolids generated by a nearby treatment facility considering one-time and long-term biosolids land application scenarios for a range of 5–100 t/ha fertilizer application rate. The 95th percentile of the hazard index (HI) increased from 0.124 to 0.179 when the rate of fertilizer application increased from 5 to 100 t/ha at one-time biosolids land application. The HI at long-term biosolids land application was also found 1.3 and 1.9 times greater than that of one-time land application at fertilizer application rates of 5 and 100 t/ha, respectively. Rice ingestion had more contribution to the HI than vegetable ingestion. Cd and Cu were also found to have more contribution to the health risk associated to vegetable and rice ingestion, respectively. Results indicated no potential risk to the human health even at long-term biosolids land application scenario at 100 t/ha fertilizer application rate.

  5. Concept analysis of safety climate in healthcare providers.

    Science.gov (United States)

    Lin, Ying-Siou; Lin, Yen-Chun; Lou, Meei-Fang

    2017-06-01

    To report an analysis of the concept of safety climate in healthcare providers. Compliance with safe work practices is essential to patient safety and care outcomes. Analysing the concept of safety climate from the perspective of healthcare providers could improve understanding of the correlations between safety climate and healthcare provider compliance with safe work practices, thus enhancing quality of patient care. Concept analysis. The electronic databases of CINAHL, MEDLINE, PubMed and Web of Science were searched for literature published between 1995-2015. Searches used the keywords 'safety climate' or 'safety culture' with 'hospital' or 'healthcare'. The concept analysis method of Walker and Avant analysed safety climate from the perspective of healthcare providers. Three attributes defined how healthcare providers define safety climate: (1) creation of safe working environment by senior management in healthcare organisations; (2) shared perception of healthcare providers about safety of their work environment; and (3) the effective dissemination of safety information. Antecedents included the characteristics of healthcare providers and healthcare organisations as a whole, and the types of work in which they are engaged. Consequences consisted of safety performance and safety outcomes. Most studies developed and assessed the survey tools of safety climate or safety culture, with a minority consisting of interventional measures for improving safety climate. More prospective studies are needed to create interventional measures for improving safety climate of healthcare providers. This study is provided as a reference for use in developing multidimensional safety climate assessment tools and interventional measures. The values healthcare teams emphasise with regard to safety can serve to improve safety performance. Having an understanding of the concept of and interventional measures for safety climate allows healthcare providers to ensure the safety of their

  6. Probabilistic Graphical Models for the Analysis and Synthesis of Musical Audio

    Science.gov (United States)

    Hoffmann, Matthew Douglas

    Content-based Music Information Retrieval (MIR) systems seek to automatically extract meaningful information from musical audio signals. This thesis applies new and existing generative probabilistic models to several content-based MIR tasks: timbral similarity estimation, semantic annotation and retrieval, and latent source discovery and separation. In order to estimate how similar two songs sound to one another, we employ a Hierarchical Dirichlet Process (HDP) mixture model to discover a shared representation of the distribution of timbres in each song. Comparing songs under this shared representation yields better query-by-example retrieval quality and scalability than previous approaches. To predict what tags are likely to apply to a song (e.g., "rap," "happy," or "driving music"), we develop the Codeword Bernoulli Average (CBA) model, a simple and fast mixture-of-experts model. Despite its simplicity, CBA performs at least as well as state-of-the-art approaches at automatically annotating songs and finding to what songs in a database a given tag most applies. Finally, we address the problem of latent source discovery and separation by developing two Bayesian nonparametric models, the Shift-Invariant HDP and Gamma Process NMF. These models allow us to discover what sounds (e.g. bass drums, guitar chords, etc.) are present in a song or set of songs and to isolate or suppress individual source. These models' ability to decide how many latent sources are necessary to model the data is particularly valuable in this application, since it is impossible to guess a priori how many sounds will appear in a given song or set of songs. Once they have been fit to data, probabilistic models can also be used to drive the synthesis of new musical audio, both for creative purposes and to qualitatively diagnose what information a model does and does not capture. We also adapt the SIHDP model to create new versions of input audio with arbitrary sample sets, for example, to create

  7. Probabilistic seismic hazard maps from seismicity patterns analysis: the Iberian Peninsula case

    Directory of Open Access Journals (Sweden)

    A. Jiménez

    2004-01-01

    Full Text Available Earthquake prediction is a main topic in Seismology. Here, the goal is to know the correlation between the seismicity at a certain place at a given time with the seismicity at the same place, but at a following interval of time. There are no ways for exact predictions, but one can wonder about the causality relations between the seismic characteristics at a given time interval and another in a region. In this paper, a new approach to this kind of studies is presented. Tools which include cellular automata theory and Shannon's entropy are used. First, the catalogue is divided into time intervals, and the region into cells. The activity or inactivity of each cell at a certain time is described using an energy criterion; thus a pattern which evolves over time is given. The aim is to find the rules of the stochastic cellular automaton which best fits the evolution of the pattern. The neighborhood utilized is the cross template (CT. A grid search is made to choose the best model, being the mutual information between the different times the function to be maximized. This function depends on the size of the cells β on and the interval of time τ which is considered for studying the activity of a cell. With these β and τ, a set of probabilities which characterizes the evolution rules is calculated, giving a probabilistic approach to the spatiotemporal evolution of the region. The sample catalogue for the Iberian Peninsula covers since 1970 till 2001. The results point out that the seismic activity must be deduced not only from the past activity at the same region but also from its surrounding activity. The time and spatial highest interaction for the catalogue used are of around 3.3 years and 290x165 km2, respectively; if a cell is inactive, it will continue inactive with a high probability; an active cell has around the 60% probability of continuing active in the future. The Probabilistic Seismic Hazard Map obtained marks the main seismic active

  8. 14th International Probabilistic Workshop

    CERN Document Server

    Taerwe, Luc; Proske, Dirk

    2017-01-01

    This book presents the proceedings of the 14th International Probabilistic Workshop that was held in Ghent, Belgium in December 2016. Probabilistic methods are currently of crucial importance for research and developments in the field of engineering, which face challenges presented by new materials and technologies and rapidly changing societal needs and values. Contemporary needs related to, for example, performance-based design, service-life design, life-cycle analysis, product optimization, assessment of existing structures and structural robustness give rise to new developments as well as accurate and practically applicable probabilistic and statistical engineering methods to support these developments. These proceedings are a valuable resource for anyone interested in contemporary developments in the field of probabilistic engineering applications.

  9. Multilevel analysis in road safety research.

    Science.gov (United States)

    Dupont, Emmanuelle; Papadimitriou, Eleonora; Martensen, Heike; Yannis, George

    2013-11-01

    Hierarchical structures in road safety data are receiving increasing attention in the literature and multilevel (ML) models are proposed for appropriately handling the resulting dependences among the observations. However, so far no empirical synthesis exists of the actual added value of ML modelling techniques as compared to other modelling approaches. This paper summarizes the statistical and conceptual background and motivations for multilevel analyses in road safety research. It then provides a review of several ML analyses applied to aggregate and disaggregate (accident) data. In each case, the relevance of ML modelling techniques is assessed by examining whether ML model formulations (i) allow improving the fit of the model to the data, (ii) allow identifying and explaining random variation at specific levels of the hierarchy considered, and (iii) yield different (more correct) conclusions than single-level model formulations with respect to the significance of the parameter estimates. The evidence reviewed offers different conclusions depending on whether the analysis concerns aggregate data or disaggregate data. In the first case, the application of ML analysis techniques appears straightforward and relevant. The studies based on disaggregate accident data, on the other hand, offer mixed findings: computational problems can be encountered, and ML applications are not systematically necessary. The general recommendation concerning disaggregate accident data is to proceed to a preliminary investigation of the necessity of ML analyses and of the additional information to be expected from their application. Copyright © 2013 Elsevier Ltd. All rights reserved.

  10. 242-A evaporator safety analysis report

    Energy Technology Data Exchange (ETDEWEB)

    CAMPBELL, T.A.

    1999-05-17

    This report provides a revised safety analysis for the upgraded 242-A Evaporator (the Evaporator). This safety analysis report (SAR) supports the operation of the Evaporator following life extension upgrades and other facility and operations upgrades (e.g., Project B-534) that were undertaken to enhance the capabilities of the Evaporator. The Evaporator has been classified as a moderate-hazard facility (Johnson 1990). The information contained in this SAR is based on information provided by 242-A Evaporator Operations, Westinghouse Hanford Company, site maintenance and operations contractor from June 1987 to October 1996, and the existing operating contractor, Waste Management Hanford (WMH) policies. Where appropriate, a discussion address the US Department of Energy (DOE) Orders applicable to a topic is provided. Operation of the facility will be compared to the operating contractor procedures using appropriate audits and appraisals. The following subsections provide introductory and background information, including a general description of the Evaporator facility and process, a description of the scope of this SAR revision,a nd a description of the basic changes made to the original SAR.

  11. Modeling and Quantification of Team Performance in Human Reliability Analysis for Probabilistic Risk Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Jeffrey C. JOe; Ronald L. Boring

    2014-06-01

    Probabilistic Risk Assessment (PRA) and Human Reliability Assessment (HRA) are important technical contributors to the United States (U.S.) Nuclear Regulatory Commission’s (NRC) risk-informed and performance based approach to regulating U.S. commercial nuclear activities. Furthermore, all currently operating commercial NPPs in the U.S. are required by federal regulation to be staffed with crews of operators. Yet, aspects of team performance are underspecified in most HRA methods that are widely used in the nuclear industry. There are a variety of "emergent" team cognition and teamwork errors (e.g., communication errors) that are 1) distinct from individual human errors, and 2) important to understand from a PRA perspective. The lack of robust models or quantification of team performance is an issue that affects the accuracy and validity of HRA methods and models, leading to significant uncertainty in estimating HEPs. This paper describes research that has the objective to model and quantify team dynamics and teamwork within NPP control room crews for risk informed applications, thereby improving the technical basis of HRA, which improves the risk-informed approach the NRC uses to regulate the U.S. commercial nuclear industry.

  12. Implementing eigenvector methods/probabilistic neural networks for analysis of EEG signals.

    Science.gov (United States)

    Ubeyli, Elif Derya

    2008-11-01

    A new approach based on the implementation of probabilistic neural network (PNN) is presented for classification of electroencephalogram (EEG) signals. In practical applications of pattern recognition, there are often diverse features extracted from raw data which needs recognizing. Because of the importance of making the right decision, the present work is carried out for searching better classification procedures for the EEG signals. Decision making was performed in two stages: feature extraction by eigenvector methods and classification using the classifiers trained on the extracted features. The aim of the study is classification of the EEG signals by the combination of eigenvector methods and the PNN. The purpose is to determine an optimum classification scheme for this problem and also to infer clues about the extracted features. The present research demonstrated that the power levels of the power spectral density (PSD) estimates obtained by the eigenvector methods are the features which well represent the EEG signals and the PNN trained on these features achieved high classification accuracies.

  13. Probabilistic accident consequence uncertainty analysis: Food chain uncertainty assessment. Volume 1: Main report

    Energy Technology Data Exchange (ETDEWEB)

    Brown, J. [National Radiological Protection Board (United Kingdom); Goossens, L.H.J.; Kraan, B.C.P. [Delft Univ. of Technology (Netherlands)] [and others

    1997-06-01

    This volume is the first of a two-volume document that summarizes a joint project conducted by the US Nuclear Regulatory Commission and the European Commission to assess uncertainties in the MACCS and COSYMA probabilistic accident consequence codes. These codes were developed primarily for estimating the risks presented by nuclear reactors based on postulated frequencies and magnitudes of potential accidents. This document reports on an ongoing project to assess uncertainty in the MACCS and COSYMA calculations for the offsite consequences of radionuclide releases by hypothetical nuclear power plant accidents. A panel of sixteen experts was formed to compile credible and traceable uncertainty distributions for food chain variables that affect calculations of offsite consequences. The expert judgment elicitation procedure and its outcomes are described in these volumes. Other panels were formed to consider uncertainty in other aspects of the codes. Their results are described in companion reports. Volume 1 contains background information and a complete description of the joint consequence uncertainty study. Volume 2 contains appendices that include (1) a summary of the MACCS and COSYMA consequence codes, (2) the elicitation questionnaires and case structures for both panels, (3) the rationales and results for the panels on soil and plant transfer and animal transfer, (4) short biographies of the experts, and (5) the aggregated results of their responses.

  14. Development of safety analysis technology for integral reactor

    Energy Technology Data Exchange (ETDEWEB)

    Sim, Suk K.; Song, J. H.; Chung, Y. J. and others

    1999-03-01

    Inherent safety features and safety system characteristics of the SMART integral reactor are investigated in this study. Performance and safety of the SMART conceptual design have been evaluated and confirmed through the performance and safety analyses using safety analysis system codes as well as a preliminary performance and safety analysis methodology. SMART design base events and their acceptance criteria are identified to develop a preliminary PIRT for the SMART integral reactor. Using the preliminary PIRT, a set of experimental program for the thermal hydraulic separate effect tests and the integral effect tests was developed for the thermal hydraulic model development and the system code validation. Safety characteristics as well as the safety issues of the integral reactor has been identified during the study, which will be used to resolve the safety issues and guide the regulatory criteria for the integral reactor. The results of the performance and safety analyses performed during the study were used to feedback for the SMART conceptual design. The performance and safety analysis code systems as well as the preliminary safety analysis methodology developed in this study will be validated as the SMART design evolves. The performance and safety analysis technology developed during the study will be utilized for the SMART basic design development. (author)

  15. Code development incorporating environmental, safety, and economic aspects of fusion reactors (FY 89--91). Final report

    Energy Technology Data Exchange (ETDEWEB)

    Ho, S.K.; Fowler, T.K.; Holdren, J.P. [eds.

    1991-11-01

    This report discusses the following aspects of Fusion reactors.: Activation Analysis; Tritium Inventory; Environmental and Safety Indices and Their Graphical Representation; Probabilistic Risk Assessment (PRA) and Decision Analysis; Plasma Burn Control -- Application to ITER; and Other Applications.

  16. Probabilistic accident consequence uncertainty analysis: Food chain uncertainty assessment. Volume 2: Appendices

    Energy Technology Data Exchange (ETDEWEB)

    Brown, J. [National Radiological Protection Board (United Kingdom); Goossens, L.H.J.; Kraan, B.C.P. [Delft Univ. of Technology (Netherlands)] [and others

    1997-06-01

    This volume is the second of a two-volume document that summarizes a joint project by the US Nuclear Regulatory and the Commission of European Communities to assess uncertainties in the MACCS and COSYMA probabilistic accident consequence codes. These codes were developed primarily for estimating the risks presented by nuclear reactors based on postulated frequencies and magnitudes of potential accidents. This two-volume report, which examines mechanisms and uncertainties of transfer through the food chain, is the first in a series of five such reports. A panel of sixteen experts was formed to compile credible and traceable uncertainty distributions for food chain transfer that affect calculations of offsite radiological consequences. Seven of the experts reported on transfer into the food chain through soil and plants, nine reported on transfer via food products from animals, and two reported on both. The expert judgment elicitation procedure and its outcomes are described in these volumes. This volume contains seven appendices. Appendix A presents a brief discussion of the MAACS and COSYMA model codes. Appendix B is the structure document and elicitation questionnaire for the expert panel on soils and plants. Appendix C presents the rationales and responses of each of the members of the soils and plants expert panel. Appendix D is the structure document and elicitation questionnaire for the expert panel on animal transfer. The rationales and responses of each of the experts on animal transfer are given in Appendix E. Brief biographies of the food chain expert panel members are provided in Appendix F. Aggregated results of expert responses are presented in graph format in Appendix G.

  17. Assessment of climate change impacts on climate variables using probabilistic ensemble modeling and trend analysis

    Science.gov (United States)

    Safavi, Hamid R.; Sajjadi, Sayed Mahdi; Raghibi, Vahid

    2017-10-01

    Water resources in snow-dependent regions have undergone significant changes due to climate change. Snow measurements in these regions have revealed alarming declines in snowfall over the past few years. The Zayandeh-Rud River in central Iran chiefly depends on winter falls as snow for supplying water from wet regions in high Zagrous Mountains to the downstream, (semi-)arid, low-lying lands. In this study, the historical records (baseline: 1971-2000) of climate variables (temperature and precipitation) in the wet region were chosen to construct a probabilistic ensemble model using 15 GCMs in order to forecast future trends and changes while the Long Ashton Research Station Weather Generator (LARS-WG) was utilized to project climate variables under two A2 and B1 scenarios to a future period (2015-2044). Since future snow water equivalent (SWE) forecasts by GCMs were not available for the study area, an artificial neural network (ANN) was implemented to build a relationship between climate variables and snow water equivalent for the baseline period to estimate future snowfall amounts. As a last step, homogeneity and trend tests were performed to evaluate the robustness of the data series and changes were examined to detect past and future variations. Results indicate different characteristics of the climate variables at upstream stations. A shift is observed in the type of precipitation from snow to rain as well as in its quantities across the subregions. The key role in these shifts and the subsequent side effects such as water losses is played by temperature.

  18. Probabilistic aftershock hazard analysis, two case studies in West and Northwest Iran

    Science.gov (United States)

    Ommi, S.; Zafarani, H.

    2017-09-01

    Aftershock hazard maps contain the essential information for search and rescue process, and re-occupation after a main-shock. Accordingly, the main purposes of this article are to study the aftershock decay parameters and to estimate the expected high-frequency ground motions (i.e., Peak Ground Acceleration (PGA)) for recent large earthquakes in the Iranian plateau. For this aim, the Ahar-Varzaghan doublet earthquake (August 11, 2012; M N =6.5, M N =6.3), and the Ilam (Murmuri) earthquake (August 18, 2014 ; M N =6.2) have been selected. The earthquake catalogue has been collected based on the Gardner and Knopoff (Bull Seismol Soc Am 64(5), 1363-1367, 1974) temporal and spatial windowing technique. The magnitude of completeness and the seismicity parameters (a, b) and the modified Omori law parameters (P, K, C) have been determined for these two earthquakes in the 14, 30, and 60 days after the mainshocks. Also, the temporal changes of parameters (a, b, P, K, C) have been studied. The aftershock hazard maps for the probability of exceedance (33%) have been computed in the time periods of 14, 30, and 60 days after the Ahar-Varzaghan and Ilam (Murmuri) earthquakes. For calculating the expected PGA of aftershocks, the regional and global ground motion prediction equations have been utilized. Amplification factor based on the site classes has also been implied in the calculation of PGA. These aftershock hazard maps show an agreement between the PGAs of large aftershocks and the forecasted PGAs. Also, the significant role of b parameter in the Ilam (Murmuri) probabilistic aftershock hazard maps has been investigated.

  19. Influence of weak hip abductor muscles on joint contact forces during normal walking: probabilistic modeling analysis.

    Science.gov (United States)

    Valente, Giordano; Taddei, Fulvia; Jonkers, Ilse

    2013-09-03

    The weakness of hip abductor muscles is related to lower-limb joint osteoarthritis, and joint overloading may increase the risk for disease progression. The relationship between muscle strength, structural joint deterioration and joint loading makes the latter an important parameter in the study of onset and follow-up of the disease. Since the relationship between hip abductor weakness and joint loading still remains an open question, the purpose of this study was to adopt a probabilistic modeling approach to give insights into how the weakness of hip abductor muscles, in the extent to which normal gait could be unaltered, affects ipsilateral joint contact forces. A generic musculoskeletal model was scaled to each healthy subject included in the study, and the maximum force-generating capacity of each hip abductor muscle in the model was perturbed to evaluate how all physiologically possible configurations of hip abductor weakness affected the joint contact forces during walking. In general, the muscular system was able to compensate for abductor weakness. The reduced force-generating capacity of the abductor muscles affected joint contact forces to a mild extent, with 50th percentile mean differences up to 0.5 BW (maximum 1.7 BW). There were greater increases in the peak knee joint loads than in loads at the hip or ankle. Gluteus medius, particularly the anterior compartment, was the abductor muscle with the most influence on hip and knee loads. Further studies should assess if these increases in joint loading may affect initiation and progression of osteoarthritis. Copyright © 2013 Elsevier Ltd. All rights reserved.

  20. Advanced Small Modular Reactor (SMR) Probabilistic Risk Assessment (PRA) Technical Exchange Meeting

    Energy Technology Data Exchange (ETDEWEB)

    Smith, Curtis [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2013-09-01

    During FY13, the INL developed an advanced SMR PRA framework which has been described in the report Small Modular Reactor (SMR) Probabilistic Risk Assessment (PRA) Detailed Technical Framework Specification, INL/EXT-13-28974 (April 2013). In this framework, the various areas are considered: Probabilistic models to provide information specific to advanced SMRs Representation of specific SMR design issues such as having co-located modules and passive safety features Use of modern open-source and readily available analysis methods Internal and external events resulting in impacts to safety All-hazards considerations Methods to support the identification of design vulnerabilities Mechanistic and probabilistic data needs to support modeling and tools In order to describe this framework more fully and obtain feedback on the proposed approaches, the INL hosted a technical exchange meeting during August 2013. This report describes the outcomes of that meeting.

  1. Safety analysis of surface haulage accidents

    Energy Technology Data Exchange (ETDEWEB)

    Randolph, R.F.; Boldt, C.M.K.

    1996-12-31

    Research on improving haulage truck safety, started by the U.S. Bureau of Mines, is being continued by its successors. This paper reports the orientation of the renewed research efforts, beginning with an update on accident data analysis, the role of multiple causes in these accidents, and the search for practical methods for addressing the most important causes. Fatal haulage accidents most often involve loss of control or collisions caused by a variety of factors. Lost-time injuries most often involve sprains or strains to the back or multiple body areas, which can often be attributed to rough roads and the shocks of loading and unloading. Research to reduce these accidents includes improved warning systems, shock isolation for drivers, encouraging seatbelt usage, and general improvements to system and task design.

  2. Analysis of road safety management systems in Europe.

    NARCIS (Netherlands)

    Muhlrad, N. Vallet, G. Butler, I. Gitelman, V. Doveh, E. Dupont, E. Thomas, P. Talbot, R. Papadimitriou, E. Yannis, G. Persia, L. Giustiniani, G. Machata, K. & Bax, C.A.

    2014-01-01

    The objective of this paper is the analysis of road safety management in European countries and the identification of “good practice”. A road safety management investigation model was created, based on several “good practice” criteria. Road safety management systems have been thoroughly investigated

  3. Adversarial safety analysis: borrowing the methods of security vulnerability assessments.

    Science.gov (United States)

    Johnston, Roger G

    2004-01-01

    Safety and security share numerous attributes. The author, who heads the (Security) Vulnerability Assessment Team at Los Alamos National Laboratory, therefore argues that techniques used to optimize security might be useful for optimizing safety. There are three main ways to attempt to improve security-security surveys, risk assessment (or "design basis threat"), and vulnerability assessments. The latter is usually the most effective. Vulnerability assessment techniques used to improve security can be applied to safety analysis--even though safety is not ordinarily viewed as having malicious adversaries (other than hazards involving deliberate sabotage). Thinking like a malicious adversary can nevertheless have benefits in identifying safety vulnerabilities. The attributes of an effective safety vulnerability assessment are discussed, and recommendations are offered for how such an adversarial assessment might work. A safety vulnerability assessment can potentially provide new insights, a fresh and vivid perspective on safety hazards, and increased safety awareness.

  4. SRS BEDROCK PROBABILISTIC SEISMIC HAZARD ANALYSIS (PSHA) DESIGN BASIS JUSTIFICATION (U)

    Energy Technology Data Exchange (ETDEWEB)

    (NOEMAIL), R

    2005-12-14

    This represents an assessment of the available Savannah River Site (SRS) hard-rock probabilistic seismic hazard assessments (PSHAs), including PSHAs recently completed, for incorporation in the SRS seismic hazard update. The prior assessment of the SRS seismic design basis (WSRC, 1997) incorporated the results from two PSHAs that were published in 1988 and 1993. Because of the vintage of these studies, an assessment is necessary to establish the value of these PSHAs considering more recently collected data affecting seismic hazards and the availability of more recent PSHAs. This task is consistent with the Department of Energy (DOE) order, DOE O 420.1B and DOE guidance document DOE G 420.1-2. Following DOE guidance, the National Map Hazard was reviewed and incorporated in this assessment. In addition to the National Map hazard, alternative ground motion attenuation models (GMAMs) are used with the National Map source model to produce alternate hazard assessments for the SRS. These hazard assessments are the basis for the updated hard-rock hazard recommendation made in this report. The development and comparison of hazard based on the National Map models and PSHAs completed using alternate GMAMs provides increased confidence in this hazard recommendation. The alternate GMAMs are the EPRI (2004), USGS (2002) and a regional specific model (Silva et al., 2004). Weights of 0.6, 0.3 and 0.1 are recommended for EPRI (2004), USGS (2002) and Silva et al. (2004) respectively. This weighting gives cluster weights of .39, .29, .15, .17 for the 1-corner, 2-corner, hybrid, and Greens-function models, respectively. This assessment is judged to be conservative as compared to WSRC (1997) and incorporates the range of prevailing expert opinion pertinent to the development of seismic hazard at the SRS. The corresponding SRS hard-rock uniform hazard spectra are greater than the design spectra developed in WSRC (1997) that were based on the LLNL (1993) and EPRI (1988) PSHAs. The

  5. Structural reliability codes for probabilistic design

    DEFF Research Database (Denmark)

    Ditlevsen, Ove Dalager

    1997-01-01

    difficulties of ambiguity and definition show up when attempting to make the transition from a given authorized partial safety factor code to a superior probabilistic code. For any chosen probabilistic code format there is a considerable variation of the reliability level over the set of structures defined....... The last problem must be accepted as the state of the matter and it seems that it can only be solved pragmatically by standardizing a specific code format as reference format for constant reliability. By an example this paper illustrates that a presently valid partial safety factor code imposes a quite...... is suggested for guiding the choice of the reference probabilistic code format for constant reliability. In the author's opinion there is an urgent need for establishing a standard probabilistic reliability code. This paper presents some considerations that may be debatable, but nevertheless point...

  6. Nonlinear analysis of NPP safety against the aircraft attack

    Energy Technology Data Exchange (ETDEWEB)

    Králik, Juraj, E-mail: juraj.kralik@stuba.sk [Faculty of Civil Engineering, STU in Bratislava, Radlinského 11, 813 68 Bratislava (Slovakia); Králik, Juraj, E-mail: kralik@fa.stuba.sk [Faculty of Architecture, STU in Bratislava, Námestie Slobody 19, 812 45 Bratislava (Slovakia)

    2016-06-08

    The paper presents the nonlinear probabilistic analysis of the reinforced concrete buildings of nuclear power plant under the aircraft attack. The dynamic load is defined in time on base of the airplane impact simulations considering the real stiffness, masses, direction and velocity of the flight. The dynamic response is calculated in the system ANSYS using the transient nonlinear analysis solution method. The damage of the concrete wall is evaluated in accordance with the standard NDRC considering the spalling, scabbing and perforation effects. The simple and detailed calculations of the wall damage are compared.

  7. Nonlinear analysis of NPP safety against the aircraft attack

    Science.gov (United States)

    Králik, Juraj; Králik, Juraj

    2016-06-01

    The paper presents the nonlinear probabilistic analysis of the reinforced concrete buildings of nuclear power plant under the aircraft attack. The dynamic load is defined in time on base of the airplane impact simulations considering the real stiffness, masses, direction and velocity of the flight. The dynamic response is calculated in the system ANSYS using the transient nonlinear analysis solution method. The damage of the concrete wall is evaluated in accordance with the standard NDRC considering the spalling, scabbing and perforation effects. The simple and detailed calculations of the wall damage are compared.

  8. Patient safety work in Sweden: quantitative and qualitative analysis of annual patient safety reports.

    Science.gov (United States)

    Ridelberg, Mikaela; Roback, Kerstin; Nilsen, Per; Carlfjord, Siw

    2016-03-21

    There is widespread recognition of the problem of unsafe care and extensive efforts have been made over the last 15 years to improve patient safety. In Sweden, a new patient safety law obliges the 21 county councils to assemble a yearly patient safety report (PSR). The aim of this study was to describe the patient safety work carried out in Sweden by analysing the PSRs with regard to the structure, process and result elements reported, and to investigate the perceived usefulness of the PSRs as a tool to achieve improved patient safety. The study was based on two sources of data: patient safety reports obtained from county councils in Sweden published in 2014 and a survey of health care practitioners with strategic positions in patient safety work, acting as key informants for their county councils. Answers to open-ended questions were analysed using conventional content analysis. A total of 14 structure elements, 31 process elements and 23 outcome elements were identified. The most frequently reported structure elements were groups devoted to working with antibiotics issues and electronic incident reporting systems. The PSRs were perceived to provide a structure for patient safety work, enhance the focus on patient safety and contribute to learning about patient safety. Patient safety work carried out in Sweden, as described in annual PSRs, features a wide range of structure, process and result elements. According to health care practitioners with strategic positions in the county councils' patient safety work, the PSRs are perceived as useful at various system levels.

  9. Probabilistic model checking analysis of palytoxin effects on cell energy reactions of the Na+/K+-ATPase.

    Science.gov (United States)

    Braz, Fernando A F; Cruz, Jader S; Faria-Campos, Alessandra C; Campos, Sérgio V A

    2013-01-01

    Probabilistic model checking (PMC) is a technique used for the specification and analysis of complex systems. It can be applied directly to biological systems which present these characteristics, including cell transport systems. These systems are structures responsible for exchanging ions through the plasma membrane. Their correct behavior is essential for animal cells, since changes on those are responsible for diseases. In this work, PMC is used to model and analyze the effects of the palytoxin toxin (PTX) interactions with one of these systems. Our model suggests that ATP could inhibit PTX action. Therefore, individuals with ATP deficiencies, such as in brain disorders, may be more susceptible to the toxin. We have also used heat maps to enhance the kinetic model, which is used to describe the system reactions. The map reveals unexpected situations, such as a frequent reaction between unlikely pump states, and hot spots such as likely states and reactions. This type of analysis provides a better understanding on how transmembrane ionic transport systems behave and may lead to the discovery and development of new drugs to treat diseases associated to their incorrect behavior.

  10. Monte Carlo probabilistic sensitivity analysis for patient level simulation models: efficient estimation of mean and variance using ANOVA.

    Science.gov (United States)

    O'Hagan, Anthony; Stevenson, Matt; Madan, Jason

    2007-10-01

    Probabilistic sensitivity analysis (PSA) is required to account for uncertainty in cost-effectiveness calculations arising from health economic models. The simplest way to perform PSA in practice is by Monte Carlo methods, which involves running the model many times using randomly sampled values of the model inputs. However, this can be impractical when the economic model takes appreciable amounts of time to run. This situation arises, in particular, for patient-level simulation models (also known as micro-simulation or individual-level simulation models), where a single run of the model simulates the health care of many thousands of individual patients. The large number of patients required in each run to achieve accurate estimation of cost-effectiveness means that only a relatively small number of runs is possible. For this reason, it is often said that PSA is not practical for patient-level models. We develop a way to reduce the computational burden of Monte Carlo PSA for patient-level models, based on the algebra of analysis of variance. Methods are presented to estimate the mean and variance of the model output, with formulae for determining optimal sample sizes. The methods are simple to apply and will typically reduce the computational demand very substantially. John Wiley & Sons, Ltd.

  11. Characterising Seismic Hazard Input for Analysis Risk to Multi-System Infrastructures: Application to Scenario Event-Based Models and extension to Probabilistic Risk

    Science.gov (United States)

    Weatherill, G. A.; Silva, V.

    2011-12-01

    The potential human and economic cost of earthquakes to complex urban infrastructures has been demonstrated in the most emphatic manner by recent large earthquakes such as that of Haiti (February 2010), Christchurch (September 2010 and February 2011) and Tohoku (March 2011). Consideration of seismic risk for a homogenous portfolio, such as a single building typology or infrastructure, or independent analyses of separate typologies or infrastructures, are insufficient to fully characterise the potential impacts that arise from inter-connected system failure. Individual elements of each infrastructure may be adversely affected by different facets of the ground motion (e.g. short-period acceleration, long-period displacement, cumulative energy input etc.). The accuracy and efficiency of the risk analysis is dependent on the ability to characterise these multiple features of the ground motion over a spatially distributed portfolio of elements. The modelling challenges raised by this extension to multi-system analysis of risk have been a key focus of the European Project "Systemic Seismic Vulnerability and Risk Analysis for Buildings, Lifeline Networks and Infrastructures Safety Gain (SYNER-G)", and are expected to be developed further within the Global Earthquake Model (GEM). Seismic performance of a spatially distributed infrastructure during an earthquake may be assessed by means of Monte Carlo simulation, in order to incorporate the aleatory variability of the ground motion into the network analysis. Methodologies for co-simulating large numbers of spatially cross-correlated ground motion fields are appraised, and their potential impacts on a spatially distributed portfolio of mixed building typologies assessed using idealised case study scenarios from California and Europe. Potential developments to incorporate correlation and uncertainty in site amplification and geotechnical hazard are also explored. Whilst the initial application of the seismic risk analysis is

  12. Incorporating Traffic Control and Safety Hardware Performance Functions into Risk-based Highway Safety Analysis

    Directory of Open Access Journals (Sweden)

    Zongzhi Li

    2017-04-01

    Full Text Available Traffic control and safety hardware such as traffic signs, lighting, signals, pavement markings, guardrails, barriers, and crash cushions form an important and inseparable part of highway infrastructure affecting safety performance. Significant progress has been made in recent decades to develop safety performance functions and crash modification factors for site-specific crash predictions. However, the existing models and methods lack rigorous treatments of safety impacts of time-deteriorating conditions of traffic control and safety hardware. This study introduces a refined method for computing the Safety Index (SI as a means of crash predictions for a highway segment that incorporates traffic control and safety hardware performance functions into the analysis. The proposed method is applied in a computation experiment using five-year data on nearly two hundred rural and urban highway segments. The root-mean square error (RMSE, Chi-square, Spearman’s rank correlation, and Mann-Whitney U tests are employed for validation.

  13. Multi-criteria decision analysis with probabilistic risk assessment for the management of contaminated ground water

    OpenAIRE

    Khadam, I.; Kaluarachchi, J. J.

    2003-01-01

    Traditionally, environmental decision analysis in subsurface contamination scenarios is performed using cost–benefit analysis. In this paper, we discuss some of the limitations associated with cost–benefit analysis, especially its definition of risk, its definition of cost of risk, and its poor ability to communicate risk-related information. This paper presents an integrated approach for management of contaminated ground water resources using health risk assessment and economic analysis thro...

  14. On the Probabilistic Characterization of Robustness and Resilience

    DEFF Research Database (Denmark)

    Faber, Michael Havbro; Qin, J.; Miraglia, Simona

    2017-01-01

    in the modeling of robustness and resilience in the research areas of natural disaster risk management, socio-ecological systems and social systems and we propose a generic decision analysis framework for the modeling and analysis of systems across application areas. The proposed framework extends the concept...... of direct and indirect consequences and associated risks in probabilistic systems modeling formulated by the Joint Committee on Structural Safety (JCSS) to facilitate the modeling and analysis of resilience in addition to robustness and vulnerability. Moreover, based on recent insights in the modeling...

  15. Probabilistic metric spaces

    CERN Document Server

    Schweizer, B

    2005-01-01

    Topics include special classes of probabilistic metric spaces, topologies, and several related structures, such as probabilistic normed and inner-product spaces. 1983 edition, updated with 3 new appendixes. Includes 17 illustrations.

  16. Quantum probability for probabilists

    CERN Document Server

    Meyer, Paul-André

    1993-01-01

    In recent years, the classical theory of stochastic integration and stochastic differential equations has been extended to a non-commutative set-up to develop models for quantum noises. The author, a specialist of classical stochastic calculus and martingale theory, tries to provide anintroduction to this rapidly expanding field in a way which should be accessible to probabilists familiar with the Ito integral. It can also, on the other hand, provide a means of access to the methods of stochastic calculus for physicists familiar with Fock space analysis.

  17. A parametric Probabilistic Context-Free Grammar for food intake analysis based on continuous meal weight measurements.

    Science.gov (United States)

    Papapanagiotou, Vasileios; Diou, Christos; Langlet, Billy; Ioakimidis, Ioannis; Delopoulos, Anastasios

    2015-08-01

    Monitoring and modification of eating behaviour through continuous meal weight measurements has been successfully applied in clinical practice to treat obesity and eating disorders. For this purpose, the Mandometer, a plate scale, along with video recordings of subjects during the course of single meals, has been used to assist clinicians in measuring relevant food intake parameters. In this work, we present a novel algorithm for automatically constructing a subject's food intake curve using only the Mandometer weight measurements. This eliminates the need for direct clinical observation or video recordings, thus significantly reducing the manual effort required for analysis. The proposed algorithm aims at identifying specific meal related events (e.g. bites, food additions, artifacts), by applying an adaptive pre-processing stage using Delta coefficients, followed by event detection based on a parametric Probabilistic Context-Free Grammar on the derivative of the recorded sequence. Experimental results on a dataset of 114 meals from individuals suffering from obesity or eating disorders, as well as from individuals with normal BMI, demonstrate the effectiveness of the proposed approach.

  18. A probabilistic analysis reveals fundamental limitations with the environmental impact quotient and similar systems for rating pesticide risks.

    Science.gov (United States)

    Peterson, Robert K D; Schleier, Jerome J

    2014-01-01

    Comparing risks among pesticides has substantial utility for decision makers. However, if rating schemes to compare risks are to be used, they must be conceptually and mathematically sound. We address limitations with pesticide risk rating schemes by examining in particular the Environmental Impact Quotient (EIQ) using, for the first time, a probabilistic analytic technique. To demonstrate the consequences of mapping discrete risk ratings to probabilities, adjusted EIQs were calculated for a group of 20 insecticides in four chemical classes. Using Monte Carlo simulation, adjusted EIQs were determined under different hypothetical scenarios by incorporating probability ranges. The analysis revealed that pesticides that have different EIQs, and therefore different putative environmental effects, actually may be no different when incorporating uncertainty. The EIQ equation cannot take into account uncertainty the way that it is structured and provide reliable quotients of pesticide impact. The EIQ also is inconsistent with the accepted notion of risk as a joint probability of toxicity and exposure. Therefore, our results suggest that the EIQ and other similar schemes be discontinued in favor of conceptually sound schemes to estimate risk that rely on proper integration of toxicity and exposure information.

  19. Use of fragile geologic structures as indicators of unexceeded ground motions and direct constraints on probabilistic seismic hazard analysis

    Science.gov (United States)

    Baker, J.W.; Whitney, John W.; Hanks, Thomas C.; Abramson, Norman A.; Board, Mark P.

    2013-01-01

    We present a quantitative procedure for constraining probabilistic seismic hazard analysis results at a given site, based on the existence of fragile geologic structures at that site. We illustrate this procedure by analyzing precarious rocks and undamaged lithophysae at Yucca Mountain, Nevada. The key metric is the probability that the feature would have survived to the present day, assuming that the hazard results are correct. If the fragile geologic structure has an extremely low probability of having survived (which would be inconsistent with the observed survival of the structure), then the calculations illustrate how much the hazard would have to be reduced to result in a nonnegligible survival probability. The calculations are able to consider structures the predicted failure probabilities of which are a function of one or more ground‐motion parameters, as well as structures that either rapidly or slowly evolved to their current state over time. These calculations are the only way to validate seismic hazard curves over long periods of time.

  20. Site Specific Probabilistic Seismic Hazard and Risk Analysis for Surrounding Communities of The Geysers Geothermal Development Area

    Science.gov (United States)

    Miah, M.; Hutchings, L. J.; Savy, J. B.

    2014-12-01

    We conduct a probabilistic seismic hazard and risk analysis from induced and tectonic earthquakes for a 50 km radius area centered on The Geysers, California and for the next ten years. We calculate hazard with both a conventional and physics-based approach. We estimate site specific hazard. We convert hazard to risk of nuisance and damage to structures per year and map the risk. For the conventional PSHA we assume the past ten years is indicative of hazard for the next ten years from Msurprising since they were calculated by completely independent means. The conventional approach used the actual catalog of the past ten years of earthquakes to estimate the hazard for the next ten year. While the physics-based approach used geotechnical modeling to calculate the catalog for the next ten years. Similarly, for the conventional PSHA, we utilized attenuation relations from past earthquakes recorded at the Geysers to translate the ground motion from the source to the site. While for the physics-based approach we calculated ground motion from simulation of actual earthquake rupture. Finally, the source of the earthquakes was the actual source for the conventional PSHA. While, we assumed random fractures for the physics-based approach. From all this, we consider the calculation of the conventional approach, based on actual data, to validate the physics-based approach used.

  1. Probabilistic analysis of three-player symmetric quantum games played using the Einstein-Podolsky-Rosen-Bohm setting

    Energy Technology Data Exchange (ETDEWEB)

    Iqbal, Azhar [School of Electrical and Electronic Engineering, The University of Adelaide, SA 5005 (Australia); Centre for Advanced Mathematics and Physics, National University of Sciences and Technology, Campus of College of Electrical and Mechanical Engineering, Peshawar Road, Rawalpindi (Pakistan)], E-mail: iqbal@eleceng.adelaide.edu.au; Cheon, Taksu [Kochi University of Technology, Tosa Yamada, Kochi 782-8502 (Japan); Abbott, Derek [School of Electrical and Electronic Engineering, The University of Adelaide, SA 5005 (Australia)

    2008-10-27

    This Letter extends our probabilistic framework for two-player quantum games to the multiplayer case, while giving a unified perspective for both classical and quantum games. Considering joint probabilities in the Einstein-Podolsky-Rosen-Bohm (EPR-Bohm) setting for three observers, we use this setting in order to play general three-player noncooperative symmetric games. We analyze how the peculiar non-factorizable joint probabilities provided by the EPR-Bohm setting can change the outcome of a game, while requiring that the quantum game attains a classical interpretation for factorizable joint probabilities. In this framework, our analysis of the three-player generalized Prisoner's Dilemma (PD) shows that the players can indeed escape from the classical outcome of the game, because of non-factorizable joint probabilities that the EPR setting can provide. This surprising result for three-player PD contrasts strikingly with our earlier result for two-player PD, played in the same framework, in which even non-factorizable joint probabilities do not result in escaping from the classical consequence of the game.

  2. Safety Analysis versus Type Inference with Partial Types

    DEFF Research Database (Denmark)

    Schwartzbach, Michael Ignatieff; Palsberg, Jens

    1992-01-01

    Safety analysis is an algorithm for determining if a term in an untyped lambda calculus with constants is safe, i.e., if it does not cause an error during evaluation. This ambition is also shared by algorithms for type inference. Safety analysis and type inference are based on rather different...... perspectives, however. Safety analysis is global in that it can only analyze a complete program. In contrast, type inference is local in that it can analyze pieces of a program in isolation. In this paper we prove that safety analysis is sound, relative to both a strict and a lazy operational semantics. We...... also prove that safety analysis accepts strictly more safe lambda terms than does type inference for simple types. The latter result demonstrates that global program analysis can be more precise than local ones....

  3. Compositional Safety Analysis using Barrier Certificates

    DEFF Research Database (Denmark)

    Sloth, Christoffer; Pappas, George J.; Wisniewski, Rafael

    2012-01-01

    This paper proposes a compositional method for verifying the safety of a dynamical system, given as an interconnection of subsystems. The safety verification is conducted by the use of the barrier certificate method; hence, the contribution of this paper is to show how to obtain compositional...... conditions for safety verification. We show how to formulate the verification problem, as a composition of coupled subproblems, each given for one subsystem. Furthermore, we show how to find the compositional barrier certificates via linear and sum of squares programming problems. The proposed method makes...... it possible to verify the safety of higher dimensional systems, than the method for centrally computed barrier certificates. This is demonstrated by verifying the safety of an emergency shutdown of a wind turbine....

  4. Safety Injection Tank Performance Analysis Using CFD

    Energy Technology Data Exchange (ETDEWEB)

    Cho, Jai Oan; Lee, Jeong Ik; Nietiadi Yohanes Setiawan [KAIST, Daejeon (Korea, Republic of); Addad Yacine [KUSTAR, Abu Dhabi (United Arab Emirates); Bang, Young Seok; Yoo, Seung Hun [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2016-10-15

    This may affect the core cooling capability and threaten the fuel integrity during LOCA situations. However, information on the nitrogen flow rate during discharge is very limited due to the associated experimental measurement difficulties, and these phenomena are hardly reflected in current 1D system codes. In the current study, a CFD analysis is presented which hopefully should allow obtaining a more realistic prediction of the SIT performance which can then be reflected on 1D system codes to simulate various accident scenarios. Current Computational Fluid Dynamics (CFD) calculations have had limited success in predicting the fluid flow accurately. This study aims to find a better CFD prediction and more accurate modeling to predict the system performance during accident scenarios. The safety injection tank with fluidic device was analyzed using commercial CFD. A fine resolution grid was used to capture the vortex of the fluidic device. The calculation so far has shown good consistency with the experiment. Calculation should complete by the conference date and will be thoroughly analyzed to be discussed. Once a detailed CFD computation is finished, a small-scale experiment will be conducted for the given conditions. Using the experimental results and the CFD model, physical models can be validated to give more reliable results. The data from CFD and experiments will provide a more accurate K-factor of the fluidic device which can later be applied in system code inputs.

  5. Applying Fuzzy and Probabilistic Uncertainty Concepts to the Material Flow Analysis of Palladium in Austria

    DEFF Research Database (Denmark)

    Laner, David; Rechberger, Helmut; Astrup, Thomas Fruergaard

    2015-01-01

    Material flow analysis (MFA) is a widely applied tool to investigate resource and recycling systems of metals and minerals. Owing to data limitations and restricted system understanding, MFA results are inherently uncertain. To demonstrate the systematic implementation of uncertainty analysis in ...

  6. Probabilistic Risk Assessment of Hydraulic Fracturing in Unconventional Reservoirs by Means of Fault Tree Analysis: An Initial Discussion

    Science.gov (United States)

    Rodak, C. M.; McHugh, R.; Wei, X.

    2016-12-01

    The development and combination of horizontal drilling and hydraulic fracturing has unlocked unconventional hydrocarbon reserves around the globe. These advances have triggered a number of concerns regarding aquifer contamination and over-exploitation, leading to scientific studies investigating potential risks posed by directional hydraulic fracturing activities. These studies, balanced with potential economic benefits of energy production, are a crucial source of information for communities considering the development of unconventional reservoirs. However, probabilistic quantification of the overall risk posed by hydraulic fracturing at the system level are rare. Here we present the concept of fault tree analysis to determine the overall probability of groundwater contamination or over-exploitation, broadly referred to as the probability of failure. The potential utility of fault tree analysis for the quantification and communication of risks is approached with a general application. However, the fault tree design is robust and can handle various combinations of regional-specific data pertaining to relevant spatial scales, geological conditions, and industry practices where available. All available data are grouped into quantity and quality-based impacts and sub-divided based on the stage of the hydraulic fracturing process in which the data is relevant as described by the USEPA. Each stage is broken down into the unique basic events required for failure; for example, to quantify the risk of an on-site spill we must consider the likelihood, magnitude, composition, and subsurface transport of the spill. The structure of the fault tree described above can be used to render a highly complex system of variables into a straightforward equation for risk calculation based on Boolean logic. This project shows the utility of fault tree analysis for the visual communication of the potential risks of hydraulic fracturing activities on groundwater resources.

  7. Software project profitability analysis using temporal probabilistic reasoning; an empirical study with the CASSE framework

    CSIR Research Space (South Africa)

    Balikuddembe, JK

    2009-04-01

    Full Text Available Undertaking adequate risk management by understanding project requirements and ensuring that viable estimates are made on software projects require extensive application and sophisticated techniques of analysis and interpretation. Informative...

  8. Transit safety & security statistics & analysis 2002 annual report (formerly SAMIS)

    Science.gov (United States)

    2004-12-01

    The Transit Safety & Security Statistics & Analysis 2002 Annual Report (formerly SAMIS) is a compilation and analysis of mass transit accident, casualty, and crime statistics reported under the Federal Transit Administrations (FTAs) National Tr...

  9. Transit safety & security statistics & analysis 2003 annual report (formerly SAMIS)

    Science.gov (United States)

    2005-12-01

    The Transit Safety & Security Statistics & Analysis 2003 Annual Report (formerly SAMIS) is a compilation and analysis of mass transit accident, casualty, and crime statistics reported under the Federal Transit Administrations (FTAs) National Tr...

  10. Cost Benefit Analysis of Consumer Product Safety Standards

    Science.gov (United States)

    Smith, Betty F.; Dardis, Rachel

    1977-01-01

    This paper investigates the role of cost-benefit analysis in evaluating consumer product safety standards and applys such analysis to an evaluation of flammability standards for children's sleepwear. (Editor)

  11. Analytical solutions of linked fault tree probabilistic risk assessments using binary decision diagrams with emphasis on nuclear safety applications[Dissertation 17286

    Energy Technology Data Exchange (ETDEWEB)

    Nusbaumer, O. P. M

    2007-07-01

    This study is concerned with the quantification of Probabilistic Risk Assessment (PRA) using linked Fault Tree (FT) models. Probabilistic Risk assessment (PRA) of Nuclear Power Plants (NPPs) complements traditional deterministic analysis; it is widely recognized as a comprehensive and structured approach to identify accident scenarios and to derive numerical estimates of the associated risk levels. PRA models as found in the nuclear industry have evolved rapidly. Increasingly, they have been broadly applied to support numerous applications on various operational and regulatory matters. Regulatory bodies in many countries require that a PRA be performed for licensing purposes. PRA has reached the point where it can considerably influence the design and operation of nuclear power plants. However, most of the tools available for quantifying large PRA models are unable to produce analytically correct results. The algorithms of such quantifiers are designed to neglect sequences when their likelihood decreases below a predefined cutoff limit. In addition, the rare event approximation (e.g. Moivre's equation) is typically implemented for the first order, ignoring the success paths and the possibility that two or more events can occur simultaneously. This is only justified in assessments where the probabilities of the basic events are low. When the events in question are failures, the first order rare event approximation is always conservative, resulting in wrong interpretation of risk importance measures. Advanced NPP PRA models typically include human errors, common cause failure groups, seismic and phenomenological basic events, where the failure probabilities may approach unity, leading to questionable results. It is accepted that current quantification tools have reached their limits, and that new quantification techniques should be investigated. A novel approach using the mathematical concept of Binary Decision Diagram (BDD) is proposed to overcome these

  12. Probabilistic accident consequence uncertainty analysis: Dispersion and deposition uncertainty assessment. Volume 3, Appendices C, D, E, F, and G

    Energy Technology Data Exchange (ETDEWEB)

    Harper, F.T.; Young, M.L.; Miller, L.A. [Sandia National Labs., Albuquerque, NM (United States)] [and others

    1995-01-01

    The development of two new probabilistic accident consequence codes, MACCS and COSYMA, completed in 1990, estimate the risks presented by nuclear installations based on postulated frequencies and magnitudes of potential accidents. In 1991, the US Nuclear Regulatory Commission (NRC) and the Commission of the European Communities (CEC) began a joint uncertainty analysis of the two codes. The objective was to develop credible and traceable uncertainty distributions for the input variables of the codes. Expert elicitation, developed independently, was identified as the best technology available for developing a library of uncertainty distributions for the selected consequence parameters. The study was formulated jointly and was limited to the current code models and to physical quantities that could be measured in experiments. To validate the distributions generated for the wet deposition input variables, samples were taken from these distributions and propagated through the wet deposition code model along with the Gaussian plume model (GPM) implemented in the MACCS and COSYMA codes. Resulting distributions closely replicated the aggregated elicited wet deposition distributions. Project teams from the NRC and CEC cooperated successfully to develop and implement a unified process for the elaboration of uncertainty distributions on consequence code input parameters. Formal expert judgment elicitation proved valuable for synthesizing the best available information. Distributions on measurable atmospheric dispersion and deposition parameters were successfully elicited from experts involved in the many phenomenological areas of consequence analysis. This volume is the third of a three-volume document describing the project and contains descriptions of the probability assessment principles; the expert identification and selection process; the weighting methods used; the inverse modeling methods; case structures; and summaries of the consequence codes.

  13. Probabilistic Modelling of Robustness and Resilience of Power Grid Systems

    DEFF Research Database (Denmark)

    Qin, Jianjun; Sansavini, Giovanni; Nielsen, Michael Havbro Faber

    2017-01-01

    The present paper proposes a framework for the modeling and analysis of resilience of networked power grid systems. A probabilistic systems model is proposed based on the JCSS Probabilistic Model Code (JCSS, 2001) and deterministic engineering systems modeling techniques such as the DC flow model...... cascading failure event scenarios (Nan and Sansavini, 2017). The concept of direct and indirect consequences proposed by the Joint Committee on Structural Safety (JCSS, 2008) is utilized to model the associated consequences. To facilitate a holistic modeling of robustness and resilience, and to identify how...... these characteristics may be optimized these characteristics, the power grid system is finally interlinked with its fundamental interdependent systems, i.e. a societal model, a regulatory system and control feedback loops. The proposed framework is exemplified with reference to optimal decision support for resilience...

  14. Analysis expectation of investment projects in the conditions of maintenance of investment security companies: probabilistic approach

    Directory of Open Access Journals (Sweden)

    G.V. Berlyak

    2015-06-01

    Full Text Available Uncertainty in making investment and financial decisions arises both because there are many alternatives that can select an investor, and through him the uncontrollable state of the environment at the time of implementation of the decision. One of the stages of the economic analysis is the study now necessary volume of investment resources, possibly based on the analysis of performance indicators of investment security. Indicative analysis of investment security of the enterprise affect the size of the expectation of the realization of the investment project. It asked to rank performance given set of blocks and for each factor of influence on the state of expectations. Each of the proposed indicators of investment security value awarded, the amount of which, in compliance with established standards Raven 1. This means that the investor as a result of the investment project will receive the whole amount of the expectations. If at least one performance indicators does not meet the proposed standard, the amount of expectations reduced in proportion to the coefficient of influence. Based on the study the technique of reflection of financial and economic condition of the enterprise in terms of the analysis of performance indicators of investment security, in varying degrees, to judge the financial health of the enterprise and the availability of capital for current and future development, and to repay its debts, liabilities and also the possibility of realization of the investment project.

  15. Sparse Probabilistic Parallel Factor Analysis for the Modeling of PET and Task-fMRI Data

    DEFF Research Database (Denmark)

    Beliveau, Vincent; Papoutsakis, Georgios; Hinrich, Jesper Løve

    2017-01-01

    Modern datasets are often multiway in nature and can contain patterns common to a mode of the data (e.g. space, time, and subjects). Multiway decomposition such as parallel factor analysis (PARAFAC) take into account the intrinsic structure of the data, and sparse versions of these methods improv...

  16. A probabilistic framework for image information fusion with an application to mammographic analysis.

    NARCIS (Netherlands)

    Velikova, M.; Lucas, P.J.; Samulski, M.; Karssemeijer, N.

    2012-01-01

    The recent increased interest in information fusion methods for solving complex problem, such as in image analysis, is motivated by the wish to better exploit the multitude of information, available from different sources, to enhance decision-making. In this paper, we propose a novel method, that

  17. Probabilistic evaluation of riprap failure under future uncertain flood conditions: the case study of river Kleine Emme (Switzerland)

    Science.gov (United States)

    Jafarnejad, Mona; Pfister, Michael; Franca, Mário J.; Schleiss, Anton J.

    2014-05-01

    Potential failure for river bank protection measures is a critical issue to be evaluated for the safety and stability assessment. Moreover, uncertainties associated to flood conditions and sediment transport in rivers, as a possible result of climate change in the future, affects the safety level of such riverbank protection structures as riprap and walls. Bank failure can lead to uncontrolled erosion and flooding with disastrous consequences in residential areas or in critical infrastructures. The probabilistic analysis of failure on different mechanisms due to possible flood events and sediment transport is a principal step to assess embankment stability in future scenarios. Herein, a probabilistic risk assessment model to define the failure risk of river bank ripraps, developed based on Monte Carlo simulation and Moment Analysis Methods, is showed. This probabilistic simulation estimates the resistance of ripraps regarding varied flood and sediment transport scenarios in future. The failure probability of ripraps is assessed by a probabilistic function of the design safety factor. The probability of failure in different mechanisms such as direct block erosion, toe scouring and overtopping is defined by taking into account the modified bed-load transport due to a probabilistic function of the design discharge. This evaluation method is applied to a Swiss river located in Canton Lucerne, the Kleine Emme. The results highlight the failure probability of riverbank riprap associated to different mechanisms individually. A risk map to represent the risk of total failure along a longitudinal profile of the river is proposed.

  18. Applications of probabilistic methods in geotechnical engineering. Part 2. Analysis of documented case histories using a stochastic model for seismically generated pore pressure and shear strain potential

    Energy Technology Data Exchange (ETDEWEB)

    Kavazanjian, E. Jr.; Chameau, J.L.; Clough, G.W.; Hadk-Hamou, T.

    1983-09-01

    This report presents the basics of a new stochastic model for seismically-generated pore pressure and shear strain potential and illustrates its use for documented case histories. Model parameters are chosen according to available information on the variability of soil properties, and it is applied to sites where liquefaction was observed and where no evidence of liquefaction was observed and where no evidence of liquefaction was observed after major seismic events. Results of the analysis are in substantial agreement with observed field behavior, indicating that this model can be used in a predictive capacity if parameters are chosen correctly. An application of the model to a comprehensive risk analysis of seismically induced initial liquefaction is also briefly described. An example using available seismic information for a hypothetical soil site near San Francisco is presented to illustrate the use of this type of model. Two models are applied to documented case histories to demonstrate their applicability and to illustrate how the probabilistic design parameters are chosen. The probabilistic pore pressure model developed by Chameau (1980) and the probabilistic shear strain model developed by Hadj Hamou (1982) are used herein to analyze the behavior of three sites where liquefaction did and did not occur during earthquakes.

  19. The quality/safety medical index: implementation and analysis.

    Science.gov (United States)

    Reiner, Bruce I

    2015-02-01

    Medical analytics relating to quality and safety measures have become particularly timely and of high importance in contemporary medical practice. In medical imaging, the dynamic relationship between medical imaging quality and radiation safety creates challenges in quantifying quality or safety independently. By creating a standardized measurement which simultaneously accounts for quality and safety measures (i.e., quality safety index), one can in theory create a standardized method for combined quality and safety analysis, which in turn can be analyzed in the context of individual patient, exam, and clinical profiles. The derived index measures can be entered into a centralized database, which in turn can be used for comparative performance of individual and institutional service providers. In addition, data analytics can be used to create customizable educational resources for providers and patients, clinical decision support tools, technology performance analysis, and clinical/economic outcomes research.

  20. Experimental analysis of pilot-based equalization for probabilistically shaped WDM systems with 256QAM/1024QAM

    DEFF Research Database (Denmark)

    Yankov, Metodi Plamenov; Porto da Silva, Edson; Da Ros, Francesco

    2017-01-01

    Pilot based equalization is studied in a 5x10 Gbaud WDM transmission experiment. The equalization is independent of the modulation format and is demonstrated for 256/1024QAM with uniform and probabilistically optimized distribution using an optimized pilot insertion rate of 2-5%.......Pilot based equalization is studied in a 5x10 Gbaud WDM transmission experiment. The equalization is independent of the modulation format and is demonstrated for 256/1024QAM with uniform and probabilistically optimized distribution using an optimized pilot insertion rate of 2-5%....

  1. The role of safety analysis in accident prevention.

    Science.gov (United States)

    Suokas, J

    1988-02-01

    The need for safety analysis has grown in the fields of nuclear industry, civil and military aviation and space technology where the potential for accidents with far-reaching consequences for employees, the public and the environment is most apparent. Later the use of safety analysis has spread widely to other industrial branches. General systems theory, accident theories and scientific management represent domains that have influenced the development of safety analysis. These relations are shortly presented and the common methods employed in safety analysis are described and structured according to the aim of the search and to the search strategy. A framework for the evaluation of the coverage of the search procedures employed in different methods of safety analysis is presented. The framework is then used in an heuristic and in an empiric evaluation of hazard and operability study (HAZOP), work safety analysis (WSA), action error analysis (AEA) and management oversight and risk tree (MORT). Finally, some recommendations on the use of safety analysis for preventing accidents are presented.

  2. Analysis of Stationary Random Responses for Non-Parametric Probabilistic Systems

    Directory of Open Access Journals (Sweden)

    Y. Zhao

    2010-01-01

    Full Text Available The move from conceptual design, through fabrication to observation and measurement on the resulting physical structure is fraught with uncertainty. This, together with the necessary simplifications inherent when using the finite element technique, makes the development of a predictive model for the physical structure sufficiently approximate that the use of random structural models is often to be preferred. In this paper, the random uncertainties of the mass, damping and stiffness matrices in a finite element model are replaced by random matrices, and a highly efficient pseudo excitation method for the dynamic response analysis of non-parametric probability systems subjected to stationary random loads is developed. A numerical example shows that the dynamic responses calculated using a conventional (mean finite element model may be quite different from those based on a random matrix model. For precise fabrication, the uncertainties of models cannot be ignored and the proposed method should be useful in the analysis of such problems.

  3. A Probabilistic Approach to Uncertainty Analysis in NTPR Radiation Dose Assessments

    Science.gov (United States)

    2009-11-01

    analysis. Software Selection (4.1.1) resulted in choosing Mathcad ® as the primary tool for Monte Carlo calculations and was supplemented with the...model variability and uncertainty of model parameters generated in Mathcad . The special studies of model parameters, uncertainties, and distributions...Enewetak Island contained in contractor reports to model PFs and test them for a 48-man barracks building and an 8-man tent. Mathcad software was

  4. The Performance of Structure-Controller Coupled Systems Analysis Using Probabilistic Evaluation and Identification Model Approach

    OpenAIRE

    Mosbeh R. Kaloop; Jong Wan Hu; Yasser Bigdeli

    2017-01-01

    This study evaluates the performance of passively controlled steel frame building under dynamic loads using time series analysis. A novel application is utilized for the time and frequency domains evaluation to analyze the behavior of controlling systems. In addition, the autoregressive moving average (ARMA) neural networks are employed to identify the performance of the controller system. Three passive vibration control devices are utilized in this study, namely, tuned mass damper (TMD), tun...

  5. A Probabilistic Framework for Risk Analysis of Widespread Flood Events: A Proof-of-Concept Study.

    Science.gov (United States)

    Schneeberger, Klaus; Huttenlau, Matthias; Winter, Benjamin; Steinberger, Thomas; Achleitner, Stefan; Stötter, Johann

    2017-07-27

    This article presents a flood risk analysis model that considers the spatially heterogeneous nature of flood events. The basic concept of this approach is to generate a large sample of flood events that can be regarded as temporal extrapolation of flood events. These are combined with cumulative flood impact indicators, such as building damages, to finally derive time series of damages for risk estimation. Therefore, a multivariate modeling procedure that is able to take into account the spatial characteristics of flooding, the regionalization method top-kriging, and three different impact indicators are combined in a model chain. Eventually, the expected annual flood impact (e.g., expected annual damages) and the flood impact associated with a low probability of occurrence are determined for a study area. The risk model has the potential to augment the understanding of flood risk in a region and thereby contribute to enhanced risk management of, for example, risk analysts and policymakers or insurance companies. The modeling framework was successfully applied in a proof-of-concept exercise in Vorarlberg (Austria). The results of the case study show that risk analysis has to be based on spatially heterogeneous flood events in order to estimate flood risk adequately. © 2017 Society for Risk Analysis.

  6. Safety analysis report for the Waste Storage Facility. Revision 2

    Energy Technology Data Exchange (ETDEWEB)

    Bengston, S.J.

    1994-05-01

    This safety analysis report outlines the safety concerns associated with the Waste Storage Facility located in the Radioactive Waste Management Complex at the Idaho National Engineering Laboratory. The three main objectives of the report are: define and document a safety basis for the Waste Storage Facility activities; demonstrate how the activities will be carried out to adequately protect the workers, public, and environment; and provide a basis for review and acceptance of the identified risk that the managers, operators, and owners will assume.

  7. SNF fuel retrieval sub project safety analysis document

    Energy Technology Data Exchange (ETDEWEB)

    BERGMANN, D.W.

    1999-02-24

    This safety analysis is for the SNF Fuel Retrieval (FRS) Sub Project. The FRS equipment will be added to K West and K East Basins to facilitate retrieval, cleaning and repackaging the spent nuclear fuel into Multi-Canister Overpack baskets. The document includes a hazard evaluation, identifies bounding accidents, documents analyses of the accidents and establishes safety class or safety significant equipment to mitigate accidents as needed.

  8. Applying importance-performance analysis to patient safety culture.

    Science.gov (United States)

    Lee, Yii-Ching; Wu, Hsin-Hung; Hsieh, Wan-Lin; Weng, Shao-Jen; Hsieh, Liang-Po; Huang, Chih-Hsuan

    2015-01-01

    The Sexton et al.'s (2006) safety attitudes questionnaire (SAQ) has been widely used to assess staff's attitudes towards patient safety in healthcare organizations. However, to date there have been few studies that discuss the perceptions of patient safety both from hospital staff and upper management. The purpose of this paper is to improve and to develop better strategies regarding patient safety in healthcare organizations. The Chinese version of SAQ based on the Taiwan Joint Commission on Hospital Accreditation is used to evaluate the perceptions of hospital staff. The current study then lies in applying importance-performance analysis technique to identify the major strengths and weaknesses of the safety culture. The results show that teamwork climate, safety climate, job satisfaction, stress recognition and working conditions are major strengths and should be maintained in order to provide a better patient safety culture. On the contrary, perceptions of management and hospital handoffs and transitions are important weaknesses and should be improved immediately. Research limitations/implications - The research is restricted in generalizability. The assessment of hospital staff in patient safety culture is physicians and registered nurses. It would be interesting to further evaluate other staff's (e.g. technicians, pharmacists and others) opinions regarding patient safety culture in the hospital. Few studies have clearly evaluated the perceptions of healthcare organization management regarding patient safety culture. Healthcare managers enable to take more effective actions to improve the level of patient safety by investigating key characteristics (either strengths or weaknesses) that healthcare organizations should focus on.

  9. Probabilistic Social Behavior Analysis by Exploring Body Motion-Based Patterns.

    Science.gov (United States)

    Roudposhti, Kamrad Khoshhal; Nunes, Urbano; Dias, Jorge

    2016-08-01

    Understanding human behavior through nonverbal-based features, is interesting in several applications such as surveillance, ambient assisted living and human-robot interaction. In this article in order to analyze human behaviors in social context, we propose a new approach which explores interrelations between body part motions in scenarios with people doing a conversation. The novelty of this method is that we analyze body motion-based features in frequency domain to estimate different human social patterns: Interpersonal Behaviors (IBs) and a Social Role (SR). To analyze the dynamics and interrelations of people's body motions, a human movement descriptor is used to extract discriminative features, and a multi-layer Dynamic Bayesian Network (DBN) technique is proposed to model the existent dependencies. Laban Movement Analysis (LMA) is a well-known human movement descriptor, which provides efficient mid-level information of human body motions. The mid-level information is useful to extract the complex interdependencies. The DBN technique is tested in different scenarios to model the mentioned complex dependencies. The study is applied for obtaining four IBs (Interest, Indicator, Empathy and Emphasis) to estimate one SR (Leading).The obtained results give a good indication of the capabilities of the proposed approach for people interaction analysis with potential applications in human-robot interaction.

  10. Probabilistic Analysis of Drought Spatiotemporal Characteristics in the Beijing-Tianjin-Hebei Metropolitan Area in China

    Directory of Open Access Journals (Sweden)

    Wanyuan Cai

    2015-03-01

    Full Text Available The temporal and spatial characteristics of meteorological drought have been investigated to provide a framework of methodologies for the analysis of drought in the Beijing-Tianjin-Hebei metropolitan area (BTHMA in China. Using the Reconnaissance Drought Index (RDI as an indicator of drought severity, the characteristics of droughts have been examined. The Beijing-Tianjin-Hebei metropolitan area was divided into 253 grid-cells of 27 × 27km and monthly precipitation data for the period of 1960–2010 from 33 meteorological stations were used for global interpolation of precipitation using spatial co-ordinate data. Drought severity was assessed from the estimated gridded RDI values at multiple time scales. Firstly, the temporal and spatial characteristics of droughts were analyzed, and then drought severity-areal extent-frequency (SAF annual curves were developed. The analysis indicated that the frequency of moderate and severe droughts was about 9.10% in the BTHMA. Using the SAF curves, the return period of selected severe drought events was assessed. The identification of the temporal and spatial characteristics of droughts in the BTHMA will be useful for the development of a drought preparedness plan in the region.

  11. Estimating the Expected Value of Sample Information Using the Probabilistic Sensitivity Analysis Sample: A Fast, Nonparametric Regression-Based Method.

    Science.gov (United States)

    Strong, Mark; Oakley, Jeremy E; Brennan, Alan; Breeze, Penny

    2015-07-01

    Health economic decision-analytic models are used to estimate the expected net benefits of competing decision options. The true values of the input parameters of such models are rarely known with certainty, and it is often useful to quantify the value to the decision maker of reducing uncertainty through collecting new data. In the context of a particular decision problem, the value of a proposed research design can be quantified by its expected value of sample information (EVSI). EVSI is commonly estimated via a 2-level Monte Carlo procedure in which plausible data sets are generated in an outer loop, and then, conditional on these, the parameters of the decision model are updated via Bayes rule and sampled in an inner loop. At each iteration of the inner loop, the decision model is evaluated. This is computationally demanding and may be difficult if the posterior distribution of the model parameters conditional on sampled data is hard to sample from. We describe a fast nonparametric regression-based method for estimating per-patient EVSI that requires only the probabilistic sensitivity analysis sample (i.e., the set of samples drawn from the joint distribution of the parameters and the corresponding net benefits). The method avoids the need to sample from the posterior distributions of the parameters and avoids the need to rerun the model. The only requirement is that sample data sets can be generated. The method is applicable with a model of any complexity and with any specification of model parameter distribution. We demonstrate in a case study the superior efficiency of the regression method over the 2-level Monte Carlo method. © The Author(s) 2015.

  12. Cloud Extraction from Chinese High Resolution Satellite Imagery by Probabilistic Latent Semantic Analysis and Object-Based Machine Learning

    Directory of Open Access Journals (Sweden)

    Kai Tan

    2016-11-01

    Full Text Available Automatic cloud extraction from satellite imagery is a vital process for many applications in optical remote sensing since clouds can locally obscure the surface features and alter the reflectance. Clouds can be easily distinguished by the human eyes in satellite imagery via remarkable regional characteristics, but finding a way to automatically detect various kinds of clouds by computer programs to speed up the processing efficiency remains a challenge. This paper introduces a new cloud detection method based on probabilistic latent semantic analysis (PLSA and object-based machine learning. The method begins by segmenting satellite images into superpixels by Simple Linear Iterative Clustering (SLIC algorithm while also extracting the spectral, texture, frequency and line segment features. Then, the implicit information in each superpixel is extracted from the feature histogram through the PLSA model by which the descriptor of each superpixel can be computed to form a feature vector for classification. Thereafter, the cloud mask is extracted by optimal thresholding and applying the Support Vector Machine (SVM algorithm at the superpixel level. The GrabCut algorithm is then applied to extract more accurate cloud regions at the pixel level by assuming the cloud mask as the prior knowledge. When compared to different cloud detection methods in the literature, the overall accuracy of the proposed cloud detection method was up to 90 percent for ZY-3 and GF-1 images, which is about a 6.8 percent improvement over the traditional spectral-based methods. The experimental results show that the proposed method can automatically and accurately detect clouds using the multispectral information of the available four bands.

  13. Visualisation of variable binding pockets on protein surfaces by probabilistic analysis of related structure sets

    Directory of Open Access Journals (Sweden)

    Ashford Paul

    2012-03-01

    Full Text Available Abstract Background Protein structures provide a valuable resource for rational drug design. For a protein with no known ligand, computational tools can predict surface pockets that are of suitable size and shape to accommodate a complementary small-molecule drug. However, pocket prediction against single static structures may miss features of pockets that arise from proteins' dynamic behaviour. In particular, ligand-binding conformations can be observed as transiently populated states of the apo protein, so it is possible to gain insight into ligand-bound forms by considering conformational variation in apo proteins. This variation can be explored by considering sets of related structures: computationally generated conformers, solution NMR ensembles, multiple crystal structures, homologues or homology models. It is non-trivial to compare pockets, either from different programs or across sets of structures. For a single structure, difficulties arise in defining particular pocket's boundaries. For a set of conformationally distinct structures the challenge is how to make reasonable comparisons between them given that a perfect structural alignment is not possible. Results We have developed a computational method, Provar, that provides a consistent representation of predicted binding pockets across sets of related protein structures. The outputs are probabilities that each atom or residue of the protein borders a predicted pocket. These probabilities can be readily visualised on a protein using existing molecular graphics software. We show how Provar simplifies comparison of the outputs of different pocket prediction algorithms, of pockets across multiple simulated conformations and between homologous structures. We demonstrate the benefits of use of multiple structures for protein-ligand and protein-protein interface analysis on a set of complexes and consider three case studies in detail: i analysis of a kinase superfamily highlights the

  14. Techno-economic and Monte Carlo probabilistic analysis of microalgae biofuel production system.

    Science.gov (United States)

    Batan, Liaw Y; Graff, Gregory D; Bradley, Thomas H

    2016-11-01

    This study focuses on the characterization of the technical and economic feasibility of an enclosed photobioreactor microalgae system with annual production of 37.85 million liters (10 million gallons) of biofuel. The analysis characterizes and breaks down the capital investment and operating costs and the production cost of unit of algal diesel. The economic modelling shows total cost of production of algal raw oil and diesel of $3.46 and $3.69 per liter, respectively. Additionally, the effects of co-products' credit and their impact in the economic performance of algal-to-biofuel system are discussed. The Monte Carlo methodology is used to address price and cost projections and to simulate scenarios with probabilities of financial performance and profits of the analyzed model. Different markets for allocation of co-products have shown significant shifts for economic viability of algal biofuel system. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. Probabilistic Cost-Effectiveness Analysis of Vaccination for Mild or Moderate Alzheimer's Disease.

    Science.gov (United States)

    Yang, Kuen-Cheh; Chen, Hsiu-Hsi

    2016-01-01

    Studies on the immunotherapy for Alzheimer's disease (AD) have increasingly gained attention since 1990s. However, there are pros (preventing of AD) and cons (incurred cost and side effects) regarding the administration of immunotherapy. Up to date, there has been lacking of economic evaluation for immunotherapy of AD. We aimed to assess the cost-effectiveness analysis of the vaccination for AD. A meta-analysis of randomized control trials after systemic review was conducted to evaluate the efficacy of the vaccine. A Markov decision model was constructed and applied to a 120,000-Taiwanese cohort aged ≥65 years. Person years and quality-adjusted life years (QALY) were computed between the vaccinated group and the the unvaccinated group. Economic evaluation was performed to calculate the incremental cost-effectiveness ratio (ICER) and cost-effectiveness acceptability curve (CEAC). Vaccinated group gained an additional 0.84 life years and 0.56 QALYs over 10-years and an additional 0.35 life years and 0.282 QALYs over 5-years of follow-up. The vaccinated group dominated the unvaccinated group by ICER over 5-years of follow-up. The ICERs of 10-year follow-up for the vaccinated group against the unvaccinated group were $13,850 per QALY and $9,038 per life year gained. Given the threshold of $20,000 of willingness to pay (WTP), the CEAC showed the probability of being cost-effective for vaccination with QALY was 70.7% and 92% for life years gained after 10-years of follow-up. The corresponding figures were 87.3% for QALY and 93.5% for life years gained over 5-years follow-up. The vaccination for AD was cost-effective in gaining QALY and life years compared with no vaccination, under the condition of a reasonable threshold of WTP.

  16. Experiments with ROPAR, an approach for probabilistic analysis of the optimal solutions' robustness

    Science.gov (United States)

    Marquez, Oscar; Solomatine, Dimitri

    2016-04-01

    Robust optimization is defined as the search for solutions and performance results which remain reasonably unchanged when exposed to uncertain conditions such as natural variability in input variables, parameter drifts during operation time, model sensitivities and others [1]. In the present study we follow the approach named ROPAR (multi-objective robust optimization allowing for explicit analysis of robustness (see online publication [2]). Its main idea is in: a) sampling the vectors of uncertain factors; b) solving MOO problem for each of them obtaining multiple Pareto sets; c) analysing the statistical properties (distributions) of the subsets of these Pareto sets corresponding to different conditions (e.g. based on constraints formulated for the objective functions values of other system variables); d) selecting the robust solutions. The paper presents the results of experiments with the two case studies: 1) a benchmark function ZDT1 (with an uncertain factor) often used in algorithms comparisons, and 2) a problem of drainage network rehabilitation that uses SWMM hydrodynamic model (the rainfall is assumed to be an uncertain factor). This study is partly supported by the FP7 European Project WeSenseIt Citizen Water Observatory (www.http://wesenseit.eu/) and the CONACYT (Mexico's National Council of Science and Technology) supporting the PhD study of the first author. References [1] H.G.Beyer and B. Sendhoff. "Robust optimization - A comprehensive survey." Comput. Methods Appl. Mech. Engrg., 2007: 3190-3218. [2] D.P. Solomatine (2012). An approach to multi-objective robust optimization allowing for explicit analysis of robustness (ROPAR). UNESCO-IHE. Online publication. Web: https://www.unesco-ihe.org/sites/default/files/solomatine-ropar.pdf

  17. Probabilistic risk assessment model for allergens in food: sensitivity analysis of the minimum eliciting dose and food consumption

    NARCIS (Netherlands)

    Kruizinga, A.G.; Briggs, D.; Crevel, R.W.R.; Knulst, A.C.; Bosch, L.M.C.v.d.; Houben, G.F.

    2008-01-01

    Previously, TNO developed a probabilistic model to predict the likelihood of an allergic reaction, resulting in a quantitative assessment of the risk associated with unintended exposure to food allergens. The likelihood is estimated by including in the model the proportion of the population who is

  18. Safety analysis of passing maneuvers using extreme value theory

    Directory of Open Access Journals (Sweden)

    Haneen Farah

    2017-04-01

    The results indicate that this is a promising approach for safety evaluation. On-going work of the authors will attempt to generalize this method to other safety measures related to passing maneuvers, test it for the detailed analysis of the effect of demographic factors on passing maneuvers' crash probability and for its usefulness in a traffic simulation environment.

  19. 14 CFR 417.405 - Ground safety analysis.

    Science.gov (United States)

    2010-01-01

    ... unfenced boundary of an entire industrial complex or multi-user launch site. A launch location hazard may... must identify employee hazards and demonstrate that there are no associated public safety issues. (4.... (j) A launch operator must verify all information in a ground safety analysis, including design...

  20. Challenges on innovations of newly-developed safety analysis codes

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Yanhua [Shanghai Jiao Tong Univ. (China). School of Nuclear Science and Engineering; Zhang, Hao [State Nuclear Power Software Development Center, Beijing (China). Beijing Future Science and Technology City

    2016-05-15

    With the development of safety analysis method, the safety analysis codes meet more challenges. Three challenges are presented in this paper, which are mathematic model, code design and user interface. Combined with the self-reliance safety analysis code named COSINE, the ways of meeting these requirements are suggested, that is to develop multi-phases, multi-fields and multi-dimension models, to adopt object-oriented code design ideal and to improve the way of modeling, calculation control and data post-processing in the user interface.

  1. Systems Analysis of NASA Aviation Safety Program: Final Report

    Science.gov (United States)

    Jones, Sharon M.; Reveley, Mary S.; Withrow, Colleen A.; Evans, Joni K.; Barr, Lawrence; Leone, Karen

    2013-01-01

    A three-month study (February to April 2010) of the NASA Aviation Safety (AvSafe) program was conducted. This study comprised three components: (1) a statistical analysis of currently available civilian subsonic aircraft data from the National Transportation Safety Board (NTSB), the Federal Aviation Administration (FAA), and the Aviation Safety Information Analysis and Sharing (ASIAS) system to identify any significant or overlooked aviation safety issues; (2) a high-level qualitative identification of future safety risks, with an assessment of the potential impact of the NASA AvSafe research on the National Airspace System (NAS) based on these risks; and (3) a detailed, top-down analysis of the NASA AvSafe program using an established and peer-reviewed systems analysis methodology. The statistical analysis identified the top aviation "tall poles" based on NTSB accident and FAA incident data from 1997 to 2006. A separate examination of medical helicopter accidents in the United States was also conducted. Multiple external sources were used to develop a compilation of ten "tall poles" in future safety issues/risks. The top-down analysis of the AvSafe was conducted by using a modification of the Gibson methodology. Of the 17 challenging safety issues that were identified, 11 were directly addressed by the AvSafe program research portfolio.

  2. Some Interesting Applications of Probabilistic Techiques in Structural Dynamic Analysis of Rocket Engines

    Science.gov (United States)

    Brown, Andrew M.

    2014-01-01

    Numerical and Analytical methods developed to determine damage accumulation in specific engine components when speed variation included. Dither Life Ratio shown to be well over factor of 2 for specific example. Steady-State assumption shown to be accurate for most turbopump cases, allowing rapid calculation of DLR. If hot-fire speed data unknown, Monte Carlo method developed that uses speed statistics for similar engines. Application of techniques allow analyst to reduce both uncertainty and excess conservatism. High values of DLR could allow previously unacceptable part to pass HCF criteria without redesign. Given benefit and ease of implementation, recommend that any finite life turbomachine component analysis adopt these techniques. Probability Values calculated, compared, and evaluated for several industry-proposed methods for combining random and harmonic loads. Two new excel macros written to calculate combined load for any specific probability level. Closed form Curve fits generated for widely used 3(sigma) and 2(sigma) probability levels. For design of lightweight aerospace components, obtaining accurate, reproducible, statistically meaningful answer critical.

  3. Probabilistic modeling and global sensitivity analysis for CO 2 storage in geological formations: a spectral approach

    KAUST Repository

    Saad, Bilal Mohammed

    2017-09-18

    This work focuses on the simulation of CO2 storage in deep underground formations under uncertainty and seeks to understand the impact of uncertainties in reservoir properties on CO2 leakage. To simulate the process, a non-isothermal two-phase two-component flow system with equilibrium phase exchange is used. Since model evaluations are computationally intensive, instead of traditional Monte Carlo methods, we rely on polynomial chaos (PC) expansions for representation of the stochastic model response. A non-intrusive approach is used to determine the PC coefficients. We establish the accuracy of the PC representations within a reasonable error threshold through systematic convergence studies. In addition to characterizing the distributions of model observables, we compute probabilities of excess CO2 leakage. Moreover, we consider the injection rate as a design parameter and compute an optimum injection rate that ensures that the risk of excess pressure buildup at the leaky well remains below acceptable levels. We also provide a comprehensive analysis of sensitivities of CO2 leakage, where we compute the contributions of the random parameters, and their interactions, to the variance by computing first, second, and total order Sobol’ indices.

  4. An analysis of the traffic safety phenomenon.

    NARCIS (Netherlands)

    Asmussen, E. & Kranenburg, A.

    1982-01-01

    The lack of traffic safety is a combination of the critical coincidence of circumstances in the traffic of incidents (near-accidents) and accidents with unwanted (permanent) consequences, such as fatalities, injured and disabled persons and material damage. This definition covers the whole of the

  5. Safety Analysis of Stochastic Dynamical Systems

    DEFF Research Database (Denmark)

    Sloth, Christoffer; Wisniewski, Rafael

    2015-01-01

    This paper presents a method for verifying the safety of a stochastic system. In particular, we show how to compute the largest set of initial conditions such that a given stochastic system is safe with probability p. To compute the set of initial conditions we rely on the moment method that via...

  6. Using Prospective Risk Analysis Tools to Improve Safety in Pharmacy Settings: A Systematic Review and Critical Appraisal.

    Science.gov (United States)

    Stojkovic, Tatjana; Marinkovic, Valentina; Manser, Tanja

    2017-06-29

    This study aimed to review and critically appraise the published literature on 2 selected prospective risk analysis tools, Failure Mode and Effects Analysis and Socio-Technical Probabilistic Risk Assessment, as applied to the dispensing of medicines in both inpatient and outpatient pharmacy settings. A comprehensive search of electronic databases (PubMed and Scopus) was conducted (January 1990-March 2016), supplemented by hand search of reference lists. Eligible articles were assessed for data sources used for the risk analysis, uniformity of the risk quantification framework, and whether the analysis teams assembled were multidisciplinary. Of 1011 records identified, 11 articles met our inclusion criteria. These studies were mainly focused on dispensing of high-alert medications, and most were conducted in inpatient settings. The main risks identified were transcription, preparation, and selection errors, whereas the most common corrective actions included electronic transmission of prescriptions to the pharmacy, use of barcode, and medication safety training. Significant risk reduction was demonstrated by implementing corrective measures in both inpatient and outpatient pharmacy settings. The main Failure Mode and Effects Analysis limitations were its subjectivity and the lack of common risk quantification criteria. The prospective risk analysis methods included in this review revealed relevant safety issues and hold significant potential for risk reduction. They were deemed suitable for application in both inpatient and outpatient pharmacy settings and should form an integral part of any patient safety improvement strategy.

  7. Quantitative Safety and Security Analysis from a Communication Perspective

    DEFF Research Database (Denmark)

    Malinowsky, Boris; Schwefel, Hans-Peter; Jung, Oliver

    2014-01-01

    This paper introduces and exemplifies a trade-off analysis of safety and security properties in distributed systems. The aim is to support analysis for real-time communication and authentication building blocks in a wireless communication scenario. By embedding an authentication scheme into a real......-time communication protocol for safety-critical scenarios, we can rely on the protocol’s individual safety and security properties. The resulting communication protocol satisfies selected safety and security properties for deployment in safety-critical use-case scenarios with security requirements. We look...... at handover situations in a IEEE 802.11 wireless setup between mobile nodes and access points. The trade-offs involve application-layer data goodput, probability of completed handovers, and effect on usable protocol slots, to quantify the impact of security from a lower-layer communication perspective...

  8. Quantitative Safety and Security Analysis from a Communication Perspective

    Directory of Open Access Journals (Sweden)

    Boris Malinowsky

    2015-12-01

    Full Text Available This paper introduces and exemplifies a trade-off analysis of safety and security properties in distributed systems. The aim is to support analysis for real-time communication and authentication building blocks in a wireless communication scenario. By embedding an authentication scheme into a real-time communication protocol for safety-critical scenarios, we can rely on the protocol’s individual safety and security properties. The resulting communication protocol satisfies selected safety and security properties for deployment in safety-critical use-case scenarios with security requirements. We look at handover situations in a IEEE 802.11 wireless setup between mobile nodes and access points. The trade-offs involve application-layer data goodput, probability of completed handovers, and effect on usable protocol slots, to quantify the impact of security from a lower-layer communication perspective on the communication protocols. The results are obtained using the network simulator ns-3.

  9. NKS/SOS-1 seminar on safety analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lauridsen, K. [Risoe National Lab., Roskilde (Denmark); Anderson, K. [Karinta-Konsult (Sweden); Pulkkinen, U. [VTT Automation (Finland)

    2001-05-01

    The report describes presentations and discussions at a seminar held at Risoe on March 22-23, 2000. The title of the seminar was NKS/SOS-1 - Safety Analysis. It dealt with issues of relevance for the safety analysis for the entire nuclear safety field (notably reactors and nuclear waste repositories). Such issues were: objectives of safety analysis, risk criteria, decision analysis, expert judgement and risk communication. In addition, one talk dealt with criteria for chemical industries in Europe. The seminar clearly showed that the concept of risk is multidimensional, which makes clarity and transparency essential elements in risk communication, and that there are issues of common concern between different applications, such as how to deal with different kinds of uncertainty and expert judgement. (au)

  10. Automation of Safety Analysis with SysML Models Project

    Data.gov (United States)

    National Aeronautics and Space Administration — This project was a small proof-of-concept case study, generating SysML model information as a side effect of safety analysis. A prototype FMEA Assistant was...

  11. Probabilistic Logical Characterization

    DEFF Research Database (Denmark)

    Hermanns, Holger; Parma, Augusto; Segala, Roberto

    2011-01-01

    Probabilistic automata exhibit both probabilistic and non-deterministic choice. They are therefore a powerful semantic foundation for modeling concurrent systems with random phenomena arising in many applications ranging from artificial intelligence, security, systems biology to performance...... modeling. Several variations of bisimulation and simulation relations have proved to be useful as means to abstract and compare different automata. This paper develops a taxonomy of logical characterizations of these relations on image-finite and image-infinite probabilistic automata....

  12. Construction safety and waste management an economic analysis

    CERN Document Server

    Li, Rita Yi Man

    2015-01-01

    This monograph presents an analysis of construction safety problems and on-site safety measures from an economist’s point of view. The book includes examples from both emerging countries, e.g. China and India, and developed countries, e.g. Australia and Hong Kong. Moreover, the author covers an analysis on construction safety knowledge sharing by means of updatable mobile technology such as apps in Androids and iOS platform mobile devices. The target audience comprises primarily researchers and experts in the field but the book may also be beneficial for graduate students.

  13. Probabilistic methods for service life predictions

    NARCIS (Netherlands)

    Siemes, A.J.M.

    1999-01-01

    Nowadays it is commonly accepted that the safety of structures should be expressed in terms of reli-ability. This means as the probability of failure. In literature [1, 2, 3, and 4] the bases have been given for the calculation of the failure probability. Making probabilistic calculations can be

  14. Probabilistic Risk Analysis and Fault Trees as Tools in Improving the Delineation of Wellhead Protection Areas: An Initial Discussion

    Science.gov (United States)

    Rodak, C. M.; Silliman, S. E.

    2010-12-01

    Delineation of a wellhead protection area (WHPA) is a critical component of managing / protecting the aquifer(s) supplying potable water to a public water-supply well. While a number of previous authors have addressed questions related to uncertainties in advective capture zones, methods for assessing WHPAs in the presence of uncertainty in the chemistry of groundwater contaminants, the relationship between land-use and contaminant sources, and the impact on health risk within the receiving population are more limited. Probabilistic risk analysis (PRA) combined with fault trees (FT) addresses this latter challenge by providing a structure whereby four key WHPA issues may be addressed: (i) uncertainty in land-use practices and chemical release, (ii) uncertainty in groundwater flow, (iii) variability in natural attenuation properties (and/or remediation) of the contaminants, and (iv) estimated health risk from contaminant arrival at a well. The potential utility of PRA-FT in this application is considered through a simplified case study involving management decisions related both to regional land use planning and local land-use zoning regulation. An application-specific fault tree is constructed to visualize and identify the events required for health risk failure at the well and a Monte Carlo approach is used to create multiple realizations of groundwater flow and chemical transport to a well in a model of a simple, unconfined aquifer. Model parameters allowed to vary during this simplified case study include hydraulic conductivity, probability of a chemical spill (related to land use variation in space), and natural attenuation through variation in rate of decay of the contaminant. Numerical results are interpreted in association with multiple land-use management scenarios as well as multiple cancer risk assumptions regarding the contaminant arriving at the well. This case study shows significant variability of health risk at the well, however general trends were

  15. Modification of the SAS4A Safety Analysis Code for Integration with the ADAPT Discrete Dynamic Event Tree Framework.

    Energy Technology Data Exchange (ETDEWEB)

    Jankovsky, Zachary Kyle [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Denman, Matthew R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2017-05-01

    It is difficult to assess the consequences of a transient in a sodium-cooled fast reactor (SFR) using traditional probabilistic risk assessment (PRA) methods, as numerous safety-related sys- tems have passive characteristics. Often there is significant dependence on the value of con- tinuous stochastic parameters rather than binary success/failure determinations. One form of dynamic PRA uses a system simulator to represent the progression of a transient, tracking events through time in a discrete dynamic event tree (DDET). In order to function in a DDET environment, a simulator must have characteristics that make it amenable to changing physical parameters midway through the analysis. The SAS4A SFR system analysis code did not have these characteristics as received. This report describes the code modifications made to allow dynamic operation as well as the linking to a Sandia DDET driver code. A test case is briefly described to demonstrate the utility of the changes.

  16. Probabilistic machine learning and artificial intelligence.

    Science.gov (United States)

    Ghahramani, Zoubin

    2015-05-28

    How can a machine learn from experience? Probabilistic modelling provides a framework for understanding what learning is, and has therefore emerged as one of the principal theoretical and practical approaches for designing machines that learn from data acquired through experience. The probabilistic framework, which describes how to represent and manipulate uncertainty about models and predictions, has a central role in scientific data analysis, machine learning, robotics, cognitive science and artificial intelligence. This Review provides an introduction to this framework, and discusses some of the state-of-the-art advances in the field, namely, probabilistic programming, Bayesian optimization, data compression and automatic model discovery.

  17. Probabilistic machine learning and artificial intelligence

    Science.gov (United States)

    Ghahramani, Zoubin

    2015-05-01

    How can a machine learn from experience? Probabilistic modelling provides a framework for understanding what learning is, and has therefore emerged as one of the principal theoretical and practical approaches for designing machines that learn from data acquired through experience. The probabilistic framework, which describes how to represent and manipulate uncertainty about models and predictions, has a central role in scientific data analysis, machine learning, robotics, cognitive science and artificial intelligence. This Review provides an introduction to this framework, and discusses some of the state-of-the-art advances in the field, namely, probabilistic programming, Bayesian optimization, data compression and automatic model discovery.

  18. Probabilistic Mu-Calculus

    DEFF Research Database (Denmark)

    Larsen, Kim Guldstrand; Mardare, Radu Iulian; Xue, Bingtian

    2016-01-01

    We introduce a version of the probabilistic µ-calculus (PMC) built on top of a probabilistic modal logic that allows encoding n-ary inequational conditions on transition probabilities. PMC extends previously studied calculi and we prove that, despite its expressiveness, it enjoys a series of good...... is innovative in many aspects combining various techniques from topology and model theory....

  19. Investigation of safety analysis methods using computer vision techniques

    Science.gov (United States)

    Shirazi, Mohammad Shokrolah; Morris, Brendan Tran

    2017-09-01

    This work investigates safety analysis methods using computer vision techniques. The vision-based tracking system is developed to provide the trajectory of road users including vehicles and pedestrians. Safety analysis methods are developed to estimate time to collision (TTC) and postencroachment time (PET) that are two important safety measurements. Corresponding algorithms are presented and their advantages and drawbacks are shown through their success in capturing the conflict events in real time. The performance of the tracking system is evaluated first, and probability density estimation of TTC and PET are shown for 1-h monitoring of a Las Vegas intersection. Finally, an idea of an intersection safety map is introduced, and TTC values of two different intersections are estimated for 1 day from 8:00 a.m. to 6:00 p.m.

  20. Safety analysis report for packaging (onsite) steel drum

    Energy Technology Data Exchange (ETDEWEB)

    McCormick, W.A.

    1998-09-29

    This Safety Analysis Report for Packaging (SARP) provides the analyses and evaluations necessary to demonstrate that the steel drum packaging system meets the transportation safety requirements of HNF-PRO-154, Responsibilities and Procedures for all Hazardous Material Shipments, for an onsite packaging containing Type B quantities of solid and liquid radioactive materials. The basic component of the steel drum packaging system is the 208 L (55-gal) steel drum.

  1. Probabilistic composition of preferences, theory and applications

    CERN Document Server

    Parracho Sant'Anna, Annibal

    2015-01-01

    Putting forward a unified presentation of the features and possible applications of probabilistic preferences composition, and serving as a methodology for decisions employing multiple criteria, this book maximizes reader insights into the evaluation in probabilistic terms and the development of composition approaches that do not depend on assigning weights to the criteria. With key applications in important areas of management such as failure modes, effects analysis and productivity analysis – together with explanations about the application of the concepts involved –this book makes available numerical examples of probabilistic transformation development and probabilistic composition. Useful not only as a reference source for researchers, but also in teaching classes of graduate courses in Production Engineering and Management Science, the key themes of the book will be of especial interest to researchers in the field of Operational Research.

  2. Safety Analysis Report for the PWR Spent Fuel Canister

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Heui Joo; Choi, Jong Won; Cho, Dong Keun; Chun, Kwan Sik; Lee, Jong Youl; Kim, Seong Ki; Kim, Seong Soo; Lee, Yang

    2005-11-15

    This report outlined the results of the safety assessment of the canisters for the PWR spent fuels which will be used in the KRS. All safety analyses including criticality and radiation shielding analyses, mechanical analyses, thermal analyses, and containment analyses were performed. The reference PWR spent fuels were in the 17x17 and determined to have 45,000 MWD/MTU burnup. The canister consists of copper outer shell and nodular cast iron inner structure with diameter of 102 cm and height of 483 cm. Criticality safety was checked for normal and abnormal conditions. It was assumed that the integrity of engineered barriers is preserved and saturated with water of 1.0g/cc for normal condition. For the abnormal condition container and bentonite was assumed to disappear, which allows the spent fuel to be surrounded by water with the most reactive condition. In radiation shielding analysis it was investigated that the absorbed dose at the surface of the canister met the safety limit. The structural analysis was conducted considering three load conditions, normal, extreme, and rock movement condition. Thermal analysis was carried out for the case that the canister with four PWR assemblies was deposited in the repository 500 meter below the surface with 40 m tunnel spacing and 6 m deposition hole spacing. The results of the safety assessment showed that the proposed KDC-1 canister met all the safety limits.

  3. Methodological considerations with data uncertainty in road safety analysis.

    Science.gov (United States)

    Schlögl, Matthias; Stütz, Rainer

    2017-02-16

    The analysis of potential influencing factors that affect the likelihood of road accident occurrence has been of major interest for safety researchers throughout the recent decades. Even though steady methodological progresses were made over the years, several impediments pertaining to the statistical analysis of crash data remain. While issues related to methodological approaches have been subject to constructive discussion, uncertainties inherent to the most fundamental part of any analysis have been widely neglected: data. This paper scrutinizes data from various sources that are commonly used in road safety studies with respect to their actual suitability for applications in this area. Issues related to spatial and temporal aspects of data uncertainty are pointed out and their implications for road safety analysis are discussed in detail. These general methodological considerations are exemplary illustrated with data from Austria, providing suggestions and methods how to overcome these obstacles. Considering these aspects is of major importance for expediting further advances in road safety data analysis and thus for increasing road safety. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Measuring safety climate in acute hospitals: Rasch analysis of the safety attitudes questionnaire.

    Science.gov (United States)

    Soh, Sze-Ee; Barker, Anna; Morello, Renata; Dalton, Megan; Brand, Caroline

    2016-09-20

    The Safety Attitudes Questionnaire (SAQ) is commonly used to assess staff perception of safety climate within their clinical environment. The psychometric properties of the SAQ have previously been explored with confirmatory factor analysis and found to have some issues with construct validity. This study aimed to extend the psychometric evaluations of the SAQ by using Rasch analysis. Assessment of internal construct validity included overall fit to the Rasch model (unidimensionality), response formats, targeting, differential item functioning (DIF) and person-separation index (PSI). A total of 420 nurses completed the SAQ (response rate 60 %). Data showed overall fit to a Rasch model of expected item functioning for interval scale measurement. The questionnaire demonstrated unidimensionality confirming the appropriateness of summing the items in each domain. Score reliabilities were appropriate (internal consistency PSI 0.6-0.8). However, participants were not using the response options on the SAQ in a consistent manner. All domains demonstrated suboptimal targeting and showed compromised score precision towards higher levels of safety climate (substantial ceiling effects). There was general support for the reliability of the SAQ as a measure of safety climate although it may not be able to detect small but clinically important changes in safety climate within an organisation. Further refinement of the SAQ is warranted. This may involve changing the response options and including new items to improve the overall targeting of the scale. This study was registered with the Australian New Zealand Clinical Trials Registry, number ACTRN12611000332921 (21 March 2011).

  5. Risk and safety analysis of nuclear systems

    National Research Council Canada - National Science Library

    Lee, John C; McCormick, Norman J

    2011-01-01

    .... The first half of the book covers the principles of risk analysis, the techniques used to develop and update a reliability data base, the reliability of multi-component systems, Markov methods used...

  6. Risk and safety analysis of nuclear systems

    National Research Council Canada - National Science Library

    Lee, John C; McCormick, Norman J

    2011-01-01

    ...), and failure modes of systems. All of this material is general enough that it could be used in non-nuclear applications, although there is an emphasis placed on the analysis of nuclear systems...

  7. Use of Probabilistic Engineering Methods in the Detailed Design and Development Phases of the NASA Ares Launch Vehicle

    Science.gov (United States)

    Fayssal, Safie; Weldon, Danny

    2008-01-01

    The United States National Aeronautics and Space Administration (NASA) is in the midst of a space exploration program called Constellation to send crew and cargo to the international Space Station, to the moon, and beyond. As part of the Constellation program, a new launch vehicle, Ares I, is being developed by NASA Marshall Space Flight Center. Designing a launch vehicle with high reliability and increased safety requires a significant effort in understanding design variability and design uncertainty at the various levels of the design (system, element, subsystem, component, etc.) and throughout the various design phases (conceptual, preliminary design, etc.). In a previous paper [1] we discussed a probabilistic functional failure analysis approach intended mainly to support system requirements definition, system design, and element design during the early design phases. This paper provides an overview of the application of probabilistic engineering methods to support the detailed subsystem/component design and development as part of the "Design for Reliability and Safety" approach for the new Ares I Launch Vehicle. Specifically, the paper discusses probabilistic engineering design analysis cases that had major impact on the design and manufacturing of the Space Shuttle hardware. The cases represent important lessons learned from the Space Shuttle Program and clearly demonstrate the significance of probabilistic engineering analysis in better understanding design deficiencies and identifying potential design improvement for Ares I. The paper also discusses the probabilistic functional failure analysis approach applied during the early design phases of Ares I and the forward plans for probabilistic design analysis in the detailed design and development phases.

  8. Safety analysis and review system (SARS) assessment report

    Energy Technology Data Exchange (ETDEWEB)

    Browne, E.T.

    1981-03-01

    Under DOE Order 5481.1, Safety Analysis and Review System for DOE Operations, safety analyses are required for DOE projects in order to ensure that: (1) potential hazards are systematically identified; (2) potential impacts are analyzed; (3) reasonable measures have been taken to eliminate, control, or mitigate the hazards; and (4) there is documented management authorization of the DOE operation based on an objective assessment of the adequacy of the safety analysis. This report is intended to provide the DOE Office of Plans and Technology Assessment (OPTA) with an independent evaluation of the adequacy of the ongoing safety analysis effort. As part of this effort, a number of site visits and interviews were conducted, and FE SARS documents were reviewed. The latter included SARS Implementation Plans for a number of FE field offices, as well as safety analysis reports completed for certain FE operations. This report summarizes SARS related efforts at the DOE field offices visited and evaluates the extent to which they fulfill the requirements of DOE 5481.1.

  9. Modelling Active Faults in Probabilistic Seismic Hazard Analysis (PSHA) with OpenQuake: Definition, Design and Experience

    Science.gov (United States)

    Weatherill, Graeme; Garcia, Julio; Poggi, Valerio; Chen, Yen-Shin; Pagani, Marco

    2016-04-01

    The Global Earthquake Model (GEM) has, since its inception in 2009, made many contributions to the practice of seismic hazard modeling in different regions of the globe. The OpenQuake-engine (hereafter referred to simply as OpenQuake), GEM's open-source software for calculation of earthquake hazard and risk, has found application in many countries, spanning a diversity of tectonic environments. GEM itself has produced a database of national and regional seismic hazard models, harmonizing into OpenQuake's own definition the varied seismogenic sources found therein. The characterization of active faults in probabilistic seismic hazard analysis (PSHA) is at the centre of this process, motivating many of the developments in OpenQuake and presenting hazard modellers with the challenge of reconciling seismological, geological and geodetic information for the different regions of the world. Faced with these challenges, and from the experience gained in the process of harmonizing existing models of seismic hazard, four critical issues are addressed. The challenge GEM has faced in the development of software is how to define a representation of an active fault (both in terms of geometry and earthquake behaviour) that is sufficiently flexible to adapt to different tectonic conditions and levels of data completeness. By exploring the different fault typologies supported by OpenQuake we illustrate how seismic hazard calculations can, and do, take into account complexities such as geometrical irregularity of faults in the prediction of ground motion, highlighting some of the potential pitfalls and inconsistencies that can arise. This exploration leads to the second main challenge in active fault modeling, what elements of the fault source model impact most upon the hazard at a site, and when does this matter? Through a series of sensitivity studies we show how different configurations of fault geometry, and the corresponding characterisation of near-fault phenomena (including

  10. Safety analysis of the existing 851 Firing Facility

    Energy Technology Data Exchange (ETDEWEB)

    Odell, B.N.

    1986-06-05

    A safety analysis was performed to determine if normal operations and/or potential accidents at the 851 Firing Facility at Site 300 could present undue hazards to the general public, personnel at Site 300, or have an adverse effect on the environment. The normal operations and credible accidents that might have an effect on these facilities or have off-site consequences were considered. It was determined by this analysis that all but two of the hazards were either low or of the type or magnitude routinely encountered and/or accepted by the public. The exceptions were the linear accelerator and explosives, which were classified as moderate hazards per the requirements given in DOE Order 5481.1A. This safety analysis concluded that the operation at this facility will present no undue risk to the health and safety of LLNL employees or the public.

  11. Safety analysis of the existing 850 Firing Facility

    Energy Technology Data Exchange (ETDEWEB)

    Odell, B.N.

    1986-06-05

    A safety analysis was performed to determine if normal operations and/or potential accidents at the 850 Firing Facility at Site 300 could present undue hazards to the general public, personnel at Site 300, or have an adverse effect on the environment. The normal operations and credible accidents that might have an effect on these facilities or have off-site consequences were considered. It was determined by this analysis that all but one of the hazards were either low or of the type or magnitude routinely encountered and/or accepted by the public. The exception was explosives, which was classified as a moderate hazard per the requirements given in DOE Order 5481.1A. This safety analysis concluded that the operation at this facility will present no undue risk to the health and safety of LLNL employees or the public.

  12. System safety analysis of an autonomous mobile robot

    Energy Technology Data Exchange (ETDEWEB)

    Bartos, R.J.

    1994-08-01

    Analysis of the safety of operating and maintaining the Stored Waste Autonomous Mobile Inspector (SWAMI) II in a hazardous environment at the Fernald Environmental Management Project (FEMP) was completed. The SWAMI II is a version of a commercial robot, the HelpMate{trademark} robot produced by the Transitions Research Corporation, which is being updated to incorporate the systems required for inspecting mixed toxic chemical and radioactive waste drums at the FEMP. It also has modified obstacle detection and collision avoidance subsystems. The robot will autonomously travel down the aisles in storage warehouses to record images of containers and collect other data which are transmitted to an inspector at a remote computer terminal. A previous study showed the SWAMI II has economic feasibility. The SWAMI II will more accurately locate radioactive contamination than human inspectors. This thesis includes a System Safety Hazard Analysis and a quantitative Fault Tree Analysis (FTA). The objectives of the analyses are to prevent potentially serious events and to derive a comprehensive set of safety requirements from which the safety of the SWAMI II and other autonomous mobile robots can be evaluated. The Computer-Aided Fault Tree Analysis (CAFTA{copyright}) software is utilized for the FTA. The FTA shows that more than 99% of the safety risk occurs during maintenance, and that when the derived safety requirements are implemented the rate of serious events is reduced to below one event per million operating hours. Training and procedures in SWAMI II operation and maintenance provide an added safety margin. This study will promote the safe use of the SWAMI II and other autonomous mobile robots in the emerging technology of mobile robotic inspection.

  13. Advanced analysis and design for fire safety of steel structures

    CERN Document Server

    Li, Guoqiang

    2013-01-01

    Advanced Analysis and Design for Fire Safety of Steel Structures systematically presents the latest findings on behaviours of steel structural components in a fire, such as the catenary actions of restrained steel beams, the design methods for restrained steel columns, and the membrane actions of concrete floor slabs with steel decks. Using a systematic description of structural fire safety engineering principles, the authors illustrate the important difference between behaviours of an isolated structural element and the restrained component in a complete structure under fire conditions. The book will be an essential resource for structural engineers who wish to improve their understanding of steel buildings exposed to fires. It is also an ideal textbook for introductory courses in fire safety for master’s degree programs in structural engineering, and is excellent reading material for final-year undergraduate students in civil engineering and fire safety engineering. Furthermore, it successfully bridges th...

  14. A probabilistic approach to the dynamic analysis of ducts subjected to multibase harmonic and random excitation. [for Space Shuttle Main Engine

    Science.gov (United States)

    Debchaudhury, Amit; Rajagopal, K. R.; Ho, H.; Newell, J. F.

    1990-01-01

    The dynamic behavior of the discharge duct of the high-pressure oxidizer turbopump of a cryogenic rocket motor is investigated analytically. The probabilistic analysis program NESSUS (Numerical Evaluation of Stochastic Structures Under Stress; Cruse et al., 1988) is used to treat the uncertainties due to random and harmonic excitation (e.g., pump noise, pump-induced harmonics, and combustion noise), variations in engine inlet pressure, and changes in system damping. The load modeling procedure, the variation in power-spectral density in different zones of the engine structure, and the dynamic structural-analysis technique are described, and the numerical results of the NESSUS analysis are presented in extensive tables and graphs and discussed in detail.

  15. Probabilistic sensitivity analysis of two suspension bridges in Istanbul, Turkey to near- and far-fault ground motion

    Directory of Open Access Journals (Sweden)

    Ö. Çavdar

    2012-02-01

    Full Text Available The aim of this paper is to compare the near-fault and far-fault ground motion effects on the probabilistic sensitivity dynamic responses of two suspension bridges in Istanbul. Two different types of suspension bridges are selected to investigate the near-fault (NF and far-fault (FF ground motion effects on the bridge sensitivity responses. NF and FF strong ground motion records, which have approximately identical peak ground accelerations, of the Kocaeli (1999 earthquake are selected for the analyses. Displacements and internal forces are determined using the probabilistic sensitivity method (PSM, which is one type of stochastic finite element method. The efficiency and accuracy of the proposed algorithm are validated by comparison with results of the Monte Carlo Simulation (MCS method. The displacements and internal forces obtained from the analyses of suspension bridges subjected to each fault effect are compared with each other. It is clearly seen that there is more seismic demand on displacements and internal forces when suspension bridges are subjected to NF and FF ground motion.

  16. Socioeconomic Considerations in Dam Safety Risk Analysis.

    Science.gov (United States)

    1987-06-01

    techniques are compared. The application of eco - nomic principles to the analysis of water projects is, of course, well ,p.~-*..’. ,~. -. . - 5 delineated...aniciateddirct lss s 25perent r mre-o°th commnitys sock f reid ialand ommrcia captal "o 44 which destroy a sizable percentage of the region’s industrial

  17. Solving the Problem of Multiple-Criteria Building Design Decisions with respect to the Fire Safety of Occupants: An Approach Based on Probabilistic Modelling

    Directory of Open Access Journals (Sweden)

    Egidijus Rytas Vaidogas

    2015-01-01

    Full Text Available The design of buildings may include a comparison of alternative architectural and structural solutions. They can be developed at different levels of design process. The alternative design solutions are compared and ranked by applying methods of multiple-criteria decision-making (MCDM. Each design is characterised by a number of criteria used in a MCDM problem. The paper discusses how to choose MCDM criteria expressing fire safety related to alternative designs. Probability of a successful evacuation of occupants from a building fire and difference between evacuation time and time to untenable conditions are suggested as the most important criteria related to fire safety. These two criteria are treated as uncertain quantities expressed by probability distributions. Monte Carlo simulation of fire and evacuation processes is natural means for an estimation of these distributions. The presence of uncertain criteria requires applying stochastic MCDM methods for ranking alternative designs. An application of the safety-related criteria is illustrated by an example which analyses three alternative architectural floor plans prepared for a reconstruction of a medical building. A MCDM method based on stochastic simulation is used to solve the example problem.

  18. The geography of patient safety: a topical analysis of sterility.

    Science.gov (United States)

    Mesman, Jessica

    2009-12-01

    Many studies on patient safety are geared towards prevention of adverse events by eliminating causes of error. In this article, I argue that patient safety research needs to widen its analytical scope and include causes of strength as well. This change of focus enables me to ask other questions, like why don't things go wrong more often? Or, what is the significance of time and space for patient safety? The focal point of this article is on the spatial dimension of patient safety. To gain insight into the 'geography' of patient safety and perform a topical analysis, I will focus on one specific kind of space (sterile space), one specific medical procedure (insertion of an intravenous line) and one specific medical ward (neonatology). Based on ethnographic data from research in the Netherlands, I demonstrate how spatial arrangements produce sterility and how sterility work produces spatial orders at the same time. Detailed analysis shows how a sterile line insertion involves the convergence of spatially distributed resources, relocations of the field of activity, an assemblage of an infrastructure of attention, a specific compositional order of materials, and the scaling down of one's degree of mobility. Sterility, I will argue, turns out to be a product of spatial orderings. Simultaneously, sterility work generates particular spatial orders, like open and restricted areas, by producing buffers and boundaries. However, the spatial order of sterility intersects with the spatial order of other lines of activity. Insight into the normative structure of these co-existing spatial orders turns out to be crucial for patient safety. By analyzing processes of spatial fine-tuning in everyday practice, it becomes possible to identify spatial competences and circumstances that enable staff members to provide safe health care. As such, a topical analysis offers an alternative perspective of patient safety, one that takes into account its spatial dimension.

  19. Preliminary safety analysis for key design features of KALIMER-600

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Y. B.; Chang, W. P.; Suk, S. D.; Ha, K. S.; Jeong, H. Y.; Heo, S

    2004-03-01

    KAERI is developing the conceptual design of a Liquid Metal Reactor, KALIMER-600 (Korea Advanced LIquid MEtal Reactor) under the Long-term Nuclear R and D Program. KALIMER-600 addresses key issues regarding future nuclear power plants such as plant safety, economics, proliferation, and waste. In this report, key safety design features are described and safety analyses results for typical ATWS accidents in the KALIMER design with breakeven core are presented. First, the basic approach to achieve the safety goal is introduced in Chapter 1, and the event categorization and acceptance criteria for the KALIMER-600 safety analysis are described in Chapter 2. In Chapter 3, results of inherent safety evaluations for the KALIMER-600 conceptual design are presented. The KALIMER-600 core and plant system are designed to assure benign performance during a selected set of events without either reactor control or protection system intervention. Safety analyses for the postulated Anticipated Transient Without Scram (ATWS) have been performed using the SSC-K code to investigate the KALIMER-600 system response to the events. They are categorized as Bounding Events (BEs) because of their low probability of occurrence. In Chapter 4, the analysis of flow blockage for KALIMER-600 with the MATRA-LMR-FB code, which has been developed for the internal flow blockage in a LMR subassembly. The cases with a blockage of 6-subchannel, 24-subchannel, and 54-subchannel are analyzed.The performance analysis of the KALIMER-600 containment and some evaluations for the behaviors during HCDA will be performed later.

  20. Improvement of auditing technology of safety analysis through thermal-hydraulic separate effect tests

    Energy Technology Data Exchange (ETDEWEB)

    No, Hee Cheon; Park, Hyun Sik; Kim, Hyougn Tae; Moon, Young Min; Choi, Sung Won; Heo, Sun [Korea Advanced Institute Science and Technology, Taejon (Korea, Republic of)

    1999-04-15

    The loss-of-RHR accident during midloop operation has been important as results of the probabilistic safety analysis. The condensation models In RELAP5/MOD3 are not proper to analyze the midloop operation. To audit and improve the model in RELAP5/MOD3.2, several items of separate effect tests have been performed. The 29 sets of reflux condensation data is obtained and the correlation is developed with these heat transfer coefficient's data. In the experiment of the direct contact condensation in hot leg, the apparatus setting is finished and a few experimental data is obtained. Non-iterative model is used to predict the model in RELAP5/MOD3.2 with the results of reflux condensation and evaluates better than the present model. The results of the direct contact condensation in a hot leg represent to be similar with the present model. The study of the CCF and liquid entrainment in a surge line and pressurizer is selected as the third separate experiment and is on performance.

  1. Probabilistic transmission system planning

    CERN Document Server

    Li, Wenyuan

    2011-01-01

    "The book is composed of 12 chapters and three appendices, and can be divided into four parts. The first part includes Chapters 2 to 7, which discuss the concepts, models, methods and data in probabilistic transmission planning. The second part, Chapters 8 to 11, addresses four essential issues in probabilistic transmission planning applications using actual utility systems as examples. Chapter 12, as the third part, focuses on a special issue, i.e. how to deal with uncertainty of data in probabilistic transmission planning. The fourth part consists of three appendices, which provide the basic knowledge in mathematics for probabilistic planning. Please refer to the attached table of contents which is given in a very detailed manner"--

  2. Consistent Probabilistic Social Choice

    OpenAIRE

    Brandl, Florian; Brandt, Felix; Seedig, Hans Georg

    2015-01-01

    Two fundamental axioms in social choice theory are consistency with respect to a variable electorate and consistency with respect to components of similar alternatives. In the context of traditional non-probabilistic social choice, these axioms are incompatible with each other. We show that in the context of probabilistic social choice, these axioms uniquely characterize a function proposed by Fishburn (Rev. Econ. Stud., 51(4), 683--692, 1984). Fishburn's function returns so-called maximal lo...

  3. Methodological Development of the Probabilistic Model of the Safety Assessment of Hontomin P.D.T.; Desarrollo Metodologico del Modelo Probabilista de Evaluacion de Seguridad de la P.D.T. de Hontomin

    Energy Technology Data Exchange (ETDEWEB)

    Hurtado, A.; Eguilior, S.; Recreo, F.

    2011-06-07

    In the framework of CO{sub 2} Capture and Geological Storage, Risk Analysis plays an important role, because it is an essential requirement of knowledge to make up local, national and supranational definition and planning of carbon injection strategies. This is because each project is at risk of failure. Even from the early stages, it should take into account the possible causes of this risk and propose corrective methods along the process, i.e., managing risk. Proper risk management reduces the negative consequences arising from the project. The main method of reduction or neutralizing of risk is mainly the identification, measurement and evaluation of it, together with the development of decision rules. This report presents the developed methodology for risk analysis and the results of its application. The risk assessment requires determination of the random variables that will influence the functioning of the system. It is very difficult to set up probability distribution of a random variable in the classical sense (objective probability) when a particular event rarely occurred or even it has a incomplete development. In this situation, we have to determine the subjective probability, especially at an early stage of projects, when we have not enough information about the system. This subjective probability is constructed from assessment of experts judgement to estimate the possibility of certain random events could happen depending on geological features of the area of application. The proposed methodology is based on the application of Bayesian Probabilistic Networks for estimating the probability of risk of leakage. These probabilistic networks can define graphically relations of dependence between the variables and joint probability function through a local factorization of probability functions. (Author) 98 refs.

  4. A system-of-systems framework of Nuclear Power Plant Probabilistic Seismic Hazard Analysis by Fault Tree analysis and Monte Carlo simulation

    OpenAIRE

    Ferrario, Elisa; Zio, Enrico

    2012-01-01

    International audience; We propose a quantitative safety analysis of a critical plant with respect to the occurrence of an earthquake, extending the envelope of the study to the interdependent infrastructures which are connected to it in a "system-of-systems" - like fashion. As a mock-up case study, we consider the impacts produced on a nuclear power plant (the critical plant) embedded in the connected power and water distribution, and transportation networks which support its operation. The ...

  5. Safety Issues with Hydrogen as a Vehicle Fuel

    Energy Technology Data Exchange (ETDEWEB)

    L. C. Cadwallader; J. S. Herring

    1999-09-01

    This report is an initial effort to identify and evaluate safety issues associated with the use of hydrogen as a vehicle fuel in automobiles. Several forms of hydrogen have been considered: gas, liquid, slush, and hydrides. The safety issues have been discussed, beginning with properties of hydrogen and the phenomenology of hydrogen combustion. Safety-related operating experiences with hydrogen vehicles have been summarized to identify concerns that must be addressed in future design activities and to support probabilistic risk assessment. Also, applicable codes, standards, and regulations pertaining to hydrogen usage and refueling have been identified and are briefly discussed. This report serves as a safety foundation for any future hydrogen safety work, such as a safety analysis or a probabilistic risk assessment.

  6. Safety Issues with Hydrogen as a Vehicle Fuel

    Energy Technology Data Exchange (ETDEWEB)

    Cadwallader, Lee Charles; Herring, James Stephen

    1999-10-01

    This report is an initial effort to identify and evaluate safety issues associated with the use of hydrogen as a vehicle fuel in automobiles. Several forms of hydrogen have been considered: gas, liquid, slush, and hydrides. The safety issues have been discussed, beginning with properties of hydrogen and the phenomenology of hydrogen combustion. Safety-related operating experiences with hydrogen vehicles have been summarized to identify concerns that must be addressed in future design activities and to support probabilistic risk assessment. Also, applicable codes, standards, and regulations pertaining to hydrogen usage and refueling have been identified and are briefly discussed. This report serves as a safety foundation for any future hydrogen safety work, such as a safety analysis or a probabilistic risk assessment.

  7. Model Based Safety Analysis with smartIflow †

    Directory of Open Access Journals (Sweden)

    Philipp Hönig

    2017-01-01

    Full Text Available Verification of safety requirements is one important task during the development of safety critical systems. The increasing complexity of systems makes manual analysis almost impossible. This paper introduces a new methodology for formal verification of technical systems with smartIflow (State Machines for Automation of Reliability-related Tasks using Information FLOWs. smartIflow is a new modeling language that has been especially designed for the purpose of automating the safety analysis process in early product life cycle stages. It builds up on experience with existing approaches. As is common practice in current approaches, components are modeled as finite state machines. However, new concepts are introduced to describe component interactions. Events play a major role for internal interactions between components as well as for external (user interactions. Our approach to the verification of formally specified safety requirements is a two-step method. First, an exhaustive simulation creates knowledge about a great variety of possible behaviors of the system, especially including reactions on suddenly occurring (possibly intermittent faults. In the second step, safety requirements specified in CTL (Computation Tree Logic are verified using model checking techniques, and counterexamples are generated if these are not satisfied. The practical applicability of this approach is demonstrated based on a Java implementation using a simple Two-Tank-Pump-Consumer system.

  8. 14 CFR 417.213 - Flight safety limits analysis.

    Science.gov (United States)

    2010-01-01

    ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Flight safety limits analysis. 417.213 Section 417.213 Aeronautics and Space COMMERCIAL SPACE TRANSPORTATION, FEDERAL AVIATION ADMINISTRATION... launch vehicle's flight to prevent the hazardous effects of the resulting debris impacts from reaching...

  9. SAFETY ANALYSIS METHODOLOGY FOR AGED CANDU® 6 NUCLEAR REACTORS

    Directory of Open Access Journals (Sweden)

    WOLFGANG HARTMANN

    2013-10-01

    Full Text Available This paper deals with the Safety Analysis for CANDU® 6 nuclear reactors as affected by main Heat Transport System (HTS aging. Operational and aging related changes of the HTS throughout its lifetime may lead to restrictions in certain safety system settings and hence some restriction in performance under certain conditions. A step in confirming safe reactor operation is the tracking of relevant data and their corresponding interpretation by the use of appropriate thermalhydraulic analytic models. Safety analyses ranging from the assessment of safety limits associated with the prevention of intermittent fuel sheath dryout for a slow Loss of Regulation (LOR analysis and fission gas release after a fuel failure are summarized. Specifically for fission gas release, the thermalhydraulic analysis for a fresh core and an 11 Effective Full Power Years (EFPY aged core was summarized, leading to the most severe stagnation break sizes for the inlet feeder break and the channel failure time. Associated coolant conditions provide the input data for fuel analyses. Based on the thermalhydraulic data, the fission product inventory under normal operating conditions may be calculated for both fresh and aged cores, and the fission gas release may be evaluated during the transient. This analysis plays a major role in determining possible radiation doses to the public after postulated accidents have occurred.

  10. QuantUM: Quantitative Safety Analysis of UML Models

    Directory of Open Access Journals (Sweden)

    Florian Leitner-Fischer

    2011-07-01

    Full Text Available When developing a safety-critical system it is essential to obtain an assessment of different design alternatives. In particular, an early safety assessment of the architectural design of a system is desirable. In spite of the plethora of available formal quantitative analysis methods it is still difficult for software and system architects to integrate these techniques into their every day work. This is mainly due to the lack of methods that can be directly applied to architecture level models, for instance given as UML diagrams. Also, it is necessary that the description methods used do not require a profound knowledge of formal methods. Our approach bridges this gap and improves the integration of quantitative safety analysis methods into the development process. All inputs of the analysis are specified at the level of a UML model. This model is then automatically translated into the analysis model, and the results of the analysis are consequently represented on the level of the UML model. Thus the analysis model and the formal methods used during the analysis are hidden from the user. We illustrate the usefulness of our approach using an industrial strength case study.

  11. PAT-1 safety analysis report addendum.

    Energy Technology Data Exchange (ETDEWEB)

    Weiner, Ruth F.; Schmale, David T.; Kalan, Robert J.; Akin, Lili A.; Miller, David Russell; Knorovsky, Gerald Albert; Yoshimura, Richard Hiroyuki; Lopez, Carlos; Harding, David Cameron; Jones, Perry L.; Morrow, Charles W.

    2010-09-01

    The Plutonium Air Transportable Package, Model PAT-1, is certified under Title 10, Code of Federal Regulations Part 71 by the U.S. Nuclear Regulatory Commission (NRC) per Certificate of Compliance (CoC) USA/0361B(U)F-96 (currently Revision 9). The purpose of this SAR Addendum is to incorporate plutonium (Pu) metal as a new payload for the PAT-1 package. The Pu metal is packed in an inner container (designated the T-Ampoule) that replaces the PC-1 inner container. The documentation and results from analysis contained in this addendum demonstrate that the replacement of the PC-1 and associated packaging material with the T-Ampoule and associated packaging with the addition of the plutonium metal content are not significant with respect to the design, operating characteristics, or safe performance of the containment system and prevention of criticality when the package is subjected to the tests specified in 10 CFR 71.71, 71.73 and 71.74.

  12. Probabilistic Risk Assessment Procedures Guide for NASA Managers and Practitioners (Second Edition)

    Science.gov (United States)

    Stamatelatos,Michael; Dezfuli, Homayoon; Apostolakis, George; Everline, Chester; Guarro, Sergio; Mathias, Donovan; Mosleh, Ali; Paulos, Todd; Riha, David; Smith, Curtis; hide

    2011-01-01

    Probabilistic Risk Assessment (PRA) is a comprehensive, structured, and logical analysis method aimed at identifying and assessing risks in complex technological systems for the purpose of cost-effectively improving their safety and performance. NASA's objective is to better understand and effectively manage risk, and thus more effectively ensure mission and programmatic success, and to achieve and maintain high safety standards at NASA. NASA intends to use risk assessment in its programs and projects to support optimal management decision making for the improvement of safety and program performance. In addition to using quantitative/probabilistic risk assessment to improve safety and enhance the safety decision process, NASA has incorporated quantitative risk assessment into its system safety assessment process, which until now has relied primarily on a qualitative representation of risk. Also, NASA has recently adopted the Risk-Informed Decision Making (RIDM) process [1-1] as a valuable addition to supplement existing deterministic and experience-based engineering methods and tools. Over the years, NASA has been a leader in most of the technologies it has employed in its programs. One would think that PRA should be no exception. In fact, it would be natural for NASA to be a leader in PRA because, as a technology pioneer, NASA uses risk assessment and management implicitly or explicitly on a daily basis. NASA has probabilistic safety requirements (thresholds and goals) for crew transportation system missions to the International Space Station (ISS) [1-2]. NASA intends to have probabilistic requirements for any new human spaceflight transportation system acquisition. Methods to perform risk and reliability assessment in the early 1960s originated in U.S. aerospace and missile programs. Fault tree analysis (FTA) is an example. It would have been a reasonable extrapolation to expect that NASA would also become the world leader in the application of PRA. That was

  13. Preliminary safety analysis for key design features of KALIMER

    Energy Technology Data Exchange (ETDEWEB)

    Hahn, D. H.; Kwon, Y. M.; Chang, W. P.; Suk, S. D.; Lee, S. O.; Lee, Y. B.; Jeong, K. S

    2000-07-01

    KAERI is currently developing the conceptual design of a liquid metal reactor, KALIMER(Korea Advanced Liquid Metal Reactor) under the long-term nuclear R and D program. In this report, descriptions of the KALIMER safety design features and safety analyses results for selected ATWS accidents are presented. First, the basic approach to achieve the safety goal is introduced in chapter 1, and the safety evaluation procedure for the KALIMER design is described in chapter 2. It includes event selection, event categorization, description of design basis events, and beyond design basis events. In chapter 3, results of inherent safety evaluations for the KALIMER conceptual design are presented. The KALIMER core and plant system are designed to assure design performance during a selected set of events without either reactor control or protection system intervention. Safety analyses for the postulated anticipated transient without scram(ATWS) have been performed to investigate the KALIMER system response to the events. They are categorized as bounding events(BEs) because of their low probability of occurrence. In chapter 4, the design of the KALIMER containment dome and the results of its performance analysis are presented. The designs of the existing LMR containment and the KALIMER containment dome have been compared in this chapter. Procedure of the containment performance analysis and the analysis results are described along with the accident scenario and source terms. Finally, a simple methodology is introduced to investigate the core kinetics and hydraulic behavior during HCDA in chapter 5. Mathematical formulations have been developed in the framework of the modified bethe-tait method, and scoping analyses have been performed for the KALIMER core behavior during super-prompt critical excursions.

  14. Tritium Research Laboratory safety analysis report

    Energy Technology Data Exchange (ETDEWEB)

    Wright, D.A.

    1979-03-01

    Design and operational philosophy has been evolved to keep radiation exposures to personnel and radiation releases to the environment as low as reasonably achievable. Each experiment will be doubly contained in a glove box and will be limited to 10 grams of tritium gas. Specially designed solid-hydride storage beds may be used to store temporarily up to 25 grams of tritium in the form of tritides. To evaluate possible risks to the public or the environment, a review of the Sandia Laboratories Livermore (SLL) site was carried out. Considered were location, population, land use, meteorology, hydrology, geology, and seismology. The risks and the extent of damage to the TRL and vital systems were evaluated for flooding, lightning, severe winds, earthquakes, explosions, and fires. All of the natural phenomena and human error accidents were considered credible, although the extent of potential damage varied. However, rather than address the myriad of specific individual consequences of each accident scenario, a worst-case tritium release caused indirectly by an unspecified natural phenomenon or human error was evaluated. The maximum credible radiological accident is postulated to result from the release of the maximum quantity of gas from one experiment. Thus 10 grams of tritium gas was used in the analysis to conservatively estimate the maximum whole-body dose of 1 rem at the site boundary and a maximum population dose of 600 man-rem. Accidental release of this amount of tritium implies simultaneous failure of two doubly contained systems, an occurrence considered not credible. Nuclear criticality is impossible in this facility. Based upon the analyses performed for this report, we conclude that the Tritium Research Laboratory can be operated without undue risk to employees, the general public, or the environment. (ERB)

  15. Safety analysis report for packaging (onsite) multicanister overpack cask

    Energy Technology Data Exchange (ETDEWEB)

    Edwards, W.S.

    1997-07-14

    This safety analysis report for packaging (SARP) documents the safety of shipments of irradiated fuel elements in the MUlticanister Overpack (MCO) and MCO Cask for a highway route controlled quantity, Type B fissile package. This SARP evaluates the package during transfers of (1) water-filled MCOs from the K Basins to the Cold Vacuum Drying Facility (CVDF) and (2) sealed and cold vacuum dried MCOs from the CVDF in the 100 K Area to the Canister Storage Building in the 200 East Area.

  16. Fuel Storage Facility Final Safety Analysis Report. Revision 1

    Energy Technology Data Exchange (ETDEWEB)

    Linderoth, C.E.

    1984-03-01

    The Fuel Storage Facility (FSF) is an integral part of the Fast Flux Test Facility. Its purpose is to provide long-term storage (20-year design life) for spent fuel core elements used to provide the fast flux environment in FFTF, and for test fuel pins, components and subassemblies that have been irradiated in the fast flux environment. This Final Safety Analysis Report (FSAR) and its supporting documentation provides a complete description and safety evaluation of the site, the plant design, operations, and potential accidents.

  17. Worker Safety and Health and Nuclear Safety Quarterly Performance Analysis (January - March 2008)

    Energy Technology Data Exchange (ETDEWEB)

    Kerr, C E

    2009-10-07

    The DOE Office of Enforcement expects LLNL to 'implement comprehensive management and independent assessments that are effective in identifying deficiencies and broader problems in safety and security programs, as well as opportunities for continuous improvement within the organization' and to 'regularly perform assessments to evaluate implementation of the contractor's processes for screening and internal reporting.' LLNL has a self-assessment program, described in ES&H Manual Document 4.1, that includes line, management and independent assessments. LLNL also has in place a process to identify and report deficiencies of nuclear, worker safety and health and security requirements. In addition, the DOE Office of Enforcement expects LLNL to evaluate 'issues management databases to identify adverse trends, dominant problem areas, and potential repetitive events or conditions' (page 14, DOE Enforcement Process Overview, December 2007). LLNL requires that all worker safety and health and nuclear safety noncompliances be tracked as 'deficiencies' in the LLNL Issues Tracking System (ITS). Data from the ITS are analyzed for worker safety and health (WSH) and nuclear safety noncompliances that may meet the threshold for reporting to the DOE Noncompliance Tracking System (NTS). This report meets the expectations defined by the DOE Office of Enforcement to review the assessments conducted by LLNL, analyze the issues and noncompliances found in these assessments, and evaluate the data in the ITS database to identify adverse trends, dominant problem areas, and potential repetitive events or conditions. The report attempts to answer three questions: (1) Is LLNL evaluating its programs and state of compliance? (2) What is LLNL finding? (3) Is LLNL appropriately managing what it finds? The analysis in this report focuses on data from the first quarter of 2008 (January through March). This quarter is analyzed within the context of

  18. Software safety analysis activities during software development phases of the Microwave Limb Sounder (MLS)

    Science.gov (United States)

    Shaw, Hui-Yin; Sherif, Joseph S.

    2004-01-01

    This paper describes the MLS software safety analysis activities and documents the SSA results. The scope of this software safety effort is consistent with the MLS system safety definition and is concentrated on the software faults and hazards that may have impact on the personnel safety and the environment safety.

  19. SAFETY

    CERN Multimedia

    Niels Dupont

    2013-01-01

    CERN Safety rules and Radiation Protection at CMS The CERN Safety rules are defined by the Occupational Health & Safety and Environmental Protection Unit (HSE Unit), CERN’s institutional authority and central Safety organ attached to the Director General. In particular the Radiation Protection group (DGS-RP1) ensures that personnel on the CERN sites and the public are protected from potentially harmful effects of ionising radiation linked to CERN activities. The RP Group fulfils its mandate in collaboration with the CERN departments owning or operating sources of ionising radiation and having the responsibility for Radiation Safety of these sources. The specific responsibilities concerning "Radiation Safety" and "Radiation Protection" are delegated as follows: Radiation Safety is the responsibility of every CERN Department owning radiation sources or using radiation sources put at its disposition. These Departments are in charge of implementing the requi...

  20. Overview of Key Computer Codes for the PGSFR Safety Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Chang, Won-Pyo; Lee, Kwi-Lim; Yoo, Jaewoon [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2016-10-15

    The engineering project for licensing and construction of a PGSFR (Proto-type Generation IV Sodium cooled Fast Reactor) was launched in 2012. The efficient electricity generation as well as the high level of safety is a requirement for the PGSFR design. In this context, the safety analysis is a key concern for the PGSFR specific design. In this regard, the present manuscript is aimed at sharing the knowledge on the PGSFR safety analysis with concerned individuals or organizations for a mutual understanding and collaboration. It introduces overall characteristics of the PGSFR design first, and then describes an accident classification with acceptance criteria, highlights of safety analysis computer codes, and discussion of covering ranges and availability of the codes. MARSLMR has a wide range of applicability to accident analyses for an integrated system. SAS4A/SASSYS-1 also has a capability to model a system, but its models address more to the fuel failures during the initiating phase of HCDA. On the other hand, the codes such as MATRA-LMR/FB, SWAAM-II, and CONTAIN-LMR have their specific purposes and limited applications, while ORIGEN-2, ISFRA, and MACCS-II are used for the PSA purpose. A code which can analyze the molten core progress post assembly duct failure is not available at present time.

  1. Analysis of human reliability in the APS of fire. Application of NUREG-1921; Analisis de Fiabilidad Humana en el APS de Incendios. Aplicacion del NUREG-1921

    Energy Technology Data Exchange (ETDEWEB)

    Perez Torres, J. L.; Celaya Meler, M. A.

    2014-07-01

    An analysis of human reliability in a probabilistic safety analysis (APS) of fire aims to identify, describe, analyze and quantify, in a manner traceable, human actions that can affect the mitigation of an initiating event produced by a fire. (Author)

  2. Evolution of Safety Analysis to Support New Exploration Missions

    Science.gov (United States)

    Thrasher, Chard W.

    2008-01-01

    NASA is currently developing the Ares I launch vehicle as a key component of the Constellation program which will provide safe and reliable transportation to the International Space Station, back to the moon, and later to Mars. The risks and costs of the Ares I must be significantly lowered, as compared to other manned launch vehicles, to enable the continuation of space exploration. It is essential that safety be significantly improved, and cost-effectively incorporated into the design process. This paper justifies early and effective safety analysis of complex space systems. Interactions and dependences between design, logistics, modeling, reliability, and safety engineers will be discussed to illustrate methods to lower cost, reduce design cycles and lessen the likelihood of catastrophic events.

  3. Style, content and format guide for writing safety analysis documents. Volume 1, Safety analysis reports for DOE nuclear facilities

    Energy Technology Data Exchange (ETDEWEB)

    1994-06-01

    The purpose of Volume 1 of this 4-volume style guide is to furnish guidelines on writing and publishing Safety Analysis Reports (SARs) for DOE nuclear facilities at Sandia National Laboratories. The scope of Volume 1 encompasses not only the general guidelines for writing and publishing, but also the prescribed topics/appendices contents along with examples from typical SARs for DOE nuclear facilities.

  4. Further development of probabilistic analysis method for lifetime determination of piping and vessels. Final report; Weiterentwicklung probabilistischer Analysemethoden zur Lebensdauerbestimmung von Rohrleitungen und Behaeltern. Abschlussbericht

    Energy Technology Data Exchange (ETDEWEB)

    Heckmann, K.; Grebner, H.; Sievers, J.

    2013-07-15

    Within the framework of research project RS1196 the computer code PROST (Probabilistic Structure Calculation) for the quantitative evaluation of the structural reliability of pipe components has been further developed. Thereby models were provided and tested for the consideration of the damage mechanism 'stable crack growth' to determine leak and break probabilities in cylindrical structures of ferritic and austenitic reactor steels. These models are now additionally available to the model for the consideration of the damage mechanisms 'fatigue' and 'corrosion'. Moreover, a crack initiation model has been established supplementary to the treatment of initial cracks. Furthermore, the application range of the code was extended to the calculation of the growth of wall penetrating cracks. This is important for surface cracks growing until the formation of a stable leak. The calculation of the growth of the wall penetrating crack until break occurs improves the estimation of the break probability. For this purpose program modules were developed to be able to calculate stress intensity factors and critical crack lengths for wall penetrating cracks. In the frame of this work a restructuring of PROST was performed including possibilities to combine damage mechanisms during a calculation. Furthermore several additional fatigue crack growth laws were implemented. The implementation of methods to estimate leak areas and leak rates of wall penetrating cracks was completed by the inclusion of leak detection boundaries. The improved analysis methods were tested by calculation of cases treated already before. Furthermore comparative analyses have been performed for several tasks within the international activity BENCH-KJ. Altogether, the analyses show that with the provided flexible probabilistic analysis method quantitative determination of leak and break probabilities of a crack in a complex structure geometry under thermal-mechanical loading as

  5. Dynamic determination of kinetic parameters, computer simulation, and probabilistic analysis of growth of Clostridium perfringens in cooked beef during cooling.

    Science.gov (United States)

    Huang, Lihan

    2015-02-16

    The objective of this research was to develop a new one-step methodology that uses a dynamic approach to directly construct a tertiary model for prediction of the growth of Clostridium perfringens in cooked beef. This methodology was based on simultaneous numerical analysis and optimization of both primary and secondary models using multiple dynamic growth curves obtained under different conditions. Once the models were constructed, the bootstrap method was used to calculate the 95% confidence intervals of kinetic parameters, and a Monte Carlo simulation method was developed to validate the models using the growth curves not previously used in model development. The results showed that the kinetic parameters obtained from this study accurately matched the common characteristics of C. perfringens, with the optimum temperature being 45.3°C. The results also showed that the predicted growth curves matched accurately with experimental observations used in validation. The mean of residuals of the predictions is -0.02logCFU/g, with a standard deviation of only 0.23logCFU/g. For relative growths 0.4logCFU/g, while only 1.5% are >0.8logCFU/g. In addition, the dynamic model also accurately predicted four isothermal growth curves arbitrarily chosen from the literature. Finally, the Monte Carlo simulation was used to provide the probability of >1 and 2logCFU/g relative growths at the end of cooling. The results of this study will provide a new and accurate tool to the food industry and regulatory agencies to assess the safety of cooked beef in the event of cooling deviation. Published by Elsevier B.V.

  6. Notes for a workshop on risk analysis and decision under uncertainty. The practical use of probabilistic and Bayesian methodology inreal life risk assessment and decision problems

    Energy Technology Data Exchange (ETDEWEB)

    1979-01-01

    The use of probabilistic, and especially Bayesian, methods is explained. The concepts of risk and decision, and probability and frequency are elucidated. The mechanics of probability and probabilistic calculations is discussed. The use of the method for particular problems, such as the frequency of aircraft crashes at a specified nuclear reactor site, is illustrated. 64 figures, 20 tables. (RWR)

  7. Probabilistic Tsunami Hazard Analysis of the Pacific Coast of Mexico: Case Study Based on the 1995 Colima Earthquake Tsunami

    Directory of Open Access Journals (Sweden)

    Nobuhito Mori

    2017-06-01

    Full Text Available This study develops a novel computational framework to carry out probabilistic tsunami hazard assessment for the Pacific coast of Mexico. The new approach enables the consideration of stochastic tsunami source scenarios having variable fault geometry and heterogeneous slip that are constrained by an extensive database of rupture models for historical earthquakes around the world. The assessment focuses upon the 1995 Jalisco–Colima Earthquake Tsunami from a retrospective viewpoint. Numerous source scenarios of large subduction earthquakes are generated to assess the sensitivity and variability of tsunami inundation characteristics of the target region. Analyses of nine slip models along the Mexican Pacific coast are performed, and statistical characteristics of slips (e.g., coherent structures of slip spectra are estimated. The source variability allows exploring a wide range of tsunami scenarios for a moment magnitude (Mw 8 subduction earthquake in the Mexican Pacific region to conduct thorough sensitivity analyses and to quantify the tsunami height variability. The numerical results indicate a strong sensitivity of maximum tsunami height to major slip locations in the source and indicate major uncertainty at the first peak of tsunami waves.

  8. Probabilistic Load-Flow Analysis of Biomass-Fuelled Gas Engines with Electrical Vehicles in Distribution Systems

    Directory of Open Access Journals (Sweden)

    Francisco J. Ruiz-Rodríguez

    2017-10-01

    Full Text Available Feeding biomass-fueled gas engines (BFGEs with olive tree pruning residues offers new opportunities to decrease fossil fuel use in road vehicles and electricity generation. BFGEs, coupled to radial distribution systems (RDSs, provide renewable energy and power that can feed electric vehicle (EV charging stations. However, the combined impact of BFGEs and EVs on RDSs must be assessed to assure the technical constraint fulfilment. Because of the stochastic nature of source/load, it was decided that a probabilistic approach was the most viable option for this assessment. Consequently, this research developed an analytical technique to evaluate the technical constraint fulfilment in RDSs with this combined interaction. The proposed analytical technique (PAT involved the calculation of cumulants and the linearization of load-flow equations, along with the application of the cumulant method, and Cornish-Fisher expansion. The uncertainties related to biomass stock and its heating value (HV were important factors that were assessed for the first time. Application of the PAT in a Spanish RDS with BFGEs and EVs confirmed the feasibility of the proposal and its additional benefits. Specifically, BFGEs were found to clearly contribute to the voltage constraint fulfilment. The computational cost of the PAT was lower than that associated with Monte-Carlo simulations (MCSs.

  9. Probabilistic conditional independence structures

    CERN Document Server

    Studeny, Milan

    2005-01-01

    Probabilistic Conditional Independence Structures provides the mathematical description of probabilistic conditional independence structures; the author uses non-graphical methods of their description, and takes an algebraic approach.The monograph presents the methods of structural imsets and supermodular functions, and deals with independence implication and equivalence of structural imsets.Motivation, mathematical foundations and areas of application are included, and a rough overview of graphical methods is also given.In particular, the author has been careful to use suitable terminology, and presents the work so that it will be understood by both statisticians, and by researchers in artificial intelligence.The necessary elementary mathematical notions are recalled in an appendix.

  10. Probabilistic approach to mechanisms

    CERN Document Server

    Sandler, BZ

    1984-01-01

    This book discusses the application of probabilistics to the investigation of mechanical systems. The book shows, for example, how random function theory can be applied directly to the investigation of random processes in the deflection of cam profiles, pitch or gear teeth, pressure in pipes, etc. The author also deals with some other technical applications of probabilistic theory, including, amongst others, those relating to pneumatic and hydraulic mechanisms and roller bearings. Many of the aspects are illustrated by examples of applications of the techniques under discussion.

  11. Use of limited data to construct Bayesian networks for probabilistic risk assessment.

    Energy Technology Data Exchange (ETDEWEB)

    Groth, Katrina M.; Swiler, Laura Painton

    2013-03-01

    Probabilistic Risk Assessment (PRA) is a fundamental part of safety/quality assurance for nuclear power and nuclear weapons. Traditional PRA very effectively models complex hardware system risks using binary probabilistic models. However, traditional PRA models are not flexible enough to accommodate non-binary soft-causal factors, such as digital instrumentation&control, passive components, aging, common cause failure, and human errors. Bayesian Networks offer the opportunity to incorporate these risks into the PRA framework. This report describes the results of an early career LDRD project titled %E2%80%9CUse of Limited Data to Construct Bayesian Networks for Probabilistic Risk Assessment%E2%80%9D. The goal of the work was to establish the capability to develop Bayesian Networks from sparse data, and to demonstrate this capability by producing a data-informed Bayesian Network for use in Human Reliability Analysis (HRA) as part of nuclear power plant Probabilistic Risk Assessment (PRA). This report summarizes the research goal and major products of the research.

  12. Safety analysis report for packaging (onsite) sample pig transport system

    Energy Technology Data Exchange (ETDEWEB)

    MCCOY, J.C.

    1999-03-16

    This Safety Analysis Report for Packaging (SARP) provides a technical evaluation of the Sample Pig Transport System as compared to the requirements of the U.S. Department of Energy, Richland Operations Office (RL) Order 5480.1, Change 1, Chapter III. The evaluation concludes that the package is acceptable for the onsite transport of Type B, fissile excepted radioactive materials when used in accordance with this document.

  13. C-Bag Consolidation: An Inventory and Safety Stock Analysis

    Science.gov (United States)

    2014-06-13

    AND SAFETY STOCK ANALYSIS GRADUATE RESEARCH PAPER Presented to the Faculty Graduate School of Engineering and Management Air Force...i.e. pilots, firefighters) was excluded from this research. Any items found in both the A-bag and C-bag (helmet, web belt, canteen , etc.) were also...M8 detector paper booklet, M9 detector paper roll, M258A1 or M291 decontamination kit, M295 decontamination kit, a web belt, canteen , M1 canteen

  14. Overview of New Tools to Perform Safety Analysis: BWR Station Black Out Test Case

    Energy Technology Data Exchange (ETDEWEB)

    D. Mandelli; C. Smith; T. Riley; J. Nielsen; J. Schroeder; C. Rabiti; A. Alfonsi; Cogliati; R. Kinoshita; V. Pasucci; B. Wang; D. Maljovec

    2014-06-01

    Dynamic Probabilistic Risk Assessment (DPRA) methodologies couple system simulator codes (e.g., RELAP, MELCOR) with simulation controller codes (e.g., RAVEN, ADAPT). While system simulator codes accurately model system dynamics deterministically, simulation controller codes introduce both deterministic (e.g., system control logic, operating procedures) and stochastic (e.g., component failures, parameter uncertainties) elements into the simulation. Typically, a DPRA is performed by: 1) sampling values of a set of parameters from the uncertainty space of interest (using the simulation controller codes), and 2) simulating the system behavior for that specific set of parameter values (using the system simulator codes). For complex systems, one of the major challenges in using DPRA methodologies is to analyze the large amount of information (i.e., large number of scenarios ) generated, where clustering techniques are typically employed to allow users to better organize and interpret the data. In this paper, we focus on the analysis of a nuclear simulation dataset that is part of the Risk Informed Safety Margin Characterization (RISMC) Boiling Water Reactor (BWR) station blackout (SBO) case study. We apply a software tool that provides the domain experts with an interactive analysis and visualization environment for understanding the structures of such high-dimensional nuclear simulation datasets. Our tool encodes traditional and topology-based clustering techniques, where the latter partitions the data points into clusters based on their uniform gradient flow behavior. We demonstrate through our case study that both types of clustering techniques complement each other in bringing enhanced structural understanding of the data.

  15. Safety analysis of the existing 804 and 845 firing facilities

    Energy Technology Data Exchange (ETDEWEB)

    Odell, B.N.

    1986-06-05

    A safety analysis was performed to determine if normal operations and/or potential accidents at the 804 and 845 Firing Facilities at Site 300 could present undue hazards to the general public, peronnel at Site 300, or have an adverse effect on the environment. The normal operation and credible accident that might have an effect on these facilities or have off-site consequence were considered. It was determined by this analysis that all but one of the hazards were either low or of the type or magnitude routinely encountered and/or accepted by the public. The exception was explosives. Since this hazard has the potential for causing significant on-site and minimum off-site consequences, Bunkers 804 and 845 have been classified as moderate hazard facilties per DOE Order 5481.1A. This safety analysis concluded that the operation at these facilities will present no undue risk to the health and safety of LLNL employees or the public.

  16. Development and assessment of best estimate integrated safety analysis code

    Energy Technology Data Exchange (ETDEWEB)

    Chung, Bub Dong; Lee, Young Jin; Hwang, Moon Kyu (and others)

    2007-03-15

    Improvement of the integrated safety analysis code MARS3.0 has been carried out and a multi-D safety analysis application system has been established. Iterative matrix solver and parallel processing algorithm have been introduced, and a LINUX version has been generated to enable MARS to run in cluster PCs. MARS variables and sub-routines have been reformed and modularised to simplify code maintenance. Model uncertainty analyses have been performed for THTF, FLECHT, NEPTUN, and LOFT experiments as well as APR1400 plant. Participations in international cooperation research projects such as OECD BEMUSE, SETH, PKL, BFBT, and TMI-2 have been actively pursued as part of code assessment efforts. The assessment, evaluation and experimental data obtained through international cooperation projects have been registered and maintained in the T/H Databank. Multi-D analyses of APR1400 LBLOCA, DVI Break, SLB, and SGTR have been carried out as a part of application efforts in multi-D safety analysis. GUI based 3D input generator has been developed for user convenience. Operation of the MARS Users Group (MUG) was continued and through MUG, the technology has been transferred to 24 organisations. A set of 4 volumes of user manuals has been compiled and the correction reports for the code errors reported during MARS development have been published.

  17. Probabilistic Capacity Assessment of Lattice Transmission Towers under Strong Wind

    Directory of Open Access Journals (Sweden)

    Wei eZhang

    2015-10-01

    Full Text Available Serving as one key component of the most important lifeline infrastructure system, transmission towers are vulnerable to multiple nature hazards including strong wind and could pose severe threats to the power system security with possible blackouts under extreme weather conditions, such as hurricanes, derechoes, or winter storms. For the security and resiliency of the power system, it is important to ensure the structural safety with enough capacity for all possible failure modes, such as structural stability. The study is to develop a probabilistic capacity assessment approach for transmission towers under strong wind loads. Due to the complicated structural details of lattice transmission towers, wind tunnel experiments are carried out to understand the complex interactions of wind and the lattice sections of transmission tower and drag coefficients and the dynamic amplification factor for different panels of the transmission tower are obtained. The wind profile is generated and the wind time histories are simulated as a summation of time-varying mean and fluctuating components. The capacity curve for the transmission towers is obtained from the incremental dynamic analysis (IDA method. To consider the stochastic nature of wind field, probabilistic capacity curves are generated by implementing IDA analysis for different wind yaw angles and different randomly generated wind speed time histories. After building the limit state functions based on the maximum allowable drift to height ratio, the probabilities of failure are obtained based on the meteorological data at a given site. As the transmission tower serves as the key nodes for the power network, the probabilistic capacity curves can be incorporated into the performance based design of the power transmission network.

  18. Efficacy Assessment of Endovascular Stenting in Patients with Unilateral Middle Cerebral Artery Stenosis Using Statistical Probabilistic Anatomical Mapping Analysis of Basal/Acetazolamide Brain Perfusion SPECT

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Hae Won; Won, Kyoung Sook; Zeon, Seok Kil; Lee, Chang Young [Keimyung University, School of Medicine, Daegu (Korea, Republic of)

    2009-08-15

    The aim of this study was to evaluate the hemodynamic changes after endovascular stenting in patients with unilateral middle cerebral artery (MCA) stenosis using statistical probabilistic anatomical mapping (SPAM) analysis of basal/acetazolamide (ACZ) Tc-99m ECD brain perfusion SPECT. Eight patients (3 men and 5 women, 64.8{+-}10.5 years) who underwent endovascular stenting for unilateral MCA stenosis were enrolled. Basal/ACZ Tc-99m ECD brain perfusion SPECT studies were performed by one-day protocol before and after stenting. Using SPAM analysis, we compared basal cerebral perfusion (BCP) counts and cerebrovascular reserve (CVR) index of the MCA territory before stenting with those after stenting. After stenting, no patient had any complication nor additional stroke. In SPAM analysis, 7 out of the 8 patients had improved BCP counts of the MCA territory and 7 out of the 8 patients had improved CVR index of the MCA territory after stenting. Before stenting, the mean BCP counts and CVR index in the affected MCA territory were 47.1{+-}2.2 ml/min/100 g and -2.1{+-}2.9%, respectively. After stenting, the mean BCP counts and CVR index in the affected MCA territory were improved significantly (48.3{+-}2.9 ml/min/100 g, p=0.025 and 0.1{+-}1.3%, p=0.036). This study revealed that SPAM analysis of basal/ACZ brain perfusion SPECT would be helpful to evaluate hemodynamic efficacy of endovascular stenting in unilateral MCA stenosis.

  19. Toward Probabilistic Diagnosis and Understanding of Depression Based on Functional MRI Data Analysis with Logistic Group LASSO.

    Directory of Open Access Journals (Sweden)

    Yu Shimizu

    Full Text Available Diagnosis of psychiatric disorders based on brain imaging data is highly desirable in clinical applications. However, a common problem in applying machine learning algorithms is that the number of imaging data dimensions often greatly exceeds the number of available training samples. Furthermore, interpretability of the learned classifier with respect to brain function and anatomy is an important, but non-trivial issue. We propose the use of logistic regression with a least absolute shrinkage and selection operator (LASSO to capture the most critical input features. In particular, we consider application of group LASSO to select brain areas relevant to diagnosis. An additional advantage of LASSO is its probabilistic output, which allows evaluation of diagnosis certainty. To verify our approach, we obtained semantic and phonological verbal fluency fMRI data from 31 depression patients and 31 control subjects, and compared the performances of group LASSO (gLASSO, and sparse group LASSO (sgLASSO to those of standard LASSO (sLASSO, Support Vector Machine (SVM, and Random Forest. Over 90% classification accuracy was achieved with gLASSO, sgLASSO, as well as SVM; however, in contrast to SVM, LASSO approaches allow for identification of the most discriminative weights and estimation of prediction reliability. Semantic task data revealed contributions to the classification from left precuneus, left precentral gyrus, left inferior frontal cortex (pars triangularis, and left cerebellum (c rus1. Weights for the phonological task indicated contributions from left inferior frontal operculum, left post central gyrus, left insula, left middle frontal cortex, bilateral middle temporal cortices, bilateral precuneus, left inferior frontal cortex (pars triangularis, and left precentral gyrus. The distribution of normalized odds ratios further showed, that predictions with absolute odds ratios higher than 0.2 could be regarded as certain.

  20. PROBABILISTIC COST ANALYSIS OF LOGIC PROGRAMS ANÁLISIS DE COSTO PROBABILÍSTICO DE PROGRAMAS LÓGICOS

    Directory of Open Access Journals (Sweden)

    Héctor Juan Soza Pollman

    2009-08-01

    Full Text Available Cost analyses of logic programs have been developed which make it possible to obtain automatically lower and upper bounds of runtime cost of computations. This information is very useful for a variety of purposes, including granularity control, query optimization in databases, and program transformation and synthesis. However, current techniques suffer a loss of accuracy in some cases which are quite representative (i.e., some divide-and-conquer programs as QuickSort. This paper describes an alternative probabilistic approach which makes it possible to figure out an estimate of the execution cost. One of its advantages is that it needs only a few changes over previously proposed schemes.Se han desarrollado análisis de costos de programas lógicos para obtener automáticamente cotas superiores e inferiores del costo del tiempo de ejecución de dicho tipo de programas. Esta información es muy útil para una variedad de propósitos, incluyendo control de granularidad, optimización de consultas en bases de datos, y transformación de programas y síntesis. Sin embargo, las técnicas actuales carecen de exactitud en algunos casos que son bastante representativos (por ejemplo, algunos programas de dividir para reinar como Quicksort. Este artículo describe un enfoque probabilístico alternativo que hace posible obtener una estimación más precisa del costo de ejecución. Una de sus ventajas es que plantea sólo unos pocos cambios sobre los esquemas propuestos previamente.