WorldWideScience

Sample records for serve probabilistic reliability

  1. Structural reliability codes for probabilistic design

    DEFF Research Database (Denmark)

    Ditlevsen, Ove Dalager

    1997-01-01

    probabilistic code format has not only strong influence on the formal reliability measure, but also on the formal cost of failure to be associated if a design made to the target reliability level is considered to be optimal. In fact, the formal cost of failure can be different by several orders of size for two...... different, but by and large equally justifiable probabilistic code formats. Thus, the consequence is that a code format based on decision theoretical concepts and formulated as an extension of a probabilistic code format must specify formal values to be used as costs of failure. A principle of prudence...... is suggested for guiding the choice of the reference probabilistic code format for constant reliability. In the author's opinion there is an urgent need for establishing a standard probabilistic reliability code. This paper presents some considerations that may be debatable, but nevertheless point...

  2. Measuring reliability under epistemic uncertainty: Review on non-probabilistic reliability metrics

    Directory of Open Access Journals (Sweden)

    Kang Rui

    2016-06-01

    Full Text Available In this paper, a systematic review of non-probabilistic reliability metrics is conducted to assist the selection of appropriate reliability metrics to model the influence of epistemic uncertainty. Five frequently used non-probabilistic reliability metrics are critically reviewed, i.e., evidence-theory-based reliability metrics, interval-analysis-based reliability metrics, fuzzy-interval-analysis-based reliability metrics, possibility-theory-based reliability metrics (posbist reliability and uncertainty-theory-based reliability metrics (belief reliability. It is pointed out that a qualified reliability metric that is able to consider the effect of epistemic uncertainty needs to (1 compensate the conservatism in the estimations of the component-level reliability metrics caused by epistemic uncertainty, and (2 satisfy the duality axiom, otherwise it might lead to paradoxical and confusing results in engineering applications. The five commonly used non-probabilistic reliability metrics are compared in terms of these two properties, and the comparison can serve as a basis for the selection of the appropriate reliability metrics.

  3. Human reliability assessment and probabilistic risk assessment

    International Nuclear Information System (INIS)

    Embrey, D.E.; Lucas, D.A.

    1989-01-01

    Human reliability assessment (HRA) is used within Probabilistic Risk Assessment (PRA) to identify the human errors (both omission and commission) which have a significant effect on the overall safety of the system and to quantify the probability of their occurrence. There exist a variey of HRA techniques and the selection of an appropriate one is often difficult. This paper reviews a number of available HRA techniques and discusses their strengths and weaknesses. The techniques reviewed include: decompositional methods, time-reliability curves and systematic expert judgement techniques. (orig.)

  4. Probabilistic simulation applications to reliability assessments

    International Nuclear Information System (INIS)

    Miller, Ian; Nutt, Mark W.; Hill, Ralph S. III

    2003-01-01

    Probabilistic risk/reliability (PRA) analyses for engineered systems are conventionally based on fault-tree methods. These methods are mature and efficient, and are well suited to systems consisting of interacting components with known, low probabilities of failure. Even complex systems, such as nuclear power plants or aircraft, are modeled by the careful application of these approaches. However, for systems that may evolve in complex and nonlinear ways, and where the performance of components may be a sensitive function of the history of their working environments, fault-tree methods can be very demanding. This paper proposes an alternative method of evaluating such systems, based on probabilistic simulation using intelligent software objects to represent the components of such systems. Using a Monte Carlo approach, simulation models can be constructed from relatively simple interacting objects that capture the essential behavior of the components that they represent. Such models are capable of reflecting the complex behaviors of the systems that they represent in a natural and realistic way. (author)

  5. Human reliability in probabilistic safety assessments

    International Nuclear Information System (INIS)

    Nunez Mendez, J.

    1989-01-01

    Nowadays a growing interest in medioambiental aspects is detected in our country. It implies an assessment of the risk involved in the industrial processess and installations in order to determine if those are into the acceptable limits. In these safety assessments, among which PSA (Probabilistic Safety Assessments), can be pointed out the role played by the human being in the system is one of the more relevant subjects. (This relevance has been demostrated in the accidents happenned). However in Spain there aren't manuals specifically dedicated to asses the human contribution to risk in the frame of PSAs. This report aims to improve this situation providing: a) a theoretical background to help the reader in the understanding of the nature of the human error, b) a guide to carry out a Human Reliability Analysis and c) a selected overwiev of the techniques and methodologies currently applied in this area. (Author)

  6. Human Reliability in Probabilistic Safety Assessments

    International Nuclear Information System (INIS)

    Nunez Mendez, J.

    1989-01-01

    Nowadays a growing interest in environmental aspects is detected in our country. It implies an assessment of the risk involved in the industrial processes and installations in order to determine if those are into the acceptable limits. In these safety assessments, among which PSA (Probabilistic Safety Assessments), can be pointed out the role played by the human being in the system is one of the more relevant subjects (This relevance has been demonstrated in the accidents happened) . However, in Spain there aren't manuals specifically dedicated to asses the human contribution to risk in the frame of PSAs. This report aims to improve this situation providing: a) a theoretical background to help the reader in the understanding of the nature of the human error, b) a quid to carry out a Human Reliability Analysis and c) a selected overview of the techniques and methodologies currently applied in this area. (Author) 20 refs

  7. Human reliability. Is probabilistic human reliability assessment possible?

    International Nuclear Information System (INIS)

    Mosneron Dupin, F.

    1996-01-01

    The possibility of carrying out Probabilistic Human Reliability Assessments (PHRA) is often doubted. Basing ourselves on the experience Electricite de France (EDF) has acquired in Probabilistic Safety Assessments for nuclear power plants, we show why the uncertainty of PHRA is very high. We then specify the limits of generic data and models for PHRA: very important factors are often poorly taken into account. To account for them, you need to have proper understanding of the actual context in which operators work. This demands surveys on the field (power plant and simulator) all of which must be carried out with behaviours science skills. The idea of estimating the probabilities of operator failure must not be abandoned, but probabilities must be given less importance, for they are only approximate indications. The qualitative aspects of PHRA should be given greater value (analysis process and qualitative insights). That is why the description (illustrated by case histories) of the main mechanisms of human behaviour, and of their manifestations in the nuclear power plant context (in terms of habits, attitudes, and informal methods and organization in particular) should be an important part of PHRA handbooks. These handbooks should also insist more on methods for gathering information on the actual context of the work of operators. Under these conditions, the PHRA should be possible and even desirable as a process for systematic analysis and assessment of human intervention. (author). 24 refs, 2 figs, 1 tab

  8. Human reliability. Is probabilistic human reliability assessment possible?

    Energy Technology Data Exchange (ETDEWEB)

    Mosneron Dupin, F

    1997-12-31

    The possibility of carrying out Probabilistic Human Reliability Assessments (PHRA) is often doubted. Basing ourselves on the experience Electricite de France (EDF) has acquired in Probabilistic Safety Assessments for nuclear power plants, we show why the uncertainty of PHRA is very high. We then specify the limits of generic data and models for PHRA: very important factors are often poorly taken into account. To account for them, you need to have proper understanding of the actual context in which operators work. This demands surveys on the field (power plant and simulator) all of which must be carried out with behaviours science skills. The idea of estimating the probabilities of operator failure must not be abandoned, but probabilities must be given less importance, for they are only approximate indications. The qualitative aspects of PHRA should be given greater value (analysis process and qualitative insights). That is why the description (illustrated by case histories) of the main mechanisms of human behaviour, and of their manifestations in the nuclear power plant context (in terms of habits, attitudes, and informal methods and organization in particular) should be an important part of PHRA handbooks. These handbooks should also insist more on methods for gathering information on the actual context of the work of operators. Under these conditions, the PHRA should be possible and even desirable as a process for systematic analysis and assessment of human intervention. (author). 24 refs, 2 figs, 1 tab.

  9. Probabilistic assessment of pressure vessel and piping reliability

    International Nuclear Information System (INIS)

    Sundararajan, C.

    1986-01-01

    The paper presents a critical review of the state-of-the-art in probabilistic assessment of pressure vessel and piping reliability. First the differences in assessing the reliability directly from historical failure data and indirectly by a probabilistic analysis of the failure phenomenon are discussed and the advantages and disadvantages are pointed out. The rest of the paper deals with the latter approach of reliability assessment. Methods of probabilistic reliability assessment are described and major projects where these methods are applied for pressure vessel and piping problems are discussed. An extensive list of references is provided at the end of the paper

  10. Probabilistic confidence for decisions based on uncertain reliability estimates

    Science.gov (United States)

    Reid, Stuart G.

    2013-05-01

    Reliability assessments are commonly carried out to provide a rational basis for risk-informed decisions concerning the design or maintenance of engineering systems and structures. However, calculated reliabilities and associated probabilities of failure often have significant uncertainties associated with the possible estimation errors relative to the 'true' failure probabilities. For uncertain probabilities of failure, a measure of 'probabilistic confidence' has been proposed to reflect the concern that uncertainty about the true probability of failure could result in a system or structure that is unsafe and could subsequently fail. The paper describes how the concept of probabilistic confidence can be applied to evaluate and appropriately limit the probabilities of failure attributable to particular uncertainties such as design errors that may critically affect the dependability of risk-acceptance decisions. This approach is illustrated with regard to the dependability of structural design processes based on prototype testing with uncertainties attributable to sampling variability.

  11. Bayesian Inference for NASA Probabilistic Risk and Reliability Analysis

    Science.gov (United States)

    Dezfuli, Homayoon; Kelly, Dana; Smith, Curtis; Vedros, Kurt; Galyean, William

    2009-01-01

    This document, Bayesian Inference for NASA Probabilistic Risk and Reliability Analysis, is intended to provide guidelines for the collection and evaluation of risk and reliability-related data. It is aimed at scientists and engineers familiar with risk and reliability methods and provides a hands-on approach to the investigation and application of a variety of risk and reliability data assessment methods, tools, and techniques. This document provides both: A broad perspective on data analysis collection and evaluation issues. A narrow focus on the methods to implement a comprehensive information repository. The topics addressed herein cover the fundamentals of how data and information are to be used in risk and reliability analysis models and their potential role in decision making. Understanding these topics is essential to attaining a risk informed decision making environment that is being sought by NASA requirements and procedures such as 8000.4 (Agency Risk Management Procedural Requirements), NPR 8705.05 (Probabilistic Risk Assessment Procedures for NASA Programs and Projects), and the System Safety requirements of NPR 8715.3 (NASA General Safety Program Requirements).

  12. Human reliability analysis methods for probabilistic safety assessment

    International Nuclear Information System (INIS)

    Pyy, P.

    2000-11-01

    Human reliability analysis (HRA) of a probabilistic safety assessment (PSA) includes identifying human actions from safety point of view, modelling the most important of them in PSA models, and assessing their probabilities. As manifested by many incidents and studies, human actions may have both positive and negative effect on safety and economy. Human reliability analysis is one of the areas of probabilistic safety assessment (PSA) that has direct applications outside the nuclear industry. The thesis focuses upon developments in human reliability analysis methods and data. The aim is to support PSA by extending the applicability of HRA. The thesis consists of six publications and a summary. The summary includes general considerations and a discussion about human actions in the nuclear power plant (NPP) environment. A condensed discussion about the results of the attached publications is then given, including new development in methods and data. At the end of the summary part, the contribution of the publications to good practice in HRA is presented. In the publications, studies based on the collection of data on maintenance-related failures, simulator runs and expert judgement are presented in order to extend the human reliability analysis database. Furthermore, methodological frameworks are presented to perform a comprehensive HRA, including shutdown conditions, to study reliability of decision making, and to study the effects of wrong human actions. In the last publication, an interdisciplinary approach to analysing human decision making is presented. The publications also include practical applications of the presented methodological frameworks. (orig.)

  13. Probabilistic safety analysis and human reliability analysis. Proceedings. Working material

    International Nuclear Information System (INIS)

    1996-01-01

    An international meeting on Probabilistic Safety Assessment (PSA) and Human Reliability Analysis (HRA) was jointly organized by Electricite de France - Research and Development (EDF DER) and SRI International in co-ordination with the International Atomic Energy Agency. The meeting was held in Paris 21-23 November 1994. A group of international and French specialists in PSA and HRA participated at the meeting and discussed the state of the art and current trends in the following six topics: PSA Methodology; PSA Applications; From PSA to Dependability; Incident Analysis; Safety Indicators; Human Reliability. For each topic a background paper was prepared by EDF/DER and reviewed by the international group of specialists who attended the meeting. The results of this meeting provide a comprehensive overview of the most important questions related to the readiness of PSA for specific uses and areas where further research and development is required. Refs, figs, tabs

  14. Probabilistic safety analysis and human reliability analysis. Proceedings. Working material

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-12-31

    An international meeting on Probabilistic Safety Assessment (PSA) and Human Reliability Analysis (HRA) was jointly organized by Electricite de France - Research and Development (EDF DER) and SRI International in co-ordination with the International Atomic Energy Agency. The meeting was held in Paris 21-23 November 1994. A group of international and French specialists in PSA and HRA participated at the meeting and discussed the state of the art and current trends in the following six topics: PSA Methodology; PSA Applications; From PSA to Dependability; Incident Analysis; Safety Indicators; Human Reliability. For each topic a background paper was prepared by EDF/DER and reviewed by the international group of specialists who attended the meeting. The results of this meeting provide a comprehensive overview of the most important questions related to the readiness of PSA for specific uses and areas where further research and development is required. Refs, figs, tabs.

  15. Component reliability data for use in probabilistic safety assessment

    International Nuclear Information System (INIS)

    1988-10-01

    Generic component reliability data is indispensable in any probabilistic safety analysis. It is not realistic to assume that all possible component failures and failure modes modeled in a PSA would be available from the operating experience of a specific plant in a statistically meaningful way. The degree that generic data is used in PSAs varies from case to case. Some studies are totally based on generic data while others use generic data as prior information to be specialized by plant specific data. Most studies, however, finally use a combination where data for certain components come from generic data sources and others from Bayesian updating. The IAEA effort to compile a generic component reliability data base aimed at facilitating the use of data available in the literature and at highlighting pitfalls which deserve special consideration. It was also intended to complement the fault tree and event tree package (PSAPACK) and to facilitate its use. Moreover, it should be noted, that the IAEA has recently initiated a Coordinated Research Program in Reliability Data Collection, Retrieval and Analysis. In this framework the issues identified as most affecting the quality of existing data bases would be addressed. This report presents the results of a compilation made from the specialized literature and includes reliability data for components usually considered in PSA

  16. Reliability and Probabilistic Risk Assessment - How They Play Together

    Science.gov (United States)

    Safie, Fayssal M.; Stutts, Richard G.; Zhaofeng, Huang

    2015-01-01

    PRA methodology is one of the probabilistic analysis methods that NASA brought from the nuclear industry to assess the risk of LOM, LOV and LOC for launch vehicles. PRA is a system scenario based risk assessment that uses a combination of fault trees, event trees, event sequence diagrams, and probability and statistical data to analyze the risk of a system, a process, or an activity. It is a process designed to answer three basic questions: What can go wrong? How likely is it? What is the severity of the degradation? Since 1986, NASA, along with industry partners, has conducted a number of PRA studies to predict the overall launch vehicles risks. Planning Research Corporation conducted the first of these studies in 1988. In 1995, Science Applications International Corporation (SAIC) conducted a comprehensive PRA study. In July 1996, NASA conducted a two-year study (October 1996 - September 1998) to develop a model that provided the overall Space Shuttle risk and estimates of risk changes due to proposed Space Shuttle upgrades. After the Columbia accident, NASA conducted a PRA on the Shuttle External Tank (ET) foam. This study was the most focused and extensive risk assessment that NASA has conducted in recent years. It used a dynamic, physics-based, integrated system analysis approach to understand the integrated system risk due to ET foam loss in flight. Most recently, a PRA for Ares I launch vehicle has been performed in support of the Constellation program. Reliability, on the other hand, addresses the loss of functions. In a broader sense, reliability engineering is a discipline that involves the application of engineering principles to the design and processing of products, both hardware and software, for meeting product reliability requirements or goals. It is a very broad design-support discipline. It has important interfaces with many other engineering disciplines. Reliability as a figure of merit (i.e. the metric) is the probability that an item will

  17. Substation design improvement with a probabilistic reliability approach using the TOPASE program

    Energy Technology Data Exchange (ETDEWEB)

    Bulot, M.; Heroin, G.; Bergerot, J-L.; Le Du, M. [Electricite de France (France)

    1997-12-31

    TOPASE, (the French acronym for Probabilistic Tools and Data Processing for the Analysis of Electric Systems), developed by Electricite de France (EDF) to perform reliability studies on transmission substations, was described. TOPASE serves a dual objective of assisting in the automation of HV substation studies, as well as enabling electrical systems experts who are not necessarily specialists in reliability studies to perform such studies. The program is capable of quantifying the occurrence rate of undesirable events and of identifying critical equipment and the main incident scenarios. The program can be used to improve an existing substation, to choose an HV structure during the design stage, or to choose a system of protective devices. Data collected during 1996 and 1997 will be analyzed to identify useful experiences and to validate the basic concepts of the program. 4 figs.

  18. Suppression of panel flutter of near-space aircraft based on non-probabilistic reliability theory

    Directory of Open Access Journals (Sweden)

    Ye-Wei Zhang

    2016-03-01

    Full Text Available The vibration active control of the composite panels with the uncertain parameters in the hypersonic flow is studied using the non-probabilistic reliability theory. Using the piezoelectric patches as active control actuators, dynamic equations of panel are established by finite element method and Hamilton’s principle. And the control model of panel with uncertain parameters is obtained. According to the non-probabilistic reliability index, and besides being based on H∞ robust control theory and non-probabilistic reliability theory, the non-probabilistic reliability performance function is given. Moreover, the relationships between the robust controller and H∞ performance index and reliability are established. Numerical results show that the control method under the influence of reliability, H∞ performance index, and approaching velocity is effective to the vibration suppression of panel in the whole interval of uncertain parameters.

  19. Quantification of human reliability in probabilistic safety assessment

    International Nuclear Information System (INIS)

    Hirschberg, S.; Dankg, Vinh N.

    1996-01-01

    Human performance may substantially influence the reliability and safety of complex technical systems. For this reason, Human Reliability Analysis (HRA) constitutes an important part of Probabilistic Safety Assessment (PSAs) or Quantitative Risk Analyses (QRAs). The results of these studies as well as analyses of past accidents and incidents clearly demonstrate the importance of human interactions. The contribution of human errors to the core damage frequency (CDF), as estimated in the Swedish nuclear PSAs, are between 15 and 88%. A survey of the FRAs in the Swiss PSAs shows that also for the Swiss nuclear power plants the estimated HE contributions are substantial (49% of the CDF due to internal events in the case of Beznau and 70% in the case of Muehleberg; for the total CDF, including external events, 25% respectively 20%). Similar results can be extracted from the PSAs carried out for French, German, and US plants. In PSAs or QRAs, the adequate treatment of the human interactions with the system is a key to the understanding of accident sequences and their relative importance to overall risk. The main objectives of HRA are: first, to ensure that the key human interactions are systematically identified and incorporated into the safety analysis in a traceable manner, and second, to quantify the probabilities of their success and failure. Adopting a structured and systematic approach to the assessment of human performance makes it possible to provide greater confidence that the safety and availability of human-machine systems is not unduly jeopardized by human performance problems. Section 2 discusses the different types of human interactions analysed in PSAs. More generally, the section presents how HRA fits in the overall safety analysis, that is, how the human interactions to be quantified are identified. Section 3 addresses the methods for quantification. Section 4 concludes the paper by presenting some recommendations and pointing out the limitations of the

  20. Global optimization of maintenance and surveillance testing based on reliability and probabilistic safety assessment. Research project

    International Nuclear Information System (INIS)

    Martorell, S.; Serradell, V.; Munoz, A.; Sanchez, A.

    1997-01-01

    Background, objective, scope, detailed working plan and follow-up and final product of the project ''Global optimization of maintenance and surveillance testing based on reliability and probabilistic safety assessment'' are described

  1. Evaluation of seismic reliability of steel moment resisting frames rehabilitated by concentric braces with probabilistic models

    Directory of Open Access Journals (Sweden)

    Fateme Rezaei

    2017-08-01

    Full Text Available Probability of structure failure which has been designed by "deterministic methods" can be more than the one which has been designed in similar situation using probabilistic methods and models considering "uncertainties". The main purpose of this research was to evaluate the seismic reliability of steel moment resisting frames rehabilitated with concentric braces by probabilistic models. To do so, three-story and nine-story steel moment resisting frames were designed based on resistant criteria of Iranian code and then they were rehabilitated based on controlling drift limitations by concentric braces. Probability of frames failure was evaluated by probabilistic models of magnitude, location of earthquake, ground shaking intensity in the area of the structure, probabilistic model of building response (based on maximum lateral roof displacement and probabilistic methods. These frames were analyzed under subcrustal source by sampling probabilistic method "Risk Tools" (RT. Comparing the exceedance probability of building response curves (or selected points on it of the three-story and nine-story model frames (before and after rehabilitation, seismic response of rehabilitated frames, was reduced and their reliability was improved. Also the main effective variables in reducing the probability of frames failure were determined using sensitivity analysis by FORM probabilistic method. The most effective variables reducing the probability of frames failure are  in the magnitude model, ground shaking intensity model error and magnitude model error

  2. Quantification of Wave Model Uncertainties Used for Probabilistic Reliability Assessments of Wave Energy Converters

    DEFF Research Database (Denmark)

    Ambühl, Simon; Kofoed, Jens Peter; Sørensen, John Dalsgaard

    2015-01-01

    Wave models used for site assessments are subjected to model uncertainties, which need to be quantified when using wave model results for probabilistic reliability assessments. This paper focuses on determination of wave model uncertainties. Four different wave models are considered, and validation...... data are collected from published scientific research. The bias and the root-mean-square error, as well as the scatter index, are considered for the significant wave height as well as the mean zero-crossing wave period. Based on an illustrative generic example, this paper presents how the quantified...... uncertainties can be implemented in probabilistic reliability assessments....

  3. Determination of Wave Model Uncertainties used for Probabilistic Reliability Assessments of Wave Energy Devices

    DEFF Research Database (Denmark)

    Ambühl, Simon; Kofoed, Jens Peter; Sørensen, John Dalsgaard

    2014-01-01

    Wave models used for site assessments are subject to model uncertainties, which need to be quantified when using wave model results for probabilistic reliability assessments. This paper focuses on determination of wave model uncertainties. Considered are four different wave models and validation...... data is collected from published scientific research. The bias, the root-mean-square error as well as the scatter index are considered for the significant wave height as well as the mean zero-crossing wave period. Based on an illustrative generic example it is shown how the estimated uncertainties can...... be implemented in probabilistic reliability assessments....

  4. Reliability Evaluation and Probabilistic Design of Coastal Structures

    DEFF Research Database (Denmark)

    Burcharth, H. F.

    1993-01-01

    Conventional design practice for coastal structures is deterministic in nature and is based on the concept of a design load, which should not exceed the resistance (carrying capacity) of the structure. The design load is usually defined on a probabilistic basis as a characteristic value of the load......, e.g. the expectation (mean) value of the lOO-year return period event, however, often without consideration of the involved uncertainties. The resistance is in most cases defined in terms of the load which causes a certain design impact or damage to the structure and is not given as an ultimate...... force or deformation. This is because most of the available design formulae only give the relationship between wave characteristics and structural response, e.g. in terms of run-up, overtopping, armour layer damage etc. An example is the Hudson formula for armour layer stability. Almost all such design...

  5. Reliability and Probabilistic Risk Assessment - How They Play Together

    Science.gov (United States)

    Safie, Fayssal M.; Stutts, Richard; Huang, Zhaofeng

    2015-01-01

    The objective of this presentation is to discuss the PRA process and the reliability engineering discipline, their differences and similarities, and how they are used as complimentary analyses to support design and flight decisions.

  6. Probabilistic risk assessment course documentation. Volume 5. System reliability and analysis techniques Session D - quantification

    International Nuclear Information System (INIS)

    Lofgren, E.V.

    1985-08-01

    This course in System Reliability and Analysis Techniques focuses on the probabilistic quantification of accident sequences and the link between accident sequences and consequences. Other sessions in this series focus on the quantification of system reliability and the development of event trees and fault trees. This course takes the viewpoint that event tree sequences or combinations of system failures and success are available and that Boolean equations for system fault trees have been developed and are available. 93 figs., 11 tabs

  7. Human reliability analysis in Loviisa probabilistic safety analysis

    International Nuclear Information System (INIS)

    Illman, L.; Isaksson, J.; Makkonen, L.; Vaurio, J.K.; Vuorio, U.

    1986-01-01

    The human reliability analysis in the Loviisa PSA project is carried out for three major groups of errors in human actions: (A) errors made before an initiating event, (B) errors that initiate a transient and (C) errors made during transients. Recovery possibilities are also included in each group. The methods used or planned for each group are described. A simplified THERP approach is used for group A, with emphasis on test and maintenance error recovery aspects and dependencies between redundancies. For group B, task analyses and human factors assessments are made for startup, shutdown and operational transients, with emphasis on potential common cause initiators. For group C, both misdiagnosis and slow decision making are analyzed, as well as errors made in carrying out necessary or backup actions. New or advanced features of the methodology are described

  8. Construct validity and reliability of a checklist for volleyball serve analysis

    Directory of Open Access Journals (Sweden)

    Cicero Luciano Alves Costa

    2018-03-01

    Full Text Available This study aims to investigate the construct validity and reliability of the checklist for qualitative analysis of the overhand serve in Volleyball. Fifty-five male subjects aged 13-17 years participated in the study. The overhand serve was analyzed using the checklist proposed by Meira Junior (2003, which analyzes the pattern of serve movement in four phases: (I initial position, (II ball lifting, (III ball attacking, and (IV finalization. Construct validity was analyzed using confirmatory factorial analysis and reliability through the Cronbach’s alpha coefficient. The construct validity was supported by confirmatory factor analysis with the RMSEA results (0.037 [confidence interval 90% = 0.020-0.040], CFI (0.970 and TLI (0.950 indicating good fit of the model. In relation to reliability, Cronbach’s alpha coefficient was 0.661, being this value considered acceptable. Among the items on the checklist, ball lifting and attacking showed higher factor loadings, 0.69 and 0.99, respectively. In summary, the checklist for the qualitative analysis of the overhand serve of Meira Junior (2003 can be considered a valid and reliable instrument for use in research in the field of Sports Sciences.

  9. Practical applications of probabilistic structural reliability analyses to primary pressure systems of nuclear power plants

    International Nuclear Information System (INIS)

    Witt, F.J.

    1980-01-01

    Primary pressure systems of nuclear power plants are built to exacting codes and standards with provisions for inservice inspection and repair if necessary. Analyses and experiments have demonstrated by deterministic means that very large margins exist on safety impacting failures under normal operating and upset conditions. Probabilistic structural reliability analyses provide additional support that failures of significance are very, very remote. They may range in degree of sophistication from very simple calculations to very complex computer analyses involving highly developed mathematical techniques. The end result however should be consistent with the desired usage. In this paper a probabilistic structural reliability analysis is performed as a supplement to in-depth deterministic evaluations with the primary objective to demonstrate an acceptably low probability of failure for the conditions considered. (author)

  10. Assessing reliability of fatigue indicator parameters for small crack growth via a probabilistic framework

    Science.gov (United States)

    Rovinelli, Andrea; Guilhem, Yoann; Proudhon, Henry; Lebensohn, Ricardo A.; Ludwig, Wolfgang; Sangid, Michael D.

    2017-06-01

    Microstructurally small cracks exhibit large variability in their fatigue crack growth rate. It is accepted that the inherent variability in microstructural features is related to the uncertainty in the growth rate. However, due to (i) the lack of cycle-by-cycle experimental data, (ii) the complexity of the short crack growth phenomenon, and (iii) the incomplete physics of constitutive relationships, only empirical damage metrics have been postulated to describe the short crack driving force metric (SCDFM) at the mesoscale level. The identification of the SCDFM of polycrystalline engineering alloys is a critical need, in order to achieve more reliable fatigue life prediction and improve material design. In this work, the first steps in the development of a general probabilistic framework are presented, which uses experimental result as an input, retrieves missing experimental data through crystal plasticity (CP) simulations, and extracts correlations utilizing machine learning and Bayesian networks (BNs). More precisely, experimental results representing cycle-by-cycle data of a short crack growing through a beta-metastable titanium alloy, VST-55531, have been acquired via phase and diffraction contrast tomography. These results serve as an input for FFT-based CP simulations, which provide the micromechanical fields influenced by the presence of the crack, complementing the information available from the experiment. In order to assess the correlation between postulated SCDFM and experimental observations, the data is mined and analyzed utilizing BNs. Results show the ability of the framework to autonomously capture relevant correlations and the equivalence in the prediction capability of different postulated SCDFMs for the high cycle fatigue regime.

  11. An evaluation of the reliability and usefulness of external-initiator PRA [probabilistic risk analysis] methodologies

    International Nuclear Information System (INIS)

    Budnitz, R.J.; Lambert, H.E.

    1990-01-01

    The discipline of probabilistic risk analysis (PRA) has become so mature in recent years that it is now being used routinely to assist decision-making throughout the nuclear industry. This includes decision-making that affects design, construction, operation, maintenance, and regulation. Unfortunately, not all sub-areas within the larger discipline of PRA are equally ''mature,'' and therefore the many different types of engineering insights from PRA are not all equally reliable. 93 refs., 4 figs., 1 tab

  12. An evaluation of the reliability and usefulness of external-initiator PRA (probabilistic risk analysis) methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Budnitz, R.J.; Lambert, H.E. (Future Resources Associates, Inc., Berkeley, CA (USA))

    1990-01-01

    The discipline of probabilistic risk analysis (PRA) has become so mature in recent years that it is now being used routinely to assist decision-making throughout the nuclear industry. This includes decision-making that affects design, construction, operation, maintenance, and regulation. Unfortunately, not all sub-areas within the larger discipline of PRA are equally mature,'' and therefore the many different types of engineering insights from PRA are not all equally reliable. 93 refs., 4 figs., 1 tab.

  13. Summary of component reliability data for probabilistic safety analysis of Korean standard nuclear power plant

    International Nuclear Information System (INIS)

    Choi, S. Y.; Han, S. H.

    2004-01-01

    The reliability data of Korean NPP that reflects the plant specific characteristics is necessary for PSA of Korean nuclear power plants. We have performed a study to develop the component reliability DB and S/W for component reliability analysis. Based on the system, we had have collected the component operation data and failure/repair data during plant operation data to 1998/2000 for YGN 3,4/UCN 3,4 respectively. Recently, we have upgraded the database by collecting additional data by 2002 for Korean standard nuclear power plants and performed component reliability analysis and Bayesian analysis again. In this paper, we supply the summary of component reliability data for probabilistic safety analysis of Korean standard nuclear power plant and describe the plant specific characteristics compared to the generic data

  14. Development of reliability databases and the particular requirements of probabilistic risk analyses

    International Nuclear Information System (INIS)

    Meslin, T.

    1989-01-01

    Nuclear utilities have an increasing need to develop reliability databases for their operating experience. The purposes of these databases are often multiple, including both equipment maintenance aspects and probabilistic risk analyses. EDF has therefore been developing experience feedback databases, including the Reliability Data Recording System (SRDF) and the Event File, as well as the history of numerous operating documents. Furthermore, since the end of 1985, EDF has been preparing a probabilistic safety analysis applied to one 1,300 MWe unit, for which a large amount of data of French origin is necessary. This data concerns both component reliability parameters and initiating event frequencies. The study has thus been an opportunity for trying out the performance databases for a specific application, as well as in-depth audits of a number of nuclear sites to make it possible to validate numerous results. Computer aided data collection is also on trial in a number of plants. After describing the EDF operating experience feedback files, we discuss the particular requirements of probabilistic risk analyses, and the resources implemented by EDF to satisfy them. (author). 5 refs

  15. Structural system reliability calculation using a probabilistic fault tree analysis method

    Science.gov (United States)

    Torng, T. Y.; Wu, Y.-T.; Millwater, H. R.

    1992-01-01

    The development of a new probabilistic fault tree analysis (PFTA) method for calculating structural system reliability is summarized. The proposed PFTA procedure includes: developing a fault tree to represent the complex structural system, constructing an approximation function for each bottom event, determining a dominant sampling sequence for all bottom events, and calculating the system reliability using an adaptive importance sampling method. PFTA is suitable for complicated structural problems that require computer-intensive computer calculations. A computer program has been developed to implement the PFTA.

  16. Standardization of domestic human reliability analysis and experience of human reliability analysis in probabilistic safety assessment for NPPs under design

    International Nuclear Information System (INIS)

    Kang, D. I.; Jung, W. D.

    2002-01-01

    This paper introduces the background and development activities of domestic standardization of procedure and method for Human Reliability Analysis (HRA) to avoid the intervention of subjectivity by HRA analyst in Probabilistic Safety Assessment (PSA) as possible, and the review of the HRA results for domestic nuclear power plants under design studied by Korea Atomic Energy Research Institute. We identify the HRA methods used for PSA for domestic NPPs and discuss the subjectivity of HRA analyst shown in performing a HRA. Also, we introduce the PSA guidelines published in USA and review the HRA results based on them. We propose the system of a standard procedure and method for HRA to be developed

  17. Reliability of structures of industrial installations. Theory and applications of probabilistic mechanics

    International Nuclear Information System (INIS)

    Procaccia, H.; Morilhat, P.; Carle, R.; Menjon, G.

    1996-01-01

    The management of the service life of mechanical materials implies an evaluation of their risk of failure during their use. To evaluate this risk the following methods are used: the classical frequency statistics applied to experience feedback data concerning failures noticed during operation of active parts (pumps, valves, exchangers, circuit breakers etc..); the Bayesian approach in the case of scarce statistical data and when experts are needed to compensate the lack of information; the structures reliability approach when no data are available and when a theoretical model of degradation must be used, in particular for passive structures (pressure vessels, pipes, tanks, etc..). The aim of this book is to describe the principles and applications of this third approach to industrial installations. Chapter 1 recalls the historical aspects of the probabilistic approach to the reliability of structures and the existing codes. Chapter 2 presents the level 1 deterministic method applied so far for the conceiving of passive structures. The Cornell reliability index, already used in civil engineering codes, is defined in chapter 3. The Hasofer and Lind reliability index, a generalization of the Cornell index, is defined in chapter 4. Chapter 5 concerns the application of probabilistic approaches to optimization studies with the introduction of the economical variables linked to the risk and the possible actions to limit this risk (in-service inspection, maintenance, repairing etc..). Chapters 6 and 7 describe the Monte Carlo simulation and approximation methods for failure probabilistic calculations, and recall the fracture mechanics basis and the models of load and degradation of industrial installations. Some applications are given in chapter 9 with the cases of the safety margins quantization of a fissured pipe and the optimizing of the in-service inspection policy of a steam generator. Chapter 10 raises the problem of the coupling between mechanical and reliability

  18. Probabilistic risk assessment course documentation. Volume 3. System reliability and analysis techniques, Session A - reliability

    International Nuclear Information System (INIS)

    Lofgren, E.V.

    1985-08-01

    This course in System Reliability and Analysis Techniques focuses on the quantitative estimation of reliability at the systems level. Various methods are reviewed, but the structure provided by the fault tree method is used as the basis for system reliability estimates. The principles of fault tree analysis are briefly reviewed. Contributors to system unreliability and unavailability are reviewed, models are given for quantitative evaluation, and the requirements for both generic and plant-specific data are discussed. Also covered are issues of quantifying component faults that relate to the systems context in which the components are embedded. All reliability terms are carefully defined. 44 figs., 22 tabs

  19. Photovoltaic and Wind Turbine Integration Applying Cuckoo Search for Probabilistic Reliable Optimal Placement

    Directory of Open Access Journals (Sweden)

    R. A. Swief

    2018-01-01

    Full Text Available This paper presents an efficient Cuckoo Search Optimization technique to improve the reliability of electrical power systems. Various reliability objective indices such as Energy Not Supplied, System Average Interruption Frequency Index, System Average Interruption, and Duration Index are the main indices indicating reliability. The Cuckoo Search Optimization (CSO technique is applied to optimally place the protection devices, install the distributed generators, and to determine the size of distributed generators in radial feeders for reliability improvement. Distributed generator affects reliability and system power losses and voltage profile. The volatility behaviour for both photovoltaic cells and the wind turbine farms affect the values and the selection of protection devices and distributed generators allocation. To improve reliability, the reconfiguration will take place before installing both protection devices and distributed generators. Assessment of consumer power system reliability is a vital part of distribution system behaviour and development. Distribution system reliability calculation will be relayed on probabilistic reliability indices, which can expect the disruption profile of a distribution system based on the volatility behaviour of added generators and load behaviour. The validity of the anticipated algorithm has been tested using a standard IEEE 69 bus system.

  20. Problems and chances for probabilistic fracture mechanics in the analysis of steel pressure boundary reliability

    Energy Technology Data Exchange (ETDEWEB)

    Staat, M [Forschungszentrum Juelich GmbH (Germany). Inst. fuer Sicherheitsforschung und Reaktortechnik

    1996-12-01

    It is shown that the difficulty for probabilistic fracture mechanics (PFM) is the general problem of the high reliability of a small population. There is no way around the problem as yet. Therefore what PFM can contribute to the reliability of steel pressure boundaries is demonstrated with the example of a typical reactor pressure vessel and critically discussed. Although no method is distinguishable that could give exact failure probabilities, PFM has several additional chances. Upper limits for failure probability may be obtained together with trends for design and operating conditions. Further, PFM can identify the most sensitive parameters, improved control of which would increase reliability. Thus PFM should play a vital role in the analysis of steel pressure boundaries despite all shortcomings. (author). 19 refs, 7 figs, 1 tab.

  1. Problems and chances for probabilistic fracture mechanics in the analysis of steel pressure boundary reliability

    International Nuclear Information System (INIS)

    Staat, M.

    1996-01-01

    It is shown that the difficulty for probabilistic fracture mechanics (PFM) is the general problem of the high reliability of a small population. There is no way around the problem as yet. Therefore what PFM can contribute to the reliability of steel pressure boundaries is demonstrated with the example of a typical reactor pressure vessel and critically discussed. Although no method is distinguishable that could give exact failure probabilities, PFM has several additional chances. Upper limits for failure probability may be obtained together with trends for design and operating conditions. Further, PFM can identify the most sensitive parameters, improved control of which would increase reliability. Thus PFM should play a vital role in the analysis of steel pressure boundaries despite all shortcomings. (author). 19 refs, 7 figs, 1 tab

  2. Probabilistic Analysis of Passive Safety System Reliability in Advanced Small Modular Reactors: Methodologies and Lessons Learned

    Energy Technology Data Exchange (ETDEWEB)

    Grabaskas, David; Bucknor, Matthew; Brunett, Acacia; Grelle, Austin

    2015-06-28

    Many advanced small modular reactor designs rely on passive systems to fulfill safety functions during accident sequences. These systems depend heavily on boundary conditions to induce a motive force, meaning the system can fail to operate as intended due to deviations in boundary conditions, rather than as the result of physical failures. Furthermore, passive systems may operate in intermediate or degraded modes. These factors make passive system operation difficult to characterize with a traditional probabilistic framework that only recognizes discrete operating modes and does not allow for the explicit consideration of time-dependent boundary conditions. Argonne National Laboratory has been examining various methodologies for assessing passive system reliability within a probabilistic risk assessment for a station blackout event at an advanced small modular reactor. This paper describes the most promising options: mechanistic techniques, which share qualities with conventional probabilistic methods, and simulation-based techniques, which explicitly account for time-dependent processes. The primary intention of this paper is to describe the strengths and weaknesses of each methodology and highlight the lessons learned while applying the two techniques while providing high-level results. This includes the global benefits and deficiencies of the methods and practical problems encountered during the implementation of each technique.

  3. Development of advanced methods and related software for human reliability evaluation within probabilistic safety analyses

    International Nuclear Information System (INIS)

    Kosmowski, K.T.; Mertens, J.; Degen, G.; Reer, B.

    1994-06-01

    Human Reliability Analysis (HRA) is an important part of Probabilistic Safety Analysis (PSA). The first part of this report consists of an overview of types of human behaviour and human error including the effect of significant performance shaping factors on human reliability. Particularly with regard to safety assessments for nuclear power plants a lot of HRA methods have been developed. The most important of these methods are presented and discussed in the report, together with techniques for incorporating HRA into PSA and with models of operator cognitive behaviour. Based on existing HRA methods the concept of a software system is described. For the development of this system the utilization of modern programming tools is proposed; the essential goal is the effective application of HRA methods. A possible integration of computeraided HRA within PSA is discussed. The features of Expert System Technology and examples of applications (PSA, HRA) are presented in four appendices. (orig.) [de

  4. Operator reliability study for Probabilistic Safety Analysis of an operating research reactor

    International Nuclear Information System (INIS)

    Mohamed, F.; Hassan, A.; Yahaya, R.; Rahman, I.; Maskin, M.; Praktom, P.; Charlie, F.

    2015-01-01

    Highlights: • Human Reliability Analysis (HRA) for Level 1 Probabilistic Safety Analysis (PSA) is performed on research nuclear reactor. • Implemented qualitative HRA framework is addressed. • Human Failure Events of significant impact to the reactor safety are derived. - Abstract: A Level 1 Probabilistic Safety Analysis (PSA) for the TRIGA Mark II research reactor of Malaysian Nuclear Agency has been developed to evaluate the potential risk in its operation. In conjunction to this PSA development, Human Reliability Analysis (HRA) is performed in order to determine human contribution to the risk. The aim of this study is to qualitatively analyze human actions (HAs) involved in the operation of this reactor according to the qualitative part of the HRA framework for PSA which is namely the identification, qualitative screening and modeling of HAs. By performing this framework, Human Failure Events (HFEs) of significant impact to the reactor safety are systematically analyzed and incorporated into the PSA structure. A part of the findings in this study will become the input for the subsequent quantitative part of the HRA framework, i.e. the Human Error Probability (HEP) quantification

  5. Stochastic network interdiction optimization via capacitated network reliability modeling and probabilistic solution discovery

    International Nuclear Information System (INIS)

    Ramirez-Marquez, Jose Emmanuel; Rocco S, Claudio M.

    2009-01-01

    This paper introduces an evolutionary optimization approach that can be readily applied to solve stochastic network interdiction problems (SNIP). The network interdiction problem solved considers the minimization of the cost associated with an interdiction strategy such that the maximum flow that can be transmitted between a source node and a sink node for a fixed network design is greater than or equal to a given reliability requirement. Furthermore, the model assumes that the nominal capacity of each network link and the cost associated with their interdiction can change from link to link and that such interdiction has a probability of being successful. This version of the SNIP is for the first time modeled as a capacitated network reliability problem allowing for the implementation of computation and solution techniques previously unavailable. The solution process is based on an evolutionary algorithm that implements: (1) Monte-Carlo simulation, to generate potential network interdiction strategies, (2) capacitated network reliability techniques to analyze strategies' source-sink flow reliability and, (3) an evolutionary optimization technique to define, in probabilistic terms, how likely a link is to appear in the final interdiction strategy. Examples for different sizes of networks are used throughout the paper to illustrate the approach

  6. Development of reliability and probabilistic safety assessment program RiskA

    International Nuclear Information System (INIS)

    Wu, Yican

    2015-01-01

    Highlights: • There are four parts in the structure of RiskA. User input part lets users input the PSA model and some necessary data by GUI or model transformation tool. In calculation engine part, fault tree analysis, event tree analysis, uncertainty analysis, sensitivity analysis, importance analysis and failure mode and effects analysis are supplied. User output part outputs the analysis results, user customized reports and some other data. The last part includes reliability database, some other common tools and help documents. • RiskA has several advanced features. Extensible framework makes it easy to add any new functions, making RiskA to be a large platform of reliability and probabilistic safety assessment. It is very fast to analysis fault tree in RiskA because many advanced algorithm improvement were made. Many model formats can be imported and exported, which made the PSA model in the commercial software can be easily transformed to adapt RiskA platform. Web-based co-modeling let several users in different places work together whenever they are online. • The comparison between RiskA and other mature PSA codes (e.g. CAFTA, RiskSpectrum, XFTA) has demonstrated that the calculation and analysis of RiskA is correct and efficient. Based on the development of this code package, many applications of safety and reliability analysis of some research reactors and nuclear power plants were performed. The development of RiskA appears to be of realistic and potential value for academic research and practical operation safety management of nuclear power plants in China and abroad. - Abstract: PSA (probabilistic safety assessment) software, the indispensable tool in nuclear safety assessment, has been widely used. An integrated reliability and PSA program named RiskA has been developed by FDS Team. RiskA supplies several standard PSA modules including fault tree analysis, event tree analysis, uncertainty analysis, failure mode and effect analysis and reliability

  7. RELEVANT OBJECTIVES OF ASSURANCE OF RELIABILITY OF FACADE SYSTEMS SERVING THERMAL INSULATION AND FINISHING PURPOSES

    Directory of Open Access Journals (Sweden)

    Yavorskiy Andrey Andreevich

    2012-12-01

    Full Text Available The authors consider up-to-date methods of implementation of requirements stipulated by Federal Law no. 261-FZ that encompasses reduction of heat losses through installation of progressive heat-insulation systems, cement plaster system (CPS, and ventilated facades (VF. Unresolved problems of their efficient application caused by the absence of the all-Russian regulatory documents capable of controlling the processes of their installation and maintenance, as well as the projection of their behaviour, are also considered in the article. The authors argue that professional skills of designers and construction workers responsible for the design and installation of façade systems influence the quality and reliability of design and construction works. Unavailability of unified solutions or regulations serves as the objective reason for the unavailability of the respective database; therefore, there is an urgent need to perform a set of researches to have the unified database compiled. The authors use the example of thermal insulation cement plaster systems designated for facades as results of researches into the quantitative analysis of safety systems. Collected and systematized data that cover defects that have proven to be reasons for failures, as well as potential methods of their prevention are also studied. Data on pilot studies of major factors of influence onto reliability of glutinous adhesion of CPS to the base of a wall are provided.

  8. Human Reliability in Probabilistic Safety Assessments; Fiabilidad Humana en los Analisis Probabilisticos de Seguridad

    Energy Technology Data Exchange (ETDEWEB)

    Nunez Mendez, J

    1989-07-01

    Nowadays a growing interest in environmental aspects is detected in our country. It implies an assessment of the risk involved in the industrial processes and installations in order to determine if those are into the acceptable limits. In these safety assessments, among which PSA (Probabilistic Safety Assessments), can be pointed out the role played by the human being in the system is one of the more relevant subjects (This relevance has been demonstrated in the accidents happened) . However, in Spain there aren't manuals specifically dedicated to asses the human contribution to risk in the frame of PSAs. This report aims to improve this situation providing: a) a theoretical background to help the reader in the understanding of the nature of the human error, b) a quid to carry out a Human Reliability Analysis and c) a selected overview of the techniques and methodologies currently applied in this area. (Author) 20 refs.

  9. Human reliability analysis for probabilistic safety assessments - review of methods and issues

    International Nuclear Information System (INIS)

    Srinivas, G.; Guptan, Rajee; Malhotra, P.K.; Ghadge, S.G.; Chandra, Umesh

    2011-01-01

    It is well known that the two major events in World Nuclear Power Plant Operating history, namely the Three Mile Island and Chernobyl, were Human failure events. Subsequent to these two events, several significant changes have been incorporated in Plant Design, Control Room Design and Operator Training to reduce the possibility of Human errors during plant transients. Still, human error contribution to Risk in Nuclear Power Plant operations has been a topic of continued attention for research, development and analysis. Probabilistic Safety Assessments attempt to capture all potential human errors with a scientifically computed failure probability, through Human Reliability Analysis. Several methods are followed by different countries to quantify the Human error probability. This paper reviews the various popular methods being followed, critically examines them with reference to their criticisms and brings out issues for future research. (author)

  10. A note on the application of probabilistic structural reliability methodology to nuclear power plants

    International Nuclear Information System (INIS)

    Maurer, H.A.

    1978-01-01

    The interest shown in the general prospects of primary energy in European countries prompted description of the actual European situation. Explanation of the needs for installation of nuclear power plants in most contries of the European Communities are given. Activities of the Commission of the European Communities to initiate a progressive harmonization of already existing European criteria, codes and complementary requirements in order to improve the structural reliability of components and systems of nuclear power plants are summarized. Finally, the applicability of a probabilistic safety analysis to facilitate decision-making as to safety by defining acceptable target and limit values, coupled with a subjective estimate as it is applied in the safety analyses performed in most European countries, is demonstrated. (Auth.)

  11. Human Reliability in Probabilistic Safety Assessments; Fiabilidad Humana en los Analisis Probabilisticos de Seguridad

    Energy Technology Data Exchange (ETDEWEB)

    Nunez Mendez, J.

    1989-07-01

    Nowadays a growing interest in environmental aspects is detected in our country. It implies an assessment of the risk involved in the industrial processes and installations in order to determine if those are into the acceptable limits. In these safety assessments, among which PSA (Probabilistic Safety Assessments), can be pointed out the role played by the human being in the system is one of the more relevant subjects (This relevance has been demonstrated in the accidents happened) . However, in Spain there aren't manuals specifically dedicated to asses the human contribution to risk in the frame of PSAs. This report aims to improve this situation providing: a) a theoretical background to help the reader in the understanding of the nature of the human error, b) a quid to carry out a Human Reliability Analysis and c) a selected overview of the techniques and methodologies currently applied in this area. (Author) 20 refs.

  12. Reliability calculation of cracked components using probabilistic fracture mechanics and a Markovian approach

    International Nuclear Information System (INIS)

    Schmidt, T.

    1988-01-01

    The numerical reliability calculation of cracked construction components under cyclical fatigue stress can be done with the help of models of probabilistic fracture mechanics. An alternative to the Monte Carlo simulation method is examined; the alternative method is based on the description of failure processes with the help of a Markov process. The Markov method is traced back directly to the stochastic parameters of a two-dimensional fracture mechanics model, the effects of inspections and repairs also being considered. The probability of failure and expected failure frequency can be determined as time functions with the transition and conditional probabilities of the original or derived Markov process. For concrete calculation, an approximative Markov chain is designed which, under certain conditions, is capable of giving a sufficient approximation of the original Markov process and the reliability characteristics determined by it. The application of the MARKOV program code developed into an algorithm reveals sufficient conformity with the Monte Carlo reference results. The starting point of the investigation was the 'Deutsche Risikostudie B (DWR)' ('German Risk Study B (DWR)'), specifically, the reliability of the main coolant line. (orig./HP) [de

  13. Current activities and future trends in reliability analysis and probabilistic safety assessment in Hungary

    International Nuclear Information System (INIS)

    Hollo, E.; Toth, J.

    1986-01-01

    In Hungary reliability analysis (RA) and probabilistic safety assessment (PSA) of nuclear power plants was initiated 3 years ago. First, computer codes for automatic fault tree analysis (CAT, PREP) and numerical evaluation (REMO, KITT1,2) were adapted. Two main case studies - detailed availability/reliability calculation of diesel sets and analysis of safety systems influencing event sequences induced by large LOCA - were performed. Input failure data were taken from publications, a need for failure and reliability data bank was revealed. Current and future activities involves: setup of national data bank for WWER-440 units; full-scope level-I PSA of PAKS NPP in Hungary; operational safety assessment of particular problems at PAKS NPP. In the present article the state of RA and PSA activities in Hungary, as well as the main objectives of ongoing work are described. A need for international cooperation (for unified data collection of WWER-440 units) and for IAEA support (within Interregional Program INT/9/063) is emphasized. (author)

  14. A probabilistic approach to safety/reliability of space nuclear power systems

    International Nuclear Information System (INIS)

    Medford, G.; Williams, K.; Kolaczkowski, A.

    1989-01-01

    An ongoing effort is investigating the feasibility of using probabilistic risk assessment (PRA) modeling techniques to construct a living model of a space nuclear power system. This is being done in conjunction with a traditional reliability and survivability analysis of the SP-100 space nuclear power system. The initial phase of the project consists of three major parts with the overall goal of developing a top-level system model and defining initiating events of interest for the SP-100 system. The three major tasks were performing a traditional survivability analysis, performing a simple system reliability analysis, and constructing a top-level system fault-tree model. Each of these tasks and their interim results are discussed in this paper. Initial results from the study support the conclusion that PRA modeling techniques can provide a valuable design and decision-making tool for space reactors. The ability of the model to rank and calculate relative contributions from various failure modes allows design optimization for maximum safety and reliability. Future efforts in the SP-100 program will see data development and quantification of the model to allow parametric evaluations of the SP-100 system. Current efforts have shown the need for formal data development and test programs within such a modeling framework

  15. Space Shuttle Rudder Speed Brake Actuator-A Case Study Probabilistic Fatigue Life and Reliability Analysis

    Science.gov (United States)

    Oswald, Fred B.; Savage, Michael; Zaretsky, Erwin V.

    2015-01-01

    The U.S. Space Shuttle fleet was originally intended to have a life of 100 flights for each vehicle, lasting over a 10-year period, with minimal scheduled maintenance or inspection. The first space shuttle flight was that of the Space Shuttle Columbia (OV-102), launched April 12, 1981. The disaster that destroyed Columbia occurred on its 28th flight, February 1, 2003, nearly 22 years after its first launch. In order to minimize risk of losing another Space Shuttle, a probabilistic life and reliability analysis was conducted for the Space Shuttle rudder/speed brake actuators to determine the number of flights the actuators could sustain. A life and reliability assessment of the actuator gears was performed in two stages: a contact stress fatigue model and a gear tooth bending fatigue model. For the contact stress analysis, the Lundberg-Palmgren bearing life theory was expanded to include gear-surface pitting for the actuator as a system. The mission spectrum of the Space Shuttle rudder/speed brake actuator was combined into equivalent effective hinge moment loads including an actuator input preload for the contact stress fatigue and tooth bending fatigue models. Gear system reliabilities are reported for both models and their combination. Reliability of the actuator bearings was analyzed separately, based on data provided by the actuator manufacturer. As a result of the analysis, the reliability of one half of a single actuator was calculated to be 98.6 percent for 12 flights. Accordingly, each actuator was subsequently limited to 12 flights before removal from service in the Space Shuttle.

  16. Development of Probabilistic Reliability Models of Photovoltaic System Topologies for System Adequacy Evaluation

    Directory of Open Access Journals (Sweden)

    Ahmad Alferidi

    2017-02-01

    Full Text Available The contribution of solar power in electric power systems has been increasing rapidly due to its environmentally friendly nature. Photovoltaic (PV systems contain solar cell panels, power electronic converters, high power switching and often transformers. These components collectively play an important role in shaping the reliability of PV systems. Moreover, the power output of PV systems is variable, so it cannot be controlled as easily as conventional generation due to the unpredictable nature of weather conditions. Therefore, solar power has a different influence on generating system reliability compared to conventional power sources. Recently, different PV system designs have been constructed to maximize the output power of PV systems. These different designs are commonly adopted based on the scale of a PV system. Large-scale grid-connected PV systems are generally connected in a centralized or a string structure. Central and string PV schemes are different in terms of connecting the inverter to PV arrays. Micro-inverter systems are recognized as a third PV system topology. It is therefore important to evaluate the reliability contribution of PV systems under these topologies. This work utilizes a probabilistic technique to develop a power output model for a PV generation system. A reliability model is then developed for a PV integrated power system in order to assess the reliability and energy contribution of the solar system to meet overall system demand. The developed model is applied to a small isolated power unit to evaluate system adequacy and capacity level of a PV system considering the three topologies.

  17. A shortened version of the THERP/Handbook approach to human reliability analysis for probabilistic risk assessment

    International Nuclear Information System (INIS)

    Swain, A.D.

    1986-01-01

    The approach to human reliability analysis (HRA) known as THERP/Handbook has been applied to several probabilistic risk assessments (PRAs) of nuclear power plants (NPPs) and other complex systems. The approach is based on a thorough task analysis of the man-machine interfaces, including the interactions among the people, involved in the operations being assessed. The idea is to assess fully the underlying performance shaping factors (PSFs) and dependence effects which result either in reliable or unreliable human performance

  18. Reliability high cycle fatigue design of gas turbine blading system using probabilistic goodman diagram

    Energy Technology Data Exchange (ETDEWEB)

    Herman Shen, M.-H. [Ohio State Univ., Columbus, OH (United States). Dept. of Aerospace Engineering and Aviation; Nicholas, T. [MLLN, Wright-Patterson AFB, OH (United States). Air Force Research Lab.

    2001-07-01

    A framework for the probabilistic analysis of high cycle fatigue is developed. The framework will be useful to U.S. Air Force and aeroengine manufacturers in the design of high cycle fatigue in disk or compressor components fabricated from Ti-6Al-4V under a range of loading conditions that might be encountered during service. The main idea of the framework is to characterize vibratory stresses from random input variables due to uncertainties such as crack location, loading, material properties, and manufacturing variability. The characteristics of such vibratory stresses are portrayed graphically as histograms, or probability density function (PDF). The outcome of the probability measures associated with all the values of a random variable exceeding the material capability is achieved by a failure function g(X) defined by the difference between the vibratory stress and Goodman line or surface such that the probability of HCF failure is P{sub f} =P(g(X<0)). Design can then be based on a go-no go criterion based on an assumed risk. The framework can be used to facilitate the development of design tools for the prediction of inspection schedules and reliability in aeroengine components. Such tools could lead ultimately to improved life extension schemes in aging aircraft, and more reliable methods for the design and inspection of critical components. (orig.)

  19. Key attributes of the SAPHIRE risk and reliability analysis software for risk-informed probabilistic applications

    International Nuclear Information System (INIS)

    Smith, Curtis; Knudsen, James; Kvarfordt, Kellie; Wood, Ted

    2008-01-01

    The Idaho National Laboratory is a primary developer of probabilistic risk and reliability analysis (PRRA) tools, dating back over 35 years. Evolving from mainframe-based software, the current state-of-the-practice has led to the creation of the SAPHIRE software. Currently, agencies such as the Nuclear Regulatory Commission, the National Aeronautics and Aerospace Agency, the Department of Energy, and the Department of Defense use version 7 of the SAPHIRE software for many of their risk-informed activities. In order to better understand and appreciate the power of software as part of risk-informed applications, we need to recall that our current analysis methods and solution methods have built upon pioneering work done 30-40 years ago. We contrast this work with the current capabilities in the SAPHIRE analysis package. As part of this discussion, we provide information for both the typical features and special analysis capabilities, which are available. We also present the application and results typically found with state-of-the-practice PRRA models. By providing both a high-level and detailed look at the SAPHIRE software, we give a snapshot in time for the current use of software tools in a risk-informed decision arena

  20. Modeling and Quantification of Team Performance in Human Reliability Analysis for Probabilistic Risk Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Jeffrey C. JOe; Ronald L. Boring

    2014-06-01

    Probabilistic Risk Assessment (PRA) and Human Reliability Assessment (HRA) are important technical contributors to the United States (U.S.) Nuclear Regulatory Commission’s (NRC) risk-informed and performance based approach to regulating U.S. commercial nuclear activities. Furthermore, all currently operating commercial NPPs in the U.S. are required by federal regulation to be staffed with crews of operators. Yet, aspects of team performance are underspecified in most HRA methods that are widely used in the nuclear industry. There are a variety of "emergent" team cognition and teamwork errors (e.g., communication errors) that are 1) distinct from individual human errors, and 2) important to understand from a PRA perspective. The lack of robust models or quantification of team performance is an issue that affects the accuracy and validity of HRA methods and models, leading to significant uncertainty in estimating HEPs. This paper describes research that has the objective to model and quantify team dynamics and teamwork within NPP control room crews for risk informed applications, thereby improving the technical basis of HRA, which improves the risk-informed approach the NRC uses to regulate the U.S. commercial nuclear industry.

  1. An overview of the evolution of human reliability analysis in the context of probabilistic risk assessment

    International Nuclear Information System (INIS)

    Bley, Dennis C.; Lois, Erasmia; Kolaczkowski, Alan M.; Forester, John Alan; Wreathall, John; Cooper, Susan E.

    2009-01-01

    Since the Reactor Safety Study in the early 1970's, human reliability analysis (HRA) has been evolving towards a better ability to account for the factors and conditions that can lead humans to take unsafe actions and thereby provide better estimates of the likelihood of human error for probabilistic risk assessments (PRAs). The purpose of this paper is to provide an overview of recent reviews of operational events and advances in the behavioral sciences that have impacted the evolution of HRA methods and contributed to improvements. The paper discusses the importance of human errors in complex human-technical systems, examines why humans contribute to accidents and unsafe conditions, and discusses how lessons learned over the years have changed the perspective and approach for modeling human behavior in PRAs of complicated domains such as nuclear power plants. It is argued that it has become increasingly more important to understand and model the more cognitive aspects of human performance and to address the broader range of factors that have been shown to influence human performance in complex domains. The paper concludes by addressing the current ability of HRA to adequately predict human failure events and their likelihood

  2. Integration of human reliability analysis into the probabilistic risk assessment process: phase 1

    International Nuclear Information System (INIS)

    Bell, B.J.; Vickroy, S.C.

    1985-01-01

    The US Nuclear Regulatory Commission and Pacific Northwest Laboratory initiated a research program in 1984 to develop a testable set of analytical procedures for integrating human reliability analysis (HRA) into the probabilistic risk assessment (PRA) process to more adequately assess the overall impact of human performance on risk. In this three phase program, stand-alone HRA/PRA analytic procedures will be developed and field evaluated to provide improved methods, techniques, and models for applying quantitative and qualitative human error data which systematically integrate HRA principles, techniques, and analyses throughout the entire PRA process. Phase 1 of the program involved analysis of state-of-the-art PRAs to define the structures and processes currently in use in the industry. Phase 2 research will involve developing a new or revised PRA methodology which will enable more efficient regulation of the industry using quantitative or qualitative results of the PRA. Finally, Phase 3 will be to field test those procedures to assure that the results generated by the new methodologies will be usable and acceptable to the NRC. This paper briefly describes the first phase of the program and outlines the second

  3. Integration of human reliability analysis into the probabilistic risk assessment process: Phase 1

    International Nuclear Information System (INIS)

    Bell, B.J.; Vickroy, S.C.

    1984-10-01

    A research program was initiated to develop a testable set of analytical procedures for integrating human reliability analysis (HRA) into the probabilistic risk assessment (PRA) process to more adequately assess the overall impact of human performance on risk. In this three-phase program, stand-alone HRA/PRA analytic procedures will be developed and field evaluated to provide improved methods, techniques, and models for applying quantitative and qualitative human error data which systematically integrate HRA principles, techniques, and analyses throughout the entire PRA process. Phase 1 of the program involved analysis of state-of-the-art PRAs to define the structures and processes currently in use in the industry. Phase 2 research will involve developing a new or revised PRA methodology which will enable more efficient regulation of the industry using quantitative or qualitative results of the PRA. Finally, Phase 3 will be to field test those procedures to assure that the results generated by the new methodologies will be usable and acceptable to the NRC. This paper briefly describes the first phase of the program and outlines the second

  4. An overview of the evolution of human reliability analysis in the context of probabilistic risk assessment.

    Energy Technology Data Exchange (ETDEWEB)

    Bley, Dennis C. (Buttonwood Consulting Inc., Oakton, VA); Lois, Erasmia (U.S. Nuclear Regulatory Commission, Washington, DC); Kolaczkowski, Alan M. (Science Applications International Corporation, Eugene, OR); Forester, John Alan; Wreathall, John (John Wreathall and Co., Dublin, OH); Cooper, Susan E. (U.S. Nuclear Regulatory Commission, Washington, DC)

    2009-01-01

    Since the Reactor Safety Study in the early 1970's, human reliability analysis (HRA) has been evolving towards a better ability to account for the factors and conditions that can lead humans to take unsafe actions and thereby provide better estimates of the likelihood of human error for probabilistic risk assessments (PRAs). The purpose of this paper is to provide an overview of recent reviews of operational events and advances in the behavioral sciences that have impacted the evolution of HRA methods and contributed to improvements. The paper discusses the importance of human errors in complex human-technical systems, examines why humans contribute to accidents and unsafe conditions, and discusses how lessons learned over the years have changed the perspective and approach for modeling human behavior in PRAs of complicated domains such as nuclear power plants. It is argued that it has become increasingly more important to understand and model the more cognitive aspects of human performance and to address the broader range of factors that have been shown to influence human performance in complex domains. The paper concludes by addressing the current ability of HRA to adequately predict human failure events and their likelihood.

  5. Setting reinspection intervals for seam welded piping by use of probabilistic fracture mechanics and target reliability values

    International Nuclear Information System (INIS)

    Harris, D.O.; Dedhia, D.

    1995-01-01

    The purpose of this paper is to describe a procedure for the selection of a reinspection interval for defects found during an inspection. The procedure is based on probabilistic fracture mechanics calculations of the reliability of the component into the future and selection of an inspection time based on maintaining the target value reliability. The selection of a target value based on the risk of everyday activities is discussed. The procedure is applied to high temperature seam welded piping as an example, because the probabilistic fracture mechanics tools are relatively readily available and this is a problem of great current interest. The results obtained in the example problem indicate reinspection intervals much shorter than field experience would suggest. This indicates a conservatism in the fracture mechanics procedures and/or lack of accurate characterization of scatter in material properties due to lack of data. The general procedure should prove useful in the disposition of detected cracks in a wide variety of situations

  6. Probabilistic safety assessment of Tehran Research Reactor using systems analysis programs for hands-on integrated reliability evaluations

    International Nuclear Information System (INIS)

    Hosseini, M.H.; Nematollahi, M.R.; Sepanloo, K.

    2004-01-01

    Probabilistic safety assessment application is found to be a practical tool for research reactor safety due to intense involvement of human interactions in an experimental facility. In this document the application of the probabilistic safety assessment to the Tehran Research Reactor is presented. The level 1 practicabilities safety assessment application involved: Familiarization with the plant, selection of accident initiators, mitigating functions and system definitions, event tree constructions and quantifications, fault tree constructions and quantification, human reliability, component failure data base development and dependent failure analysis. Each of the steps of the analysis given above is discussed with highlights from the selected results. Quantification of the constructed models is done using systems analysis programs for hands-on integrated reliability evaluations software

  7. A comparative study of the probabilistic fracture mechanics and the stochastic Markovian process approaches for structural reliability assessment

    Energy Technology Data Exchange (ETDEWEB)

    Stavrakakis, G.; Lucia, A.C.; Solomos, G. (Commission of the European Communities, Ispra (Italy). Joint Research Centre)

    1990-01-01

    The two computer codes COVASTOL and RELIEF, developed for the modeling of cumulative damage processes in the framework of probabilistic structural reliability, are compared. They are based respectively on the randomisation of a differential crack growth law and on the theory of discrete Markov processes. The codes are applied for fatigue crack growth predictions using two sets of data of crack propagation curves from specimens. The results are critically analyzed and an extensive discussion follows on the merits and limitations of each code. Their transferability for the reliability assessment of real structures is investigated. (author).

  8. A probabilistic capacity spectrum strategy for the reliability analysis of bridge pile shafts considering soil structure interaction

    Directory of Open Access Journals (Sweden)

    Dookie Kim

    Full Text Available This paper presents a probabilistic capacity spectrum strategy for the reliability analysis of a bridge pile shaft, accounting for uncertainties in design factors in the analysis and the soil-structure interaction (SSI. Monte Carlo simulation method (MCS is adopted to determine the probabilities of failure by comparing the responses with defined limit states. The analysis considers the soil structure interaction together with the probabilistic application of the capacity spectrum method for different types of limit states. A cast-in-drilledhole (CIDH extended reinforced concrete pile shaft of a bridge is analysed using the proposed strategy. The results of the analysis show that the SSI can lead to increase or decrease of the structure's probability of failure depending on the definition of the limit states.

  9. Fuzzy sets as extension of probabilistic models for evaluating human reliability

    International Nuclear Information System (INIS)

    Przybylski, F.

    1996-11-01

    On the base of a survey of established quantification methodologies for evaluating human reliability, a new computerized methodology was developed in which a differential consideration of user uncertainties is made. In this quantification method FURTHER (FUzzy Sets Related To Human Error Rate Prediction), user uncertainties are quantified separately from model and data uncertainties. As tools fuzzy sets are applied which, however, stay hidden to the method's user. The user in the quantification process only chooses an action pattern, performance shaping factors and natural language expressions. The acknowledged method HEART (Human Error Assessment and Reduction Technique) serves as foundation of the fuzzy set approach FURTHER. By means of this method, the selection of a basic task in connection with its basic error probability, the decision how correct the basic task's selection is, the selection of a peformance shaping factor, and the decision how correct the selection and how important the performance shaping factor is, were identified as aspects of fuzzification. This fuzzification is made on the base of data collection and information from literature as well as of the estimation by competent persons. To verify the ammount of additional information to be received by the usage of fuzzy sets, a benchmark session was accomplished. In this benchmark twelve actions were assessed by five test-persons. In case of the same degree of detail in the action modelling process, the bandwidths of the interpersonal evaluations decrease in FURTHER in comparison with HEART. The uncertainties of the single results could not be reduced up to now. The benchmark sessions conducted so far showed plausible results. A further testing of the fuzzy set approach by using better confirmed fuzzy sets can only be achieved in future practical application. Adequate procedures, however, are provided. (orig.) [de

  10. Photovoltaic and Wind Turbine Integration Applying Cuckoo Search for Probabilistic Reliable Optimal Placement

    OpenAIRE

    R. A. Swief; T. S. Abdel-Salam; Noha H. El-Amary

    2018-01-01

    This paper presents an efficient Cuckoo Search Optimization technique to improve the reliability of electrical power systems. Various reliability objective indices such as Energy Not Supplied, System Average Interruption Frequency Index, System Average Interruption, and Duration Index are the main indices indicating reliability. The Cuckoo Search Optimization (CSO) technique is applied to optimally place the protection devices, install the distributed generators, and to determine the size of ...

  11. Probabilistic risk assessment for a loss of coolant accident in McMaster Nuclear Reactor and application of reliability physics model for modeling human reliability

    Science.gov (United States)

    Ha, Taesung

    A probabilistic risk assessment (PRA) was conducted for a loss of coolant accident, (LOCA) in the McMaster Nuclear Reactor (MNR). A level 1 PRA was completed including event sequence modeling, system modeling, and quantification. To support the quantification of the accident sequence identified, data analysis using the Bayesian method and human reliability analysis (HRA) using the accident sequence evaluation procedure (ASEP) approach were performed. Since human performance in research reactors is significantly different from that in power reactors, a time-oriented HRA model (reliability physics model) was applied for the human error probability (HEP) estimation of the core relocation. This model is based on two competing random variables: phenomenological time and performance time. The response surface and direct Monte Carlo simulation with Latin Hypercube sampling were applied for estimating the phenomenological time, whereas the performance time was obtained from interviews with operators. An appropriate probability distribution for the phenomenological time was assigned by statistical goodness-of-fit tests. The human error probability (HEP) for the core relocation was estimated from these two competing quantities: phenomenological time and operators' performance time. The sensitivity of each probability distribution in human reliability estimation was investigated. In order to quantify the uncertainty in the predicted HEPs, a Bayesian approach was selected due to its capability of incorporating uncertainties in model itself and the parameters in that model. The HEP from the current time-oriented model was compared with that from the ASEP approach. Both results were used to evaluate the sensitivity of alternative huinan reliability modeling for the manual core relocation in the LOCA risk model. This exercise demonstrated the applicability of a reliability physics model supplemented with a. Bayesian approach for modeling human reliability and its potential

  12. Method for assessing reliability of a network considering probabilistic safety assessment

    International Nuclear Information System (INIS)

    Cepin, M.

    2005-01-01

    A method for assessment of reliability of the network is developed, which uses the features of the fault tree analysis. The method is developed in a way that the increase of the network under consideration does not require significant increase of the model. The method is applied to small examples of network consisting of a small number of nodes and a small number of their connections. The results give the network reliability. They identify equipment, which is to be carefully maintained in order that the network reliability is not reduced, and equipment, which is a candidate for redundancy, as this would improve network reliability significantly. (author)

  13. Time-dependent reliability analysis of nuclear reactor operators using probabilistic network models

    International Nuclear Information System (INIS)

    Oka, Y.; Miyata, K.; Kodaira, H.; Murakami, S.; Kondo, S.; Togo, Y.

    1987-01-01

    Human factors are very important for the reliability of a nuclear power plant. Human behavior has essentially a time-dependent nature. The details of thinking and decision making processes are important for detailed analysis of human reliability. They have, however, not been well considered by the conventional methods of human reliability analysis. The present paper describes the models for the time-dependent and detailed human reliability analysis. Recovery by an operator is taken into account and two-operators models are also presented

  14. Probabilistic evaluation of design S-N curve and reliability assessment of ASME code-based evaluation

    International Nuclear Information System (INIS)

    Zhao Yongxiang

    1999-01-01

    A probabilistic evaluating approach of design S-N curve and a reliability assessment approach of the ASME code-based evaluation are presented on the basis of Langer S-N model-based P-S-N curves. The P-S-N curves are estimated by a so-called general maximum likelihood method. This method can be applied to deal with the virtual stress amplitude-crack initial life data which have a characteristics of double random variables. Investigation of a set of the virtual stress amplitude-crack initial life (S-N) data of 1Cr18Ni9Ti austenitic stainless steel-welded joint reveals that the P-S-N curves can give a good prediction of scatter regularity of the S-N data. Probabilistic evaluation of the design S-N curve with 0.9999 survival probability has considered various uncertainties, besides of the scatter of the S-N data, to an appropriate extent. The ASME code-based evaluation with 20 reduction factor on the mean life is much more conservative than that with 2 reduction factor on the stress amplitude. Evaluation of the latter in 666.61 MPa virtual stress amplitude is equivalent to 0.999522 survival probability and in 2092.18 MPa virtual stress amplitude equivalent to 0.9999999995 survival probability. This means that the evaluation in the low loading level may be non-conservative and in contrast, too conservative in the high loading level. Cause is that the reduction factors are constants and the factors can not take into account the general observation that scatter of the N data increases with the loading level decreasing. This has indicated that it is necessary to apply the probabilistic approach to the evaluation of design S-N curve

  15. A study on the dependency evaluation for multiple human actions in human reliability analysis of probabilistic safety assessment

    International Nuclear Information System (INIS)

    Kang, D. I.; Yang, J. E.; Jung, W. D.; Sung, T. Y.; Park, J. H.; Lee, Y. H.; Hwang, M. J.; Kim, K. Y.; Jin, Y. H.; Kim, S. C.

    1997-02-01

    This report describes the study results on the method of the dependency evaluation and the modeling, and the limited value of human error probability (HEP) for multiple human actions in accident sequences of probabilistic safety assessment (PSA). THERP and Parry's method, which have been generally used in dependency evaluation of human reliability analysis (HRA), are introduced and their limitations are discussed. New dependency evaluation method in HRA is established to make up for the weak points of THERP and Parry's methods. The limited value of HEP is also established based on the review of several HRA related documents. This report describes the definition, the type, the evaluation method, and the evaluation example of dependency to help the reader's understanding. It is expected that this study results will give a guidance to HRA analysts in dependency evaluation of multiple human actions and enable PSA analysts to understand HRA in detail. (author). 23 refs., 3 tabs., 2 figs

  16. Kuhn-Tucker optimization based reliability analysis for probabilistic finite elements

    Science.gov (United States)

    Liu, W. K.; Besterfield, G.; Lawrence, M.; Belytschko, T.

    1988-01-01

    The fusion of probability finite element method (PFEM) and reliability analysis for fracture mechanics is considered. Reliability analysis with specific application to fracture mechanics is presented, and computational procedures are discussed. Explicit expressions for the optimization procedure with regard to fracture mechanics are given. The results show the PFEM is a very powerful tool in determining the second-moment statistics. The method can determine the probability of failure or fracture subject to randomness in load, material properties and crack length, orientation, and location.

  17. Reliability data update using condition monitoring and prognostics in probabilistic safety assessment

    Directory of Open Access Journals (Sweden)

    Hyeonmin Kim

    2015-03-01

    Full Text Available Probabilistic safety assessment (PSA has had a significant role in quantitative decision-making by finding design and operational vulnerabilities and evaluating cost-benefit in improving such weak points. In particular, it has been widely used as the core methodology for risk-informed applications (RIAs. Even though the nature of PSA seeks realistic results, there are still “conservative” aspects. One of the sources for the conservatism is the assumptions of safety analysis and the estimation of failure frequency. Surveillance, diagnosis, and prognosis (SDP, utilizing massive databases and information technology, is worth highlighting in terms of its capability for alleviating the conservatism in conventional PSA. This article provides enabling techniques to solidify a method to provide time- and condition-dependent risks by integrating a conventional PSA model with condition monitoring and prognostics techniques. We will discuss how to integrate the results with frequency of initiating events (IEs and probability of basic events (BEs. Two illustrative examples will be introduced: (1 how the failure probability of a passive system can be evaluated under different plant conditions and (2 how the IE frequency for a steam generator tube rupture (SGTR can be updated in terms of operating time. We expect that the proposed model can take a role of annunciator to show the variation of core damage frequency (CDF depending on operational conditions.

  18. Reliability data update using condition monitoring and prognostics in probabilistic safety assessment

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Hyeon Min; Lee, Sang Hwan; Park, Jun Seok; Kim, Hyung Dae; Chang, Yoon Suk; Heo, Gyun Young [Dept. of Nuclear Engineering, Kyung Hee University, Yongin (Korea, Republic of)

    2015-03-15

    Probabilistic safety assessment (PSA) has had a significant role in quantitative decision making by finding design and operational vulnerabilities and evaluating cost-benefit in improving such weak points. In particular, it has been widely used as the core methodology for risk-informed applications (RIAs). Even though the nature of PSA seeks realistic results, there are still 'conservative' aspects. One of the sources for the conservatism is the assumptions of safety analysis and the estimation of failure frequency. Surveillance, diagnosis, and prognosis (SDP), utilizing massive databases and information technology, is worth highlighting in terms of its capability for alleviating the conservatism in conventional PSA. This article provides enabling techniques to solidify a method to provide time and condition-dependent risks by integrating a conventional PSA model with condition monitoring and prognostics techniques. We will discuss how to integrate the results with frequency of initiating events (IEs) and probability of basic events (BEs). Two illustrative examples will be introduced: (1) how the failure probability of a passive system can be evaluated under different plant conditions and (2) how the IE frequency for a steam generator tube rupture (SGTR) can be updated in terms of operating time. We expect that the proposed model can take a role of annunciator to show the variation of core damage frequency (CDF) depending on operational conditions.

  19. Application of probabilistic fracture mechanics to the reliability analysis of pressure-bearing reactor components

    International Nuclear Information System (INIS)

    Schmitt, W.; Roehrich, E.; Wellein, R.

    1977-01-01

    Since no failures in the primary reactor components have been reported so far, it is impossible to estimate the failure probability of those components just by means of statistics. Therefore the way of probabilistic fracture mechanics has been proposed. Here the material properties, the loads and the crack distributions are treated as statistical variables with certain distributions. From the distributions of these data probability density functions can be established for the loading of a component (e.g. the stress intensity factor) as well as for the resistance of this component (e.g. the fracture toughness). From these functions the failure probability for a given failure mode (e.g. brittle fracture) is easily obtained either by the application of direct integration procedures which are shortly reviewed here, or by the use of Monte Carlo techniques. The most important part of the concept is the collection of a sufficiently large amount of raw data from different sources (departments within the company or external). These data need to be processed so that they can be transformed into probability density functions. The method of data collection and processing in terms of histograms, plots of probability density functions etc, is described. The choice of the various types of distribution functions is discussed. As an example the derivation of the probability density function for cracks of a given size in a component is presented. (Auth.)

  20. On the applicability of probabilistic analyses to assess the structural reliability of materials and components for solid-oxide fuel cells

    Energy Technology Data Exchange (ETDEWEB)

    Lara-Curzio, Edgar [ORNL; Radovic, Miladin [Texas A& M University; Luttrell, Claire R [ORNL

    2016-01-01

    The applicability of probabilistic analyses to assess the structural reliability of materials and components for solid-oxide fuel cells (SOFC) is investigated by measuring the failure rate of Ni-YSZ when subjected to a temperature gradient and comparing it with that predicted using the Ceramics Analysis and Reliability Evaluation of Structures (CARES) code. The use of a temperature gradient to induce stresses was chosen because temperature gradients resulting from gas flow patterns generate stresses during SOFC operation that are the likely to control the structural reliability of cell components The magnitude of the predicted failure rate was found to be comparable to that determined experimentally, which suggests that such probabilistic analyses are appropriate for predicting the structural reliability of materials and components for SOFCs. Considerations for performing more comprehensive studies are discussed.

  1. Differential reliability : probabilistic engineering applied to wood members in bending-tension

    Science.gov (United States)

    Stanley K. Suddarth; Frank E. Woeste; William L. Galligan

    1978-01-01

    Reliability analysis is a mathematical technique for appraising the design and materials of engineered structures to provide a quantitative estimate of probability of failure. Two or more cases which are similar in all respects but one may be analyzed by this method; the contrast between the probabilities of failure for these cases allows strong analytical focus on the...

  2. Application of probabilistic fracture mechanics to the reliability analysis of pressure-bearing reactor components

    International Nuclear Information System (INIS)

    Schmitt, W.; Roehrich, E.; Wellein, R.

    1977-01-01

    Since no failures in the primary reactor components have been reported so far, it is impossible to estimate the failure probability of those components just by means of statistics. Therefore the way of probabilistic fracture mechanics has been proposed. Here the material properties, the loads and the crack distributions are treated as statistical variables with certain distributions. From the distributions of these data probability density functions can be established for the loading of a component as well as for the resistance of this component. From these functions the failure probability for a given failure mode is easily obtained either by the application of direct integration procedures which are shortly reviewed here, or by the use of Monte Carlo techniques. The most important part of the concept is the collection of a sufficiently large amount of raw data from different sources. These data need to be processed so that they can be transformed into probability density functions. The method of data collection and processing in terms of histograms, plots of probability density functions etc. is described. The choice of the various types of distribution functions is discussed. As an example, the derivation of the probability density function for cracks of a given size in a component is presented. Here the raw data, i.e. the ultrasonic results, are transformed into real crack sizes by means of a conservative conversion rule. The true distribution of the indications is obtained by taking into account a detection probability function. The final probability density function is influenced by the fact that indications exceeding certain values need to be re

  3. Investigation on the reliability of expansion joint for piping with probabilistic method

    International Nuclear Information System (INIS)

    Ishii, Y.; Kambe, M.

    1980-01-01

    The reduction of the plant size is necessitated as one of the major targets in LMFBR design. Usually, piping work system is extensively used to absorb thermal expansion between two components anywhere. Besides above, expansion joint for piping seems to be attractive lately for the same object. This paper describes the significance of expansion joint with multiple boundaries, breakdown probability of expansion joint assembly and partly the bellows by introducing several hypothetical conditions in connection with piping. Also, an importance of in-service inspection (ISI) for expansion joint was discussed using a comparative table and probabilities on reliability from partly broken to full penetration. In conclusion, the expansion joint with ISI should be manufactured with excellent reliability in order to cope with piping work system; several conditions of the practical application for piping systems are suggested. (author)

  4. Investigation on the reliability of expansion joint for piping with probabilistic method

    Energy Technology Data Exchange (ETDEWEB)

    Ishii, Y; Kambe, M

    1980-02-01

    The reduction of the plant size is necessitated as one of the major targets in LMFBR design. Usually, piping work system is extensively used to absorb thermal expansion between two components anywhere. Besides above, expansion joint for piping seems to be attractive lately for the same object. This paper describes the significance of expansion joint with multiple boundaries, breakdown probability of expansion joint assembly and partly the bellows by introducing several hypothetical conditions in connection with piping. Also, an importance of in-service inspection (ISI) for expansion joint was discussed using a comparative table and probabilities on reliability from partly broken to full penetration. In conclusion, the expansion joint with ISI should be manufactured with excellent reliability in order to cope with piping work system; several conditions of the practical application for piping systems are suggested. (author)

  5. Investigation on the reliability of expansion joint for piping with probabilistic method

    International Nuclear Information System (INIS)

    Ishii, Yoichiro; Kambe, Mitsuru.

    1979-11-01

    The reduction of the plant size if necessitated as one of the major target in LMFBR design. Usually, piping work system is extensively used to absorb thermal expansion between two components anywhere. Besides above, expansion joint for piping seems to be attractive lately for the same object. This paper describes about the significance of expansion joint with multiple boundaries, breakdown probability of expansion joint assembly and partly the bellows by introducing several hypothetical conditions in connection with piping. Also, an importance of inservice inspection (ISI) for expansion joint was discussed using by comparative table and probabilities on reliability from partly broken to full penetration. In the conclusion, the expansion joint with ISI should be manufactured with excellent reliability in order to cope with piping work system, and several conditions of the practical application for piping systems are suggested. (author)

  6. Development of Probabilistic Reliability Models of Photovoltaic System Topologies for System Adequacy Evaluation

    OpenAIRE

    Ahmad Alferidi; Rajesh Karki

    2017-01-01

    The contribution of solar power in electric power systems has been increasing rapidly due to its environmentally friendly nature. Photovoltaic (PV) systems contain solar cell panels, power electronic converters, high power switching and often transformers. These components collectively play an important role in shaping the reliability of PV systems. Moreover, the power output of PV systems is variable, so it cannot be controlled as easily as conventional generation due to the unpredictable na...

  7. Pitting corrosion and structural reliability of corroding RC structures: Experimental data and probabilistic analysis

    International Nuclear Information System (INIS)

    Stewart, Mark G.; Al-Harthy, Ali

    2008-01-01

    A stochastic analysis is developed to assess the temporal and spatial variability of pitting corrosion on the reliability of corroding reinforced concrete (RC) structures. The structure considered herein is a singly reinforced RC beam with Y16 or Y27 reinforcing bars. Experimental data obtained from corrosion tests are used to characterise the probability distribution of pit depth. The RC beam is discretised into a series of small elements and maximum pit depths are generated for each reinforcing steel bar in each element. The loss of cross-sectional area, reduction in yield strength and reduction in flexural resistance are then inferred. The analysis considers various member spans, loading ratios, bar diameters and numbers of bars in a given cross-section, and moment diagrams. It was found that the maximum corrosion loss in a reinforcing bar conditional on beam collapse was no more than 16%. The probabilities of failure considering spatial variability of pitting corrosion were up to 200% higher than probabilities of failure obtained from a non-spatial analysis after 50 years of corrosion. This shows the importance of considering spatial variability in a structural reliability analysis for deteriorating structures, particularly for corroding RC beams in flexure

  8. Reliable Biomass Supply Chain Design under Feedstock Seasonality and Probabilistic Facility Disruptions

    Directory of Open Access Journals (Sweden)

    Zhixue Liu

    2017-11-01

    Full Text Available While biomass has been recognized as an important renewable energy source which has a range of positive impacts on the economy, environment, and society, the existence of feedstock seasonality and risk of service disruptions at collection facilities potentially compromises the efficiency and reliability of the energy supply system. In this paper, we consider reliable supply chain design for biomass collection against feedstock seasonality and time-varying disruption risks. We optimize facility location, inventory, biomass quantity, and shipment decisions in a multi-period planning horizon setting. A real-world case in Hubei, China is studied to offer managerial insights. Our computational results show that: (1 the disruption risk significantly affects both the optimal facility locations and the supply chain cost; (2 no matter how the failure probability changes, setting backup facilities can significantly decrease the total cost; and (3 the feedstock seasonality does not affect locations of the collection facilities, but it affects the allocations of collection facilities and brings higher inventory cost for the biomass supply chain.

  9. Living PRAs [probabilistic risk analysis] made easier with IRRAS [Integrated Reliability and Risk Analysis System

    International Nuclear Information System (INIS)

    Russell, K.D.; Sattison, M.B.; Rasmuson, D.M.

    1989-01-01

    The Integrated Reliability and Risk Analysis System (IRRAS) is an integrated PRA software tool that gives the user the ability to create and analyze fault trees and accident sequences using an IBM-compatible microcomputer. This program provides functions that range from graphical fault tree and event tree construction to cut set generation and quantification. IRRAS contains all the capabilities and functions required to create, modify, reduce, and analyze event tree and fault tree models used in the analysis of complex systems and processes. IRRAS uses advanced graphic and analytical techniques to achieve the greatest possible realization of the potential of the microcomputer. When the needs of the user exceed this potential, IRRAS can call upon the power of the mainframe computer. The role of the Idaho National Engineering Laboratory if the IRRAS program is that of software developer and interface to the user community. Version 1.0 of the IRRAS program was released in February 1987 to prove the concept of performing this kind of analysis on microcomputers. This version contained many of the basic features needed for fault tree analysis and was received very well by the PRA community. Since the release of Version 1.0, many user comments and enhancements have been incorporated into the program providing a much more powerful and user-friendly system. This version is designated ''IRRAS 2.0''. Version 3.0 will contain all of the features required for efficient event tree and fault tree construction and analysis. 5 refs., 26 figs

  10. Human reliability analysis in probabilistic safety assessment for nuclear power plants. A Safety Practice. A publication within the NUSS programme

    International Nuclear Information System (INIS)

    1995-01-01

    Probabilistic safety assessment (PSA) is playing an increasingly important role in the safe operation of nuclear power plants throughout the world. In order to establish a consistent framework for conducting PSA studies, for promoting technology transfer of the state of the art, and for encouraging uniformity in the way PSA is carried out, the IAEA is preparing a set of publications which gives guidance on various aspects of PSA. This document presents a practical approach for incorporating human reliability analysis (HRA) into PSA. It describes the steps needed and the documentation that should be provided both to support the PSA itself and to ensure effective communication of important information arising from the studies. It also describes a framework for analysing those human actions which could affect safety and for relating such human influences to specific parts of a PSA. This Safety Practice also addresses the limitations of PSA in taking account of human factors in relation to safety and risk. Refs, figs and tabs

  11. MERMOS: an EDF project to update the PHRA methodology (Probabilistic Human Reliability Assessment)

    International Nuclear Information System (INIS)

    Le Bot, Pierre; Desmares, E.; Bieder, C.; Cara, F.; Bonnet, J.L.

    1998-01-01

    To account for successive evolution of nuclear power plants emergency operation, EDF had several times to review PHRA methodologies. It was particularly the case when event-based procedures were left behind to the benefit of state-based procedures. A more recent updating was necessary to get pieces of information on the new unit type N4 safety. The extent of changes in operation for this unit type (especially the computerization of both the control room and the procedures) required to deeply rethink existing PHRA methods. It also seemed necessary to - more explicitly than in the past - base the design of methods on concepts evolved in human sciences. These are the main ambitions of the project named MERMOS that started in 1996. The design effort for a new PHRA method is carried out by a multidisciplinary team involving reliability engineers, psychologists and ergonomists. An independent expert is in charge of project review. The method, considered as the analysis tool dedicated to PHRA analysts, is one of the two outcomes of the project. The other one is the formalization of the design approach for the method, aimed at a good appropriation of the method by the analysts. EDF's specificity in the field of PHRA and more generally PSA is that the method is not used by the designers but by analysts. Keeping track of the approach is also meant to guarantee its transposition to other EDF unit types such as 900 or 1300 MW PWR. The PHRA method is based upon a model of emergency operation called 'SAD model'. The formalization effort of the design approach lead to clarify and justify it. The model describes and explains both functioning and dys-functioning of emergency operation in PSA scenarios. It combines a systemic approach and what is called distributed cognition in cognitive sciences. Collective aspects are considered as an important feature in explaining phenomena under study in operation dys-functioning. The PHRA method is to be operational early next year (1998

  12. A fuzzy-based reliability approach to evaluate basic events of fault tree analysis for nuclear power plant probabilistic safety assessment

    International Nuclear Information System (INIS)

    Purba, Julwan Hendry

    2014-01-01

    Highlights: • We propose a fuzzy-based reliability approach to evaluate basic event reliabilities. • It implements the concepts of failure possibilities and fuzzy sets. • Experts evaluate basic event failure possibilities using qualitative words. • Triangular fuzzy numbers mathematically represent qualitative failure possibilities. • It is a very good alternative for conventional reliability approach. - Abstract: Fault tree analysis has been widely utilized as a tool for nuclear power plant probabilistic safety assessment. This analysis can be completed only if all basic events of the system fault tree have their quantitative failure rates or failure probabilities. However, it is difficult to obtain those failure data due to insufficient data, environment changing or new components. This study proposes a fuzzy-based reliability approach to evaluate basic events of system fault trees whose failure precise probability distributions of their lifetime to failures are not available. It applies the concept of failure possibilities to qualitatively evaluate basic events and the concept of fuzzy sets to quantitatively represent the corresponding failure possibilities. To demonstrate the feasibility and the effectiveness of the proposed approach, the actual basic event failure probabilities collected from the operational experiences of the David–Besse design of the Babcock and Wilcox reactor protection system fault tree are used to benchmark the failure probabilities generated by the proposed approach. The results confirm that the proposed fuzzy-based reliability approach arises as a suitable alternative for the conventional probabilistic reliability approach when basic events do not have the corresponding quantitative historical failure data for determining their reliability characteristics. Hence, it overcomes the limitation of the conventional fault tree analysis for nuclear power plant probabilistic safety assessment

  13. Personal Publications Lists Serve as a Reliable Calibration Parameter to Compare Coverage in Academic Citation Databases with Scientific Social Media

    Directory of Open Access Journals (Sweden)

    Emma Hughes

    2017-03-01

    Full Text Available A Review of: Hilbert, F., Barth, J., Gremm, J., Gros, D., Haiter, J., Henkel, M., Reinhardt, W., & Stock, W.G. (2015. Coverage of academic citation databases compared with coverage of scientific social media: personal publication lists as calibration parameters. Online Information Review 39(2: 255-264. http://dx.doi.org/10.1108/OIR-07-2014-0159 Objective – The purpose of this study was to explore coverage rates of information science publications in academic citation databases and scientific social media using a new method of personal publication lists as a calibration parameter. The research questions were: How many publications are covered in different databases, which has the best coverage, and what institutions are represented and how does the language of the publication play a role? Design – Bibliometric analysis. Setting – Academic citation databases (Web of Science, Scopus, Google Scholar and scientific social media (Mendeley, CiteULike, Bibsonomy. Subjects – 1,017 library and information science publications produced by 76 information scientists at 5 German-speaking universities in Germany and Austria. Methods – Only documents which were published between 1 January 2003 and 31 December 2012 were included. In that time the 76 information scientists had produced 1,017 documents. The information scientists confirmed that their publication lists were complete and these served as the calibration parameter for the study. The citations from the publication lists were searched in three academic databases: Google Scholar, Web of Science (WoS, and Scopus; as well as three social media citation sites: Mendeley, CiteULike, and BibSonomy and the results were compared. The publications were searched for by author name and words from the title. Main results – None of the databases investigated had 100% coverage. In the academic databases, Google Scholar had the highest amount of coverage with an average of 63%, Scopus an average of 31%, and

  14. Probabilistic modelling of overflow, surcharge and flooding in urban drainage using the first-order reliability method and parameterization of local rain series

    DEFF Research Database (Denmark)

    Thorndahl, Søren; Willems, Patrick

    2007-01-01

    Failure of urban drainage systems may occur due to surcharge or flooding at specific manholes in the system, or due to overflows from combined sewer systems to receiving waters. To quantify the probability or return period of failure, standard approaches make use of the simulation of design storms...... or long historical rainfall series in a hydrodynamic model of the urban drainage system. In this paper, an alternative probabilistic method is investigated: the First Order Reliability Method (FORM). To apply this method, a long rainfall time series was divided in rain storms (rain events), and each rain...

  15. Uses of human reliability analysis probabilistic risk assessment results to resolve personnel performance issues that could affect safety

    International Nuclear Information System (INIS)

    O'Brien, J.N.; Spettell, C.M.

    1985-10-01

    This report is the first in a series which documents research aimed at improving the usefulness of Probabilistic Risk Assessment (PRA) results in addressing human risk issues. This first report describes the results of an assessment of how well currently available PRA data addresses human risk issues of current concern to NRC. Findings indicate that PRA data could be far more useful in addressing human risk issues with modification of the development process and documentation structure of PRAs. In addition, information from non-PRA sources could be integrated with PRA data to address many other issues. 12 tabs

  16. Possible use and limits of probabilistic models in connection with the reliability of large-scale plant

    International Nuclear Information System (INIS)

    Schmitt, W.; Baudendistel, E.; Ockewitz, A.

    1987-03-01

    The OCA-P program makes the deterministic and probabilistic safety analysis of reactor pressure vessels possible. The special routines in OCA-P for calculating the stress intensity factors are replaced by inherent developments. These methods permit the treatment of general fault and container geometries. The plot of the OCA-P was converted to the CALCOMP format. The checking was done by calculating examples with the original version and fictional examples of German plants (HDR). (DG) With 13 refs., 3 tabs., 46 figs [de

  17. Collection and classification of human reliability data for use in probabilistic safety assessments. Final report of a co-ordinated research programme 1995-1998

    International Nuclear Information System (INIS)

    1998-10-01

    One of the most important lessons from abnormal events in NPPs is that they often result from incorrect human action. The awareness of the importance of human factors and human reliability has increased significantly over 10-15 years primarily owing to the fact that some major incidents (nuclear or non-nuclear) have had significant human error contributions. Each of these incidents have revealed different types of human errors, some of which were not generally recognized prior to the incident. The analysis of these events led to wide recognition of the fact that more information about human actions and errors is needed to improve the safety and operation of nuclear power plants. At the same time, the need or proper human reliability data was recognised in view of probabilistic safety assessment (PSA). No PSA study can be regarded as complete and accurate without adequate incorporation of human reliability analysis (HRA). In order to support incorporation of human reliability data into PSA the IAEA established a coordinated research programme with the objective to develop a common data base structure for human errors that might have important contributions to risk in different types of reactors. This report is a product of four years of coordinated research and describes the data collection and classification schemes currently in use in Member States as well as an outlook into future, discussing what types of data might be needed to support the new improved HRA methods which are currently under development

  18. Review of cause-based decision tree approach for the development of domestic standard human reliability analysis procedure in low power/shutdown operation probabilistic safety assessment

    International Nuclear Information System (INIS)

    Kang, D. I.; Jung, W. D.

    2003-01-01

    We review the Cause-Based Decision Tree (CBDT) approach to decide whether we incorporate it or not for the development of domestic standard Human Reliability Analysis (HRA) procedure in low power/shutdown operation Probabilistic Safety Assessment (PSA). In this paper, we introduce the cause based decision tree approach, quantify human errors using it, and identify merits and demerits of it in comparision with previously used THERP. The review results show that it is difficult to incorporate the CBDT method for the development of domestic standard HRA procedure in low power/shutdown PSA because the CBDT method need for the subjective judgment of HRA analyst like as THERP. However, it is expected that the incorporation of the CBDT method into the development of domestic standard HRA procedure only for the comparision of quantitative HRA results will relieve the burden of development of detailed HRA procedure and will help maintain consistent quantitative HRA results

  19. Probabilistic logics and probabilistic networks

    CERN Document Server

    Haenni, Rolf; Wheeler, Gregory; Williamson, Jon; Andrews, Jill

    2014-01-01

    Probabilistic Logic and Probabilistic Networks presents a groundbreaking framework within which various approaches to probabilistic logic naturally fit. Additionally, the text shows how to develop computationally feasible methods to mesh with this framework.

  20. Reliability

    OpenAIRE

    Condon, David; Revelle, William

    2017-01-01

    Separating the signal in a test from the irrelevant noise is a challenge for all measurement. Low test reliability limits test validity, attenuates important relationships, and can lead to regression artifacts. Multiple approaches to the assessment and improvement of reliability are discussed. The advantages and disadvantages of several different approaches to reliability are considered. Practical advice on how to assess reliability using open source software is provided.

  1. Future developments of probabilistic structural reliability to meet the needs of risk analyses of nuclear power plants

    International Nuclear Information System (INIS)

    Schnurer, H.

    1980-01-01

    The methods of structural reliability, knowing their benefits and their limitations, will offer an increasingly important tool in order to make future quality decisions for nuclear safety more rational, objective and balanced. This might make them suitable for licensing and approval decisions of components and structures, offering an alternative to the presently used deterministic practice. (orig./RW)

  2. Interaction of CREDO [Centralized Reliability Data Organization] with the EBR-II [Experimental Breeder Reactor II] PRA [probabilistic risk assessment] development

    International Nuclear Information System (INIS)

    Smith, M.S.; Ragland, W.A.

    1989-01-01

    The National Academy of Sciences review of US Department of Energy (DOE) class 1 reactors recommended that the Experimental Breeder Reactor II (EBR-II), operated by Argonne National Laboratory (ANL), develop a level 1 probabilistic risk assessment (PRA) and make provisions for level 2 and level 3 PRAs based on the results of the level 1 PRA. The PRA analysis group at ANL will utilize the Centralized Reliability Data Organization (CREDO) at Oak Ridge National Laboratory to support the PRA data needs. CREDO contains many years of empirical liquid-metal reactor component data from EBR-II. CREDO is a mutual data- and cost-sharing system sponsored by DOE and the Power Reactor and Nuclear Fuels Development Corporation of Japan. CREDO is a component based data system; data are collected on components that are liquid-metal specific, associated with a liquid-metal environment, contained in systems that interface with liquid-metal environments, or are safety related for use in reliability/availability/maintainability (RAM) analyses of advanced reactors. The links between the EBR-II PRA development effort and the CREDO data collection at EBR-II extend beyond the sharing of data. The PRA provides a measure of the relative contribution to risk of the various components. This information can be used to prioritize future CREDO data collection activities at EBR-II and other sites

  3. Probabilistic Structural Analysis Program

    Science.gov (United States)

    Pai, Shantaram S.; Chamis, Christos C.; Murthy, Pappu L. N.; Stefko, George L.; Riha, David S.; Thacker, Ben H.; Nagpal, Vinod K.; Mital, Subodh K.

    2010-01-01

    NASA/NESSUS 6.2c is a general-purpose, probabilistic analysis program that computes probability of failure and probabilistic sensitivity measures of engineered systems. Because NASA/NESSUS uses highly computationally efficient and accurate analysis techniques, probabilistic solutions can be obtained even for extremely large and complex models. Once the probabilistic response is quantified, the results can be used to support risk-informed decisions regarding reliability for safety-critical and one-of-a-kind systems, as well as for maintaining a level of quality while reducing manufacturing costs for larger-quantity products. NASA/NESSUS has been successfully applied to a diverse range of problems in aerospace, gas turbine engines, biomechanics, pipelines, defense, weaponry, and infrastructure. This program combines state-of-the-art probabilistic algorithms with general-purpose structural analysis and lifting methods to compute the probabilistic response and reliability of engineered structures. Uncertainties in load, material properties, geometry, boundary conditions, and initial conditions can be simulated. The structural analysis methods include non-linear finite-element methods, heat-transfer analysis, polymer/ceramic matrix composite analysis, monolithic (conventional metallic) materials life-prediction methodologies, boundary element methods, and user-written subroutines. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. NASA/NESSUS 6.2c is structured in a modular format with 15 elements.

  4. Probabilistic reliability analyses to detect weak points in secondary-side residual heat removal systems of KWU PWR plants

    International Nuclear Information System (INIS)

    Schilling, R.

    1984-01-01

    Requirements made by Federal German licensing authorities called for the analysis of the second-side residual heat removal systems of new PWR plants with regard to availability, possible weak points and the balanced nature of the overall system for different incident sequences. Following a description of the generic concept and the process and safety-related systems for steam generator feed and main steam discharge, the reliability of the latter is analyzed for the small break LOCA and emergency power mode incidents, weak points in the process systems are identified, remedial measures of a system-specific and test-strategic nature are presented and their contribution to improving system availability is quantified. A comparison with the results of the German Risk Study on Nuclear Power Plants (GRS) shows a distinct reduction in core meltdown frequency. (orig.)

  5. Probabilistic physics-of-failure models for component reliabilities using Monte Carlo simulation and Weibull analysis: a parametric study

    International Nuclear Information System (INIS)

    Hall, P.L.; Strutt, J.E.

    2003-01-01

    In reliability engineering, component failures are generally classified in one of three ways: (1) early life failures; (2) failures having random onset times; and (3) late life or 'wear out' failures. When the time-distribution of failures of a population of components is analysed in terms of a Weibull distribution, these failure types may be associated with shape parameters β having values 1 respectively. Early life failures are frequently attributed to poor design (e.g. poor materials selection) or problems associated with manufacturing or assembly processes. We describe a methodology for the implementation of physics-of-failure models of component lifetimes in the presence of parameter and model uncertainties. This treats uncertain parameters as random variables described by some appropriate statistical distribution, which may be sampled using Monte Carlo methods. The number of simulations required depends upon the desired accuracy of the predicted lifetime. Provided that the number of sampled variables is relatively small, an accuracy of 1-2% can be obtained using typically 1000 simulations. The resulting collection of times-to-failure are then sorted into ascending order and fitted to a Weibull distribution to obtain a shape factor β and a characteristic life-time η. Examples are given of the results obtained using three different models: (1) the Eyring-Peck (EP) model for corrosion of printed circuit boards; (2) a power-law corrosion growth (PCG) model which represents the progressive deterioration of oil and gas pipelines; and (3) a random shock-loading model of mechanical failure. It is shown that for any specific model the values of the Weibull shape parameters obtained may be strongly dependent on the degree of uncertainty of the underlying input parameters. Both the EP and PCG models can yield a wide range of values of β, from β>1, characteristic of wear-out behaviour, to β<1, characteristic of early-life failure, depending on the degree of

  6. Probabilistic modelling of overflow, surcharge and flooding in urban drainage using the first-order reliability method and parameterization of local rain series.

    Science.gov (United States)

    Thorndahl, S; Willems, P

    2008-01-01

    Failure of urban drainage systems may occur due to surcharge or flooding at specific manholes in the system, or due to overflows from combined sewer systems to receiving waters. To quantify the probability or return period of failure, standard approaches make use of the simulation of design storms or long historical rainfall series in a hydrodynamic model of the urban drainage system. In this paper, an alternative probabilistic method is investigated: the first-order reliability method (FORM). To apply this method, a long rainfall time series was divided in rainstorms (rain events), and each rainstorm conceptualized to a synthetic rainfall hyetograph by a Gaussian shape with the parameters rainstorm depth, duration and peak intensity. Probability distributions were calibrated for these three parameters and used on the basis of the failure probability estimation, together with a hydrodynamic simulation model to determine the failure conditions for each set of parameters. The method takes into account the uncertainties involved in the rainstorm parameterization. Comparison is made between the failure probability results of the FORM method, the standard method using long-term simulations and alternative methods based on random sampling (Monte Carlo direct sampling and importance sampling). It is concluded that without crucial influence on the modelling accuracy, the FORM is very applicable as an alternative to traditional long-term simulations of urban drainage systems.

  7. Probabilistic Role Models and the Guarded Fragment

    DEFF Research Database (Denmark)

    Jaeger, Manfred

    2004-01-01

    We propose a uniform semantic framework for interpreting probabilistic concept subsumption and probabilistic role quantification through statistical sampling distributions. This general semantic principle serves as the foundation for the development of a probabilistic version of the guarded fragm...... fragment of first-order logic. A characterization of equivalence in that logic in terms of bisimulations is given....

  8. Probabilistic role models and the guarded fragment

    DEFF Research Database (Denmark)

    Jaeger, Manfred

    2006-01-01

    We propose a uniform semantic framework for interpreting probabilistic concept subsumption and probabilistic role quantification through statistical sampling distributions. This general semantic principle serves as the foundation for the development of a probabilistic version of the guarded fragm...... fragment of first-order logic. A characterization of equivalence in that logic in terms of bisimulations is given....

  9. Probabilistic insurance

    OpenAIRE

    Wakker, P.P.; Thaler, R.H.; Tversky, A.

    1997-01-01

    textabstractProbabilistic insurance is an insurance policy involving a small probability that the consumer will not be reimbursed. Survey data suggest that people dislike probabilistic insurance and demand more than a 20% reduction in the premium to compensate for a 1% default risk. While these preferences are intuitively appealing they are difficult to reconcile with expected utility theory. Under highly plausible assumptions about the utility function, willingness to pay for probabilistic i...

  10. Comparative analysis of deterministic and probabilistic fracture mechanical assessment tools

    Energy Technology Data Exchange (ETDEWEB)

    Heckmann, Klaus [Gesellschaft fuer Anlagen- und Reaktorsicherheit (GRS) gGmbH, Koeln (Germany); Saifi, Qais [VTT Technical Research Centre of Finland, Espoo (Finland)

    2016-11-15

    Uncertainties in material properties, manufacturing processes, loading conditions and damage mechanisms complicate the quantification of structural reliability. Probabilistic structure mechanical computing codes serve as tools for assessing leak- and break probabilities of nuclear piping components. Probabilistic fracture mechanical tools were compared in different benchmark activities, usually revealing minor, but systematic discrepancies between results of different codes. In this joint paper, probabilistic fracture mechanical codes are compared. Crack initiation, crack growth and the influence of in-service inspections are analyzed. Example cases for stress corrosion cracking and fatigue in LWR conditions are analyzed. The evolution of annual failure probabilities during simulated operation time is investigated, in order to identify the reasons for differences in the results of different codes. The comparison of the tools is used for further improvements of the codes applied by the partners.

  11. Probabilistic Networks

    DEFF Research Database (Denmark)

    Jensen, Finn Verner; Lauritzen, Steffen Lilholt

    2001-01-01

    This article describes the basic ideas and algorithms behind specification and inference in probabilistic networks based on directed acyclic graphs, undirected graphs, and chain graphs.......This article describes the basic ideas and algorithms behind specification and inference in probabilistic networks based on directed acyclic graphs, undirected graphs, and chain graphs....

  12. Probabilistic Insurance

    NARCIS (Netherlands)

    Wakker, P.P.; Thaler, R.H.; Tversky, A.

    1997-01-01

    Probabilistic insurance is an insurance policy involving a small probability that the consumer will not be reimbursed. Survey data suggest that people dislike probabilistic insurance and demand more than a 20% reduction in premium to compensate for a 1% default risk. These observations cannot be

  13. Probabilistic Insurance

    NARCIS (Netherlands)

    P.P. Wakker (Peter); R.H. Thaler (Richard); A. Tversky (Amos)

    1997-01-01

    textabstractProbabilistic insurance is an insurance policy involving a small probability that the consumer will not be reimbursed. Survey data suggest that people dislike probabilistic insurance and demand more than a 20% reduction in the premium to compensate for a 1% default risk. While these

  14. Failure Modes Taxonomy for Reliability Assessment of Digital Instrumentation and Control Systems for Probabilistic Risk Analysis - Failure modes taxonomy for reliability assessment of digital I and C systems for PRA

    International Nuclear Information System (INIS)

    Amri, A.; Blundell, N.; ); Authen, S.; Betancourt, L.; Coyne, K.; Halverson, D.; Li, M.; Taylor, G.; Bjoerkman, K.; Brinkman, H.; Postma, W.; Bruneliere, H.; Chirila, M.; Gheorge, R.; Chu, L.; Yue, M.; Delache, J.; Georgescu, G.; Deleuze, G.; Quatrain, R.; Thuy, N.; Holmberg, J.-E.; Kim, M.C.; Kondo, K.; Mancini, F.; Piljugin, E.; Stiller, J.; Sedlak, J.; Smidts, C.; Sopira, V.

    2015-01-01

    Digital protection and control systems appear as upgrades in older nuclear power plants (NPP), and are commonplace in new NPPs. To assess the risk of NPP operation and to determine the risk impact of digital systems, there is a need to quantitatively assess the reliability of the digital systems in a justifiable manner. Due to the many unique attributes of digital systems (e.g., functions are implemented by software, units of the system interact in a communication network, faults can be identified and handled online), a number of modelling and data collection challenges exist, and international consensus on the reliability modelling has not yet been reached. The objective of the task group called DIGREL has been to develop a taxonomy of failure modes of digital components for the purposes of probabilistic risk analysis (PRA). An activity focused on the development of a common taxonomy of failure modes is seen as an important step towards standardised digital instrumentation and control (I and C) reliability assessment techniques for PRA. Needs from PRA has guided the work, meaning, e.g., that the I and C system and its failures are studied from the point of view of their functional significance point of view. The taxonomy will be the basis of future modelling and quantification efforts. It will also help to define a structure for data collection and to review PRA studies. The proposed failure modes taxonomy has been developed by first collecting examples of taxonomies provided by the task group organisations. This material showed some variety in the handling of I and C hardware failure modes, depending on the context where the failure modes have been defined. Regarding the software part of I and C, failure modes defined in NPP PRAs have been simple - typically a software CCF failing identical processing units. The DIGREL task group has defined a new failure modes taxonomy based on a hierarchical definition of five levels of abstraction: 1. system level (complete

  15. Probabilistic modeling of timber structures

    DEFF Research Database (Denmark)

    Köhler, Jochen; Sørensen, John Dalsgaard; Faber, Michael Havbro

    2007-01-01

    The present paper contains a proposal for the probabilistic modeling of timber material properties. It is produced in the context of the Probabilistic Model Code (PMC) of the Joint Committee on Structural Safety (JCSS) [Joint Committee of Structural Safety. Probabilistic Model Code, Internet...... Publication: www.jcss.ethz.ch; 2001] and of the COST action E24 ‘Reliability of Timber Structures' [COST Action E 24, Reliability of timber structures. Several meetings and Publications, Internet Publication: http://www.km.fgg.uni-lj.si/coste24/coste24.htm; 2005]. The present proposal is based on discussions...... and comments from participants of the COST E24 action and the members of the JCSS. The paper contains a description of the basic reference properties for timber strength parameters and ultimate limit state equations for timber components. The recommended probabilistic model for these basic properties...

  16. Probabilistic design of fibre concrete structures

    Science.gov (United States)

    Pukl, R.; Novák, D.; Sajdlová, T.; Lehký, D.; Červenka, J.; Červenka, V.

    2017-09-01

    Advanced computer simulation is recently well-established methodology for evaluation of resistance of concrete engineering structures. The nonlinear finite element analysis enables to realistically predict structural damage, peak load, failure, post-peak response, development of cracks in concrete, yielding of reinforcement, concrete crushing or shear failure. The nonlinear material models can cover various types of concrete and reinforced concrete: ordinary concrete, plain or reinforced, without or with prestressing, fibre concrete, (ultra) high performance concrete, lightweight concrete, etc. Advanced material models taking into account fibre concrete properties such as shape of tensile softening branch, high toughness and ductility are described in the paper. Since the variability of the fibre concrete material properties is rather high, the probabilistic analysis seems to be the most appropriate format for structural design and evaluation of structural performance, reliability and safety. The presented combination of the nonlinear analysis with advanced probabilistic methods allows evaluation of structural safety characterized by failure probability or by reliability index respectively. Authors offer a methodology and computer tools for realistic safety assessment of concrete structures; the utilized approach is based on randomization of the nonlinear finite element analysis of the structural model. Uncertainty of the material properties or their randomness obtained from material tests are accounted in the random distribution. Furthermore, degradation of the reinforced concrete materials such as carbonation of concrete, corrosion of reinforcement, etc. can be accounted in order to analyze life-cycle structural performance and to enable prediction of the structural reliability and safety in time development. The results can serve as a rational basis for design of fibre concrete engineering structures based on advanced nonlinear computer analysis. The presented

  17. Determining the theoretical reliability function of thermal power system using simple and complex Weibull distribution

    Directory of Open Access Journals (Sweden)

    Kalaba Dragan V.

    2014-01-01

    Full Text Available The main subject of this paper is the representation of the probabilistic technique for thermal power system reliability assessment. Exploitation research of the reliability of the fossil fuel power plant system has defined the function, or the probabilistic law, according to which the random variable behaves (occurrence of complete unplanned standstill. Based on these data, and by applying the reliability theory to this particular system, using simple and complex Weibull distribution, a hypothesis has been confirmed that the distribution of the observed random variable fully describes the behaviour of such a system in terms of reliability. Establishing a comprehensive insight in the field of probabilistic power system reliability assessment technique could serve as an input for further research and development in the area of power system planning and operation.

  18. Design of robust reliable control for T-S fuzzy Markovian jumping delayed neutral type neural networks with probabilistic actuator faults and leakage delays: An event-triggered communication scheme.

    Science.gov (United States)

    Syed Ali, M; Vadivel, R; Saravanakumar, R

    2018-06-01

    This study examines the problem of robust reliable control for Takagi-Sugeno (T-S) fuzzy Markovian jumping delayed neural networks with probabilistic actuator faults and leakage terms. An event-triggered communication scheme. First, the randomly occurring actuator faults and their failures rates are governed by two sets of unrelated random variables satisfying certain probabilistic failures of every actuator, new type of distribution based event triggered fault model is proposed, which utilize the effect of transmission delay. Second, Takagi-Sugeno (T-S) fuzzy model is adopted for the neural networks and the randomness of actuators failures is modeled in a Markov jump model framework. Third, to guarantee the considered closed-loop system is exponential mean square stable with a prescribed reliable control performance, a Markov jump event-triggered scheme is designed in this paper, which is the main purpose of our study. Fourth, by constructing appropriate Lyapunov-Krasovskii functional, employing Newton-Leibniz formulation and integral inequalities, several delay-dependent criteria for the solvability of the addressed problem are derived. The obtained stability criteria are stated in terms of linear matrix inequalities (LMIs), which can be checked numerically using the effective LMI toolbox in MATLAB. Finally, numerical examples are given to illustrate the effectiveness and reduced conservatism of the proposed results over the existing ones, among them one example was supported by real-life application of the benchmark problem. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.

  19. Structural reliability assessment capability in NESSUS

    Science.gov (United States)

    Millwater, H.; Wu, Y.-T.

    1992-07-01

    The principal capabilities of NESSUS (Numerical Evaluation of Stochastic Structures Under Stress), an advanced computer code developed for probabilistic structural response analysis, are reviewed, and its structural reliability assessed. The code combines flexible structural modeling tools with advanced probabilistic algorithms in order to compute probabilistic structural response and resistance, component reliability and risk, and system reliability and risk. An illustrative numerical example is presented.

  20. Probabilistic finite elements for fracture mechanics

    Science.gov (United States)

    Besterfield, Glen

    1988-01-01

    The probabilistic finite element method (PFEM) is developed for probabilistic fracture mechanics (PFM). A finite element which has the near crack-tip singular strain embedded in the element is used. Probabilistic distributions, such as expectation, covariance and correlation stress intensity factors, are calculated for random load, random material and random crack length. The method is computationally quite efficient and can be expected to determine the probability of fracture or reliability.

  1. Probabilistic linguistics

    NARCIS (Netherlands)

    Bod, R.; Heine, B.; Narrog, H.

    2010-01-01

    Probabilistic linguistics takes all linguistic evidence as positive evidence and lets statistics decide. It allows for accurate modelling of gradient phenomena in production and perception, and suggests that rule-like behaviour is no more than a side effect of maximizing probability. This chapter

  2. Probabilistic Design

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Burcharth, H. F.

    This chapter describes how partial safety factors can be used in design of vertical wall breakwaters and an example of a code format is presented. The partial safety factors are calibrated on a probabilistic basis. The code calibration process used to calibrate some of the partial safety factors...

  3. Probabilistic assessment of faults

    International Nuclear Information System (INIS)

    Foden, R.W.

    1987-01-01

    Probabilistic safety analysis (PSA) is the process by which the probability (or frequency of occurrence) of reactor fault conditions which could lead to unacceptable consequences is assessed. The basic objective of a PSA is to allow a judgement to be made as to whether or not the principal probabilistic requirement is satisfied. It also gives insights into the reliability of the plant which can be used to identify possible improvements. This is explained in the article. The scope of a PSA and the PSA performed by the National Nuclear Corporation (NNC) for the Heysham II and Torness AGRs and Sizewell-B PWR are discussed. The NNC methods for hazards, common cause failure and operator error are mentioned. (UK)

  4. Probabilistic Logic and Probabilistic Networks

    NARCIS (Netherlands)

    Haenni, R.; Romeijn, J.-W.; Wheeler, G.; Williamson, J.

    2009-01-01

    While in principle probabilistic logics might be applied to solve a range of problems, in practice they are rarely applied at present. This is perhaps because they seem disparate, complicated, and computationally intractable. However, we shall argue in this programmatic paper that several approaches

  5. Probabilistic Design of Offshore Structural Systems

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard

    1988-01-01

    Probabilistic design of structural systems is considered in this paper. The reliability is estimated using first-order reliability methods (FORM). The design problem is formulated as the optimization problem to minimize a given cost function such that the reliability of the single elements...... satisfies given requirements or such that the systems reliability satisfies a given requirement. Based on a sensitivity analysis optimization procedures to solve the optimization problems are presented. Two of these procedures solve the system reliability-based optimization problem sequentially using quasi......-analytical derivatives. Finally an example of probabilistic design of an offshore structure is considered....

  6. Probabilistic Design of Offshore Structural Systems

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard

    Probabilistic design of structural systems is considered in this paper. The reliability is estimated using first-order reliability methods (FORM). The design problem is formulated as the optimization problem to minimize a given cost function such that the reliability of the single elements...... satisfies given requirements or such that the systems reliability satisfies a given requirement. Based on a sensitivity analysis optimization procedures to solve the optimization problems are presented. Two of these procedures solve the system reliability-based optimization problem sequentially using quasi......-analytical derivatives. Finally an example of probabilistic design of an offshore structure is considered....

  7. Probabilistic Design

    DEFF Research Database (Denmark)

    Burcharth, H. F.; Sørensen, John Dalsgaard; Voortman, Hessel

    In this report is described the failure modes in the computational reliability program implemented at Aalborg University within PROVERBS.......In this report is described the failure modes in the computational reliability program implemented at Aalborg University within PROVERBS....

  8. reliability reliability

    African Journals Online (AJOL)

    eobe

    Corresponding author, Tel: +234-703. RELIABILITY .... V , , given by the code of practice. However, checks must .... an optimization procedure over the failure domain F corresponding .... of Concrete Members based on Utility Theory,. Technical ...

  9. Human reliability

    International Nuclear Information System (INIS)

    Embrey, D.E.

    1987-01-01

    Concepts and techniques of human reliability have been developed and are used mostly in probabilistic risk assessment. For this, the major application of human reliability assessment has been to identify the human errors which have a significant effect on the overall safety of the system and to quantify the probability of their occurrence. Some of the major issues within human reliability studies are reviewed and it is shown how these are applied to the assessment of human failures in systems. This is done under the following headings; models of human performance used in human reliability assessment, the nature of human error, classification of errors in man-machine systems, practical aspects, human reliability modelling in complex situations, quantification and examination of human reliability, judgement based approaches, holistic techniques and decision analytic approaches. (UK)

  10. Reliability Calculations

    DEFF Research Database (Denmark)

    Petersen, Kurt Erling

    1986-01-01

    Risk and reliability analysis is increasingly being used in evaluations of plant safety and plant reliability. The analysis can be performed either during the design process or during the operation time, with the purpose to improve the safety or the reliability. Due to plant complexity and safety...... and availability requirements, sophisticated tools, which are flexible and efficient, are needed. Such tools have been developed in the last 20 years and they have to be continuously refined to meet the growing requirements. Two different areas of application were analysed. In structural reliability probabilistic...... approaches have been introduced in some cases for the calculation of the reliability of structures or components. A new computer program has been developed based upon numerical integration in several variables. In systems reliability Monte Carlo simulation programs are used especially in analysis of very...

  11. Probabilistic Modeling of Timber Structures

    DEFF Research Database (Denmark)

    Köhler, J.D.; Sørensen, John Dalsgaard; Faber, Michael Havbro

    2005-01-01

    The present paper contains a proposal for the probabilistic modeling of timber material properties. It is produced in the context of the Probabilistic Model Code (PMC) of the Joint Committee on Structural Safety (JCSS) and of the COST action E24 'Reliability of Timber Structures'. The present...... proposal is based on discussions and comments from participants of the COST E24 action and the members of the JCSS. The paper contains a description of the basic reference properties for timber strength parameters and ultimate limit state equations for components and connections. The recommended...

  12. Probabilistic fracture finite elements

    Science.gov (United States)

    Liu, W. K.; Belytschko, T.; Lua, Y. J.

    1991-05-01

    The Probabilistic Fracture Mechanics (PFM) is a promising method for estimating the fatigue life and inspection cycles for mechanical and structural components. The Probability Finite Element Method (PFEM), which is based on second moment analysis, has proved to be a promising, practical approach to handle problems with uncertainties. As the PFEM provides a powerful computational tool to determine first and second moment of random parameters, the second moment reliability method can be easily combined with PFEM to obtain measures of the reliability of the structural system. The method is also being applied to fatigue crack growth. Uncertainties in the material properties of advanced materials such as polycrystalline alloys, ceramics, and composites are commonly observed from experimental tests. This is mainly attributed to intrinsic microcracks, which are randomly distributed as a result of the applied load and the residual stress.

  13. Probabilistic risk assessment methodology

    International Nuclear Information System (INIS)

    Shinaishin, M.A.

    1988-06-01

    The objective of this work is to provide the tools necessary for clear identification of: the purpose of a Probabilistic Risk Study, the bounds and depth of the study, the proper modeling techniques to be used, the failure modes contributing to the analysis, the classical and baysian approaches for manipulating data necessary for quantification, ways for treating uncertainties, and available computer codes that may be used in performing such probabilistic analysis. In addition, it provides the means for measuring the importance of a safety feature to maintaining a level of risk at a Nuclear Power Plant and the worth of optimizing a safety system in risk reduction. In applying these techniques so that they accommodate our national resources and needs it was felt that emphasis should be put on the system reliability analysis level of PRA. Objectives of such studies could include: comparing systems' designs of the various vendors in the bedding stage, and performing grid reliability and human performance analysis using national specific data. (author)

  14. Probabilistic risk assessment methodology

    Energy Technology Data Exchange (ETDEWEB)

    Shinaishin, M A

    1988-06-15

    The objective of this work is to provide the tools necessary for clear identification of: the purpose of a Probabilistic Risk Study, the bounds and depth of the study, the proper modeling techniques to be used, the failure modes contributing to the analysis, the classical and baysian approaches for manipulating data necessary for quantification, ways for treating uncertainties, and available computer codes that may be used in performing such probabilistic analysis. In addition, it provides the means for measuring the importance of a safety feature to maintaining a level of risk at a Nuclear Power Plant and the worth of optimizing a safety system in risk reduction. In applying these techniques so that they accommodate our national resources and needs it was felt that emphasis should be put on the system reliability analysis level of PRA. Objectives of such studies could include: comparing systems' designs of the various vendors in the bedding stage, and performing grid reliability and human performance analysis using national specific data. (author)

  15. Method for analysis and assessment of the relation between stress and reliability of knowledge-based actions in the probabilistic safety analysis

    International Nuclear Information System (INIS)

    Fassmann, Werner

    2014-06-01

    According to the current theoretical and empirical state-of-the-art, stress has to be understood as the emotional and cognitive reaction by which humans adapt to situations which imply real or imagined danger, threat, or frustration of important personal goals or needs. The emotional reaction to such situations can be so extreme that rational coping with the situation will be precluded. In less extreme cases, changes of cognitive processes underlying human action will occur, which may systematically affect the reliability of tasks personnel has to perform in a stressful situation. Reliable task performance by personnel of nuclear power plants and other risk technologies is also affected by such effects. The method developed in the frame of the research and development project RS1198 sponsored by the German Federal Ministry for Economic Affairs and Energy (BMWi) addresses both aspects of emotional and cognitive coping with stressful situations. Analytical and evaluation steps of the approach provide guidance to the end users on how to capture and quantify the contribution of stress-related emotional and cognitive factors to the reliable performance of knowledge-based actions. For this purpose, a suitable guideline has been developed. Further research for clarifying open questions has been identified. A case study application illustrates how to use the method. Part of the work performed in this project was dedicated to a review addressing the question to which extent Swain's approach to the analysis and evaluation of stress is in line with current scientific knowledge. Suitable suggestions for updates have been developed.

  16. Probabilistic Unawareness

    Directory of Open Access Journals (Sweden)

    Mikaël Cozic

    2016-11-01

    Full Text Available The modeling of awareness and unawareness is a significant topic in the doxastic logic literature, where it is usually tackled in terms of full belief operators. The present paper aims at a treatment in terms of partial belief operators. It draws upon the modal probabilistic logic that was introduced by Aumann (1999 at the semantic level, and then axiomatized by Heifetz and Mongin (2001. The paper embodies in this framework those properties of unawareness that have been highlighted in the seminal paper by Modica and Rustichini (1999. Their paper deals with full belief, but we argue that the properties in question also apply to partial belief. Our main result is a (soundness and completeness theorem that reunites the two strands—modal and probabilistic—of doxastic logic.

  17. Development and application of a cost-benefit framework for energy reliability. Using probabilistic methods in network planning and regulation to enhance social welfare. The N-1 rule

    International Nuclear Information System (INIS)

    Nooij, Michiel de; Baarsma, Barbara; Bloemhof, Gabriel; Dijk, Harold; Slootweg, Han

    2010-01-01

    Although electricity is crucial to many activities in developed societies, guaranteeing a maximum reliability of supply to end-users is extremely costly. This situation gives rise to a trade-off between the costs and benefits of reliability. The Dutch government has responded to this trade-off by changing the rule stipulating that electricity networks must be able to maintain supply even if one component fails (known as the N-1 rule), even in maintenance situations. This rule was changed by adding the phrase 'unless the costs exceed the benefits.' We have developed a cost-benefit framework for the implementation and application of this new rule. The framework requires input on failure probability, the cost of supply interruptions to end-users and the cost of investments. A case study of the Dutch grid shows that the method is indeed practicable and that it is highly unlikely that N-1 during maintenance will enhance welfare in the Netherlands. Therefore, including the limitation 'unless the costs exceed the benefits' in the rule has been a sensible policy for the Netherlands, and would also be a sensible policy for other countries. (author)

  18. Probabilistic Analysis of Crack Width

    Directory of Open Access Journals (Sweden)

    J. Marková

    2000-01-01

    Full Text Available Probabilistic analysis of crack width of a reinforced concrete element is based on the formulas accepted in Eurocode 2 and European Model Code 90. Obtained values of reliability index b seem to be satisfactory for the reinforced concrete slab that fulfils requirements for the crack width specified in Eurocode 2. However, the reliability of the slab seems to be insufficient when the European Model Code 90 is considered; reliability index is less than recommended value 1.5 for serviceability limit states indicated in Eurocode 1. Analysis of sensitivity factors of basic variables enables to find out variables significantly affecting the total crack width.

  19. Probabilistic Design and Analysis Framework

    Science.gov (United States)

    Strack, William C.; Nagpal, Vinod K.

    2010-01-01

    PRODAF is a software package designed to aid analysts and designers in conducting probabilistic analysis of components and systems. PRODAF can integrate multiple analysis programs to ease the tedious process of conducting a complex analysis process that requires the use of multiple software packages. The work uses a commercial finite element analysis (FEA) program with modules from NESSUS to conduct a probabilistic analysis of a hypothetical turbine blade, disk, and shaft model. PRODAF applies the response surface method, at the component level, and extrapolates the component-level responses to the system level. Hypothetical components of a gas turbine engine are first deterministically modeled using FEA. Variations in selected geometrical dimensions and loading conditions are analyzed to determine the effects of the stress state within each component. Geometric variations include the cord length and height for the blade, inner radius, outer radius, and thickness, which are varied for the disk. Probabilistic analysis is carried out using developing software packages like System Uncertainty Analysis (SUA) and PRODAF. PRODAF was used with a commercial deterministic FEA program in conjunction with modules from the probabilistic analysis program, NESTEM, to perturb loads and geometries to provide a reliability and sensitivity analysis. PRODAF simplified the handling of data among the various programs involved, and will work with many commercial and opensource deterministic programs, probabilistic programs, or modules.

  20. Probabilistic methods used in NUSS

    International Nuclear Information System (INIS)

    Fischer, J.; Giuliani, P.

    1985-01-01

    Probabilistic considerations are used implicitly or explicitly in all technical areas. In the NUSS codes and guides the two areas of design and siting are those where more use is made of these concepts. A brief review of the relevant documents in these two areas is made in this paper. It covers the documents where either probabilistic considerations are implied or where probabilistic approaches are recommended in the evaluation of situations and of events. In the siting guides the review mainly covers the area of seismic hydrological and external man-made events analysis, as well as some aspects of meteorological extreme events analysis. Probabilistic methods are recommended in the design guides but they are not made a requirement. There are several reasons for this, mainly lack of reliable data and the absence of quantitative safety limits or goals against which to judge the design analysis. As far as practical, engineering judgement should be backed up by quantitative probabilistic analysis. Examples are given and the concept of design basis as used in NUSS design guides is explained. (author)

  1. Probabilistic Design of Wind Turbines

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Toft, H.S.

    2010-01-01

    Probabilistic design of wind turbines requires definition of the structural elements to be included in the probabilistic basis: e.g., blades, tower, foundation; identification of important failure modes; careful stochastic modeling of the uncertain parameters; recommendations for target reliability....... It is described how uncertainties in wind turbine design related to computational models, statistical data from test specimens, results from a few full-scale tests and from prototype wind turbines can be accounted for using the Maximum Likelihood Method and a Bayesian approach. Assessment of the optimal...... reliability level by cost-benefit optimization is illustrated by an offshore wind turbine example. Uncertainty modeling is illustrated by an example where physical, statistical and model uncertainties are estimated....

  2. Probabilistic escalation modelling

    Energy Technology Data Exchange (ETDEWEB)

    Korneliussen, G.; Eknes, M.L.; Haugen, K.; Selmer-Olsen, S. [Det Norske Veritas, Oslo (Norway)

    1997-12-31

    This paper describes how structural reliability methods may successfully be applied within quantitative risk assessment (QRA) as an alternative to traditional event tree analysis. The emphasis is on fire escalation in hydrocarbon production and processing facilities. This choice was made due to potential improvements over current QRA practice associated with both the probabilistic approach and more detailed modelling of the dynamics of escalating events. The physical phenomena important for the events of interest are explicitly modelled as functions of time. Uncertainties are represented through probability distributions. The uncertainty modelling enables the analysis to be simple when possible and detailed when necessary. The methodology features several advantages compared with traditional risk calculations based on event trees. (Author)

  3. Deliverable D74.2. Probabilistic analysis methods for support structures

    DEFF Research Database (Denmark)

    Gintautas, Tomas

    2018-01-01

    Relevant Description: Report describing the probabilistic analysis for offshore substructures and results attained. This includes comparison with experimental data and with conventional design. Specific targets: 1) Estimate current reliability level of support structures 2) Development of basis...... for probabilistic calculations and evaluation of reliability for offshore support structures (substructures) 3) Development of a probabilistic model for stiffness and strength of soil parameters and for modeling geotechnical load bearing capacity 4) Comparison between probabilistic analysis and deterministic...

  4. Structural Reliability Methods

    DEFF Research Database (Denmark)

    Ditlevsen, Ove Dalager; Madsen, H. O.

    The structural reliability methods quantitatively treat the uncertainty of predicting the behaviour and properties of a structure given the uncertain properties of its geometry, materials, and the actions it is supposed to withstand. This book addresses the probabilistic methods for evaluation...... of structural reliability, including the theoretical basis for these methods. Partial safety factor codes under current practice are briefly introduced and discussed. A probabilistic code format for obtaining a formal reliability evaluation system that catches the most essential features of the nature...... of the uncertainties and their interplay is the developed, step-by-step. The concepts presented are illustrated by numerous examples throughout the text....

  5. Libraries serving dialogue

    CERN Document Server

    Dupont, Odile

    2014-01-01

    This book based on experiences of libraries serving interreligious dialogue, presents themes like library tools serving dialogue between cultures, collections dialoguing, children and young adults dialoguing beyond borders, story telling as dialog, librarians serving interreligious dialogue.

  6. Probabilistic approach to EMP assessment

    International Nuclear Information System (INIS)

    Bevensee, R.M.; Cabayan, H.S.; Deadrick, F.J.; Martin, L.C.; Mensing, R.W.

    1980-09-01

    The development of nuclear EMP hardness requirements must account for uncertainties in the environment, in interaction and coupling, and in the susceptibility of subsystems and components. Typical uncertainties of the last two kinds are briefly summarized, and an assessment methodology is outlined, based on a probabilistic approach that encompasses the basic concepts of reliability. It is suggested that statements of survivability be made compatible with system reliability. Validation of the approach taken for simple antenna/circuit systems is performed with experiments and calculations that involve a Transient Electromagnetic Range, numerical antenna modeling, separate device failure data, and a failure analysis computer program

  7. Technology of serving

    OpenAIRE

    Taskov, Nako

    2013-01-01

    The book “Technology of serving” was prepared according to the curriculum and it is intended for students at the faculty of tourism and business logistics in republic of Macedonia In its contents on the subject of Technology of serving it includes the following - the rooms for serving, the types of catering objects in which food and beverages are served, professional serving staff, equipment and inventory for serving, card selection services in serving .,getting to know drin...

  8. Probabilistic metric spaces

    CERN Document Server

    Schweizer, B

    2005-01-01

    Topics include special classes of probabilistic metric spaces, topologies, and several related structures, such as probabilistic normed and inner-product spaces. 1983 edition, updated with 3 new appendixes. Includes 17 illustrations.

  9. Probabilistic composition of preferences, theory and applications

    CERN Document Server

    Parracho Sant'Anna, Annibal

    2015-01-01

    Putting forward a unified presentation of the features and possible applications of probabilistic preferences composition, and serving as a methodology for decisions employing multiple criteria, this book maximizes reader insights into the evaluation in probabilistic terms and the development of composition approaches that do not depend on assigning weights to the criteria. With key applications in important areas of management such as failure modes, effects analysis and productivity analysis – together with explanations about the application of the concepts involved –this book makes available numerical examples of probabilistic transformation development and probabilistic composition. Useful not only as a reference source for researchers, but also in teaching classes of graduate courses in Production Engineering and Management Science, the key themes of the book will be of especial interest to researchers in the field of Operational Research.

  10. Evaluation of Probabilistic Reasoning Evidence from Seventh-Graders

    Science.gov (United States)

    Erdem, Emrullah; Gürbüz, Ramazan

    2016-01-01

    The purpose of this study was to evaluate probabilistic reasoning of seventh-grade students (N=167) studying at randomly selected three middle schools that served low and middle socioeconomic areas in a city of Turkey. "Probabilistic Reasoning Test (PRT)" was developed and used as a data collection tool. In analyzing the data,…

  11. When catalysis is useful for probabilistic entanglement transformation

    International Nuclear Information System (INIS)

    Feng Yuan; Duan Runyao; Ying Mingsheng

    2004-01-01

    We determine all 2x2 quantum states that can serve as useful catalysts for a given probabilistic entanglement transformation, in the sense that they can increase the maximal transformation probability. When higher-dimensional catalysts are considered, a sufficient and necessary condition is derived under which a certain probabilistic transformation has useful catalysts

  12. Advances in probabilistic risk analysis

    International Nuclear Information System (INIS)

    Hardung von Hardung, H.

    1982-01-01

    Probabilistic risk analysis can now look back upon almost a quarter century of intensive development. The early studies, whose methods and results are still referred to occasionally, however, only permitted rough estimates to be made of the probabilities of recognizable accident scenarios, failing to provide a method which could have served as a reference base in calculating the overall risk associated with nuclear power plants. The first truly solid attempt was the Rasmussen Study and, partly based on it, the German Risk Study. In those studies, probabilistic risk analysis has been given a much more precise basis. However, new methodologies have been developed in the meantime, which allow much more informative risk studies to be carried out. They have been found to be valuable tools for management decisions with respect to backfitting, reinforcement and risk limitation. Today they are mainly applied by specialized private consultants and have already found widespread application especially in the USA. (orig.) [de

  13. Recent developments of the NESSUS probabilistic structural analysis computer program

    Science.gov (United States)

    Millwater, H.; Wu, Y.-T.; Torng, T.; Thacker, B.; Riha, D.; Leung, C. P.

    1992-01-01

    The NESSUS probabilistic structural analysis computer program combines state-of-the-art probabilistic algorithms with general purpose structural analysis methods to compute the probabilistic response and the reliability of engineering structures. Uncertainty in loading, material properties, geometry, boundary conditions and initial conditions can be simulated. The structural analysis methods include nonlinear finite element and boundary element methods. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. The scope of the code has recently been expanded to include probabilistic life and fatigue prediction of structures in terms of component and system reliability and risk analysis of structures considering cost of failure. The code is currently being extended to structural reliability considering progressive crack propagation. Several examples are presented to demonstrate the new capabilities.

  14. Reliability calculations

    International Nuclear Information System (INIS)

    Petersen, K.E.

    1986-03-01

    Risk and reliability analysis is increasingly being used in evaluations of plant safety and plant reliability. The analysis can be performed either during the design process or during the operation time, with the purpose to improve the safety or the reliability. Due to plant complexity and safety and availability requirements, sophisticated tools, which are flexible and efficient, are needed. Such tools have been developed in the last 20 years and they have to be continuously refined to meet the growing requirements. Two different areas of application were analysed. In structural reliability probabilistic approaches have been introduced in some cases for the calculation of the reliability of structures or components. A new computer program has been developed based upon numerical integration in several variables. In systems reliability Monte Carlo simulation programs are used especially in analysis of very complex systems. In order to increase the applicability of the programs variance reduction techniques can be applied to speed up the calculation process. Variance reduction techniques have been studied and procedures for implementation of importance sampling are suggested. (author)

  15. Do probabilistic forecasts lead to better decisions?

    Directory of Open Access Journals (Sweden)

    M. H. Ramos

    2013-06-01

    Full Text Available The last decade has seen growing research in producing probabilistic hydro-meteorological forecasts and increasing their reliability. This followed the promise that, supplied with information about uncertainty, people would take better risk-based decisions. In recent years, therefore, research and operational developments have also started focusing attention on ways of communicating the probabilistic forecasts to decision-makers. Communicating probabilistic forecasts includes preparing tools and products for visualisation, but also requires understanding how decision-makers perceive and use uncertainty information in real time. At the EGU General Assembly 2012, we conducted a laboratory-style experiment in which several cases of flood forecasts and a choice of actions to take were presented as part of a game to participants, who acted as decision-makers. Answers were collected and analysed. In this paper, we present the results of this exercise and discuss if we indeed make better decisions on the basis of probabilistic forecasts.

  16. Human reliability analysis

    International Nuclear Information System (INIS)

    Dougherty, E.M.; Fragola, J.R.

    1988-01-01

    The authors present a treatment of human reliability analysis incorporating an introduction to probabilistic risk assessment for nuclear power generating stations. They treat the subject according to the framework established for general systems theory. Draws upon reliability analysis, psychology, human factors engineering, and statistics, integrating elements of these fields within a systems framework. Provides a history of human reliability analysis, and includes examples of the application of the systems approach

  17. Probabilistic Design of Coastal Flood Defences in Vietnam

    NARCIS (Netherlands)

    Mai Van, C.

    2010-01-01

    This study further develops the method of probabilistic design and to address a knowledge gap in its application regarding safety and reliability, risk assessment and risk evaluation to the fields of flood defences. The thesis discusses: - a generic probabilistic design framework for assessing flood

  18. Utilization of probabilistic methods for evaluating the safety of PWRs built in France

    International Nuclear Information System (INIS)

    Queniart, D.; Brisbois, J.; Lanore, J.M.

    1985-01-01

    Firstly, it is recalled that, in France, PWRs are designed on a deterministic basis by studying the consequences of a limited number of conventional incidents whose estimated frequency is specified in order-of-magnitude terms and for which it is shown that the consequences, for each category of frequency, predominate over those of the other situations in the same category. These situations are called dimensioning situations. The paper then describes the use made of probabilistic methods. External attacks and loss of redundant systems are examined in particular. A probabilistic approach is in fact well suited to the evaluation of risks due, among other things, to aircraft crashes and the industrial environment. Analysis of the reliability of redundant systems has shown that, in the light of the overall risk assessment objective, their loss should be examined with a view to instituting counteraction to reduce the risks associated with such loss (particularly the introduction of special control procedures). Probabilistic methods are used to evaluate the effectiveness of the counteraction proposed and such a study has been carried out for total loss of electric power supply. Finally, the probabilistic study of hazard initiated post factum by the French safety authorities for the standardized 900 MW(e) power units is described. The study, which is not yet complete, will serve as the basis for a permanent safety analysis tool taking into account control procedures and the total operating experience acquired using these power units. (author)

  19. Probabilistic risk benchmark of the Brazilian electrical system; Risco probabilistico de referencia do sistema eletrico brasileiro

    Energy Technology Data Exchange (ETDEWEB)

    Soares, Neyl Hamilton Martelotta

    2002-05-01

    The main goal of this dissertation is to proceed a first numerical evaluation of the probabilistic risks magnitudes associated with the Brazilian Electrical network, considering the subsystems North, Northeast, South, Southeast and Mid West. This result is relevant because it can be used as an initial comparative reference for future reliability studies of the Brazilian Basic Grid. As a by-product, the whole set of criteria and procedures used in the work are described in detail. They may also serve as a preliminary base for future similar evaluations. (author)

  20. Probabilistic Design of Wave Energy Devices

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Kofoed, Jens Peter; Ferreira, C.B.

    2011-01-01

    Wave energy has a large potential for contributing significantly to production of renewable energy. However, the wave energy sector is still not able to deliver cost competitive and reliable solutions. But the sector has already demonstrated several proofs of concepts. The design of wave energy...... devices is a new and expanding technical area where there is no tradition for probabilistic design—in fact very little full scale devices has been build to date, so it can be said that no design tradition really exists in this area. For this reason it is considered to be of great importance to develop...... and advocate for a probabilistic design approach, as it is assumed (in other areas this has been demonstrated) that this leads to more economical designs compared to designs based on deterministic methods. In the present paper a general framework for probabilistic design and reliability analysis of wave energy...

  1. Learning Probabilistic Logic Models from Probabilistic Examples.

    Science.gov (United States)

    Chen, Jianzhong; Muggleton, Stephen; Santos, José

    2008-10-01

    We revisit an application developed originally using abductive Inductive Logic Programming (ILP) for modeling inhibition in metabolic networks. The example data was derived from studies of the effects of toxins on rats using Nuclear Magnetic Resonance (NMR) time-trace analysis of their biofluids together with background knowledge representing a subset of the Kyoto Encyclopedia of Genes and Genomes (KEGG). We now apply two Probabilistic ILP (PILP) approaches - abductive Stochastic Logic Programs (SLPs) and PRogramming In Statistical modeling (PRISM) to the application. Both approaches support abductive learning and probability predictions. Abductive SLPs are a PILP framework that provides possible worlds semantics to SLPs through abduction. Instead of learning logic models from non-probabilistic examples as done in ILP, the PILP approach applied in this paper is based on a general technique for introducing probability labels within a standard scientific experimental setting involving control and treated data. Our results demonstrate that the PILP approach provides a way of learning probabilistic logic models from probabilistic examples, and the PILP models learned from probabilistic examples lead to a significant decrease in error accompanied by improved insight from the learned results compared with the PILP models learned from non-probabilistic examples.

  2. Overview of Future of Probabilistic Methods and RMSL Technology and the Probabilistic Methods Education Initiative for the US Army at the SAE G-11 Meeting

    Science.gov (United States)

    Singhal, Surendra N.

    2003-01-01

    The SAE G-11 RMSL Division and Probabilistic Methods Committee meeting sponsored by the Picatinny Arsenal during March 1-3, 2004 at Westin Morristown, will report progress on projects for probabilistic assessment of Army system and launch an initiative for probabilistic education. The meeting features several Army and industry Senior executives and Ivy League Professor to provide an industry/government/academia forum to review RMSL technology; reliability and probabilistic technology; reliability-based design methods; software reliability; and maintainability standards. With over 100 members including members with national/international standing, the mission of the G-11s Probabilistic Methods Committee is to enable/facilitate rapid deployment of probabilistic technology to enhance the competitiveness of our industries by better, faster, greener, smarter, affordable and reliable product development.

  3. Probabilistic design of nuclear structures: a summary of state of the art and research needs

    International Nuclear Information System (INIS)

    Ravindra, M.K.; Walser, A.

    1978-01-01

    This paper provides an overview of ongoing research in probabilistic design of nuclear structures. The main areas of review are (1) loads, (2) load combinations, (3) missiles, (4) design criteria, (5) seismic safety, (6) system reliability, (7) hazard analysis, and (8) probabilistic response. A consistent framework of probabilistic design of nuclear structures is proposed. Areas of further research and data collection are suggested. (Auth.)

  4. Probabilistic Logical Characterization

    DEFF Research Database (Denmark)

    Hermanns, Holger; Parma, Augusto; Segala, Roberto

    2011-01-01

    Probabilistic automata exhibit both probabilistic and non-deterministic choice. They are therefore a powerful semantic foundation for modeling concurrent systems with random phenomena arising in many applications ranging from artificial intelligence, security, systems biology to performance...... modeling. Several variations of bisimulation and simulation relations have proved to be useful as means to abstract and compare different automata. This paper develops a taxonomy of logical characterizations of these relations on image-finite and image-infinite probabilistic automata....

  5. Conditional Probabilistic Population Forecasting

    OpenAIRE

    Sanderson, W.C.; Scherbov, S.; O'Neill, B.C.; Lutz, W.

    2003-01-01

    Since policy makers often prefer to think in terms of scenarios, the question has arisen as to whether it is possible to make conditional population forecasts in a probabilistic context. This paper shows that it is both possible and useful to make these forecasts. We do this with two different kinds of examples. The first is the probabilistic analog of deterministic scenario analysis. Conditional probabilistic scenario analysis is essential for policy makers it allows them to answer "what if"...

  6. Conditional probabilistic population forecasting

    OpenAIRE

    Sanderson, Warren; Scherbov, Sergei; O'Neill, Brian; Lutz, Wolfgang

    2003-01-01

    Since policy-makers often prefer to think in terms of alternative scenarios, the question has arisen as to whether it is possible to make conditional population forecasts in a probabilistic context. This paper shows that it is both possible and useful to make these forecasts. We do this with two different kinds of examples. The first is the probabilistic analog of deterministic scenario analysis. Conditional probabilistic scenario analysis is essential for policy-makers because it allows them...

  7. Conditional Probabilistic Population Forecasting

    OpenAIRE

    Sanderson, Warren C.; Scherbov, Sergei; O'Neill, Brian C.; Lutz, Wolfgang

    2004-01-01

    Since policy-makers often prefer to think in terms of alternative scenarios, the question has arisen as to whether it is possible to make conditional population forecasts in a probabilistic context. This paper shows that it is both possible and useful to make these forecasts. We do this with two different kinds of examples. The first is the probabilistic analog of deterministic scenario analysis. Conditional probabilistic scenario analysis is essential for policy-makers because...

  8. Influence of probabilistic safety analysis on design and operation of PWR plants

    International Nuclear Information System (INIS)

    Bastl, W.; Hoertner, H.; Kafka, P.

    1978-01-01

    This paper gives a comprehensive presentation of the connections and influences of probabilistic safety analysis on design and operation of PWR plants. In this context a short historical retrospective view concerning probabilistic reliability analysis is given. In the main part of this paper some examples are presented in detail, showing special outcomes of such probabilistic investigations. Additional paragraphs illustrate some activities and issues in the field of probabilistic safety analysis

  9. Duplicate Detection in Probabilistic Data

    NARCIS (Netherlands)

    Panse, Fabian; van Keulen, Maurice; de Keijzer, Ander; Ritter, Norbert

    2009-01-01

    Collected data often contains uncertainties. Probabilistic databases have been proposed to manage uncertain data. To combine data from multiple autonomous probabilistic databases, an integration of probabilistic data has to be performed. Until now, however, data integration approaches have focused

  10. A Helpful Serving

    Science.gov (United States)

    Rockower, David

    2006-01-01

    This article briefly describes how a fifth-grade class collaborated with a downtown diner for several months and then actually ran the restaurant for four hours. Through the Chatters Cafe, a local high school cafe that serves as a culinary arts training ground for high school students, fifth graders had the opportunity to prepare and serve dinner…

  11. Probabilistic Durability Analysis in Advanced Engineering Design

    Directory of Open Access Journals (Sweden)

    A. Kudzys

    2000-01-01

    Full Text Available Expedience of probabilistic durability concepts and approaches in advanced engineering design of building materials, structural members and systems is considered. Target margin values of structural safety and serviceability indices are analyzed and their draft values are presented. Analytical methods of the cumulative coefficient of correlation and the limit transient action effect for calculation of reliability indices are given. Analysis can be used for probabilistic durability assessment of carrying and enclosure metal, reinforced concrete, wood, plastic, masonry both homogeneous and sandwich or composite structures and some kinds of equipments. Analysis models can be applied in other engineering fields.

  12. Probabilistic analysis of modernization options

    International Nuclear Information System (INIS)

    Wunderlich, W.O.; Giles, J.E.

    1991-01-01

    This paper reports on benefit-cost analysis for hydropower operations, a standard procedure for reaching planning decisions. Cost overruns and benefit shortfalls are also common occurrences. One reason for the difficulty of predicting future benefits and costs is that they usually cannot be represented with sufficient reliability by accurate values, because of the many uncertainties that enter the analysis through assumptions on inputs and system parameters. Therefore, ranges of variables need to be analyzed instead of single values. As a consequence, the decision criteria, such as net benefit and benefit-cost ratio, also vary over some range. A probabilistic approach will be demonstrated as a tool for assessing the reliability of the results

  13. Reliability Analysis of Fatigue Fracture of Wind Turbine Drivetrain Components

    DEFF Research Database (Denmark)

    Berzonskis, Arvydas; Sørensen, John Dalsgaard

    2016-01-01

    in the volume of the casted ductile iron main shaft, on the reliability of the component. The probabilistic reliability analysis conducted is based on fracture mechanics models. Additionally, the utilization of the probabilistic reliability for operation and maintenance planning and quality control is discussed....

  14. Advances in reliability and system engineering

    CERN Document Server

    Davim, J

    2017-01-01

    This book presents original studies describing the latest research and developments in the area of reliability and systems engineering. It helps the reader identifying gaps in the current knowledge and presents fruitful areas for further research in the field. Among others, this book covers reliability measures, reliability assessment of multi-state systems, optimization of multi-state systems, continuous multi-state systems, new computational techniques applied to multi-state systems and probabilistic and non-probabilistic safety assessment.

  15. Deterministic and probabilistic approach to safety analysis

    International Nuclear Information System (INIS)

    Heuser, F.W.

    1980-01-01

    The examples discussed in this paper show that reliability analysis methods fairly well can be applied in order to interpret deterministic safety criteria in quantitative terms. For further improved extension of applied reliability analysis it has turned out that the influence of operational and control systems and of component protection devices should be considered with the aid of reliability analysis methods in detail. Of course, an extension of probabilistic analysis must be accompanied by further development of the methods and a broadening of the data base. (orig.)

  16. Probabilistic programmable quantum processors

    International Nuclear Information System (INIS)

    Buzek, V.; Ziman, M.; Hillery, M.

    2004-01-01

    We analyze how to improve performance of probabilistic programmable quantum processors. We show how the probability of success of the probabilistic processor can be enhanced by using the processor in loops. In addition, we show that an arbitrary SU(2) transformations of qubits can be encoded in program state of a universal programmable probabilistic quantum processor. The probability of success of this processor can be enhanced by a systematic correction of errors via conditional loops. Finally, we show that all our results can be generalized also for qudits. (Abstract Copyright [2004], Wiley Periodicals, Inc.)

  17. Probabilistic Infinite Secret Sharing

    OpenAIRE

    Csirmaz, László

    2013-01-01

    The study of probabilistic secret sharing schemes using arbitrary probability spaces and possibly infinite number of participants lets us investigate abstract properties of such schemes. It highlights important properties, explains why certain definitions work better than others, connects this topic to other branches of mathematics, and might yield new design paradigms. A probabilistic secret sharing scheme is a joint probability distribution of the shares and the secret together with a colle...

  18. Probabilistic Programming (Invited Talk)

    OpenAIRE

    Yang, Hongseok

    2017-01-01

    Probabilistic programming refers to the idea of using standard programming constructs for specifying probabilistic models from machine learning and statistics, and employing generic inference algorithms for answering various queries on these models, such as posterior inference and estimation of model evidence. Although this idea itself is not new and was, in fact, explored by several programming-language and statistics researchers in the early 2000, it is only in the last few years that proba...

  19. Limited probabilistic risk assessment applications in plant backfitting

    International Nuclear Information System (INIS)

    Desaedeleer, G.

    1987-01-01

    Plant backfitting programs are defined on the basis of deterministic (e.g. Systematic Evaluation Program) or probabilistic (e.g. Probabilistic Risk Assessment) approaches. Each approach provides valuable assets in defining the program and has its own advantages and disadvantages. Ideally one should combine the strong points of each approach. This chapter summarizes actual experience gained from combinations of deterministic and probabilistic approaches to define and implement PWR backfitting programs. Such combinations relate to limited applications of probabilistic techniques and are illustrated for upgrading fluid systems. These evaluations allow sound and rational optimization systems upgrade. However, the boundaries of the reliability analysis need to be clearly defined and system reliability may have to go beyond classical boundaries (e.g. identification of weak links in support systems). Also the implementation of upgrade on a system per system basis is not necessarily cost-effective. (author)

  20. Probabilistic approach to manipulator kinematics and dynamics

    International Nuclear Information System (INIS)

    Rao, S.S.; Bhatti, P.K.

    2001-01-01

    A high performance, high speed robotic arm must be able to manipulate objects with a high degree of accuracy and repeatability. As with any other physical system, there are a number of factors causing uncertainties in the behavior of a robotic manipulator. These factors include manufacturing and assembling tolerances, and errors in the joint actuators and controllers. In order to study the effect of these uncertainties on the robotic end-effector and to obtain a better insight into the manipulator behavior, the manipulator kinematics and dynamics are modeled using a probabilistic approach. Based on the probabilistic model, kinematic and dynamic performance criteria are defined to provide measures of the behavior of the robotic end-effector. Techniques are presented to compute the kinematic and dynamic reliabilities of the manipulator. The effects of tolerances associated with the various manipulator parameters on the reliabilities are studied. Numerical examples are presented to illustrate the procedures

  1. Probabilistic Model for Fatigue Crack Growth in Welded Bridge Details

    DEFF Research Database (Denmark)

    Toft, Henrik Stensgaard; Sørensen, John Dalsgaard; Yalamas, Thierry

    2013-01-01

    In the present paper a probabilistic model for fatigue crack growth in welded steel details in road bridges is presented. The probabilistic model takes the influence of bending stresses in the joints into account. The bending stresses can either be introduced by e.g. misalignment or redistribution...... of stresses in the structure. The fatigue stress ranges are estimated from traffic measurements and a generic bridge model. Based on the probabilistic models for the resistance and load the reliability is estimated for a typical welded steel detail. The results show that large misalignments in the joints can...

  2. Reliability and safety engineering

    CERN Document Server

    Verma, Ajit Kumar; Karanki, Durga Rao

    2016-01-01

    Reliability and safety are core issues that must be addressed throughout the life cycle of engineering systems. Reliability and Safety Engineering presents an overview of the basic concepts, together with simple and practical illustrations. The authors present reliability terminology in various engineering fields, viz.,electronics engineering, software engineering, mechanical engineering, structural engineering and power systems engineering. The book describes the latest applications in the area of probabilistic safety assessment, such as technical specification optimization, risk monitoring and risk informed in-service inspection. Reliability and safety studies must, inevitably, deal with uncertainty, so the book includes uncertainty propagation methods: Monte Carlo simulation, fuzzy arithmetic, Dempster-Shafer theory and probability bounds. Reliability and Safety Engineering also highlights advances in system reliability and safety assessment including dynamic system modeling and uncertainty management. Cas...

  3. Probabilistic analysis of crack containing structures with the PARIS code

    International Nuclear Information System (INIS)

    Brueckner-Foit, A.

    1987-10-01

    The basic features of the PARIS code which has been developed for the calculation of failure probabilities of crack containing structures are explained. An important issue in the reliability analysis of cracked components is the probabilistic leak-before-break behaviour. Formulae for the leak and break probabilities are derived and it is shown how a leak detection system influences the results. An example taken from nuclear applications illustrates the details of the probabilistic leak-before-break analysis. (orig.) [de

  4. Probabilistic real-time contingency ranking method

    International Nuclear Information System (INIS)

    Mijuskovic, N.A.; Stojnic, D.

    2000-01-01

    This paper describes a real-time contingency method based on a probabilistic index-expected energy not supplied. This way it is possible to take into account the stochastic nature of the electric power system equipment outages. This approach enables more comprehensive ranking of contingencies and it is possible to form reliability cost values that can form the basis for hourly spot price calculations. The electric power system of Serbia is used as an example for the method proposed. (author)

  5. A probabilistic maintenance model for diesel engines

    Science.gov (United States)

    Pathirana, Shan; Abeygunawardane, Saranga Kumudu

    2018-02-01

    In this paper, a probabilistic maintenance model is developed for inspection based preventive maintenance of diesel engines based on the practical model concepts discussed in the literature. Developed model is solved using real data obtained from inspection and maintenance histories of diesel engines and experts' views. Reliability indices and costs were calculated for the present maintenance policy of diesel engines. A sensitivity analysis is conducted to observe the effect of inspection based preventive maintenance on the life cycle cost of diesel engines.

  6. Probabilistic studies for safety at optimum cost

    International Nuclear Information System (INIS)

    Pitner, P.

    1999-01-01

    By definition, the risk of failure of very reliable components is difficult to evaluate. How can the best strategies for in service inspection and maintenance be defined to limit this risk to an acceptable level at optimum cost? It is not sufficient to design structures with margins, it is also essential to understand how they age. The probabilistic approach has made it possible to develop well proven concepts. (author)

  7. Probabilistic record linkage.

    Science.gov (United States)

    Sayers, Adrian; Ben-Shlomo, Yoav; Blom, Ashley W; Steele, Fiona

    2016-06-01

    Studies involving the use of probabilistic record linkage are becoming increasingly common. However, the methods underpinning probabilistic record linkage are not widely taught or understood, and therefore these studies can appear to be a 'black box' research tool. In this article, we aim to describe the process of probabilistic record linkage through a simple exemplar. We first introduce the concept of deterministic linkage and contrast this with probabilistic linkage. We illustrate each step of the process using a simple exemplar and describe the data structure required to perform a probabilistic linkage. We describe the process of calculating and interpreting matched weights and how to convert matched weights into posterior probabilities of a match using Bayes theorem. We conclude this article with a brief discussion of some of the computational demands of record linkage, how you might assess the quality of your linkage algorithm, and how epidemiologists can maximize the value of their record-linked research using robust record linkage methods. © The Author 2015; Published by Oxford University Press on behalf of the International Epidemiological Association.

  8. Dynamical systems probabilistic risk assessment

    Energy Technology Data Exchange (ETDEWEB)

    Denman, Matthew R. [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States); Ames, Arlo Leroy [Sandia National Lab. (SNL-NM), Albuquerque, NM (United States)

    2014-03-01

    Probabilistic Risk Assessment (PRA) is the primary tool used to risk-inform nuclear power regulatory and licensing activities. Risk-informed regulations are intended to reduce inherent conservatism in regulatory metrics (e.g., allowable operating conditions and technical specifications) which are built into the regulatory framework by quantifying both the total risk profile as well as the change in the risk profile caused by an event or action (e.g., in-service inspection procedures or power uprates). Dynamical Systems (DS) analysis has been used to understand unintended time-dependent feedbacks in both industrial and organizational settings. In dynamical systems analysis, feedback loops can be characterized and studied as a function of time to describe the changes to the reliability of plant Structures, Systems and Components (SSCs). While DS has been used in many subject areas, some even within the PRA community, it has not been applied toward creating long-time horizon, dynamic PRAs (with time scales ranging between days and decades depending upon the analysis). Understanding slowly developing dynamic effects, such as wear-out, on SSC reliabilities may be instrumental in ensuring a safely and reliably operating nuclear fleet. Improving the estimation of a plant's continuously changing risk profile will allow for more meaningful risk insights, greater stakeholder confidence in risk insights, and increased operational flexibility.

  9. Why do they serve?

    DEFF Research Database (Denmark)

    Vincent, Stéphanie; Glad, Ane

    2016-01-01

    that after the mission, peace-keepers are generally more disappointed than peace-enforcers. Our results also show that self-benefit motives are important for younger soldiers with only a high school education, and that this group usually serves as peace-enforcers during their gap year....... the survey both before and after deployment. Soldiers are deployed to different missions under the same circumstances. To conceptualize motives among soldiers, we use factor analysis and find three factors: challenge, self-benefit, and fidelity. Challenge represents an occupational orientation; fidelity...

  10. Drama is Served

    DEFF Research Database (Denmark)

    Svømmekjær, Heidi Frank

    2015-01-01

    This article focuses on how the theme of food is used for making social, gender, and other distinctions in the weekly Danish radio series The Hansen Family (The Danish Broadcasting Corporation, 1929-49) and in relation to other radio programmes from the 1930s and 1940s. These distinctions serve t...... with the wife. To Mrs. Hansen, it is the fruit of hard labour rather than a meal to be enjoyed. On a more general level, food is a limited resource, which often causes social tensions to burst onto the surface of human interaction....

  11. Formalizing Probabilistic Safety Claims

    Science.gov (United States)

    Herencia-Zapana, Heber; Hagen, George E.; Narkawicz, Anthony J.

    2011-01-01

    A safety claim for a system is a statement that the system, which is subject to hazardous conditions, satisfies a given set of properties. Following work by John Rushby and Bev Littlewood, this paper presents a mathematical framework that can be used to state and formally prove probabilistic safety claims. It also enables hazardous conditions, their uncertainties, and their interactions to be integrated into the safety claim. This framework provides a formal description of the probabilistic composition of an arbitrary number of hazardous conditions and their effects on system behavior. An example is given of a probabilistic safety claim for a conflict detection algorithm for aircraft in a 2D airspace. The motivation for developing this mathematical framework is that it can be used in an automated theorem prover to formally verify safety claims.

  12. Probabilistic risk analysis in chemical engineering

    International Nuclear Information System (INIS)

    Schmalz, F.

    1991-01-01

    In risk analysis in the chemical industry, recognising potential risks is considered more important than assessing their quantitative extent. Even in assessing risks, emphasis is not on the probability involved but on the possible extent. Qualitative assessment has proved valuable here. Probabilistic methods are used in individual cases where the wide implications make it essential to be able to assess the reliability of safety precautions. In this case, assessment therefore centres on the reliability of technical systems and not on the extent of a chemical risk. 7 figs

  13. Reliability Analysis and Optimal Design of Monolithic Vertical Wall Breakwaters

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Burcharth, Hans F.; Christiani, E.

    1994-01-01

    Reliability analysis and reliability-based design of monolithic vertical wall breakwaters are considered. Probabilistic models of the most important failure modes, sliding failure, failure of the foundation and overturning failure are described . Relevant design variables are identified...

  14. Probabilistic Mu-Calculus

    DEFF Research Database (Denmark)

    Larsen, Kim Guldstrand; Mardare, Radu Iulian; Xue, Bingtian

    2016-01-01

    We introduce a version of the probabilistic µ-calculus (PMC) built on top of a probabilistic modal logic that allows encoding n-ary inequational conditions on transition probabilities. PMC extends previously studied calculi and we prove that, despite its expressiveness, it enjoys a series of good...... metaproperties. Firstly, we prove the decidability of satisfiability checking by establishing the small model property. An algorithm for deciding the satisfiability problem is developed. As a second major result, we provide a complete axiomatization for the alternation-free fragment of PMC. The completeness proof...

  15. Probabilistic conditional independence structures

    CERN Document Server

    Studeny, Milan

    2005-01-01

    Probabilistic Conditional Independence Structures provides the mathematical description of probabilistic conditional independence structures; the author uses non-graphical methods of their description, and takes an algebraic approach.The monograph presents the methods of structural imsets and supermodular functions, and deals with independence implication and equivalence of structural imsets.Motivation, mathematical foundations and areas of application are included, and a rough overview of graphical methods is also given.In particular, the author has been careful to use suitable terminology, and presents the work so that it will be understood by both statisticians, and by researchers in artificial intelligence.The necessary elementary mathematical notions are recalled in an appendix.

  16. Probabilistic approach to mechanisms

    CERN Document Server

    Sandler, BZ

    1984-01-01

    This book discusses the application of probabilistics to the investigation of mechanical systems. The book shows, for example, how random function theory can be applied directly to the investigation of random processes in the deflection of cam profiles, pitch or gear teeth, pressure in pipes, etc. The author also deals with some other technical applications of probabilistic theory, including, amongst others, those relating to pneumatic and hydraulic mechanisms and roller bearings. Many of the aspects are illustrated by examples of applications of the techniques under discussion.

  17. Probabilistic systems coalgebraically: A survey

    Science.gov (United States)

    Sokolova, Ana

    2011-01-01

    We survey the work on both discrete and continuous-space probabilistic systems as coalgebras, starting with how probabilistic systems are modeled as coalgebras and followed by a discussion of their bisimilarity and behavioral equivalence, mentioning results that follow from the coalgebraic treatment of probabilistic systems. It is interesting to note that, for different reasons, for both discrete and continuous probabilistic systems it may be more convenient to work with behavioral equivalence than with bisimilarity. PMID:21998490

  18. Predicting Volleyball Serve-Reception

    NARCIS (Netherlands)

    Paulo, Ana; Zaal, Frank T J M; Fonseca, Sofia; Araujo, Duarte

    2016-01-01

    Serve and serve-reception performance have predicted success in volleyball. Given the impact of serve-reception on the game, we aimed at understanding what it is in the serve and receiver's actions that determines the selection of the type of pass used in serve-reception and its efficacy. Four

  19. Confluence reduction for probabilistic systems

    NARCIS (Netherlands)

    Timmer, Mark; van de Pol, Jan Cornelis; Stoelinga, Mariëlle Ida Antoinette

    In this presentation we introduce a novel technique for state space reduction of probabilistic specifications, based on a newly developed notion of confluence for probabilistic automata. We proved that this reduction preserves branching probabilistic bisimulation and can be applied on-the-fly. To

  20. Reliability engineering

    International Nuclear Information System (INIS)

    Lee, Chi Woo; Kim, Sun Jin; Lee, Seung Woo; Jeong, Sang Yeong

    1993-08-01

    This book start what is reliability? such as origin of reliability problems, definition of reliability and reliability and use of reliability. It also deals with probability and calculation of reliability, reliability function and failure rate, probability distribution of reliability, assumption of MTBF, process of probability distribution, down time, maintainability and availability, break down maintenance and preventive maintenance design of reliability, design of reliability for prediction and statistics, reliability test, reliability data and design and management of reliability.

  1. Safety-specific benefit of the probabilistic evaluation of older nuclear power plants

    International Nuclear Information System (INIS)

    Hoertner, H.; Koeberlein, K.

    1991-01-01

    The report summarizes the experience of the GRS obtained within the framework of a probabilistic evaluation of older nuclear power plants and the German risk study. The applied methodology and the problems involved are explained first. After a brief summary of probabilistic analyses carried out for German nuclear power plants, reliability analyses for older systems are discussed in detail. The findings from the probabilistic safety analyses and the conclusions drawn are presented. (orig.) [de

  2. Probabilistic thread algebra

    NARCIS (Netherlands)

    Bergstra, J.A.; Middelburg, C.A.

    2015-01-01

    We add probabilistic features to basic thread algebra and its extensions with thread-service interaction and strategic interleaving. Here, threads represent the behaviours produced by instruction sequences under execution and services represent the behaviours exhibited by the components of execution

  3. Probabilistic simple sticker systems

    Science.gov (United States)

    Selvarajoo, Mathuri; Heng, Fong Wan; Sarmin, Nor Haniza; Turaev, Sherzod

    2017-04-01

    A model for DNA computing using the recombination behavior of DNA molecules, known as a sticker system, was introduced by by L. Kari, G. Paun, G. Rozenberg, A. Salomaa, and S. Yu in the paper entitled DNA computing, sticker systems and universality from the journal of Acta Informatica vol. 35, pp. 401-420 in the year 1998. A sticker system uses the Watson-Crick complementary feature of DNA molecules: starting from the incomplete double stranded sequences, and iteratively using sticking operations until a complete double stranded sequence is obtained. It is known that sticker systems with finite sets of axioms and sticker rules generate only regular languages. Hence, different types of restrictions have been considered to increase the computational power of sticker systems. Recently, a variant of restricted sticker systems, called probabilistic sticker systems, has been introduced [4]. In this variant, the probabilities are initially associated with the axioms, and the probability of a generated string is computed by multiplying the probabilities of all occurrences of the initial strings in the computation of the string. Strings for the language are selected according to some probabilistic requirements. In this paper, we study fundamental properties of probabilistic simple sticker systems. We prove that the probabilistic enhancement increases the computational power of simple sticker systems.

  4. Visualizing Probabilistic Proof

    OpenAIRE

    Guerra-Pujol, Enrique

    2015-01-01

    The author revisits the Blue Bus Problem, a famous thought-experiment in law involving probabilistic proof, and presents simple Bayesian solutions to different versions of the blue bus hypothetical. In addition, the author expresses his solutions in standard and visual formats, i.e. in terms of probabilities and natural frequencies.

  5. Memristive Probabilistic Computing

    KAUST Repository

    Alahmadi, Hamzah

    2017-10-01

    In the era of Internet of Things and Big Data, unconventional techniques are rising to accommodate the large size of data and the resource constraints. New computing structures are advancing based on non-volatile memory technologies and different processing paradigms. Additionally, the intrinsic resiliency of current applications leads to the development of creative techniques in computations. In those applications, approximate computing provides a perfect fit to optimize the energy efficiency while compromising on the accuracy. In this work, we build probabilistic adders based on stochastic memristor. Probabilistic adders are analyzed with respect of the stochastic behavior of the underlying memristors. Multiple adder implementations are investigated and compared. The memristive probabilistic adder provides a different approach from the typical approximate CMOS adders. Furthermore, it allows for a high area saving and design exibility between the performance and power saving. To reach a similar performance level as approximate CMOS adders, the memristive adder achieves 60% of power saving. An image-compression application is investigated using the memristive probabilistic adders with the performance and the energy trade-off.

  6. Probabilistic Load Flow

    DEFF Research Database (Denmark)

    Chen, Peiyuan; Chen, Zhe; Bak-Jensen, Birgitte

    2008-01-01

    This paper reviews the development of the probabilistic load flow (PLF) techniques. Applications of the PLF techniques in different areas of power system steady-state analysis are also discussed. The purpose of the review is to identify different available PLF techniques and their corresponding...

  7. Transitive probabilistic CLIR models.

    NARCIS (Netherlands)

    Kraaij, W.; de Jong, Franciska M.G.

    2004-01-01

    Transitive translation could be a useful technique to enlarge the number of supported language pairs for a cross-language information retrieval (CLIR) system in a cost-effective manner. The paper describes several setups for transitive translation based on probabilistic translation models. The

  8. Redefining reliability

    International Nuclear Information System (INIS)

    Paulson, S.L.

    1995-01-01

    Want to buy some reliability? The question would have been unthinkable in some markets served by the natural gas business even a few years ago, but in the new gas marketplace, industrial, commercial and even some residential customers have the opportunity to choose from among an array of options about the kind of natural gas service they need--and are willing to pay for. The complexities of this brave new world of restructuring and competition have sent the industry scrambling to find ways to educate and inform its customers about the increased responsibility they will have in determining the level of gas reliability they choose. This article discusses the new options and the new responsibilities of customers, the needed for continuous education, and MidAmerican Energy Company's experiment in direct marketing of natural gas

  9. Probabilistic forecasting and Bayesian data assimilation

    CERN Document Server

    Reich, Sebastian

    2015-01-01

    In this book the authors describe the principles and methods behind probabilistic forecasting and Bayesian data assimilation. Instead of focusing on particular application areas, the authors adopt a general dynamical systems approach, with a profusion of low-dimensional, discrete-time numerical examples designed to build intuition about the subject. Part I explains the mathematical framework of ensemble-based probabilistic forecasting and uncertainty quantification. Part II is devoted to Bayesian filtering algorithms, from classical data assimilation algorithms such as the Kalman filter, variational techniques, and sequential Monte Carlo methods, through to more recent developments such as the ensemble Kalman filter and ensemble transform filters. The McKean approach to sequential filtering in combination with coupling of measures serves as a unifying mathematical framework throughout Part II. Assuming only some basic familiarity with probability, this book is an ideal introduction for graduate students in ap...

  10. The Accelerator Reliability Forum

    CERN Document Server

    Lüdeke, Andreas; Giachino, R

    2014-01-01

    A high reliability is a very important goal for most particle accelerators. The biennial Accelerator Reliability Workshop covers topics related to the design and operation of particle accelerators with a high reliability. In order to optimize the over-all reliability of an accelerator one needs to gather information on the reliability of many different subsystems. While a biennial workshop can serve as a platform for the exchange of such information, the authors aimed to provide a further channel to allow for a more timely communication: the Particle Accelerator Reliability Forum [1]. This contribution will describe the forum and advertise it’s usage in the community.

  11. Stochastic Simulation and Forecast of Hydrologic Time Series Based on Probabilistic Chaos Expansion

    Science.gov (United States)

    Li, Z.; Ghaith, M.

    2017-12-01

    Hydrological processes are characterized by many complex features, such as nonlinearity, dynamics and uncertainty. How to quantify and address such complexities and uncertainties has been a challenging task for water engineers and managers for decades. To support robust uncertainty analysis, an innovative approach for the stochastic simulation and forecast of hydrologic time series is developed is this study. Probabilistic Chaos Expansions (PCEs) are established through probabilistic collocation to tackle uncertainties associated with the parameters of traditional hydrological models. The uncertainties are quantified in model outputs as Hermite polynomials with regard to standard normal random variables. Sequentially, multivariate analysis techniques are used to analyze the complex nonlinear relationships between meteorological inputs (e.g., temperature, precipitation, evapotranspiration, etc.) and the coefficients of the Hermite polynomials. With the established relationships between model inputs and PCE coefficients, forecasts of hydrologic time series can be generated and the uncertainties in the future time series can be further tackled. The proposed approach is demonstrated using a case study in China and is compared to a traditional stochastic simulation technique, the Markov-Chain Monte-Carlo (MCMC) method. Results show that the proposed approach can serve as a reliable proxy to complicated hydrological models. It can provide probabilistic forecasting in a more computationally efficient manner, compared to the traditional MCMC method. This work provides technical support for addressing uncertainties associated with hydrological modeling and for enhancing the reliability of hydrological modeling results. Applications of the developed approach can be extended to many other complicated geophysical and environmental modeling systems to support the associated uncertainty quantification and risk analysis.

  12. Probabilistic approaches for geotechnical site characterization and slope stability analysis

    CERN Document Server

    Cao, Zijun; Li, Dianqing

    2017-01-01

    This is the first book to revisit geotechnical site characterization from a probabilistic point of view and provide rational tools to probabilistically characterize geotechnical properties and underground stratigraphy using limited information obtained from a specific site. This book not only provides new probabilistic approaches for geotechnical site characterization and slope stability analysis, but also tackles the difficulties in practical implementation of these approaches. In addition, this book also develops efficient Monte Carlo simulation approaches for slope stability analysis and implements these approaches in a commonly available spreadsheet environment. These approaches and the software package are readily available to geotechnical practitioners and alleviate them from reliability computational algorithms. The readers will find useful information for a non-specialist to determine project-specific statistics of geotechnical properties and to perform probabilistic analysis of slope stability.

  13. A probabilistic bridge safety evaluation against floods.

    Science.gov (United States)

    Liao, Kuo-Wei; Muto, Yasunori; Chen, Wei-Lun; Wu, Bang-Ho

    2016-01-01

    To further capture the influences of uncertain factors on river bridge safety evaluation, a probabilistic approach is adopted. Because this is a systematic and nonlinear problem, MPP-based reliability analyses are not suitable. A sampling approach such as a Monte Carlo simulation (MCS) or importance sampling is often adopted. To enhance the efficiency of the sampling approach, this study utilizes Bayesian least squares support vector machines to construct a response surface followed by an MCS, providing a more precise safety index. Although there are several factors impacting the flood-resistant reliability of a bridge, previous experiences and studies show that the reliability of the bridge itself plays a key role. Thus, the goal of this study is to analyze the system reliability of a selected bridge that includes five limit states. The random variables considered here include the water surface elevation, water velocity, local scour depth, soil property and wind load. Because the first three variables are deeply affected by river hydraulics, a probabilistic HEC-RAS-based simulation is performed to capture the uncertainties in those random variables. The accuracy and variation of our solutions are confirmed by a direct MCS to ensure the applicability of the proposed approach. The results of a numerical example indicate that the proposed approach can efficiently provide an accurate bridge safety evaluation and maintain satisfactory variation.

  14. Probabilistic Modeling of Wind Turbine Drivetrain Components

    DEFF Research Database (Denmark)

    Rafsanjani, Hesam Mirzaei

    Wind energy is one of several energy sources in the world and a rapidly growing industry in the energy sector. When placed in offshore or onshore locations, wind turbines are exposed to wave excitations, highly dynamic wind loads and/or the wakes from other wind turbines. Therefore, most components...... in a wind turbine experience highly dynamic and time-varying loads. These components may fail due to wear or fatigue, and this can lead to unplanned shutdown repairs that are very costly. The design by deterministic methods using safety factors is generally unable to account for the many uncertainties. Thus......, a reliability assessment should be based on probabilistic methods where stochastic modeling of failures is performed. This thesis focuses on probabilistic models and the stochastic modeling of the fatigue life of the wind turbine drivetrain. Hence, two approaches are considered for stochastic modeling...

  15. Probabilistic safety assessment for research reactors

    International Nuclear Information System (INIS)

    1986-12-01

    Increasing interest in using Probabilistic Safety Assessment (PSA) methods for research reactor safety is being observed in many countries throughout the world. This is mainly because of the great ability of this approach in achieving safe and reliable operation of research reactors. There is also a need to assist developing countries to apply Probabilistic Safety Assessment to existing nuclear facilities which are simpler and therefore less complicated to analyse than a large Nuclear Power Plant. It may be important, therefore, to develop PSA for research reactors. This might also help to better understand the safety characteristics of the reactor and to base any backfitting on a cost-benefit analysis which would ensure that only necessary changes are made. This document touches on all the key aspects of PSA but placed greater emphasis on so-called systems analysis aspects rather than the in-plant or ex-plant consequences

  16. Use of probabilistic design methods for NASA applications. [to be used in design phase of Space Transportation Main Engine

    Science.gov (United States)

    Safie, Fayssal M.

    1992-01-01

    This paper presents a reliability evaluation process designed to improve the reliability of advanced launch systems. The work performed includes the development of a reliability prediction methodology to be used in the design phase of the Space Transportation Main Engine (STME). This includes prediction techniques which use historical data bases as well as deterministic and probabilistic engineering models for predicting design reliability. In summary, this paper describes a probabilistic design approach for the next-generation liquid rocket engine, the STME.

  17. Probabilistic Model Development

    Science.gov (United States)

    Adam, James H., Jr.

    2010-01-01

    Objective: Develop a Probabilistic Model for the Solar Energetic Particle Environment. Develop a tool to provide a reference solar particle radiation environment that: 1) Will not be exceeded at a user-specified confidence level; 2) Will provide reference environments for: a) Peak flux; b) Event-integrated fluence; and c) Mission-integrated fluence. The reference environments will consist of: a) Elemental energy spectra; b) For protons, helium and heavier ions.

  18. Geothermal probabilistic cost study

    Energy Technology Data Exchange (ETDEWEB)

    Orren, L.H.; Ziman, G.M.; Jones, S.C.; Lee, T.K.; Noll, R.; Wilde, L.; Sadanand, V.

    1981-08-01

    A tool is presented to quantify the risks of geothermal projects, the Geothermal Probabilistic Cost Model (GPCM). The GPCM model is used to evaluate a geothermal reservoir for a binary-cycle electric plant at Heber, California. Three institutional aspects of the geothermal risk which can shift the risk among different agents are analyzed. The leasing of geothermal land, contracting between the producer and the user of the geothermal heat, and insurance against faulty performance are examined. (MHR)

  19. Probabilistic approaches to recommendations

    CERN Document Server

    Barbieri, Nicola; Ritacco, Ettore

    2014-01-01

    The importance of accurate recommender systems has been widely recognized by academia and industry, and recommendation is rapidly becoming one of the most successful applications of data mining and machine learning. Understanding and predicting the choices and preferences of users is a challenging task: real-world scenarios involve users behaving in complex situations, where prior beliefs, specific tendencies, and reciprocal influences jointly contribute to determining the preferences of users toward huge amounts of information, services, and products. Probabilistic modeling represents a robus

  20. Probabilistic liver atlas construction.

    Science.gov (United States)

    Dura, Esther; Domingo, Juan; Ayala, Guillermo; Marti-Bonmati, Luis; Goceri, E

    2017-01-13

    Anatomical atlases are 3D volumes or shapes representing an organ or structure of the human body. They contain either the prototypical shape of the object of interest together with other shapes representing its statistical variations (statistical atlas) or a probability map of belonging to the object (probabilistic atlas). Probabilistic atlases are mostly built with simple estimations only involving the data at each spatial location. A new method for probabilistic atlas construction that uses a generalized linear model is proposed. This method aims to improve the estimation of the probability to be covered by the liver. Furthermore, all methods to build an atlas involve previous coregistration of the sample of shapes available. The influence of the geometrical transformation adopted for registration in the quality of the final atlas has not been sufficiently investigated. The ability of an atlas to adapt to a new case is one of the most important quality criteria that should be taken into account. The presented experiments show that some methods for atlas construction are severely affected by the previous coregistration step. We show the good performance of the new approach. Furthermore, results suggest that extremely flexible registration methods are not always beneficial, since they can reduce the variability of the atlas and hence its ability to give sensible values of probability when used as an aid in segmentation of new cases.

  1. Probabilistic finite elements

    Science.gov (United States)

    Belytschko, Ted; Wing, Kam Liu

    1987-01-01

    In the Probabilistic Finite Element Method (PFEM), finite element methods have been efficiently combined with second-order perturbation techniques to provide an effective method for informing the designer of the range of response which is likely in a given problem. The designer must provide as input the statistical character of the input variables, such as yield strength, load magnitude, and Young's modulus, by specifying their mean values and their variances. The output then consists of the mean response and the variance in the response. Thus the designer is given a much broader picture of the predicted performance than with simply a single response curve. These methods are applicable to a wide class of problems, provided that the scale of randomness is not too large and the probabilistic density functions possess decaying tails. By incorporating the computational techniques we have developed in the past 3 years for efficiency, the probabilistic finite element methods are capable of handling large systems with many sources of uncertainties. Sample results for an elastic-plastic ten-bar structure and an elastic-plastic plane continuum with a circular hole subject to cyclic loadings with the yield stress on the random field are given.

  2. Probabilistic Modeling of the Fatigue Crack Growth Rate for Ni-base Alloy X-750

    International Nuclear Information System (INIS)

    Yoon, J.Y.; Nam, H.O.; Hwang, I.S.; Lee, T.H.

    2012-01-01

    Extending the operating life of existing nuclear power plants (NPP's) beyond 60 years. Many aging problems of passive components such as PWSCC, IASCC, FAC and Corrosion Fatigue; Safety analysis: Deterministic analysis + Probabilistic analysis; Many uncertainties of parameters or relationship in general probabilistic analysis such as probabilistic safety assessment (PSA); Bayesian inference: Decreasing uncertainties by updating unknown parameter; Ensuring the reliability of passive components (e.g. pipes) as well as active components (e.g. valve, pump) in NPP's; Developing probabilistic model for failures; Updating the fatigue crack growth rate (FCGR)

  3. Probabilistic Tsunami Hazard Analysis

    Science.gov (United States)

    Thio, H. K.; Ichinose, G. A.; Somerville, P. G.; Polet, J.

    2006-12-01

    The recent tsunami disaster caused by the 2004 Sumatra-Andaman earthquake has focused our attention to the hazard posed by large earthquakes that occur under water, in particular subduction zone earthquakes, and the tsunamis that they generate. Even though these kinds of events are rare, the very large loss of life and material destruction caused by this earthquake warrant a significant effort towards the mitigation of the tsunami hazard. For ground motion hazard, Probabilistic Seismic Hazard Analysis (PSHA) has become a standard practice in the evaluation and mitigation of seismic hazard to populations in particular with respect to structures, infrastructure and lifelines. Its ability to condense the complexities and variability of seismic activity into a manageable set of parameters greatly facilitates the design of effective seismic resistant buildings but also the planning of infrastructure projects. Probabilistic Tsunami Hazard Analysis (PTHA) achieves the same goal for hazards posed by tsunami. There are great advantages of implementing such a method to evaluate the total risk (seismic and tsunami) to coastal communities. The method that we have developed is based on the traditional PSHA and therefore completely consistent with standard seismic practice. Because of the strong dependence of tsunami wave heights on bathymetry, we use a full waveform tsunami waveform computation in lieu of attenuation relations that are common in PSHA. By pre-computing and storing the tsunami waveforms at points along the coast generated for sets of subfaults that comprise larger earthquake faults, we can efficiently synthesize tsunami waveforms for any slip distribution on those faults by summing the individual subfault tsunami waveforms (weighted by their slip). This efficiency make it feasible to use Green's function summation in lieu of attenuation relations to provide very accurate estimates of tsunami height for probabilistic calculations, where one typically computes

  4. Probabilistic Analysis of a Composite Crew Module

    Science.gov (United States)

    Mason, Brian H.; Krishnamurthy, Thiagarajan

    2011-01-01

    An approach for conducting reliability-based analysis (RBA) of a Composite Crew Module (CCM) is presented. The goal is to identify and quantify the benefits of probabilistic design methods for the CCM and future space vehicles. The coarse finite element model from a previous NASA Engineering and Safety Center (NESC) project is used as the baseline deterministic analysis model to evaluate the performance of the CCM using a strength-based failure index. The first step in the probabilistic analysis process is the determination of the uncertainty distributions for key parameters in the model. Analytical data from water landing simulations are used to develop an uncertainty distribution, but such data were unavailable for other load cases. The uncertainty distributions for the other load scale factors and the strength allowables are generated based on assumed coefficients of variation. Probability of first-ply failure is estimated using three methods: the first order reliability method (FORM), Monte Carlo simulation, and conditional sampling. Results for the three methods were consistent. The reliability is shown to be driven by first ply failure in one region of the CCM at the high altitude abort load set. The final predicted probability of failure is on the order of 10-11 due to the conservative nature of the factors of safety on the deterministic loads.

  5. Dynamic Fault Diagnosis for Nuclear Installation Using Probabilistic Approach

    International Nuclear Information System (INIS)

    Djoko Hari Nugroho; Deswandri; Ahmad Abtokhi; Darlis

    2003-01-01

    Probabilistic based fault diagnosis which represent the relationship between cause and consequence of the events for trouble shooting is developed in this research based on Bayesian Networks. Contribution of on-line data comes from sensors and system/component reliability in node cause is expected increasing the belief level of Bayesian Networks. (author)

  6. Probabilistic Analysis of Failures Mechanisms of Large Dams

    NARCIS (Netherlands)

    Shams Ghahfarokhi, G.

    2014-01-01

    Risk and reliability analysis is presently being performed in almost all fields of engineering depending upon the specific field and its particular area. Probabilistic risk analysis (PRA), also called quantitative risk analysis (QRA) is a central feature of hydraulic engineering structural design.

  7. Probabilistic safety analysis and interpretation thereof

    International Nuclear Information System (INIS)

    Steininger, U.; Sacher, H.

    1999-01-01

    Increasing use of the instrumentation of PSA is being made in Germany for quantitative technical safety assessment, for example with regard to incidents which must be reported and forwarding of information, especially in the case of modification of nuclear plants. The Commission for Nuclear Reactor Safety recommends regular execution of PSA on a cycle period of ten years. According to the PSA guidance instructions, probabilistic analyses serve for assessing the degree of safety of the entire plant, expressed as the expectation value for the frequency of endangering conditions. The authors describe the method, action sequence and evaluation of the probabilistic safety analyses. The limits of probabilistic safety analyses arise in the practical implementation. Normally the guidance instructions for PSA are confined to the safety systems, so that in practice they are at best suitable for operational optimisation only to a limited extent. The present restriction of the analyses has a similar effect on power output operation of the plant. This seriously degrades the utilitarian value of these analyses for the plant operators. In order to further develop PSA as a supervisory and operational optimisation instrument, both authors consider it to be appropriate to bring together the specific know-how of analysts, manufacturers, plant operators and experts. (orig.) [de

  8. Some probabilistic aspects of fracture

    International Nuclear Information System (INIS)

    Thomas, J.M.

    1982-01-01

    Some probabilistic aspects of fracture in structural and mechanical components are examined. The principles of fracture mechanics, material quality and inspection uncertainty are formulated into a conceptual and analytical framework for prediction of failure probability. The role of probabilistic fracture mechanics in a more global context of risk and optimization of decisions is illustrated. An example, where Monte Carlo simulation was used to implement a probabilistic fracture mechanics analysis, is discussed. (orig.)

  9. Probabilistic model for sterilization of food

    International Nuclear Information System (INIS)

    Chepurko, V.V.; Malinovskij, O.V.

    1986-01-01

    The probabilistic model for radiation sterilization is proposed based on the followng suppositions: (1) initial contamination of a volume unit of the sterilized product m is described by the distribution of the probabilities q(m), (2) inactivation of the population from m of microorganisms is approximated by Bernoulli test scheme, and (3) contamination of unit of the sterilized product is independent. The possibility of approximation q(m) by Poisson distribution is demonstrated. The diagrams are presented permitting to evaluate the dose which provides the defined reliability of sterilization of food for chicken-gnotobionts

  10. Probabilistic risk analysis of Angra-1 reactor

    International Nuclear Information System (INIS)

    Spivak, R.C.; Collussi, I.; Silva, M.C. da; Onusic Junior, J.

    1986-01-01

    The first phase of probabilistic study for safety analysis and operational analysis of Angra-1 reactor is presented. The study objectives and uses are: to support decisions about safety problems; to identify operational and/or project failures; to amplify operator qualification tests to include accidents in addition to project base; to provide informations to be used in development and/or review of operation procedures in emergency, test and maintenance procedures; to obtain experience for data collection about abnormal accurences; utilization of study results for training operators; and training of evaluation and reliability techniques for the personnel of CNEN and FURNAS. (M.C.K.) [pt

  11. Probabilistic Simulation of Multi-Scale Composite Behavior

    Science.gov (United States)

    Chamis, Christos C.

    2012-01-01

    A methodology is developed to computationally assess the non-deterministic composite response at all composite scales (from micro to structural) due to the uncertainties in the constituent (fiber and matrix) properties, in the fabrication process and in structural variables (primitive variables). The methodology is computationally efficient for simulating the probability distributions of composite behavior, such as material properties, laminate and structural responses. Bi-products of the methodology are probabilistic sensitivities of the composite primitive variables. The methodology has been implemented into the computer codes PICAN (Probabilistic Integrated Composite ANalyzer) and IPACS (Integrated Probabilistic Assessment of Composite Structures). The accuracy and efficiency of this methodology are demonstrated by simulating the uncertainties in composite typical laminates and comparing the results with the Monte Carlo simulation method. Available experimental data of composite laminate behavior at all scales fall within the scatters predicted by PICAN. Multi-scaling is extended to simulate probabilistic thermo-mechanical fatigue and to simulate the probabilistic design of a composite redome in order to illustrate its versatility. Results show that probabilistic fatigue can be simulated for different temperature amplitudes and for different cyclic stress magnitudes. Results also show that laminate configurations can be selected to increase the redome reliability by several orders of magnitude without increasing the laminate thickness--a unique feature of structural composites. The old reference denotes that nothing fundamental has been done since that time.

  12. Prospects for probabilistic safety assessment

    International Nuclear Information System (INIS)

    Hirschberg, S.

    1992-01-01

    This article provides some reflections on future developments of Probabilistic Safety Assessment (PSA) in view of the present state of the art and evaluates current trends in the use of PSA for safety management. The main emphasis is on Level 1 PSA, although Level 2 aspects are also highlighted to some extent. As a starting point, the role of PSA is outlined from a historical perspective, demonstrating the rapid expansion of the uses of PSA. In this context the wide spectrum of PSA applications and the associated benefits to the users are in focus. It should be kept in mind, however, that PSA, in spite of its merits, is not a self-standing safety tool. It complements deterministic analysis and thus improves understanding and facilitating prioritization of safety issues. Significant progress in handling PSA limitations - such as reliability data, common-cause failures, human interactions, external events, accident progression, containment performance, and source-term issues - is described. This forms a background for expected future developments of PSA. Among the most important issues on the agenda for the future are PSA scope extensions, methodological improvements and computer code advancements, and full exploitation of the potential benefits of applications to operational safety management. Many PSA uses, if properly exercised, lead to safety improvements as well as major burden reductions. The article provides, in addition, International Atomic Energy Agency (IAEA) perspective on the topics covered, as reflected in the current PSA programs of the agency. 74 refs., 6 figs., 1 tab

  13. Safety and reliability. V. 1. Proceedings

    International Nuclear Information System (INIS)

    Soares, C.G.

    1997-01-01

    Proceedings of a 1997 conference on industrial safety and reliability are reported. The first volume looks at risk management, probabilistic safety assessment and management styles in various industrial settings, including nuclear power plants. The second volume addresses safety and reliability in the offshore and transport industries, focusing on the role of staff training and appropriate maintenance routines to effectively reduce accidents and outages. (UK)

  14. Probabilistic safety assessment

    International Nuclear Information System (INIS)

    Hoertner, H.; Schuetz, B.

    1982-09-01

    For the purpose of assessing applicability and informativeness on risk-analysis methods in licencing procedures under atomic law, the choice of instruments for probabilistic analysis, the problems in and experience gained in their application, and the discussion of safety goals with respect to such instruments are of paramount significance. Naturally, such a complex field can only be dealt with step by step, making contribution relative to specific problems. The report on hand shows the essentials of a 'stocktaking' of systems relability studies in the licencing procedure under atomic law and of an American report (NUREG-0739) on 'Quantitative Safety Goals'. (orig.) [de

  15. Probabilistic methods for physics

    International Nuclear Information System (INIS)

    Cirier, G

    2013-01-01

    We present an asymptotic method giving a probability of presence of the iterated spots of R d by a polynomial function f. We use the well-known Perron Frobenius operator (PF) that lets certain sets and measure invariant by f. Probabilistic solutions can exist for the deterministic iteration. If the theoretical result is already known, here we quantify these probabilities. This approach seems interesting to use for computing situations when the deterministic methods don't run. Among the examined applications, are asymptotic solutions of Lorenz, Navier-Stokes or Hamilton's equations. In this approach, linearity induces many difficult problems, all of whom we have not yet resolved.

  16. Quantum probability for probabilists

    CERN Document Server

    Meyer, Paul-André

    1993-01-01

    In recent years, the classical theory of stochastic integration and stochastic differential equations has been extended to a non-commutative set-up to develop models for quantum noises. The author, a specialist of classical stochastic calculus and martingale theory, tries to provide anintroduction to this rapidly expanding field in a way which should be accessible to probabilists familiar with the Ito integral. It can also, on the other hand, provide a means of access to the methods of stochastic calculus for physicists familiar with Fock space analysis.

  17. Integration of Probabilistic Exposure Assessment and Probabilistic Hazard Characterization

    NARCIS (Netherlands)

    Voet, van der H.; Slob, W.

    2007-01-01

    A method is proposed for integrated probabilistic risk assessment where exposure assessment and hazard characterization are both included in a probabilistic way. The aim is to specify the probability that a random individual from a defined (sub)population will have an exposure high enough to cause a

  18. Fast probabilistic file fingerprinting for big data.

    Science.gov (United States)

    Tretyakov, Konstantin; Laur, Sven; Smant, Geert; Vilo, Jaak; Prins, Pjotr

    2013-01-01

    Biological data acquisition is raising new challenges, both in data analysis and handling. Not only is it proving hard to analyze the data at the rate it is generated today, but simply reading and transferring data files can be prohibitively slow due to their size. This primarily concerns logistics within and between data centers, but is also important for workstation users in the analysis phase. Common usage patterns, such as comparing and transferring files, are proving computationally expensive and are tying down shared resources. We present an efficient method for calculating file uniqueness for large scientific data files, that takes less computational effort than existing techniques. This method, called Probabilistic Fast File Fingerprinting (PFFF), exploits the variation present in biological data and computes file fingerprints by sampling randomly from the file instead of reading it in full. Consequently, it has a flat performance characteristic, correlated with data variation rather than file size. We demonstrate that probabilistic fingerprinting can be as reliable as existing hashing techniques, with provably negligible risk of collisions. We measure the performance of the algorithm on a number of data storage and access technologies, identifying its strengths as well as limitations. Probabilistic fingerprinting may significantly reduce the use of computational resources when comparing very large files. Utilisation of probabilistic fingerprinting techniques can increase the speed of common file-related workflows, both in the data center and for workbench analysis. The implementation of the algorithm is available as an open-source tool named pfff, as a command-line tool as well as a C library. The tool can be downloaded from http://biit.cs.ut.ee/pfff.

  19. Probabilistic Structural Analysis of SSME Turbopump Blades: Probabilistic Geometry Effects

    Science.gov (United States)

    Nagpal, V. K.

    1985-01-01

    A probabilistic study was initiated to evaluate the precisions of the geometric and material properties tolerances on the structural response of turbopump blades. To complete this study, a number of important probabilistic variables were identified which are conceived to affect the structural response of the blade. In addition, a methodology was developed to statistically quantify the influence of these probabilistic variables in an optimized way. The identified variables include random geometric and material properties perturbations, different loadings and a probabilistic combination of these loadings. Influences of these probabilistic variables are planned to be quantified by evaluating the blade structural response. Studies of the geometric perturbations were conducted for a flat plate geometry as well as for a space shuttle main engine blade geometry using a special purpose code which uses the finite element approach. Analyses indicate that the variances of the perturbations about given mean values have significant influence on the response.

  20. Advances in probabilistic databases for uncertain information management

    CERN Document Server

    Yan, Li

    2013-01-01

    This book covers a fast-growing topic in great depth and focuses on the technologies and applications of probabilistic data management. It aims to provide a single account of current studies in probabilistic data management. The objective of the book is to provide the state of the art information to researchers, practitioners, and graduate students of information technology of intelligent information processing, and at the same time serving the information technology professional faced with non-traditional applications that make the application of conventional approaches difficult or impossible.

  1. A General Framework for Probabilistic Characterizing Formulae

    DEFF Research Database (Denmark)

    Sack, Joshua; Zhang, Lijun

    2012-01-01

    Recently, a general framework on characteristic formulae was proposed by Aceto et al. It offers a simple theory that allows one to easily obtain characteristic formulae of many non-probabilistic behavioral relations. Our paper studies their techniques in a probabilistic setting. We provide...... a general method for determining characteristic formulae of behavioral relations for probabilistic automata using fixed-point probability logics. We consider such behavioral relations as simulations and bisimulations, probabilistic bisimulations, probabilistic weak simulations, and probabilistic forward...

  2. Probabilistic pathway construction.

    Science.gov (United States)

    Yousofshahi, Mona; Lee, Kyongbum; Hassoun, Soha

    2011-07-01

    Expression of novel synthesis pathways in host organisms amenable to genetic manipulations has emerged as an attractive metabolic engineering strategy to overproduce natural products, biofuels, biopolymers and other commercially useful metabolites. We present a pathway construction algorithm for identifying viable synthesis pathways compatible with balanced cell growth. Rather than exhaustive exploration, we investigate probabilistic selection of reactions to construct the pathways. Three different selection schemes are investigated for the selection of reactions: high metabolite connectivity, low connectivity and uniformly random. For all case studies, which involved a diverse set of target metabolites, the uniformly random selection scheme resulted in the highest average maximum yield. When compared to an exhaustive search enumerating all possible reaction routes, our probabilistic algorithm returned nearly identical distributions of yields, while requiring far less computing time (minutes vs. years). The pathways identified by our algorithm have previously been confirmed in the literature as viable, high-yield synthesis routes. Prospectively, our algorithm could facilitate the design of novel, non-native synthesis routes by efficiently exploring the diversity of biochemical transformations in nature. Copyright © 2011 Elsevier Inc. All rights reserved.

  3. Probabilistic population aging

    Science.gov (United States)

    2017-01-01

    We merge two methodologies, prospective measures of population aging and probabilistic population forecasts. We compare the speed of change and variability in forecasts of the old age dependency ratio and the prospective old age dependency ratio as well as the same comparison for the median age and the prospective median age. While conventional measures of population aging are computed on the basis of the number of years people have already lived, prospective measures are computed also taking account of the expected number of years they have left to live. Those remaining life expectancies change over time and differ from place to place. We compare the probabilistic distributions of the conventional and prospective measures using examples from China, Germany, Iran, and the United States. The changes over time and the variability of the prospective indicators are smaller than those that are observed in the conventional ones. A wide variety of new results emerge from the combination of methodologies. For example, for Germany, Iran, and the United States the likelihood that the prospective median age of the population in 2098 will be lower than it is today is close to 100 percent. PMID:28636675

  4. Probabilistic cellular automata.

    Science.gov (United States)

    Agapie, Alexandru; Andreica, Anca; Giuclea, Marius

    2014-09-01

    Cellular automata are binary lattices used for modeling complex dynamical systems. The automaton evolves iteratively from one configuration to another, using some local transition rule based on the number of ones in the neighborhood of each cell. With respect to the number of cells allowed to change per iteration, we speak of either synchronous or asynchronous automata. If randomness is involved to some degree in the transition rule, we speak of probabilistic automata, otherwise they are called deterministic. With either type of cellular automaton we are dealing with, the main theoretical challenge stays the same: starting from an arbitrary initial configuration, predict (with highest accuracy) the end configuration. If the automaton is deterministic, the outcome simplifies to one of two configurations, all zeros or all ones. If the automaton is probabilistic, the whole process is modeled by a finite homogeneous Markov chain, and the outcome is the corresponding stationary distribution. Based on our previous results for the asynchronous case-connecting the probability of a configuration in the stationary distribution to its number of zero-one borders-the article offers both numerical and theoretical insight into the long-term behavior of synchronous cellular automata.

  5. Probabilistic biological network alignment.

    Science.gov (United States)

    Todor, Andrei; Dobra, Alin; Kahveci, Tamer

    2013-01-01

    Interactions between molecules are probabilistic events. An interaction may or may not happen with some probability, depending on a variety of factors such as the size, abundance, or proximity of the interacting molecules. In this paper, we consider the problem of aligning two biological networks. Unlike existing methods, we allow one of the two networks to contain probabilistic interactions. Allowing interaction probabilities makes the alignment more biologically relevant at the expense of explosive growth in the number of alternative topologies that may arise from different subsets of interactions that take place. We develop a novel method that efficiently and precisely characterizes this massive search space. We represent the topological similarity between pairs of aligned molecules (i.e., proteins) with the help of random variables and compute their expected values. We validate our method showing that, without sacrificing the running time performance, it can produce novel alignments. Our results also demonstrate that our method identifies biologically meaningful mappings under a comprehensive set of criteria used in the literature as well as the statistical coherence measure that we developed to analyze the statistical significance of the similarity of the functions of the aligned protein pairs.

  6. Quantum probabilistic logic programming

    Science.gov (United States)

    Balu, Radhakrishnan

    2015-05-01

    We describe a quantum mechanics based logic programming language that supports Horn clauses, random variables, and covariance matrices to express and solve problems in probabilistic logic. The Horn clauses of the language wrap random variables, including infinite valued, to express probability distributions and statistical correlations, a powerful feature to capture relationship between distributions that are not independent. The expressive power of the language is based on a mechanism to implement statistical ensembles and to solve the underlying SAT instances using quantum mechanical machinery. We exploit the fact that classical random variables have quantum decompositions to build the Horn clauses. We establish the semantics of the language in a rigorous fashion by considering an existing probabilistic logic language called PRISM with classical probability measures defined on the Herbrand base and extending it to the quantum context. In the classical case H-interpretations form the sample space and probability measures defined on them lead to consistent definition of probabilities for well formed formulae. In the quantum counterpart, we define probability amplitudes on Hinterpretations facilitating the model generations and verifications via quantum mechanical superpositions and entanglements. We cast the well formed formulae of the language as quantum mechanical observables thus providing an elegant interpretation for their probabilities. We discuss several examples to combine statistical ensembles and predicates of first order logic to reason with situations involving uncertainty.

  7. Day-Ahead Probabilistic Model for Scheduling the Operation of a Wind Pumped-Storage Hybrid Power Station: Overcoming Forecasting Errors to Ensure Reliability of Supply to the Grid

    Directory of Open Access Journals (Sweden)

    Jakub Jurasz

    2018-06-01

    Full Text Available Variable renewable energy sources (VRES, such as solarphotovoltaic (PV and wind turbines (WT, are starting to play a significant role in several energy systems around the globe. To overcome the problem of their non-dispatchable and stochastic nature, several approaches have been proposed so far. This paper describes a novel mathematical model for scheduling the operation of a wind-powered pumped-storage hydroelectricity (PSH hybrid for 25 to 48 h ahead. The model is based on mathematical programming and wind speed forecasts for the next 1 to 24 h, along with predicted upper reservoir occupancy for the 24th hour ahead. The results indicate that by coupling a 2-MW conventional wind turbine with a PSH of energy storing capacity equal to 54 MWh it is possible to significantly reduce the intraday energy generation coefficient of variation from 31% for pure wind turbine to 1.15% for a wind-powered PSH The scheduling errors calculated based on mean absolute percentage error (MAPE are significantly smaller for such a coupling than those seen for wind generation forecasts, at 2.39% and 27%, respectively. This is even stronger emphasized by the fact that, those for wind generation were calculated for forecasts made for the next 1 to 24 h, while those for scheduled generation were calculated for forecasts made for the next 25 to 48 h. The results clearly show that the proposed scheduling approach ensures the high reliability of the WT–PSH energy source.

  8. Topics in Probabilistic Judgment Aggregation

    Science.gov (United States)

    Wang, Guanchun

    2011-01-01

    This dissertation is a compilation of several studies that are united by their relevance to probabilistic judgment aggregation. In the face of complex and uncertain events, panels of judges are frequently consulted to provide probabilistic forecasts, and aggregation of such estimates in groups often yield better results than could have been made…

  9. Probabilistic studies of accident sequences

    International Nuclear Information System (INIS)

    Villemeur, A.; Berger, J.P.

    1986-01-01

    For several years, Electricite de France has carried out probabilistic assessment of accident sequences for nuclear power plants. In the framework of this program many methods were developed. As the interest in these studies was increasing and as adapted methods were developed, Electricite de France has undertaken a probabilistic safety assessment of a nuclear power plant [fr

  10. Compression of Probabilistic XML documents

    NARCIS (Netherlands)

    Veldman, Irma

    2009-01-01

    Probabilistic XML (PXML) files resulting from data integration can become extremely large, which is undesired. For XML there are several techniques available to compress the document and since probabilistic XML is in fact (a special form of) XML, it might benefit from these methods even more. In

  11. Basic design of parallel computational program for probabilistic structural analysis

    International Nuclear Information System (INIS)

    Kaji, Yoshiyuki; Arai, Taketoshi; Gu, Wenwei; Nakamura, Hitoshi

    1999-06-01

    In our laboratory, for 'development of damage evaluation method of structural brittle materials by microscopic fracture mechanics and probabilistic theory' (nuclear computational science cross-over research) we examine computational method related to super parallel computation system which is coupled with material strength theory based on microscopic fracture mechanics for latent cracks and continuum structural model to develop new structural reliability evaluation methods for ceramic structures. This technical report is the review results regarding probabilistic structural mechanics theory, basic terms of formula and program methods of parallel computation which are related to principal terms in basic design of computational mechanics program. (author)

  12. Basic design of parallel computational program for probabilistic structural analysis

    Energy Technology Data Exchange (ETDEWEB)

    Kaji, Yoshiyuki; Arai, Taketoshi [Japan Atomic Energy Research Inst., Tokai, Ibaraki (Japan). Tokai Research Establishment; Gu, Wenwei; Nakamura, Hitoshi

    1999-06-01

    In our laboratory, for `development of damage evaluation method of structural brittle materials by microscopic fracture mechanics and probabilistic theory` (nuclear computational science cross-over research) we examine computational method related to super parallel computation system which is coupled with material strength theory based on microscopic fracture mechanics for latent cracks and continuum structural model to develop new structural reliability evaluation methods for ceramic structures. This technical report is the review results regarding probabilistic structural mechanics theory, basic terms of formula and program methods of parallel computation which are related to principal terms in basic design of computational mechanics program. (author)

  13. Probabilistic Forecasting of Photovoltaic Generation: An Efficient Statistical Approach

    DEFF Research Database (Denmark)

    Wan, Can; Lin, Jin; Song, Yonghua

    2017-01-01

    This letter proposes a novel efficient probabilistic forecasting approach to accurately quantify the variability and uncertainty of the power production from photovoltaic (PV) systems. Distinguished from most existing models, a linear programming based prediction interval construction model for P...... power generation is proposed based on extreme learning machine and quantile regression, featuring high reliability and computational efficiency. The proposed approach is validated through the numerical studies on PV data from Denmark.......This letter proposes a novel efficient probabilistic forecasting approach to accurately quantify the variability and uncertainty of the power production from photovoltaic (PV) systems. Distinguished from most existing models, a linear programming based prediction interval construction model for PV...

  14. Probabilistic Structural Analysis Theory Development

    Science.gov (United States)

    Burnside, O. H.

    1985-01-01

    The objective of the Probabilistic Structural Analysis Methods (PSAM) project is to develop analysis techniques and computer programs for predicting the probabilistic response of critical structural components for current and future space propulsion systems. This technology will play a central role in establishing system performance and durability. The first year's technical activity is concentrating on probabilistic finite element formulation strategy and code development. Work is also in progress to survey critical materials and space shuttle mian engine components. The probabilistic finite element computer program NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) is being developed. The final probabilistic code will have, in the general case, the capability of performing nonlinear dynamic of stochastic structures. It is the goal of the approximate methods effort to increase problem solving efficiency relative to finite element methods by using energy methods to generate trial solutions which satisfy the structural boundary conditions. These approximate methods will be less computer intensive relative to the finite element approach.

  15. Reliability Engineering

    International Nuclear Information System (INIS)

    Lee, Sang Yong

    1992-07-01

    This book is about reliability engineering, which describes definition and importance of reliability, development of reliability engineering, failure rate and failure probability density function about types of it, CFR and index distribution, IFR and normal distribution and Weibull distribution, maintainability and movability, reliability test and reliability assumption in index distribution type, normal distribution type and Weibull distribution type, reliability sampling test, reliability of system, design of reliability and functionality failure analysis by FTA.

  16. Fully probabilistic design: the way for optimizing of concrete structures

    Directory of Open Access Journals (Sweden)

    I. Laníková

    Full Text Available Some standards for the design of concrete structures (e.g. EC2 and the original ČSN 73 1201-86 allow a structure to be designed by several methods. This contribution documents the fact that even if a structure does not comply with the partial reliability factor method, according to EC2, it can satisfy the conditions during the application of the fully probabilistic approach when using the same standard. From an example of the reliability of a prestressed spun concrete pole designed by the partial factor method and fully probabilistic approach according to the Eurocode it is evident that an expert should apply a more precise (though unfortunately more complicated method in the limiting cases. The Monte Carlo method, modified by the Latin Hypercube Sampling (LHS method, has been used for the calculation of reliability. Ultimate and serviceability limit states were checked for the partial factor method and fully probabilistic design. As a result of fully probabilistic design it is possible to obtain a more efficient design for a structure.

  17. Structural hybrid reliability index and its convergent solving method based on random–fuzzy–interval reliability model

    OpenAIRE

    Hai An; Ling Zhou; Hui Sun

    2016-01-01

    Aiming to resolve the problems of a variety of uncertainty variables that coexist in the engineering structure reliability analysis, a new hybrid reliability index to evaluate structural hybrid reliability, based on the random–fuzzy–interval model, is proposed in this article. The convergent solving method is also presented. First, the truncated probability reliability model, the fuzzy random reliability model, and the non-probabilistic interval reliability model are introduced. Then, the new...

  18. Non-probabilistic defect assessment for structures with cracks based on interval model

    International Nuclear Information System (INIS)

    Dai, Qiao; Zhou, Changyu; Peng, Jian; Chen, Xiangwei; He, Xiaohua

    2013-01-01

    Highlights: • Non-probabilistic approach is introduced to defect assessment. • Definition and establishment of IFAC are put forward. • Determination of assessment rectangle is proposed. • Solution of non-probabilistic reliability index is presented. -- Abstract: Traditional defect assessment methods conservatively treat uncertainty of parameters as safety factors, while the probabilistic method is based on the clear understanding of detailed statistical information of parameters. In this paper, the non-probabilistic approach is introduced to the failure assessment diagram (FAD) to propose a non-probabilistic defect assessment method for structures with cracks. This novel defect assessment method contains three critical processes: establishment of the interval failure assessment curve (IFAC), determination of the assessment rectangle, and solution of the non-probabilistic reliability degree. Based on the interval theory, uncertain parameters such as crack sizes, material properties and loads are considered as interval variables. As a result, the failure assessment curve (FAC) will vary in a certain range, which is defined as IFAC. And the assessment point will vary within a rectangle zone which is defined as an assessment rectangle. Based on the interval model, the establishment of IFAC and the determination of the assessment rectangle are presented. Then according to the interval possibility degree method, the non-probabilistic reliability degree of IFAC can be determined. Meanwhile, in order to clearly introduce the non-probabilistic defect assessment method, a numerical example for the assessment of a pipe with crack is given. In addition, the assessment result of the proposed method is compared with that of the traditional probabilistic method, which confirms that this non-probabilistic defect assessment can reasonably resolve the practical problem with interval variables

  19. Non-probabilistic defect assessment for structures with cracks based on interval model

    Energy Technology Data Exchange (ETDEWEB)

    Dai, Qiao; Zhou, Changyu, E-mail: changyu_zhou@163.com; Peng, Jian; Chen, Xiangwei; He, Xiaohua

    2013-09-15

    Highlights: • Non-probabilistic approach is introduced to defect assessment. • Definition and establishment of IFAC are put forward. • Determination of assessment rectangle is proposed. • Solution of non-probabilistic reliability index is presented. -- Abstract: Traditional defect assessment methods conservatively treat uncertainty of parameters as safety factors, while the probabilistic method is based on the clear understanding of detailed statistical information of parameters. In this paper, the non-probabilistic approach is introduced to the failure assessment diagram (FAD) to propose a non-probabilistic defect assessment method for structures with cracks. This novel defect assessment method contains three critical processes: establishment of the interval failure assessment curve (IFAC), determination of the assessment rectangle, and solution of the non-probabilistic reliability degree. Based on the interval theory, uncertain parameters such as crack sizes, material properties and loads are considered as interval variables. As a result, the failure assessment curve (FAC) will vary in a certain range, which is defined as IFAC. And the assessment point will vary within a rectangle zone which is defined as an assessment rectangle. Based on the interval model, the establishment of IFAC and the determination of the assessment rectangle are presented. Then according to the interval possibility degree method, the non-probabilistic reliability degree of IFAC can be determined. Meanwhile, in order to clearly introduce the non-probabilistic defect assessment method, a numerical example for the assessment of a pipe with crack is given. In addition, the assessment result of the proposed method is compared with that of the traditional probabilistic method, which confirms that this non-probabilistic defect assessment can reasonably resolve the practical problem with interval variables.

  20. Probabilistic retinal vessel segmentation

    Science.gov (United States)

    Wu, Chang-Hua; Agam, Gady

    2007-03-01

    Optic fundus assessment is widely used for diagnosing vascular and non-vascular pathology. Inspection of the retinal vasculature may reveal hypertension, diabetes, arteriosclerosis, cardiovascular disease and stroke. Due to various imaging conditions retinal images may be degraded. Consequently, the enhancement of such images and vessels in them is an important task with direct clinical applications. We propose a novel technique for vessel enhancement in retinal images that is capable of enhancing vessel junctions in addition to linear vessel segments. This is an extension of vessel filters we have previously developed for vessel enhancement in thoracic CT scans. The proposed approach is based on probabilistic models which can discern vessels and junctions. Evaluation shows the proposed filter is better than several known techniques and is comparable to the state of the art when evaluated on a standard dataset. A ridge-based vessel tracking process is applied on the enhanced image to demonstrate the effectiveness of the enhancement filter.

  1. Probabilistic sensory recoding.

    Science.gov (United States)

    Jazayeri, Mehrdad

    2008-08-01

    A hallmark of higher brain functions is the ability to contemplate the world rather than to respond reflexively to it. To do so, the nervous system makes use of a modular architecture in which sensory representations are dissociated from areas that control actions. This flexibility however necessitates a recoding scheme that would put sensory information to use in the control of behavior. Sensory recoding faces two important challenges. First, recoding must take into account the inherent variability of sensory responses. Second, it must be flexible enough to satisfy the requirements of different perceptual goals. Recent progress in theory, psychophysics, and neurophysiology indicate that cortical circuitry might meet these challenges by evaluating sensory signals probabilistically.

  2. Multidisciplinary System Reliability Analysis

    Science.gov (United States)

    Mahadevan, Sankaran; Han, Song; Chamis, Christos C. (Technical Monitor)

    2001-01-01

    The objective of this study is to develop a new methodology for estimating the reliability of engineering systems that encompass multiple disciplines. The methodology is formulated in the context of the NESSUS probabilistic structural analysis code, developed under the leadership of NASA Glenn Research Center. The NESSUS code has been successfully applied to the reliability estimation of a variety of structural engineering systems. This study examines whether the features of NESSUS could be used to investigate the reliability of systems in other disciplines such as heat transfer, fluid mechanics, electrical circuits etc., without considerable programming effort specific to each discipline. In this study, the mechanical equivalence between system behavior models in different disciplines are investigated to achieve this objective. A new methodology is presented for the analysis of heat transfer, fluid flow, and electrical circuit problems using the structural analysis routines within NESSUS, by utilizing the equivalence between the computational quantities in different disciplines. This technique is integrated with the fast probability integration and system reliability techniques within the NESSUS code, to successfully compute the system reliability of multidisciplinary systems. Traditional as well as progressive failure analysis methods for system reliability estimation are demonstrated, through a numerical example of a heat exchanger system involving failure modes in structural, heat transfer and fluid flow disciplines.

  3. Next-generation probabilistic seismicity forecasting

    Energy Technology Data Exchange (ETDEWEB)

    Hiemer, S.

    2014-07-01

    The development of probabilistic seismicity forecasts is one of the most important tasks of seismologists at present time. Such forecasts form the basis of probabilistic seismic hazard assessment, a widely used approach to generate ground motion exceedance maps. These hazard maps guide the development of building codes, and in the absence of the ability to deterministically predict earthquakes, good building and infrastructure planning is key to prevent catastrophes. Probabilistic seismicity forecasts are models that specify the occurrence rate of earthquakes as a function of space, time and magnitude. The models presented in this thesis are time-invariant mainshock occurrence models. Accordingly, the reliable estimation of the spatial and size distribution of seismicity are of crucial importance when constructing such probabilistic forecasts. Thereby we focus on data-driven approaches to infer these distributions, circumventing the need for arbitrarily chosen external parameters and subjective expert decisions. Kernel estimation has been shown to appropriately transform discrete earthquake locations into spatially continuous probability distributions. However, we show that neglecting the information from fault networks constitutes a considerable shortcoming and thus limits the skill of these current seismicity models. We present a novel earthquake rate forecast that applies the kernel-smoothing method to both past earthquake locations and slip rates on mapped crustal faults applied to Californian and European data. Our model is independent from biases caused by commonly used non-objective seismic zonations, which impose artificial borders of activity that are not expected in nature. Studying the spatial variability of the seismicity size distribution is of great importance. The b-value of the well-established empirical Gutenberg-Richter model forecasts the rates of hazard-relevant large earthquakes based on the observed rates of abundant small events. We propose a

  4. Next-generation probabilistic seismicity forecasting

    International Nuclear Information System (INIS)

    Hiemer, S.

    2014-01-01

    The development of probabilistic seismicity forecasts is one of the most important tasks of seismologists at present time. Such forecasts form the basis of probabilistic seismic hazard assessment, a widely used approach to generate ground motion exceedance maps. These hazard maps guide the development of building codes, and in the absence of the ability to deterministically predict earthquakes, good building and infrastructure planning is key to prevent catastrophes. Probabilistic seismicity forecasts are models that specify the occurrence rate of earthquakes as a function of space, time and magnitude. The models presented in this thesis are time-invariant mainshock occurrence models. Accordingly, the reliable estimation of the spatial and size distribution of seismicity are of crucial importance when constructing such probabilistic forecasts. Thereby we focus on data-driven approaches to infer these distributions, circumventing the need for arbitrarily chosen external parameters and subjective expert decisions. Kernel estimation has been shown to appropriately transform discrete earthquake locations into spatially continuous probability distributions. However, we show that neglecting the information from fault networks constitutes a considerable shortcoming and thus limits the skill of these current seismicity models. We present a novel earthquake rate forecast that applies the kernel-smoothing method to both past earthquake locations and slip rates on mapped crustal faults applied to Californian and European data. Our model is independent from biases caused by commonly used non-objective seismic zonations, which impose artificial borders of activity that are not expected in nature. Studying the spatial variability of the seismicity size distribution is of great importance. The b-value of the well-established empirical Gutenberg-Richter model forecasts the rates of hazard-relevant large earthquakes based on the observed rates of abundant small events. We propose a

  5. A Probabilistic Approach for Robustness Evaluation of Timber Structures

    DEFF Research Database (Denmark)

    Kirkegaard, Poul Henning; Sørensen, John Dalsgaard

    of Structures and a probabilistic modelling of the timber material proposed in the Probabilistic Model Code (PMC) of the Joint Committee on Structural Safety (JCSS). Due to the framework in the Danish Code the timber structure has to be evaluated with respect to the following criteria where at least one shall...... to criteria a) and b) the timber frame structure has one column with a reliability index a bit lower than an assumed target level. By removal three columns one by one no significant extensive failure of the entire structure or significant parts of it are obatined. Therefore the structure can be considered......A probabilistic based robustness analysis has been performed for a glulam frame structure supporting the roof over the main court in a Norwegian sports centre. The robustness analysis is based on the framework for robustness analysis introduced in the Danish Code of Practice for the Safety...

  6. Reliability and Cost Impacts for Attritable Systems

    Science.gov (United States)

    2017-03-23

    on reliability and cost: a probabilistic model. Electric Power Systems Research, 72(3), 213-224. Kalbfleisch, J.D. & Prentice, R.L. (1980). The...copyright protection in the United States. AFIT-ENV-MS-17-M-172 RELIABILITY AND COST IMPACTS FOR ATTRITABLE SYSTEMS THESIS Presented to... power of discrete time Markov chains, whether homogeneous or non-homogeneous, to model the reliability and dependability of repairable systems should

  7. Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) Version 5.0: Data loading manual. Volume 10

    International Nuclear Information System (INIS)

    VanHorn, R.L.; Wolfram, L.M.; Fowler, R.D.; Beck, S.T.; Smith, C.L.

    1995-04-01

    The Systems Analysis Programs for Hands-on Integrated Reliability Evaluations (SAPHIRE) suite of programs can be used to organize and standardize in an electronic format information from probabilistic risk assessments or individual plant examinations. The Models and Results Database (MAR-D) program of the SAPHIRE suite serves as the repository for probabilistic risk assessment and individual plant examination data and information. This report demonstrates by examples the common electronic and manual methods used to load these types of data. It is not a stand alone document but references documents that contribute information relative to the data loading process. This document provides a more detailed discussion and instructions for using SAPHIRE 5.0 only when enough information on a specific topic is not provided by another available source

  8. Probabilistic brains: knowns and unknowns

    Science.gov (United States)

    Pouget, Alexandre; Beck, Jeffrey M; Ma, Wei Ji; Latham, Peter E

    2015-01-01

    There is strong behavioral and physiological evidence that the brain both represents probability distributions and performs probabilistic inference. Computational neuroscientists have started to shed light on how these probabilistic representations and computations might be implemented in neural circuits. One particularly appealing aspect of these theories is their generality: they can be used to model a wide range of tasks, from sensory processing to high-level cognition. To date, however, these theories have only been applied to very simple tasks. Here we discuss the challenges that will emerge as researchers start focusing their efforts on real-life computations, with a focus on probabilistic learning, structural learning and approximate inference. PMID:23955561

  9. Simulation Approach to Mission Risk and Reliability Analysis, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — It is proposed to develop and demonstrate an integrated total-system risk and reliability analysis approach that is based on dynamic, probabilistic simulation. This...

  10. Nuclear plant reliability data system. 1979 annual reports of cumulative system and component reliability

    International Nuclear Information System (INIS)

    1979-01-01

    The primary purposes of the information in these reports are the following: to provide operating statistics of safety-related systems within a unit which may be used to compare and evaluate reliability performance and to provide failure mode and failure rate statistics on components which may be used in failure mode effects analysis, fault hazard analysis, probabilistic reliability analysis, and so forth

  11. Probabilistic Decision Graphs - Combining Verification and AI Techniques for Probabilistic Inference

    DEFF Research Database (Denmark)

    Jaeger, Manfred

    2004-01-01

    We adopt probabilistic decision graphs developed in the field of automated verification as a tool for probabilistic model representation and inference. We show that probabilistic inference has linear time complexity in the size of the probabilistic decision graph, that the smallest probabilistic ...

  12. Calculation of the reliability of large complex systems by the relevant path method

    International Nuclear Information System (INIS)

    Richter, G.

    1975-03-01

    In this paper, analytical methods are presented and tested with which the probabilistic reliability data of technical systems can be determined for given fault trees and block diagrams and known reliability data of the components. (orig./AK) [de

  13. Human Reliability Analysis in Support of Risk Assessment for Positive Train Control

    Science.gov (United States)

    2003-06-01

    This report describes an approach to evaluating the reliability of human actions that are modeled in a probabilistic risk assessment : (PRA) of train control operations. This approach to human reliability analysis (HRA) has been applied in the case o...

  14. Waste package reliability analysis

    International Nuclear Information System (INIS)

    Pescatore, C.; Sastre, C.

    1983-01-01

    Proof of future performance of a complex system such as a high-level nuclear waste package over a period of hundreds to thousands of years cannot be had in the ordinary sense of the word. The general method of probabilistic reliability analysis could provide an acceptable framework to identify, organize, and convey the information necessary to satisfy the criterion of reasonable assurance of waste package performance according to the regulatory requirements set forth in 10 CFR 60. General principles which may be used to evaluate the qualitative and quantitative reliability of a waste package design are indicated and illustrated with a sample calculation of a repository concept in basalt. 8 references, 1 table

  15. Probabilistic Open Set Recognition

    Science.gov (United States)

    Jain, Lalit Prithviraj

    Real-world tasks in computer vision, pattern recognition and machine learning often touch upon the open set recognition problem: multi-class recognition with incomplete knowledge of the world and many unknown inputs. An obvious way to approach such problems is to develop a recognition system that thresholds probabilities to reject unknown classes. Traditional rejection techniques are not about the unknown; they are about the uncertain boundary and rejection around that boundary. Thus traditional techniques only represent the "known unknowns". However, a proper open set recognition algorithm is needed to reduce the risk from the "unknown unknowns". This dissertation examines this concept and finds existing probabilistic multi-class recognition approaches are ineffective for true open set recognition. We hypothesize the cause is due to weak adhoc assumptions combined with closed-world assumptions made by existing calibration techniques. Intuitively, if we could accurately model just the positive data for any known class without overfitting, we could reject the large set of unknown classes even under this assumption of incomplete class knowledge. For this, we formulate the problem as one of modeling positive training data by invoking statistical extreme value theory (EVT) near the decision boundary of positive data with respect to negative data. We provide a new algorithm called the PI-SVM for estimating the unnormalized posterior probability of class inclusion. This dissertation also introduces a new open set recognition model called Compact Abating Probability (CAP), where the probability of class membership decreases in value (abates) as points move from known data toward open space. We show that CAP models improve open set recognition for multiple algorithms. Leveraging the CAP formulation, we go on to describe the novel Weibull-calibrated SVM (W-SVM) algorithm, which combines the useful properties of statistical EVT for score calibration with one-class and binary

  16. NRPC ServCat priorities

    Data.gov (United States)

    Department of the Interior — This document lists the Natural Resource Program Center’s priority ServCat documents. It is recommended that these documents- which include annual narrative reports,...

  17. Probabilistic broadcasting of mixed states

    International Nuclear Information System (INIS)

    Li Lvjun; Li Lvzhou; Wu Lihua; Zou Xiangfu; Qiu Daowen

    2009-01-01

    It is well known that the non-broadcasting theorem proved by Barnum et al is a fundamental principle of quantum communication. As we are aware, optimal broadcasting (OB) is the only method to broadcast noncommuting mixed states approximately. In this paper, motivated by the probabilistic cloning of quantum states proposed by Duan and Guo, we propose a new way for broadcasting noncommuting mixed states-probabilistic broadcasting (PB), and we present a sufficient condition for PB of mixed states. To a certain extent, we generalize the probabilistic cloning theorem from pure states to mixed states, and in particular, we generalize the non-broadcasting theorem, since the case that commuting mixed states can be exactly broadcast can be thought of as a special instance of PB where the success ratio is 1. Moreover, we discuss probabilistic local broadcasting (PLB) of separable bipartite states

  18. Evaluation of Probabilistic Disease Forecasts.

    Science.gov (United States)

    Hughes, Gareth; Burnett, Fiona J

    2017-10-01

    The statistical evaluation of probabilistic disease forecasts often involves calculation of metrics defined conditionally on disease status, such as sensitivity and specificity. However, for the purpose of disease management decision making, metrics defined conditionally on the result of the forecast-predictive values-are also important, although less frequently reported. In this context, the application of scoring rules in the evaluation of probabilistic disease forecasts is discussed. An index of separation with application in the evaluation of probabilistic disease forecasts, described in the clinical literature, is also considered and its relation to scoring rules illustrated. Scoring rules provide a principled basis for the evaluation of probabilistic forecasts used in plant disease management. In particular, the decomposition of scoring rules into interpretable components is an advantageous feature of their application in the evaluation of disease forecasts.

  19. 14th International Probabilistic Workshop

    CERN Document Server

    Taerwe, Luc; Proske, Dirk

    2017-01-01

    This book presents the proceedings of the 14th International Probabilistic Workshop that was held in Ghent, Belgium in December 2016. Probabilistic methods are currently of crucial importance for research and developments in the field of engineering, which face challenges presented by new materials and technologies and rapidly changing societal needs and values. Contemporary needs related to, for example, performance-based design, service-life design, life-cycle analysis, product optimization, assessment of existing structures and structural robustness give rise to new developments as well as accurate and practically applicable probabilistic and statistical engineering methods to support these developments. These proceedings are a valuable resource for anyone interested in contemporary developments in the field of probabilistic engineering applications.

  20. Cumulative Dominance and Probabilistic Sophistication

    NARCIS (Netherlands)

    Wakker, P.P.; Sarin, R.H.

    2000-01-01

    Machina & Schmeidler (Econometrica, 60, 1992) gave preference conditions for probabilistic sophistication, i.e. decision making where uncertainty can be expressed in terms of (subjective) probabilities without commitment to expected utility maximization. This note shows that simpler and more general

  1. Probabilistic simulation of fermion paths

    International Nuclear Information System (INIS)

    Zhirov, O.V.

    1989-01-01

    Permutation symmetry of fermion path integral allows (while spin degrees of freedom are ignored) to use in its simulation any probabilistic algorithm, like Metropolis one, heat bath, etc. 6 refs., 2 tabs

  2. Aeroelastic/Aeroservoelastic Uncertainty and Reliability of Advanced Aerospace Vehicles in Flight and Ground Operations, Phase I

    Data.gov (United States)

    National Aeronautics and Space Administration — ASSURE - Aeroelastic / Aeroservoelastic (AE/ASE) Uncertainty and Reliability Engineering capability - is a set of probabilistic computer programs for isolating...

  3. Reliability data collection and use in risk and availability assessment

    International Nuclear Information System (INIS)

    Colombari, V.

    1989-01-01

    For EuReDatA it is a prevailing objective to initiate and support contact between experts, companies and institutions active in reliability engineering and research. Main topics of this 6th EuReDatA Conference are: Reliability data banks; incidents data banks; common cause data; source and propagation of uncertainties; computer aided risk analysis; reliability and incidents data acquisition and processing; human reliability; probabilistic safety and availability assessment; feedback of reliability into system design; data fusion; reliability modeling and techniques; structural and mechanical reliability; consequence modeling; software and electronic reliability; reliability tests. Some conference papers are separately indexed in the database. (HP)

  4. Probabilistic approaches to life prediction of nuclear plant structural components

    International Nuclear Information System (INIS)

    Villain, B.; Pitner, P.; Procaccia, H.

    1996-01-01

    In the last decade there has been an increasing interest at EDF in developing and applying probabilistic methods for a variety of purposes. In the field of structural integrity and reliability they are used to evaluate the effect of deterioration due to aging mechanisms, mainly on major passive structural components such as steam generators, pressure vessels and piping in nuclear plants. Because there can be numerous uncertainties involved in a assessment of the performance of these structural components, probabilistic methods. The benefits of a probabilistic approach are the clear treatment of uncertainly and the possibility to perform sensitivity studies from which it is possible to identify and quantify the effect of key factors and mitigative actions. They thus provide information to support effective decisions to optimize In-Service Inspection planning and maintenance strategies and for realistic lifetime prediction or reassessment. The purpose of the paper is to discuss and illustrate the methods available at EDF for probabilistic component life prediction. This includes a presentation of software tools in classical, Bayesian and structural reliability, and an application on two case studies (steam generator tube bundle, reactor pressure vessel). (authors)

  5. Probabilistic approaches to life prediction of nuclear plant structural components

    International Nuclear Information System (INIS)

    Villain, B.; Pitner, P.; Procaccia, H.

    1996-01-01

    In the last decade there has been an increasing interest at EDF in developing and applying probabilistic methods for a variety of purposes. In the field of structural integrity and reliability they are used to evaluate the effect of deterioration due to aging mechanisms, mainly on major passive structural components such as steam generators, pressure vessels and piping in nuclear plants. Because there can be numerous uncertainties involved in an assessment of the performance of these structural components, probabilistic methods provide an attractive alternative or supplement to more conventional deterministic methods. The benefits of a probabilistic approach are the clear treatment of uncertainty and the possibility to perform sensitivity studies from which it is possible to identify and quantify the effect of key factors and mitigative actions. They thus provide information to support effective decisions to optimize In-Service Inspection planning and maintenance strategies and for realistic lifetime prediction or reassessment. The purpose of the paper is to discuss and illustrate the methods available at EDF for probabilistic component life prediction. This includes a presentation of software tools in classical, Bayesian and structural reliability, and an application on two case studies (steam generator tube bundle, reactor pressure vessel)

  6. Structural reliability methods: Code development status

    Science.gov (United States)

    Millwater, Harry R.; Thacker, Ben H.; Wu, Y.-T.; Cruse, T. A.

    1991-05-01

    The Probabilistic Structures Analysis Method (PSAM) program integrates state of the art probabilistic algorithms with structural analysis methods in order to quantify the behavior of Space Shuttle Main Engine structures subject to uncertain loadings, boundary conditions, material parameters, and geometric conditions. An advanced, efficient probabilistic structural analysis software program, NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) was developed as a deliverable. NESSUS contains a number of integrated software components to perform probabilistic analysis of complex structures. A nonlinear finite element module NESSUS/FEM is used to model the structure and obtain structural sensitivities. Some of the capabilities of NESSUS/FEM are shown. A Fast Probability Integration module NESSUS/FPI estimates the probability given the structural sensitivities. A driver module, PFEM, couples the FEM and FPI. NESSUS, version 5.0, addresses component reliability, resistance, and risk.

  7. On Probabilistic Alpha-Fuzzy Fixed Points and Related Convergence Results in Probabilistic Metric and Menger Spaces under Some Pompeiu-Hausdorff-Like Probabilistic Contractive Conditions

    OpenAIRE

    De la Sen, M.

    2015-01-01

    In the framework of complete probabilistic metric spaces and, in particular, in probabilistic Menger spaces, this paper investigates some relevant properties of convergence of sequences to probabilistic α-fuzzy fixed points under some types of probabilistic contractive conditions.

  8. Maximizing Statistical Power When Verifying Probabilistic Forecasts of Hydrometeorological Events

    Science.gov (United States)

    DeChant, C. M.; Moradkhani, H.

    2014-12-01

    Hydrometeorological events (i.e. floods, droughts, precipitation) are increasingly being forecasted probabilistically, owing to the uncertainties in the underlying causes of the phenomenon. In these forecasts, the probability of the event, over some lead time, is estimated based on some model simulations or predictive indicators. By issuing probabilistic forecasts, agencies may communicate the uncertainty in the event occurring. Assuming that the assigned probability of the event is correct, which is referred to as a reliable forecast, the end user may perform some risk management based on the potential damages resulting from the event. Alternatively, an unreliable forecast may give false impressions of the actual risk, leading to improper decision making when protecting resources from extreme events. Due to this requisite for reliable forecasts to perform effective risk management, this study takes a renewed look at reliability assessment in event forecasts. Illustrative experiments will be presented, showing deficiencies in the commonly available approaches (Brier Score, Reliability Diagram). Overall, it is shown that the conventional reliability assessment techniques do not maximize the ability to distinguish between a reliable and unreliable forecast. In this regard, a theoretical formulation of the probabilistic event forecast verification framework will be presented. From this analysis, hypothesis testing with the Poisson-Binomial distribution is the most exact model available for the verification framework, and therefore maximizes one's ability to distinguish between a reliable and unreliable forecast. Application of this verification system was also examined within a real forecasting case study, highlighting the additional statistical power provided with the use of the Poisson-Binomial distribution.

  9. Probabilistic methods applied to the safety of nuclear power plant: annual report - 1980. Part. 1: theoretical fundaments

    International Nuclear Information System (INIS)

    Oliveira, L.F.S. de; Hesles, J.B.S.; Milidiu, R.L.; Maciel, C.C.; Gibelli, S.M.O.; Oliveira, L.C.; Fleming, P.V.; Rivera, R.R.J.

    1981-02-01

    The probabilistic Safety Analysis Group from COPPE was founded in 1980. This first part of the report shows the theoretical fundaments used for reliability analysis of some safety systems for Angra-1 [pt

  10. Trends in probabilistic power system reliability analysis - a survey

    NARCIS (Netherlands)

    Tuinema, B.W.; Gibescu, M.; vd Meijden, M.A.M.M.; Kling, W.L.; Sluis, van der L.

    2011-01-01

    The electric power system is continually in development. Many of the developments will lead to an increase in the stress on the power system. For example, the increasing need for electrical energy, the transition to a sustainable energy supply and the liberalization of the electricity market all put

  11. Military Cultural Competency: Understanding How to Serve Those Who Serve

    Science.gov (United States)

    Bonura, Kimberlee Bethany; Lovald, Nicole

    2015-01-01

    The aim of this essay is to define and describe the different constituents of the military population, and present the challenges this demographic faces when pursuing higher education. The essay also discusses key aspects higher education professionals must understand in order to better serve military populations, such as federal regulations and…

  12. Probabilistic fracture mechanics applied for lbb case study: international benchmark

    International Nuclear Information System (INIS)

    Radu, V.

    2015-01-01

    An application of probabilistic fracture mechanics to evaluate the structural integrity for a case study chosen from experimental Mock-ups of FP7 STYLE project is described. The reliability model for probabilistic structural integrity, focused on the assessment of TWC in the pipe weld under complex loading (bending moment and residual stress) has been setup. The basic model is the model of fracture for through-wall cracked pipe under elastic-plastic conditions. The corresponding structural reliability approach is developed with the probabilities of failure associated with maximum load for crack initiation, net-section collapse but also the evaluation the instability loads. The probabilities of failure for a through-wall crack in a pipe subject to pure bending are evaluated by using crude Monte Carlo simulations. The results from the international benchmark are presented for the mentioned case in the context of ageing and lifetime management of pressure boundary/pressure circuit component. (authors)

  13. Extending CANTUP code analysis to probabilistic evaluations

    International Nuclear Information System (INIS)

    Florea, S.

    2001-01-01

    The structural analysis with numerical methods based on final element method plays at present a central role in evaluations and predictions of structural systems which require safety and reliable operation in aggressive environmental conditions. This is the case too for the CANDU - 600 fuel channel, where besides the corrosive and thermal aggression upon the Zr97.5Nb2.5 pressure tubes, a lasting irradiation adds which has marked consequences upon the materials properties evolution. This results in an unavoidable spreading in the materials properties in time, affected by high uncertainties. Consequently, the deterministic evaluation with computation codes based on finite element method are supplemented by statistic and probabilistic methods of evaluation of the response of structural components. This paper reports the works on extending the thermo-mechanical evaluation of the fuel channel components in the frame of probabilistic structure mechanics based on statistical methods and developed upon deterministic CANTUP code analyses. CANTUP code was adapted from LAHEY 77 platform onto Microsoft Developer Studio - Fortran Power Station 4.0 platform. To test the statistical evaluation of the creeping behaviour of pressure tube, the value of longitudinal elasticity modulus (Young) was used, as random variable, with a normal distribution around value, as used in deterministic analyses. The influence of the random quantity upon the hog and effective stress developed in the pressure tube for to time values, specific to primary and secondary creep was studied. The results obtained after a five year creep, corresponding to the secondary creep are presented

  14. Guidance for the definition and application of probabilistic safety criteria

    International Nuclear Information System (INIS)

    Holmberg, J.-E.; Knochenhauer, M.

    2011-05-01

    The project 'The Validity of Safety Goals' has been financed jointly by NKS (Nordic Nuclear Safety Research), SSM (Swedish Radiation Safety Authority) and the Swedish and Finnish nuclear utilities. The national financing went through NPSAG, the Nordic PSA Group (Swedish contributions) and SAFIR2010, the Finnish research programme on NPP safety (Finnish contributions). The project has been performed in four phases during 2006-2010. This guidance document aims at describing, on the basis of the work performed throughout the project, issues to consider when defining, applying and interpreting probabilistic safety criteria. Thus, the basic aim of the document is to serve as a checklist and toolbox for the definition and application of probabilistic safety criteria. The document describes the terminology and concepts involved, the levels of criteria and relations between these, how to define a probabilistic safety criterion, how to apply a probabilistic safety criterion, on what to apply the probabilistic safety criterion, and how to interpret the result of the application. The document specifically deals with what makes up a probabilistic safety criterion, i.e., the risk metric, the frequency criterion, the PSA used for assessing compliance and the application procedure for the criterion. It also discusses the concept of subsidiary criteria, i.e., different levels of safety goals. The results from the project can be used as a platform for discussions at the utilities on how to define and use quantitative safety goals. The results can also be used by safety authorities as a reference for risk-informed regulation. The outcome can have an impact on the requirements on PSA, e.g., regarding quality, scope, level of detail, and documentation. Finally, the results can be expected to support on-going activities concerning risk-informed applications. (Author)

  15. Guidance for the definition and application of probabilistic safety criteria

    Energy Technology Data Exchange (ETDEWEB)

    Holmberg, J.-E. (VTT Technical Research Centre of Finland (Finland)); Knochenhauer, M. (Scandpower AB (Sweden))

    2011-05-15

    The project 'The Validity of Safety Goals' has been financed jointly by NKS (Nordic Nuclear Safety Research), SSM (Swedish Radiation Safety Authority) and the Swedish and Finnish nuclear utilities. The national financing went through NPSAG, the Nordic PSA Group (Swedish contributions) and SAFIR2010, the Finnish research programme on NPP safety (Finnish contributions). The project has been performed in four phases during 2006-2010. This guidance document aims at describing, on the basis of the work performed throughout the project, issues to consider when defining, applying and interpreting probabilistic safety criteria. Thus, the basic aim of the document is to serve as a checklist and toolbox for the definition and application of probabilistic safety criteria. The document describes the terminology and concepts involved, the levels of criteria and relations between these, how to define a probabilistic safety criterion, how to apply a probabilistic safety criterion, on what to apply the probabilistic safety criterion, and how to interpret the result of the application. The document specifically deals with what makes up a probabilistic safety criterion, i.e., the risk metric, the frequency criterion, the PSA used for assessing compliance and the application procedure for the criterion. It also discusses the concept of subsidiary criteria, i.e., different levels of safety goals. The results from the project can be used as a platform for discussions at the utilities on how to define and use quantitative safety goals. The results can also be used by safety authorities as a reference for risk-informed regulation. The outcome can have an impact on the requirements on PSA, e.g., regarding quality, scope, level of detail, and documentation. Finally, the results can be expected to support on-going activities concerning risk-informed applications. (Author)

  16. ZERO: Probabilistic Routing for Deploy and Forget Wireless Sensor Networks

    Directory of Open Access Journals (Sweden)

    Jose Carlos Pacho

    2010-09-01

    Full Text Available As Wireless Sensor Networks are being adopted by industry and agriculture for large-scale and unattended deployments, the need for reliable and energy-conservative protocols become critical. Physical and Link layer efforts for energy conservation are not mostly considered by routing protocols that put their efforts on maintaining reliability and throughput. Gradient-based routing protocols route data through most reliable links aiming to ensure 99% packet delivery. However, they suffer from the so-called ”hot spot” problem. Most reliable routes waste their energy fast, thus partitioning the network and reducing the area monitored. To cope with this ”hot spot” problem we propose ZERO a combined approach at Network and Link layers to increase network lifespan while conserving reliability levels by means of probabilistic load balancing techniques.

  17. Power transformer reliability modelling

    NARCIS (Netherlands)

    Schijndel, van A.

    2010-01-01

    Problem description Electrical power grids serve to transport and distribute electrical power with high reliability and availability at acceptable costs and risks. These grids play a crucial though preferably invisible role in supplying sufficient power in a convenient form. Today’s society has

  18. Accident simulator development for probabilistic safety analysis

    International Nuclear Information System (INIS)

    Cacciabue, P.C.; Amendola, A.; Mancini, G.

    1985-01-01

    This paper describes the basic features of a new concept of incident simulator, Response System Analyzed (RSA) which is being developed within the CEC JRC Research Program on Reactor Safety. Focusing on somewhat different aims than actual simulators, RSA development extends the field of application of simulators to the area of risk and reliability analysis and in particular to the identification of relevant sequences, to the modeling of human behavior and to the validation of operating procedures. The fundamental components of the project, i.e. the deterministic transient model of the plant, the automatic probabilistic driver and the human possible intervention modeling, are discussed in connection with the problem of their dynamic interaction. The analyses so far performed by separately testing RSA on significant study cases have shown encouraging results and have proven the feasibility of the overall program

  19. Probabilistic Modeling of Graded Timber Material Properties

    DEFF Research Database (Denmark)

    Faber, M. H.; Köhler, J.; Sørensen, John Dalsgaard

    2004-01-01

    The probabilistic modeling of timber material characteristics is considered with special emphasis to the modeling of the effect of different quality control and selection procedures used as means for quality grading in the production line. It is shown how statistical models may be established...... on the basis of the same type of information which is normally collected as a part of the quality control procedures and furthermore, how the efficiency of different control procedures may be quantified and compared. The tail behavior of the probability distributions of timber material characteristics plays...... such that they may readily be applied in structural reliability analysis and their format appears to be appropriate for codification purposes of quality control and selection for grading procedures....

  20. Probabilistic Modelling of Timber Material Properties

    DEFF Research Database (Denmark)

    Nielsen, Michael Havbro Faber; Köhler, Jochen; Sørensen, John Dalsgaard

    2001-01-01

    The probabilistic modeling of timber material characteristics is considered with special emphasis to the modeling of the effect of different quality control and selection procedures used as means for grading of timber in the production line. It is shown how statistical models may be established...... on the basis of the same type of information which is normally collected as a part of the quality control procedures and furthermore, how the efficiency of different control procedures may be compared. The tail behavior of the probability distributions of timber material characteristics play an important role...... such that they may readily be applied in structural reliability analysis and the format appears to be appropriate for codification purposes of quality control and selection for grading procedures...

  1. Probabilistic numerical discrimination in mice.

    Science.gov (United States)

    Berkay, Dilara; Çavdaroğlu, Bilgehan; Balcı, Fuat

    2016-03-01

    Previous studies showed that both human and non-human animals can discriminate between different quantities (i.e., time intervals, numerosities) with a limited level of precision due to their endogenous/representational uncertainty. In addition, other studies have shown that subjects can modulate their temporal categorization responses adaptively by incorporating information gathered regarding probabilistic contingencies into their time-based decisions. Despite the psychophysical similarities between the interval timing and nonverbal counting functions, the sensitivity of count-based decisions to probabilistic information remains an unanswered question. In the current study, we investigated whether exogenous probabilistic information can be integrated into numerosity-based judgments by mice. In the task employed in this study, reward was presented either after few (i.e., 10) or many (i.e., 20) lever presses, the last of which had to be emitted on the lever associated with the corresponding trial type. In order to investigate the effect of probabilistic information on performance in this task, we manipulated the relative frequency of different trial types across different experimental conditions. We evaluated the behavioral performance of the animals under models that differed in terms of their assumptions regarding the cost of responding (e.g., logarithmically increasing vs. no response cost). Our results showed for the first time that mice could adaptively modulate their count-based decisions based on the experienced probabilistic contingencies in directions predicted by optimality.

  2. An approach for assessing human decision reliability

    International Nuclear Information System (INIS)

    Pyy, P.

    2000-01-01

    This paper presents a method to study human reliability in decision situations related to nuclear power plant disturbances. Decisions often play a significant role in handling of emergency situations. The method may be applied to probabilistic safety assessments (PSAs) in cases where decision making is an important dimension of an accident sequence. Such situations are frequent e.g. in accident management. In this paper, a modelling approach for decision reliability studies is first proposed. Then, a case study with two decision situations with relatively different characteristics is presented. Qualitative and quantitative findings of the study are discussed. In very simple decision cases with time pressure, time reliability correlation proved out to be a feasible reliability modelling method. In all other decision situations, more advanced probabilistic decision models have to be used. Finally, decision probability assessment by using simulator run results and expert judgement is presented

  3. Implications of probabilistic risk assessment

    International Nuclear Information System (INIS)

    Cullingford, M.C.; Shah, S.M.; Gittus, J.H.

    1987-01-01

    Probabilistic risk assessment (PRA) is an analytical process that quantifies the likelihoods, consequences and associated uncertainties of the potential outcomes of postulated events. Starting with planned or normal operation, probabilistic risk assessment covers a wide range of potential accidents and considers the whole plant and the interactions of systems and human actions. Probabilistic risk assessment can be applied in safety decisions in design, licensing and operation of industrial facilities, particularly nuclear power plants. The proceedings include a review of PRA procedures, methods and technical issues in treating uncertainties, operating and licensing issues and future trends. Risk assessment for specific reactor types or components and specific risks (eg aircraft crashing onto a reactor) are used to illustrate the points raised. All 52 articles are indexed separately. (U.K.)

  4. A Probabilistic Damage Tolerance Concept for Welded Joints

    DEFF Research Database (Denmark)

    Lassen, T.; Sørensen, John Dalsgaard

    2002-01-01

    The first part of this paper presented the required statistics and stochastic models for reliability analysis of the fatigue fracture of welded plate joints. This present Part 2 suggests a probabilistic damage tolerance supplement to the design S–N curves for welded joints. The goal is to provide......) will have the same reliability level for the same FDF. This is true at the end of TSL and at earlier stages, i.e. fractions of TSL. The absolute value of TSL is immaterial for a given FDF. In the case of in-service inspection, the inspection interval is also given without dimensions as a fraction of TSL...

  5. Probabilistic validation of protein NMR chemical shift assignments

    International Nuclear Information System (INIS)

    Dashti, Hesam; Tonelli, Marco; Lee, Woonghee; Westler, William M.; Cornilescu, Gabriel; Ulrich, Eldon L.; Markley, John L.

    2016-01-01

    Data validation plays an important role in ensuring the reliability and reproducibility of studies. NMR investigations of the functional properties, dynamics, chemical kinetics, and structures of proteins depend critically on the correctness of chemical shift assignments. We present a novel probabilistic method named ARECA for validating chemical shift assignments that relies on the nuclear Overhauser effect data. ARECA has been evaluated through its application to 26 case studies and has been shown to be complementary to, and usually more reliable than, approaches based on chemical shift databases. ARECA is available online at http://areca.nmrfam.wisc.edu/ http://areca.nmrfam.wisc.edu/

  6. Probabilistic validation of protein NMR chemical shift assignments

    Energy Technology Data Exchange (ETDEWEB)

    Dashti, Hesam [University of Wisconsin-Madison, Graduate Program in Biophysics, Biochemistry Department (United States); Tonelli, Marco; Lee, Woonghee; Westler, William M.; Cornilescu, Gabriel [University of Wisconsin-Madison, Biochemistry Department, National Magnetic Resonance Facility at Madison (United States); Ulrich, Eldon L. [University of Wisconsin-Madison, BioMagResBank, Biochemistry Department (United States); Markley, John L., E-mail: markley@nmrfam.wisc.edu, E-mail: jmarkley@wisc.edu [University of Wisconsin-Madison, Biochemistry Department, National Magnetic Resonance Facility at Madison (United States)

    2016-01-15

    Data validation plays an important role in ensuring the reliability and reproducibility of studies. NMR investigations of the functional properties, dynamics, chemical kinetics, and structures of proteins depend critically on the correctness of chemical shift assignments. We present a novel probabilistic method named ARECA for validating chemical shift assignments that relies on the nuclear Overhauser effect data. ARECA has been evaluated through its application to 26 case studies and has been shown to be complementary to, and usually more reliable than, approaches based on chemical shift databases. ARECA is available online at http://areca.nmrfam.wisc.edu/ http://areca.nmrfam.wisc.edu/.

  7. Probabilistic coding of quantum states

    International Nuclear Information System (INIS)

    Grudka, Andrzej; Wojcik, Antoni; Czechlewski, Mikolaj

    2006-01-01

    We discuss the properties of probabilistic coding of two qubits to one qutrit and generalize the scheme to higher dimensions. We show that the protocol preserves the entanglement between the qubits to be encoded and the environment and can also be applied to mixed states. We present a protocol that enables encoding of n qudits to one qudit of dimension smaller than the Hilbert space of the original system and then allows probabilistic but error-free decoding of any subset of k qudits. We give a formula for the probability of successful decoding

  8. Probabilistic methods in combinatorial analysis

    CERN Document Server

    Sachkov, Vladimir N

    2014-01-01

    This 1997 work explores the role of probabilistic methods for solving combinatorial problems. These methods not only provide the means of efficiently using such notions as characteristic and generating functions, the moment method and so on but also let us use the powerful technique of limit theorems. The basic objects under investigation are nonnegative matrices, partitions and mappings of finite sets, with special emphasis on permutations and graphs, and equivalence classes specified on sequences of finite length consisting of elements of partially ordered sets; these specify the probabilist

  9. Probabilistic reasoning in data analysis.

    Science.gov (United States)

    Sirovich, Lawrence

    2011-09-20

    This Teaching Resource provides lecture notes, slides, and a student assignment for a lecture on probabilistic reasoning in the analysis of biological data. General probabilistic frameworks are introduced, and a number of standard probability distributions are described using simple intuitive ideas. Particular attention is focused on random arrivals that are independent of prior history (Markovian events), with an emphasis on waiting times, Poisson processes, and Poisson probability distributions. The use of these various probability distributions is applied to biomedical problems, including several classic experimental studies.

  10. Convex sets in probabilistic normed spaces

    International Nuclear Information System (INIS)

    Aghajani, Asadollah; Nourouzi, Kourosh

    2008-01-01

    In this paper we obtain some results on convexity in a probabilistic normed space. We also investigate the concept of CSN-closedness and CSN-compactness in a probabilistic normed space and generalize the corresponding results of normed spaces

  11. Developing Probabilistic Safety Performance Margins for Unknown and Underappreciated Risks

    Science.gov (United States)

    Benjamin, Allan; Dezfuli, Homayoon; Everett, Chris

    2015-01-01

    Probabilistic safety requirements currently formulated or proposed for space systems, nuclear reactor systems, nuclear weapon systems, and other types of systems that have a low-probability potential for high-consequence accidents depend on showing that the probability of such accidents is below a specified safety threshold or goal. Verification of compliance depends heavily upon synthetic modeling techniques such as PRA. To determine whether or not a system meets its probabilistic requirements, it is necessary to consider whether there are significant risks that are not fully considered in the PRA either because they are not known at the time or because their importance is not fully understood. The ultimate objective is to establish a reasonable margin to account for the difference between known risks and actual risks in attempting to validate compliance with a probabilistic safety threshold or goal. In this paper, we examine data accumulated over the past 60 years from the space program, from nuclear reactor experience, from aircraft systems, and from human reliability experience to formulate guidelines for estimating probabilistic margins to account for risks that are initially unknown or underappreciated. The formulation includes a review of the safety literature to identify the principal causes of such risks.

  12. Japanese round robin analysis for probabilistic fracture mechanics

    International Nuclear Information System (INIS)

    Yagawa, G.; Yoshimura, S.; Handa, N.

    1991-01-01

    Recently attention is focused on the probabilistic fracture mechanics, a branch of fracture mechanics with probability theory for a rational mean to assess the strength of components and structures. In particular, the probabilistic fracture mechanics is recognized as the powerful means for quantitative investigation of significance of factors and rational evaluation of life on problems involving a number of uncertainties, such as degradation of material strength, accuracy and frequency of inspection. Comparison with reference experiments are generally employed to assure the analytical accuracy. However, accuracy and reliability of analytical methods in the probabilistic fracture mechanics are hardly verified by experiments. Therefore, it is strongly needed to verify the probabilistic fracture mechanics through the round robin analysis. This paper describes results from the round robin analysis of flat plate with semi-elliptic cracks on the surface, conducted by the PFM Working Group of LE Subcommittee of the Japan Welding Society under the contract of the Japan Atomic Energy Research Institute and participated by Tokyo University, Yokohama National University, the Power Reactor and Nuclear Fuel Corporation, Tokyo Electric Power Co. Central Research Institute of Electric Power Industry, Toshiba Corporation, Kawasaki Heavy Industry Co. and Mitsubishi Heavy Industry Co. (author)

  13. Probabilistic Electricity Price Forecasting Models by Aggregation of Competitive Predictors

    Directory of Open Access Journals (Sweden)

    Claudio Monteiro

    2018-04-01

    Full Text Available This article presents original probabilistic price forecasting meta-models (PPFMCP models, by aggregation of competitive predictors, for day-ahead hourly probabilistic price forecasting. The best twenty predictors of the EEM2016 EPF competition are used to create ensembles of hourly spot price forecasts. For each hour, the parameter values of the probability density function (PDF of a Beta distribution for the output variable (hourly price can be directly obtained from the expected and variance values associated to the ensemble for such hour, using three aggregation strategies of predictor forecasts corresponding to three PPFMCP models. A Reliability Indicator (RI and a Loss function Indicator (LI are also introduced to give a measure of uncertainty of probabilistic price forecasts. The three PPFMCP models were satisfactorily applied to the real-world case study of the Iberian Electricity Market (MIBEL. Results from PPFMCP models showed that PPFMCP model 2, which uses aggregation by weight values according to daily ranks of predictors, was the best probabilistic meta-model from a point of view of mean absolute errors, as well as of RI and LI. PPFMCP model 1, which uses the averaging of predictor forecasts, was the second best meta-model. PPFMCP models allow evaluations of risk decisions based on the price to be made.

  14. Confluence Reduction for Probabilistic Systems (extended version)

    NARCIS (Netherlands)

    Timmer, Mark; Stoelinga, Mariëlle Ida Antoinette; van de Pol, Jan Cornelis

    2010-01-01

    This paper presents a novel technique for state space reduction of probabilistic specifications, based on a newly developed notion of confluence for probabilistic automata. We prove that this reduction preserves branching probabilistic bisimulation and can be applied on-the-fly. To support the

  15. Software reliability

    CERN Document Server

    Bendell, A

    1986-01-01

    Software Reliability reviews some fundamental issues of software reliability as well as the techniques, models, and metrics used to predict the reliability of software. Topics covered include fault avoidance, fault removal, and fault tolerance, along with statistical methods for the objective assessment of predictive accuracy. Development cost models and life-cycle cost models are also discussed. This book is divided into eight sections and begins with a chapter on adaptive modeling used to predict software reliability, followed by a discussion on failure rate in software reliability growth mo

  16. Virtual Globes: Serving Science and Society

    Directory of Open Access Journals (Sweden)

    Salman Qureshi

    2012-08-01

    Full Text Available Virtual Globes reached the mass market in 2005. They created multi-million dollar businesses in a very short time by providing novel ways to explore data geographically. We use the term “Virtual Globes” as the common denominator for technologies offering capabilities to annotate, edit and publish geographic information to a world-wide audience and to visualize information provided by the public and private sectors, as well as by citizens who volunteer new data. Unfortunately, but not surprising for a new trend or paradigm, overlapping terms such as “Virtual Globes”, “Digital Earth”, “Geospatial Web”, “Geoportal” or software specific terms are used heterogeneously. We analyze the terminologies and trends in scientific publications and ask whether these developments serve science and society. While usage can be answered quantitatively, the authors reason from the literature studied that these developments serve to educate the masses and may help to democratize geographic information by extending the producer base. We believe that we can contribute to a better distinction between software centered terms and the generic concept as such. The power of the visual, coupled with the potential of spatial analysis and modeling for public and private purposes raises new issues of reliability, standards, privacy and best practice. This is increasingly addressed in scientific literature but the required body of knowledge is still in its infancy.

  17. Making Probabilistic Relational Categories Learnable

    Science.gov (United States)

    Jung, Wookyoung; Hummel, John E.

    2015-01-01

    Theories of relational concept acquisition (e.g., schema induction) based on structured intersection discovery predict that relational concepts with a probabilistic (i.e., family resemblance) structure ought to be extremely difficult to learn. We report four experiments testing this prediction by investigating conditions hypothesized to facilitate…

  18. Probabilistic inductive inference: a survey

    OpenAIRE

    Ambainis, Andris

    2001-01-01

    Inductive inference is a recursion-theoretic theory of learning, first developed by E. M. Gold (1967). This paper surveys developments in probabilistic inductive inference. We mainly focus on finite inference of recursive functions, since this simple paradigm has produced the most interesting (and most complex) results.

  19. Probabilistic Approaches to Video Retrieval

    NARCIS (Netherlands)

    Ianeva, Tzvetanka; Boldareva, L.; Westerveld, T.H.W.; Cornacchia, Roberto; Hiemstra, Djoerd; de Vries, A.P.

    Our experiments for TRECVID 2004 further investigate the applicability of the so-called “Generative Probabilistic Models to video retrieval��?. TRECVID 2003 results demonstrated that mixture models computed from video shot sequences improve the precision of “query by examples��? results when

  20. Probabilistic safety analysis procedures guide

    International Nuclear Information System (INIS)

    Papazoglou, I.A.; Bari, R.A.; Buslik, A.J.

    1984-01-01

    A procedures guide for the performance of probabilistic safety assessment has been prepared for interim use in the Nuclear Regulatory Commission programs. The probabilistic safety assessment studies performed are intended to produce probabilistic predictive models that can be used and extended by the utilities and by NRC to sharpen the focus of inquiries into a range of tissues affecting reactor safety. This guide addresses the determination of the probability (per year) of core damage resulting from accident initiators internal to the plant and from loss of offsite electric power. The scope includes analyses of problem-solving (cognitive) human errors, a determination of importance of the various core damage accident sequences, and an explicit treatment and display of uncertainties for the key accident sequences. Ultimately, the guide will be augmented to include the plant-specific analysis of in-plant processes (i.e., containment performance) and the risk associated with external accident initiators, as consensus is developed regarding suitable methodologies in these areas. This guide provides the structure of a probabilistic safety study to be performed, and indicates what products of the study are essential for regulatory decision making. Methodology is treated in the guide only to the extent necessary to indicate the range of methods which is acceptable; ample reference is given to alternative methodologies which may be utilized in the performance of the study

  1. Sound Probabilistic #SAT with Projection

    Directory of Open Access Journals (Sweden)

    Vladimir Klebanov

    2016-10-01

    Full Text Available We present an improved method for a sound probabilistic estimation of the model count of a boolean formula under projection. The problem solved can be used to encode a variety of quantitative program analyses, such as concerning security of resource consumption. We implement the technique and discuss its application to quantifying information flow in programs.

  2. Probabilistic uniformities of uniform spaces

    Energy Technology Data Exchange (ETDEWEB)

    Rodriguez Lopez, J.; Romaguera, S.; Sanchis, M.

    2017-07-01

    The theory of metric spaces in the fuzzy context has shown to be an interesting area of study not only from a theoretical point of view but also for its applications. Nevertheless, it is usual to consider these spaces as classical topological or uniform spaces and there are not too many results about constructing fuzzy topological structures starting from a fuzzy metric. Maybe, H/{sup o}hle was the first to show how to construct a probabilistic uniformity and a Lowen uniformity from a probabilistic pseudometric /cite{Hohle78,Hohle82a}. His method can be directly translated to the context of fuzzy metrics and allows to characterize the categories of probabilistic uniform spaces or Lowen uniform spaces by means of certain families of fuzzy pseudometrics /cite{RL}. On the other hand, other different fuzzy uniformities can be constructed in a fuzzy metric space: a Hutton $[0,1]$-quasi-uniformity /cite{GGPV06}; a fuzzifiying uniformity /cite{YueShi10}, etc. The paper /cite{GGRLRo} gives a study of several methods of endowing a fuzzy pseudometric space with a probabilistic uniformity and a Hutton $[0,1]$-quasi-uniformity. In 2010, J. Guti/'errez Garc/'{/i}a, S. Romaguera and M. Sanchis /cite{GGRoSanchis10} proved that the category of uniform spaces is isomorphic to a category formed by sets endowed with a fuzzy uniform structure, i. e. a family of fuzzy pseudometrics satisfying certain conditions. We will show here that, by means of this isomorphism, we can obtain several methods to endow a uniform space with a probabilistic uniformity. Furthermore, these constructions allow to obtain a factorization of some functors introduced in /cite{GGRoSanchis10}. (Author)

  3. Food and drink serving contract

    Directory of Open Access Journals (Sweden)

    Veselinović Janko

    2012-01-01

    Full Text Available Food and drink catering service is almost as old as the civilization itself. Even though this vocation is a part of the catering activity, Serbian law does not foresee this contract section as personalized. Key legal sources for this kind of contract are business customs. Food and drink serving contract is a mixed-type contract and its legal nature is very interesting due to its complexity. Specific for this contract is the fact that it is not an ordinary service, but also an activity which requires a degree of culinary skills, knowledge of customs of other nations, as well as other skills. The very category of a good professional in business economy / hospitality industry is very dynamic, as it needs to be evaluated according to all given circumstances, which may be rather unpredictable. By considering the legal nature, but also the rights and obligations of the contracting parties, we tried to point to the questions that require a special attention. Legal sources that indirectly refer to food and drink serving contracts were taken into account. Apart from the Law on Obligatory Relations, we also considered here the Law on Tourism also pointing to the comparative law and jurisprudence.

  4. Estimation of the Reliability of Plastic Slabs

    DEFF Research Database (Denmark)

    Pirzada, G. B. : Ph.D.

    In this thesis, work related to fundamental conditions has been extended to non-fundamental or the general case of probabilistic analysis. Finally, using the ss-unzipping technique a door has been opened to system reliability analysis of plastic slabs. An attempt has been made in this thesis...... to give a probabilistic treatment of plastic slabs which is parallel to the deterministic and systematic treatment of plastic slabs by Nielsen (3). The fundamental reason is that in Nielsen (3) the treatment is based on a deterministic modelling of the basic material properties for the reinforced...

  5. A probabilistic Hu-Washizu variational principle

    Science.gov (United States)

    Liu, W. K.; Belytschko, T.; Besterfield, G. H.

    1987-01-01

    A Probabilistic Hu-Washizu Variational Principle (PHWVP) for the Probabilistic Finite Element Method (PFEM) is presented. This formulation is developed for both linear and nonlinear elasticity. The PHWVP allows incorporation of the probabilistic distributions for the constitutive law, compatibility condition, equilibrium, domain and boundary conditions into the PFEM. Thus, a complete probabilistic analysis can be performed where all aspects of the problem are treated as random variables and/or fields. The Hu-Washizu variational formulation is available in many conventional finite element codes thereby enabling the straightforward inclusion of the probabilistic features into present codes.

  6. Reliability tasks from prediction to field use

    International Nuclear Information System (INIS)

    Guyot, Christian.

    1975-01-01

    This tutorial paper is part of a series intended to sensitive on reliability prolems. Reliability probabilistic concept, is an important parameter of availability. Reliability prediction is an estimation process for evaluating design progress. It is only by the application of a reliability program that reliability objectives can be attained through the different stages of work: conception, fabrication, field use. The user is mainly interested in operational reliability. Indication are given on the support and the treatment of data in the case of electronic equipment at C.E.A. Reliability engineering requires a special state of mind which must be formed and developed in a company in the same way as it may be done for example for safety [fr

  7. Application of the probabilistic method at the E.D.F

    International Nuclear Information System (INIS)

    Gachot, Bernard

    1976-01-01

    Having first evoked the problems arising from the definition of a so-called 'acceptable risk', the probabilistic study programme on safety carried out at the E.D.F. is described. The different aspects of the probabilistic estimation of a hazard are presented as well as the different steps i.e. collecting the information, carrying out a quantitative and qualitative analysis, which characterize the probabilistic study of safety problems. The problem of data determination is considered on reliability of the equipment, noting as a conclusion, that in spite of the lack of accuracy of the present data, the probabilistic methods already appear as a highly valuable tool favouring an homogenous and coherent approach of nuclear plant safety [fr

  8. An advanced probabilistic structural analysis method for implicit performance functions

    Science.gov (United States)

    Wu, Y.-T.; Millwater, H. R.; Cruse, T. A.

    1989-01-01

    In probabilistic structural analysis, the performance or response functions usually are implicitly defined and must be solved by numerical analysis methods such as finite element methods. In such cases, the most commonly used probabilistic analysis tool is the mean-based, second-moment method which provides only the first two statistical moments. This paper presents a generalized advanced mean value (AMV) method which is capable of establishing the distributions to provide additional information for reliability design. The method requires slightly more computations than the second-moment method but is highly efficient relative to the other alternative methods. In particular, the examples show that the AMV method can be used to solve problems involving non-monotonic functions that result in truncated distributions.

  9. A probabilistic approach to delineating functional brain regions

    DEFF Research Database (Denmark)

    Kalbitzer, Jan; Svarer, Claus; Frokjaer, Vibe G

    2009-01-01

    The purpose of this study was to develop a reliable observer-independent approach to delineating volumes of interest (VOIs) for functional brain regions that are not identifiable on structural MR images. The case is made for the raphe nuclei, a collection of nuclei situated in the brain stem known...... to be densely packed with serotonin transporters (5-hydroxytryptaminic [5-HTT] system). METHODS: A template set for the raphe nuclei, based on their high content of 5-HTT as visualized in parametric (11)C-labeled 3-amino-4-(2-dimethylaminomethyl-phenylsulfanyl)-benzonitrile PET images, was created for 10...... healthy subjects. The templates were subsequently included in the region sets used in a previously published automatic MRI-based approach to create an observer- and activity-independent probabilistic VOI map. The probabilistic map approach was tested in a different group of 10 subjects and compared...

  10. Towards a multilevel cognitive probabilistic representation of space

    Science.gov (United States)

    Tapus, Adriana; Vasudevan, Shrihari; Siegwart, Roland

    2005-03-01

    This paper addresses the problem of perception and representation of space for a mobile agent. A probabilistic hierarchical framework is suggested as a solution to this problem. The method proposed is a combination of probabilistic belief with "Object Graph Models" (OGM). The world is viewed from a topological optic, in terms of objects and relationships between them. The hierarchical representation that we propose permits an efficient and reliable modeling of the information that the mobile agent would perceive from its environment. The integration of both navigational and interactional capabilities through efficient representation is also addressed. Experiments on a set of images taken from the real world that validate the approach are reported. This framework draws on the general understanding of human cognition and perception and contributes towards the overall efforts to build cognitive robot companions.

  11. Probabilistic finite elements for fracture and fatigue analysis

    Science.gov (United States)

    Liu, W. K.; Belytschko, T.; Lawrence, M.; Besterfield, G. H.

    1989-01-01

    The fusion of the probabilistic finite element method (PFEM) and reliability analysis for probabilistic fracture mechanics (PFM) is presented. A comprehensive method for determining the probability of fatigue failure for curved crack growth was developed. The criterion for failure or performance function is stated as: the fatigue life of a component must exceed the service life of the component; otherwise failure will occur. An enriched element that has the near-crack-tip singular strain field embedded in the element is used to formulate the equilibrium equation and solve for the stress intensity factors at the crack-tip. Performance and accuracy of the method is demonstrated on a classical mode 1 fatigue problem.

  12. Reliability Based Ship Structural Design

    DEFF Research Database (Denmark)

    Dogliani, M.; Østergaard, C.; Parmentier, G.

    1996-01-01

    This paper deals with the development of different methods that allow the reliability-based design of ship structures to be transferred from the area of research to the systematic application in current design. It summarises the achievements of a three-year collaborative research project dealing...... with developments of models of load effects and of structural collapse adopted in reliability formulations which aim at calibrating partial safety factors for ship structural design. New probabilistic models of still-water load effects are developed both for tankers and for containerships. New results are presented...... structure of several tankers and containerships. The results of the reliability analysis were the basis for the definition of a target safety level which was used to asses the partial safety factors suitable for in a new design rules format to be adopted in modern ship structural design. Finally...

  13. Structural Optimization with Reliability Constraints

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Thoft-Christensen, Palle

    1986-01-01

    During the last 25 years considerable progress has been made in the fields of structural optimization and structural reliability theory. In classical deterministic structural optimization all variables are assumed to be deterministic. Due to the unpredictability of loads and strengths of actual......]. In this paper we consider only structures which can be modelled as systems of elasto-plastic elements, e.g. frame and truss structures. In section 2 a method to evaluate the reliability of such structural systems is presented. Based on a probabilistic point of view a modern structural optimization problem...... is formulated in section 3. The formulation is a natural extension of the commonly used formulations in determinstic structural optimization. The mathematical form of the optimization problem is briefly discussed. In section 4 two new optimization procedures especially designed for the reliability...

  14. NASA Applications and Lessons Learned in Reliability Engineering

    Science.gov (United States)

    Safie, Fayssal M.; Fuller, Raymond P.

    2011-01-01

    Since the Shuttle Challenger accident in 1986, communities across NASA have been developing and extensively using quantitative reliability and risk assessment methods in their decision making process. This paper discusses several reliability engineering applications that NASA has used over the year to support the design, development, and operation of critical space flight hardware. Specifically, the paper discusses several reliability engineering applications used by NASA in areas such as risk management, inspection policies, components upgrades, reliability growth, integrated failure analysis, and physics based probabilistic engineering analysis. In each of these areas, the paper provides a brief discussion of a case study to demonstrate the value added and the criticality of reliability engineering in supporting NASA project and program decisions to fly safely. Examples of these case studies discussed are reliability based life limit extension of Shuttle Space Main Engine (SSME) hardware, Reliability based inspection policies for Auxiliary Power Unit (APU) turbine disc, probabilistic structural engineering analysis for reliability prediction of the SSME alternate turbo-pump development, impact of ET foam reliability on the Space Shuttle System risk, and reliability based Space Shuttle upgrade for safety. Special attention is given in this paper to the physics based probabilistic engineering analysis applications and their critical role in evaluating the reliability of NASA development hardware including their potential use in a research and technology development environment.

  15. An integrated approach to human reliability analysis -- decision analytic dynamic reliability model

    International Nuclear Information System (INIS)

    Holmberg, J.; Hukki, K.; Norros, L.; Pulkkinen, U.; Pyy, P.

    1999-01-01

    The reliability of human operators in process control is sensitive to the context. In many contemporary human reliability analysis (HRA) methods, this is not sufficiently taken into account. The aim of this article is that integration between probabilistic and psychological approaches in human reliability should be attempted. This is achieved first, by adopting such methods that adequately reflect the essential features of the process control activity, and secondly, by carrying out an interactive HRA process. Description of the activity context, probabilistic modeling, and psychological analysis form an iterative interdisciplinary sequence of analysis in which the results of one sub-task maybe input to another. The analysis of the context is carried out first with the help of a common set of conceptual tools. The resulting descriptions of the context promote the probabilistic modeling, through which new results regarding the probabilistic dynamics can be achieved. These can be incorporated in the context descriptions used as reference in the psychological analysis of actual performance. The results also provide new knowledge of the constraints of activity, by providing information of the premises of the operator's actions. Finally, the stochastic marked point process model gives a tool, by which psychological methodology may be interpreted and utilized for reliability analysis

  16. Reliability Assessment and Reliability-Based Inspection and Maintenance of Offshore Wind Turbines

    DEFF Research Database (Denmark)

    Ramírez, José G. Rangel; Sørensen, John Dalsgaard

    2009-01-01

    Probabilistic methodologies represent an important tool to identify the suitable strategy to inspect and deal with the deterioration in structures such as offshore wind turbines (OWT). Reliability based methods such as Risk Based Inspection (RBI) planning may represent a proper methodology to opt...

  17. Uncertainties and reliability theories for reactor safety

    International Nuclear Information System (INIS)

    Veneziano, D.

    1975-01-01

    What makes the safety problem of nuclear reactors particularly challenging is the demand for high levels of reliability and the limitation of statistical information. The latter is an unfortunate circumstance, which forces deductive theories of reliability to use models and parameter values with weak factual support. The uncertainty about probabilistic models and parameters which are inferred from limited statistical evidence can be quantified and incorporated rationally into inductive theories of reliability. In such theories, the starting point is the information actually available, as opposed to an estimated probabilistic model. But, while the necessity of introducing inductive uncertainty into reliability theories has been recognized by many authors, no satisfactory inductive theory is presently available. The paper presents: a classification of uncertainties and of reliability models for reactor safety; a general methodology to include these uncertainties into reliability analysis; a discussion about the relative advantages and the limitations of various reliability theories (specifically, of inductive and deductive, parametric and nonparametric, second-moment and full-distribution theories). For example, it is shown that second-moment theories, which were originally suggested to cope with the scarcity of data, and which have been proposed recently for the safety analysis of secondary containment vessels, are the least capable of incorporating statistical uncertainty. The focus is on reliability models for external threats (seismic accelerations and tornadoes). As an application example, the effect of statistical uncertainty on seismic risk is studied using parametric full-distribution models

  18. Probabilistic costing of transmission services

    International Nuclear Information System (INIS)

    Wijayatunga, P.D.C.

    1992-01-01

    Costing of transmission services of electrical utilities is required for transactions involving the transport of energy over a power network. The calculation of these costs based on Short Run Marginal Costing (SRMC) is preferred over other methods proposed in the literature due to its economic efficiency. In the research work discussed here, the concept of probabilistic costing of use-of-system based on SRMC which emerges as a consequence of the uncertainties in a power system is introduced using two different approaches. The first approach, based on the Monte Carlo method, generates a large number of possible system states by simulating random variables in the system using pseudo random number generators. A second approach to probabilistic use-of-system costing is proposed based on numerical convolution and multi-area representation of the transmission network. (UK)

  19. Probabilistic risk assessment of HTGRs

    International Nuclear Information System (INIS)

    Fleming, K.N.; Houghton, W.J.; Hannaman, G.W.; Joksimovic, V.

    1980-08-01

    Probabilistic Risk Assessment methods have been applied to gas-cooled reactors for more than a decade and to HTGRs for more than six years in the programs sponsored by the US Department of Energy. Significant advancements to the development of PRA methodology in these programs are summarized as are the specific applications of the methods to HTGRs. Emphasis here is on PRA as a tool for evaluating HTGR design options. Current work and future directions are also discussed

  20. Probabilistic methods for rotordynamics analysis

    Science.gov (United States)

    Wu, Y.-T.; Torng, T. Y.; Millwater, H. R.; Fossum, A. F.; Rheinfurth, M. H.

    1991-01-01

    This paper summarizes the development of the methods and a computer program to compute the probability of instability of dynamic systems that can be represented by a system of second-order ordinary linear differential equations. Two instability criteria based upon the eigenvalues or Routh-Hurwitz test functions are investigated. Computational methods based on a fast probability integration concept and an efficient adaptive importance sampling method are proposed to perform efficient probabilistic analysis. A numerical example is provided to demonstrate the methods.

  1. Probabilistic analysis and related topics

    CERN Document Server

    Bharucha-Reid, A T

    1983-01-01

    Probabilistic Analysis and Related Topics, Volume 3 focuses on the continuity, integrability, and differentiability of random functions, including operator theory, measure theory, and functional and numerical analysis. The selection first offers information on the qualitative theory of stochastic systems and Langevin equations with multiplicative noise. Discussions focus on phase-space evolution via direct integration, phase-space evolution, linear and nonlinear systems, linearization, and generalizations. The text then ponders on the stability theory of stochastic difference systems and Marko

  2. Probabilistic analysis and related topics

    CERN Document Server

    Bharucha-Reid, A T

    1979-01-01

    Probabilistic Analysis and Related Topics, Volume 2 focuses on the integrability, continuity, and differentiability of random functions, as well as functional analysis, measure theory, operator theory, and numerical analysis.The selection first offers information on the optimal control of stochastic systems and Gleason measures. Discussions focus on convergence of Gleason measures, random Gleason measures, orthogonally scattered Gleason measures, existence of optimal controls without feedback, random necessary conditions, and Gleason measures in tensor products. The text then elaborates on an

  3. Probabilistic risk assessment of HTGRs

    International Nuclear Information System (INIS)

    Fleming, K.N.; Houghton, W.J.; Hannaman, G.W.; Joksimovic, V.

    1981-01-01

    Probabilistic Risk Assessment methods have been applied to gas-cooled reactors for more than a decade and to HTGRs for more than six years in the programs sponsored by the U.S. Department of Energy. Significant advancements to the development of PRA methodology in these programs are summarized as are the specific applications of the methods to HTGRs. Emphasis here is on PRA as a tool for evaluating HTGR design options. Current work and future directions are also discussed. (author)

  4. Applications of probabilistic risk analysis in nuclear criticality safety design

    International Nuclear Information System (INIS)

    Chang, J.K.

    1992-01-01

    Many documents have been prepared that try to define the scope of the criticality analysis and that suggest adding probabilistic risk analysis (PRA) to the deterministic safety analysis. The report of the US Department of Energy (DOE) AL 5481.1B suggested that an accident is credible if the occurrence probability is >1 x 10 -6 /yr. The draft DOE 5480 safety analysis report suggested that safety analyses should include the application of methods such as deterministic safety analysis, risk assessment, reliability engineering, common-cause failure analysis, human reliability analysis, and human factor safety analysis techniques. The US Nuclear Regulatory Commission (NRC) report NRC SG830.110 suggested that major safety analysis methods should include but not be limited to risk assessment, reliability engineering, and human factor safety analysis. All of these suggestions have recommended including PRA in the traditional criticality analysis

  5. Serving the world's poor, profitably.

    Science.gov (United States)

    Prahalad, C K; Hammond, Allen

    2002-09-01

    By stimulating commerce and development at the bottom of the economic pyramid, multi-nationals could radically improve the lives of billions of people and help create a more stable, less dangerous world. Achieving this goal does not require MNCs to spearhead global social-development initiatives for charitable purposes. They need only act in their own self-interest. How? The authors lay out the business case for entering the world's poorest markets. Fully 65% of the world's population earns less than $2,000 per year--that's 4 billion people. But despite the vastness of this market, it remains largely untapped. The reluctance to invest is easy to understand, but it is, by and large, based on outdated assumptions of the developing world. While individual incomes may be low, the aggregate buying power of poor communities is actually quite large, representing a substantial market in many countries for what some might consider luxury goods like satellite television and phone services. Prices, and margins, are often much higher in poor neighborhoods than in their middle-class counterparts. And new technologies are already steadily reducing the effects of corruption, illiteracy, inadequate infrastructure, and other such barriers. Because these markets are in the earliest stages of economic development, revenue growth for multi-nationals entering them can be extremely rapid. MNCs can also lower costs, not only through low-cost labor but by transferring operating efficiencies and innovations developed to serve their existing operations. Certainly, succeeding in such markets requires MNCs to think creatively. The biggest change, though, has to come from executives: Unless business leaders confront their own preconceptions--particularly about the value of high-volume, low-margin businesses--companies are unlikely to master the challenges or reap the rewards of these developing markets.

  6. Reliability Engineering

    CERN Document Server

    Lazzaroni, Massimo

    2012-01-01

    This book gives a practical guide for designers and users in Information and Communication Technology context. In particular, in the first Section, the definition of the fundamental terms according to the international standards are given. Then, some theoretical concepts and reliability models are presented in Chapters 2 and 3: the aim is to evaluate performance for components and systems and reliability growth. Chapter 4, by introducing the laboratory tests, puts in evidence the reliability concept from the experimental point of view. In ICT context, the failure rate for a given system can be

  7. Reliability training

    Science.gov (United States)

    Lalli, Vincent R. (Editor); Malec, Henry A. (Editor); Dillard, Richard B.; Wong, Kam L.; Barber, Frank J.; Barina, Frank J.

    1992-01-01

    Discussed here is failure physics, the study of how products, hardware, software, and systems fail and what can be done about it. The intent is to impart useful information, to extend the limits of production capability, and to assist in achieving low cost reliable products. A review of reliability for the years 1940 to 2000 is given. Next, a review of mathematics is given as well as a description of what elements contribute to product failures. Basic reliability theory and the disciplines that allow us to control and eliminate failures are elucidated.

  8. Power system reliability analysis using fault trees

    International Nuclear Information System (INIS)

    Volkanovski, A.; Cepin, M.; Mavko, B.

    2006-01-01

    The power system reliability analysis method is developed from the aspect of reliable delivery of electrical energy to customers. The method is developed based on the fault tree analysis, which is widely applied in the Probabilistic Safety Assessment (PSA). The method is adapted for the power system reliability analysis. The method is developed in a way that only the basic reliability parameters of the analysed power system are necessary as an input for the calculation of reliability indices of the system. The modeling and analysis was performed on an example power system consisting of eight substations. The results include the level of reliability of current power system configuration, the combinations of component failures resulting in a failed power delivery to loads, and the importance factors for components and subsystems. (author)

  9. Comparison of Methods for Dependency Determination between Human Failure Events within Human Reliability Analysis

    International Nuclear Information System (INIS)

    Cepin, M.

    2008-01-01

    The human reliability analysis (HRA) is a highly subjective evaluation of human performance, which is an input for probabilistic safety assessment, which deals with many parameters of high uncertainty. The objective of this paper is to show that subjectivism can have a large impact on human reliability results and consequently on probabilistic safety assessment results and applications. The objective is to identify the key features, which may decrease subjectivity of human reliability analysis. Human reliability methods are compared with focus on dependency comparison between Institute Jozef Stefan human reliability analysis (IJS-HRA) and standardized plant analysis risk human reliability analysis (SPAR-H). Results show large differences in the calculated human error probabilities for the same events within the same probabilistic safety assessment, which are the consequence of subjectivity. The subjectivity can be reduced by development of more detailed guidelines for human reliability analysis with many practical examples for all steps of the process of evaluation of human performance

  10. Comparison of methods for dependency determination between human failure events within human reliability analysis

    International Nuclear Information System (INIS)

    Cepis, M.

    2007-01-01

    The Human Reliability Analysis (HRA) is a highly subjective evaluation of human performance, which is an input for probabilistic safety assessment, which deals with many parameters of high uncertainty. The objective of this paper is to show that subjectivism can have a large impact on human reliability results and consequently on probabilistic safety assessment results and applications. The objective is to identify the key features, which may decrease of subjectivity of human reliability analysis. Human reliability methods are compared with focus on dependency comparison between Institute Jozef Stefan - Human Reliability Analysis (IJS-HRA) and Standardized Plant Analysis Risk Human Reliability Analysis (SPAR-H). Results show large differences in the calculated human error probabilities for the same events within the same probabilistic safety assessment, which are the consequence of subjectivity. The subjectivity can be reduced by development of more detailed guidelines for human reliability analysis with many practical examples for all steps of the process of evaluation of human performance. (author)

  11. Probabilistic methods in nuclear power plant component ageing analysis

    International Nuclear Information System (INIS)

    Simola, K.

    1992-03-01

    The nuclear power plant ageing research is aimed to ensure that the plant safety and reliability are maintained at a desired level through the designed, and possibly extended lifetime. In ageing studies, the reliability of components, systems and structures is evaluated taking into account the possible time- dependent decrease in reliability. The results of analyses can be used in the evaluation of the remaining lifetime of components and in the development of preventive maintenance, testing and replacement programmes. The report discusses the use of probabilistic models in the evaluations of the ageing of nuclear power plant components. The principles of nuclear power plant ageing studies are described and examples of ageing management programmes in foreign countries are given. The use of time-dependent probabilistic models to evaluate the ageing of various components and structures is described and the application of models is demonstrated with two case studies. In the case study of motor- operated closing valves the analysis are based on failure data obtained from a power plant. In the second example, the environmentally assisted crack growth is modelled with a computer code developed in United States, and the applicability of the model is evaluated on the basis of operating experience

  12. Probabilistic Analysis of Space Shuttle Body Flap Actuator Ball Bearings

    Science.gov (United States)

    Oswald, Fred B.; Jett, Timothy R.; Predmore, Roamer E.; Zaretsky, Erwin V.

    2008-01-01

    A probabilistic analysis, using the 2-parameter Weibull-Johnson method, was performed on experimental life test data from space shuttle actuator bearings. Experiments were performed on a test rig under simulated conditions to determine the life and failure mechanism of the grease lubricated bearings that support the input shaft of the space shuttle body flap actuators. The failure mechanism was wear that can cause loss of bearing preload. These tests established life and reliability data for both shuttle flight and ground operation. Test data were used to estimate the failure rate and reliability as a function of the number of shuttle missions flown. The Weibull analysis of the test data for the four actuators on one shuttle, each with a 2-bearing shaft assembly, established a reliability level of 96.9 percent for a life of 12 missions. A probabilistic system analysis for four shuttles, each of which has four actuators, predicts a single bearing failure in one actuator of one shuttle after 22 missions (a total of 88 missions for a 4-shuttle fleet). This prediction is comparable with actual shuttle flight history in which a single actuator bearing was found to have failed by wear at 20 missions.

  13. Deterministic and probabilistic approach to determine seismic risk of nuclear power plants; a practical example

    International Nuclear Information System (INIS)

    Soriano Pena, A.; Lopez Arroyo, A.; Roesset, J.M.

    1976-01-01

    The probabilistic and deterministic approaches for calculating the seismic risk of nuclear power plants are both applied to a particular case in Southern Spain. The results obtained by both methods, when varying the input data, are presented and some conclusions drawn in relation to the applicability of the methods, their reliability and their sensitivity to change

  14. Probabilistically-Cued Patterns Trump Perfect Cues in Statistical Language Learning.

    Science.gov (United States)

    Lany, Jill; Gómez, Rebecca L

    2013-01-01

    Probabilistically-cued co-occurrence relationships between word categories are common in natural languages but difficult to acquire. For example, in English, determiner-noun and auxiliary-verb dependencies both involve co-occurrence relationships, but determiner-noun relationships are more reliably marked by correlated distributional and phonological cues, and appear to be learned more readily. We tested whether experience with co-occurrence relationships that are more reliable promotes learning those that are less reliable using an artificial language paradigm. Prior experience with deterministically-cued contingencies did not promote learning of less reliably-cued structure, nor did prior experience with relationships instantiated in the same vocabulary. In contrast, prior experience with probabilistically-cued co-occurrence relationships instantiated in different vocabulary did enhance learning. Thus, experience with co-occurrence relationships sharing underlying structure but not vocabulary may be an important factor in learning grammatical patterns. Furthermore, experience with probabilistically-cued co-occurrence relationships, despite their difficultly for naïve learners, lays an important foundation for learning novel probabilistic structure.

  15. Future of structural reliability methodology in nuclear power plant technology

    Energy Technology Data Exchange (ETDEWEB)

    Schueeller, G I [Technische Univ. Muenchen (Germany, F.R.); Kafka, P [Gesellschaft fuer Reaktorsicherheit m.b.H. (GRS), Garching (Germany, F.R.)

    1978-10-01

    This paper presents the authors' personal view as to which areas of structural reliability in nuclear power plant design need most urgently to be advanced. Aspects of simulation modeling, design rules, codification and specification of reliability, system analysis, probabilistic structural dynamics, rare events and particularly the interaction of systems and structural reliability are discussed. As an example, some considerations of the interaction effects between the protective systems and the pressure vessel are stated. The paper concludes with recommendation for further research.

  16. Probabilistic Harmonic Modeling of Wind Power Plants

    DEFF Research Database (Denmark)

    Guest, Emerson; Jensen, Kim H.; Rasmussen, Tonny Wederberg

    2017-01-01

    A probabilistic sequence domain (SD) harmonic model of a grid-connected voltage-source converter is used to estimate harmonic emissions in a wind power plant (WPP) comprised of Type-IV wind turbines. The SD representation naturally partitioned converter generated voltage harmonics into those...... with deterministic phase and those with probabilistic phase. A case study performed on a string of ten 3MW, Type-IV wind turbines implemented in PSCAD was used to verify the probabilistic SD harmonic model. The probabilistic SD harmonic model can be employed in the planning phase of WPP projects to assess harmonic...

  17. Students’ difficulties in probabilistic problem-solving

    Science.gov (United States)

    Arum, D. P.; Kusmayadi, T. A.; Pramudya, I.

    2018-03-01

    There are many errors can be identified when students solving mathematics problems, particularly in solving the probabilistic problem. This present study aims to investigate students’ difficulties in solving the probabilistic problem. It focuses on analyzing and describing students errors during solving the problem. This research used the qualitative method with case study strategy. The subjects in this research involve ten students of 9th grade that were selected by purposive sampling. Data in this research involve students’ probabilistic problem-solving result and recorded interview regarding students’ difficulties in solving the problem. Those data were analyzed descriptively using Miles and Huberman steps. The results show that students have difficulties in solving the probabilistic problem and can be divided into three categories. First difficulties relate to students’ difficulties in understanding the probabilistic problem. Second, students’ difficulties in choosing and using appropriate strategies for solving the problem. Third, students’ difficulties with the computational process in solving the problem. Based on the result seems that students still have difficulties in solving the probabilistic problem. It means that students have not able to use their knowledge and ability for responding probabilistic problem yet. Therefore, it is important for mathematics teachers to plan probabilistic learning which could optimize students probabilistic thinking ability.

  18. Probabilistic Analysis of Structural Member from Recycled Aggregate Concrete

    Science.gov (United States)

    Broukalová, I.; Šeps, K.

    2017-09-01

    The paper aims at the topic of sustainable building concerning recycling of waste rubble concrete from demolition. Considering demands of maximising recycled aggregate use and minimising of cement consumption, composite from recycled concrete aggregate was proposed. The objective of the presented investigations was to verify feasibility of the recycled aggregate cement based fibre reinforced composite in a structural member. Reliability of wall from recycled aggregate fibre reinforced composite was assessed in a probabilistic analysis of a load-bearing capacity of the wall. The applicability of recycled aggregate fibre reinforced concrete in structural applications was demonstrated. The outcomes refer to issue of high scatter of material parameters of recycled aggregate concretes.

  19. Contribution of operating feedback to probabilistic safety studies

    International Nuclear Information System (INIS)

    Guio, J.M. de; Lannoy, A.

    1992-03-01

    This paper presents the method used for PWR unit operation feedback analysis and its contribution to probabilistic safety studies. The targets were as follows: - use of failure data banks to assess reliability parameters, - use of event data banks to identify and quantify main system initiating events, - determination of a standard operating profile. These studies, performed in the context of nuclear power plant safety programs, prove useful not only to safety engineers but also to equipment experts, designers, operators and maintenance specialists. They constitute basic data for studies in all these areas or the departure point for new investigations. (authors). 3 figs., 3 tabs., 3 refs

  20. Probabilistic analyses of failure in reactor coolant piping

    International Nuclear Information System (INIS)

    Holman, G.S.

    1984-01-01

    LLNL is performing probabilistic reliability analyses of PWR and BWR reactor coolant piping for the NRC Office of Nuclear Regulatory Research. Specifically, LLNL is estimating the probability of a double-ended guillotine break (DEGB) in the reactor coolant loop piping in PWR plants, and in the main stream, feedwater, and recirculation piping of BWR plants. In estimating the probability of DEGB, LLNL considers two causes of pipe break: pipe fracture due to the growth of cracks at welded joints (direct DEGB), and pipe rupture indirectly caused by the seismically-induced failure of critical supports or equipment (indirect DEGB)

  1. Systems reliability/structural reliability

    International Nuclear Information System (INIS)

    Green, A.E.

    1980-01-01

    The question of reliability technology using quantified techniques is considered for systems and structures. Systems reliability analysis has progressed to a viable and proven methodology whereas this has yet to be fully achieved for large scale structures. Structural loading variants over the half-time of the plant are considered to be more difficult to analyse than for systems, even though a relatively crude model may be a necessary starting point. Various reliability characteristics and environmental conditions are considered which enter this problem. The rare event situation is briefly mentioned together with aspects of proof testing and normal and upset loading conditions. (orig.)

  2. Deterministic and Probabilistic Analysis of NPP Communication Bridge Resistance Due to Extreme Loads

    Directory of Open Access Journals (Sweden)

    Králik Juraj

    2014-12-01

    Full Text Available This paper presents the experiences from the deterministic and probability analysis of the reliability of communication bridge structure resistance due to extreme loads - wind and earthquake. On the example of the steel bridge between two NPP buildings is considered the efficiency of the bracing systems. The advantages and disadvantages of the deterministic and probabilistic analysis of the structure resistance are discussed. The advantages of the utilization the LHS method to analyze the safety and reliability of the structures is presented

  3. Probabilistic Structural Analysis Methods (PSAM) for select space propulsion system structural components

    Science.gov (United States)

    Cruse, T. A.

    1987-01-01

    The objective is the development of several modular structural analysis packages capable of predicting the probabilistic response distribution for key structural variables such as maximum stress, natural frequencies, transient response, etc. The structural analysis packages are to include stochastic modeling of loads, material properties, geometry (tolerances), and boundary conditions. The solution is to be in terms of the cumulative probability of exceedance distribution (CDF) and confidence bounds. Two methods of probability modeling are to be included as well as three types of structural models - probabilistic finite-element method (PFEM); probabilistic approximate analysis methods (PAAM); and probabilistic boundary element methods (PBEM). The purpose in doing probabilistic structural analysis is to provide the designer with a more realistic ability to assess the importance of uncertainty in the response of a high performance structure. Probabilistic Structural Analysis Method (PSAM) tools will estimate structural safety and reliability, while providing the engineer with information on the confidence that should be given to the predicted behavior. Perhaps most critically, the PSAM results will directly provide information on the sensitivity of the design response to those variables which are seen to be uncertain.

  4. Probabilistic Structural Analysis Methods for select space propulsion system structural components (PSAM)

    Science.gov (United States)

    Cruse, T. A.; Burnside, O. H.; Wu, Y.-T.; Polch, E. Z.; Dias, J. B.

    1988-01-01

    The objective is the development of several modular structural analysis packages capable of predicting the probabilistic response distribution for key structural variables such as maximum stress, natural frequencies, transient response, etc. The structural analysis packages are to include stochastic modeling of loads, material properties, geometry (tolerances), and boundary conditions. The solution is to be in terms of the cumulative probability of exceedance distribution (CDF) and confidence bounds. Two methods of probability modeling are to be included as well as three types of structural models - probabilistic finite-element method (PFEM); probabilistic approximate analysis methods (PAAM); and probabilistic boundary element methods (PBEM). The purpose in doing probabilistic structural analysis is to provide the designer with a more realistic ability to assess the importance of uncertainty in the response of a high performance structure. Probabilistic Structural Analysis Method (PSAM) tools will estimate structural safety and reliability, while providing the engineer with information on the confidence that should be given to the predicted behavior. Perhaps most critically, the PSAM results will directly provide information on the sensitivity of the design response to those variables which are seen to be uncertain.

  5. Concurrent Probabilistic Simulation of High Temperature Composite Structural Response

    Science.gov (United States)

    Abdi, Frank

    1996-01-01

    A computational structural/material analysis and design tool which would meet industry's future demand for expedience and reduced cost is presented. This unique software 'GENOA' is dedicated to parallel and high speed analysis to perform probabilistic evaluation of high temperature composite response of aerospace systems. The development is based on detailed integration and modification of diverse fields of specialized analysis techniques and mathematical models to combine their latest innovative capabilities into a commercially viable software package. The technique is specifically designed to exploit the availability of processors to perform computationally intense probabilistic analysis assessing uncertainties in structural reliability analysis and composite micromechanics. The primary objectives which were achieved in performing the development were: (1) Utilization of the power of parallel processing and static/dynamic load balancing optimization to make the complex simulation of structure, material and processing of high temperature composite affordable; (2) Computational integration and synchronization of probabilistic mathematics, structural/material mechanics and parallel computing; (3) Implementation of an innovative multi-level domain decomposition technique to identify the inherent parallelism, and increasing convergence rates through high- and low-level processor assignment; (4) Creating the framework for Portable Paralleled architecture for the machine independent Multi Instruction Multi Data, (MIMD), Single Instruction Multi Data (SIMD), hybrid and distributed workstation type of computers; and (5) Market evaluation. The results of Phase-2 effort provides a good basis for continuation and warrants Phase-3 government, and industry partnership.

  6. Probabilistic forecasting for extreme NO2 pollution episodes

    International Nuclear Information System (INIS)

    Aznarte, José L.

    2017-01-01

    In this study, we investigate the convenience of quantile regression to predict extreme concentrations of NO 2 . Contrarily to the usual point-forecasting, where a single value is forecast for each horizon, probabilistic forecasting through quantile regression allows for the prediction of the full probability distribution, which in turn allows to build models specifically fit for the tails of this distribution. Using data from the city of Madrid, including NO 2 concentrations as well as meteorological measures, we build models that predict extreme NO 2 concentrations, outperforming point-forecasting alternatives, and we prove that the predictions are accurate, reliable and sharp. Besides, we study the relative importance of the independent variables involved, and show how the important variables for the median quantile are different than those important for the upper quantiles. Furthermore, we present a method to compute the probability of exceedance of thresholds, which is a simple and comprehensible manner to present probabilistic forecasts maximizing their usefulness. - Highlights: • A new probabilistic forecasting system is presented to predict NO 2 concentrations. • While predicting the full distribution, it also outperforms other point-forecasting models. • Forecasts show good properties and peak concentrations are properly predicted. • It forecasts the probability of exceedance of thresholds, key to decision makers. • Relative forecasting importance of the variables is obtained as a by-product.

  7. Probabilistic Flood Defence Assessment Tools

    Directory of Open Access Journals (Sweden)

    Slomp Robert

    2016-01-01

    institutions managing flood the defences, and not by just a small number of experts in probabilistic assessment. Therefore, data management and use of software are main issues that have been covered in courses and training in 2016 and 2017. All in all, this is the largest change in the assessment of Dutch flood defences since 1996. In 1996 probabilistic techniques were first introduced to determine hydraulic boundary conditions (water levels and waves (wave height, wave period and direction for different return periods. To simplify the process, the assessment continues to consist of a three-step approach, moving from simple decision rules, to the methods for semi-probabilistic assessment, and finally to a fully probabilistic analysis to compare the strength of flood defences with the hydraulic loads. The formal assessment results are thus mainly based on the fully probabilistic analysis and the ultimate limit state of the strength of a flood defence. For complex flood defences, additional models and software were developed. The current Hydra software suite (for policy analysis, formal flood defence assessment and design will be replaced by the model Ringtoets. New stand-alone software has been developed for revetments, geotechnical analysis and slope stability of the foreshore. Design software and policy analysis software, including the Delta model, will be updated in 2018. A fully probabilistic method results in more precise assessments and more transparency in the process of assessment and reconstruction of flood defences. This is of increasing importance, as large-scale infrastructural projects in a highly urbanized environment are increasingly subject to political and societal pressure to add additional features. For this reason, it is of increasing importance to be able to determine which new feature really adds to flood protection, to quantify how much its adds to the level of flood protection and to evaluate if it is really worthwhile. Please note: The Netherlands

  8. Probabilistic forward model for electroencephalography source analysis

    International Nuclear Information System (INIS)

    Plis, Sergey M; George, John S; Jun, Sung C; Ranken, Doug M; Volegov, Petr L; Schmidt, David M

    2007-01-01

    Source localization by electroencephalography (EEG) requires an accurate model of head geometry and tissue conductivity. The estimation of source time courses from EEG or from EEG in conjunction with magnetoencephalography (MEG) requires a forward model consistent with true activity for the best outcome. Although MRI provides an excellent description of soft tissue anatomy, a high resolution model of the skull (the dominant resistive component of the head) requires CT, which is not justified for routine physiological studies. Although a number of techniques have been employed to estimate tissue conductivity, no present techniques provide the noninvasive 3D tomographic mapping of conductivity that would be desirable. We introduce a formalism for probabilistic forward modeling that allows the propagation of uncertainties in model parameters into possible errors in source localization. We consider uncertainties in the conductivity profile of the skull, but the approach is general and can be extended to other kinds of uncertainties in the forward model. We and others have previously suggested the possibility of extracting conductivity of the skull from measured electroencephalography data by simultaneously optimizing over dipole parameters and the conductivity values required by the forward model. Using Cramer-Rao bounds, we demonstrate that this approach does not improve localization results nor does it produce reliable conductivity estimates. We conclude that the conductivity of the skull has to be either accurately measured by an independent technique, or that the uncertainties in the conductivity values should be reflected in uncertainty in the source location estimates

  9. A probabilistic approach to controlling crevice chemistry

    International Nuclear Information System (INIS)

    Millett, P.J.; Brobst, G.E.; Riddle, J.

    1995-01-01

    It has been generally accepted that the corrosion of steam generator tubing could be reduced if the local pH in regions where impurities concentrate could be controlled. The practice of molar ratio control is based on this assumption. Unfortunately, due to the complexity of the crevice concentration process, efforts to model the crevice chemistry based on bulk water conditions are quite uncertain. In-situ monitoring of the crevice chemistry is desirable, but may not be achievable in the near future. The current methodology for assessing the crevice chemistry is to monitor the hideout return chemistry when the plant shuts down. This approach also has its shortcomings, but may provide sufficient data to evaluate whether the crevice pH is in a desirable range. In this paper, an approach to controlling the crevice chemistry based on a target molar ratio indicator is introduced. The molar ratio indicator is based on what is believed to be the most reliable hideout return data. Probabilistic arguments are then used to show that the crevice pH will most likely be in a desirable range when the target molar ratio is achieved

  10. Aging in probabilistic safety assessment

    International Nuclear Information System (INIS)

    Jordan Cizelj, R.; Kozuh, M.

    1995-01-01

    Aging is a phenomenon, which is influencing on unavailability of all components of the plant. The influence of aging on Probabilistic Safety Assessment calculations was estimated for Electrical Power Supply System. The average increase of system unavailability due to aging of system components was estimated and components were prioritized regarding their influence on change of system unavailability and relative increase of their unavailability due to aging. After the analysis of some numerical results, the recommendation for a detailed research of aging phenomena and its influence on system availability is given. (author)

  11. Probabilistic assessment of SGTR management

    International Nuclear Information System (INIS)

    Champ, M.; Cornille, Y.; Lanore, J.M.

    1989-04-01

    In case of steam generator tube rupture (SGTR) event, in France, the mitigation of accident relies on operator intervention, by applying a specific accidental procedure. A detailed probabilistic analysis has been conducted which required the assessment of the failure probability of the operator actions, and for that purpose it was necessary to estimate the time available for the operator to apply the adequate procedure for various sequences. The results indicate that by taking into account the delays and the existence of adequate accidental procedures, the risk is reduced to a reasonably low level

  12. Probabilistic accident sequence recovery analysis

    International Nuclear Information System (INIS)

    Stutzke, Martin A.; Cooper, Susan E.

    2004-01-01

    Recovery analysis is a method that considers alternative strategies for preventing accidents in nuclear power plants during probabilistic risk assessment (PRA). Consideration of possible recovery actions in PRAs has been controversial, and there seems to be a widely held belief among PRA practitioners, utility staff, plant operators, and regulators that the results of recovery analysis should be skeptically viewed. This paper provides a framework for discussing recovery strategies, thus lending credibility to the process and enhancing regulatory acceptance of PRA results and conclusions. (author)

  13. Probabilistic risk assessment: Number 219

    International Nuclear Information System (INIS)

    Bari, R.A.

    1985-01-01

    This report describes a methodology for analyzing the safety of nuclear power plants. A historical overview of plants in the US is provided, and past, present, and future nuclear safety and risk assessment are discussed. A primer on nuclear power plants is provided with a discussion of pressurized water reactors (PWR) and boiling water reactors (BWR) and their operation and containment. Probabilistic Risk Assessment (PRA), utilizing both event-tree and fault-tree analysis, is discussed as a tool in reactor safety, decision making, and communications. (FI)

  14. Axiomatisation of fully probabilistic design

    Czech Academy of Sciences Publication Activity Database

    Kárný, Miroslav; Kroupa, Tomáš

    2012-01-01

    Roč. 186, č. 1 (2012), s. 105-113 ISSN 0020-0255 R&D Projects: GA MŠk(CZ) 2C06001; GA ČR GA102/08/0567 Institutional research plan: CEZ:AV0Z10750506 Keywords : Bayesian decision making * Fully probabilistic design * Kullback–Leibler divergence * Unified decision making Subject RIV: BB - Applied Statistics, Operational Research Impact factor: 3.643, year: 2012 http://library.utia.cas.cz/separaty/2011/AS/karny-0367271.pdf

  15. Probabilistic risk assessment, Volume I

    International Nuclear Information System (INIS)

    Anon.

    1982-01-01

    This book contains 158 papers presented at the International Topical Meeting on Probabilistic Risk Assessment held by the American Nuclear Society (ANS) and the European Nuclear Society (ENS) in Port Chester, New York in 1981. The meeting was second in a series of three. The main focus of the meeting was on the safety of light water reactors. The papers discuss safety goals and risk assessment. Quantitative safety goals, risk assessment in non-nuclear technologies, and operational experience and data base are also covered. Included is an address by Dr. Chauncey Starr

  16. Probabilistic safety analysis using microcomputer

    International Nuclear Information System (INIS)

    Futuro Filho, F.L.F.; Mendes, J.E.S.; Santos, M.J.P. dos

    1990-01-01

    The main steps of execution of a Probabilistic Safety Assessment (PSA) are presented in this report, as the study of the system description, construction of event trees and fault trees, and the calculation of overall unavailability of the systems. It is also presented the use of microcomputer in performing some tasks, highlightning the main characteristics of a software to perform adequately the job. A sample case of fault tree construction and calculation is presented, using the PSAPACK software, distributed by the IAEA (International Atomic Energy Agency) for training purpose. (author)

  17. Reliability technology and nuclear power

    International Nuclear Information System (INIS)

    Garrick, B.J.; Kaplan, S.

    1976-01-01

    This paper reviews some of the history and status of nuclear reliability and the evolution of this subject from art towards science. It shows that that probability theory is the appropriate and essential mathematical language of this subject. The authors emphasize that it is more useful to view probability not as a $prime$frequency$prime$, i.e., not as the result of a statistical experiment, but rather as a measure of state of confidence or a state of knowledge. They also show that the probabilistic, quantitative approach has a considerable history of application in the electric power industry in the area of power system planning. Finally, the authors show that the decision theory notion of utility provides a point of view from which risks, benefits, safety, and reliability can be viewed in a unified way thus facilitating understanding, comparison, and communication. 29 refs

  18. Probabilistic aspects of risk analyses for hazardous facilities

    International Nuclear Information System (INIS)

    Morici, A.; Valeri, A.; Zaffiro, C.

    1989-01-01

    The work described in the paper discusses the aspects of the risk analysis concerned with the use of the probabilistic methodology, in order to see how this approach may affect the risk management of industrial hazardous facilities. To this purpose reference is done to the Probabilistic Risk Assessment (PRA) of nuclear power plants. The paper points out that even though the public aversion towards nuclear risks is still far from being removed, the probabilistic approach may provide a sound support to the decision making and authorization process for any industrial activity implying risk for the environment and the public health. It is opinion of the authors that the probabilistic techniques have been developed to a great level of sophistication in the nuclear industry and provided much more experience in this field than in others. For some particular areas of the nuclear applications, such as the plant reliability and the plant response to the accidents, these techniques have reached a sufficient level of maturity and so some results have been usefully taken as a measure of the safety level of the plant itself. The use of some limited safety goals is regarded as a relevant item of the nuclear licensing process. The paper claims that it is time now that these methods would be applied with equal success to other hazardous facilities, and makes some comparative consideration on the differences of these plants with nuclear power plants in order to understand the effect of these differences on the PRA results and on the use one intends to make with them. (author)

  19. Study of QoS control and reliable routing method for utility communication network. Application of differentiated service to the network and alternative route establishment by the IP routing protocol; Denryokuyo IP network no QoS seigyo to shinraisei kakuho no hoho. DiffServ ni yoru QoS seigyo no koka to IP ni yoru fuku root ka no kento

    Energy Technology Data Exchange (ETDEWEB)

    Oba, E.

    2000-05-01

    QoS control method which satisfies utilities communication network requirement and alternative route establishment method which is for sustaining communication during a failure are studied. Applicability of DiffServ (Differentiated Service), one of the most promising QoS control method on IP network and studying energetically in IETF WG, is studied and it is found most application used in the utility communication network except for relaying system information could he accommodated to the DiffServ network. An example of the napping of the utility communication applications to the DiffServ PHB (Per Hop Behavior) is shown in this paper. Regarding to the alternative route, usual IP routing protocol cannot establish alternative route which doesn't have common links and nodes in their paths for a destination. IP address duplication with some modification of routing protocol enables such alternative route establishment. MPLS, distance vector algorithm and link state algorithm are evaluated qualitatively, and as a result, we found MPLS is promising way to establish the route. Quantitative evaluation will be future work. (author)

  20. Human reliability

    International Nuclear Information System (INIS)

    Bubb, H.

    1992-01-01

    This book resulted from the activity of Task Force 4.2 - 'Human Reliability'. This group was established on February 27th, 1986, at the plenary meeting of the Technical Reliability Committee of VDI, within the framework of the joint committee of VDI on industrial systems technology - GIS. It is composed of representatives of industry, representatives of research institutes, of technical control boards and universities, whose job it is to study how man fits into the technical side of the world of work and to optimize this interaction. In a total of 17 sessions, information from the part of ergonomy dealing with human reliability in using technical systems at work was exchanged, and different methods for its evaluation were examined and analyzed. The outcome of this work was systematized and compiled in this book. (orig.) [de

  1. Microelectronics Reliability

    Science.gov (United States)

    2017-01-17

    inverters  connected in a chain. ................................................. 5  Figure 3  Typical graph showing frequency versus square root of...developing an experimental  reliability estimating methodology that could both illuminate the  lifetime  reliability of advanced devices,  circuits and...or  FIT of the device. In other words an accurate estimate of the device  lifetime  was found and thus the  reliability  that  can  be  conveniently

  2. Reliability analysis under epistemic uncertainty

    International Nuclear Information System (INIS)

    Nannapaneni, Saideep; Mahadevan, Sankaran

    2016-01-01

    This paper proposes a probabilistic framework to include both aleatory and epistemic uncertainty within model-based reliability estimation of engineering systems for individual limit states. Epistemic uncertainty is considered due to both data and model sources. Sparse point and/or interval data regarding the input random variables leads to uncertainty regarding their distribution types, distribution parameters, and correlations; this statistical uncertainty is included in the reliability analysis through a combination of likelihood-based representation, Bayesian hypothesis testing, and Bayesian model averaging techniques. Model errors, which include numerical solution errors and model form errors, are quantified through Gaussian process models and included in the reliability analysis. The probability integral transform is used to develop an auxiliary variable approach that facilitates a single-level representation of both aleatory and epistemic uncertainty. This strategy results in an efficient single-loop implementation of Monte Carlo simulation (MCS) and FORM/SORM techniques for reliability estimation under both aleatory and epistemic uncertainty. Two engineering examples are used to demonstrate the proposed methodology. - Highlights: • Epistemic uncertainty due to data and model included in reliability analysis. • A novel FORM-based approach proposed to include aleatory and epistemic uncertainty. • A single-loop Monte Carlo approach proposed to include both types of uncertainties. • Two engineering examples used for illustration.

  3. Probabilistic Capacity Assessment of Lattice Transmission Towers under Strong Wind

    Directory of Open Access Journals (Sweden)

    Wei eZhang

    2015-10-01

    Full Text Available Serving as one key component of the most important lifeline infrastructure system, transmission towers are vulnerable to multiple nature hazards including strong wind and could pose severe threats to the power system security with possible blackouts under extreme weather conditions, such as hurricanes, derechoes, or winter storms. For the security and resiliency of the power system, it is important to ensure the structural safety with enough capacity for all possible failure modes, such as structural stability. The study is to develop a probabilistic capacity assessment approach for transmission towers under strong wind loads. Due to the complicated structural details of lattice transmission towers, wind tunnel experiments are carried out to understand the complex interactions of wind and the lattice sections of transmission tower and drag coefficients and the dynamic amplification factor for different panels of the transmission tower are obtained. The wind profile is generated and the wind time histories are simulated as a summation of time-varying mean and fluctuating components. The capacity curve for the transmission towers is obtained from the incremental dynamic analysis (IDA method. To consider the stochastic nature of wind field, probabilistic capacity curves are generated by implementing IDA analysis for different wind yaw angles and different randomly generated wind speed time histories. After building the limit state functions based on the maximum allowable drift to height ratio, the probabilities of failure are obtained based on the meteorological data at a given site. As the transmission tower serves as the key nodes for the power network, the probabilistic capacity curves can be incorporated into the performance based design of the power transmission network.

  4. Reliability Analysis of Adhesive Bonded Scarf Joints

    DEFF Research Database (Denmark)

    Kimiaeifar, Amin; Toft, Henrik Stensgaard; Lund, Erik

    2012-01-01

    element analysis (FEA). For the reliability analysis a design equation is considered which is related to a deterministic code-based design equation where reliability is secured by partial safety factors together with characteristic values for the material properties and loads. The failure criteria......A probabilistic model for the reliability analysis of adhesive bonded scarfed lap joints subjected to static loading is developed. It is representative for the main laminate in a wind turbine blade subjected to flapwise bending. The structural analysis is based on a three dimensional (3D) finite...... are formulated using a von Mises, a modified von Mises and a maximum stress failure criterion. The reliability level is estimated for the scarfed lap joint and this is compared with the target reliability level implicitly used in the wind turbine standard IEC 61400-1. A convergence study is performed to validate...

  5. Structural Reliability of Wind Turbine Blades

    DEFF Research Database (Denmark)

    Dimitrov, Nikolay Krasimirov

    turbine blades. The main purpose is to draw a clear picture of how reliability-based design of wind turbines can be done in practice. The objectives of the thesis are to create methodologies for efficient reliability assessment of composite materials and composite wind turbine blades, and to map...... the uncertainties in the processes, materials and external conditions that have an effect on the health of a composite structure. The study considers all stages in a reliability analysis, from defining models of structural components to obtaining the reliability index and calibration of partial safety factors...... by developing new models and standards or carrying out tests The following aspects are covered in detail: ⋅ The probabilistic aspects of ultimate strength of composite laminates are addressed. Laminated plates are considered as a general structural reliability system where each layer in a laminate is a separate...

  6. Compression of Probabilistic XML Documents

    Science.gov (United States)

    Veldman, Irma; de Keijzer, Ander; van Keulen, Maurice

    Database techniques to store, query and manipulate data that contains uncertainty receives increasing research interest. Such UDBMSs can be classified according to their underlying data model: relational, XML, or RDF. We focus on uncertain XML DBMS with as representative example the Probabilistic XML model (PXML) of [10,9]. The size of a PXML document is obviously a factor in performance. There are PXML-specific techniques to reduce the size, such as a push down mechanism, that produces equivalent but more compact PXML documents. It can only be applied, however, where possibilities are dependent. For normal XML documents there also exist several techniques for compressing a document. Since Probabilistic XML is (a special form of) normal XML, it might benefit from these methods even more. In this paper, we show that existing compression mechanisms can be combined with PXML-specific compression techniques. We also show that best compression rates are obtained with a combination of PXML-specific technique with a rather simple generic DAG-compression technique.

  7. Living probabilistic safety assessment (LPSA)

    International Nuclear Information System (INIS)

    1999-08-01

    Over the past few years many nuclear power plant organizations have performed probabilistic safety assessments (PSAs) to identify and understand key plant vulnerabilities. As a result of the availability of these PSA studies, there is a desire to use them to enhance plant safety and to operate the nuclear stations in the most efficient manner. PSA is an effective tool for this purpose as it assists plant management to target resources where the largest benefit to plant safety can be obtained. However, any PSA which is to be used in this way must have a credible and defensible basis. Thus, it is very important to have a high quality 'living PSA' accepted by the plant and the regulator. With this background in mind, the IAEA has prepared this report on Living Probabilistic Safety Assessment (LPSA) which addresses the updating, documentation, quality assurance, and management and organizational requirements for LPSA. Deficiencies in the areas addressed in this report would seriously reduce the adequacy of the LPSA as a tool to support decision making at NPPs. This report was reviewed by a working group during a Technical Committee Meeting on PSA Applications to Improve NPP Safety held in Madrid, Spain, from 23 to 27 February 1998

  8. Software for Probabilistic Risk Reduction

    Science.gov (United States)

    Hensley, Scott; Michel, Thierry; Madsen, Soren; Chapin, Elaine; Rodriguez, Ernesto

    2004-01-01

    A computer program implements a methodology, denoted probabilistic risk reduction, that is intended to aid in planning the development of complex software and/or hardware systems. This methodology integrates two complementary prior methodologies: (1) that of probabilistic risk assessment and (2) a risk-based planning methodology, implemented in a prior computer program known as Defect Detection and Prevention (DDP), in which multiple requirements and the beneficial effects of risk-mitigation actions are taken into account. The present methodology and the software are able to accommodate both process knowledge (notably of the efficacy of development practices) and product knowledge (notably of the logical structure of a system, the development of which one seeks to plan). Estimates of the costs and benefits of a planned development can be derived. Functional and non-functional aspects of software can be taken into account, and trades made among them. It becomes possible to optimize the planning process in the sense that it becomes possible to select the best suite of process steps and design choices to maximize the expectation of success while remaining within budget.

  9. Is Probabilistic Evidence a Source of Knowledge?

    Science.gov (United States)

    Friedman, Ori; Turri, John

    2015-01-01

    We report a series of experiments examining whether people ascribe knowledge for true beliefs based on probabilistic evidence. Participants were less likely to ascribe knowledge for beliefs based on probabilistic evidence than for beliefs based on perceptual evidence (Experiments 1 and 2A) or testimony providing causal information (Experiment 2B).…

  10. Probabilistic Cue Combination: Less Is More

    Science.gov (United States)

    Yurovsky, Daniel; Boyer, Ty W.; Smith, Linda B.; Yu, Chen

    2013-01-01

    Learning about the structure of the world requires learning probabilistic relationships: rules in which cues do not predict outcomes with certainty. However, in some cases, the ability to track probabilistic relationships is a handicap, leading adults to perform non-normatively in prediction tasks. For example, in the "dilution effect,"…

  11. Multiobjective optimal allocation problem with probabilistic non ...

    African Journals Online (AJOL)

    This paper considers the optimum compromise allocation in multivariate stratified sampling with non-linear objective function and probabilistic non-linear cost constraint. The probabilistic non-linear cost constraint is converted into equivalent deterministic one by using Chance Constrained programming. A numerical ...

  12. Probabilistic reasoning with graphical security models

    NARCIS (Netherlands)

    Kordy, Barbara; Pouly, Marc; Schweitzer, Patrick

    This work provides a computational framework for meaningful probabilistic evaluation of attack–defense scenarios involving dependent actions. We combine the graphical security modeling technique of attack–defense trees with probabilistic information expressed in terms of Bayesian networks. In order

  13. Probabilistic Geoacoustic Inversion in Complex Environments

    Science.gov (United States)

    2015-09-30

    Probabilistic Geoacoustic Inversion in Complex Environments Jan Dettmer School of Earth and Ocean Sciences, University of Victoria, Victoria BC...long-range inversion methods can fail to provide sufficient resolution. For proper quantitative examination of variability, parameter uncertainty must...project aims to advance probabilistic geoacoustic inversion methods for complex ocean environments for a range of geoacoustic data types. The work is

  14. Application of probabilistic precipitation forecasts from a ...

    African Journals Online (AJOL)

    2014-02-14

    Feb 14, 2014 ... Application of probabilistic precipitation forecasts from a deterministic model ... aim of this paper is to investigate the increase in the lead-time of flash flood warnings of the SAFFG using probabilistic precipitation forecasts ... The procedure is applied to a real flash flood event and the ensemble-based.

  15. Why do probabilistic finite element analysis ?

    CERN Document Server

    Thacker, Ben H

    2008-01-01

    The intention of this book is to provide an introduction to performing probabilistic finite element analysis. As a short guideline, the objective is to inform the reader of the use, benefits and issues associated with performing probabilistic finite element analysis without excessive theory or mathematical detail.

  16. Branching bisimulation congruence for probabilistic systems

    NARCIS (Netherlands)

    Trcka, N.; Georgievska, S.; Aldini, A.; Baier, C.

    2008-01-01

    The notion of branching bisimulation for the alternating model of probabilistic systems is not a congruence with respect to parallel composition. In this paper we first define another branching bisimulation in the more general model allowing consecutive probabilistic transitions, and we prove that

  17. Probabilistic Reversible Automata and Quantum Automata

    OpenAIRE

    Golovkins, Marats; Kravtsev, Maksim

    2002-01-01

    To study relationship between quantum finite automata and probabilistic finite automata, we introduce a notion of probabilistic reversible automata (PRA, or doubly stochastic automata). We find that there is a strong relationship between different possible models of PRA and corresponding models of quantum finite automata. We also propose a classification of reversible finite 1-way automata.

  18. Bisimulations meet PCTL equivalences for probabilistic automata

    DEFF Research Database (Denmark)

    Song, Lei; Zhang, Lijun; Godskesen, Jens Chr.

    2013-01-01

    Probabilistic automata (PAs) have been successfully applied in formal verification of concurrent and stochastic systems. Efficient model checking algorithms have been studied, where the most often used logics for expressing properties are based on probabilistic computation tree logic (PCTL) and its...

  19. Error Discounting in Probabilistic Category Learning

    Science.gov (United States)

    Craig, Stewart; Lewandowsky, Stephan; Little, Daniel R.

    2011-01-01

    The assumption in some current theories of probabilistic categorization is that people gradually attenuate their learning in response to unavoidable error. However, existing evidence for this error discounting is sparse and open to alternative interpretations. We report 2 probabilistic-categorization experiments in which we investigated error…

  20. The importance of probabilistic evaluations in connection with risk analyses according to technical safety laws

    International Nuclear Information System (INIS)

    Mathiak, E.

    1984-01-01

    The nuclear energy sector exemplifies the essential importance to be attached to the practical application of probabilistic evaluations (e.g. probabilistic reliability analyses) in connection with the legal risk assessment of technical systems and installations. The study is making use of a triad risk analysis and tries to reconcile the natural science and legal points of view. Without changing the definitions of 'risk' and 'hazard' in the legal sense of their meaning the publication discusses their reconcilation with the laws of natural science, their interpretation and application in view of the latter. (HSCH) [de

  1. Reliability and Failure in NASA Missions: Blunders, Normal Accidents, High Reliability, Bad Luck

    Science.gov (United States)

    Jones, Harry W.

    2015-01-01

    NASA emphasizes crew safety and system reliability but several unfortunate failures have occurred. The Apollo 1 fire was mistakenly unanticipated. After that tragedy, the Apollo program gave much more attention to safety. The Challenger accident revealed that NASA had neglected safety and that management underestimated the high risk of shuttle. Probabilistic Risk Assessment was adopted to provide more accurate failure probabilities for shuttle and other missions. NASA's "faster, better, cheaper" initiative and government procurement reform led to deliberately dismantling traditional reliability engineering. The Columbia tragedy and Mars mission failures followed. Failures can be attributed to blunders, normal accidents, or bad luck. Achieving high reliability is difficult but possible.

  2. Distributed collaborative probabilistic design of multi-failure structure with fluid-structure interaction using fuzzy neural network of regression

    Science.gov (United States)

    Song, Lu-Kai; Wen, Jie; Fei, Cheng-Wei; Bai, Guang-Chen

    2018-05-01

    To improve the computing efficiency and precision of probabilistic design for multi-failure structure, a distributed collaborative probabilistic design method-based fuzzy neural network of regression (FR) (called as DCFRM) is proposed with the integration of distributed collaborative response surface method and fuzzy neural network regression model. The mathematical model of DCFRM is established and the probabilistic design idea with DCFRM is introduced. The probabilistic analysis of turbine blisk involving multi-failure modes (deformation failure, stress failure and strain failure) was investigated by considering fluid-structure interaction with the proposed method. The distribution characteristics, reliability degree, and sensitivity degree of each failure mode and overall failure mode on turbine blisk are obtained, which provides a useful reference for improving the performance and reliability of aeroengine. Through the comparison of methods shows that the DCFRM reshapes the probability of probabilistic analysis for multi-failure structure and improves the computing efficiency while keeping acceptable computational precision. Moreover, the proposed method offers a useful insight for reliability-based design optimization of multi-failure structure and thereby also enriches the theory and method of mechanical reliability design.

  3. Consideration of aging in probabilistic safety assessment

    International Nuclear Information System (INIS)

    Titina, B.; Cepin, M.

    2007-01-01

    Probabilistic safety assessment is a standardised tool for assessment of safety of nuclear power plants. It is a complement to the safety analyses. Standard probabilistic models of safety equipment assume component failure rate as a constant. Ageing of systems, structures and components can theoretically be included in new age-dependent probabilistic safety assessment, which generally causes the failure rate to be a function of age. New age-dependent probabilistic safety assessment models, which offer explicit calculation of the ageing effects, are developed. Several groups of components are considered which require their unique models: e.g. operating components e.g. stand-by components. The developed models on the component level are inserted into the models of the probabilistic safety assessment in order that the ageing effects are evaluated for complete systems. The preliminary results show that the lack of necessary data for consideration of ageing causes highly uncertain models and consequently the results. (author)

  4. Uncertainty estimation in nuclear power plant probabilistic safety assessment

    International Nuclear Information System (INIS)

    Guarro, S.B.; Cummings, G.E.

    1989-01-01

    Probabilistic Risk Assessment (PRA) was introduced in the nuclear industry and the nuclear regulatory process in 1975 with the publication of the Reactor Safety Study by the U.S. Nuclear Regulatory Commission. Almost fifteen years later, the state-of-the-art in this field has been expanded and sharpened in many areas, and about thirty-five plant-specific PRAs (Probabilistic Risk Assessments) have been performed by the nuclear utility companies or by the U.S. Nuclear Regulatory commission. Among the areas where the most evident progress has been made in PRA and PSA (Probabilistic Safety Assessment, as these studies are more commonly referred to in the international community outside the U.S.) is the development of a consistent framework for the identification of sources of uncertainty and the estimation of their magnitude as it impacts various risk measures. Techniques to propagate uncertainty in reliability data through the risk models and display its effect on the top level risk estimates were developed in the early PRAs. The Seismic Safety Margin Research Program (SSMRP) study was the first major risk study to develop an approach to deal explicitly with uncertainty in risk estimates introduced not only by uncertainty in component reliability data, but by the incomplete state of knowledge of the assessor(s) with regard to basic phenomena that may trigger and drive a severe accident. More recently NUREG-1150, another major study of reactor risk sponsored by the NRC, has expanded risk uncertainty estimation and analysis into the realm of model uncertainty related to the relatively poorly known post-core-melt phenomena which determine the behavior of the molten core and of the rector containment structures

  5. Probabilistic finite elements for fatigue and fracture analysis

    Science.gov (United States)

    Belytschko, Ted; Liu, Wing Kam

    1993-04-01

    An overview of the probabilistic finite element method (PFEM) developed by the authors and their colleagues in recent years is presented. The primary focus is placed on the development of PFEM for both structural mechanics problems and fracture mechanics problems. The perturbation techniques are used as major tools for the analytical derivation. The following topics are covered: (1) representation and discretization of random fields; (2) development of PFEM for the general linear transient problem and nonlinear elasticity using Hu-Washizu variational principle; (3) computational aspects; (4) discussions of the application of PFEM to the reliability analysis of both brittle fracture and fatigue; and (5) a stochastic computational tool based on stochastic boundary element (SBEM). Results are obtained for the reliability index and corresponding probability of failure for: (1) fatigue crack growth; (2) defect geometry; (3) fatigue parameters; and (4) applied loads. These results show that initial defect is a critical parameter.

  6. Probabilistic risk assessment and its role in plant modifications

    International Nuclear Information System (INIS)

    Diederich, A.R.; McElroy, W.F.

    1986-01-01

    Electric Utilities today have a tool available to improve management's ability to evaluate nuclear power plant modifications (MODS). Probabilistic Risk Assessment (PRA), is a tool of choice since it can be applied to a specific situation such as MOD request review, bringing the perspectives of reliability, financial risk and consequences to the public in addition to the more rigid requirements like those associated with Quality Assurance or licensing criteria. The techniques used in the PRA process revolve about the creation and manipulation of Fault Trees and Event Trees, which are used to quantify the event sequences and reliability of plant systems in a logical framework. It is through these methods that chains of sequences, or events, are understood. The degree to which plant systems are modelled in the PRA can vary depending on resources and purpose. Philadelphia Elecrtric Company's PRA modelled ten (10) major systems but this number may increase during the application and updating process

  7. Probabilistic safety assessment in the chemical and nuclear industries

    CERN Document Server

    Fullwood, Ralph R

    2000-01-01

    Probabilistic Safety Analysis (PSA) determines the probability and consequences of accidents, hence, the risk. This subject concerns policy makers, regulators, designers, educators and engineers working to achieve maximum safety with operational efficiency. Risk is analyzed using methods for achieving reliability in the space program. The first major application was to the nuclear power industry, followed by applications to the chemical industry. It has also been applied to space, aviation, defense, ground, and water transportation. This book is unique in its treatment of chemical and nuclear risk. Problems are included at the end of many chapters, and answers are in the back of the book. Computer files are provided (via the internet), containing reliability data, a calculator that determines failure rate and uncertainty based on field experience, pipe break calculator, event tree calculator, FTAP and associated programs for fault tree analysis, and a units conversion code. It contains 540 references and many...

  8. Importance of properly treating human performance in probabilistic risk assessments

    International Nuclear Information System (INIS)

    Kukielka, C.A.; Butler, F.G.; Chaiko, M.A.

    1997-01-01

    A critical issue to consider when developing Advanced Reactor Systems (ARS) is the operators' ability to reliably execute Emergency Operating Procedures (EOPs) during accidents. A combined probabilistic and deterministic method for evaluating operator performance is outlined in this paper. Three questions are addressed: (1) does the operator understand the status of the plant? (2) does the operator know what to do? and (3) what are the odds of successful EOP execution? Deterministic methods are used to evaluate questions 1 and 2, and question 3 is addressed by statistical analysis. Simulator exercises are used to develop probability of response as a function of time curves for time limited operator actions. This method has been used to identify and resolve deficiencies in the plant operating procedures and the operator interface. An application is provided to the Anticipated Transient without Scram accident sequences. The results of Human Reliability Analysis are compared with the results of similar BWR analyses. 2 figs., 2 tabs

  9. Application of probabilistic safety assessment for Macedonian electric power system

    International Nuclear Information System (INIS)

    Kancev, D.; Causevski, A.; Cepin, M.; Volkanovski, A.

    2007-01-01

    Due to the complex and integrated nature of a power system, failures in any part of the system can cause interruptions, which range from inconveniencing a small number of local residents to a major and widespread catastrophic disruption of supply known as blackout. The objective of the paper is to show that the methods and tools of probabilistic safety assessment are applicable for assessment and improvement of real power systems. The method used in this paper is developed based on the fault tree analysis and is adapted for the power system reliability analysis. A particular power system i.e. the Macedonian power system is the object of the analysis. The results show that the method is suitable for application of real systems. The reliability of Macedonian power system assumed as the static system is assessed. The components, which can significantly impact the power system are identified and analysed in more details. (author)

  10. Probabilist methods applied to electric source problems in nuclear safety

    International Nuclear Information System (INIS)

    Carnino, A.; Llory, M.

    1979-01-01

    Nuclear Safety has frequently been asked to quantify safety margins and evaluate the hazard. In order to do so, the probabilist methods have proved to be the most promising. Without completely replacing determinist safety, they are now commonly used at the reliability or availability stages of systems as well as for determining the likely accidental sequences. In this paper an application linked to the problem of electric sources is described, whilst at the same time indicating the methods used. This is the calculation of the probable loss of all the electric sources of a pressurized water nuclear power station, the evaluation of the reliability of diesels by event trees of failures and the determination of accidental sequences which could be brought about by the 'total electric source loss' initiator and affect the installation or the environment [fr

  11. New Aspects of Probabilistic Forecast Verification Using Information Theory

    Science.gov (United States)

    Tödter, Julian; Ahrens, Bodo

    2013-04-01

    This work deals with information-theoretical methods in probabilistic forecast verification, particularly concerning ensemble forecasts. Recent findings concerning the "Ignorance Score" are shortly reviewed, then a consistent generalization to continuous forecasts is motivated. For ensemble-generated forecasts, the presented measures can be calculated exactly. The Brier Score (BS) and its generalizations to the multi-categorical Ranked Probability Score (RPS) and to the Continuous Ranked Probability Score (CRPS) are prominent verification measures for probabilistic forecasts. Particularly, their decompositions into measures quantifying the reliability, resolution and uncertainty of the forecasts are attractive. Information theory sets up a natural framework for forecast verification. Recently, it has been shown that the BS is a second-order approximation of the information-based Ignorance Score (IGN), which also contains easily interpretable components and can also be generalized to a ranked version (RIGN). Here, the IGN, its generalizations and decompositions are systematically discussed in analogy to the variants of the BS. Additionally, a Continuous Ranked IGN (CRIGN) is introduced in analogy to the CRPS. The useful properties of the conceptually appealing CRIGN are illustrated, together with an algorithm to evaluate its components reliability, resolution, and uncertainty for ensemble-generated forecasts. This algorithm can also be used to calculate the decomposition of the more traditional CRPS exactly. The applicability of the "new" measures is demonstrated in a small evaluation study of ensemble-based precipitation forecasts.

  12. Probabilistic framework for product design optimization and risk management

    Science.gov (United States)

    Keski-Rahkonen, J. K.

    2018-05-01

    Probabilistic methods have gradually gained ground within engineering practices but currently it is still the industry standard to use deterministic safety margin approaches to dimensioning components and qualitative methods to manage product risks. These methods are suitable for baseline design work but quantitative risk management and product reliability optimization require more advanced predictive approaches. Ample research has been published on how to predict failure probabilities for mechanical components and furthermore to optimize reliability through life cycle cost analysis. This paper reviews the literature for existing methods and tries to harness their best features and simplify the process to be applicable in practical engineering work. Recommended process applies Monte Carlo method on top of load-resistance models to estimate failure probabilities. Furthermore, it adds on existing literature by introducing a practical framework to use probabilistic models in quantitative risk management and product life cycle costs optimization. The main focus is on mechanical failure modes due to the well-developed methods used to predict these types of failures. However, the same framework can be applied on any type of failure mode as long as predictive models can be developed.

  13. Pipe fracture evaluations for leak-rate detection: Probabilistic models

    International Nuclear Information System (INIS)

    Rahman, S.; Wilkowski, G.; Ghadiali, N.

    1993-01-01

    This is the second in series of three papers generated from studies on nuclear pipe fracture evaluations for leak-rate detection. This paper focuses on the development of novel probabilistic models for stochastic performance evaluation of degraded nuclear piping systems. It was accomplished here in three distinct stages. First, a statistical analysis was conducted to characterize various input variables for thermo-hydraulic analysis and elastic-plastic fracture mechanics, such as material properties of pipe, crack morphology variables, and location of cracks found in nuclear piping. Second, a new stochastic model was developed to evaluate performance of degraded piping systems. It is based on accurate deterministic models for thermo-hydraulic and fracture mechanics analyses described in the first paper, statistical characterization of various input variables, and state-of-the-art methods of modem structural reliability theory. From this model. the conditional probability of failure as a function of leak-rate detection capability of the piping systems can be predicted. Third, a numerical example was presented to illustrate the proposed model for piping reliability analyses. Results clearly showed that the model provides satisfactory estimates of conditional failure probability with much less computational effort when compared with those obtained from Monte Carlo simulation. The probabilistic model developed in this paper will be applied to various piping in boiling water reactor and pressurized water reactor plants for leak-rate detection applications

  14. An overview-probabilistic safety analysis for research reactors

    International Nuclear Information System (INIS)

    Liu Jinlin; Peng Changhong

    2015-01-01

    For long-term application, Probabilistic Safety Analysis (PSA) has proved to be a valuable tool for improving the safety and reliability of power reactors. In China, 'Nuclear safety and radioactive pollution prevention 'Twelfth Five Year Plan' and the 2020 vision' raises clearly that: to develop probabilistic safety analysis and aging evaluation for research reactors. Comparing with the power reactors, it reveals some specific features in research reactors: lower operating power, lower coolant temperature and pressure, etc. However, the core configurations may be changed very often and human actions play an important safety role in research reactors due to its specific experimental requirement. As a result, there is a necessary to conduct the PSA analysis of research reactors. This paper discusses the special characteristics related to the structure and operation and the methods to develop the PSA of research reactors, including initiating event analysis, event tree analysis, fault tree analysis, dependent failure analysis, human reliability analysis and quantification as well as the experimental and external event evaluation through the investigation of various research reactors and their PSAs home and abroad, to provide the current situation and features of research reactors PSAs. (author)

  15. A Novel TRM Calculation Method by Probabilistic Concept

    Science.gov (United States)

    Audomvongseree, Kulyos; Yokoyama, Akihiko; Verma, Suresh Chand; Nakachi, Yoshiki

    In a new competitive environment, it becomes possible for the third party to access a transmission facility. From this structure, to efficiently manage the utilization of the transmission network, a new definition about Available Transfer Capability (ATC) has been proposed. According to the North American ElectricReliability Council (NERC)’s definition, ATC depends on several parameters, i. e. Total Transfer Capability (TTC), Transmission Reliability Margin (TRM), and Capacity Benefit Margin (CBM). This paper is focused on the calculation of TRM which is one of the security margin reserved for any uncertainty of system conditions. The TRM calculation by probabilistic method is proposed in this paper. Based on the modeling of load forecast error and error in transmission line limitation, various cases of transmission transfer capability and its related probabilistic nature can be calculated. By consideration of the proposed concept of risk analysis, the appropriate required amount of TRM can be obtained. The objective of this research is to provide realistic information on the actual ability of the network which may be an alternative choice for system operators to make an appropriate decision in the competitive market. The advantages of the proposed method are illustrated by application to the IEEJ-WEST10 model system.

  16. Fault trees and the impact of human variability on probabilistic risk analysis

    International Nuclear Information System (INIS)

    1983-01-01

    It has long been recognized that human reliability is an important factor in probabilistic risk analysis. In the field, this is true in a direct operational sense as well as in the areas of installation and maintenance. The interest in quantification arises from the desire to achieve optimum design in the human factors sense (operability-maintainability) and from the need to include human reliability considerations in probabilistic risk analysis to achieve complete and valid risk evaluation. In order to integrate human reliability into the system analysis, it is necessary to consider two questions. These relate to the way that human functions fit into the existing analytical models and methods as well as the nature of human failure mechanisms, modes and failure (error) rates

  17. Method of extracting significant trouble information of nuclear power plants using probabilistic analysis technique

    International Nuclear Information System (INIS)

    Shimada, Yoshio; Miyazaki, Takamasa

    2005-01-01

    In order to analyze and evaluate large amounts of trouble information of overseas nuclear power plants, it is necessary to select information that is significant in terms of both safety and reliability. In this research, a method of efficiently and simply classifying degrees of importance of components in terms of safety and reliability while paying attention to root-cause components appearing in the information was developed. Regarding safety, the reactor core damage frequency (CDF), which is used in the probabilistic analysis of a reactor, was used. Regarding reliability, the automatic plant trip probability (APTP), which is used in the probabilistic analysis of automatic reactor trips, was used. These two aspects were reflected in the development of criteria for classifying degrees of importance of components. By applying these criteria, a simple method of extracting significant trouble information of overseas nuclear power plants was developed. (author)

  18. Reliability evaluation of power systems

    CERN Document Server

    Billinton, Roy

    1996-01-01

    The Second Edition of this well-received textbook presents over a decade of new research in power system reliability-while maintaining the general concept, structure, and style of the original volume. This edition features new chapters on the growing areas of Monte Carlo simulation and reliability economics. In addition, chapters cover the latest developments in techniques and their application to real problems. The text also explores the progress occurring in the structure, planning, and operation of real power systems due to changing ownership, regulation, and access. This work serves as a companion volume to Reliability Evaluation of Engineering Systems: Second Edition (1992).

  19. Revealing the burden of maternal mortality: a probabilistic model for determining pregnancy-related causes of death from verbal autopsies

    Directory of Open Access Journals (Sweden)

    Desta Teklay

    2007-02-01

    Full Text Available Abstract Background Substantial reductions in maternal mortality are called for in Millennium Development Goal 5 (MDG-5, thus assuming that maternal mortality is measurable. A key difficulty is attributing causes of death for the many women who die unaided in developing countries. Verbal autopsy (VA can elicit circumstances of death, but data need to be interpreted reliably and consistently to serve as global indicators. Recent developments in probabilistic modelling of VA interpretation are adapted and assessed here for the specific circumstances of pregnancy-related death. Methods A preliminary version of the InterVA-M probabilistic VA interpretation model was developed and refined with adult female VA data from several sources, and then assessed against 258 additional VA interviews from Burkina Faso. Likely causes of death produced by the model were compared with causes previously determined by local physicians. Distinction was made between free-text and closed-question data in the VA interviews, to assess the added value of free-text material on the model's output. Results Following rationalisation between the model and physician interpretations, cause-specific mortality fractions were broadly similar. Case-by-case agreement between the model and any of the reviewing physicians reached approximately 60%, rising to approximately 80% when cases with a discrepancy were reviewed by an additional physician. Cardiovascular disease and malaria showed the largest differences between the methods, and the attribution of infections related to pregnancy also varied. The model estimated 30% of deaths to be pregnancy-related, of which half were due to direct causes. Data derived from free-text made no appreciable difference. Conclusion InterVA-M represents a potentially valuable new tool for measuring maternal mortality in an efficient, consistent and standardised way. Further development, refinement and validation are planned. It could become a routine

  20. Probabilistic Survivability Versus Time Modeling

    Science.gov (United States)

    Joyner, James J., Sr.

    2016-01-01

    This presentation documents Kennedy Space Center's Independent Assessment work completed on three assessments for the Ground Systems Development and Operations (GSDO) Program to assist the Chief Safety and Mission Assurance Officer during key programmatic reviews and provided the GSDO Program with analyses of how egress time affects the likelihood of astronaut and ground worker survival during an emergency. For each assessment, a team developed probability distributions for hazard scenarios to address statistical uncertainty, resulting in survivability plots over time. The first assessment developed a mathematical model of probabilistic survivability versus time to reach a safe location using an ideal Emergency Egress System at Launch Complex 39B (LC-39B); the second used the first model to evaluate and compare various egress systems under consideration at LC-39B. The third used a modified LC-39B model to determine if a specific hazard decreased survivability more rapidly than other events during flight hardware processing in Kennedy's Vehicle Assembly Building.

  1. Probabilistic cloning with supplementary information

    International Nuclear Information System (INIS)

    Azuma, Koji; Shimamura, Junichi; Koashi, Masato; Imoto, Nobuyuki

    2005-01-01

    We consider probabilistic cloning of a state chosen from a mutually nonorthogonal set of pure states, with the help of a party holding supplementary information in the form of pure states. When the number of states is 2, we show that the best efficiency of producing m copies is always achieved by a two-step protocol in which the helping party first attempts to produce m-1 copies from the supplementary state, and if it fails, then the original state is used to produce m copies. On the other hand, when the number of states exceeds two, the best efficiency is not always achieved by such a protocol. We give examples in which the best efficiency is not achieved even if we allow any amount of one-way classical communication from the helping party

  2. Machine learning a probabilistic perspective

    CERN Document Server

    Murphy, Kevin P

    2012-01-01

    Today's Web-enabled deluge of electronic data calls for automated methods of data analysis. Machine learning provides these, developing methods that can automatically detect patterns in data and then use the uncovered patterns to predict future data. This textbook offers a comprehensive and self-contained introduction to the field of machine learning, based on a unified, probabilistic approach. The coverage combines breadth and depth, offering necessary background material on such topics as probability, optimization, and linear algebra as well as discussion of recent developments in the field, including conditional random fields, L1 regularization, and deep learning. The book is written in an informal, accessible style, complete with pseudo-code for the most important algorithms. All topics are copiously illustrated with color images and worked examples drawn from such application domains as biology, text processing, computer vision, and robotics. Rather than providing a cookbook of different heuristic method...

  3. Probabilistic assessments of fuel performance

    International Nuclear Information System (INIS)

    Kelppe, S.; Ranta-Puska, K.

    1998-01-01

    The probabilistic Monte Carlo Method, coupled with quasi-random sampling, is applied for the fuel performance analyses. By using known distributions of fabrication parameters and real power histories with their randomly selected combinations, and by making a large number of ENIGMA code calculations, one expects to find out the state of the whole reactor fuel. Good statistics requires thousands of runs. A sample case representing VVER-440 reactor fuel indicates relatively low fuel temperatures and mainly athermal fission gas release if any. The rod internal pressure remains typically below 2.5 MPa, which leaves a large margin to the system pressure of 12 MPa Gap conductance, an essential parameter in the accident evaluations, shows no decrease from its start-of-life value. (orig.)

  4. Probabilistic Fatigue Damage Program (FATIG)

    Science.gov (United States)

    Michalopoulos, Constantine

    2012-01-01

    FATIG computes fatigue damage/fatigue life using the stress rms (root mean square) value, the total number of cycles, and S-N curve parameters. The damage is computed by the following methods: (a) traditional method using Miner s rule with stress cycles determined from a Rayleigh distribution up to 3*sigma; and (b) classical fatigue damage formula involving the Gamma function, which is derived from the integral version of Miner's rule. The integration is carried out over all stress amplitudes. This software solves the problem of probabilistic fatigue damage using the integral form of the Palmgren-Miner rule. The software computes fatigue life using an approach involving all stress amplitudes, up to N*sigma, as specified by the user. It can be used in the design of structural components subjected to random dynamic loading, or by any stress analyst with minimal training for fatigue life estimates of structural components.

  5. Probabilistic cloning of equidistant states

    International Nuclear Information System (INIS)

    Jimenez, O.; Roa, Luis; Delgado, A.

    2010-01-01

    We study the probabilistic cloning of equidistant states. These states are such that the inner product between them is a complex constant or its conjugate. Thereby, it is possible to study their cloning in a simple way. In particular, we are interested in the behavior of the cloning probability as a function of the phase of the overlap among the involved states. We show that for certain families of equidistant states Duan and Guo's cloning machine leads to cloning probabilities lower than the optimal unambiguous discrimination probability of equidistant states. We propose an alternative cloning machine whose cloning probability is higher than or equal to the optimal unambiguous discrimination probability for any family of equidistant states. Both machines achieve the same probability for equidistant states whose inner product is a positive real number.

  6. Probabilistic optimization of safety coefficients

    International Nuclear Information System (INIS)

    Marques, M.; Devictor, N.; Magistris, F. de

    1999-01-01

    This article describes a reliability-based method for the optimization of safety coefficients defined and used in design codes. The purpose of the optimization is to determine the partial safety coefficients which minimize an objective function for sets of components and loading situations covered by a design rule. This objective function is a sum of distances between the reliability of the components designed using the safety coefficients and a target reliability. The advantage of this method is shown on the examples of the reactor vessel, a vapour pipe and the safety injection circuit. (authors)

  7. Probabilistic safety assessment - regulatory perspective

    International Nuclear Information System (INIS)

    Solanki, R.B.; Paul, U.K.; Hajra, P.; Agarwal, S.K.

    2002-01-01

    Full text: Nuclear power plants (NPPs) have been designed, constructed and operated mainly based on deterministic safety analysis philosophy. In this approach, a substantial amount of safety margin is incorporated in the design and operational requirements. Additional margin is incorporated by applying the highest quality engineering codes, standards and practices, and the concept of defence-in-depth in design and operating procedures, by including conservative assumptions and acceptance criteria in plant response analysis of postulated initiating events (PIEs). However, as the probabilistic approach has been improved and refined over the years, it is possible for the designer, operator and regulator to get a more detailed and realistic picture of the safety importance of plant design features, operating procedures and operational practices by using probabilistic safety assessment (PSA) along with the deterministic methodology. At present, many countries including USA, UK and France are using PSA insights in their decision making along with deterministic basis. India has also made substantial progress in the development of methods for carrying out PSA. However, consensus on the use of PSA in regulatory decision-making has not been achieved yet. This paper emphasises on the requirements (e.g.,level of details, key modelling assumptions, data, modelling aspects, success criteria, sensitivity and uncertainty analysis) for improving the quality and consistency in performance and use of PSA that can facilitate meaningful use of the PSA insights in the regulatory decision-making in India. This paper also provides relevant information on international scenario and various application areas of PSA along with progress made in India. The PSA perspective presented in this paper may help in achieving consensus on the use of PSA for regulatory / utility decision-making in design and operation of NPPs

  8. Probabilistic modeling of fatigue crack growth in Ti-6Al-4V

    International Nuclear Information System (INIS)

    Soboyejo, W.O.; Shen, W.; Soboyejo, A.B.O.

    2001-01-01

    This paper presents the results of a combined experimental and analytical study of the probabilistic nature of fatigue crack growth in Ti-6Al-4V. A simple experimental fracture mechanics framework is presented for the determination of statistical fatigue crack growth parameters from two fatigue tests. The experimental studies show that the variabilities in long fatigue crack growth rate data and the Paris coefficient are well described by the log-normal distributions. The variabilities in the Paris exponent are also shown to be well characterized by a normal distribution. The measured statistical distributions are incorporated into a probabilistic fracture mechanics framework for the estimation of material reliability. The implications of the results are discussed for the probabilistic analysis of fatigue crack growth in engineering components and structures. (orig.)

  9. Survey of probabilistic methods in safety and risk assessment for nuclear power plant licensing

    International Nuclear Information System (INIS)

    1984-04-01

    After an overview about the goals and general methods of probabilistic approaches in nuclear safety the main features of probabilistic safety or risk assessment (PRA) methods are discussed. Mostly in practical applications not a full-fledged PRA is applied but rather various levels of analysis leading from unavailability assessment of systems over the more complex analysis of the probable core damage stages up to the assessment of the overall health effects on the total population from a certain practice. The various types of application are discussed in relation to their limitation and benefits for different stages of design or operation of nuclear power plants. This gives guidance for licensing staff to judge the usefulness of the various methods for their licensing decisions. Examples of the application of probabilistic methods in several countries are given. Two appendices on reliability analysis and on containment and consequence analysis provide some more details on these subjects. (author)

  10. Short-term Probabilistic Load Forecasting with the Consideration of Human Body Amenity

    Directory of Open Access Journals (Sweden)

    Ning Lu

    2013-02-01

    Full Text Available Load forecasting is the basis of power system planning and design. It is important for the economic operation and reliability assurance of power system. However, the results of load forecasting given by most existing methods are deterministic. This study aims at probabilistic load forecasting. First, the support vector machine regression is used to acquire the deterministic results of load forecasting with the consideration of human body amenity. Then the probabilistic load forecasting at a certain confidence level is given after the analysis of error distribution law corresponding to certain heat index interval. The final simulation shows that this probabilistic forecasting method is easy to implement and can provide more information than the deterministic forecasting results, and thus is helpful for decision-makers to make reasonable decisions.

  11. Probabilistic analysis of deformed mode of engineering constructions’ soil-cement grounds

    Directory of Open Access Journals (Sweden)

    Vynnykov Yuriy

    2017-01-01

    Full Text Available The results of the analysis of probabilistic methods that are used to assess the deformed state of the foundations of engineering structures are presented. A finite element analysis of the stress-strain state of the “man made soil ground – foundation – structure” system was carried out. A method for probabilistic calculation using the finite element method is proposed. On a real example, the level of reliability of a design decision based on a deterministic calculation is estimated by probabilistic calculation. On the basis of the statistic data obtained by imitational modeling, the probability of failure and no-failure operation of the structure regarding the absolute value of settlement and regarding the value of tilt against the reinforcement ratio of soft soil grounds settlements was determined. The probability of failure regarding the value of tilt against the reinforcement ratio was taken (15 to 25%, which is 0.03 – 0.05.

  12. Residual Heat Removal System qualitative probabilistic safety analysis before and after auto closure interlock removal

    International Nuclear Information System (INIS)

    Mikulicic, V.; Simic, Z.

    1992-01-01

    The analysis evaluates the consequences of the removal of the auto closure interlock (ACI) on the Residual Heat Removal System (RHRS) suction/isolation valves at the nuclear power plant. The deletion of the RHRS ACI is in part based on a probabilistic safety analysis (PSA) which justifies the removal based on a criterion of increased availability and reliability. Three different areas to be examined in PSA: the likelihood of an interfacing system LOCA; RHRS availability and reliability; and low temperature overpressurization control. The paper emphasizes particularly the RHRS unavailability and reliability evaluation utilizing the current control circuitry configuration and then with the proposed modification to the control circuitry. (author)

  13. Probabilistic machine learning and artificial intelligence.

    Science.gov (United States)

    Ghahramani, Zoubin

    2015-05-28

    How can a machine learn from experience? Probabilistic modelling provides a framework for understanding what learning is, and has therefore emerged as one of the principal theoretical and practical approaches for designing machines that learn from data acquired through experience. The probabilistic framework, which describes how to represent and manipulate uncertainty about models and predictions, has a central role in scientific data analysis, machine learning, robotics, cognitive science and artificial intelligence. This Review provides an introduction to this framework, and discusses some of the state-of-the-art advances in the field, namely, probabilistic programming, Bayesian optimization, data compression and automatic model discovery.

  14. Probabilistic machine learning and artificial intelligence

    Science.gov (United States)

    Ghahramani, Zoubin

    2015-05-01

    How can a machine learn from experience? Probabilistic modelling provides a framework for understanding what learning is, and has therefore emerged as one of the principal theoretical and practical approaches for designing machines that learn from data acquired through experience. The probabilistic framework, which describes how to represent and manipulate uncertainty about models and predictions, has a central role in scientific data analysis, machine learning, robotics, cognitive science and artificial intelligence. This Review provides an introduction to this framework, and discusses some of the state-of-the-art advances in the field, namely, probabilistic programming, Bayesian optimization, data compression and automatic model discovery.

  15. Probabilistic assessment of nuclear safety and safeguards

    International Nuclear Information System (INIS)

    Higson, D.J.

    1987-01-01

    Nuclear reactor accidents and diversions of materials from the nuclear fuel cycle are perceived by many people as particularly serious threats to society. Probabilistic assessment is a rational approach to the evaluation of both threats, and may provide a basis for decisions on appropriate actions to control them. Probabilistic method have become standard tools used in the analysis of safety, but there are disagreements on the criteria to be applied when assessing the results of analysis. Probabilistic analysis and assessment of the effectiveness of nuclear material safeguards are still at an early stage of development. (author)

  16. Integrated Deterministic-Probabilistic Safety Assessment Methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Kudinov, P.; Vorobyev, Y.; Sanchez-Perea, M.; Queral, C.; Jimenez Varas, G.; Rebollo, M. J.; Mena, L.; Gomez-Magin, J.

    2014-02-01

    IDPSA (Integrated Deterministic-Probabilistic Safety Assessment) is a family of methods which use tightly coupled probabilistic and deterministic approaches to address respective sources of uncertainties, enabling Risk informed decision making in a consistent manner. The starting point of the IDPSA framework is that safety justification must be based on the coupling of deterministic (consequences) and probabilistic (frequency) considerations to address the mutual interactions between stochastic disturbances (e.g. failures of the equipment, human actions, stochastic physical phenomena) and deterministic response of the plant (i.e. transients). This paper gives a general overview of some IDPSA methods as well as some possible applications to PWR safety analyses. (Author)

  17. A History of Probabilistic Inductive Logic Programming

    Directory of Open Access Journals (Sweden)

    Fabrizio eRiguzzi

    2014-09-01

    Full Text Available The field of Probabilistic Logic Programming (PLP has seen significant advances in the last 20 years, with many proposals for languages that combine probability with logic programming. Since the start, the problem of learning probabilistic logic programs has been the focus of much attention. Learning these programs represents a whole subfield of Inductive Logic Programming (ILP. In Probabilistic ILP (PILP two problems are considered: learning the parameters of a program given the structure (the rules and learning both the structure and the parameters. Usually structure learning systems use parameter learning as a subroutine. In this article we present an overview of PILP and discuss the main results.

  18. Minority Serving Institutions Reporting System Database

    Data.gov (United States)

    Social Security Administration — The database will be used to track SSA's contributions to Minority Serving Institutions such as Historically Black Colleges and Universities (HBCU), Tribal Colleges...

  19. PROBABILISTIC RELATIONAL MODELS OF COMPLETE IL-SEMIRINGS

    OpenAIRE

    Tsumagari, Norihiro

    2012-01-01

    This paper studies basic properties of probabilistic multirelations which are generalized the semantic domain of probabilistic systems and then provides two probabilistic models of complete IL-semirings using probabilistic multirelations. Also it is shown that these models need not be models of complete idempotentsemirings.

  20. A convergence theory for probabilistic metric spaces | Jäger ...

    African Journals Online (AJOL)

    We develop a theory of probabilistic convergence spaces based on Tardiff's neighbourhood systems for probabilistic metric spaces. We show that the resulting category is a topological universe and we characterize a subcategory that is isomorphic to the category of probabilistic metric spaces. Keywords: Probabilistic metric ...

  1. Application of Reliability in Breakwater Design

    DEFF Research Database (Denmark)

    Christiani, Erik

    methods to design certain types of breakwaters. Reliability analyses of the main armour and toe berm interaction is exemplified to show the effect of a multiple set of failure mechanisms. First the limit state equations of the main armour and toe interaction are derived from laboratory tests performed...... response, but in one area information has been lacking; bearing capacity has not been treated in depth in a probabilistic manner for breakwaters. Reliability analysis of conventional rubble mound breakwaters and conventional vertical breakwaters is exemplified for the purpose of establishing new ways...... by Bologna University. Thereafter a multiple system of failure for the interaction is established. Relevant stochastic parameters are characterized prior to the reliability evaluation. Application of reliability in crown wall design is illustrated by deriving relevant single foundation failure modes...

  2. Fully probabilistic control for stochastic nonlinear control systems with input dependent noise.

    Science.gov (United States)

    Herzallah, Randa

    2015-03-01

    Robust controllers for nonlinear stochastic systems with functional uncertainties can be consistently designed using probabilistic control methods. In this paper a generalised probabilistic controller design for the minimisation of the Kullback-Leibler divergence between the actual joint probability density function (pdf) of the closed loop control system, and an ideal joint pdf is presented emphasising how the uncertainty can be systematically incorporated in the absence of reliable systems models. To achieve this objective all probabilistic models of the system are estimated from process data using mixture density networks (MDNs) where all the parameters of the estimated pdfs are taken to be state and control input dependent. Based on this dependency of the density parameters on the input values, explicit formulations to the construction of optimal generalised probabilistic controllers are obtained through the techniques of dynamic programming and adaptive critic methods. Using the proposed generalised probabilistic controller, the conditional joint pdfs can be made to follow the ideal ones. A simulation example is used to demonstrate the implementation of the algorithm and encouraging results are obtained. Copyright © 2014 Elsevier Ltd. All rights reserved.

  3. Development of probabilistic fatigue curve for asphalt concrete based on viscoelastic continuum damage mechanics

    Directory of Open Access Journals (Sweden)

    Himanshu Sharma

    2016-07-01

    Full Text Available Due to its roots in fundamental thermodynamic framework, continuum damage approach is popular for modeling asphalt concrete behavior. Currently used continuum damage models use mixture averaged values for model parameters and assume deterministic damage process. On the other hand, significant scatter is found in fatigue data generated even under extremely controlled laboratory testing conditions. Thus, currently used continuum damage models fail to account the scatter observed in fatigue data. This paper illustrates a novel approach for probabilistic fatigue life prediction based on viscoelastic continuum damage approach. Several specimens were tested for their viscoelastic properties and damage properties under uniaxial mode of loading. The data thus generated were analyzed using viscoelastic continuum damage mechanics principles to predict fatigue life. Weibull (2 parameter, 3 parameter and lognormal distributions were fit to fatigue life predicted using viscoelastic continuum damage approach. It was observed that fatigue damage could be best-described using Weibull distribution when compared to lognormal distribution. Due to its flexibility, 3-parameter Weibull distribution was found to fit better than 2-parameter Weibull distribution. Further, significant differences were found between probabilistic fatigue curves developed in this research and traditional deterministic fatigue curve. The proposed methodology combines advantages of continuum damage mechanics as well as probabilistic approaches. These probabilistic fatigue curves can be conveniently used for reliability based pavement design. Keywords: Probabilistic fatigue curve, Continuum damage mechanics, Weibull distribution, Lognormal distribution

  4. Durability reliability analysis for corroding concrete structures under uncertainty

    Science.gov (United States)

    Zhang, Hao

    2018-02-01

    This paper presents a durability reliability analysis of reinforced concrete structures subject to the action of marine chloride. The focus is to provide insight into the role of epistemic uncertainties on durability reliability. The corrosion model involves a number of variables whose probabilistic characteristics cannot be fully determined due to the limited availability of supporting data. All sources of uncertainty, both aleatory and epistemic, should be included in the reliability analysis. Two methods are available to formulate the epistemic uncertainty: the imprecise probability-based method and the purely probabilistic method in which the epistemic uncertainties are modeled as random variables. The paper illustrates how the epistemic uncertainties are modeled and propagated in the two methods, and shows how epistemic uncertainties govern the durability reliability.

  5. Distributed collaborative probabilistic design for turbine blade-tip radial running clearance using support vector machine of regression

    Science.gov (United States)

    Fei, Cheng-Wei; Bai, Guang-Chen

    2014-12-01

    To improve the computational precision and efficiency of probabilistic design for mechanical dynamic assembly like the blade-tip radial running clearance (BTRRC) of gas turbine, a distribution collaborative probabilistic design method-based support vector machine of regression (SR)(called as DCSRM) is proposed by integrating distribution collaborative response surface method and support vector machine regression model. The mathematical model of DCSRM is established and the probabilistic design idea of DCSRM is introduced. The dynamic assembly probabilistic design of aeroengine high-pressure turbine (HPT) BTRRC is accomplished to verify the proposed DCSRM. The analysis results reveal that the optimal static blade-tip clearance of HPT is gained for designing BTRRC, and improving the performance and reliability of aeroengine. The comparison of methods shows that the DCSRM has high computational accuracy and high computational efficiency in BTRRC probabilistic analysis. The present research offers an effective way for the reliability design of mechanical dynamic assembly and enriches mechanical reliability theory and method.

  6. Reliability analysis of common hazardous waste treatment processes

    International Nuclear Information System (INIS)

    Waters, R.D.

    1993-05-01

    Five hazardous waste treatment processes are analyzed probabilistically using Monte Carlo simulation to elucidate the relationships between process safety factors and reliability levels. The treatment processes evaluated are packed tower aeration, reverse osmosis, activated sludge, upflow anaerobic sludge blanket, and activated carbon adsorption

  7. Reliability analysis of common hazardous waste treatment processes

    Energy Technology Data Exchange (ETDEWEB)

    Waters, Robert D. [Vanderbilt Univ., Nashville, TN (United States)

    1993-05-01

    Five hazardous waste treatment processes are analyzed probabilistically using Monte Carlo simulation to elucidate the relationships between process safety factors and reliability levels. The treatment processes evaluated are packed tower aeration, reverse osmosis, activated sludge, upflow anaerobic sludge blanket, and activated carbon adsorption.

  8. Disjunctive Probabilistic Modal Logic is Enough for Bisimilarity on Reactive Probabilistic Systems

    OpenAIRE

    Bernardo, Marco; Miculan, Marino

    2016-01-01

    Larsen and Skou characterized probabilistic bisimilarity over reactive probabilistic systems with a logic including true, negation, conjunction, and a diamond modality decorated with a probabilistic lower bound. Later on, Desharnais, Edalat, and Panangaden showed that negation is not necessary to characterize the same equivalence. In this paper, we prove that the logical characterization holds also when conjunction is replaced by disjunction, with negation still being not necessary. To this e...

  9. On synchronous parallel computations with independent probabilistic choice

    International Nuclear Information System (INIS)

    Reif, J.H.

    1984-01-01

    This paper introduces probabilistic choice to synchronous parallel machine models; in particular parallel RAMs. The power of probabilistic choice in parallel computations is illustrate by parallelizing some known probabilistic sequential algorithms. The authors characterize the computational complexity of time, space, and processor bounded probabilistic parallel RAMs in terms of the computational complexity of probabilistic sequential RAMs. They show that parallelism uniformly speeds up time bounded probabilistic sequential RAM computations by nearly a quadratic factor. They also show that probabilistic choice can be eliminated from parallel computations by introducing nonuniformity

  10. Incorporating travel time reliability into the Highway Capacity Manual.

    Science.gov (United States)

    2014-01-01

    This final report documents the activities performed during SHRP 2 Reliability Project L08: Incorporating Travel Time Reliability into the Highway Capacity Manual. It serves as a supplement to the proposed chapters for incorporating travel time relia...

  11. Proceeding of 35th domestic symposium on applications of structural reliability and risk assessment methods to nuclear power plants

    International Nuclear Information System (INIS)

    2005-06-01

    As the 35th domestic symposium of Atomic Energy Research Committee, the Japan Welding Engineering Society, the symposium was held titled as Applications of structural reliability/risk assessment methods to nuclear energy'. Six speakers gave lectures titled as 'Structural reliability and risk assessment methods', 'Risk-informed regulation of US nuclear energy and role of probabilistic risk assessment', 'Reliability and risk assessment methods in chemical plants', 'Practical structural design methods based on reliability in architectural and civil areas', 'Maintenance activities based on reliability in thermal power plants' and 'LWR maintenance strategies based on Probabilistic Fracture Mechanics'. (T. Tanaka)

  12. Probabilistic Counterfactuals: Semantics, Computation, and Applications

    National Research Council Canada - National Science Library

    Balke, Alexander

    1997-01-01

    ... handled within the framework of standard probability theory. Starting with functional description of physical mechanisms, we were able to derive the standard probabilistic properties of Bayesian networks and to show: (1...

  13. Multiobjective optimal allocation problem with probabilistic non ...

    African Journals Online (AJOL)

    user

    The probabilistic non-linear cost constraint is converted into equivalent deterministic .... Further, in a survey the costs for enumerating a character in various strata are not known exactly, rather these are being ...... Naval Research Logistics, Vol.

  14. Strategic Team AI Path Plans: Probabilistic Pathfinding

    Directory of Open Access Journals (Sweden)

    Tng C. H. John

    2008-01-01

    Full Text Available This paper proposes a novel method to generate strategic team AI pathfinding plans for computer games and simulations using probabilistic pathfinding. This method is inspired by genetic algorithms (Russell and Norvig, 2002, in that, a fitness function is used to test the quality of the path plans. The method generates high-quality path plans by eliminating the low-quality ones. The path plans are generated by probabilistic pathfinding, and the elimination is done by a fitness test of the path plans. This path plan generation method has the ability to generate variation or different high-quality paths, which is desired for games to increase replay values. This work is an extension of our earlier work on team AI: probabilistic pathfinding (John et al., 2006. We explore ways to combine probabilistic pathfinding and genetic algorithm to create a new method to generate strategic team AI pathfinding plans.

  15. Probabilistic Meteorological Characterization for Turbine Loads

    DEFF Research Database (Denmark)

    Kelly, Mark C.; Larsen, Gunner Chr.; Dimitrov, Nikolay Krasimirov

    2014-01-01

    Beyond the existing, limited IEC prescription to describe fatigue loads on wind turbines, we look towards probabilistic characterization of the loads via analogous characterization of the atmospheric flow, particularly for today's "taller" turbines with rotors well above the atmospheric surface...

  16. Advanced Test Reactor probabilistic risk assessment

    International Nuclear Information System (INIS)

    Atkinson, S.A.; Eide, S.A.; Khericha, S.T.; Thatcher, T.A.

    1993-01-01

    This report discusses Level 1 probabilistic risk assessment (PRA) incorporating a full-scope external events analysis which has been completed for the Advanced Test Reactor (ATR) located at the Idaho National Engineering Laboratory

  17. Probabilistic safety assessment for seismic events

    International Nuclear Information System (INIS)

    1993-10-01

    This Technical Document on Probabilistic Safety Assessment for Seismic Events is mainly associated with the Safety Practice on Treatment of External Hazards in PSA and discusses in detail one specific external hazard, i.e. earthquakes

  18. Estimating software development project size, using probabilistic ...

    African Journals Online (AJOL)

    Estimating software development project size, using probabilistic techniques. ... of managing the size of software development projects by Purchasers (Clients) and Vendors (Development ... EMAIL FREE FULL TEXT EMAIL FREE FULL TEXT

  19. Comparing Categorical and Probabilistic Fingerprint Evidence.

    Science.gov (United States)

    Garrett, Brandon; Mitchell, Gregory; Scurich, Nicholas

    2018-04-23

    Fingerprint examiners traditionally express conclusions in categorical terms, opining that impressions do or do not originate from the same source. Recently, probabilistic conclusions have been proposed, with examiners estimating the probability of a match between recovered and known prints. This study presented a nationally representative sample of jury-eligible adults with a hypothetical robbery case in which an examiner opined on the likelihood that a defendant's fingerprints matched latent fingerprints in categorical or probabilistic terms. We studied model language developed by the U.S. Defense Forensic Science Center to summarize results of statistical analysis of the similarity between prints. Participant ratings of the likelihood the defendant left prints at the crime scene and committed the crime were similar when exposed to categorical and strong probabilistic match evidence. Participants reduced these likelihoods when exposed to the weaker probabilistic evidence, but did not otherwise discriminate among the prints assigned different match probabilities. © 2018 American Academy of Forensic Sciences.

  20. Probabilistic methods in exotic option pricing

    NARCIS (Netherlands)

    Anderluh, J.H.M.

    2007-01-01

    The thesis presents three ways of calculating the Parisian option price as an illustration of probabilistic methods in exotic option pricing. Moreover options on commidities are considered and double-sided barrier options in a compound Poisson framework.

  1. Non-unitary probabilistic quantum computing

    Science.gov (United States)

    Gingrich, Robert M.; Williams, Colin P.

    2004-01-01

    We present a method for designing quantum circuits that perform non-unitary quantum computations on n-qubit states probabilistically, and give analytic expressions for the success probability and fidelity.

  2. A logic for inductive probabilistic reasoning

    DEFF Research Database (Denmark)

    Jaeger, Manfred

    2005-01-01

    Inductive probabilistic reasoning is understood as the application of inference patterns that use statistical background information to assign (subjective) probabilities to single events. The simplest such inference pattern is direct inference: from '70% of As are Bs" and "a is an A" infer...... that a is a B with probability 0.7. Direct inference is generalized by Jeffrey's rule and the principle of cross-entropy minimization. To adequately formalize inductive probabilistic reasoning is an interesting topic for artificial intelligence, as an autonomous system acting in a complex environment may have...... to base its actions on a probabilistic model of its environment, and the probabilities needed to form this model can often be obtained by combining statistical background information with particular observations made, i.e., by inductive probabilistic reasoning. In this paper a formal framework...

  3. Proceedings of the workshop on reliability data collection

    International Nuclear Information System (INIS)

    1999-01-01

    The main purpose of the Workshop was to provide a forum for exchanging information and experience on Reliability Data Collection and analysis to support Living Probabilistic Safety Assessments (LPSA). The Workshop is divided into four sessions which titles are: Session 1: Reliability Data - Database Systems (3 papers), Session 2: Reliability Data Collection for PSA (5 papers), Session 3: NPP Data Collection (3 papers), Session 4: Reliability Data Assessment (Part 1: General - 2 papers; Part 2: CCF - 2 papers; Part 3: Reactor Protection Systems / External Event Data - 2 papers; Part 4: Human Errors - 2 papers)

  4. Neglect Of Parameter Estimation Uncertainty Can Significantly Overestimate Structural Reliability

    Directory of Open Access Journals (Sweden)

    Rózsás Árpád

    2015-12-01

    Full Text Available Parameter estimation uncertainty is often neglected in reliability studies, i.e. point estimates of distribution parameters are used for representative fractiles, and in probabilistic models. A numerical example examines the effect of this uncertainty on structural reliability using Bayesian statistics. The study reveals that the neglect of parameter estimation uncertainty might lead to an order of magnitude underestimation of failure probability.

  5. Use of COMCAN III in system design and reliability analysis

    International Nuclear Information System (INIS)

    Rasmuson, D.M.; Shepherd, J.C.; Marshall, N.H.; Fitch, L.R.

    1982-03-01

    This manual describes the COMCAN III computer program and its use. COMCAN III is a tool that can be used by the reliability analyst performing a probabilistic risk assessment or by the designer of a system desiring improved performance and efficiency. COMCAN III can be used to determine minimal cut sets of a fault tree, to calculate system reliability characteristics, and to perform qualitative common cause failure analysis

  6. Nuclear reactor component populations, reliability data bases, and their relationship to failure rate estimation and uncertainty analysis

    International Nuclear Information System (INIS)

    Martz, H.F.; Beckman, R.J.

    1981-12-01

    Probabilistic risk analyses are used to assess the risks inherent in the operation of existing and proposed nuclear power reactors. In performing such risk analyses the failure rates of various components which are used in a variety of reactor systems must be estimated. These failure rate estimates serve as input to fault trees and event trees used in the analyses. Component failure rate estimation is often based on relevant field failure data from different reliability data sources such as LERs, NPRDS, and the In-Plant Data Program. Various statistical data analysis and estimation methods have been proposed over the years to provide the required estimates of the component failure rates. This report discusses the basis and extent to which statistical methods can be used to obtain component failure rate estimates. The report is expository in nature and focuses on the general philosophical basis for such statistical methods. Various terms and concepts are defined and illustrated by means of numerous simple examples

  7. Risk assessment using probabilistic standards

    International Nuclear Information System (INIS)

    Avila, R.

    2004-01-01

    A core element of risk is uncertainty represented by plural outcomes and their likelihood. No risk exists if the future outcome is uniquely known and hence guaranteed. The probability that we will die some day is equal to 1, so there would be no fatal risk if sufficiently long time frame is assumed. Equally, rain risk does not exist if there was 100% assurance of rain tomorrow, although there would be other risks induced by the rain. In a formal sense, any risk exists if, and only if, more than one outcome is expected at a future time interval. In any practical risk assessment we have to deal with uncertainties associated with the possible outcomes. One way of dealing with the uncertainties is to be conservative in the assessments. For example, we may compare the maximal exposure to a radionuclide with a conservatively chosen reference value. In this case, if the exposure is below the reference value then it is possible to assure that the risk is low. Since single values are usually compared; this approach is commonly called 'deterministic'. Its main advantage lies in the simplicity and in that it requires minimum information. However, problems arise when the reference values are actually exceeded or might be exceeded, as in the case of potential exposures, and when the costs for realizing the reference values are high. In those cases, the lack of knowledge on the degree of conservatism involved impairs a rational weighing of the risks against other interests. In this presentation we will outline an approach for dealing with uncertainties that in our opinion is more consistent. We will call it a 'fully probabilistic risk assessment'. The essence of this approach consists in measuring the risk in terms of probabilities, where the later are obtained from comparison of two probabilistic distributions, one reflecting the uncertainties in the outcomes and one reflecting the uncertainties in the reference value (standard) used for defining adverse outcomes. Our first aim

  8. Probabilistic Risk Assessment (PRA): A Practical and Cost Effective Approach

    Science.gov (United States)

    Lee, Lydia L.; Ingegneri, Antonino J.; Djam, Melody

    2006-01-01

    The Lunar Reconnaissance Orbiter (LRO) is the first mission of the Robotic Lunar Exploration Program (RLEP), a space exploration venture to the Moon, Mars and beyond. The LRO mission includes spacecraft developed by NASA Goddard Space Flight Center (GSFC) and seven instruments built by GSFC, Russia, and contractors across the nation. LRO is defined as a measurement mission, not a science mission. It emphasizes the overall objectives of obtaining data to facilitate returning mankind safely to the Moon in preparation for an eventual manned mission to Mars. As the first mission in response to the President's commitment of the journey of exploring the solar system and beyond: returning to the Moon in the next decade, then venturing further into the solar system, ultimately sending humans to Mars and beyond, LRO has high-visibility to the public but limited resources and a tight schedule. This paper demonstrates how NASA's Lunar Reconnaissance Orbiter Mission project office incorporated reliability analyses in assessing risks and performing design tradeoffs to ensure mission success. Risk assessment is performed using NASA Procedural Requirements (NPR) 8705.5 - Probabilistic Risk Assessment (PRA) Procedures for NASA Programs and Projects to formulate probabilistic risk assessment (PRA). As required, a limited scope PRA is being performed for the LRO project. The PRA is used to optimize the mission design within mandated budget, manpower, and schedule constraints. The technique that LRO project office uses to perform PRA relies on the application of a component failure database to quantify the potential mission success risks. To ensure mission success in an efficient manner, low cost and tight schedule, the traditional reliability analyses, such as reliability predictions, Failure Modes and Effects Analysis (FMEA), and Fault Tree Analysis (FTA), are used to perform PRA for the large system of LRO with more than 14,000 piece parts and over 120 purchased or contractor

  9. New probabilistic interest measures for association rules

    OpenAIRE

    Hahsler, Michael; Hornik, Kurt

    2008-01-01

    Mining association rules is an important technique for discovering meaningful patterns in transaction databases. Many different measures of interestingness have been proposed for association rules. However, these measures fail to take the probabilistic properties of the mined data into account. In this paper, we start with presenting a simple probabilistic framework for transaction data which can be used to simulate transaction data when no associations are present. We use such data and a rea...

  10. Semantics of probabilistic processes an operational approach

    CERN Document Server

    Deng, Yuxin

    2015-01-01

    This book discusses the semantic foundations of concurrent systems with nondeterministic and probabilistic behaviour. Particular attention is given to clarifying the relationship between testing and simulation semantics and characterising bisimulations from metric, logical, and algorithmic perspectives. Besides presenting recent research outcomes in probabilistic concurrency theory, the book exemplifies the use of many mathematical techniques to solve problems in computer science, which is intended to be accessible to postgraduate students in Computer Science and Mathematics. It can also be us

  11. Probabilistic cloning of three symmetric states

    International Nuclear Information System (INIS)

    Jimenez, O.; Bergou, J.; Delgado, A.

    2010-01-01

    We study the probabilistic cloning of three symmetric states. These states are defined by a single complex quantity, the inner product among them. We show that three different probabilistic cloning machines are necessary to optimally clone all possible families of three symmetric states. We also show that the optimal cloning probability of generating M copies out of one original can be cast as the quotient between the success probability of unambiguously discriminating one and M copies of symmetric states.

  12. Probabilistic Analysis Methods for Hybrid Ventilation

    DEFF Research Database (Denmark)

    Brohus, Henrik; Frier, Christian; Heiselberg, Per

    This paper discusses a general approach for the application of probabilistic analysis methods in the design of ventilation systems. The aims and scope of probabilistic versus deterministic methods are addressed with special emphasis on hybrid ventilation systems. A preliminary application...... of stochastic differential equations is presented comprising a general heat balance for an arbitrary number of loads and zones in a building to determine the thermal behaviour under random conditions....

  13. 78 FR 27113 - Version 5 Critical Infrastructure Protection Reliability Standards

    Science.gov (United States)

    2013-05-09

    ... approve certain reliability standards proposed by the North American Electric Reliability Corporation... Infrastructure Protection Reliability Standards, 143 FERC ] 61,055 (2013). This errata notice serves to correct P... Commission 18 CFR Part 40 [Docket No. RM13-5-000] Version 5 Critical Infrastructure Protection Reliability...

  14. Sequential optimization and reliability assessment method for metal forming processes

    International Nuclear Information System (INIS)

    Sahai, Atul; Schramm, Uwe; Buranathiti, Thaweepat; Chen Wei; Cao Jian; Xia, Cedric Z.

    2004-01-01

    Uncertainty is inevitable in any design process. The uncertainty could be due to the variations in geometry of the part, material properties or due to the lack of knowledge about the phenomena being modeled itself. Deterministic design optimization does not take uncertainty into account and worst case scenario assumptions lead to vastly over conservative design. Probabilistic design, such as reliability-based design and robust design, offers tools for making robust and reliable decisions under the presence of uncertainty in the design process. Probabilistic design optimization often involves double-loop procedure for optimization and iterative probabilistic assessment. This results in high computational demand. The high computational demand can be reduced by replacing computationally intensive simulation models with less costly surrogate models and by employing Sequential Optimization and reliability assessment (SORA) method. The SORA method uses a single-loop strategy with a series of cycles of deterministic optimization and reliability assessment. The deterministic optimization and reliability assessment is decoupled in each cycle. This leads to quick improvement of design from one cycle to other and increase in computational efficiency. This paper demonstrates the effectiveness of Sequential Optimization and Reliability Assessment (SORA) method when applied to designing a sheet metal flanging process. Surrogate models are used as less costly approximations to the computationally expensive Finite Element simulations

  15. Online probabilistic learning with an ensemble of forecasts

    Science.gov (United States)

    Thorey, Jean; Mallet, Vivien; Chaussin, Christophe

    2016-04-01

    Our objective is to produce a calibrated weighted ensemble to forecast a univariate time series. In addition to a meteorological ensemble of forecasts, we rely on observations or analyses of the target variable. The celebrated Continuous Ranked Probability Score (CRPS) is used to evaluate the probabilistic forecasts. However applying the CRPS on weighted empirical distribution functions (deriving from the weighted ensemble) may introduce a bias because of which minimizing the CRPS does not produce the optimal weights. Thus we propose an unbiased version of the CRPS which relies on clusters of members and is strictly proper. We adapt online learning methods for the minimization of the CRPS. These methods generate the weights associated to the members in the forecasted empirical distribution function. The weights are updated before each forecast step using only past observations and forecasts. Our learning algorithms provide the theoretical guarantee that, in the long run, the CRPS of the weighted forecasts is at least as good as the CRPS of any weighted ensemble with weights constant in time. In particular, the performance of our forecast is better than that of any subset ensemble with uniform weights. A noteworthy advantage of our algorithm is that it does not require any assumption on the distributions of the observations and forecasts, both for the application and for the theoretical guarantee to hold. As application example on meteorological forecasts for photovoltaic production integration, we show that our algorithm generates a calibrated probabilistic forecast, with significant performance improvements on probabilistic diagnostic tools (the CRPS, the reliability diagram and the rank histogram).

  16. An empirical system for probabilistic seasonal climate prediction

    Science.gov (United States)

    Eden, Jonathan; van Oldenborgh, Geert Jan; Hawkins, Ed; Suckling, Emma

    2016-04-01

    Preparing for episodes with risks of anomalous weather a month to a year ahead is an important challenge for governments, non-governmental organisations, and private companies and is dependent on the availability of reliable forecasts. The majority of operational seasonal forecasts are made using process-based dynamical models, which are complex, computationally challenging and prone to biases. Empirical forecast approaches built on statistical models to represent physical processes offer an alternative to dynamical systems and can provide either a benchmark for comparison or independent supplementary forecasts. Here, we present a simple empirical system based on multiple linear regression for producing probabilistic forecasts of seasonal surface air temperature and precipitation across the globe. The global CO2-equivalent concentration is taken as the primary predictor; subsequent predictors, including large-scale modes of variability in the climate system and local-scale information, are selected on the basis of their physical relationship with the predictand. The focus given to the climate change signal as a source of skill and the probabilistic nature of the forecasts produced constitute a novel approach to global empirical prediction. Hindcasts for the period 1961-2013 are validated against observations using deterministic (correlation of seasonal means) and probabilistic (continuous rank probability skill scores) metrics. Good skill is found in many regions, particularly for surface air temperature and most notably in much of Europe during the spring and summer seasons. For precipitation, skill is generally limited to regions with known El Niño-Southern Oscillation (ENSO) teleconnections. The system is used in a quasi-operational framework to generate empirical seasonal forecasts on a monthly basis.

  17. A global empirical system for probabilistic seasonal climate prediction

    Science.gov (United States)

    Eden, J. M.; van Oldenborgh, G. J.; Hawkins, E.; Suckling, E. B.

    2015-12-01

    Preparing for episodes with risks of anomalous weather a month to a year ahead is an important challenge for governments, non-governmental organisations, and private companies and is dependent on the availability of reliable forecasts. The majority of operational seasonal forecasts are made using process-based dynamical models, which are complex, computationally challenging and prone to biases. Empirical forecast approaches built on statistical models to represent physical processes offer an alternative to dynamical systems and can provide either a benchmark for comparison or independent supplementary forecasts. Here, we present a simple empirical system based on multiple linear regression for producing probabilistic forecasts of seasonal surface air temperature and precipitation across the globe. The global CO2-equivalent concentration is taken as the primary predictor; subsequent predictors, including large-scale modes of variability in the climate system and local-scale information, are selected on the basis of their physical relationship with the predictand. The focus given to the climate change signal as a source of skill and the probabilistic nature of the forecasts produced constitute a novel approach to global empirical prediction. Hindcasts for the period 1961-2013 are validated against observations using deterministic (correlation of seasonal means) and probabilistic (continuous rank probability skill scores) metrics. Good skill is found in many regions, particularly for surface air temperature and most notably in much of Europe during the spring and summer seasons. For precipitation, skill is generally limited to regions with known El Niño-Southern Oscillation (ENSO) teleconnections. The system is used in a quasi-operational framework to generate empirical seasonal forecasts on a monthly basis.

  18. A probabilistic approach for representation of interval uncertainty

    International Nuclear Information System (INIS)

    Zaman, Kais; Rangavajhala, Sirisha; McDonald, Mark P.; Mahadevan, Sankaran

    2011-01-01

    In this paper, we propose a probabilistic approach to represent interval data for input variables in reliability and uncertainty analysis problems, using flexible families of continuous Johnson distributions. Such a probabilistic representation of interval data facilitates a unified framework for handling aleatory and epistemic uncertainty. For fitting probability distributions, methods such as moment matching are commonly used in the literature. However, unlike point data where single estimates for the moments of data can be calculated, moments of interval data can only be computed in terms of upper and lower bounds. Finding bounds on the moments of interval data has been generally considered an NP-hard problem because it includes a search among the combinations of multiple values of the variables, including interval endpoints. In this paper, we present efficient algorithms based on continuous optimization to find the bounds on second and higher moments of interval data. With numerical examples, we show that the proposed bounding algorithms are scalable in polynomial time with respect to increasing number of intervals. Using the bounds on moments computed using the proposed approach, we fit a family of Johnson distributions to interval data. Furthermore, using an optimization approach based on percentiles, we find the bounding envelopes of the family of distributions, termed as a Johnson p-box. The idea of bounding envelopes for the family of Johnson distributions is analogous to the notion of empirical p-box in the literature. Several sets of interval data with different numbers of intervals and type of overlap are presented to demonstrate the proposed methods. As against the computationally expensive nested analysis that is typically required in the presence of interval variables, the proposed probabilistic representation enables inexpensive optimization-based strategies to estimate bounds on an output quantity of interest.

  19. Probabilistic causality and radiogenic cancers

    International Nuclear Information System (INIS)

    Groeer, P.G.

    1986-01-01

    A review and scrutiny of the literature on probability and probabilistic causality shows that it is possible under certain assumptions to estimate the probability that a certain type of cancer diagnosed in an individual exposed to radiation prior to diagnosis was caused by this exposure. Diagnosis of this causal relationship like diagnosis of any disease - malignant or not - requires always some subjective judgments by the diagnostician. It is, therefore, illusory to believe that tables based on actuarial data can provide objective estimates of the chance that a cancer diagnosed in an individual is radiogenic. It is argued that such tables can only provide a base from which the diagnostician(s) deviate in one direction or the other according to his (their) individual (consensual) judgment. Acceptance of a physician's diagnostic judgment by patients is commonplace. Similar widespread acceptance of expert judgment by claimants in radiation compensation cases does presently not exist. Judicious use of the present radioepidemiological tables prepared by the Working Group of the National Institutes of Health or of updated future versions of similar tables may improve the situation. 20 references

  20. Computing Distances between Probabilistic Automata

    Directory of Open Access Journals (Sweden)

    Mathieu Tracol

    2011-07-01

    Full Text Available We present relaxed notions of simulation and bisimulation on Probabilistic Automata (PA, that allow some error epsilon. When epsilon is zero we retrieve the usual notions of bisimulation and simulation on PAs. We give logical characterisations of these notions by choosing suitable logics which differ from the elementary ones, L with negation and L without negation, by the modal operator. Using flow networks, we show how to compute the relations in PTIME. This allows the definition of an efficiently computable non-discounted distance between the states of a PA. A natural modification of this distance is introduced, to obtain a discounted distance, which weakens the influence of long term transitions. We compare our notions of distance to others previously defined and illustrate our approach on various examples. We also show that our distance is not expansive with respect to process algebra operators. Although L without negation is a suitable logic to characterise epsilon-(bisimulation on deterministic PAs, it is not for general PAs; interestingly, we prove that it does characterise weaker notions, called a priori epsilon-(bisimulation, which we prove to be NP-difficult to decide.

  1. Probabilistic modeling of children's handwriting

    Science.gov (United States)

    Puri, Mukta; Srihari, Sargur N.; Hanson, Lisa

    2013-12-01

    There is little work done in the analysis of children's handwriting, which can be useful in developing automatic evaluation systems and in quantifying handwriting individuality. We consider the statistical analysis of children's handwriting in early grades. Samples of handwriting of children in Grades 2-4 who were taught the Zaner-Bloser style were considered. The commonly occurring word "and" written in cursive style as well as hand-print were extracted from extended writing. The samples were assigned feature values by human examiners using a truthing tool. The human examiners looked at how the children constructed letter formations in their writing, looking for similarities and differences from the instructions taught in the handwriting copy book. These similarities and differences were measured using a feature space distance measure. Results indicate that the handwriting develops towards more conformity with the class characteristics of the Zaner-Bloser copybook which, with practice, is the expected result. Bayesian networks were learnt from the data to enable answering various probabilistic queries, such as determining students who may continue to produce letter formations as taught during lessons in school and determining the students who will develop a different and/or variation of the those letter formations and the number of different types of letter formations.

  2. Probabilistic description of traffic flow

    International Nuclear Information System (INIS)

    Mahnke, R.; Kaupuzs, J.; Lubashevsky, I.

    2005-01-01

    A stochastic description of traffic flow, called probabilistic traffic flow theory, is developed. The general master equation is applied to relatively simple models to describe the formation and dissolution of traffic congestions. Our approach is mainly based on spatially homogeneous systems like periodically closed circular rings without on- and off-ramps. We consider a stochastic one-step process of growth or shrinkage of a car cluster (jam). As generalization we discuss the coexistence of several car clusters of different sizes. The basic problem is to find a physically motivated ansatz for the transition rates of the attachment and detachment of individual cars to a car cluster consistent with the empirical observations in real traffic. The emphasis is put on the analogy with first-order phase transitions and nucleation phenomena in physical systems like supersaturated vapour. The results are summarized in the flux-density relation, the so-called fundamental diagram of traffic flow, and compared with empirical data. Different regimes of traffic flow are discussed: free flow, congested mode as stop-and-go regime, and heavy viscous traffic. The traffic breakdown is studied based on the master equation as well as the Fokker-Planck approximation to calculate mean first passage times or escape rates. Generalizations are developed to allow for on-ramp effects. The calculated flux-density relation and characteristic breakdown times coincide with empirical data measured on highways. Finally, a brief summary of the stochastic cellular automata approach is given

  3. Distribution functions of probabilistic automata

    Science.gov (United States)

    Vatan, F.

    2001-01-01

    Each probabilistic automaton M over an alphabet A defines a probability measure Prob sub(M) on the set of all finite and infinite words over A. We can identify a k letter alphabet A with the set {0, 1,..., k-1}, and, hence, we can consider every finite or infinite word w over A as a radix k expansion of a real number X(w) in the interval [0, 1]. This makes X(w) a random variable and the distribution function of M is defined as usual: F(x) := Prob sub(M) { w: X(w) automata in detail. Automata with continuous distribution functions are characterized. By a new, and much more easier method, it is shown that the distribution function F(x) is an analytic function if it is a polynomial. Finally, answering a question posed by D. Knuth and A. Yao, we show that a polynomial distribution function F(x) on [0, 1] can be generated by a prob abilistic automaton iff all the roots of F'(x) = 0 in this interval, if any, are rational numbers. For this, we define two dynamical systems on the set of polynomial distributions and study attracting fixed points of random composition of these two systems.

  4. Probabilistic transport models for fusion

    International Nuclear Information System (INIS)

    Milligen, B.Ph. van; Carreras, B.A.; Lynch, V.E.; Sanchez, R.

    2005-01-01

    A generalization of diffusive (Fickian) transport is considered, in which particle motion is described by probability distributions. We design a simple model that includes a critical mechanism to switch between two transport channels, and show that it exhibits various interesting characteristics, suggesting that the ideas of probabilistic transport might provide a framework for the description of a range of unusual transport phenomena observed in fusion plasmas. The model produces power degradation and profile consistency, as well as a scaling of the confinement time with system size reminiscent of the gyro-Bohm/Bohm scalings observed in fusion plasmas, and rapid propagation of disturbances. In the present work we show how this model may also produce on-axis peaking of the profiles with off-axis fuelling. It is important to note that the fluid limit of a simple model like this, characterized by two transport channels, does not correspond to the usual (Fickian) transport models commonly used for modelling transport in fusion plasmas, and behaves in a fundamentally different way. (author)

  5. Analysis of NPP protection structure reliability under impact of a falling aircraft

    International Nuclear Information System (INIS)

    Shul'man, G.S.

    1996-01-01

    Methodology for evaluation of NPP protection structure reliability by impact of aircraft fall down is considered. The methodology is base on the probabilistic analysis of all potential events. The problem is solved in three stages: determination of loads on structural units, calculation of local reliability of protection structures by assigned loads and estimation of the structure reliability. The methodology proposed may be applied at the NPP design stage and by determination of reliability of already available structures

  6. Human Reliability Analysis: session summary

    International Nuclear Information System (INIS)

    Hall, R.E.

    1985-01-01

    The use of Human Reliability Analysis (HRA) to identify and resolve human factors issues has significantly increased over the past two years. Today, utilities, research institutions, consulting firms, and the regulatory agency have found a common application of HRA tools and Probabilistic Risk Assessment (PRA). The ''1985 IEEE Third Conference on Human Factors and Power Plants'' devoted three sessions to the discussion of these applications and a review of the insights so gained. This paper summarizes the three sessions and presents those common conclusions that were discussed during the meeting. The paper concludes that session participants supported the use of an adequately documented ''living PRA'' to address human factors issues in design and procedural changes, regulatory compliance, and training and that the techniques can produce cost effective qualitative results that are complementary to more classical human factors methods

  7. Interim reliability evaluation program (IREP)

    International Nuclear Information System (INIS)

    Carlson, D.D.; Murphy, J.A.

    1981-01-01

    The Interim Reliability Evaluation Program (IREP), sponsored by the Office of Nuclear Regulatory Research of the US Nuclear Regulatory Commission, is currently applying probabilistic risk analysis techniques to two PWR and two BWR type power plants. Emphasis was placed on the systems analysis portion of the risk assessment, as opposed to accident phenomenology or consequence analysis, since the identification of risk significant plant features was of primary interest. Traditional event tree/fault tree modeling was used for the analysis. However, the study involved a more thorough investigation of transient initiators and of support system faults than studies in the past and substantially improved techniques were used to quantify accident sequence frequencies. This study also attempted to quantify the potential for operator recovery actions in the course of each significant accident

  8. An Introduction To Reliability

    International Nuclear Information System (INIS)

    Park, Kyoung Su

    1993-08-01

    This book introduces reliability with definition of reliability, requirement of reliability, system of life cycle and reliability, reliability and failure rate such as summary, reliability characteristic, chance failure, failure rate which changes over time, failure mode, replacement, reliability in engineering design, reliability test over assumption of failure rate, and drawing of reliability data, prediction of system reliability, conservation of system, failure such as summary and failure relay and analysis of system safety.

  9. Reliability Estimates for Undergraduate Grade Point Average

    Science.gov (United States)

    Westrick, Paul A.

    2017-01-01

    Undergraduate grade point average (GPA) is a commonly employed measure in educational research, serving as a criterion or as a predictor depending on the research question. Over the decades, researchers have used a variety of reliability coefficients to estimate the reliability of undergraduate GPA, which suggests that there has been no consensus…

  10. Operator reliability assessment system (OPERAS)

    International Nuclear Information System (INIS)

    Singh, A.; Spurgin, A.J.; Martin, T.; Welsch, J.; Hallam, J.W.

    1991-01-01

    OPERAS is a personal-computer (PC) based software to collect and process simulator data on control-room operators responses during requalification training scenarios. The data collection scheme is based upon approach developed earlier during the EPRI Operator Reliability Experiments project. The software allows automated data collection from simulator, thus minimizing simulator staff time and resources to collect, maintain and process data which can be useful in monitoring, assessing and enhancing the progress of crew reliability and effectiveness. The system is designed to provide the data and output information in the form of user-friendly charts, tables and figures for use by plant staff. OPERAS prototype software has been implemented at the Diablo Canyon (PWR) and Millstone (BWR) plants and is currently being used to collect operator response data. Data collected from similator include plant-state variables such as reactor pressure and temperature, malfunction, times at which annunciators are activated, operator actions and observations of crew behavior by training staff. The data and systematic analytical results provided by the OPERAS system can contribute to increase objectivity by the utility probabilistic risk analysis (PRA) and training staff in monitoring and assessing reliability of their crews

  11. Very Short-term Nonparametric Probabilistic Forecasting of Renewable Energy Generation - with Application to Solar Energy

    DEFF Research Database (Denmark)

    Golestaneh, Faranak; Pinson, Pierre; Gooi, Hoay Beng

    2016-01-01

    Due to the inherent uncertainty involved in renewable energy forecasting, uncertainty quantification is a key input to maintain acceptable levels of reliability and profitability in power system operation. A proposal is formulated and evaluated here for the case of solar power generation, when only...... approach to generate very short-term predictive densities, i.e., for lead times between a few minutes to one hour ahead, with fast frequency updates. We rely on an Extreme Learning Machine (ELM) as a fast regression model, trained in varied ways to obtain both point and quantile forecasts of solar power...... generation. Four probabilistic methods are implemented as benchmarks. Rival approaches are evaluated based on a number of test cases for two solar power generation sites in different climatic regions, allowing us to show that our approach results in generation of skilful and reliable probabilistic forecasts...

  12. Multiple sequential failure model: A probabilistic approach to quantifying human error dependency

    International Nuclear Information System (INIS)

    Samanta

    1985-01-01

    This paper rpesents a probabilistic approach to quantifying human error dependency when multiple tasks are performed. Dependent human failures are dominant contributors to risks from nuclear power plants. An overview of the Multiple Sequential Failure (MSF) model developed and its use in probabilistic risk assessments (PRAs) depending on the available data are discussed. A small-scale psychological experiment was conducted on the nature of human dependency and the interpretation of the experimental data by the MSF model show remarkable accommodation of the dependent failure data. The model, which provides an unique method for quantification of dependent failures in human reliability analysis, can be used in conjunction with any of the general methods currently used for performing the human reliability aspect in PRAs

  13. Probabilistic atlas based labeling of the cerebral vessel tree

    Science.gov (United States)

    Van de Giessen, Martijn; Janssen, Jasper P.; Brouwer, Patrick A.; Reiber, Johan H. C.; Lelieveldt, Boudewijn P. F.; Dijkstra, Jouke

    2015-03-01

    Preoperative imaging of the cerebral vessel tree is essential for planning therapy on intracranial stenoses and aneurysms. Usually, a magnetic resonance angiography (MRA) or computed tomography angiography (CTA) is acquired from which the cerebral vessel tree is segmented. Accurate analysis is helped by the labeling of the cerebral vessels, but labeling is non-trivial due to anatomical topological variability and missing branches due to acquisition issues. In recent literature, labeling the cerebral vasculature around the Circle of Willis has mainly been approached as a graph-based problem. The most successful method, however, requires the definition of all possible permutations of missing vessels, which limits application to subsets of the tree and ignores spatial information about the vessel locations. This research aims to perform labeling using probabilistic atlases that model spatial vessel and label likelihoods. A cerebral vessel tree is aligned to a probabilistic atlas and subsequently each vessel is labeled by computing the maximum label likelihood per segment from label-specific atlases. The proposed method was validated on 25 segmented cerebral vessel trees. Labeling accuracies were close to 100% for large vessels, but dropped to 50-60% for small vessels that were only present in less than 50% of the set. With this work we showed that using solely spatial information of the vessel labels, vessel segments from stable vessels (>50% presence) were reliably classified. This spatial information will form the basis for a future labeling strategy with a very loose topological model.

  14. Sex determination using the Probabilistic Sex Diagnosis (DSP: Diagnose Sexuelle Probabiliste) tool in a virtual environment.

    Science.gov (United States)

    Chapman, Tara; Lefevre, Philippe; Semal, Patrick; Moiseev, Fedor; Sholukha, Victor; Louryan, Stéphane; Rooze, Marcel; Van Sint Jan, Serge

    2014-01-01

    The hip bone is one of the most reliable indicators of sex in the human body due to the fact it is the most dimorphic bone. Probabilistic Sex Diagnosis (DSP: Diagnose Sexuelle Probabiliste) developed by Murail et al., in 2005, is a sex determination method based on a worldwide hip bone metrical database. Sex is determined by comparing specific measurements taken from each specimen using sliding callipers and computing the probability of specimens being female or male. In forensic science it is sometimes not possible to sex a body due to corpse decay or injury. Skeletalization and dissection of a body is a laborious process and desecrates the body. There were two aims to this study. The first aim was to examine the accuracy of the DSP method in comparison with a current visual sexing method on sex determination. A further aim was to see if it was possible to virtually utilise the DSP method on both the hip bone and the pelvic girdle in order to utilise this method for forensic sciences. For the first part of the study, forty-nine dry hip bones of unknown sex were obtained from the Body Donation Programme of the Université Libre de Bruxelles (ULB). A comparison was made between DSP analysis and visual sexing on dry bone by two researchers. CT scans of bones were then analysed to obtain three-dimensional (3D) virtual models and the method of DSP was analysed virtually by importing the models into a customised software programme called lhpFusionBox which was developed at ULB. The software enables DSP distances to be measured via virtually-palpated bony landmarks. There was found to be 100% agreement of sex between the manual and virtual DSP method. The second part of the study aimed to further validate the method by analysing thirty-nine supplementary pelvic girdles of known sex blind. There was found to be a 100% accuracy rate further demonstrating that the virtual DSP method is robust. Statistically significant differences were found in the identification of sex

  15. Efficient probabilistic inference in generic neural networks trained with non-probabilistic feedback.

    Science.gov (United States)

    Orhan, A Emin; Ma, Wei Ji

    2017-07-26

    Animals perform near-optimal probabilistic inference in a wide range of psychophysical tasks. Probabilistic inference requires trial-to-trial representation of the uncertainties associated with task variables and subsequent use of this representation. Previous work has implemented such computations using neural networks with hand-crafted and task-dependent operations. We show that generic neural networks trained with a simple error-based learning rule perform near-optimal probabilistic inference in nine common psychophysical tasks. In a probabilistic categorization task, error-based learning in a generic network simultaneously explains a monkey's learning curve and the evolution of qualitative aspects of its choice behavior. In all tasks, the number of neurons required for a given level of performance grows sublinearly with the input population size, a substantial improvement on previous implementations of probabilistic inference. The trained networks develop a novel sparsity-based probabilistic population code. Our results suggest that probabilistic inference emerges naturally in generic neural networks trained with error-based learning rules.Behavioural tasks often require probability distributions to be inferred about task specific variables. Here, the authors demonstrate that generic neural networks can be trained using a simple error-based learning rule to perform such probabilistic computations efficiently without any need for task specific operations.

  16. Probabilistic soft sets and dual probabilistic soft sets in decision making with positive and negative parameters

    Science.gov (United States)

    Fatimah, F.; Rosadi, D.; Hakim, R. B. F.

    2018-03-01

    In this paper, we motivate and introduce probabilistic soft sets and dual probabilistic soft sets for handling decision making problem in the presence of positive and negative parameters. We propose several types of algorithms related to this problem. Our procedures are flexible and adaptable. An example on real data is also given.

  17. The development of a probabilistic approach to forecast coastal change

    Science.gov (United States)

    Lentz, Erika E.; Hapke, Cheryl J.; Rosati, Julie D.; Wang, Ping; Roberts, Tiffany M.

    2011-01-01

    This study demonstrates the applicability of a Bayesian probabilistic model as an effective tool in predicting post-storm beach changes along sandy coastlines. Volume change and net shoreline movement are modeled for two study sites at Fire Island, New York in response to two extratropical storms in 2007 and 2009. Both study areas include modified areas adjacent to unmodified areas in morphologically different segments of coast. Predicted outcomes are evaluated against observed changes to test model accuracy and uncertainty along 163 cross-shore transects. Results show strong agreement in the cross validation of predictions vs. observations, with 70-82% accuracies reported. Although no consistent spatial pattern in inaccurate predictions could be determined, the highest prediction uncertainties appeared in locations that had been recently replenished. Further testing and model refinement are needed; however, these initial results show that Bayesian networks have the potential to serve as important decision-support tools in forecasting coastal change.

  18. 77 FR 13173 - Best Equipped Best Served

    Science.gov (United States)

    2012-03-05

    ... on the best equipped, best performing, best served concept for implementation in the 2012-2014... Advisory Committee (NAC). FAA is seeking stakeholder input on the technical and operational feasibility of...

  19. The probabilistic innovation theoretical framework

    Directory of Open Access Journals (Sweden)

    Chris W. Callaghan

    2017-07-01

    Full Text Available Background: Despite technological advances that offer new opportunities for solving societal problems in real time, knowledge management theory development has largely not kept pace with these developments. This article seeks to offer useful insights into how more effective theory development in this area could be enabled. Aim: This article suggests different streams of literature for inclusion into a theoretical framework for an emerging stream of research, termed ‘probabilistic innovation’, which seeks to develop a system of real-time research capability. The objective of this research is therefore to provide a synthesis of a range of diverse literatures, and to provide useful insights into how research enabled by crowdsourced research and development can potentially be used to address serious knowledge problems in real time. Setting: This research suggests that knowledge management theory can provide an anchor for a new stream of research contributing to the development of real-time knowledge problem solving. Methods: This conceptual article seeks to re-conceptualise the problem of real-time research and locate this knowledge problem in relation to a host of rapidly developing streams of literature. In doing so, a novel perspective of societal problem-solving is enabled. Results: An analysis of theory and literature suggests that certain rapidly developing streams of literature might more effectively contribute to societally important real-time research problem solving if these steams are united under a theoretical framework with this goal as its explicit focus. Conclusion: Although the goal of real-time research is as yet not attainable, research that contributes to its attainment may ultimately make an important contribution to society.

  20. Probabilistic causality, selection bias, and the logic of the democratic peace

    OpenAIRE

    Slantchev, Branislav L; Alexandrova, A; Gartzke, E

    2005-01-01

    Rosato (2003) claims to have discredited democratic peace theories. However, the methodological approach adopted by the study cannot reliably generate the conclusions espoused by the author. Rosato seems to misunderstand the probabilistic nature of most arguments about democratic peace and ignores issues that an appropriate research design should account for. Further, the study's use of case studies and data sets without attention to selection-bias produces examples that actually support theo...