WorldWideScience

Sample records for probabilistic performance goals

  1. Probabilistic safety assessment goals in Canada

    International Nuclear Information System (INIS)

    Snell, V.G.

    1986-01-01

    CANDU safety philosphy, both in design and in licensing, has always had a strong bias towards quantitative probabilistically-based goals derived from comparative safety. Formal probabilistic safety assessment began in Canada as a design tool. The influence of this carried over later on into the definition of the deterministic safety guidelines used in CANDU licensing. Design goals were further developed which extended the consequence/frequency spectrum of 'acceptable' events, from the two points defined by the deterministic single/dual failure analysis, to a line passing through lower and higher frequencies. Since these were design tools, a complete risk summation was not necessary, allowing a cutoff at low event frequencies while preserving the identification of the most significant safety-related events. These goals gave a logical framework for making decisions on implementing design changes proposed as a result of the Probabilistic Safety Analysis. Performing this analysis became a regulatory requirement, and the design goals remained the framework under which this was submitted. Recently, there have been initiatives to incorporate more detailed probabilistic safety goals into the regulatory process in Canada. These range from far-reaching safety optimization across society, to initiatives aimed at the nuclear industry only. The effectiveness of the latter is minor at very low and very high event frequencies; at medium frequencies, a justification against expenditures per life saved in other industries should be part of the goal setting

  2. Probabilistic safety goals. Phase 3 - Status report

    Energy Technology Data Exchange (ETDEWEB)

    Holmberg, J.-E. (VTT (Finland)); Knochenhauer, M. (Relcon Scandpower AB, Sundbyberg (Sweden))

    2009-07-15

    The first phase of the project (2006) described the status, concepts and history of probabilistic safety goals for nuclear power plants. The second and third phases (2007-2008) have provided guidance related to the resolution of some of the problems identified, and resulted in a common understanding regarding the definition of safety goals. The basic aim of phase 3 (2009) has been to increase the scope and level of detail of the project, and to start preparations of a guidance document. Based on the conclusions from the previous project phases, the following issues have been covered: 1) Extension of international overview. Analysis of results from the questionnaire performed within the ongoing OECD/NEA WGRISK activity on probabilistic safety criteria, including participation in the preparation of the working report for OECD/NEA/WGRISK (to be finalised in phase 4). 2) Use of subsidiary criteria and relations between these (to be finalised in phase 4). 3) Numerical criteria when using probabilistic analyses in support of deterministic safety analysis (to be finalised in phase 4). 4) Guidance for the formulation, application and interpretation of probabilistic safety criteria (to be finalised in phase 4). (LN)

  3. Probabilistic safety goals. Phase 3 - Status report

    International Nuclear Information System (INIS)

    Holmberg, J.-E.; Knochenhauer, M.

    2009-07-01

    The first phase of the project (2006) described the status, concepts and history of probabilistic safety goals for nuclear power plants. The second and third phases (2007-2008) have provided guidance related to the resolution of some of the problems identified, and resulted in a common understanding regarding the definition of safety goals. The basic aim of phase 3 (2009) has been to increase the scope and level of detail of the project, and to start preparations of a guidance document. Based on the conclusions from the previous project phases, the following issues have been covered: 1) Extension of international overview. Analysis of results from the questionnaire performed within the ongoing OECD/NEA WGRISK activity on probabilistic safety criteria, including participation in the preparation of the working report for OECD/NEA/WGRISK (to be finalised in phase 4). 2) Use of subsidiary criteria and relations between these (to be finalised in phase 4). 3) Numerical criteria when using probabilistic analyses in support of deterministic safety analysis (to be finalised in phase 4). 4) Guidance for the formulation, application and interpretation of probabilistic safety criteria (to be finalised in phase 4). (LN)

  4. Probabilistic Safety Goals for Nuclear Power Plants; Phases 2-4 / Final Report

    International Nuclear Information System (INIS)

    Bengtsson, Lisa; Knochenhauer, Michael; Holmberg, Jan-Erik; Rossi, Jukka

    2011-05-01

    The outcome of a probabilistic safety assessment (PSA) for a nuclear power plant is a combination of qualitative and quantitative results. Quantitative results are typically presented as the Core Damage Frequency (CDF) and as the frequency of an unacceptable radioactive release. In order to judge the acceptability of PSA results, criteria for the interpretation of results and the assessment of their acceptability need to be defined. Safety goals are defined in different ways in different countries and also used differently. Many countries are presently developing them in connection to the transfer to risk-informed regulation of both operating nuclear power plants (NPP) and new designs. However, it is far from self-evident how probabilistic safety criteria should be defined and used. On one hand, experience indicates that safety goals are valuable tools for the interpretation of results from a probabilistic safety assessment (PSA), and they tend to enhance the realism of a risk assessment. On the other hand, strict use of probabilistic criteria is usually avoided. A major problem is the large number of different uncertainties in a PSA model, which makes it difficult to demonstrate the compliance with a probabilistic criterion. Further, it has been seen that PSA results can change a lot over time due to scope extensions, revised operating experience data, method development, changes in system requirements, or increases of level of detail, mostly leading to an increase of the frequency of the calculated risk. This can cause a problem of consistency in the judgments. The first phase of the project (2006) provided a general description of the issue of probabilistic safety goals for nuclear power plants, of important concepts related to the definition and application of safety goals, and of experiences in Finland and Sweden. The second, third and fourth phases (2007-2009) have been concerned with providing guidance related to the resolution of some of the problems

  5. Seismic design and evaluation criteria based on target performance goals

    International Nuclear Information System (INIS)

    Murray, R.C.; Nelson, T.A.; Kennedy, R.P.; Short, S.A.

    1994-04-01

    The Department of Energy utilizes deterministic seismic design/evaluation criteria developed to achieve probabilistic performance goals. These seismic design and evaluation criteria are intended to apply equally to the design of new facilities and to the evaluation of existing facilities. In addition, the criteria are intended to cover design and evaluation of buildings, equipment, piping, and other structures. Four separate sets of seismic design/evaluation criteria have been presented each with a different performance goal. In all these criteria, earthquake loading is selected from seismic hazard curves on a probabilistic basis but seismic response evaluation methods and acceptable behavior limits are deterministic approaches with which design engineers are familiar. For analytical evaluations, conservatism has been introduced through the use of conservative inelastic demand-capacity ratios combined with ductile detailing requirements, through the use of minimum specified material strengths and conservative code capacity equations, and through the use of a seismic scale factor. For evaluation by testing or by experience data, conservatism has been introduced through the use of an increase scale factor which is applied to the prescribed design/evaluation input motion

  6. Probabilistic safety goals for nuclear power plants; Phases 2-4. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Bengtsson, L.; Knochenhauer, M. (Scandpower AB (Sweden)); Holmberg, J.-E.; Rossi, J. (VTT Technical Research Centre of Finland (Finland))

    2011-05-15

    Safety goals are defined in different ways in different countries and also used differently. Many countries are presently developing them in connection to the transfer to risk-informed regulation of both operating nuclear power plants (NPP) and new designs. However, it is far from self-evident how probabilistic safety criteria should be defined and used. On one hand, experience indicates that safety goals are valuable tools for the interpretation of results from a probabilistic safety assessment (PSA), and they tend to enhance the realism of a risk assessment. On the other hand, strict use of probabilistic criteria is usually avoided. A major problem is the large number of different uncertainties in a PSA model, which makes it difficult to demonstrate the compliance with a probabilistic criterion. Further, it has been seen that PSA results can change a lot over time due to scope extensions, revised operating experience data, method development, changes in system requirements, or increases of level of detail, mostly leading to an increase of the frequency of the calculated risk. This can cause a problem of consistency in the judgments. This report presents the results from the second, third and fourth phases of the project (2007-2009), which have dealt with providing guidance related to the resolution of some specific problems, such as the problem of consistency in judgement, comparability of safety goals used in different industries, the relationship between criteria on different levels, and relations between criteria for level 2 and 3 PSA. In parallel, additional context information has been provided. This was achieved by extending the international overview by contributing to and benefiting from a survey on PSA safety criteria which was initiated in 2006 within the OECD/NEA Working Group Risk. The results from the project can be used as a platform for discussions at the utilities on how to define and use quantitative safety goals. The results can also be used by

  7. Probabilistic safety goals for nuclear power plants; Phases 2-4. Final report

    International Nuclear Information System (INIS)

    Bengtsson, L.; Knochenhauer, M.; Holmberg, J.-E.; Rossi, J.

    2011-05-01

    Safety goals are defined in different ways in different countries and also used differently. Many countries are presently developing them in connection to the transfer to risk-informed regulation of both operating nuclear power plants (NPP) and new designs. However, it is far from self-evident how probabilistic safety criteria should be defined and used. On one hand, experience indicates that safety goals are valuable tools for the interpretation of results from a probabilistic safety assessment (PSA), and they tend to enhance the realism of a risk assessment. On the other hand, strict use of probabilistic criteria is usually avoided. A major problem is the large number of different uncertainties in a PSA model, which makes it difficult to demonstrate the compliance with a probabilistic criterion. Further, it has been seen that PSA results can change a lot over time due to scope extensions, revised operating experience data, method development, changes in system requirements, or increases of level of detail, mostly leading to an increase of the frequency of the calculated risk. This can cause a problem of consistency in the judgments. This report presents the results from the second, third and fourth phases of the project (2007-2009), which have dealt with providing guidance related to the resolution of some specific problems, such as the problem of consistency in judgement, comparability of safety goals used in different industries, the relationship between criteria on different levels, and relations between criteria for level 2 and 3 PSA. In parallel, additional context information has been provided. This was achieved by extending the international overview by contributing to and benefiting from a survey on PSA safety criteria which was initiated in 2006 within the OECD/NEA Working Group Risk. The results from the project can be used as a platform for discussions at the utilities on how to define and use quantitative safety goals. The results can also be used by

  8. Probabilistic Safety Goals. Phase 1 Status and Experiences in Sweden and Finland

    International Nuclear Information System (INIS)

    Holmberg, Jan-Erik; Knochenhauer, Michael

    2007-02-01

    The outcome of a probabilistic safety assessment (PSA) for a nuclear power plant is a combination of qualitative and quantitative results. Quantitative results are typically presented as the Core Damage Frequency (CDF) and as the frequency of an unacceptable radioactive release. In order to judge the acceptability of PSA results, criteria for the interpretation of results and the assessment of their acceptability need to be defined. Ultimately, the goals are intended to define an acceptable level of risk from the operation of a nuclear facility. However, safety goals usually have a dual function, i.e., they define an acceptable safety level, but they also have a wider and more general use as decision criteria. The exact levels of the safety goals differ between organisations and between different countries. There are also differences in the definition of the safety goal, and in the formal status of the goals, i.e., whether they are mandatory or not. In this first phase of the project, the aim has been on providing a clear description of the issue of probabilistic safety goals for nuclear power plants, to define and describe important concepts related to the definition and application of safety goals, and to describe experiences in Finland and Sweden. Based on a series of interviews and on literature reviews as well as on a limited international over-view, the project has described the history and current status of safety goals in Sweden and Finland, and elaborated on a number of issues, including the following: The status of the safety goals in view of the fact that they have been exceeded for much of the time they have been in use, as well as the possible implications of these exceedances. Safety goals as informal or mandatory limits. Strategies for handling violations of safety goals, including various graded approaches, such as ALARP (As Low As Reasonably Practicable). Relation between safety goals defined on different levels, e.g., for core damage and for

  9. Probabilistic safety goals. Phase 1 - Status and experiences in Sweden and Finland

    International Nuclear Information System (INIS)

    Holmberg, J.E.; Knochenhauer, M.

    2007-03-01

    The outcome of a probabilistic safety assessment (PSA) for a nuclear power plant is a combination of qualitative and quantitative results. Quantitative results are typically presented as the Core Damage Frequency (CDF) and as the frequency of an unacceptable radioactive release. In order to judge the acceptability of PSA results, criteria for the interpretation of results and the assessment of their acceptability need to be defined. Ultimately, the goals are intended to define an acceptable level of risk from the operation of a nuclear facility. However, safety goals usually have a dual function, i.e., they define an acceptable safety level, but they also have a wider and more general use as decision criteria. The exact levels of the safety goals differ between organisations and between different countries. There are also differences in the definition of the safety goal, and in the formal status of the goals, i.e., whether they are mandatory or not. In this first phase of the project, the aim has been on providing a clear description of the issue of probabilistic safety goals for nuclear power plants, to define and describe important concepts related to the definition and application of safety goals, and to describe experiences in Finland and Sweden. Based on a series of interviews and on literature reviews as well as on a limited international over-view, the project has described the history and current status of safety goals in Sweden and Finland, and elaborated on a number of issues, including the following: 1) The status of the safety goals in view of the fact that they have been exceeded for much of the time they have been in use, as well as the possible implications of these exceedances. 2) Safety goals as informal or mandatory limits. 3) Strategies for handling violations of safety goals, including various graded approaches, such as ALARP (As Low As Reasonably Practicable). 4) Relation between safety goals defined on different levels, e.g., for core damage

  10. Probabilistic Safety Goals. Phase 1 Status and Experiences in Sweden and Finland

    Energy Technology Data Exchange (ETDEWEB)

    Holmberg, Jan-Erik (VTT, FI-02044 VTT (Finland)); Knochenhauer, Michael (Relcon Scandpower AB, SE-172 25 Sundbyberg (Sweden))

    2007-02-15

    The outcome of a probabilistic safety assessment (PSA) for a nuclear power plant is a combination of qualitative and quantitative results. Quantitative results are typically presented as the Core Damage Frequency (CDF) and as the frequency of an unacceptable radioactive release. In order to judge the acceptability of PSA results, criteria for the interpretation of results and the assessment of their acceptability need to be defined. Ultimately, the goals are intended to define an acceptable level of risk from the operation of a nuclear facility. However, safety goals usually have a dual function, i.e., they define an acceptable safety level, but they also have a wider and more general use as decision criteria. The exact levels of the safety goals differ between organisations and between different countries. There are also differences in the definition of the safety goal, and in the formal status of the goals, i.e., whether they are mandatory or not. In this first phase of the project, the aim has been on providing a clear description of the issue of probabilistic safety goals for nuclear power plants, to define and describe important concepts related to the definition and application of safety goals, and to describe experiences in Finland and Sweden. Based on a series of interviews and on literature reviews as well as on a limited international over-view, the project has described the history and current status of safety goals in Sweden and Finland, and elaborated on a number of issues, including the following: The status of the safety goals in view of the fact that they have been exceeded for much of the time they have been in use, as well as the possible implications of these exceedances. Safety goals as informal or mandatory limits. Strategies for handling violations of safety goals, including various graded approaches, such as ALARP (As Low As Reasonably Practicable). Relation between safety goals defined on different levels, e.g., for core damage and for

  11. Probabilistic safety goals. Phase 1 - Status and experiences in Sweden and Finland

    Energy Technology Data Exchange (ETDEWEB)

    Holmberg, J.E. [VTT (Finland); Knochenhauer, M. [Relcon Scandpower AB (Sweden)

    2007-03-15

    The outcome of a probabilistic safety assessment (PSA) for a nuclear power plant is a combination of qualitative and quantitative results. Quantitative results are typically presented as the Core Damage Frequency (CDF) and as the frequency of an unacceptable radioactive release. In order to judge the acceptability of PSA results, criteria for the interpretation of results and the assessment of their acceptability need to be defined. Ultimately, the goals are intended to define an acceptable level of risk from the operation of a nuclear facility. However, safety goals usually have a dual function, i.e., they define an acceptable safety level, but they also have a wider and more general use as decision criteria. The exact levels of the safety goals differ between organisations and between different countries. There are also differences in the definition of the safety goal, and in the formal status of the goals, i.e., whether they are mandatory or not. In this first phase of the project, the aim has been on providing a clear description of the issue of probabilistic safety goals for nuclear power plants, to define and describe important concepts related to the definition and application of safety goals, and to describe experiences in Finland and Sweden. Based on a series of interviews and on literature reviews as well as on a limited international over-view, the project has described the history and current status of safety goals in Sweden and Finland, and elaborated on a number of issues, including the following: 1) The status of the safety goals in view of the fact that they have been exceeded for much of the time they have been in use, as well as the possible implications of these exceedances. 2) Safety goals as informal or mandatory limits. 3) Strategies for handling violations of safety goals, including various graded approaches, such as ALARP (As Low As Reasonably Practicable). 4) Relation between safety goals defined on different levels, e.g., for core damage

  12. Meeting performance goals by the use of experience data

    International Nuclear Information System (INIS)

    Salmon, M.W.; Kennedy, R.P.

    1993-01-01

    DOE Order 5480.28 requires that structures, systems and components (SSCs) be designed and constructed to withstand the effects of natural phenomena hazards. For SSCs to be acceptable, it must be demonstrated that there is a sufficiently low probability of failure of those SSCs consistent with established performance goals. For new design, NPH loads are taken from probabilistic hazard assessments and coupled with response and evaluation methods to control the levels of conservatism required to achieve performance goals. For components qualified by test, performance goals are achieved by specifying a test response spectrum that envelops a required response spectrum coupled with minimal acceptance standards. DOE Standard 1020-92 adapts both of these approaches to ensure that the required performance goals are met for new installations. For existing installations these approaches are generally not applicable. There is a need for a simple approach for use in verifying the performance of existing equipment subject to seismic hazards. The USNRC has adapted such an approach for the resolution of USI A-46 in the Generic Implementation Procedure (GIP). A simple set of screening rules, keyed to a generic bounding spectrum forms the basis of the USNRC approach. A similar approach is being adapted for use in the DOE. The DOE approach, however, must also ensure that appropriate performance goals are met when the general screens are met. This paper summarizes research to date on the topic of meeting performance goals by the use of experience data. The paper presents a review of the background material, a summary of the requirements for existing components, a summary of the approach used in establishing the performance goals associated with experience data approaches, and a summary of results to date. Simplified criteria are proposed

  13. Developing Probabilistic Safety Performance Margins for Unknown and Underappreciated Risks

    Science.gov (United States)

    Benjamin, Allan; Dezfuli, Homayoon; Everett, Chris

    2015-01-01

    Probabilistic safety requirements currently formulated or proposed for space systems, nuclear reactor systems, nuclear weapon systems, and other types of systems that have a low-probability potential for high-consequence accidents depend on showing that the probability of such accidents is below a specified safety threshold or goal. Verification of compliance depends heavily upon synthetic modeling techniques such as PRA. To determine whether or not a system meets its probabilistic requirements, it is necessary to consider whether there are significant risks that are not fully considered in the PRA either because they are not known at the time or because their importance is not fully understood. The ultimate objective is to establish a reasonable margin to account for the difference between known risks and actual risks in attempting to validate compliance with a probabilistic safety threshold or goal. In this paper, we examine data accumulated over the past 60 years from the space program, from nuclear reactor experience, from aircraft systems, and from human reliability experience to formulate guidelines for estimating probabilistic margins to account for risks that are initially unknown or underappreciated. The formulation includes a review of the safety literature to identify the principal causes of such risks.

  14. Probabilistic Structural Analysis Theory Development

    Science.gov (United States)

    Burnside, O. H.

    1985-01-01

    The objective of the Probabilistic Structural Analysis Methods (PSAM) project is to develop analysis techniques and computer programs for predicting the probabilistic response of critical structural components for current and future space propulsion systems. This technology will play a central role in establishing system performance and durability. The first year's technical activity is concentrating on probabilistic finite element formulation strategy and code development. Work is also in progress to survey critical materials and space shuttle mian engine components. The probabilistic finite element computer program NESSUS (Numerical Evaluation of Stochastic Structures Under Stress) is being developed. The final probabilistic code will have, in the general case, the capability of performing nonlinear dynamic of stochastic structures. It is the goal of the approximate methods effort to increase problem solving efficiency relative to finite element methods by using energy methods to generate trial solutions which satisfy the structural boundary conditions. These approximate methods will be less computer intensive relative to the finite element approach.

  15. Development of Probabilistic Performance Evaluation Procedure for Umbilical Lines of Seismically Isolated NPPs

    International Nuclear Information System (INIS)

    Hahm, Daegi; Park, Junhee; Choi, Inkil

    2013-01-01

    In this study, we proposed a procedure to perform the probabilistic performance evaluation of interface piping system for seismically isolated NPPs, and carried out the preliminary performance evaluation of the target example umbilical line. For EDB level earthquakes, the target performance goal cannot be fulfilled, but we also find out that the result can be changed with respect to the variation of the assumed values, i. e., the distribution of response, and the limit state of piping system. Recently, to design the nuclear power plants (NPPs) more efficiently and safely against the strong seismic load, many researchers focus on the seismic isolation system. For the adoption of seismic isolation system to the NPPs, the seismic performance of isolation devices, structures, and components should be guaranteed firstly. Hence, some researches were performed to determine the seismic performance of such items. For the interface piping system between isolated structure and non-isolated structure, the seismic capacity should be carefully estimated since that the required displacement absorption capacity will be increased significantly by the adoption of the seismic isolation system. Nowadays, in NUREG report, the probabilistic performance criteria for isolated NPP structures and components are proposed. Hence, in this study, we developed the probabilistic performance evaluation method and procedure for interface piping system, and applied the method to an example pipe. The detailed procedure and main results are summarized in next section. For the interface piping system, the seismic capacity should be carefully estimated since that the required displacement absorption capacity will be increased significantly by the adoption of the seismic isolation system

  16. Probabilistic safety goals. Phase 2 - Status report

    International Nuclear Information System (INIS)

    Holmberg, J.-E.; Bjoerkman, K.; Rossi, J.; Knochenhauer, M.; Xuhong He; Persson, A.; Gustavsson, H.

    2008-07-01

    The second phase of the project, the outcome of which is described in this project report has mainly dealt with four issues: 1) Consistency in the usage of safety goals 2) Criteria for assessment of results from PSA level 2 3) Overview of international safety goals and experiences from their use 4) Safety goals related to other man-made risks in society. Consistency in judgement over time has been perceived to be one of the main problems in the usage of safety goals. Safety goals defined in the 80ies were met in the beginning with PSA:s performed to the standards of that time, i.e., by PSA:s that were quite limited in scope and level of detail compared to today's state of the art. This issue was investigated by performing a comparative review was performed of three generations of the same PSA, focusing on the impact from changes over time in component failure data, IE frequency, and modelling of the plant, including plant changes and changes in success criteria. It proved to be very time-consuming and in some cases next to impossible to correctly identify the basic causes for changes in PSA results. A multitude of different sub-causes turned out to combined and difficult to differentiate. Thus, rigorous book-keeping is needed in order to keep track of how and why PSA results change. This is especially important in order to differentiate 'real' differences due to plant changes and updated component and IE data from differences that are due to general PSA development (scope, level of detail, modelling issues). (au)

  17. Probabilistic safety goals. Phase 2 - Status report

    Energy Technology Data Exchange (ETDEWEB)

    Holmberg, J.-E.; Bjoerkman, K. Rossi, J. (VTT (Finland)); Knochenhauer, M.; Xuhong He; Persson, A.; Gustavsson, H. (Relcon Scandpower AB, Sundbyberg (Sweden))

    2008-07-15

    The second phase of the project, the outcome of which is described in this project report has mainly dealt with four issues: 1) Consistency in the usage of safety goals 2) Criteria for assessment of results from PSA level 2 3) Overview of international safety goals and experiences from their use 4) Safety goals related to other man-made risks in society. Consistency in judgement over time has been perceived to be one of the main problems in the usage of safety goals. Safety goals defined in the 80ies were met in the beginning with PSA:s performed to the standards of that time, i.e., by PSA:s that were quite limited in scope and level of detail compared to today's state of the art. This issue was investigated by performing a comparative review was performed of three generations of the same PSA, focusing on the impact from changes over time in component failure data, IE frequency, and modelling of the plant, including plant changes and changes in success criteria. It proved to be very time-consuming and in some cases next to impossible to correctly identify the basic causes for changes in PSA results. A multitude of different sub-causes turned out to combined and difficult to differentiate. Thus, rigorous book-keeping is needed in order to keep track of how and why PSA results change. This is especially important in order to differentiate 'real' differences due to plant changes and updated component and IE data from differences that are due to general PSA development (scope, level of detail, modelling issues). (au)

  18. Application of probabilistic safety goals to regulation of nuclear power plants in Canada

    Energy Technology Data Exchange (ETDEWEB)

    Rzentkowski, G.; Akl, Y.; Yalaoui, S. [Canadian Nuclear Safety Commission, Ottawa, Ontario (Canada)

    2013-07-01

    In the Canadian nuclear regulatory framework, Safety Goals are formulated in addition to the deterministic design requirements and the dose acceptance criteria so that risk to the public that originates from accidents outside the design basis is considered. In principle, application of the Safety Goals ensures that the likelihood of accidents with serious radiological consequences is extremely low, and the potential radiological consequences from severe accidents are limited as far as practicable. Effectively, the Safety Goals extend the plant design envelope to include not only the capabilities of the plant to successfully cope with various plant states, but also practical measures to halt the progression of severe accidents. This paper describes the general approach to the development of the Safety Goals and their application to the existing nuclear power plants in Canada. This general approach is consistent with the currently accepted international practice and Canadian regulatory experience. The results of probabilistic safety assessments indicate that the Safety Goals meet or exceed international safety objectives due to effective implementation of the defence-in-depth principle in the reactor design and plant operation. At the same time, the application of the Safety Goals reveal that practicable measures exist to further enhance the overall level of reactor safety by focusing on severe accident prevention and mitigation. These measures are being currently implemented through refurbishment projects and feedback on operating experience. (author)

  19. A Probabilistic Design Methodology for a Turboshaft Engine Overall Performance Analysis

    Directory of Open Access Journals (Sweden)

    Min Chen

    2014-05-01

    Full Text Available In reality, the cumulative effect of the many uncertainties in engine component performance may stack up to affect the engine overall performance. This paper aims to quantify the impact of uncertainty in engine component performance on the overall performance of a turboshaft engine based on Monte-Carlo probabilistic design method. A novel probabilistic model of turboshaft engine, consisting of a Monte-Carlo simulation generator, a traditional nonlinear turboshaft engine model, and a probability statistical model, was implemented to predict this impact. One of the fundamental results shown herein is that uncertainty in component performance has a significant impact on the engine overall performance prediction. This paper also shows that, taking into consideration the uncertainties in component performance, the turbine entry temperature and overall pressure ratio based on the probabilistic design method should increase by 0.76% and 8.33%, respectively, compared with the ones of deterministic design method. The comparison shows that the probabilistic approach provides a more credible and reliable way to assign the design space for a target engine overall performance.

  20. Performance-approach and performance-avoidance classroom goals and the adoption of personal achievement goals.

    Science.gov (United States)

    Schwinger, Malte; Stiensmeier-Pelster, Joachim

    2011-12-01

    Students' perceptions of classroom goals influence their adoption of personal goals. To assess different forms of classroom goals, recent studies have favoured an overall measure of performance classroom goals, compared to a two-dimensional assessment of performance-approach and performance-avoidance classroom goals (PAVCG). This paper considered the relationship between students' perceptions of classroom goals and their endorsement of personal achievement goals. We proposed that three (instead of only two) classroom goals need to be distinguished. We aimed to provide evidence for this hypothesis by confirmatory factor analysis (CFA) and also by divergent associations between the respective classroom goal and students' personal goal endorsement. A total of 871 (474 female) 10th grade students from several German high schools participated in this study. Students responded to items assessing their perception of mastery, performance-approach, and performance-avoidance goals in the classroom. Additionally, the students reported how much they personally pursue mastery, performance-approach, and performance-avoidance goals. All items referred to German as a specific school subject. RESULTS.A CFA yielded empirical support for the proposed distinction of three (instead of only two) different kinds of classroom goals. Moreover, in hierarchical linear modelling (HLM) analyses all three classroom goals showed unique associations with students' personal goal adoption. The findings emphasized the need to distinguish performance-approach and PAVCG. Furthermore, our results suggest that multiple classroom goals have interactive effects on students' personal achievement strivings. ©2010 The British Psychological Society.

  1. Probabilistic Analysis of Gas Turbine Field Performance

    Science.gov (United States)

    Gorla, Rama S. R.; Pai, Shantaram S.; Rusick, Jeffrey J.

    2002-01-01

    A gas turbine thermodynamic cycle was computationally simulated and probabilistically evaluated in view of the several uncertainties in the performance parameters, which are indices of gas turbine health. Cumulative distribution functions and sensitivity factors were computed for the overall thermal efficiency and net specific power output due to the thermodynamic random variables. These results can be used to quickly identify the most critical design variables in order to optimize the design, enhance performance, increase system availability and make it cost effective. The analysis leads to the selection of the appropriate measurements to be used in the gas turbine health determination and to the identification of both the most critical measurements and parameters. Probabilistic analysis aims at unifying and improving the control and health monitoring of gas turbine aero-engines by increasing the quality and quantity of information available about the engine's health and performance.

  2. Performance analysis of chi models using discrete-time probabilistic reward graphs

    NARCIS (Netherlands)

    Trcka, N.; Georgievska, S.; Markovski, J.; Andova, S.; Vink, de E.P.

    2008-01-01

    We propose the model of discrete-time probabilistic reward graphs (DTPRGs) for performance analysis of systems exhibiting discrete deterministic time delays and probabilistic behavior, via their interpretation as discrete-time Markov reward chains, full-fledged platform for qualitative and

  3. Guidance for the definition and application of probabilistic safety criteria

    International Nuclear Information System (INIS)

    Holmberg, J.-E.; Knochenhauer, M.

    2011-05-01

    The project 'The Validity of Safety Goals' has been financed jointly by NKS (Nordic Nuclear Safety Research), SSM (Swedish Radiation Safety Authority) and the Swedish and Finnish nuclear utilities. The national financing went through NPSAG, the Nordic PSA Group (Swedish contributions) and SAFIR2010, the Finnish research programme on NPP safety (Finnish contributions). The project has been performed in four phases during 2006-2010. This guidance document aims at describing, on the basis of the work performed throughout the project, issues to consider when defining, applying and interpreting probabilistic safety criteria. Thus, the basic aim of the document is to serve as a checklist and toolbox for the definition and application of probabilistic safety criteria. The document describes the terminology and concepts involved, the levels of criteria and relations between these, how to define a probabilistic safety criterion, how to apply a probabilistic safety criterion, on what to apply the probabilistic safety criterion, and how to interpret the result of the application. The document specifically deals with what makes up a probabilistic safety criterion, i.e., the risk metric, the frequency criterion, the PSA used for assessing compliance and the application procedure for the criterion. It also discusses the concept of subsidiary criteria, i.e., different levels of safety goals. The results from the project can be used as a platform for discussions at the utilities on how to define and use quantitative safety goals. The results can also be used by safety authorities as a reference for risk-informed regulation. The outcome can have an impact on the requirements on PSA, e.g., regarding quality, scope, level of detail, and documentation. Finally, the results can be expected to support on-going activities concerning risk-informed applications. (Author)

  4. Guidance for the definition and application of probabilistic safety criteria

    Energy Technology Data Exchange (ETDEWEB)

    Holmberg, J.-E. (VTT Technical Research Centre of Finland (Finland)); Knochenhauer, M. (Scandpower AB (Sweden))

    2011-05-15

    The project 'The Validity of Safety Goals' has been financed jointly by NKS (Nordic Nuclear Safety Research), SSM (Swedish Radiation Safety Authority) and the Swedish and Finnish nuclear utilities. The national financing went through NPSAG, the Nordic PSA Group (Swedish contributions) and SAFIR2010, the Finnish research programme on NPP safety (Finnish contributions). The project has been performed in four phases during 2006-2010. This guidance document aims at describing, on the basis of the work performed throughout the project, issues to consider when defining, applying and interpreting probabilistic safety criteria. Thus, the basic aim of the document is to serve as a checklist and toolbox for the definition and application of probabilistic safety criteria. The document describes the terminology and concepts involved, the levels of criteria and relations between these, how to define a probabilistic safety criterion, how to apply a probabilistic safety criterion, on what to apply the probabilistic safety criterion, and how to interpret the result of the application. The document specifically deals with what makes up a probabilistic safety criterion, i.e., the risk metric, the frequency criterion, the PSA used for assessing compliance and the application procedure for the criterion. It also discusses the concept of subsidiary criteria, i.e., different levels of safety goals. The results from the project can be used as a platform for discussions at the utilities on how to define and use quantitative safety goals. The results can also be used by safety authorities as a reference for risk-informed regulation. The outcome can have an impact on the requirements on PSA, e.g., regarding quality, scope, level of detail, and documentation. Finally, the results can be expected to support on-going activities concerning risk-informed applications. (Author)

  5. Predicting Subsequent Task Performance From Goal Motivation and Goal Failure

    Directory of Open Access Journals (Sweden)

    Laura Catherine Healy

    2015-07-01

    Full Text Available Recent research has demonstrated that the cognitive processes associated with goal pursuit can continue to interfere with unrelated tasks when a goal is unfulfilled. Drawing from the self-regulation and goal-striving literatures, the present study explored the impact of goal failure on subsequent cognitive and physical task performance. Furthermore, we examined if the autonomous or controlled motivation underpinning goal striving moderates the responses to goal failure. Athletes (75 male, 59 female, Mage = 19.90 years, SDage = 3.50 completed a cycling trial with the goal of covering a given distance in 8 minutes. Prior to the trial, their motivation was primed using a video. During the trial they were provided with manipulated performance feedback, thus creating conditions of goal success or failure. No differences emerged in the responses to goal failure between the primed motivation or performance feedback conditions. We make recommendations for future research into how individuals can deal with failure in goal striving.

  6. Can motto-goals outperform learning and performance goals? Influence of goal setting on performance and affect in a complex problem solving task

    Directory of Open Access Journals (Sweden)

    Miriam S. Rohe

    2016-09-01

    Full Text Available In this paper, we bring together research on complex problem solving with that on motivational psychology about goal setting. Complex problems require motivational effort because of their inherent difficulties. Goal Setting Theory has shown with simple tasks that high, specific performance goals lead to better performance outcome than do-your-best goals. However, in complex tasks, learning goals have proven more effective than performance goals. Based on the Zurich Resource Model (Storch & Krause, 2014, so-called motto-goals (e.g., "I breathe happiness" should activate a person’s resources through positive affect. It was found that motto-goals are effective with unpleasant duties. Therefore, we tested the hypothesis that motto-goals outperform learning and performance goals in the case of complex problems. A total of N = 123 subjects participated in the experiment. In dependence of their goal condition, subjects developed a personal motto, learning, or performance goal. This goal was adapted for the computer-simulated complex scenario Tailorshop, where subjects worked as managers in a small fictional company. Other than expected, there was no main effect of goal condition for the management performance. As hypothesized, motto goals led to higher positive and lower negative affect than the other two goal types. Even though positive affect decreased and negative affect increased in all three groups during Tailorshop completion, participants with motto goals reported the lowest rates of negative affect over time. Exploratory analyses investigated the role of affect in complex problem solving via mediational analyses and the influence of goal type on perceived goal attainment.

  7. A Bayesian Developmental Approach to Robotic Goal-Based Imitation Learning.

    Directory of Open Access Journals (Sweden)

    Michael Jae-Yoon Chung

    Full Text Available A fundamental challenge in robotics today is building robots that can learn new skills by observing humans and imitating human actions. We propose a new Bayesian approach to robotic learning by imitation inspired by the developmental hypothesis that children use self-experience to bootstrap the process of intention recognition and goal-based imitation. Our approach allows an autonomous agent to: (i learn probabilistic models of actions through self-discovery and experience, (ii utilize these learned models for inferring the goals of human actions, and (iii perform goal-based imitation for robotic learning and human-robot collaboration. Such an approach allows a robot to leverage its increasing repertoire of learned behaviors to interpret increasingly complex human actions and use the inferred goals for imitation, even when the robot has very different actuators from humans. We demonstrate our approach using two different scenarios: (i a simulated robot that learns human-like gaze following behavior, and (ii a robot that learns to imitate human actions in a tabletop organization task. In both cases, the agent learns a probabilistic model of its own actions, and uses this model for goal inference and goal-based imitation. We also show that the robotic agent can use its probabilistic model to seek human assistance when it recognizes that its inferred actions are too uncertain, risky, or impossible to perform, thereby opening the door to human-robot collaboration.

  8. Classification of Company Performance using Weighted Probabilistic Neural Network

    Science.gov (United States)

    Yasin, Hasbi; Waridi Basyiruddin Arifin, Adi; Warsito, Budi

    2018-05-01

    Classification of company performance can be judged by looking at its financial status, whether good or bad state. Classification of company performance can be achieved by some approach, either parametric or non-parametric. Neural Network is one of non-parametric methods. One of Artificial Neural Network (ANN) models is Probabilistic Neural Network (PNN). PNN consists of four layers, i.e. input layer, pattern layer, addition layer, and output layer. The distance function used is the euclidean distance and each class share the same values as their weights. In this study used PNN that has been modified on the weighting process between the pattern layer and the addition layer by involving the calculation of the mahalanobis distance. This model is called the Weighted Probabilistic Neural Network (WPNN). The results show that the company's performance modeling with the WPNN model has a very high accuracy that reaches 100%.

  9. User manual for the probabilistic fuel performance code FRP

    International Nuclear Information System (INIS)

    Friis Jensen, J.; Misfeldt, I.

    1980-10-01

    This report describes the use of the probabilistic fuel performance code FRP. Detailed description of both input to and output from the program are given. The use of the program is illustrated by an example. (author)

  10. Use and Communication of Probabilistic Forecasts.

    Science.gov (United States)

    Raftery, Adrian E

    2016-12-01

    Probabilistic forecasts are becoming more and more available. How should they be used and communicated? What are the obstacles to their use in practice? I review experience with five problems where probabilistic forecasting played an important role. This leads me to identify five types of potential users: Low Stakes Users, who don't need probabilistic forecasts; General Assessors, who need an overall idea of the uncertainty in the forecast; Change Assessors, who need to know if a change is out of line with expectatations; Risk Avoiders, who wish to limit the risk of an adverse outcome; and Decision Theorists, who quantify their loss function and perform the decision-theoretic calculations. This suggests that it is important to interact with users and to consider their goals. The cognitive research tells us that calibration is important for trust in probability forecasts, and that it is important to match the verbal expression with the task. The cognitive load should be minimized, reducing the probabilistic forecast to a single percentile if appropriate. Probabilities of adverse events and percentiles of the predictive distribution of quantities of interest seem often to be the best way to summarize probabilistic forecasts. Formal decision theory has an important role, but in a limited range of applications.

  11. Use and Communication of Probabilistic Forecasts

    Science.gov (United States)

    Raftery, Adrian E.

    2015-01-01

    Probabilistic forecasts are becoming more and more available. How should they be used and communicated? What are the obstacles to their use in practice? I review experience with five problems where probabilistic forecasting played an important role. This leads me to identify five types of potential users: Low Stakes Users, who don’t need probabilistic forecasts; General Assessors, who need an overall idea of the uncertainty in the forecast; Change Assessors, who need to know if a change is out of line with expectatations; Risk Avoiders, who wish to limit the risk of an adverse outcome; and Decision Theorists, who quantify their loss function and perform the decision-theoretic calculations. This suggests that it is important to interact with users and to consider their goals. The cognitive research tells us that calibration is important for trust in probability forecasts, and that it is important to match the verbal expression with the task. The cognitive load should be minimized, reducing the probabilistic forecast to a single percentile if appropriate. Probabilities of adverse events and percentiles of the predictive distribution of quantities of interest seem often to be the best way to summarize probabilistic forecasts. Formal decision theory has an important role, but in a limited range of applications. PMID:28446941

  12. Probabilistic Radiological Performance Assessment Modeling and Uncertainty

    Science.gov (United States)

    Tauxe, J.

    2004-12-01

    A generic probabilistic radiological Performance Assessment (PA) model is presented. The model, built using the GoldSim systems simulation software platform, concerns contaminant transport and dose estimation in support of decision making with uncertainty. Both the U.S. Nuclear Regulatory Commission (NRC) and the U.S. Department of Energy (DOE) require assessments of potential future risk to human receptors of disposal of LLW. Commercially operated LLW disposal facilities are licensed by the NRC (or agreement states), and the DOE operates such facilities for disposal of DOE-generated LLW. The type of PA model presented is probabilistic in nature, and hence reflects the current state of knowledge about the site by using probability distributions to capture what is expected (central tendency or average) and the uncertainty (e.g., standard deviation) associated with input parameters, and propagating through the model to arrive at output distributions that reflect expected performance and the overall uncertainty in the system. Estimates of contaminant release rates, concentrations in environmental media, and resulting doses to human receptors well into the future are made by running the model in Monte Carlo fashion, with each realization representing a possible combination of input parameter values. Statistical summaries of the results can be compared to regulatory performance objectives, and decision makers are better informed of the inherently uncertain aspects of the model which supports their decision-making. While this information may make some regulators uncomfortable, they must realize that uncertainties which were hidden in a deterministic analysis are revealed in a probabilistic analysis, and the chance of making a correct decision is now known rather than hoped for. The model includes many typical features and processes that would be part of a PA, but is entirely fictitious. This does not represent any particular site and is meant to be a generic example. A

  13. Efficient probabilistic inference in generic neural networks trained with non-probabilistic feedback.

    Science.gov (United States)

    Orhan, A Emin; Ma, Wei Ji

    2017-07-26

    Animals perform near-optimal probabilistic inference in a wide range of psychophysical tasks. Probabilistic inference requires trial-to-trial representation of the uncertainties associated with task variables and subsequent use of this representation. Previous work has implemented such computations using neural networks with hand-crafted and task-dependent operations. We show that generic neural networks trained with a simple error-based learning rule perform near-optimal probabilistic inference in nine common psychophysical tasks. In a probabilistic categorization task, error-based learning in a generic network simultaneously explains a monkey's learning curve and the evolution of qualitative aspects of its choice behavior. In all tasks, the number of neurons required for a given level of performance grows sublinearly with the input population size, a substantial improvement on previous implementations of probabilistic inference. The trained networks develop a novel sparsity-based probabilistic population code. Our results suggest that probabilistic inference emerges naturally in generic neural networks trained with error-based learning rules.Behavioural tasks often require probability distributions to be inferred about task specific variables. Here, the authors demonstrate that generic neural networks can be trained using a simple error-based learning rule to perform such probabilistic computations efficiently without any need for task specific operations.

  14. The probabilistic approach and the deterministic licensing procedure

    International Nuclear Information System (INIS)

    Fabian, H.; Feigel, A.; Gremm, O.

    1984-01-01

    If safety goals are given, the creativity of the engineers is necessary to transform the goals into actual safety measures. That is, safety goals are not sufficient for the derivation of a safety concept; the licensing process asks ''What does a safe plant look like.'' The answer connot be given by a probabilistic procedure, but need definite deterministic statements; the conclusion is, that the licensing process needs a deterministic approach. The probabilistic approach should be used in a complementary role in cases where deterministic criteria are not complete, not detailed enough or not consistent and additional arguments for decision making in connection with the adequacy of a specific measure are necessary. But also in these cases the probabilistic answer has to be transformed into a clear deterministic statement. (orig.)

  15. Probabilistic assessments of fuel performance

    International Nuclear Information System (INIS)

    Kelppe, S.; Ranta-Puska, K.

    1998-01-01

    The probabilistic Monte Carlo Method, coupled with quasi-random sampling, is applied for the fuel performance analyses. By using known distributions of fabrication parameters and real power histories with their randomly selected combinations, and by making a large number of ENIGMA code calculations, one expects to find out the state of the whole reactor fuel. Good statistics requires thousands of runs. A sample case representing VVER-440 reactor fuel indicates relatively low fuel temperatures and mainly athermal fission gas release if any. The rod internal pressure remains typically below 2.5 MPa, which leaves a large margin to the system pressure of 12 MPa Gap conductance, an essential parameter in the accident evaluations, shows no decrease from its start-of-life value. (orig.)

  16. Pengaruh Goal Setting terhadap Performance : Tinjauan Teoritis

    OpenAIRE

    Ginting, Surya Dharma; Ariani, D. Wahyu

    2004-01-01

    This article is the conceptual view of goal setting theory and effects of goal setting on individual performance. Goal setting is recognized, and is a major theory of work motivation. Difficult goals have consistently been shown to lead to higher levels of performance than easy goals. If there is no commitment, a goal can have no motivational effect. Goals are central to current treatments of work motivation, and goal commitment is a necessary condition for difficult goals to result in higher...

  17. Predicting race performance in triathlon: the role of perfectionism, achievement goals, and personal goal setting.

    Science.gov (United States)

    Stoeber, Joachim; Uphill, Mark A; Hotham, Sarah

    2009-04-01

    The question of how perfectionism affects performance is highly debated. Because empirical studies examining perfectionism and competitive sport performance are missing, the present research investigated how perfectionism affected race performance and what role athletes' goals played in this relationship in two prospective studies with competitive triathletes (Study 1: N = 112; Study 2: N = 321). Regression analyses showed that perfectionistic personal standards, high performance-approach goals, low performance-avoidance goals, and high personal goals predicted race performance beyond athletes' performance level. Moreover, the contrast between performance-avoidance and performance-approach goals mediated the relationship between perfectionistic personal standards and performance, whereas personal goal setting mediated the relationship between performance-approach goals and performance. The findings indicate that perfectionistic personal standards do not undermine competitive performance, but are associated with goals that help athletes achieve their best possible performance.

  18. Achievement goals, self-handicapping, and performance: a 2 x 2 achievement goal perspective.

    Science.gov (United States)

    Ntoumanis, Nikos; Thøgersen-Ntoumani, Cecilie; Smith, Alison L

    2009-11-01

    Elliot and colleagues (2006) examined the effects of experimentally induced achievement goals, proposed by the trichotomous model, on self-handicapping and performance in physical education. Our study replicated and extended the work of Elliot et al. by experimentally promoting all four goals proposed by the 2 x 2 model (Elliot & McGregor, 2001), measuring the participants' own situational achievement goals, using a relatively novel task, and testing the participants in a group setting. We used a randomized experimental design with four conditions that aimed to induce one of the four goals advanced by the 2 x 2 model. The participants (n = 138) were undergraduates who engaged in a dart-throwing task. The results pertaining to self-handicapping partly replicated Elliot and colleagues' findings by showing that experimentally promoted performance-avoidance goals resulted in less practice. In contrast, the promotion of mastery-avoidance goals did not result in less practice compared with either of the approach goals. Dart-throwing performance did not differ among the four goal conditions. Personal achievement goals did not moderate the effects of experimentally induced goals on self-handicapping and performance. The extent to which mastery-avoidance goals are maladaptive is discussed, as well as the interplay between personal and experimentally induced goals.

  19. Probabilistic risk assessment in nuclear power plant regulation

    Energy Technology Data Exchange (ETDEWEB)

    Wall, J B

    1980-09-01

    A specific program is recommended to utilize more effectively probabilistic risk assessment in nuclear power plant regulation. It is based upon the engineering insights from the Reactor Safety Study (WASH-1400) and some follow-on risk assessment research by USNRC. The Three Mile Island accident is briefly discussed from a risk viewpoint to illustrate a weakness in current practice. The development of a probabilistic safety goal is recommended with some suggestions on underlying principles. Some ongoing work on risk perception and the draft probabilistic safety goal being reviewed on Canada is described. Some suggestions are offered on further risk assessment research. Finally, some recent U.S. Nuclear Regulatory Commission actions are described.

  20. Performing Probabilistic Risk Assessment Through RAVEN

    Energy Technology Data Exchange (ETDEWEB)

    A. Alfonsi; C. Rabiti; D. Mandelli; J. Cogliati; R. Kinoshita

    2013-06-01

    The Reactor Analysis and Virtual control ENviroment (RAVEN) code is a software tool that acts as the control logic driver and post-processing engine for the newly developed Thermal-Hydraulic code RELAP-7. RAVEN is now a multi-purpose Probabilistic Risk Assessment (PRA) software framework that allows dispatching different functionalities: Derive and actuate the control logic required to simulate the plant control system and operator actions (guided procedures), allowing on-line monitoring/controlling in the Phase Space Perform both Monte-Carlo sampling of random distributed events and Dynamic Event Tree based analysis Facilitate the input/output handling through a Graphical User Interface (GUI) and a post-processing data mining module

  1. Differentiating Performance Approach Goals and Their Unique Effects

    Science.gov (United States)

    Edwards, Ordene V.

    2014-01-01

    The study differentiates between two types of performance approach goals (competence demonstration performance approach goal and normative performance approach goal) by examining their unique effects on self-efficacy, interest, and fear of failure. Seventy-nine students completed questionnaires that measure performance approach goals,…

  2. Latent Profile Analysis of Schizotypy and Paranormal Belief: Associations with Probabilistic Reasoning Performance

    OpenAIRE

    Denovan, Andrew; Dagnall, Neil; Drinkwater, Kenneth; Parker, Andrew

    2018-01-01

    This study assessed the extent to which within-individual variation in schizotypy and paranormal belief influenced performance on probabilistic reasoning tasks. A convenience sample of 725 non-clinical adults completed measures assessing schizotypy (Oxford-Liverpool Inventory of Feelings and Experiences; O-Life brief), belief in the paranormal (Revised Paranormal Belief Scale; RPBS) and probabilistic reasoning (perception of randomness, conjunction fallacy, paranormal perception of randomness...

  3. Application of the performance-goal based approach for establishing the SSE site specific response spectrum for new nuclear power plants in South Africa

    Energy Technology Data Exchange (ETDEWEB)

    Nhleko, Sifiso, E-mail: snhleko@nnr.co.za [National Nuclear Regulator of South Africa (South Africa)

    2013-02-15

    Highlights: ► Criteria for import of performance goals defined in ASCE 43-05 are established. ► Derivation of performance goals from radiological safety criteria is demonstrated. ► Evaluation of mean exceedance frequencies from performance goals is illustrated. ► Simple formulae for the definition of a capable fault are presented. -- Abstract: Nuclear installation license holders in South Africa have become increasingly interested in the performance-goal based approach defined in the American Society of Civil Engineering Standard ASCE/SEI 43-05 for establishing the safe shutdown earthquake (SSE) site specific design response spectrum (SSRS) for new nuclear power plants. This approach has been adopted by the U.S. Nuclear Regulatory Commission (NRC) and has now been followed at more than 20 sites in that country. Quantitative performance goals are required when establishing seismic design basis parameters using the performance-goal based approach. However, the quantitative performance goals recommended in ASCE/SEI 43-05 were established based on country-specific operating experience and seismic probabilistic risk assessment (SPRA) applications conducted for existing plants designed and operated to meet specific safety criteria, set by a specific regulatory body. Whilst ASCE/SEI 43-05 provides enough flexibility for the selection of other user-specified quantitative performance goals, there is no guidance on how quantitative performance goals should be established in the absence of extensive operational experience accompanied by data derived from rigorous SPRA applications. This paper presents two practical approaches that can be used to provide a technical basis and to demonstrate the derivation of quantitative values of target performance goals when no data related to past and present operational experience exist to justify technical specifications.

  4. Human performance analysis in the frame of probabilistic safety assessment of research reactors

    International Nuclear Information System (INIS)

    Farcasiu, Mita; Nitoi, Mirela; Apostol, Minodora; Turcu, I.; Florescu, Gh.

    2005-01-01

    Full text: The analysis of operating experience has identified the importance of human performance in reliability and safety of research reactors. In Probabilistic Safety Assessment (PSA) of nuclear facilities, human performance analysis (HPA) is used in order to estimate human error contribution to the failure of system components or functions. HPA is a qualitative and quantitative analysis of human actions identified for error-likely situations or accident-prone situations. Qualitative analysis is used to identify all man-machine interfaces that can lead to an accident, types of human interactions which may mitigate or exacerbate the accident, types of human errors and performance shaping factors. Quantitative analysis is used to develop estimates of human error probability as effects of human performance in reliability and safety. The goal of this paper is to accomplish a HPA in the PSA frame for research reactors. Human error probabilities estimated as results of human actions analysis could be included in system event tree and/or system fault tree. The achieved sensitivity analyses determine human performance sensibility at systematically variations both for dependencies level between human actions and for operator stress level. The necessary information was obtained from operating experience of research reactor TRIGA from INR Pitesti. The required data were obtained from generic data bases. (authors)

  5. The effect of subconscious performance goals on academic performance

    NARCIS (Netherlands)

    Bipp, T.; Kleingeld, P.A.M.; van Mierlo, H.; Kunde, W.

    2017-01-01

    We investigated the impact of subconscious goals on academic performance in two field experiments. We show that unobtrusive priming of goals with regard to achievement motivation by means of a photograph improves performance in different educational contexts. High-school students who were exposed to

  6. Latent Profile Analysis of Schizotypy and Paranormal Belief: Associations with Probabilistic Reasoning Performance

    Directory of Open Access Journals (Sweden)

    Andrew Denovan

    2018-01-01

    Full Text Available This study assessed the extent to which within-individual variation in schizotypy and paranormal belief influenced performance on probabilistic reasoning tasks. A convenience sample of 725 non-clinical adults completed measures assessing schizotypy (Oxford-Liverpool Inventory of Feelings and Experiences; O-Life brief, belief in the paranormal (Revised Paranormal Belief Scale; RPBS and probabilistic reasoning (perception of randomness, conjunction fallacy, paranormal perception of randomness, and paranormal conjunction fallacy. Latent profile analysis (LPA identified four distinct groups: class 1, low schizotypy and low paranormal belief (43.9% of sample; class 2, moderate schizotypy and moderate paranormal belief (18.2%; class 3, moderate schizotypy (high cognitive disorganization and low paranormal belief (29%; and class 4, moderate schizotypy and high paranormal belief (8.9%. Identification of homogeneous classes provided a nuanced understanding of the relative contribution of schizotypy and paranormal belief to differences in probabilistic reasoning performance. Multivariate analysis of covariance revealed that groups with lower levels of paranormal belief (classes 1 and 3 performed significantly better on perception of randomness, but not conjunction problems. Schizotypy had only a negligible effect on performance. Further analysis indicated that framing perception of randomness and conjunction problems in a paranormal context facilitated performance for all groups but class 4.

  7. Latent Profile Analysis of Schizotypy and Paranormal Belief: Associations with Probabilistic Reasoning Performance.

    Science.gov (United States)

    Denovan, Andrew; Dagnall, Neil; Drinkwater, Kenneth; Parker, Andrew

    2018-01-01

    This study assessed the extent to which within-individual variation in schizotypy and paranormal belief influenced performance on probabilistic reasoning tasks. A convenience sample of 725 non-clinical adults completed measures assessing schizotypy (Oxford-Liverpool Inventory of Feelings and Experiences; O-Life brief), belief in the paranormal (Revised Paranormal Belief Scale; RPBS) and probabilistic reasoning (perception of randomness, conjunction fallacy, paranormal perception of randomness, and paranormal conjunction fallacy). Latent profile analysis (LPA) identified four distinct groups: class 1, low schizotypy and low paranormal belief (43.9% of sample); class 2, moderate schizotypy and moderate paranormal belief (18.2%); class 3, moderate schizotypy (high cognitive disorganization) and low paranormal belief (29%); and class 4, moderate schizotypy and high paranormal belief (8.9%). Identification of homogeneous classes provided a nuanced understanding of the relative contribution of schizotypy and paranormal belief to differences in probabilistic reasoning performance. Multivariate analysis of covariance revealed that groups with lower levels of paranormal belief (classes 1 and 3) performed significantly better on perception of randomness, but not conjunction problems. Schizotypy had only a negligible effect on performance. Further analysis indicated that framing perception of randomness and conjunction problems in a paranormal context facilitated performance for all groups but class 4.

  8. Probabilistic structural analysis of aerospace components using NESSUS

    Science.gov (United States)

    Shiao, Michael C.; Nagpal, Vinod K.; Chamis, Christos C.

    1988-01-01

    Probabilistic structural analysis of a Space Shuttle main engine turbopump blade is conducted using the computer code NESSUS (numerical evaluation of stochastic structures under stress). The goal of the analysis is to derive probabilistic characteristics of blade response given probabilistic descriptions of uncertainties in blade geometry, material properties, and temperature and pressure distributions. Probability densities are derived for critical blade responses. Risk assessment and failure life analysis is conducted assuming different failure models.

  9. The Effect of Subconscious Performance Goals on Academic Performance

    Science.gov (United States)

    Bipp, Tanja; Kleingeld, Ad; van Mierlo, Heleen; Kunde, Wilfried

    2017-01-01

    We investigated the impact of subconscious goals on academic performance in two field experiments. We show that unobtrusive priming of goals with regard to achievement motivation by means of a photograph improves performance in different educational contexts. High-school students who were exposed to an achievement-related photograph achieved…

  10. An advanced probabilistic structural analysis method for implicit performance functions

    Science.gov (United States)

    Wu, Y.-T.; Millwater, H. R.; Cruse, T. A.

    1989-01-01

    In probabilistic structural analysis, the performance or response functions usually are implicitly defined and must be solved by numerical analysis methods such as finite element methods. In such cases, the most commonly used probabilistic analysis tool is the mean-based, second-moment method which provides only the first two statistical moments. This paper presents a generalized advanced mean value (AMV) method which is capable of establishing the distributions to provide additional information for reliability design. The method requires slightly more computations than the second-moment method but is highly efficient relative to the other alternative methods. In particular, the examples show that the AMV method can be used to solve problems involving non-monotonic functions that result in truncated distributions.

  11. Probabilistic methods used in NUSS

    International Nuclear Information System (INIS)

    Fischer, J.; Giuliani, P.

    1985-01-01

    Probabilistic considerations are used implicitly or explicitly in all technical areas. In the NUSS codes and guides the two areas of design and siting are those where more use is made of these concepts. A brief review of the relevant documents in these two areas is made in this paper. It covers the documents where either probabilistic considerations are implied or where probabilistic approaches are recommended in the evaluation of situations and of events. In the siting guides the review mainly covers the area of seismic hydrological and external man-made events analysis, as well as some aspects of meteorological extreme events analysis. Probabilistic methods are recommended in the design guides but they are not made a requirement. There are several reasons for this, mainly lack of reliable data and the absence of quantitative safety limits or goals against which to judge the design analysis. As far as practical, engineering judgement should be backed up by quantitative probabilistic analysis. Examples are given and the concept of design basis as used in NUSS design guides is explained. (author)

  12. The LaSalle probabilistic safety analysis

    International Nuclear Information System (INIS)

    Frederick, L.G.; Massin, H.L.; Crane, G.R.

    1987-01-01

    A probabilistic safety analysis has been performed for LaSalle County Station, a twin-unit General Electric BWR5 Mark II nuclear power plant. A primary objective of this PSA is to provide engineers with a useful and useable tool for making design decisions, performing technical specification optimization, evaluating proposed regulatory changes to equipment and procedures, and as an aid in operator training. Other objectives are to identify the hypothetical accident sequences that would contribute to core damage frequency, and to provide assurance that the total expected frequency of core-damaging accidents is below 10 -4 per reactor-year in response to suggested goals. (orig./HSCH)

  13. A Control Variate Method for Probabilistic Performance Assessment. Improved Estimates for Mean Performance Quantities of Interest

    Energy Technology Data Exchange (ETDEWEB)

    MacKinnon, Robert J.; Kuhlman, Kristopher L

    2016-05-01

    We present a method of control variates for calculating improved estimates for mean performance quantities of interest, E(PQI) , computed from Monte Carlo probabilistic simulations. An example of a PQI is the concentration of a contaminant at a particular location in a problem domain computed from simulations of transport in porous media. To simplify the presentation, the method is described in the setting of a one- dimensional elliptical model problem involving a single uncertain parameter represented by a probability distribution. The approach can be easily implemented for more complex problems involving multiple uncertain parameters and in particular for application to probabilistic performance assessment of deep geologic nuclear waste repository systems. Numerical results indicate the method can produce estimates of E(PQI)having superior accuracy on coarser meshes and reduce the required number of simulations needed to achieve an acceptable estimate.

  14. An analysis of budgetary goals impacting organizational performance

    Directory of Open Access Journals (Sweden)

    Cheok MUI YEE

    2016-05-01

    Full Text Available This study proposes a conceptual review of how budgetary goals impact organizational performance. The aim of this study is to get a better understanding of the direct and indirect relationship between the organization’s decision-making process and operational performances. Setting the budget particularly influences subordinates’ budget goal levels and motivations (i.e., budget goal acceptance and budget goal commitment, which ultimately enhances the organization’s performance. To test these relationships, data were collected using the three perspectives approach: budgetary goal, budgetary participation and budgetary evaluation. The study provided evidence that perception of fairness mediates the relation between the levels of budget participation and goal commitment, whereas goal commitment mediates the relation between fairness perceptions and performance. At the end of the article, there are some implications for SMEs industries and some suggestions for future studies.

  15. Integrated Campaign Probabilistic Cost, Schedule, Performance, and Value for Program Office Support

    Science.gov (United States)

    Cornelius, David; Sasamoto, Washito; Daugherty, Kevin; Deacon, Shaun

    2012-01-01

    This paper describes an integrated assessment tool developed at NASA Langley Research Center that incorporates probabilistic analysis of life cycle cost, schedule, launch performance, on-orbit performance, and value across a series of planned space-based missions, or campaign. Originally designed as an aid in planning the execution of missions to accomplish the National Research Council 2007 Earth Science Decadal Survey, it utilizes Monte Carlo simulation of a series of space missions for assessment of resource requirements and expected return on investment. Interactions between simulated missions are incorporated, such as competition for launch site manifest, to capture unexpected and non-linear system behaviors. A novel value model is utilized to provide an assessment of the probabilistic return on investment. A demonstration case is discussed to illustrate the tool utility.

  16. Achievement goals and interpersonal behaviour: How mastery and performance goals shape information exchange

    NARCIS (Netherlands)

    Poortvliet, P.M.; Janssen, O.; Van Yperen, N.W.; Van de Vliert, E.

    2007-01-01

    The present research examines the impact of achievement goals on task-related information exchange. Studies 1 and 2 reveal that relative to those with mastery goals or no goal, individuals pursuing performance goals were less open in their information giving to exchange partners. Study 2 further

  17. Memristive Probabilistic Computing

    KAUST Repository

    Alahmadi, Hamzah

    2017-10-01

    In the era of Internet of Things and Big Data, unconventional techniques are rising to accommodate the large size of data and the resource constraints. New computing structures are advancing based on non-volatile memory technologies and different processing paradigms. Additionally, the intrinsic resiliency of current applications leads to the development of creative techniques in computations. In those applications, approximate computing provides a perfect fit to optimize the energy efficiency while compromising on the accuracy. In this work, we build probabilistic adders based on stochastic memristor. Probabilistic adders are analyzed with respect of the stochastic behavior of the underlying memristors. Multiple adder implementations are investigated and compared. The memristive probabilistic adder provides a different approach from the typical approximate CMOS adders. Furthermore, it allows for a high area saving and design exibility between the performance and power saving. To reach a similar performance level as approximate CMOS adders, the memristive adder achieves 60% of power saving. An image-compression application is investigated using the memristive probabilistic adders with the performance and the energy trade-off.

  18. Probabilistic safety analysis : a new nuclear power plants licensing method

    International Nuclear Information System (INIS)

    Oliveira, L.F.S. de.

    1982-04-01

    After a brief retrospect of the application of Probabilistic Safety Analysis in the nuclear field, the basic differences between the deterministic licensing method, currently in use, and the probabilistic method are explained. Next, the two main proposals (by the AIF and the ACRS) concerning the establishment of the so-called quantitative safety goals (or simply 'safety goals') are separately presented and afterwards compared in their most fundamental aspects. Finally, some recent applications and future possibilities are discussed. (Author) [pt

  19. Duplicate Detection in Probabilistic Data

    NARCIS (Netherlands)

    Panse, Fabian; van Keulen, Maurice; de Keijzer, Ander; Ritter, Norbert

    2009-01-01

    Collected data often contains uncertainties. Probabilistic databases have been proposed to manage uncertain data. To combine data from multiple autonomous probabilistic databases, an integration of probabilistic data has to be performed. Until now, however, data integration approaches have focused

  20. Probabilistic Logical Characterization

    DEFF Research Database (Denmark)

    Hermanns, Holger; Parma, Augusto; Segala, Roberto

    2011-01-01

    Probabilistic automata exhibit both probabilistic and non-deterministic choice. They are therefore a powerful semantic foundation for modeling concurrent systems with random phenomena arising in many applications ranging from artificial intelligence, security, systems biology to performance...... modeling. Several variations of bisimulation and simulation relations have proved to be useful as means to abstract and compare different automata. This paper develops a taxonomy of logical characterizations of these relations on image-finite and image-infinite probabilistic automata....

  1. Test anxiety, perfectionism, goal orientation, and academic performance.

    Science.gov (United States)

    Eum, KoUn; Rice, Kenneth G

    2011-03-01

    Dimensions of perfectionism and goal orientation have been reported to have differential relationships with test anxiety. However, the degree of inter-relationship between different dimensions of perfectionism, the 2 × 2 model of goal orientations proposed by Elliot and McGregor, cognitive test anxiety, and academic performance indicators is not known. Based on data from 134 university students, we conducted correlation and regression analyses to test associations between adaptive and maladaptive perfectionism, four types of goal orientations, cognitive test anxiety, and two indicators of academic performance: proximal cognitive performance on a word list recall test and distal academic performance in terms of grade point average. Cognitive test anxiety was inversely associated with both performance indicators, and positively associated with maladaptive perfectionism and avoidance goal orientations. Adaptive and maladaptive perfectionism accounted for significant variance in cognitive test anxiety after controlling for approach and avoidance goal orientations. Overall, nearly 50% of the variance in cognitive test anxiety could be attributed to gender, goal orientations, and perfectionism. Results suggested that students who are highly test anxious are likely to be women who endorse avoidance goal orientations and are maladaptively perfectionistic.

  2. Effectiveness of Securities with Fuzzy Probabilistic Return

    Directory of Open Access Journals (Sweden)

    Krzysztof Piasecki

    2011-01-01

    Full Text Available The generalized fuzzy present value of a security is defined here as fuzzy valued utility of cash flow. The generalized fuzzy present value cannot depend on the value of future cash flow. There exists such a generalized fuzzy present value which is not a fuzzy present value in the sense given by some authors. If the present value is a fuzzy number and the future value is a random one, then the return rate is given as a probabilistic fuzzy subset on a real line. This kind of return rate is called a fuzzy probabilistic return. The main goal of this paper is to derive the family of effective securities with fuzzy probabilistic return. Achieving this goal requires the study of the basic parameters characterizing fuzzy probabilistic return. Therefore, fuzzy expected value and variance are determined for this case of return. These results are a starting point for constructing a three-dimensional image. The set of effective securities is introduced as the Pareto optimal set determined by the maximization of the expected return rate and minimization of the variance. Finally, the set of effective securities is distinguished as a fuzzy set. These results are obtained without the assumption that the distribution of future values is Gaussian. (original abstract

  3. Probabilistic safety assessment

    International Nuclear Information System (INIS)

    Hoertner, H.; Schuetz, B.

    1982-09-01

    For the purpose of assessing applicability and informativeness on risk-analysis methods in licencing procedures under atomic law, the choice of instruments for probabilistic analysis, the problems in and experience gained in their application, and the discussion of safety goals with respect to such instruments are of paramount significance. Naturally, such a complex field can only be dealt with step by step, making contribution relative to specific problems. The report on hand shows the essentials of a 'stocktaking' of systems relability studies in the licencing procedure under atomic law and of an American report (NUREG-0739) on 'Quantitative Safety Goals'. (orig.) [de

  4. Documentation design for probabilistic risk assessment

    International Nuclear Information System (INIS)

    Parkinson, W.J.; von Herrmann, J.L.

    1985-01-01

    This paper describes a framework for documentation design of probabilistic risk assessment (PRA) and is based on the EPRI document NP-3470 ''Documentation Design for Probabilistic Risk Assessment''. The goals for PRA documentation are stated. Four audiences are identified which PRA documentation must satisfy, and the documentation consistent with the needs of the various audiences are discussed, i.e., the Summary Report, the Executive Summary, the Main Report, and Appendices. The authors recommend the documentation specifications discussed herein as guides rather than rigid definitions

  5. Probabilistic performance assessment of complex energy process systems - The case of a self-sustained sanitation system.

    Science.gov (United States)

    Kolios, Athanasios; Jiang, Ying; Somorin, Tosin; Sowale, Ayodeji; Anastasopoulou, Aikaterini; Anthony, Edward J; Fidalgo, Beatriz; Parker, Alison; McAdam, Ewan; Williams, Leon; Collins, Matt; Tyrrel, Sean

    2018-05-01

    A probabilistic modelling approach was developed and applied to investigate the energy and environmental performance of an innovative sanitation system, the "Nano-membrane Toilet" (NMT). The system treats human excreta via an advanced energy and water recovery island with the aim of addressing current and future sanitation demands. Due to the complex design and inherent characteristics of the system's input material, there are a number of stochastic variables which may significantly affect the system's performance. The non-intrusive probabilistic approach adopted in this study combines a finite number of deterministic thermodynamic process simulations with an artificial neural network (ANN) approximation model and Monte Carlo simulations (MCS) to assess the effect of system uncertainties on the predicted performance of the NMT system. The joint probability distributions of the process performance indicators suggest a Stirling Engine (SE) power output in the range of 61.5-73 W with a high confidence interval (CI) of 95%. In addition, there is high probability (with 95% CI) that the NMT system can achieve positive net power output between 15.8 and 35 W. A sensitivity study reveals the system power performance is mostly affected by SE heater temperature. Investigation into the environmental performance of the NMT design, including water recovery and CO 2 /NO x emissions, suggests significant environmental benefits compared to conventional systems. Results of the probabilistic analysis can better inform future improvements on the system design and operational strategy and this probabilistic assessment framework can also be applied to similar complex engineering systems.

  6. Goal Setting and Task Performance: 1969-1980

    Science.gov (United States)

    1980-06-01

    self - esteem showed greater performance improvement than individuals with low self - esteem ...There was no self - esteem effect when instrumentality was high. When self - esteem was low , typists who perceived high goal instrumentality showed greater...performance improvement than M. 49 those with low goal instrumentality; when self - esteem was high, there was no instrumentality effect.

  7. The impact of goal setting and goal orientation on performance during a clerkship surgical skills training program.

    Science.gov (United States)

    Gardner, Aimee K; Diesen, Diana L; Hogg, Deborah; Huerta, Sergio

    2016-02-01

    The purpose of this study was to integrate relevant goal-setting theory and to identify if trainees' goal orientations have an impact on the assigned goals-performance relationship. Trainees attended 1 of the 3 goal-training activities (do your best, performance, or learning goals) for knot tying (KT) and camera navigation (CN) during the 3rd-year clerkship rotation. Questionnaires and pretests and/or post-tests were completed. One twenty-seven 3rd-year medical students (age: 25 ± 2.6; 54% women) participated in the training program. Pretraining to post-training performance changes were significant for all groups on both tasks (P goals group (do your best: KTΔ = 2.14, CNΔ = 1.69; performance: KTΔ = 2.49, CNΔ = 2.24; learning: KTΔ = 3.04 CNΔ = 2.76). Correlations between goal orientations and improvement were examined, revealing a unique role of goal orientation for performance improvement. These data indicate that consideration of goal type and trainee goal orientation must be considered during curriculum development to maximize educational value. Copyright © 2016 Elsevier Inc. All rights reserved.

  8. Probabilistic logics and probabilistic networks

    CERN Document Server

    Haenni, Rolf; Wheeler, Gregory; Williamson, Jon; Andrews, Jill

    2014-01-01

    Probabilistic Logic and Probabilistic Networks presents a groundbreaking framework within which various approaches to probabilistic logic naturally fit. Additionally, the text shows how to develop computationally feasible methods to mesh with this framework.

  9. An empirical examination of negotiated goals and performance-to-goal following the introduction of an incentive bonus plan with participative goal setting

    NARCIS (Netherlands)

    Anderson, S.W.; Dekker, H.C.; Sedatole, K.L.

    2010-01-01

    Prior research documents performance improvements following the implementation of pay-for-performance (PFP) bonus plans. However, bonus plans typically pay for performance relative to a goal, and the manager whose performance is to be evaluated often participates in setting the goal. In these

  10. The analysis of probability task completion; Taxonomy of probabilistic thinking-based across gender in elementary school students

    Science.gov (United States)

    Sari, Dwi Ivayana; Budayasa, I. Ketut; Juniati, Dwi

    2017-08-01

    Formulation of mathematical learning goals now is not only oriented on cognitive product, but also leads to cognitive process, which is probabilistic thinking. Probabilistic thinking is needed by students to make a decision. Elementary school students are required to develop probabilistic thinking as foundation to learn probability at higher level. A framework of probabilistic thinking of students had been developed by using SOLO taxonomy, which consists of prestructural probabilistic thinking, unistructural probabilistic thinking, multistructural probabilistic thinking and relational probabilistic thinking. This study aimed to analyze of probability task completion based on taxonomy of probabilistic thinking. The subjects were two students of fifth grade; boy and girl. Subjects were selected by giving test of mathematical ability and then based on high math ability. Subjects were given probability tasks consisting of sample space, probability of an event and probability comparison. The data analysis consisted of categorization, reduction, interpretation and conclusion. Credibility of data used time triangulation. The results was level of boy's probabilistic thinking in completing probability tasks indicated multistructural probabilistic thinking, while level of girl's probabilistic thinking in completing probability tasks indicated unistructural probabilistic thinking. The results indicated that level of boy's probabilistic thinking was higher than level of girl's probabilistic thinking. The results could contribute to curriculum developer in developing probability learning goals for elementary school students. Indeed, teachers could teach probability with regarding gender difference.

  11. A probabilistic Hu-Washizu variational principle

    Science.gov (United States)

    Liu, W. K.; Belytschko, T.; Besterfield, G. H.

    1987-01-01

    A Probabilistic Hu-Washizu Variational Principle (PHWVP) for the Probabilistic Finite Element Method (PFEM) is presented. This formulation is developed for both linear and nonlinear elasticity. The PHWVP allows incorporation of the probabilistic distributions for the constitutive law, compatibility condition, equilibrium, domain and boundary conditions into the PFEM. Thus, a complete probabilistic analysis can be performed where all aspects of the problem are treated as random variables and/or fields. The Hu-Washizu variational formulation is available in many conventional finite element codes thereby enabling the straightforward inclusion of the probabilistic features into present codes.

  12. Approach to modeling of human performance for purposes of probabilistic risk assessment

    International Nuclear Information System (INIS)

    Swain, A.D.

    1983-01-01

    This paper describes the general approach taken in NUREG/CR-1278 to model human performance in sufficienct detail to permit probabilistic risk assessments of nuclear power plant operations. To show the basis for the more specific models in the above NUREG, a simplified model of the human component in man-machine systems is presented, the role of performance shaping factors is discussed, and special problems in modeling the cognitive aspect of behavior are described

  13. Probabilistic record linkage.

    Science.gov (United States)

    Sayers, Adrian; Ben-Shlomo, Yoav; Blom, Ashley W; Steele, Fiona

    2016-06-01

    Studies involving the use of probabilistic record linkage are becoming increasingly common. However, the methods underpinning probabilistic record linkage are not widely taught or understood, and therefore these studies can appear to be a 'black box' research tool. In this article, we aim to describe the process of probabilistic record linkage through a simple exemplar. We first introduce the concept of deterministic linkage and contrast this with probabilistic linkage. We illustrate each step of the process using a simple exemplar and describe the data structure required to perform a probabilistic linkage. We describe the process of calculating and interpreting matched weights and how to convert matched weights into posterior probabilities of a match using Bayes theorem. We conclude this article with a brief discussion of some of the computational demands of record linkage, how you might assess the quality of your linkage algorithm, and how epidemiologists can maximize the value of their record-linked research using robust record linkage methods. © The Author 2015; Published by Oxford University Press on behalf of the International Epidemiological Association.

  14. Insights from Guideline for Performance of Internal Flooding Probabilistic Risk Assessment (IFPRA)

    International Nuclear Information System (INIS)

    Choi, Sun Yeong; Yang, Joo Eon

    2009-01-01

    An internal flooding (IF) risk assessment refers to the quantitative probabilistic safety assessment (PSA) treatment of flooding as a result of pipe and tank breaks inside the plants, as well as from other recognized flood sources. The industry consensus standard for Internal Events Probabilistic Risk Assessment (ASME-RA-Sb-2005) includes high-level and supporting technical requirements for developing internal flooding probabilistic risk assessment (IFPRA). This industry standard is endorsed in Regulatory Guide 1.200, Revision 1 as an acceptable approach for addressing the risk contribution from IF events for risk informed applications that require U.S. Nuclear Regulatory commission (NRC) approval. In 2006, EPRI published a draft report for IFPRA that addresses the requirements of the ASME PRA consensus standard and have made efforts to refine and update the final EPRI IFPRA guideline. Westinghouse has performed an IFPRA analysis for several nuclear power plants (NPPs), such as Watts Bar and Fort Calhoun, using the draft EPRI guidelines for development of an IFPRA. Proprietary methodologies have been developed to apply the EPRI guidelines. The objectives of the draft report for IFPRA guideline are to: · Provide guidance for PSA practitioners in the performance of the elements of a PRA associated with internal flooding events consistent with the current state of the art for internal flooding PRA · Provide guidance regarding acceptable approaches that is sufficient to meeting the requirements of the ASME PRA Standard associated with internal flooding · Incorporate lessons learned in the performance of internal flooding PRAs including those identified as pilot applications of earlier drafts of this procedures guide The purpose of this paper is to present a vision for domestic nuclear power plants' IFPRA by comparing the method of the draft EPRI guidelines with the existing IFPRA method for domestic NPPs

  15. Insights from Guideline for Performance of Internal Flooding Probabilistic Risk Assessment (IFPRA)

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Sun Yeong; Yang, Joo Eon [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2009-10-15

    An internal flooding (IF) risk assessment refers to the quantitative probabilistic safety assessment (PSA) treatment of flooding as a result of pipe and tank breaks inside the plants, as well as from other recognized flood sources. The industry consensus standard for Internal Events Probabilistic Risk Assessment (ASME-RA-Sb-2005) includes high-level and supporting technical requirements for developing internal flooding probabilistic risk assessment (IFPRA). This industry standard is endorsed in Regulatory Guide 1.200, Revision 1 as an acceptable approach for addressing the risk contribution from IF events for risk informed applications that require U.S. Nuclear Regulatory commission (NRC) approval. In 2006, EPRI published a draft report for IFPRA that addresses the requirements of the ASME PRA consensus standard and have made efforts to refine and update the final EPRI IFPRA guideline. Westinghouse has performed an IFPRA analysis for several nuclear power plants (NPPs), such as Watts Bar and Fort Calhoun, using the draft EPRI guidelines for development of an IFPRA. Proprietary methodologies have been developed to apply the EPRI guidelines. The objectives of the draft report for IFPRA guideline are to: {center_dot} Provide guidance for PSA practitioners in the performance of the elements of a PRA associated with internal flooding events consistent with the current state of the art for internal flooding PRA {center_dot} Provide guidance regarding acceptable approaches that is sufficient to meeting the requirements of the ASME PRA Standard associated with internal flooding {center_dot} Incorporate lessons learned in the performance of internal flooding PRAs including those identified as pilot applications of earlier drafts of this procedures guide The purpose of this paper is to present a vision for domestic nuclear power plants' IFPRA by comparing the method of the draft EPRI guidelines with the existing IFPRA method for domestic NPPs.

  16. Use of limited data to construct Bayesian networks for probabilistic risk assessment.

    Energy Technology Data Exchange (ETDEWEB)

    Groth, Katrina M.; Swiler, Laura Painton

    2013-03-01

    Probabilistic Risk Assessment (PRA) is a fundamental part of safety/quality assurance for nuclear power and nuclear weapons. Traditional PRA very effectively models complex hardware system risks using binary probabilistic models. However, traditional PRA models are not flexible enough to accommodate non-binary soft-causal factors, such as digital instrumentation&control, passive components, aging, common cause failure, and human errors. Bayesian Networks offer the opportunity to incorporate these risks into the PRA framework. This report describes the results of an early career LDRD project titled %E2%80%9CUse of Limited Data to Construct Bayesian Networks for Probabilistic Risk Assessment%E2%80%9D. The goal of the work was to establish the capability to develop Bayesian Networks from sparse data, and to demonstrate this capability by producing a data-informed Bayesian Network for use in Human Reliability Analysis (HRA) as part of nuclear power plant Probabilistic Risk Assessment (PRA). This report summarizes the research goal and major products of the research.

  17. Why do probabilistic finite element analysis ?

    CERN Document Server

    Thacker, Ben H

    2008-01-01

    The intention of this book is to provide an introduction to performing probabilistic finite element analysis. As a short guideline, the objective is to inform the reader of the use, benefits and issues associated with performing probabilistic finite element analysis without excessive theory or mathematical detail.

  18. Probabilistic risk assessment, Volume I

    International Nuclear Information System (INIS)

    Anon.

    1982-01-01

    This book contains 158 papers presented at the International Topical Meeting on Probabilistic Risk Assessment held by the American Nuclear Society (ANS) and the European Nuclear Society (ENS) in Port Chester, New York in 1981. The meeting was second in a series of three. The main focus of the meeting was on the safety of light water reactors. The papers discuss safety goals and risk assessment. Quantitative safety goals, risk assessment in non-nuclear technologies, and operational experience and data base are also covered. Included is an address by Dr. Chauncey Starr

  19. Probabilistic programmable quantum processors

    International Nuclear Information System (INIS)

    Buzek, V.; Ziman, M.; Hillery, M.

    2004-01-01

    We analyze how to improve performance of probabilistic programmable quantum processors. We show how the probability of success of the probabilistic processor can be enhanced by using the processor in loops. In addition, we show that an arbitrary SU(2) transformations of qubits can be encoded in program state of a universal programmable probabilistic quantum processor. The probability of success of this processor can be enhanced by a systematic correction of errors via conditional loops. Finally, we show that all our results can be generalized also for qudits. (Abstract Copyright [2004], Wiley Periodicals, Inc.)

  20. The Effects of Goal Setting on Rugby Performance

    Science.gov (United States)

    Mellalieu, Stephen D.; Hanton, Sheldon; O'Brien, Michael

    2006-01-01

    Goal-setting effects on selected performance behaviors of 5 collegiate rugby players were assessed over an entire competitive season using self-generated targets and goal-attainment scaling. Results suggest that goal setting was effective for enhancing task-specific on-field behavior in rugby union. (Contains 1 figure.)

  1. Probabilistic Fracture Mechanics of Reactor Pressure Vessels with Populations of Flaws

    Energy Technology Data Exchange (ETDEWEB)

    Spencer, Benjamin [Idaho National Lab. (INL), Idaho Falls, ID (United States); Backman, Marie [Univ. of Tennessee, Knoxville, TN (United States); Williams, Paul [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Hoffman, William [Idaho National Lab. (INL), Idaho Falls, ID (United States); Alfonsi, Andrea [Idaho National Lab. (INL), Idaho Falls, ID (United States); Dickson, Terry [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Bass, B. Richard [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Klasky, Hilda [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2016-09-01

    This report documents recent progress in developing a tool that uses the Grizzly and RAVEN codes to perform probabilistic fracture mechanics analyses of reactor pressure vessels in light water reactor nuclear power plants. The Grizzly code is being developed with the goal of creating a general tool that can be applied to study a variety of degradation mechanisms in nuclear power plant components. Because of the central role of the reactor pressure vessel (RPV) in a nuclear power plant, particular emphasis is being placed on developing capabilities to model fracture in embrittled RPVs to aid in the process surrounding decision making relating to life extension of existing plants. A typical RPV contains a large population of pre-existing flaws introduced during the manufacturing process. The use of probabilistic techniques is necessary to assess the likelihood of crack initiation at one or more of these flaws during a transient event. This report documents development and initial testing of a capability to perform probabilistic fracture mechanics of large populations of flaws in RPVs using reduced order models to compute fracture parameters. The work documented here builds on prior efforts to perform probabilistic analyses of a single flaw with uncertain parameters, as well as earlier work to develop deterministic capabilities to model the thermo-mechanical response of the RPV under transient events, and compute fracture mechanics parameters at locations of pre-defined flaws. The capabilities developed as part of this work provide a foundation for future work, which will develop a platform that provides the flexibility needed to consider scenarios that cannot be addressed with the tools used in current practice.

  2. Clinical skills-related learning goals of senior medical students after performance feedback.

    Science.gov (United States)

    Chang, Anna; Chou, Calvin L; Teherani, Arianne; Hauer, Karen E

    2011-09-01

    Lifelong learning is essential for doctors to maintain competence in clinical skills. With performance feedback, learners should be able to formulate specific and achievable learning goals in areas of need. We aimed to determine: (i) the type and specificity of medical student learning goals after a required clinical performance examination; (ii) differences in goal setting among low, average and high performers, and (iii) whether low performers articulate learning goals that are concordant with their learning needs. We conducted a single-site, multi-year, descriptive comparison study. Senior medical students were given performance benchmarks, individual feedback and guidelines on learning goals; each student was subsequently instructed to write two clinical skills learning goals. Investigators coded the learning goals for specificity, categorised the goals, and performed statistical analyses to determine their concordance with student performance level (low, average or high) in data gathering (history taking and physical examination) or communication skills. All 208 students each wrote two learning goals and most (n=200, 96%) wrote two specific learning goals. Nearly two-thirds of low performers in data gathering wrote at least one learning goal that referred to history taking or physical examination; one-third wrote learning goals pertaining to the organisation of the encounter. High performers in data gathering wrote significantly more patient education goals and significantly fewer history-taking goals than average or low performers. Only 50% of low performers in communication wrote learning goals related to communication skills. Low performers in communication were significantly more likely than average or high performers to identify learning goals related to improving performance in future examinations. The provision of performance benchmarking, individual feedback and brief written guidelines helped most senior medical students in our study to write specific

  3. Advanced nuclear power plant regulation using risk-informed and performance-based methods

    International Nuclear Information System (INIS)

    Modarres, Mohammad

    2009-01-01

    This paper proposes and discusses implications of a largely probabilistic regulatory framework using best-estimate, goal-driven, risk-informed, and performance-based methods. This framework relies on continuous probabilistic assessment of performance of a set of time-dependent, safety-critical systems, structures, components, and procedures that assure attainment of a broad set of overarching technology-neutral protective, mitigative, and preventive goals under all phases of plant operations. In this framework acceptable levels of performance are set through formal apportionment so that they are commensurate with the overarching goals. Regulatory acceptance would be the based on the confidence level with which the plant conforms to these goals and performance objectives. The proposed framework uses the traditional defense-in-depth design and operation regulatory philosophy when uncertainty in conforming to specific goals and objectives is high. Finally, the paper discusses the steps needed to develop a corresponding technology-neutral regulatory approach from the proposed framework

  4. Comparison of Two Probabilistic Fatigue Damage Assessment Approaches Using Prognostic Performance Metrics

    Directory of Open Access Journals (Sweden)

    Xuefei Guan

    2011-01-01

    Full Text Available In this paper, two probabilistic prognosis updating schemes are compared. One is based on the classical Bayesian approach and the other is based on newly developed maximum relative entropy (MRE approach. The algorithm performance of the two models is evaluated using a set of recently developed prognostics-based metrics. Various uncertainties from measurements, modeling, and parameter estimations are integrated into the prognosis framework as random input variables for fatigue damage of materials. Measures of response variables are then used to update the statistical distributions of random variables and the prognosis results are updated using posterior distributions. Markov Chain Monte Carlo (MCMC technique is employed to provide the posterior samples for model updating in the framework. Experimental data are used to demonstrate the operation of the proposed probabilistic prognosis methodology. A set of prognostics-based metrics are employed to quantitatively evaluate the prognosis performance and compare the proposed entropy method with the classical Bayesian updating algorithm. In particular, model accuracy, precision, robustness and convergence are rigorously evaluated in addition to the qualitative visual comparison. Following this, potential development and improvement for the prognostics-based metrics are discussed in detail.

  5. Probabilistic safety analysis procedures guide

    International Nuclear Information System (INIS)

    Papazoglou, I.A.; Bari, R.A.; Buslik, A.J.

    1984-01-01

    A procedures guide for the performance of probabilistic safety assessment has been prepared for interim use in the Nuclear Regulatory Commission programs. The probabilistic safety assessment studies performed are intended to produce probabilistic predictive models that can be used and extended by the utilities and by NRC to sharpen the focus of inquiries into a range of tissues affecting reactor safety. This guide addresses the determination of the probability (per year) of core damage resulting from accident initiators internal to the plant and from loss of offsite electric power. The scope includes analyses of problem-solving (cognitive) human errors, a determination of importance of the various core damage accident sequences, and an explicit treatment and display of uncertainties for the key accident sequences. Ultimately, the guide will be augmented to include the plant-specific analysis of in-plant processes (i.e., containment performance) and the risk associated with external accident initiators, as consensus is developed regarding suitable methodologies in these areas. This guide provides the structure of a probabilistic safety study to be performed, and indicates what products of the study are essential for regulatory decision making. Methodology is treated in the guide only to the extent necessary to indicate the range of methods which is acceptable; ample reference is given to alternative methodologies which may be utilized in the performance of the study

  6. Stormwater Tank Performance: Design and Management Criteria for Capture Tanks Using a Continuous Simulation and a Semi-Probabilistic Analytical Approach

    Directory of Open Access Journals (Sweden)

    Flavio De Martino

    2013-10-01

    Full Text Available Stormwater tank performance significantly depends on management practices. This paper proposes a procedure to assess tank efficiency in terms of volume and pollutant concentration using four different capture tank management protocols. The comparison of the efficiency results reveals that, as expected, a combined bypass—stormwater tank system achieves better results than a tank alone. The management practices tested for the tank-only systems provide notably different efficiency results. The practice of immediately emptying after the end of the event exhibits significant levels of efficiency and operational advantages. All other configurations exhibit either significant operational problems or very low performances. The continuous simulation and semi-probabilistic approach for the best tank management practice are compared. The semi-probabilistic approach is based on a Weibull probabilistic model of the main characteristics of the rainfall process. Following this approach, efficiency indexes were established. The comparison with continuous simulations shows the reliability of the probabilistic approach even if this last is certainly very site sensitive.

  7. 14th International Probabilistic Workshop

    CERN Document Server

    Taerwe, Luc; Proske, Dirk

    2017-01-01

    This book presents the proceedings of the 14th International Probabilistic Workshop that was held in Ghent, Belgium in December 2016. Probabilistic methods are currently of crucial importance for research and developments in the field of engineering, which face challenges presented by new materials and technologies and rapidly changing societal needs and values. Contemporary needs related to, for example, performance-based design, service-life design, life-cycle analysis, product optimization, assessment of existing structures and structural robustness give rise to new developments as well as accurate and practically applicable probabilistic and statistical engineering methods to support these developments. These proceedings are a valuable resource for anyone interested in contemporary developments in the field of probabilistic engineering applications.

  8. Development of a Risk-Based Probabilistic Performance-Assessment Method for Long-Term Cover Systems - 2nd Edition

    International Nuclear Information System (INIS)

    HO, CLIFFORD K.; ARNOLD, BILL W.; COCHRAN, JOHN R.; TAIRA, RANDAL Y.

    2002-01-01

    A probabilistic, risk-based performance-assessment methodology has been developed to assist designers, regulators, and stakeholders in the selection, design, and monitoring of long-term covers for contaminated subsurface sites. This report describes the method, the software tools that were developed, and an example that illustrates the probabilistic performance-assessment method using a repository site in Monticello, Utah. At the Monticello site, a long-term cover system is being used to isolate long-lived uranium mill tailings from the biosphere. Computer models were developed to simulate relevant features, events, and processes that include water flux through the cover, source-term release, vadose-zone transport, saturated-zone transport, gas transport, and exposure pathways. The component models were then integrated into a total-system performance-assessment model, and uncertainty distributions of important input parameters were constructed and sampled in a stochastic Monte Carlo analysis. Multiple realizations were simulated using the integrated model to produce cumulative distribution functions of the performance metrics, which were used to assess cover performance for both present- and long-term future conditions. Performance metrics for this study included the water percolation reaching the uranium mill tailings, radon gas flux at the surface, groundwater concentrations, and dose. Results from uncertainty analyses, sensitivity analyses, and alternative design comparisons are presented for each of the performance metrics. The benefits from this methodology include a quantification of uncertainty, the identification of parameters most important to performance (to prioritize site characterization and monitoring activities), and the ability to compare alternative designs using probabilistic evaluations of performance (for cost savings)

  9. Classroom Environment, Achievement Goals and Maths Performance: Gender Differences

    Science.gov (United States)

    Gherasim, Loredana Ruxandra; Butnaru, Simona; Mairean, Cornelia

    2013-01-01

    This study investigated how gender shapes the relationships between classroom environment, achievement goals and maths performance. Seventh-grade students ("N"?=?498) from five urban secondary schools filled in achievement goal orientations and classroom environment scales at the beginning of the second semester. Maths performance was…

  10. Goal Setting and Expectancy Theory Predictions of Effort and Performance.

    Science.gov (United States)

    Dossett, Dennis L.; Luce, Helen E.

    Neither expectancy (VIE) theory nor goal setting alone are effective determinants of individual effort and task performance. To test the combined ability of VIE and goal setting to predict effort and performance, 44 real estate agents and their managers completed questionnaires. Quarterly income goals predicted managers' ratings of agents' effort,…

  11. Performance improvement program: goals and experience

    International Nuclear Information System (INIS)

    Guglielmi, F.

    2015-01-01

    Following long 54 month refurbishment outage at Point Lepreau Generating Station, operational performance had fallen below industry standards in a number of areas. Leadership development and succession planning had stalled. Operational focus was low primarily due to the construction focus during refurbishment. Condition of balance of plant was poor including several long standing deficiencies. In order to improve performance, the site implemented a framework based on INPO 12-011: Focus on Improving Behaviours; Set common goals and demonstrate results; Align and engage the organization; Drive to achieve high levels of performance and sustain performance.

  12. Performance improvement program: goals and experience

    Energy Technology Data Exchange (ETDEWEB)

    Guglielmi, F. [Point Lepreau Generating Station, Maces Bay, New Brunswick (Canada)

    2015-07-01

    Following long 54 month refurbishment outage at Point Lepreau Generating Station, operational performance had fallen below industry standards in a number of areas. Leadership development and succession planning had stalled. Operational focus was low primarily due to the construction focus during refurbishment. Condition of balance of plant was poor including several long standing deficiencies. In order to improve performance, the site implemented a framework based on INPO 12-011: Focus on Improving Behaviours; Set common goals and demonstrate results; Align and engage the organization; Drive to achieve high levels of performance and sustain performance.

  13. Performance management and goal ambiguity: managerial implications in a single payer system.

    Science.gov (United States)

    Calciolari, Stefano; Cantù, Elena; Fattore, Giovanni

    2011-01-01

    Goal ambiguity influences the effectiveness of performance management systems to drive organizations toward enhanced results. The literature analyzes the antecedents of goal ambiguity and shows the influence of goal ambiguity on the performance of U.S. federal agencies. However, no study has analyzed goal ambiguity in other countries or in health care systems. This study has three aims: to test the validity of a measurement instrument for goal ambiguity, to investigate its main antecedents, and to explore the relationship between goal ambiguity and organizational performance in a large, public, Beveridge-type health care system. A nationwide survey of general managers of the Italian national health system was performed. A factor analysis was used to validate the mono-dimensionality of an instrument that measured goal ambiguity. Structural equation modeling was used to test both the antecedents and the influence of goal ambiguity on organizational performance. Data from 135 health care organizations (53% response rate) were available for analysis. The results confirm the mono-dimensionality of the instrument, the existence of two environmental sources of ambiguity (political endorsement and governance commitment), and the negative relationship between goal ambiguity and organizational performance. Goal ambiguity matters because it may hamper organizational performance. Therefore, performance should be fostered by reducing goal ambiguity (e.g., goal-setting model, funding arrangements, and political support). Mutatis mutandis, our results may apply to public health care systems of other countries or other "public interest" sectors, such as social care and education.

  14. Behavioral Modeling Based on Probabilistic Finite Automata: An Empirical Study.

    Science.gov (United States)

    Tîrnăucă, Cristina; Montaña, José L; Ontañón, Santiago; González, Avelino J; Pardo, Luis M

    2016-06-24

    Imagine an agent that performs tasks according to different strategies. The goal of Behavioral Recognition (BR) is to identify which of the available strategies is the one being used by the agent, by simply observing the agent's actions and the environmental conditions during a certain period of time. The goal of Behavioral Cloning (BC) is more ambitious. In this last case, the learner must be able to build a model of the behavior of the agent. In both settings, the only assumption is that the learner has access to a training set that contains instances of observed behavioral traces for each available strategy. This paper studies a machine learning approach based on Probabilistic Finite Automata (PFAs), capable of achieving both the recognition and cloning tasks. We evaluate the performance of PFAs in the context of a simulated learning environment (in this case, a virtual Roomba vacuum cleaner robot), and compare it with a collection of other machine learning approaches.

  15. Probabilistic numerical discrimination in mice.

    Science.gov (United States)

    Berkay, Dilara; Çavdaroğlu, Bilgehan; Balcı, Fuat

    2016-03-01

    Previous studies showed that both human and non-human animals can discriminate between different quantities (i.e., time intervals, numerosities) with a limited level of precision due to their endogenous/representational uncertainty. In addition, other studies have shown that subjects can modulate their temporal categorization responses adaptively by incorporating information gathered regarding probabilistic contingencies into their time-based decisions. Despite the psychophysical similarities between the interval timing and nonverbal counting functions, the sensitivity of count-based decisions to probabilistic information remains an unanswered question. In the current study, we investigated whether exogenous probabilistic information can be integrated into numerosity-based judgments by mice. In the task employed in this study, reward was presented either after few (i.e., 10) or many (i.e., 20) lever presses, the last of which had to be emitted on the lever associated with the corresponding trial type. In order to investigate the effect of probabilistic information on performance in this task, we manipulated the relative frequency of different trial types across different experimental conditions. We evaluated the behavioral performance of the animals under models that differed in terms of their assumptions regarding the cost of responding (e.g., logarithmically increasing vs. no response cost). Our results showed for the first time that mice could adaptively modulate their count-based decisions based on the experienced probabilistic contingencies in directions predicted by optimality.

  16. Probabilistic evaluation of fuel element performance by the combined use of a fast running simplistic and a detailed deterministic fuel performance code

    International Nuclear Information System (INIS)

    Misfeldt, I.

    1980-01-01

    A comprehensive evaluation of fuel element performance requires a probabilistic fuel code supported by a well bench-marked deterministic code. This paper presents an analysis of a SGHWR ramp experiment, where the probabilistic fuel code FRP is utilized in combination with the deterministic fuel models FFRS and SLEUTH/SEER. The statistical methods employed in FRP are Monte Carlo simulation or a low-order Taylor approximation. The fast-running simplistic fuel code FFRS is used for the deterministic simulations, whereas simulations with SLEUTH/SEER are used to verify the predictions of FFRS. The ramp test was performed with a SGHWR fuel element, where 9 of the 36 fuel pins failed. There seemed to be good agreement between the deterministic simulations and the experiment, but the statistical evaluation shows that the uncertainty on the important performance parameters is too large for this ''nice'' result. The analysis does therefore indicate a discrepancy between the experiment and the deterministic code predictions. Possible explanations for this disagreement are discussed. (author)

  17. Does Manipulating Stereotype Threat Condition Change Performance Goal State

    Science.gov (United States)

    Simmons, Cecil Max

    2010-01-01

    This study tested whether the Stereotype Threat effect is mediated by achievement goals, in particular performance-avoidance goals. Threat level was altered before a difficult math test to observe how the endorsement by females of various achievement goal dimensions was affected. 222 people (96 females) in a pre-calculus class at a Mid-Western…

  18. Assessing performance and validating finite element simulations using probabilistic knowledge

    Energy Technology Data Exchange (ETDEWEB)

    Dolin, Ronald M.; Rodriguez, E. A. (Edward A.)

    2002-01-01

    Two probabilistic approaches for assessing performance are presented. The first approach assesses probability of failure by simultaneously modeling all likely events. The probability each event causes failure along with the event's likelihood of occurrence contribute to the overall probability of failure. The second assessment method is based on stochastic sampling using an influence diagram. Latin-hypercube sampling is used to stochastically assess events. The overall probability of failure is taken as the maximum probability of failure of all the events. The Likelihood of Occurrence simulation suggests failure does not occur while the Stochastic Sampling approach predicts failure. The Likelihood of Occurrence results are used to validate finite element predictions.

  19. Probabilistic Cue Combination: Less Is More

    Science.gov (United States)

    Yurovsky, Daniel; Boyer, Ty W.; Smith, Linda B.; Yu, Chen

    2013-01-01

    Learning about the structure of the world requires learning probabilistic relationships: rules in which cues do not predict outcomes with certainty. However, in some cases, the ability to track probabilistic relationships is a handicap, leading adults to perform non-normatively in prediction tasks. For example, in the "dilution effect,"…

  20. Probabilistic risk analysis for the NASA space shuttle: a brief history and current work

    International Nuclear Information System (INIS)

    Pate-Cornell, Elisabeth; Dillon, Robin

    2001-01-01

    While NASA managers have always relied on risk analysis tools for the development and maintenance of space projects, quantitative and especially probabilistic techniques have been gaining acceptance in recent years. In some cases, the studies have been required, for example, to launch the Galileo spacecraft with plutonium fuel, but these successful applications have helped to demonstrate the benefits of these tools. This paper reviews the history of probabilistic risk analysis (PRA) by NASA for the space shuttle program and discusses the status of the on-going development of the Quantitative Risk Assessment System (QRAS) software that performs PRA. The goal is to have within NASA a tool that can be used when needed to update previous risk estimates and to assess the benefits of possible upgrades to the system

  1. Probabilistic Harmonic Modeling of Wind Power Plants

    DEFF Research Database (Denmark)

    Guest, Emerson; Jensen, Kim H.; Rasmussen, Tonny Wederberg

    2017-01-01

    A probabilistic sequence domain (SD) harmonic model of a grid-connected voltage-source converter is used to estimate harmonic emissions in a wind power plant (WPP) comprised of Type-IV wind turbines. The SD representation naturally partitioned converter generated voltage harmonics into those...... with deterministic phase and those with probabilistic phase. A case study performed on a string of ten 3MW, Type-IV wind turbines implemented in PSCAD was used to verify the probabilistic SD harmonic model. The probabilistic SD harmonic model can be employed in the planning phase of WPP projects to assess harmonic...

  2. Probabilistic brains: knowns and unknowns

    Science.gov (United States)

    Pouget, Alexandre; Beck, Jeffrey M; Ma, Wei Ji; Latham, Peter E

    2015-01-01

    There is strong behavioral and physiological evidence that the brain both represents probability distributions and performs probabilistic inference. Computational neuroscientists have started to shed light on how these probabilistic representations and computations might be implemented in neural circuits. One particularly appealing aspect of these theories is their generality: they can be used to model a wide range of tasks, from sensory processing to high-level cognition. To date, however, these theories have only been applied to very simple tasks. Here we discuss the challenges that will emerge as researchers start focusing their efforts on real-life computations, with a focus on probabilistic learning, structural learning and approximate inference. PMID:23955561

  3. Probabilistic Decision Graphs - Combining Verification and AI Techniques for Probabilistic Inference

    DEFF Research Database (Denmark)

    Jaeger, Manfred

    2004-01-01

    We adopt probabilistic decision graphs developed in the field of automated verification as a tool for probabilistic model representation and inference. We show that probabilistic inference has linear time complexity in the size of the probabilistic decision graph, that the smallest probabilistic ...

  4. Probabilistic numerics and uncertainty in computations.

    Science.gov (United States)

    Hennig, Philipp; Osborne, Michael A; Girolami, Mark

    2015-07-08

    We deliver a call to arms for probabilistic numerical methods : algorithms for numerical tasks, including linear algebra, integration, optimization and solving differential equations, that return uncertainties in their calculations. Such uncertainties, arising from the loss of precision induced by numerical calculation with limited time or hardware, are important for much contemporary science and industry. Within applications such as climate science and astrophysics, the need to make decisions on the basis of computations with large and complex data have led to a renewed focus on the management of numerical uncertainty. We describe how several seminal classic numerical methods can be interpreted naturally as probabilistic inference. We then show that the probabilistic view suggests new algorithms that can flexibly be adapted to suit application specifics, while delivering improved empirical performance. We provide concrete illustrations of the benefits of probabilistic numeric algorithms on real scientific problems from astrometry and astronomical imaging, while highlighting open problems with these new algorithms. Finally, we describe how probabilistic numerical methods provide a coherent framework for identifying the uncertainty in calculations performed with a combination of numerical algorithms (e.g. both numerical optimizers and differential equation solvers), potentially allowing the diagnosis (and control) of error sources in computations.

  5. Urban Latino children's physical activity levels and performance in interactive dance video games: effects of goal difficulty and goal specificity.

    Science.gov (United States)

    Gao, Zan; Podlog, Leslie

    2012-10-01

    To examine the effects of different levels of goal specificity and difficulty on Latino children's performance and physical activity (PA) levels in an after-school program incorporating an interactive dance program (Dance Dance Revolution [DDR]; Konami Corporation). Comparison study. Rose Park Elementary School, Salt Lake City, Utah. Ninety-eight Latino children in the first through sixth grades, aged 7 to 13 years. After the pretest, the participants were randomly assigned into 1 of the following 3 goal-setting conditions: (1) easy, (2) difficult, and (3) best effort (hereinafter referred to as do-your-best goal). Participants' PA levels were measured using piezoelectric pedometers, and steps per minute were used as the outcome variable. Participants' total points for their dance on television screens were retrieved as their performance scores. These outcome variables were assessed again 8 weeks later (posttest score). The multivariate analysis of covariance yielded a significant main effect for the goal-setting condition. Follow-up tests revealed that children who set specific (easy or difficult) goals had significantly greater increased PA levels (mean scores, 10.34 for easy and 22.45 for difficult) and DDR performance (0.011 for easy and 0.67 for difficult) than those in the do-your-best group (0.83 for PA and 0.17 for performance). In addition, children's increased PA levels in the difficult-goal group were significantly higher than those in the easy-goal group. The easy- and difficult-goal groups show a significant improvement on DDR performance. The difficult- goal group also displays the highest improvement on PA levels. Strategies to enhance children's DDR performance and PA levels are discussed in relation to the extant goal-setting literature.

  6. Learning Probabilistic Logic Models from Probabilistic Examples.

    Science.gov (United States)

    Chen, Jianzhong; Muggleton, Stephen; Santos, José

    2008-10-01

    We revisit an application developed originally using abductive Inductive Logic Programming (ILP) for modeling inhibition in metabolic networks. The example data was derived from studies of the effects of toxins on rats using Nuclear Magnetic Resonance (NMR) time-trace analysis of their biofluids together with background knowledge representing a subset of the Kyoto Encyclopedia of Genes and Genomes (KEGG). We now apply two Probabilistic ILP (PILP) approaches - abductive Stochastic Logic Programs (SLPs) and PRogramming In Statistical modeling (PRISM) to the application. Both approaches support abductive learning and probability predictions. Abductive SLPs are a PILP framework that provides possible worlds semantics to SLPs through abduction. Instead of learning logic models from non-probabilistic examples as done in ILP, the PILP approach applied in this paper is based on a general technique for introducing probability labels within a standard scientific experimental setting involving control and treated data. Our results demonstrate that the PILP approach provides a way of learning probabilistic logic models from probabilistic examples, and the PILP models learned from probabilistic examples lead to a significant decrease in error accompanied by improved insight from the learned results compared with the PILP models learned from non-probabilistic examples.

  7. A Model of Statistics Performance Based on Achievement Goal Theory.

    Science.gov (United States)

    Bandalos, Deborah L.; Finney, Sara J.; Geske, Jenenne A.

    2003-01-01

    Tests a model of statistics performance based on achievement goal theory. Both learning and performance goals affected achievement indirectly through study strategies, self-efficacy, and test anxiety. Implications of these findings for teaching and learning statistics are discussed. (Contains 47 references, 3 tables, 3 figures, and 1 appendix.)…

  8. A Probabilistic Framework for Security Scenarios with Dependent Actions

    NARCIS (Netherlands)

    Kordy, Barbara; Pouly, Marc; Schweizer, Patrick; Albert, Elvira; Sekereinsk, Emil

    2014-01-01

    This work addresses the growing need of performing meaningful probabilistic analysis of security. We propose a framework that integrates the graphical security modeling technique of attack–defense trees with probabilistic information expressed in terms of Bayesian networks. This allows us to perform

  9. Engineering aspects of probabilistic risk assessment

    International Nuclear Information System (INIS)

    vonHerrmann, J.L.; Wood, P.J.

    1984-01-01

    Over the last decade, the use of probabilistic risk assessment (PRA) in the nuclear industry has expanded significantly. In these analyses the probabilities of experiencing certain undesired events (for example, a plant accident which results in damage to the nuclear fuel) are estimated and the consequences of these events are evaluated in terms of some common measure. These probabilities and consequences are then combined to form a representation of the risk associated with the plant studied. In the relatively short history of probabilistic risk assessment of nuclear power plants, the primary motivation for these studies has been the quantitative assessment of public risk associated with a single plant or group of plants. Accordingly, the primary product of most PRAs performed to date has been a 'risk curve' in which the probability (or expected frequency) of exceeding a certain consequence level is plotted against that consequence. The most common goal of these assessments has been to demonstrate the 'acceptability' of the calculated risk by comparison of the resultant risk curve to risk curves associated with other plants or with other societal risks. Presented here are brief descriptions of some alternate applications of PRAs, a discussion of how these other applications compare or contrast with the currently popular uses of PRA, and a discussion of the relative benefits of each

  10. Teacher performance goal practices and elementary students' behavioral engagement: a developmental perspective.

    Science.gov (United States)

    Hughes, Jan N; Wu, Wei; West, Stephen G

    2011-02-01

    We investigated growth trajectories for classroom performance goal practices and for student behavioral engagement across grades 2 to 5 for 497 academically at-risk elementary students. This study is the first longitudinal investigation of performance goal practices in the early elementary years. On average, teacher use of performance goal practices increased and students' behavioral engagement declined across the four years. Using autoregressive latent trajectory (ALT) models, we examined the synchronous relations between teacher-reported performance goal practices and teacher-reported student behavioral engagement. As expected, as students move into classrooms with a new teacher with less emphasis on performance goal practices, they become more behaviorally engaged in school. Gender did not moderate these results. Implications for teacher professional development are discussed. Published by Elsevier Ltd.

  11. Review of probabilistic safety assessments: insights and recommendations regarding further developments

    International Nuclear Information System (INIS)

    Spitzer, C.

    1996-01-01

    Probabilistic Safety Assessments (PSAs) performed by utilities in the framework of Periodic Safety Reviews for German nuclear power plants are reviewed by TUeV Suedwest. Insights gained and recommendations concerning the necessity and focus of further developments and applications according to practical requests for the performance and assessment of PSAs within regulatory procedures are presented in this paper. Further on, recommendations are made in order to ensure the validity of the results of PSAs necessary in order to achieve the goals thereof. Beside some general points of view the emphasis of the paper is on methodological aspects with respect to evaluation methods and assessment of common cause failures as well as human reliability assessment

  12. Diversity in goal orientation, team reflexivity, and team performance

    NARCIS (Netherlands)

    Pieterse, Anne Nederveen; van Knippenberg, Daan; van Ginkel, Wendy P.

    Although recent research highlights the role of team member goal orientation in team functioning, research has neglected the effects of diversity in goal orientation. In a laboratory study with groups working on a problem-solving task, we show that diversity in learning and performance orientation

  13. Probabilistic Performance Guarantees for Distributed Self-Assembly

    KAUST Repository

    Fox, Michael J.

    2015-04-01

    In distributed self-assembly, a multitude of agents seek to form copies of a particular structure, modeled here as a labeled graph. In the model, agents encounter each other in spontaneous pairwise interactions and decide whether or not to form or sever edges based on their two labels and a fixed set of local interaction rules described by a graph grammar. The objective is to converge on a graph with a maximum number of copies of a given target graph. Our main result is the introduction of a simple algorithm that achieves an asymptotically maximum yield in a probabilistic sense. Notably, agents do not need to update their labels except when forming or severing edges. This contrasts with certain existing approaches that exploit information propagating rules, effectively addressing the decision problem at the level of subgraphs as opposed to individual vertices. We are able to obey more stringent locality requirements while also providing smaller rule sets. The results can be improved upon if certain requirements on the labels are relaxed. We discuss limits of performance in self-assembly in terms of rule set characteristics and achievable maximum yield.

  14. Evaluation of goal kicking performance in international rugby union matches.

    Science.gov (United States)

    Quarrie, Kenneth L; Hopkins, Will G

    2015-03-01

    Goal kicking is an important element in rugby but has been the subject of minimal research. To develop and apply a method to describe the on-field pattern of goal-kicking and rank the goal kicking performance of players in international rugby union matches. Longitudinal observational study. A generalized linear mixed model was used to analyze goal-kicking performance in a sample of 582 international rugby matches played from 2002 to 2011. The model adjusted for kick distance, kick angle, a rating of the importance of each kick, and venue-related conditions. Overall, 72% of the 6769 kick attempts were successful. Forty-five percent of points scored during the matches resulted from goal kicks, and in 5.7% of the matches the result of the match hinged on the outcome of a kick attempt. There was an extremely large decrease in success with increasing distance (odds ratio for two SD distance 0.06, 90% confidence interval 0.05-0.07) and a small decrease with increasingly acute angle away from the mid-line of the goal posts (odds ratio for 2 SD angle, 0.44, 0.39-0.49). Differences between players were typically small (odds ratio for 2 between-player SD 0.53, 0.45-0.65). The generalized linear mixed model with its random-effect solutions provides a tool for ranking the performance of goal kickers in rugby. This modelling approach could be applied to other performance indicators in rugby and in other sports in which discrete outcomes are measured repeatedly on players or teams. Copyright © 2015. Published by Elsevier Ltd.

  15. Mastery and Performance Goals Predict Epistemic and Relational Conflict Regulation

    Science.gov (United States)

    Darnon, Celine; Muller, Dominique; Schrager, Sheree M.; Pannuzzo, Nelly; Butera, Fabrizio

    2006-01-01

    The present research examines whether mastery and performance goals predict different ways of reacting to a sociocognitive conflict with another person over materials to be learned, an issue not yet addressed by the achievement goal literature. Results from 2 studies showed that mastery goals predicted epistemic conflict regulation (a conflict…

  16. Probabilistic structural analysis methods for select space propulsion system components

    Science.gov (United States)

    Millwater, H. R.; Cruse, T. A.

    1989-01-01

    The Probabilistic Structural Analysis Methods (PSAM) project developed at the Southwest Research Institute integrates state-of-the-art structural analysis techniques with probability theory for the design and analysis of complex large-scale engineering structures. An advanced efficient software system (NESSUS) capable of performing complex probabilistic analysis has been developed. NESSUS contains a number of software components to perform probabilistic analysis of structures. These components include: an expert system, a probabilistic finite element code, a probabilistic boundary element code and a fast probability integrator. The NESSUS software system is shown. An expert system is included to capture and utilize PSAM knowledge and experience. NESSUS/EXPERT is an interactive menu-driven expert system that provides information to assist in the use of the probabilistic finite element code NESSUS/FEM and the fast probability integrator (FPI). The expert system menu structure is summarized. The NESSUS system contains a state-of-the-art nonlinear probabilistic finite element code, NESSUS/FEM, to determine the structural response and sensitivities. A broad range of analysis capabilities and an extensive element library is present.

  17. Goal orientations predict academic performance beyond intelligence and personality

    NARCIS (Netherlands)

    Steinmayr, R.; Bipp, T.; Spinath, B.

    2011-01-01

    Goal orientations are thought to be an important predictor of scholastic achievement. The present paper investigated the joint influence of goal orientations, intelligence, and personality on school performance in a sample of N = 520 11th and 12th graders (303 female; mean age M = 16.94 years).

  18. A Probabilistic Performance Assessment Study of Potential Low-Level Radioactive Waste Disposal Sites in Taiwan

    Science.gov (United States)

    Knowlton, R. G.; Arnold, B. W.; Mattie, P. D.; Kuo, M.; Tien, N.

    2006-12-01

    For several years now, Taiwan has been engaged in a process to select a low-level radioactive waste (LLW) disposal site. Taiwan is generating LLW from operational and decommissioning wastes associated with nuclear power reactors, as well as research, industrial, and medical radioactive wastes. The preliminary selection process has narrowed the search to four potential candidate sites. These sites are to be evaluated in a performance assessment analysis to determine the likelihood of meeting the regulatory criteria for disposal. Sandia National Laboratories and Taiwan's Institute of Nuclear Energy Research have been working together to develop the necessary performance assessment methodology and associated computer models to perform these analyses. The methodology utilizes both deterministic (e.g., single run) and probabilistic (e.g., multiple statistical realizations) analyses to achieve the goals. The probabilistic approach provides a means of quantitatively evaluating uncertainty in the model predictions and a more robust basis for performing sensitivity analyses to better understand what is driving the dose predictions from the models. Two types of disposal configurations are under consideration: a shallow land burial concept and a cavern disposal concept. The shallow land burial option includes a protective cover to limit infiltration potential to the waste. Both conceptual designs call for the disposal of 55 gallon waste drums within concrete lined trenches or tunnels, and backfilled with grout. Waste emplaced in the drums may be solidified. Both types of sites are underlain or placed within saturated fractured bedrock material. These factors have influenced the conceptual model development of each site, as well as the selection of the models to employ for the performance assessment analyses. Several existing codes were integrated in order to facilitate a comprehensive performance assessment methodology to evaluate the potential disposal sites. First, a need

  19. Effects of Self-Esteem and Perceived Goal Difficulty on Goal Setting, Certainty, Task Performance, and Attributions.

    Science.gov (United States)

    Tang, Thomas Li-Ping; Reynolds, David B.

    1993-01-01

    Fifty-two subjects competed on a task against themselves, a difficult competitor, and an easy competitor. Certainty, ability attribution, and task satisfaction for those with low self-esteem were affected by perceived goal difficulty but not for those with high self-esteem. Low self-esteem groups had lower goals, certainty, and task performance.…

  20. Expected Evaluation, Goals, and Performance: Mood as Input.

    Science.gov (United States)

    Sanna, Lawrence J.; And Others

    1996-01-01

    Research indicates effortful performances are reduced when participants cannot be evaluated. Hypothesized mood interacts with goals to attenuate such reduction in performance. As predicted, when participants' tried to do as much as they could, those in negative moods put forth more effort and persisted longer than those in positive moods,…

  1. Entropy-based Probabilistic Fatigue Damage Prognosis and Algorithmic Performance Comparison

    Data.gov (United States)

    National Aeronautics and Space Administration — In this paper, a maximum entropy-based general framework for probabilistic fatigue damage prognosis is investigated. The proposed methodology is based on an...

  2. Entropy-based probabilistic fatigue damage prognosis and algorithmic performance comparison

    Data.gov (United States)

    National Aeronautics and Space Administration — In this paper, a maximum entropy-based general framework for probabilistic fatigue damage prognosis is investigated. The proposed methodology is based on an...

  3. Probabilistic Analysis of a Composite Crew Module

    Science.gov (United States)

    Mason, Brian H.; Krishnamurthy, Thiagarajan

    2011-01-01

    An approach for conducting reliability-based analysis (RBA) of a Composite Crew Module (CCM) is presented. The goal is to identify and quantify the benefits of probabilistic design methods for the CCM and future space vehicles. The coarse finite element model from a previous NASA Engineering and Safety Center (NESC) project is used as the baseline deterministic analysis model to evaluate the performance of the CCM using a strength-based failure index. The first step in the probabilistic analysis process is the determination of the uncertainty distributions for key parameters in the model. Analytical data from water landing simulations are used to develop an uncertainty distribution, but such data were unavailable for other load cases. The uncertainty distributions for the other load scale factors and the strength allowables are generated based on assumed coefficients of variation. Probability of first-ply failure is estimated using three methods: the first order reliability method (FORM), Monte Carlo simulation, and conditional sampling. Results for the three methods were consistent. The reliability is shown to be driven by first ply failure in one region of the CCM at the high altitude abort load set. The final predicted probability of failure is on the order of 10-11 due to the conservative nature of the factors of safety on the deterministic loads.

  4. A probabilistic bridge safety evaluation against floods.

    Science.gov (United States)

    Liao, Kuo-Wei; Muto, Yasunori; Chen, Wei-Lun; Wu, Bang-Ho

    2016-01-01

    To further capture the influences of uncertain factors on river bridge safety evaluation, a probabilistic approach is adopted. Because this is a systematic and nonlinear problem, MPP-based reliability analyses are not suitable. A sampling approach such as a Monte Carlo simulation (MCS) or importance sampling is often adopted. To enhance the efficiency of the sampling approach, this study utilizes Bayesian least squares support vector machines to construct a response surface followed by an MCS, providing a more precise safety index. Although there are several factors impacting the flood-resistant reliability of a bridge, previous experiences and studies show that the reliability of the bridge itself plays a key role. Thus, the goal of this study is to analyze the system reliability of a selected bridge that includes five limit states. The random variables considered here include the water surface elevation, water velocity, local scour depth, soil property and wind load. Because the first three variables are deeply affected by river hydraulics, a probabilistic HEC-RAS-based simulation is performed to capture the uncertainties in those random variables. The accuracy and variation of our solutions are confirmed by a direct MCS to ensure the applicability of the proposed approach. The results of a numerical example indicate that the proposed approach can efficiently provide an accurate bridge safety evaluation and maintain satisfactory variation.

  5. Applying probabilistic well-performance parameters to assessments of shale-gas resources

    Science.gov (United States)

    Charpentier, Ronald R.; Cook, Troy

    2010-01-01

    In assessing continuous oil and gas resources, such as shale gas, it is important to describe not only the ultimately producible volumes, but also the expected well performance. This description is critical to any cost analysis or production scheduling. A probabilistic approach facilitates (1) the inclusion of variability in well performance within a continuous accumulation, and (2) the use of data from developed accumulations as analogs for the assessment of undeveloped accumulations. In assessing continuous oil and gas resources of the United States, the U.S. Geological Survey analyzed production data from many shale-gas accumulations. Analyses of four of these accumulations (the Barnett, Woodford, Fayetteville, and Haynesville shales) are presented here as examples of the variability of well performance. For example, the distribution of initial monthly production rates for Barnett vertical wells shows a noticeable change with time, first increasing because of improved completion practices, then decreasing from a combination of decreased reservoir pressure (in infill wells) and drilling in less productive areas. Within a partially developed accumulation, historical production data from that accumulation can be used to estimate production characteristics of undrilled areas. An understanding of the probabilistic relations between variables, such as between initial production and decline rates, can improve estimates of ultimate production. Time trends or spatial trends in production data can be clarified by plots and maps. The data can also be divided into subsets depending on well-drilling or well-completion techniques, such as vertical in relation to horizontal wells. For hypothetical or lightly developed accumulations, one can either make comparisons to a specific well-developed accumulation or to the entire range of available developed accumulations. Comparison of the distributions of initial monthly production rates of the four shale-gas accumulations that were

  6. Probabilistic sensory recoding.

    Science.gov (United States)

    Jazayeri, Mehrdad

    2008-08-01

    A hallmark of higher brain functions is the ability to contemplate the world rather than to respond reflexively to it. To do so, the nervous system makes use of a modular architecture in which sensory representations are dissociated from areas that control actions. This flexibility however necessitates a recoding scheme that would put sensory information to use in the control of behavior. Sensory recoding faces two important challenges. First, recoding must take into account the inherent variability of sensory responses. Second, it must be flexible enough to satisfy the requirements of different perceptual goals. Recent progress in theory, psychophysics, and neurophysiology indicate that cortical circuitry might meet these challenges by evaluating sensory signals probabilistically.

  7. Dealing with uncertainty arising out of probabilistic risk assessment

    International Nuclear Information System (INIS)

    Solomon, K.A.; Kastenberg, W.E.; Nelson, P.F.

    1984-03-01

    In addressing the area of safety goal implementation, the question of uncertainty arises. This report suggests that the Nuclear Regulatory Commission (NRC) should examine how other regulatory organizations have addressed the issue. Several examples are given from the chemical industry, and comparisons are made to nuclear power risks. Recommendations are made as to various considerations that the NRC should require in probabilistic risk assessments in order to properly treat uncertainties in the implementation of the safety goal policy. 40 references, 7 figures, 5 tables

  8. Probabilistic risk assessment methodology

    International Nuclear Information System (INIS)

    Shinaishin, M.A.

    1988-06-01

    The objective of this work is to provide the tools necessary for clear identification of: the purpose of a Probabilistic Risk Study, the bounds and depth of the study, the proper modeling techniques to be used, the failure modes contributing to the analysis, the classical and baysian approaches for manipulating data necessary for quantification, ways for treating uncertainties, and available computer codes that may be used in performing such probabilistic analysis. In addition, it provides the means for measuring the importance of a safety feature to maintaining a level of risk at a Nuclear Power Plant and the worth of optimizing a safety system in risk reduction. In applying these techniques so that they accommodate our national resources and needs it was felt that emphasis should be put on the system reliability analysis level of PRA. Objectives of such studies could include: comparing systems' designs of the various vendors in the bedding stage, and performing grid reliability and human performance analysis using national specific data. (author)

  9. Probabilistic risk assessment methodology

    Energy Technology Data Exchange (ETDEWEB)

    Shinaishin, M A

    1988-06-15

    The objective of this work is to provide the tools necessary for clear identification of: the purpose of a Probabilistic Risk Study, the bounds and depth of the study, the proper modeling techniques to be used, the failure modes contributing to the analysis, the classical and baysian approaches for manipulating data necessary for quantification, ways for treating uncertainties, and available computer codes that may be used in performing such probabilistic analysis. In addition, it provides the means for measuring the importance of a safety feature to maintaining a level of risk at a Nuclear Power Plant and the worth of optimizing a safety system in risk reduction. In applying these techniques so that they accommodate our national resources and needs it was felt that emphasis should be put on the system reliability analysis level of PRA. Objectives of such studies could include: comparing systems' designs of the various vendors in the bedding stage, and performing grid reliability and human performance analysis using national specific data. (author)

  10. Technical Insight of the High Level Safety Goal for the NPPs Built in China’S Thirteenth Five-Year Period (2016-2020)

    Energy Technology Data Exchange (ETDEWEB)

    Shi, G.; Zhan, W.; Mei, Q.; Sun, D., E-mail: shi@snerdi.com.cn [Shanghai Nuclear Engineering Research and Design Institute, SNPTC Shanghai (China)

    2014-10-15

    The “Nuclear Safety Planning” has been published in Oct. 2012 in China, which stipulates the safety goals for the NPPs which will be built in the future. As for the NPPs which will be built in China's Thirteenth Five-Year (2016-2020) and later, the high level safety goal is described as “the possibility of the large radioactive release should be practically eliminated by design”. A thorough investigation has been performed at SNERDI to explore the technical insights of this high level safety goal by using MEDP hierarchical safety goal approach. The definition of large release is proposed accordingly, DID requirements and probabilistic requirements are derived from this high level safety goal. (author)

  11. Establishment of safety goal and its quantification based on risk assessment

    International Nuclear Information System (INIS)

    Miyano, Hiroshi; Muramatsu, Ken

    2017-01-01

    We must clarify the safety objectives sought by society in securing the safety of nuclear reactors and nuclear power plants. For that purpose, it is useful to utilize risk assessment. Quantitative methods including probabilistic risk assessment (PRA) are superior in terms of scientific rationality and quantitative performance compared with conventional deterministic methods, and able to indicate an objective numerical value of safety level. Consequently, quantitative methods can enhance the transparency, consistency, compliance, predictability, and explanatory power of regulatory decisions toward business operators and citizens. Business operators can explain the validity of their own safety assurance activities to regulators and citizens. The goal to be secured becomes clear by incorporating the safety goal into the specific performance goal required for the nuclear power plant from the viewpoint of deep safeguard, and it becomes easy to evaluate the effectiveness of the safety measures. It helps us greatly in judging and selecting the appropriateness of safety measures. It should be noted: the fact that the result of implementing the PRA satisfies the safety goal is not a sufficient condition in the sense of guaranteeing complete safety but a necessary condition. The nuclear power field is a region with large uncertainty, and research/efforts for accuracy improvement and evaluation validity will be required continuously. (A.O.)

  12. Motor planning flexibly optimizes performance under uncertainty about task goals.

    Science.gov (United States)

    Wong, Aaron L; Haith, Adrian M

    2017-03-03

    In an environment full of potential goals, how does the brain determine which movement to execute? Existing theories posit that the motor system prepares for all potential goals by generating several motor plans in parallel. One major line of evidence for such theories is that presenting two competing goals often results in a movement intermediate between them. These intermediate movements are thought to reflect an unintentional averaging of the competing plans. However, normative theories suggest instead that intermediate movements might actually be deliberate, generated because they improve task performance over a random guessing strategy. To test this hypothesis, we vary the benefit of making an intermediate movement by changing movement speed. We find that participants generate intermediate movements only at (slower) speeds where they measurably improve performance. Our findings support the normative view that the motor system selects only a single, flexible motor plan, optimized for uncertain goals.

  13. A probabilistic analysis of PWR and BWR fuel rod performance using the code CASINO-SLEUTH

    International Nuclear Information System (INIS)

    Bull, A.J.

    1987-01-01

    This paper presents a brief description of the Monte Carlo and response surface techniques used in the code, and a probabilistic analysis of fuel rod performance in PWR and BWR applications. The analysis shows that fission gas release predictions are very sensitive to changes in certain of the code's inputs, identifies the most dominant input parameters and compares their effects in the two cases. (orig./HP)

  14. Probabilistic finite elements

    Science.gov (United States)

    Belytschko, Ted; Wing, Kam Liu

    1987-01-01

    In the Probabilistic Finite Element Method (PFEM), finite element methods have been efficiently combined with second-order perturbation techniques to provide an effective method for informing the designer of the range of response which is likely in a given problem. The designer must provide as input the statistical character of the input variables, such as yield strength, load magnitude, and Young's modulus, by specifying their mean values and their variances. The output then consists of the mean response and the variance in the response. Thus the designer is given a much broader picture of the predicted performance than with simply a single response curve. These methods are applicable to a wide class of problems, provided that the scale of randomness is not too large and the probabilistic density functions possess decaying tails. By incorporating the computational techniques we have developed in the past 3 years for efficiency, the probabilistic finite element methods are capable of handling large systems with many sources of uncertainties. Sample results for an elastic-plastic ten-bar structure and an elastic-plastic plane continuum with a circular hole subject to cyclic loadings with the yield stress on the random field are given.

  15. Non-unitary probabilistic quantum computing

    Science.gov (United States)

    Gingrich, Robert M.; Williams, Colin P.

    2004-01-01

    We present a method for designing quantum circuits that perform non-unitary quantum computations on n-qubit states probabilistically, and give analytic expressions for the success probability and fidelity.

  16. Probabilistic Seismic Performance Model for Tunnel Form Concrete Building Structures

    Directory of Open Access Journals (Sweden)

    S. Bahram Beheshti Aval

    2016-12-01

    Full Text Available Despite widespread construction of mass-production houses with tunnel form structural system across the world, unfortunately no special seismic code is published for design of this type of construction. Through a literature survey, only a few studies are about the seismic behavior of this type of structural system. Thus based on reasonable numerical results, the seismic performance of structures constructed with this technique considering the effective factors on structural behavior is highly noteworthy in a seismic code development process. In addition, due to newness of this system and observed damages in past earthquakes, and especially random nature of future earthquakes, the importance of probabilistic approach and necessity of developing fragility curves in a next generation Performance Based Earthquake Engineering (PBEE frame work are important. In this study, the seismic behavior of 2, 5 and 10 story tunnel form structures with a regular plan is examined. First, the performance levels of these structures under the design earthquake (return period of 475 years with time history analysis and pushover method are assessed, and then through incremental dynamic analysis, fragility curves are extracted for different levels of damage in walls and spandrels. The results indicated that the case study structures have high capacity and strength and show appropriate seismic performance. Moreover, all three structures subjected were in immediate occupancy performance level.

  17. Effects of Goal Relations on Self-Regulated Learning in Multiple Goal Pursuits: Performance, the Self-Regulatory Process, and Task Enjoyment

    Science.gov (United States)

    Lee, Hyunjoo

    2012-01-01

    The purpose of this study was to investigate the effects of goal relations on self-regulation in the pursuit of multiple goals, focusing on self-regulated performance, the self-regulatory process, and task enjoyment. The effect of multiple goal relations on self-regulation was explored in a set of three studies. Goal relations were divided into…

  18. Probabilistic insurance

    OpenAIRE

    Wakker, P.P.; Thaler, R.H.; Tversky, A.

    1997-01-01

    textabstractProbabilistic insurance is an insurance policy involving a small probability that the consumer will not be reimbursed. Survey data suggest that people dislike probabilistic insurance and demand more than a 20% reduction in the premium to compensate for a 1% default risk. While these preferences are intuitively appealing they are difficult to reconcile with expected utility theory. Under highly plausible assumptions about the utility function, willingness to pay for probabilistic i...

  19. Performance goals on simulators boost resident motivation and skills laboratory attendance.

    Science.gov (United States)

    Stefanidis, Dimitrios; Acker, Christina E; Greene, Frederick L

    2010-01-01

    To assess the impact of setting simulator training goals on resident motivation and skills laboratory attendance. Residents followed a proficiency-based laparoscopic curriculum on the 5 Fundamentals of Laparoscopic Surgery and 9 virtual reality tasks. Training goals consisted of the average expert performance on each task + 2 SD (mandatory) and best expert performance (optional). Residents rated the impact of the training goals on their motivation on a 20-point visual analog scale. Performance and attendance data were analyzed and correlated (Spearman's). Data are reported as medians (range). General Surgery residency program at a regional referral Academic Medical Center. General surgery residents (n = 15). During the first 5 months of the curriculum, weekly attendance rate was 51% (range, 8-96). After 153 (range, 21-412) repetitions, resident speed improved by 97% (range, 18-230), errors improved by 17% (range, 0-24), and motion efficiency by 59% (range, 26-114) compared with their baseline. Nine (60%) residents achieved proficiency in 7 (range, 3-14) and the best goals in 3.5 (range, 1-9) tasks; the other 6 residents had attendance rates motivation as 15 (range, 1-18) and setting a best goal as 13 (range, 1-18). Motivation ratings correlated positively with attendance rates, number of repetitions, performance improvement, and achievement of proficiency and best goals (r = 0.59-0.75; p motivation to participate in a simulator curriculum. While more stringent goals may potentiate this effect, they have a limited impact on senior residents. Further research is needed to investigate ways to improve skills laboratory attendance. Copyright 2010 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  20. Probabilistic Tsunami Hazard Analysis

    Science.gov (United States)

    Thio, H. K.; Ichinose, G. A.; Somerville, P. G.; Polet, J.

    2006-12-01

    The recent tsunami disaster caused by the 2004 Sumatra-Andaman earthquake has focused our attention to the hazard posed by large earthquakes that occur under water, in particular subduction zone earthquakes, and the tsunamis that they generate. Even though these kinds of events are rare, the very large loss of life and material destruction caused by this earthquake warrant a significant effort towards the mitigation of the tsunami hazard. For ground motion hazard, Probabilistic Seismic Hazard Analysis (PSHA) has become a standard practice in the evaluation and mitigation of seismic hazard to populations in particular with respect to structures, infrastructure and lifelines. Its ability to condense the complexities and variability of seismic activity into a manageable set of parameters greatly facilitates the design of effective seismic resistant buildings but also the planning of infrastructure projects. Probabilistic Tsunami Hazard Analysis (PTHA) achieves the same goal for hazards posed by tsunami. There are great advantages of implementing such a method to evaluate the total risk (seismic and tsunami) to coastal communities. The method that we have developed is based on the traditional PSHA and therefore completely consistent with standard seismic practice. Because of the strong dependence of tsunami wave heights on bathymetry, we use a full waveform tsunami waveform computation in lieu of attenuation relations that are common in PSHA. By pre-computing and storing the tsunami waveforms at points along the coast generated for sets of subfaults that comprise larger earthquake faults, we can efficiently synthesize tsunami waveforms for any slip distribution on those faults by summing the individual subfault tsunami waveforms (weighted by their slip). This efficiency make it feasible to use Green's function summation in lieu of attenuation relations to provide very accurate estimates of tsunami height for probabilistic calculations, where one typically computes

  1. Predicting subjective vitality and performance in sports: the role of passion and achievement goals.

    Science.gov (United States)

    Li, Chiung-Huang

    2010-06-01

    The major purpose of this study was to test the hypothesized paths from dualistic passions through achievement goals to subjective vitality and performance in sports. 645 high school athletes participated. The proposed structural equation model, with relationships between dualistic passions and subjective vitality and sports performance mediated by achievement goals, fit the data well, especially for mastery-approach and performance-approach goals. Harmonious and obsessive passions may lead athletes to high performance via the adoption of mastery-approach goals. However, these passions seem to have two paths influencing personal functioning: direct effects make players feel energetic, and indirect effects on subjective vitality through adoption of mastery-approach and performance-approach goals.

  2. The Role of Achievement Goals in the Development of Interest: Reciprocal Relations between Achievement Goals, Interest, and Performance

    Science.gov (United States)

    Harackiewicz, Judith M.; Durik, Amanda M.; Barron, Kenneth E.; Linnenbrink-Garcia, Lisa; Tauer, John M.

    2008-01-01

    The dynamics of individual and situational interest and academic performance were examined in the college classroom and 7 semesters later in conjunction with achievement goals. At the beginning of an introductory psychology course, participants reported their initial interest in psychology, achievement goals, and situational interest in course…

  3. Development of probabilistic risk analysis library

    International Nuclear Information System (INIS)

    Soga, Shota; Kirimoto, Yukihiro; Kanda, Kenichi

    2015-01-01

    We developed a library that is designed to perform level 1 Probabilistic Risk Analysis using Binary Decision Diagram (BDD). In particular, our goal is to develop a library that will allow Japanese electric utilities to take the advantages of BDD that can solve Event Tree (ET) and Fault Tree (FT) models analytically. Using BDD, the library supports negation in FT which allows more flexible modeling of ET/FT. The library is written by C++ within an object-oriented framework using open source software. The library itself is a header-only library so that Japanese electric utilities can take advantages of its transparency to speed up development and to build their own software for their specific needs. In this report, the basic capabilities of the library is briefly described. In addition, several applications of the library are demonstrated including validation of MCS evaluation of PRA model and evaluation of corrective and preventive maintenance considering common cause failure. (author)

  4. Angra-1 probabilistic safety study-phase B

    International Nuclear Information System (INIS)

    Fernandes Filho, T.L.; Gibelli, S.M.O.

    1988-05-01

    This study represents the Phase B of the Angra-1 Probabilistic Safety Study and is the the final report prepared for the IAEA under Research Contract No. 3423/R2/RB. The three main items covered in this report are the establishment of interim safety goals, analysis of Angra-1 operational experience and development of emergency procedures to address severe accidents. For establishment of interim safety goals a methodology for calculating consequences and risks associated to the Angra-1 operation was developed based on the available data and codes. The proposed safety goals refer to the individual risk of early fatality for people living in the vicinity of the plant, colective risk of cancer fatalities for people living near the plant, the propobability of core melt occurrence and the probability of dominant accident sequences. (author) [pt

  5. Comparing Three Models of Achievement Goals: Goal Orientations, Goal Standards, and Goal Complexes

    Science.gov (United States)

    Senko, Corwin; Tropiano, Katie L.

    2016-01-01

    Achievement goal theory (Dweck, 1986) initially characterized mastery goals and performance goals as opposites in a good-bad dualism of student motivation. A later revision (Harackiewicz, Barron, & Elliot, 1998) contended that both goals can provide benefits and be pursued together. Perhaps both frameworks are correct: Their contrasting views…

  6. Predicting BCI subject performance using probabilistic spatio-temporal filters.

    Directory of Open Access Journals (Sweden)

    Heung-Il Suk

    Full Text Available Recently, spatio-temporal filtering to enhance decoding for Brain-Computer-Interfacing (BCI has become increasingly popular. In this work, we discuss a novel, fully Bayesian-and thereby probabilistic-framework, called Bayesian Spatio-Spectral Filter Optimization (BSSFO and apply it to a large data set of 80 non-invasive EEG-based BCI experiments. Across the full frequency range, the BSSFO framework allows to analyze which spatio-spectral parameters are common and which ones differ across the subject population. As expected, large variability of brain rhythms is observed between subjects. We have clustered subjects according to similarities in their corresponding spectral characteristics from the BSSFO model, which is found to reflect their BCI performances well. In BCI, a considerable percentage of subjects is unable to use a BCI for communication, due to their missing ability to modulate their brain rhythms-a phenomenon sometimes denoted as BCI-illiteracy or inability. Predicting individual subjects' performance preceding the actual, time-consuming BCI-experiment enhances the usage of BCIs, e.g., by detecting users with BCI inability. This work additionally contributes by using the novel BSSFO method to predict the BCI-performance using only 2 minutes and 3 channels of resting-state EEG data recorded before the actual BCI-experiment. Specifically, by grouping the individual frequency characteristics we have nicely classified them into the subject 'prototypes' (like μ - or β -rhythm type subjects or users without ability to communicate with a BCI, and then by further building a linear regression model based on the grouping we could predict subjects' performance with the maximum correlation coefficient of 0.581 with the performance later seen in the actual BCI session.

  7. Using performance indicators to reduce cost uncertainty of China's CO2 mitigation goals

    International Nuclear Information System (INIS)

    Xu, Yuan

    2013-01-01

    Goals on absolute emissions and intensity play key roles in CO 2 mitigation. However, like cap-and-trade policies with price uncertainty, they suffer from significant uncertainty in abatement costs. This article examines whether an indicator could be established to complement CO 2 mitigation goals and help reduce cost uncertainty with a particular focus on China. Performance indicators on CO 2 emissions per unit of energy consumption could satisfy three criteria: compared with the mitigation goals, (i) they are more closely associated with active mitigation efforts and (ii) their baselines have more stable projections from historical trajectories. (iii) Their abatement costs are generally higher than other mitigation methods, particularly energy efficiency and conservation. Performance indicators could be used in the following way: if a CO 2 goal on absolute emissions or intensity is attained, the performance indicator should still reach a lower threshold as a cost floor. If the goal cannot be attained, an upper performance threshold should be achieved as a cost ceiling. The narrower cost uncertainty may encourage wider and greater mitigation efforts. - Highlights: ► CO 2 emissions per unit of energy consumption could act as performance indicators. ► Performance indicators are more closely related to active mitigation activities. ► Performance indicators have more stable historical trajectories. ► Abatement costs are higher for performance indicators than for other activities. ► Performance thresholds could reduce the cost uncertainty of CO 2 mitigation goals.

  8. Performance improvement on a MIMO radio-over-fiber system by probabilistic shaping

    Science.gov (United States)

    Kong, Miao; Yu, Jianjun

    2018-01-01

    As we know, probabilistic shaping (PS), as a typical one of modulation format optimization technologies, becomes a promising technology and attracts more and more attention, because of its higher transmission capacity and lower computation complexity. In this paper, we experimentally demonstrated a reliable 8 Gbaud-rate delivery of polarization multiplexed PS 16-QAM single carrier signal in a MIMO radio-over-fiber system with 20-km SMF-28 wire link and 2.5-m wireless link at 60 GHz. The BER performance of PS 16-QAM signals at different baud rate was also evaluated. What is more, PS 16-QAM was also experimentally compared with uniform 16-QAM, and it can be concluded that PS 16-QAM brings a better compromise between effectiveness and reliability performance and a higher capacity than uniform 16-QAM for the radio-over-fiber system.

  9. Probabilistic assessment of faults

    International Nuclear Information System (INIS)

    Foden, R.W.

    1987-01-01

    Probabilistic safety analysis (PSA) is the process by which the probability (or frequency of occurrence) of reactor fault conditions which could lead to unacceptable consequences is assessed. The basic objective of a PSA is to allow a judgement to be made as to whether or not the principal probabilistic requirement is satisfied. It also gives insights into the reliability of the plant which can be used to identify possible improvements. This is explained in the article. The scope of a PSA and the PSA performed by the National Nuclear Corporation (NNC) for the Heysham II and Torness AGRs and Sizewell-B PWR are discussed. The NNC methods for hazards, common cause failure and operator error are mentioned. (UK)

  10. Unintended anchors: Building rating systems and energy performance goals for U.S. buildings

    International Nuclear Information System (INIS)

    Klotz, Leidy; Mack, Daniel; Klapthor, Brent; Tunstall, Casey; Harrison, Jennilee

    2010-01-01

    In the U.S., where buildings account for 40% of energy use, commercial buildings use more energy per unit area than ever before. However, exemplary buildings demonstrate the feasibility of much better energy performance at no additional first cost. This research examines one possible explanation for this inconsistency. The aim is to investigate whether the anchoring bias, which refers to our tendency to gravitate towards a pre-defined standard regardless of its relevance, influences energy performance goals in building design. The scope examines professionals who help set energy performance goals for U.S. buildings. Prior to being asked to set an energy performance goal, these professionals were randomly directed to one of three series of questions. One series set an anchor of 90% energy reduction beyond standard practice, one set a 30% anchor, and one set no anchor. Respondents exposed to the 90% anchor, and respondents exposed to no anchor at all, set higher energy performance goals than respondents exposed to the 30% anchor. These results suggest that building rating systems that only reward incremental energy improvements may inadvertently create anchors, thereby discouraging more advanced energy performance goals and inhibiting energy performance that is technically and economically feasible.

  11. Prediction of intrinsic motivation and sports performance using 2 x 2 achievement goal framework.

    Science.gov (United States)

    Li, Chiung-Huang; Chi, Likang; Yeh, Suh-Ruu; Guo, Kwei-Bin; Ou, Cheng-Tsung; Kao, Chun-Chieh

    2011-04-01

    The purpose of this study was to examine the influence of 2 x 2 achievement goals on intrinsic motivation and performance in handball. Participants were 164 high school athletes. All completed the 2 x 2 Achievement Goals Questionnaire for Sport and the Intrinsic Motivation subscale of the Sport Motivation Scale; the coach for each team rated his athletes' overall sports performance. Using simultaneous-regression analyses, mastery-approach goals positively predicted both intrinsic motivation and performance in sports, whereas performance-avoidance goals negatively predicted sports performance. These results suggest that athletes who pursue task mastery and improvement of their competence perform well and enjoy their participation. In contrast, those who focus on avoiding normative incompetence perform poorly.

  12. Doing better (or worse) than one's parents: Social status, mobility, and performance-avoidance goals.

    Science.gov (United States)

    Jury, Mickaël; Bruno, Alisée; Darnon, Céline

    2018-01-11

    Previous research has shown that, when succeeding in higher education, first-generation (FG) students endorse more performance-avoidance goals (i.e., the fear of performing poorly) than continuing-generation (CG) students. In this study, individual mobility is examined as a predictor of performance-avoidance goal endorsement. It is argued that FG students endorse more these goals than CG students because in higher education, the former (but not the latter) experience upward mobility. In addition, CG can also be at risk of endorsing these goals when they are confronted with downward mobility. Two studies were conducted with psychology students (N = 143 in Study 1; N = 176 in Study 2). In Study 1, FG and CG students' perceived upward mobility was measured. In Study 2, FG and CG students were provided with a feedback that suggested either upward or downward mobility. In both studies, participants reported their level of performance-avoidance goal endorsement. Results from Study 1 supported an indirect effect of status on performance-avoidance goals via a higher perception of upward mobility. Results from Study 2 supported that psychology students who face mobility (i.e., FG students who received better feedback than their usual level of performance, CG students who received worse feedback than their usual level of performance) increased their performance-avoidance goals the most. Taken together, the results of these studies support that one's actual social position and, even more, the social position one is about to reach are reliable predictors of performance-avoidance goals. © 2018 The British Psychological Society.

  13. Probabilistic Structural Analysis of SSME Turbopump Blades: Probabilistic Geometry Effects

    Science.gov (United States)

    Nagpal, V. K.

    1985-01-01

    A probabilistic study was initiated to evaluate the precisions of the geometric and material properties tolerances on the structural response of turbopump blades. To complete this study, a number of important probabilistic variables were identified which are conceived to affect the structural response of the blade. In addition, a methodology was developed to statistically quantify the influence of these probabilistic variables in an optimized way. The identified variables include random geometric and material properties perturbations, different loadings and a probabilistic combination of these loadings. Influences of these probabilistic variables are planned to be quantified by evaluating the blade structural response. Studies of the geometric perturbations were conducted for a flat plate geometry as well as for a space shuttle main engine blade geometry using a special purpose code which uses the finite element approach. Analyses indicate that the variances of the perturbations about given mean values have significant influence on the response.

  14. PRA has many faces - can the safety goal be well-posed

    International Nuclear Information System (INIS)

    Bargmann, H.

    1983-01-01

    The question is discussed whether probabilistic reliability problems can, principally, be well-posed in practical situations. The problem is reduced to the question whether an underlying probabilistic experiment which is, essentially, the set of outcomes can be precisely specified such that the solution of the problem is unique. Upon reexamination of a classical paradox due to Bertrand and consideration of a typical problem of structural reliability we conclude that the possibility of well-posing a reliability problem should be considered illusory, for fundamental reasons which are inherent in practical situations. In particular, it should not be assumed that a quantitative safety goal could be verified. Generally, a probabilistic assessment should be considered as a quantitative method for establishing rational results which should, however, not be viewed as quantitative measures but as qualitative guides

  15. Uncertainty and sensitivity analysis using probabilistic system assessment code. 1

    International Nuclear Information System (INIS)

    Honma, Toshimitsu; Sasahara, Takashi.

    1993-10-01

    This report presents the results obtained when applying the probabilistic system assessment code under development to the PSACOIN Level 0 intercomparison exercise organized by the Probabilistic System Assessment Code User Group in the Nuclear Energy Agency (NEA) of OECD. This exercise is one of a series designed to compare and verify probabilistic codes in the performance assessment of geological radioactive waste disposal facilities. The computations were performed using the Monte Carlo sampling code PREP and post-processor code USAMO. The submodels in the waste disposal system were described and coded with the specification of the exercise. Besides the results required for the exercise, further additional uncertainty and sensitivity analyses were performed and the details of these are also included. (author)

  16. MOTIVATION: Goals and Goal Setting

    Science.gov (United States)

    Stratton, Richard K.

    2005-01-01

    Goal setting has great impact on a team's performance. Goals enable a team to synchronize their efforts to achieve success. In this article, the author talks about goals and goal setting. This articles complements Domain 5--Teaching and Communication (p.14) and discusses one of the benchmarks listed therein: "Teach the goal setting process and…

  17. Probabilistic assessment of dry transport with burnup credit

    International Nuclear Information System (INIS)

    Lake, W.H.

    2003-01-01

    The general concept of probabilistic analysis and its application to the use of burnup credit in spent fuel transport is explored. Discussion of the probabilistic analysis method is presented. The concepts of risk and its perception are introduced, and models are suggested for performing probability and risk estimates. The general probabilistic models are used for evaluating the application of burnup credit for dry spent nuclear fuel transport. Two basic cases are considered. The first addresses the question of the relative likelihood of exceeding an established criticality safety limit with and without burnup credit. The second examines the effect of using burnup credit on the overall risk for dry spent fuel transport. Using reasoned arguments and related failure probability and consequence data analysis is performed to estimate the risks of using burnup credit for dry transport of spent nuclear fuel. (author)

  18. On Probabilistic Alpha-Fuzzy Fixed Points and Related Convergence Results in Probabilistic Metric and Menger Spaces under Some Pompeiu-Hausdorff-Like Probabilistic Contractive Conditions

    OpenAIRE

    De la Sen, M.

    2015-01-01

    In the framework of complete probabilistic metric spaces and, in particular, in probabilistic Menger spaces, this paper investigates some relevant properties of convergence of sequences to probabilistic α-fuzzy fixed points under some types of probabilistic contractive conditions.

  19. [Job performance in work organizations: the effects of management by group goals and job interdependence].

    Science.gov (United States)

    Ikeda, Hiroshi; Furukawa, Hisataka

    2015-04-01

    cThis study examined the interactive effect of management by group goals and job interdependence on employee's activities in terms of task and contextual performance. A survey was conducted among 140 Japanese employees. Results indicated that management by group goals was related only to contextual performance. Job interdependence, however, had a direct effect on both task and contextual performance. Moreover, moderated regression analyses revealed that for work groups requiring higher interdependence among employees, management by group goals had a positive relation to contextual performance but not to task performance. When interdependence was not necessarily required, however, management by group goals had no relation to contextual performance and even negatively impacted task performance, respectively. These results show that management by group goals affects task and contextual performance, and that this effect is moderated by job interdependence. This provides a theoretical extension as well as a practical application to the setting and management of group goals.

  20. Probabilistic Insurance

    NARCIS (Netherlands)

    Wakker, P.P.; Thaler, R.H.; Tversky, A.

    1997-01-01

    Probabilistic insurance is an insurance policy involving a small probability that the consumer will not be reimbursed. Survey data suggest that people dislike probabilistic insurance and demand more than a 20% reduction in premium to compensate for a 1% default risk. These observations cannot be

  1. Probabilistic Insurance

    NARCIS (Netherlands)

    P.P. Wakker (Peter); R.H. Thaler (Richard); A. Tversky (Amos)

    1997-01-01

    textabstractProbabilistic insurance is an insurance policy involving a small probability that the consumer will not be reimbursed. Survey data suggest that people dislike probabilistic insurance and demand more than a 20% reduction in the premium to compensate for a 1% default risk. While these

  2. Probabilistic Risk Assessment Procedures Guide for NASA Managers and Practitioners (Second Edition)

    Science.gov (United States)

    Stamatelatos,Michael; Dezfuli, Homayoon; Apostolakis, George; Everline, Chester; Guarro, Sergio; Mathias, Donovan; Mosleh, Ali; Paulos, Todd; Riha, David; Smith, Curtis; hide

    2011-01-01

    Probabilistic Risk Assessment (PRA) is a comprehensive, structured, and logical analysis method aimed at identifying and assessing risks in complex technological systems for the purpose of cost-effectively improving their safety and performance. NASA's objective is to better understand and effectively manage risk, and thus more effectively ensure mission and programmatic success, and to achieve and maintain high safety standards at NASA. NASA intends to use risk assessment in its programs and projects to support optimal management decision making for the improvement of safety and program performance. In addition to using quantitative/probabilistic risk assessment to improve safety and enhance the safety decision process, NASA has incorporated quantitative risk assessment into its system safety assessment process, which until now has relied primarily on a qualitative representation of risk. Also, NASA has recently adopted the Risk-Informed Decision Making (RIDM) process [1-1] as a valuable addition to supplement existing deterministic and experience-based engineering methods and tools. Over the years, NASA has been a leader in most of the technologies it has employed in its programs. One would think that PRA should be no exception. In fact, it would be natural for NASA to be a leader in PRA because, as a technology pioneer, NASA uses risk assessment and management implicitly or explicitly on a daily basis. NASA has probabilistic safety requirements (thresholds and goals) for crew transportation system missions to the International Space Station (ISS) [1-2]. NASA intends to have probabilistic requirements for any new human spaceflight transportation system acquisition. Methods to perform risk and reliability assessment in the early 1960s originated in U.S. aerospace and missile programs. Fault tree analysis (FTA) is an example. It would have been a reasonable extrapolation to expect that NASA would also become the world leader in the application of PRA. That was

  3. The Role of Goal Importance in Predicting University Students' High Academic Performance

    Science.gov (United States)

    Kyle, Vanessa A.; White, Katherine M.; Hyde, Melissa K.; Occhipinti, Stefano

    2014-01-01

    We examined goal importance, focusing on high, but not exclusive priority goals, in the theory of planned behaviour (TPB) to predict students' academic performance. At the beginning of semester, students in a psychology subject (N = 197) completed TPB and goal importance items for achieving a high grade. Regression analyses revealed partial…

  4. Making things happen through challenging goals: leader proactivity, trust, and business-unit performance.

    Science.gov (United States)

    Crossley, Craig D; Cooper, Cecily D; Wernsing, Tara S

    2013-05-01

    Building on decades of research on the proactivity of individual performers, this study integrates research on goal setting and trust in leadership to examine manager proactivity and business unit sales performance in one of the largest sales organizations in the United States. Results of a moderated-mediation model suggest that proactive senior managers establish more challenging goals for their business units (N = 50), which in turn are associated with higher sales performance. We further found that employees' trust in the manager is a critical contingency variable that facilitates the relationship between challenging sales goals and subsequent sales performance. This research contributes to growing literatures on trust in leadership and proactivity by studying their joint effects at a district-unit level of analysis while identifying district managers' tendency to set challenging goals as a process variable that helps translate their proactivity into the collective performance of their units. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  5. Importance of properly treating human performance in probabilistic risk assessments

    International Nuclear Information System (INIS)

    Kukielka, C.A.; Butler, F.G.; Chaiko, M.A.

    1997-01-01

    A critical issue to consider when developing Advanced Reactor Systems (ARS) is the operators' ability to reliably execute Emergency Operating Procedures (EOPs) during accidents. A combined probabilistic and deterministic method for evaluating operator performance is outlined in this paper. Three questions are addressed: (1) does the operator understand the status of the plant? (2) does the operator know what to do? and (3) what are the odds of successful EOP execution? Deterministic methods are used to evaluate questions 1 and 2, and question 3 is addressed by statistical analysis. Simulator exercises are used to develop probability of response as a function of time curves for time limited operator actions. This method has been used to identify and resolve deficiencies in the plant operating procedures and the operator interface. An application is provided to the Anticipated Transient without Scram accident sequences. The results of Human Reliability Analysis are compared with the results of similar BWR analyses. 2 figs., 2 tabs

  6. Probabilistic Criterion for the Economical Assessment of Nuclear Reactors

    International Nuclear Information System (INIS)

    Juanico, L; Florido, Pablo; Bergallo, Juan

    2000-01-01

    In this paper a MonteCarlo probabilistic model for the economic evaluation of nuclear power plants is presented.The probabilistic results have shown a wide spread on the economic performance due to the schedule complexity and coupling if tasks.This spread increasing to the discount rate, end hence, it becomes more important for developing countries

  7. Initial Probabilistic Evaluation of Reactor Pressure Vessel Fracture with Grizzly and Raven

    Energy Technology Data Exchange (ETDEWEB)

    Spencer, Benjamin [Idaho National Lab. (INL), Idaho Falls, ID (United States); Hoffman, William [Univ. of Idaho, Moscow, ID (United States); Sen, Sonat [Idaho National Lab. (INL), Idaho Falls, ID (United States); Rabiti, Cristian [Idaho National Lab. (INL), Idaho Falls, ID (United States); Dickson, Terry [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States); Bass, Richard [Oak Ridge National Lab. (ORNL), Oak Ridge, TN (United States)

    2015-10-01

    The Grizzly code is being developed with the goal of creating a general tool that can be applied to study a variety of degradation mechanisms in nuclear power plant components. The first application of Grizzly has been to study fracture in embrittled reactor pressure vessels (RPVs). Grizzly can be used to model the thermal/mechanical response of an RPV under transient conditions that would be observed in a pressurized thermal shock (PTS) scenario. The global response of the vessel provides boundary conditions for local models of the material in the vicinity of a flaw. Fracture domain integrals are computed to obtain stress intensity factors, which can in turn be used to assess whether a fracture would initiate at a pre-existing flaw. These capabilities have been demonstrated previously. A typical RPV is likely to contain a large population of pre-existing flaws introduced during the manufacturing process. This flaw population is characterized stastistically through probability density functions of the flaw distributions. The use of probabilistic techniques is necessary to assess the likelihood of crack initiation during a transient event. This report documents initial work to perform probabilistic analysis of RPV fracture during a PTS event using a combination of the RAVEN risk analysis code and Grizzly. This work is limited in scope, considering only a single flaw with deterministic geometry, but with uncertainty introduced in the parameters that influence fracture toughness. These results are benchmarked against equivalent models run in the FAVOR code. When fully developed, the RAVEN/Grizzly methodology for modeling probabilistic fracture in RPVs will provide a general capability that can be used to consider a wider variety of vessel and flaw conditions that are difficult to consider with current tools. In addition, this will provide access to advanced probabilistic techniques provided by RAVEN, including adaptive sampling and parallelism, which can dramatically

  8. Valid Probabilistic Predictions for Ginseng with Venn Machines Using Electronic Nose

    Directory of Open Access Journals (Sweden)

    You Wang

    2016-07-01

    Full Text Available In the application of electronic noses (E-noses, probabilistic prediction is a good way to estimate how confident we are about our prediction. In this work, a homemade E-nose system embedded with 16 metal-oxide semi-conductive gas sensors was used to discriminate nine kinds of ginsengs of different species or production places. A flexible machine learning framework, Venn machine (VM was introduced to make probabilistic predictions for each prediction. Three Venn predictors were developed based on three classical probabilistic prediction methods (Platt’s method, Softmax regression and Naive Bayes. Three Venn predictors and three classical probabilistic prediction methods were compared in aspect of classification rate and especially the validity of estimated probability. A best classification rate of 88.57% was achieved with Platt’s method in offline mode, and the classification rate of VM-SVM (Venn machine based on Support Vector Machine was 86.35%, just 2.22% lower. The validity of Venn predictors performed better than that of corresponding classical probabilistic prediction methods. The validity of VM-SVM was superior to the other methods. The results demonstrated that Venn machine is a flexible tool to make precise and valid probabilistic prediction in the application of E-nose, and VM-SVM achieved the best performance for the probabilistic prediction of ginseng samples.

  9. Does extrinsic goal framing enhance extrinsic goal-oriented individuals' learning and performance? An experimental test of the match perspective versus self-determination theory

    OpenAIRE

    Vansteenkiste, Maarten; Timmermans, Tinneke; Lens, Willy; Soenens, Bart; Van den Broeck, Anja

    2008-01-01

    Previous work within self-determination theory has shown that experimentally framing a learning activity in terms of extrinsic rather than intrinsic goals results in poorer conceptual learning and performance, presumably because extrinsic goal framing detracts attention from the learning activity and is less directly satisfying of basic psychological needs. According to the match perspective, experimental extrinsic, compared to intrinsic, goal framing should enhance learning and performance f...

  10. Integration of Probabilistic Exposure Assessment and Probabilistic Hazard Characterization

    NARCIS (Netherlands)

    Voet, van der H.; Slob, W.

    2007-01-01

    A method is proposed for integrated probabilistic risk assessment where exposure assessment and hazard characterization are both included in a probabilistic way. The aim is to specify the probability that a random individual from a defined (sub)population will have an exposure high enough to cause a

  11. Probabilistic Networks

    DEFF Research Database (Denmark)

    Jensen, Finn Verner; Lauritzen, Steffen Lilholt

    2001-01-01

    This article describes the basic ideas and algorithms behind specification and inference in probabilistic networks based on directed acyclic graphs, undirected graphs, and chain graphs.......This article describes the basic ideas and algorithms behind specification and inference in probabilistic networks based on directed acyclic graphs, undirected graphs, and chain graphs....

  12. Probabilistic aspects of risk analyses for hazardous facilities

    International Nuclear Information System (INIS)

    Morici, A.; Valeri, A.; Zaffiro, C.

    1989-01-01

    The work described in the paper discusses the aspects of the risk analysis concerned with the use of the probabilistic methodology, in order to see how this approach may affect the risk management of industrial hazardous facilities. To this purpose reference is done to the Probabilistic Risk Assessment (PRA) of nuclear power plants. The paper points out that even though the public aversion towards nuclear risks is still far from being removed, the probabilistic approach may provide a sound support to the decision making and authorization process for any industrial activity implying risk for the environment and the public health. It is opinion of the authors that the probabilistic techniques have been developed to a great level of sophistication in the nuclear industry and provided much more experience in this field than in others. For some particular areas of the nuclear applications, such as the plant reliability and the plant response to the accidents, these techniques have reached a sufficient level of maturity and so some results have been usefully taken as a measure of the safety level of the plant itself. The use of some limited safety goals is regarded as a relevant item of the nuclear licensing process. The paper claims that it is time now that these methods would be applied with equal success to other hazardous facilities, and makes some comparative consideration on the differences of these plants with nuclear power plants in order to understand the effect of these differences on the PRA results and on the use one intends to make with them. (author)

  13. Dynamic Effects of Performance-Avoidance Goal Orientation on Student Achievement in Language and Mathematics.

    Science.gov (United States)

    Stamovlasis, Dimitrios; Gonida, Sofia-Eleftheria N

    2018-07-01

    The present study used achievement goal theory (AGT) as a theoretical framework and examined the role of mastery and performance goals, both performance-approach and performance-avoidance, on school achieve-ment within the nonlinear dynamical systems (NDS) perspective. A series of cusp catastrophe models were applied on students' achievement in a number of school subjects, such as mathematics and language for elementary school and algebra, geometry, ancient and modern Greek language for high school, using achievement goal orientations as control variables. The participants (N=224) were students attending fifth and eighth grade (aged 11 and 14, respectively) in public schools located in northern Greece. Cusp analysis based on the probability density function was carried out by two procedures, the maximum likelihood and the least squares. The results showed that performance-approach goals had no linear effect on achievement, while the cusp models implementing mastery goals as the asymmetry factor and performance-avoidance as the bifurcation, proved superior to their linear alternatives. The results of the study based on NDS support the multiple goal perspective within AGT. Theoretical issues, educational implications and future directions are discussed.

  14. Probabilistic data integration and computational complexity

    Science.gov (United States)

    Hansen, T. M.; Cordua, K. S.; Mosegaard, K.

    2016-12-01

    Inverse problems in Earth Sciences typically refer to the problem of inferring information about properties of the Earth from observations of geophysical data (the result of nature's solution to the `forward' problem). This problem can be formulated more generally as a problem of `integration of information'. A probabilistic formulation of data integration is in principle simple: If all information available (from e.g. geology, geophysics, remote sensing, chemistry…) can be quantified probabilistically, then different algorithms exist that allow solving the data integration problem either through an analytical description of the combined probability function, or sampling the probability function. In practice however, probabilistic based data integration may not be easy to apply successfully. This may be related to the use of sampling methods, which are known to be computationally costly. But, another source of computational complexity is related to how the individual types of information are quantified. In one case a data integration problem is demonstrated where the goal is to determine the existence of buried channels in Denmark, based on multiple sources of geo-information. Due to one type of information being too informative (and hence conflicting), this leads to a difficult sampling problems with unrealistic uncertainty. Resolving this conflict prior to data integration, leads to an easy data integration problem, with no biases. In another case it is demonstrated how imperfections in the description of the geophysical forward model (related to solving the wave-equation) can lead to a difficult data integration problem, with severe bias in the results. If the modeling error is accounted for, the data integration problems becomes relatively easy, with no apparent biases. Both examples demonstrate that biased information can have a dramatic effect on the computational efficiency solving a data integration problem and lead to biased results, and under

  15. Optimization (Alara) and probabilistic exposures: the application of optimization criteria to the control of risks due to exposures of a probabilistic nature

    International Nuclear Information System (INIS)

    Gonzalez, A.J.

    1989-01-01

    The paper described the application of the principles of optimization recommended by the International Commission on Radiological Protection (ICRP) to the restrain of radiation risks due to exposures that may or may not be incurred and to which a probability of occurrence can be assigned. After describing the concept of probabilistic exposures, it proposes a basis for a converging policy of control for both certain and probabilistic exposures, namely the dose-risk relationship adopted for radiation protection purposes. On that basis some coherent approaches for dealing with probabilistic exposures, such as the limitation of individual risks, are discussed. The optimization of safety for reducing all risks from probabilistic exposures to as-low-as-reasonably-achievable (ALARA) levels is reviewed in full. The principles of optimization of protection are used as a basic framework and the relevant factors to be taken into account when moving to probabilistic exposures are presented. The paper also reviews the decision-aiding techniques suitable for performing optimization with particular emphasis to the multi-attribute utility-analysis technique. Finally, there is a discussion on some practical application of decision-aiding multi-attribute utility analysis to probabilistic exposures including the use of probabilistic utilities. In its final outlook, the paper emphasizes the need for standardization and solutions to generic problems, if optimization of safety is to be successful

  16. Probabilistic metric spaces

    CERN Document Server

    Schweizer, B

    2005-01-01

    Topics include special classes of probabilistic metric spaces, topologies, and several related structures, such as probabilistic normed and inner-product spaces. 1983 edition, updated with 3 new appendixes. Includes 17 illustrations.

  17. OCA-P, PWR Vessel Probabilistic Fracture Mechanics

    International Nuclear Information System (INIS)

    Cheverton, R.D.; Ball, D.G.

    2001-01-01

    1 - Description of program or function: OCA-P is a probabilistic fracture-mechanics code prepared specifically for evaluating the integrity of pressurized-water reactor vessels subjected to overcooling-accident loading conditions. Based on linear-elastic fracture mechanics, it has two- and limited three-dimensional flaw capability, and can treat cladding as a discrete region. Both deterministic and probabilistic analyses can be performed. For deterministic analysis, it is possible to conduct a search for critical values of the fluence and the nil-ductility reference temperature corresponding to incipient initiation of the initial flaw. The probabilistic portion of OCA-P is based on Monte Carlo techniques, and simulated parameters include fluence, flaw depth, fracture toughness, nil-ductility reference temperature, and concentrations of copper, nickel, and phosphorous. Plotting capabilities include the construction of critical-crack-depth diagrams (deterministic analysis) and a variety of histograms (probabilistic analysis). 2 - Method of solution: OAC-P accepts as input the reactor primary- system pressure and the reactor pressure-vessel downcomer coolant temperature, as functions of time in the specified transient. Then, the wall temperatures and stresses are calculated as a function of time and radial position in the wall, and the fracture-mechanics analysis is performed to obtain the stress intensity factors as a function of crack depth and time in the transient. In a deterministic analysis, values of the static crack initiation toughness and the crack arrest toughness are also calculated for all crack depths and times in the transient. A comparison of these values permits an evaluation of flaw behavior. For a probabilistic analysis, OCA-P generates a large number of reactor pressure vessels, each with a different combination of the various values of the parameters involved in the analysis of flaw behavior. For each of these vessels, a deterministic fracture

  18. Probabilistic analysis of a materially nonlinear structure

    Science.gov (United States)

    Millwater, H. R.; Wu, Y.-T.; Fossum, A. F.

    1990-01-01

    A probabilistic finite element program is used to perform probabilistic analysis of a materially nonlinear structure. The program used in this study is NESSUS (Numerical Evaluation of Stochastic Structure Under Stress), under development at Southwest Research Institute. The cumulative distribution function (CDF) of the radial stress of a thick-walled cylinder under internal pressure is computed and compared with the analytical solution. In addition, sensitivity factors showing the relative importance of the input random variables are calculated. Significant plasticity is present in this problem and has a pronounced effect on the probabilistic results. The random input variables are the material yield stress and internal pressure with Weibull and normal distributions, respectively. The results verify the ability of NESSUS to compute the CDF and sensitivity factors of a materially nonlinear structure. In addition, the ability of the Advanced Mean Value (AMV) procedure to assess the probabilistic behavior of structures which exhibit a highly nonlinear response is shown. Thus, the AMV procedure can be applied with confidence to other structures which exhibit nonlinear behavior.

  19. Probabilistic dual heuristic programming-based adaptive critic

    Science.gov (United States)

    Herzallah, Randa

    2010-02-01

    Adaptive critic (AC) methods have common roots as generalisations of dynamic programming for neural reinforcement learning approaches. Since they approximate the dynamic programming solutions, they are potentially suitable for learning in noisy, non-linear and non-stationary environments. In this study, a novel probabilistic dual heuristic programming (DHP)-based AC controller is proposed. Distinct to current approaches, the proposed probabilistic (DHP) AC method takes uncertainties of forward model and inverse controller into consideration. Therefore, it is suitable for deterministic and stochastic control problems characterised by functional uncertainty. Theoretical development of the proposed method is validated by analytically evaluating the correct value of the cost function which satisfies the Bellman equation in a linear quadratic control problem. The target value of the probabilistic critic network is then calculated and shown to be equal to the analytically derived correct value. Full derivation of the Riccati solution for this non-standard stochastic linear quadratic control problem is also provided. Moreover, the performance of the proposed probabilistic controller is demonstrated on linear and non-linear control examples.

  20. Probabilistic Modeling and Visualization for Bankruptcy Prediction

    DEFF Research Database (Denmark)

    Antunes, Francisco; Ribeiro, Bernardete; Pereira, Francisco Camara

    2017-01-01

    In accounting and finance domains, bankruptcy prediction is of great utility for all of the economic stakeholders. The challenge of accurate assessment of business failure prediction, specially under scenarios of financial crisis, is known to be complicated. Although there have been many successful...... studies on bankruptcy detection, seldom probabilistic approaches were carried out. In this paper we assume a probabilistic point-of-view by applying Gaussian Processes (GP) in the context of bankruptcy prediction, comparing it against the Support Vector Machines (SVM) and the Logistic Regression (LR......). Using real-world bankruptcy data, an in-depth analysis is conducted showing that, in addition to a probabilistic interpretation, the GP can effectively improve the bankruptcy prediction performance with high accuracy when compared to the other approaches. We additionally generate a complete graphical...

  1. Sensitivity analysis in multi-parameter probabilistic systems

    International Nuclear Information System (INIS)

    Walker, J.R.

    1987-01-01

    Probabilistic methods involving the use of multi-parameter Monte Carlo analysis can be applied to a wide range of engineering systems. The output from the Monte Carlo analysis is a probabilistic estimate of the system consequence, which can vary spatially and temporally. Sensitivity analysis aims to examine how the output consequence is influenced by the input parameter values. Sensitivity analysis provides the necessary information so that the engineering properties of the system can be optimized. This report details a package of sensitivity analysis techniques that together form an integrated methodology for the sensitivity analysis of probabilistic systems. The techniques have known confidence limits and can be applied to a wide range of engineering problems. The sensitivity analysis methodology is illustrated by performing the sensitivity analysis of the MCROC rock microcracking model

  2. Safety goals for commercial nuclear power plants

    International Nuclear Information System (INIS)

    Roe, J.W.

    1988-01-01

    In its official policy statement on safety goals for the operation of nuclear power plants, the Nuclear Regulatory Commission (NRC) set two qualitative goals, supported by two quantitative objectives. These goals are that (1) individual members of the public should be provided a level of protection from the consequences of nuclear power plant operation such that individuals bear no significant additional risk to life and health; and (2) societal risks to life and health from nuclear power plant operation should be comparable to or less than the risks of generating electricity by viable competing technologies and should not be a significant addition to other societal risks. As an alternative, this study proposes four quantitative safety goals for nuclear power plants. It begins with an analysis of the NRC's safety-goal development process, a key portion of which was devoted to delineating criteria for evaluating goal-development methods. Based on this analysis, recommendations for revision of the NRC's basic benchmarks for goal development are proposed. Using the revised criteria, NRC safety goals are evaluated, and the alternative safety goals are proposed. To further support these recommendations, both the NRC's goals and the proposed goals are compared with the results of three major probabilistic risk assessment studies. Finally, the potential impact of these recommendations on nuclear safety is described

  3. Budget goal commitment, clinical managers' use of budget information and performance.

    Science.gov (United States)

    Macinati, Manuela S; Rizzo, Marco G

    2014-08-01

    Despite the importance placed on accounting as a means to influence performance in public healthcare, there is still a lot to be learned about the role of management accounting in clinical managers' work behavior and their link with organizational performance. The article aims at analyzing the motivational role of budgetary participation and the intervening role of individuals' mental states and behaviors in influencing the relationship between budgetary participation and performance. According to the goal-setting theory, SEM technique was used to test the relationships among variables. The data were collected by a survey conducted in an Italian hospital. The results show that: (i) budgetary participation does not directly influence the use of budget information, but the latter is encouraged by the level of budget goal commitment which, as a result, is influenced by the positive motivational consequences of participative budgeting; (ii) budget goal commitment does not directly influence performance, but the relationship is mediated by the use of budget information. This study contributes to health policy and management accounting literature and has significant policy implications. Mainly, the findings prove that the introduction of business-like techniques in the healthcare sector can improve performance if attitudinal and behavioral variables are adequately stimulated. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  4. Probabilistic Linguistic Power Aggregation Operators for Multi-Criteria Group Decision Making

    Directory of Open Access Journals (Sweden)

    Agbodah Kobina

    2017-12-01

    Full Text Available As an effective aggregation tool, power average (PA allows the input arguments being aggregated to support and reinforce each other, which provides more versatility in the information aggregation process. Under the probabilistic linguistic term environment, we deeply investigate the new power aggregation (PA operators for fusing the probabilistic linguistic term sets (PLTSs. In this paper, we firstly develop the probabilistic linguistic power average (PLPA, the weighted probabilistic linguistic power average (WPLPA operators, the probabilistic linguistic power geometric (PLPG and the weighted probabilistic linguistic power geometric (WPLPG operators. At the same time, we carefully analyze the properties of these new aggregation operators. With the aid of the WPLPA and WPLPG operators, we further design the approaches for the application of multi-criteria group decision-making (MCGDM with PLTSs. Finally, we use an illustrated example to expound our proposed methods and verify their performances.

  5. Artificial neural network model for prediction of safety performance indicators goals in nuclear plants

    Energy Technology Data Exchange (ETDEWEB)

    Souto, Kelling C.; Nunes, Wallace W. [Instituto Federal de Educacao, Ciencia e Tecnologia do Rio de Janeiro, Nilopolis, RJ (Brazil). Lab. de Aplicacoes Computacionais; Machado, Marcelo D., E-mail: dornemd@eletronuclear.gov.b [ELETROBRAS Termonuclear S.A. (ELETRONUCLEAR), Rio de Janeiro, RJ (Brazil). Gerencia de Combustivel Nuclear - GCN.T

    2011-07-01

    Safety performance indicators have been developed to provide a quantitative indication of the performance and safety in various industry sectors. These indexes can provide assess to aspects ranging from production, design, and human performance up to management issues in accordance with policy, objectives and goals of the company. The use of safety performance indicators in nuclear power plants around the world is a reality. However, it is necessary to periodically set goal values. Such goals are targets relating to each of the indicators to be achieved by the plant over a predetermined period of operation. The current process of defining these goals is carried out by experts in a subjective way, based on actual data from the plant, and comparison with global indices. Artificial neural networks are computational techniques that present a mathematical model inspired by the neural structure of intelligent organisms that acquire knowledge through experience. This paper proposes an artificial neural network model aimed at predicting values of goals to be used in the evaluation of safety performance indicators for nuclear power plants. (author)

  6. Artificial neural network model for prediction of safety performance indicators goals in nuclear plants

    International Nuclear Information System (INIS)

    Souto, Kelling C.; Nunes, Wallace W.; Machado, Marcelo D.

    2011-01-01

    Safety performance indicators have been developed to provide a quantitative indication of the performance and safety in various industry sectors. These indexes can provide assess to aspects ranging from production, design, and human performance up to management issues in accordance with policy, objectives and goals of the company. The use of safety performance indicators in nuclear power plants around the world is a reality. However, it is necessary to periodically set goal values. Such goals are targets relating to each of the indicators to be achieved by the plant over a predetermined period of operation. The current process of defining these goals is carried out by experts in a subjective way, based on actual data from the plant, and comparison with global indices. Artificial neural networks are computational techniques that present a mathematical model inspired by the neural structure of intelligent organisms that acquire knowledge through experience. This paper proposes an artificial neural network model aimed at predicting values of goals to be used in the evaluation of safety performance indicators for nuclear power plants. (author)

  7. Quantitative safety goals for nuclear power plants: critical review and reformulation within a unified theory

    International Nuclear Information System (INIS)

    Munera, H.A.; Yadigaroglu, G.

    1987-01-01

    Most suggestions for the establishment of probabilistic safety goals in the regulatory process of nuclear power plants contain some measure of total risk to the individual and to society, and/or a limit line. There is still some confusion, both on formal and informal aspects of the basic ideas. The first part of the chapter critically reviews some of the adopted and/or proposed probabilistic safety goals and criteria in several countries. Some of the difficulties identified are: Lack of an adequate delimitation of the scope of the non-deterministic choice problem. Consequently, the main components of the problem -probabilities and consequences - are not clearly defined. As a further consequence there is a conspicuous absence of a unified treatment, including notation and terminology, for concepts like risk, probability, frequency, utility, risk-aversion, limit-line, etc. The theoretical justifications and limitations of limit lines are not always fully understood, nor are the theoretical limitations realized. In the second part theoretical methods of comparing probability distributions which exist in other disciplines are mentioned and unified methodology to formulate probabilistic safety criteria is described. (author)

  8. Comparison of Two Probabilistic Fatigue Damage Assessment Approaches Using Prognostic Performance Metrics

    Data.gov (United States)

    National Aeronautics and Space Administration — A general framework for probabilistic prognosis using maximum entropy approach, MRE, is proposed in this paper to include all available information and uncertainties...

  9. Relative risk of probabilistic category learning deficits in patients with schizophrenia and their siblings

    Science.gov (United States)

    Weickert, Thomas W.; Goldberg, Terry E.; Egan, Michael F.; Apud, Jose A.; Meeter, Martijn; Myers, Catherine E.; Gluck, Mark A; Weinberger, Daniel R.

    2010-01-01

    Background While patients with schizophrenia display an overall probabilistic category learning performance deficit, the extent to which this deficit occurs in unaffected siblings of patients with schizophrenia is unknown. There are also discrepant findings regarding probabilistic category learning acquisition rate and performance in patients with schizophrenia. Methods A probabilistic category learning test was administered to 108 patients with schizophrenia, 82 unaffected siblings, and 121 healthy participants. Results Patients with schizophrenia displayed significant differences from their unaffected siblings and healthy participants with respect to probabilistic category learning acquisition rates. Although siblings on the whole failed to differ from healthy participants on strategy and quantitative indices of overall performance and learning acquisition, application of a revised learning criterion enabling classification into good and poor learners based on individual learning curves revealed significant differences between percentages of sibling and healthy poor learners: healthy (13.2%), siblings (34.1%), patients (48.1%), yielding a moderate relative risk. Conclusions These results clarify previous discrepant findings pertaining to probabilistic category learning acquisition rate in schizophrenia and provide the first evidence for the relative risk of probabilistic category learning abnormalities in unaffected siblings of patients with schizophrenia, supporting genetic underpinnings of probabilistic category learning deficits in schizophrenia. These findings also raise questions regarding the contribution of antipsychotic medication to the probabilistic category learning deficit in schizophrenia. The distinction between good and poor learning may be used to inform genetic studies designed to detect schizophrenia risk alleles. PMID:20172502

  10. Safety Goal, Multi-unit Risk and PSA Uncertainty

    Energy Technology Data Exchange (ETDEWEB)

    Yang, Joon-Eon [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-10-15

    The safety goal is an answer of each country to the question 'How safe is safe enough?'. Table 1 shows some examples of the safety goal. However, many countries including Korea do not have the official safety goal for NPPs up to now since the establishment of safety goal is not just a technical issue but a very complex socio-technical issue. In establishing the safety goal for nuclear facilities, we have to consider various factors including not only technical aspects but also social, cultural ones. Recently, Korea is trying to establish the official safety goal. In this paper, we will review the relationship between the safety goal and Probabilistic Safety Assessment (PSA). We will also address some important technical issues to be considered in establishing the safety goal for NPPs from PSA point of view, i.e. a multi-unit risk issue and the uncertainty of PSA. In this paper, we reviewed some issues related to the safety goal and PSA. We believe that the safety goal is to be established in Korea considering the multi-unit risk. In addition, the relationship between the safety goal and PSA should be also defined clearly since PSA is the only way to answer to the question 'How safe is safe enough?'.

  11. When to conduct probabilistic linkage vs. deterministic linkage? A simulation study.

    Science.gov (United States)

    Zhu, Ying; Matsuyama, Yutaka; Ohashi, Yasuo; Setoguchi, Soko

    2015-08-01

    When unique identifiers are unavailable, successful record linkage depends greatly on data quality and types of variables available. While probabilistic linkage theoretically captures more true matches than deterministic linkage by allowing imperfection in identifiers, studies have shown inconclusive results likely due to variations in data quality, implementation of linkage methodology and validation method. The simulation study aimed to understand data characteristics that affect the performance of probabilistic vs. deterministic linkage. We created ninety-six scenarios that represent real-life situations using non-unique identifiers. We systematically introduced a range of discriminative power, rate of missing and error, and file size to increase linkage patterns and difficulties. We assessed the performance difference of linkage methods using standard validity measures and computation time. Across scenarios, deterministic linkage showed advantage in PPV while probabilistic linkage showed advantage in sensitivity. Probabilistic linkage uniformly outperformed deterministic linkage as the former generated linkages with better trade-off between sensitivity and PPV regardless of data quality. However, with low rate of missing and error in data, deterministic linkage performed not significantly worse. The implementation of deterministic linkage in SAS took less than 1min, and probabilistic linkage took 2min to 2h depending on file size. Our simulation study demonstrated that the intrinsic rate of missing and error of linkage variables was key to choosing between linkage methods. In general, probabilistic linkage was a better choice, but for exceptionally good quality data (<5% error), deterministic linkage was a more resource efficient choice. Copyright © 2015 Elsevier Inc. All rights reserved.

  12. Distinguishing the desire to learn from the desire to perform: The social value of achievement goals.

    Science.gov (United States)

    Cohen, Joanna; Darnon, Céline; Mollaret, Patrick

    2017-01-01

    We sought to distinguish mastery goals (i.e., desire to learn) from performance goals (i.e., desire to achieve more positive evaluations than others) in the light of social judgment research. In a pilot study, we made a conceptual distinction between three types of traits (agency, competence, and effort) that are often undifferentiated. We then tested the relevance of this distinction for understanding how people pursuing either mastery or performance goals are judged. On self-perception, results revealed that effort was predicted by the adoption of mastery goals and agency by performance goals (Study 1). On judgments, results showed that (a) the target pursuing mastery goals was perceived as oriented toward effort, and (b) the target pursuing performance goals was oriented toward agency (Study 2). Finally, these links were shown again by participants who inferred a target's goals from his traits (Study 3). Results are discussed in terms of the social value of achievement goals at school.

  13. Cerebellar tDCS does not improve performance in probabilistic classification learning

    NARCIS (Netherlands)

    N. Seyed Majidi; M.C. Verhage (Claire); O. Donchin (Opher); P.J. Holland (Peter); M.A. Frens (Maarten); J.N. van der Geest (Jos)

    2016-01-01

    textabstractIn this study, the role of the cerebellum in a cognitive learning task using transcranial direct current stimulation (tDCS) was investigated. Using a weather prediction task, subjects had to learn the probabilistic associations between a stimulus (a combination of cards) and an outcome

  14. Disjunctive Probabilistic Modal Logic is Enough for Bisimilarity on Reactive Probabilistic Systems

    OpenAIRE

    Bernardo, Marco; Miculan, Marino

    2016-01-01

    Larsen and Skou characterized probabilistic bisimilarity over reactive probabilistic systems with a logic including true, negation, conjunction, and a diamond modality decorated with a probabilistic lower bound. Later on, Desharnais, Edalat, and Panangaden showed that negation is not necessary to characterize the same equivalence. In this paper, we prove that the logical characterization holds also when conjunction is replaced by disjunction, with negation still being not necessary. To this e...

  15. Developing probabilistic models to predict amphibian site occupancy in a patchy landscape

    Science.gov (United States)

    R. A. Knapp; K.R. Matthews; H. K. Preisler; R. Jellison

    2003-01-01

    Abstract. Human-caused fragmentation of habitats is threatening an increasing number of animal and plant species, making an understanding of the factors influencing patch occupancy ever more important. The overall goal of the current study was to develop probabilistic models of patch occupancy for the mountain yellow-legged frog (Rana muscosa). This once-common species...

  16. An approach to handle Real Time and Probabilistic behaviors in e-commerce

    DEFF Research Database (Denmark)

    Diaz, G.; Larsen, Kim Guldstrand; Pardo, J.

    2005-01-01

    In this work we describe an approach to deal with systems having at the same time probabilistic and real-time behav- iors. The main goal in the paper is to show the automatic translation from a real time model based on UPPAAL tool, which makes automatic verification of Real Time Systems, to the R...

  17. Probabilistic Structural Analysis Program

    Science.gov (United States)

    Pai, Shantaram S.; Chamis, Christos C.; Murthy, Pappu L. N.; Stefko, George L.; Riha, David S.; Thacker, Ben H.; Nagpal, Vinod K.; Mital, Subodh K.

    2010-01-01

    NASA/NESSUS 6.2c is a general-purpose, probabilistic analysis program that computes probability of failure and probabilistic sensitivity measures of engineered systems. Because NASA/NESSUS uses highly computationally efficient and accurate analysis techniques, probabilistic solutions can be obtained even for extremely large and complex models. Once the probabilistic response is quantified, the results can be used to support risk-informed decisions regarding reliability for safety-critical and one-of-a-kind systems, as well as for maintaining a level of quality while reducing manufacturing costs for larger-quantity products. NASA/NESSUS has been successfully applied to a diverse range of problems in aerospace, gas turbine engines, biomechanics, pipelines, defense, weaponry, and infrastructure. This program combines state-of-the-art probabilistic algorithms with general-purpose structural analysis and lifting methods to compute the probabilistic response and reliability of engineered structures. Uncertainties in load, material properties, geometry, boundary conditions, and initial conditions can be simulated. The structural analysis methods include non-linear finite-element methods, heat-transfer analysis, polymer/ceramic matrix composite analysis, monolithic (conventional metallic) materials life-prediction methodologies, boundary element methods, and user-written subroutines. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. NASA/NESSUS 6.2c is structured in a modular format with 15 elements.

  18. Probabilistic tsunami hazard assessment for Point Lepreau Generating Station

    Energy Technology Data Exchange (ETDEWEB)

    Mullin, D., E-mail: dmullin@nbpower.com [New Brunswick Power Corporation, Point Lepreau Generating Station, Point Lepreau (Canada); Alcinov, T.; Roussel, P.; Lavine, A.; Arcos, M.E.M.; Hanson, K.; Youngs, R., E-mail: trajce.alcinov@amecfw.com, E-mail: patrick.roussel@amecfw.com [AMEC Foster Wheeler Environment & Infrastructure, Dartmouth, NS (Canada)

    2015-07-01

    In 2012 the Geological Survey of Canada published a preliminary probabilistic tsunami hazard assessment in Open File 7201 that presents the most up-to-date information on all potential tsunami sources in a probabilistic framework on a national level, thus providing the underlying basis for conducting site-specific tsunami hazard assessments. However, the assessment identified a poorly constrained hazard for the Atlantic Coastline and recommended further evaluation. As a result, NB Power has embarked on performing a Probabilistic Tsunami Hazard Assessment (PTHA) for Point Lepreau Generating Station. This paper provides the methodology and progress or hazard evaluation results for Point Lepreau G.S. (author)

  19. Probabilistic hypergraph based hash codes for social image search

    Institute of Scientific and Technical Information of China (English)

    Yi XIE; Hui-min YU; Roland HU

    2014-01-01

    With the rapid development of the Internet, recent years have seen the explosive growth of social media. This brings great challenges in performing efficient and accurate image retrieval on a large scale. Recent work shows that using hashing methods to embed high-dimensional image features and tag information into Hamming space provides a powerful way to index large collections of social images. By learning hash codes through a spectral graph partitioning algorithm, spectral hashing (SH) has shown promising performance among various hashing approaches. However, it is incomplete to model the relations among images only by pairwise simple graphs which ignore the relationship in a higher order. In this paper, we utilize a probabilistic hypergraph model to learn hash codes for social image retrieval. A probabilistic hypergraph model offers a higher order repre-sentation among social images by connecting more than two images in one hyperedge. Unlike a normal hypergraph model, a probabilistic hypergraph model considers not only the grouping information, but also the similarities between vertices in hy-peredges. Experiments on Flickr image datasets verify the performance of our proposed approach.

  20. Application of Deterministic and Probabilistic System Design Methods and Enhancements of Conceptual Design Tools for ERA Project

    Science.gov (United States)

    Mavris, Dimitri N.; Schutte, Jeff S.

    2016-01-01

    This report documents work done by the Aerospace Systems Design Lab (ASDL) at the Georgia Institute of Technology, Daniel Guggenheim School of Aerospace Engineering for the National Aeronautics and Space Administration, Aeronautics Research Mission Directorate, Integrated System Research Program, Environmentally Responsible Aviation (ERA) Project. This report was prepared under contract NNL12AA12C, "Application of Deterministic and Probabilistic System Design Methods and Enhancement of Conceptual Design Tools for ERA Project". The research within this report addressed the Environmentally Responsible Aviation (ERA) project goal stated in the NRA solicitation "to advance vehicle concepts and technologies that can simultaneously reduce fuel burn, noise, and emissions." To identify technology and vehicle solutions that simultaneously meet these three metrics requires the use of system-level analysis with the appropriate level of fidelity to quantify feasibility, benefits and degradations, and associated risk. In order to perform the system level analysis, the Environmental Design Space (EDS) [Kirby 2008, Schutte 2012a] environment developed by ASDL was used to model both conventional and unconventional configurations as well as to assess technologies from the ERA and N+2 timeframe portfolios. A well-established system design approach was used to perform aircraft conceptual design studies, including technology trade studies to identify technology portfolios capable of accomplishing the ERA project goal and to obtain accurate tradeoffs between performance, noise, and emissions. The ERA goal, shown in Figure 1, is to simultaneously achieve the N+2 benefits of a cumulative noise margin of 42 EPNdB relative to stage 4, a 75 percent reduction in LTO NOx emissions relative to CAEP 6 and a 50 percent reduction in fuel burn relative to the 2005 best in class aircraft. There were 5 research task associated with this research: 1) identify technology collectors, 2) model

  1. To master or perform? Exploring relations between achievement goals and conceptual change learning.

    Science.gov (United States)

    Ranellucci, John; Muis, Krista R; Duffy, Melissa; Wang, Xihui; Sampasivam, Lavanya; Franco, Gina M

    2013-09-01

    Research is needed to explore conceptual change in relation to achievement goal orientations and depth of processing. To address this need, we examined relations between achievement goals, use of deep versus shallow processing strategies, and conceptual change learning using a think-aloud protocol. Seventy-three undergraduate students were assessed on their prior knowledge and misconceptions about Newtonian mechanics, and then reported their achievement goals and participated in think-aloud protocols while reading Newtonian physics texts. A mastery-approach goal orientation positively predicted deep processing strategies, shallow processing strategies, and conceptual change. In contrast, a performance-approach goal orientation did not predict either of the processing strategies, but negatively predicted conceptual change. A performance-avoidance goal orientation negatively predicted deep processing strategies and conceptual change. Moreover, deep and shallow processing strategies positively predicted conceptual change as well as recall. Finally, both deep and shallow processing strategies mediated relations between mastery-approach goals and conceptual change. Results provide some support for Dole and Sinatra's (1998) Cognitive Reconstruction of Knowledge Model of conceptual change but also challenge specific facets with regard to the role of depth of processing in conceptual change. © 2012 The British Psychological Society.

  2. Probabilistic and deterministic soil structure interaction analysis including ground motion incoherency effects

    International Nuclear Information System (INIS)

    Elkhoraibi, T.; Hashemi, A.; Ostadan, F.

    2014-01-01

    Soil-structure interaction (SSI) is a major step for seismic design of massive and stiff structures typical of the nuclear facilities and civil infrastructures such as tunnels, underground stations, dams and lock head structures. Currently most SSI analyses are performed deterministically, incorporating limited range of variation in soil and structural properties and without consideration of the ground motion incoherency effects. This often leads to overestimation of the seismic response particularly the In-Structure-Response Spectra (ISRS) with significant impositions of design and equipment qualification costs, especially in the case of high-frequency sensitive equipment at stiff soil or rock sites. The reluctance to incorporate a more comprehensive probabilistic approach is mainly due to the fact that the computational cost of performing probabilistic SSI analysis even without incoherency function considerations has been prohibitive. As such, bounding deterministic approaches have been preferred by the industry and accepted by the regulatory agencies. However, given the recently available and growing computing capabilities, the need for a probabilistic-based approach to the SSI analysis is becoming clear with the advances in performance-based engineering and the utilization of fragility analysis in the decision making process whether by the owners or the regulatory agencies. This paper demonstrates the use of both probabilistic and deterministic SSI analysis techniques to identify important engineering demand parameters in the structure. A typical nuclear industry structure is used as an example for this study. The system is analyzed for two different site conditions: rock and deep soil. Both deterministic and probabilistic SSI analysis approaches are performed, using the program SASSI, with and without ground motion incoherency considerations. In both approaches, the analysis begins at the hard rock level using the low frequency and high frequency hard rock

  3. Probabilistic and deterministic soil structure interaction analysis including ground motion incoherency effects

    Energy Technology Data Exchange (ETDEWEB)

    Elkhoraibi, T., E-mail: telkhora@bechtel.com; Hashemi, A.; Ostadan, F.

    2014-04-01

    Soil-structure interaction (SSI) is a major step for seismic design of massive and stiff structures typical of the nuclear facilities and civil infrastructures such as tunnels, underground stations, dams and lock head structures. Currently most SSI analyses are performed deterministically, incorporating limited range of variation in soil and structural properties and without consideration of the ground motion incoherency effects. This often leads to overestimation of the seismic response particularly the In-Structure-Response Spectra (ISRS) with significant impositions of design and equipment qualification costs, especially in the case of high-frequency sensitive equipment at stiff soil or rock sites. The reluctance to incorporate a more comprehensive probabilistic approach is mainly due to the fact that the computational cost of performing probabilistic SSI analysis even without incoherency function considerations has been prohibitive. As such, bounding deterministic approaches have been preferred by the industry and accepted by the regulatory agencies. However, given the recently available and growing computing capabilities, the need for a probabilistic-based approach to the SSI analysis is becoming clear with the advances in performance-based engineering and the utilization of fragility analysis in the decision making process whether by the owners or the regulatory agencies. This paper demonstrates the use of both probabilistic and deterministic SSI analysis techniques to identify important engineering demand parameters in the structure. A typical nuclear industry structure is used as an example for this study. The system is analyzed for two different site conditions: rock and deep soil. Both deterministic and probabilistic SSI analysis approaches are performed, using the program SASSI, with and without ground motion incoherency considerations. In both approaches, the analysis begins at the hard rock level using the low frequency and high frequency hard rock

  4. Site-specific seismic probabilistic tsunami hazard analysis: performances and potential applications

    Science.gov (United States)

    Tonini, Roberto; Volpe, Manuela; Lorito, Stefano; Selva, Jacopo; Orefice, Simone; Graziani, Laura; Brizuela, Beatriz; Smedile, Alessandra; Romano, Fabrizio; De Martini, Paolo Marco; Maramai, Alessandra; Piatanesi, Alessio; Pantosti, Daniela

    2017-04-01

    Seismic Probabilistic Tsunami Hazard Analysis (SPTHA) provides probabilities to exceed different thresholds of tsunami hazard intensity, at a specific site or region and in a given time span, for tsunamis caused by seismic sources. Results obtained by SPTHA (i.e., probabilistic hazard curves and inundation maps) represent a very important input to risk analyses and land use planning. However, the large variability of source parameters implies the definition of a huge number of potential tsunami scenarios, whose omission could lead to a biased analysis. Moreover, tsunami propagation from source to target requires the use of very expensive numerical simulations. At regional scale, the computational cost can be reduced using assumptions on the tsunami modeling (i.e., neglecting non-linear effects, using coarse topo-bathymetric meshes, empirically extrapolating maximum wave heights on the coast). On the other hand, moving to local scale, a much higher resolution is required and such assumptions drop out, since detailed inundation maps require significantly greater computational resources. In this work we apply a multi-step method to perform a site-specific SPTHA which can be summarized in the following steps: i) to perform a regional hazard assessment to account for both the aleatory and epistemic uncertainties of the seismic source, by combining the use of an event tree and an ensemble modeling technique; ii) to apply a filtering procedure which use a cluster analysis to define a significantly reduced number of representative scenarios contributing to the hazard of a specific target site; iii) to perform high resolution numerical simulations only for these representative scenarios and for a subset of near field sources placed in very shallow waters and/or whose coseismic displacements induce ground uplift or subsidence at the target. The method is applied to three target areas in the Mediterranean located around the cities of Milazzo (Italy), Thessaloniki (Greece) and

  5. Probabilistic approaches for geotechnical site characterization and slope stability analysis

    CERN Document Server

    Cao, Zijun; Li, Dianqing

    2017-01-01

    This is the first book to revisit geotechnical site characterization from a probabilistic point of view and provide rational tools to probabilistically characterize geotechnical properties and underground stratigraphy using limited information obtained from a specific site. This book not only provides new probabilistic approaches for geotechnical site characterization and slope stability analysis, but also tackles the difficulties in practical implementation of these approaches. In addition, this book also develops efficient Monte Carlo simulation approaches for slope stability analysis and implements these approaches in a commonly available spreadsheet environment. These approaches and the software package are readily available to geotechnical practitioners and alleviate them from reliability computational algorithms. The readers will find useful information for a non-specialist to determine project-specific statistics of geotechnical properties and to perform probabilistic analysis of slope stability.

  6. Suppression of panel flutter of near-space aircraft based on non-probabilistic reliability theory

    Directory of Open Access Journals (Sweden)

    Ye-Wei Zhang

    2016-03-01

    Full Text Available The vibration active control of the composite panels with the uncertain parameters in the hypersonic flow is studied using the non-probabilistic reliability theory. Using the piezoelectric patches as active control actuators, dynamic equations of panel are established by finite element method and Hamilton’s principle. And the control model of panel with uncertain parameters is obtained. According to the non-probabilistic reliability index, and besides being based on H∞ robust control theory and non-probabilistic reliability theory, the non-probabilistic reliability performance function is given. Moreover, the relationships between the robust controller and H∞ performance index and reliability are established. Numerical results show that the control method under the influence of reliability, H∞ performance index, and approaching velocity is effective to the vibration suppression of panel in the whole interval of uncertain parameters.

  7. Modeling and control of an unstable system using probabilistic fuzzy inference system

    Directory of Open Access Journals (Sweden)

    Sozhamadevi N.

    2015-09-01

    Full Text Available A new type Fuzzy Inference System is proposed, a Probabilistic Fuzzy Inference system which model and minimizes the effects of statistical uncertainties. The blend of two different concepts, degree of truth and probability of truth in a unique framework leads to this new concept. This combination is carried out both in Fuzzy sets and Fuzzy rules, which gives rise to Probabilistic Fuzzy Sets and Probabilistic Fuzzy Rules. Introducing these probabilistic elements, a distinctive probabilistic fuzzy inference system is developed and this involves fuzzification, inference and output processing. This integrated approach accounts for all of the uncertainty like rule uncertainties and measurement uncertainties present in the systems and has led to the design which performs optimally after training. In this paper a Probabilistic Fuzzy Inference System is applied for modeling and control of a highly nonlinear, unstable system and also proved its effectiveness.

  8. Confluence reduction for probabilistic systems

    NARCIS (Netherlands)

    Timmer, Mark; van de Pol, Jan Cornelis; Stoelinga, Mariëlle Ida Antoinette

    In this presentation we introduce a novel technique for state space reduction of probabilistic specifications, based on a newly developed notion of confluence for probabilistic automata. We proved that this reduction preserves branching probabilistic bisimulation and can be applied on-the-fly. To

  9. Probabilistic systems coalgebraically: A survey

    Science.gov (United States)

    Sokolova, Ana

    2011-01-01

    We survey the work on both discrete and continuous-space probabilistic systems as coalgebras, starting with how probabilistic systems are modeled as coalgebras and followed by a discussion of their bisimilarity and behavioral equivalence, mentioning results that follow from the coalgebraic treatment of probabilistic systems. It is interesting to note that, for different reasons, for both discrete and continuous probabilistic systems it may be more convenient to work with behavioral equivalence than with bisimilarity. PMID:21998490

  10. Probabilistic liver atlas construction.

    Science.gov (United States)

    Dura, Esther; Domingo, Juan; Ayala, Guillermo; Marti-Bonmati, Luis; Goceri, E

    2017-01-13

    Anatomical atlases are 3D volumes or shapes representing an organ or structure of the human body. They contain either the prototypical shape of the object of interest together with other shapes representing its statistical variations (statistical atlas) or a probability map of belonging to the object (probabilistic atlas). Probabilistic atlases are mostly built with simple estimations only involving the data at each spatial location. A new method for probabilistic atlas construction that uses a generalized linear model is proposed. This method aims to improve the estimation of the probability to be covered by the liver. Furthermore, all methods to build an atlas involve previous coregistration of the sample of shapes available. The influence of the geometrical transformation adopted for registration in the quality of the final atlas has not been sufficiently investigated. The ability of an atlas to adapt to a new case is one of the most important quality criteria that should be taken into account. The presented experiments show that some methods for atlas construction are severely affected by the previous coregistration step. We show the good performance of the new approach. Furthermore, results suggest that extremely flexible registration methods are not always beneficial, since they can reduce the variability of the atlas and hence its ability to give sensible values of probability when used as an aid in segmentation of new cases.

  11. Probabilistic Motor Sequence Yields Greater Offline and Less Online Learning than Fixed Sequence.

    Science.gov (United States)

    Du, Yue; Prashad, Shikha; Schoenbrun, Ilana; Clark, Jane E

    2016-01-01

    It is well acknowledged that motor sequences can be learned quickly through online learning. Subsequently, the initial acquisition of a motor sequence is boosted or consolidated by offline learning. However, little is known whether offline learning can drive the fast learning of motor sequences (i.e., initial sequence learning in the first training session). To examine offline learning in the fast learning stage, we asked four groups of young adults to perform the serial reaction time (SRT) task with either a fixed or probabilistic sequence and with or without preliminary knowledge (PK) of the presence of a sequence. The sequence and PK were manipulated to emphasize either procedural (probabilistic sequence; no preliminary knowledge (NPK)) or declarative (fixed sequence; with PK) memory that were found to either facilitate or inhibit offline learning. In the SRT task, there were six learning blocks with a 2 min break between each consecutive block. Throughout the session, stimuli followed the same fixed or probabilistic pattern except in Block 5, in which stimuli appeared in a random order. We found that PK facilitated the learning of a fixed sequence, but not a probabilistic sequence. In addition to overall learning measured by the mean reaction time (RT), we examined the progressive changes in RT within and between blocks (i.e., online and offline learning, respectively). It was found that the two groups who performed the fixed sequence, regardless of PK, showed greater online learning than the other two groups who performed the probabilistic sequence. The groups who performed the probabilistic sequence, regardless of PK, did not display online learning, as indicated by a decline in performance within the learning blocks. However, they did demonstrate remarkably greater offline improvement in RT, which suggests that they are learning the probabilistic sequence offline. These results suggest that in the SRT task, the fast acquisition of a motor sequence is driven

  12. The dialectical thinking about deterministic and probabilistic safety analysis

    International Nuclear Information System (INIS)

    Qian Yongbai; Tong Jiejuan; Zhang Zuoyi; He Xuhong

    2005-01-01

    There are two methods in designing and analysing the safety performance of a nuclear power plant, the traditional deterministic method and the probabilistic method. To date, the design of nuclear power plant is based on the deterministic method. It has been proved in practice that the deterministic method is effective on current nuclear power plant. However, the probabilistic method (Probabilistic Safety Assessment - PSA) considers a much wider range of faults, takes an integrated look at the plant as a whole, and uses realistic criteria for the performance of the systems and constructions of the plant. PSA can be seen, in principle, to provide a broader and realistic perspective on safety issues than the deterministic approaches. In this paper, the historical origins and development trend of above two methods are reviewed and summarized in brief. Based on the discussion of two application cases - one is the changes to specific design provisions of the general design criteria (GDC) and the other is the risk-informed categorization of structure, system and component, it can be concluded that the deterministic method and probabilistic method are dialectical and unified, and that they are being merged into each other gradually, and being used in coordination. (authors)

  13. Probabilistic finite elements for fracture and fatigue analysis

    Science.gov (United States)

    Liu, W. K.; Belytschko, T.; Lawrence, M.; Besterfield, G. H.

    1989-01-01

    The fusion of the probabilistic finite element method (PFEM) and reliability analysis for probabilistic fracture mechanics (PFM) is presented. A comprehensive method for determining the probability of fatigue failure for curved crack growth was developed. The criterion for failure or performance function is stated as: the fatigue life of a component must exceed the service life of the component; otherwise failure will occur. An enriched element that has the near-crack-tip singular strain field embedded in the element is used to formulate the equilibrium equation and solve for the stress intensity factors at the crack-tip. Performance and accuracy of the method is demonstrated on a classical mode 1 fatigue problem.

  14. PRECIS -- A probabilistic risk assessment system

    International Nuclear Information System (INIS)

    Peterson, D.M.; Knowlton, R.G. Jr.

    1996-01-01

    A series of computer tools has been developed to conduct the exposure assessment and risk characterization phases of human health risk assessments within a probabilistic framework. The tools are collectively referred to as the Probabilistic Risk Evaluation and Characterization Investigation System (PRECIS). With this system, a risk assessor can calculate the doses and risks associated with multiple environmental and exposure pathways, for both chemicals and radioactive contaminants. Exposure assessment models in the system account for transport of contaminants to receptor points from a source zone originating in unsaturated soils above the water table. In addition to performing calculations of dose and risk based on initial concentrations, PRECIS can also be used in an inverse manner to compute soil concentrations in the source area that must not be exceeded if prescribed limits on dose or risk are to be met. Such soil contaminant levels, referred to as soil guidelines, are computed for both single contaminants and chemical mixtures and can be used as action levels or cleanup levels. Probabilistic estimates of risk, dose and soil guidelines are derived using Monte Carlo techniques

  15. Conditional Probabilistic Population Forecasting

    OpenAIRE

    Sanderson, Warren C.; Scherbov, Sergei; O'Neill, Brian C.; Lutz, Wolfgang

    2004-01-01

    Since policy-makers often prefer to think in terms of alternative scenarios, the question has arisen as to whether it is possible to make conditional population forecasts in a probabilistic context. This paper shows that it is both possible and useful to make these forecasts. We do this with two different kinds of examples. The first is the probabilistic analog of deterministic scenario analysis. Conditional probabilistic scenario analysis is essential for policy-makers because...

  16. Probabilistic Output Analysis by Program Manipulation

    DEFF Research Database (Denmark)

    Rosendahl, Mads; Kirkeby, Maja Hanne

    2015-01-01

    The aim of a probabilistic output analysis is to derive a probability distribution of possible output values for a program from a probability distribution of its input. We present a method for performing static output analysis, based on program transformation techniques. It generates a probability...

  17. ISSUES ASSOCIATED WITH PROBABILISTIC FAILURE MODELING OF DIGITAL SYSTEMS

    International Nuclear Information System (INIS)

    CHU, T.L.; MARTINEZ-GURIDI, G.; LIHNER, J.; OVERLAND, D.

    2004-01-01

    The current U.S. Nuclear Regulatory Commission (NRC) licensing process of instrumentation and control (I and C) systems is based on deterministic requirements, e.g., single failure criteria, and defense in depth and diversity. Probabilistic considerations can be used as supplements to the deterministic process. The National Research Council has recommended development of methods for estimating failure probabilities of digital systems, including commercial off-the-shelf (COTS) equipment, for use in probabilistic risk assessment (PRA). NRC staff has developed informal qualitative and quantitative requirements for PRA modeling of digital systems. Brookhaven National Laboratory (BNL) has performed a review of the-state-of-the-art of the methods and tools that can potentially be used to model digital systems. The objectives of this paper are to summarize the review, discuss the issues associated with probabilistic modeling of digital systems, and identify potential areas of research that would enhance the state of the art toward a satisfactory modeling method that could be integrated with a typical probabilistic risk assessment

  18. Probabilistic approaches to life prediction of nuclear plant structural components

    International Nuclear Information System (INIS)

    Villain, B.; Pitner, P.; Procaccia, H.

    1996-01-01

    In the last decade there has been an increasing interest at EDF in developing and applying probabilistic methods for a variety of purposes. In the field of structural integrity and reliability they are used to evaluate the effect of deterioration due to aging mechanisms, mainly on major passive structural components such as steam generators, pressure vessels and piping in nuclear plants. Because there can be numerous uncertainties involved in a assessment of the performance of these structural components, probabilistic methods. The benefits of a probabilistic approach are the clear treatment of uncertainly and the possibility to perform sensitivity studies from which it is possible to identify and quantify the effect of key factors and mitigative actions. They thus provide information to support effective decisions to optimize In-Service Inspection planning and maintenance strategies and for realistic lifetime prediction or reassessment. The purpose of the paper is to discuss and illustrate the methods available at EDF for probabilistic component life prediction. This includes a presentation of software tools in classical, Bayesian and structural reliability, and an application on two case studies (steam generator tube bundle, reactor pressure vessel). (authors)

  19. Probabilistic approaches to life prediction of nuclear plant structural components

    International Nuclear Information System (INIS)

    Villain, B.; Pitner, P.; Procaccia, H.

    1996-01-01

    In the last decade there has been an increasing interest at EDF in developing and applying probabilistic methods for a variety of purposes. In the field of structural integrity and reliability they are used to evaluate the effect of deterioration due to aging mechanisms, mainly on major passive structural components such as steam generators, pressure vessels and piping in nuclear plants. Because there can be numerous uncertainties involved in an assessment of the performance of these structural components, probabilistic methods provide an attractive alternative or supplement to more conventional deterministic methods. The benefits of a probabilistic approach are the clear treatment of uncertainty and the possibility to perform sensitivity studies from which it is possible to identify and quantify the effect of key factors and mitigative actions. They thus provide information to support effective decisions to optimize In-Service Inspection planning and maintenance strategies and for realistic lifetime prediction or reassessment. The purpose of the paper is to discuss and illustrate the methods available at EDF for probabilistic component life prediction. This includes a presentation of software tools in classical, Bayesian and structural reliability, and an application on two case studies (steam generator tube bundle, reactor pressure vessel)

  20. Evaluating bacterial gene-finding HMM structures as probabilistic logic programs.

    Science.gov (United States)

    Mørk, Søren; Holmes, Ian

    2012-03-01

    Probabilistic logic programming offers a powerful way to describe and evaluate structured statistical models. To investigate the practicality of probabilistic logic programming for structure learning in bioinformatics, we undertook a simplified bacterial gene-finding benchmark in PRISM, a probabilistic dialect of Prolog. We evaluate Hidden Markov Model structures for bacterial protein-coding gene potential, including a simple null model structure, three structures based on existing bacterial gene finders and two novel model structures. We test standard versions as well as ADPH length modeling and three-state versions of the five model structures. The models are all represented as probabilistic logic programs and evaluated using the PRISM machine learning system in terms of statistical information criteria and gene-finding prediction accuracy, in two bacterial genomes. Neither of our implementations of the two currently most used model structures are best performing in terms of statistical information criteria or prediction performances, suggesting that better-fitting models might be achievable. The source code of all PRISM models, data and additional scripts are freely available for download at: http://github.com/somork/codonhmm. Supplementary data are available at Bioinformatics online.

  1. Probabilistic seismic hazard assessment. Gentilly 2

    International Nuclear Information System (INIS)

    1996-03-01

    Results of this probabilistic seismic hazard assessment were determined using a suite of conservative assumptions. The intent of this study was to perform a limited hazard assessment that incorporated a range of technically defensible input parameters. To best achieve this goal, input selected for the hazard assessment tended to be conservative with respect to selection of attenuation modes, and seismicity parameters. Seismic hazard estimates at Gentilly 2 were most affected by selection of the attenuation model. Alternative definitions of seismic source zones had a relatively small impact on seismic hazard. A St. Lawrence Rift model including a maximum magnitude of 7.2 m b in the zone containing the site had little effect on the hazard estimate relative to other seismic source zonation models. Mean annual probabilities of exceeding the design peak ground acceleration, and the design response spectrum for the Gentilly 2 site were computed to lie in the range of 0.001 to 0.0001. This hazard result falls well within the range determined to be acceptable for nuclear reactor sites located throughout the eastern United States. (author) 34 refs., 6 tabs., 28 figs

  2. Performance on a probabilistic inference task in healthy subjects receiving ketamine compared with patients with schizophrenia

    Science.gov (United States)

    Almahdi, Basil; Sultan, Pervez; Sohanpal, Imrat; Brandner, Brigitta; Collier, Tracey; Shergill, Sukhi S; Cregg, Roman; Averbeck, Bruno B

    2012-01-01

    Evidence suggests that some aspects of schizophrenia can be induced in healthy volunteers through acute administration of the non-competitive NMDA-receptor antagonist, ketamine. In probabilistic inference tasks, patients with schizophrenia have been shown to ‘jump to conclusions’ (JTC) when asked to make a decision. We aimed to test whether healthy participants receiving ketamine would adopt a JTC response pattern resembling that of patients. The paradigmatic task used to investigate JTC has been the ‘urn’ task, where participants are shown a sequence of beads drawn from one of two ‘urns’, each containing coloured beads in different proportions. Participants make a decision when they think they know the urn from which beads are being drawn. We compared performance on the urn task between controls receiving acute ketamine or placebo with that of patients with schizophrenia and another group of controls matched to the patient group. Patients were shown to exhibit a JTC response pattern relative to their matched controls, whereas JTC was not evident in controls receiving ketamine relative to placebo. Ketamine does not appear to promote JTC in healthy controls, suggesting that ketamine does not affect probabilistic inferences. PMID:22389244

  3. Striving for Excellence Sometimes Hinders High Achievers: Performance-Approach Goals Deplete Arithmetical Performance in Students with High Working Memory Capacity

    Science.gov (United States)

    Crouzevialle, Marie; Smeding, Annique; Butera, Fabrizio

    2015-01-01

    We tested whether the goal to attain normative superiority over other students, referred to as performance-approach goals, is particularly distractive for high-Working Memory Capacity (WMC) students—that is, those who are used to being high achievers. Indeed, WMC is positively related to high-order cognitive performance and academic success, a record of success that confers benefits on high-WMC as compared to low-WMC students. We tested whether such benefits may turn out to be a burden under performance-approach goal pursuit. Indeed, for high achievers, aiming to rise above others may represent an opportunity to reaffirm their positive status—a stake susceptible to trigger disruptive outcome concerns that interfere with task processing. Results revealed that with performance-approach goals—as compared to goals with no emphasis on social comparison—the higher the students’ WMC, the lower their performance at a complex arithmetic task (Experiment 1). Crucially, this pattern appeared to be driven by uncertainty regarding the chances to outclass others (Experiment 2). Moreover, an accessibility measure suggested the mediational role played by status-related concerns in the observed disruption of performance. We discuss why high-stake situations can paradoxically lead high-achievers to sub-optimally perform when high-order cognitive performance is at play. PMID:26407097

  4. [Perceptions of classroom goal structures, personal achievement goal orientations, and learning strategies].

    Science.gov (United States)

    Miki, Kaori; Yamauchi, Hirotsugu

    2005-08-01

    We examined the relations among students' perceptions of classroom goal structures (mastery and performance goal structures), students' achievement goal orientations (mastery, performance, and work-avoidance goals), and learning strategies (deep processing, surface processing and self-handicapping strategies). Participants were 323 5th and 6th grade students in elementary schools. The results from structural equation modeling indicated that perceptions of classroom mastery goal structures were associated with students' mastery goal orientations, which were in turn related positively to the deep processing strategies and academic achievement. Perceptions of classroom performance goal stractures proved associated with work avoidance-goal orientations, which were positively related to the surface processing and self-handicapping strategies. Two types of goal structures had a positive relation with students' performance goal orientations, which had significant positive effects on academic achievement. The results of this study suggest that elementary school students' perceptions of mastery goal structures are related to adaptive patterns of learning more than perceptions of performance goal structures are. The role of perceptions of classroom goal structure in promoting students' goal orientations and learning strategies is discussed.

  5. Comparative study of probabilistic methodologies for small signal stability assessment

    Energy Technology Data Exchange (ETDEWEB)

    Rueda, J.L.; Colome, D.G. [Universidad Nacional de San Juan (IEE-UNSJ), San Juan (Argentina). Inst. de Energia Electrica], Emails: joseluisrt@iee.unsj.edu.ar, colome@iee.unsj.edu.ar

    2009-07-01

    Traditional deterministic approaches for small signal stability assessment (SSSA) are unable to properly reflect the existing uncertainties in real power systems. Hence, the probabilistic analysis of small signal stability (SSS) is attracting more attention by power system engineers. This paper discusses and compares two probabilistic methodologies for SSSA, which are based on the two point estimation method and the so-called Monte Carlo method, respectively. The comparisons are based on the results obtained for several power systems of different sizes and with different SSS performance. It is demonstrated that although with an analytical approach the amount of computation of probabilistic SSSA can be reduced, the different degrees of approximations that are adopted, lead to deceptive results. Conversely, Monte Carlo based probabilistic SSSA can be carried out with reasonable computational effort while holding satisfactory estimation precision. (author)

  6. Conditional Probabilistic Population Forecasting

    OpenAIRE

    Sanderson, W.C.; Scherbov, S.; O'Neill, B.C.; Lutz, W.

    2003-01-01

    Since policy makers often prefer to think in terms of scenarios, the question has arisen as to whether it is possible to make conditional population forecasts in a probabilistic context. This paper shows that it is both possible and useful to make these forecasts. We do this with two different kinds of examples. The first is the probabilistic analog of deterministic scenario analysis. Conditional probabilistic scenario analysis is essential for policy makers it allows them to answer "what if"...

  7. Conditional probabilistic population forecasting

    OpenAIRE

    Sanderson, Warren; Scherbov, Sergei; O'Neill, Brian; Lutz, Wolfgang

    2003-01-01

    Since policy-makers often prefer to think in terms of alternative scenarios, the question has arisen as to whether it is possible to make conditional population forecasts in a probabilistic context. This paper shows that it is both possible and useful to make these forecasts. We do this with two different kinds of examples. The first is the probabilistic analog of deterministic scenario analysis. Conditional probabilistic scenario analysis is essential for policy-makers because it allows them...

  8. Students’ difficulties in probabilistic problem-solving

    Science.gov (United States)

    Arum, D. P.; Kusmayadi, T. A.; Pramudya, I.

    2018-03-01

    There are many errors can be identified when students solving mathematics problems, particularly in solving the probabilistic problem. This present study aims to investigate students’ difficulties in solving the probabilistic problem. It focuses on analyzing and describing students errors during solving the problem. This research used the qualitative method with case study strategy. The subjects in this research involve ten students of 9th grade that were selected by purposive sampling. Data in this research involve students’ probabilistic problem-solving result and recorded interview regarding students’ difficulties in solving the problem. Those data were analyzed descriptively using Miles and Huberman steps. The results show that students have difficulties in solving the probabilistic problem and can be divided into three categories. First difficulties relate to students’ difficulties in understanding the probabilistic problem. Second, students’ difficulties in choosing and using appropriate strategies for solving the problem. Third, students’ difficulties with the computational process in solving the problem. Based on the result seems that students still have difficulties in solving the probabilistic problem. It means that students have not able to use their knowledge and ability for responding probabilistic problem yet. Therefore, it is important for mathematics teachers to plan probabilistic learning which could optimize students probabilistic thinking ability.

  9. Engineering performance indicators in support of corporate goals and objectives

    International Nuclear Information System (INIS)

    Prawlocki, F.C.; Holland, M.B.

    1992-01-01

    In the late 1980s, a new factor was introduced into the equation of rate making: competition. Prior to this time, most utilities only had to prove to the state public service commission (PSC) that a rate increase was justified. Even this had become more difficult in recent years as PSCs implemented prudency audits as a means of determining the efficiency of utility management. Recently, however, the need for performance improvement has been initiated internally by utility management because of the advent of competition in the utility environment and state PSC inquiries. In 1991, TVA began to realign its traditional program of performance indicators to agree with industry standards and provide more extensive indicators of positive and negative trends in performance. The INPO Guideline 88-016, Guidelines for the Conduct of Design Engineering, was used as the basis for most indicators. In addition, indicators were added to highlight specific corporate objectives, problems, or regulatory commitments. The indicators are being initiated in three phases as efficient sources of performance data are identified. Once the current baseline was established, a review was made of the best utilities in the country based on the US Nuclear Regulatory Commission's systematic assessment of licensee's performance and INPO performance indicators to establish performance goals. As total quality management and cycle time reduction programs are implemented, all of the organization's annual goals and objectives are expected to more closely reflect the best of the industry

  10. A conceptual model of nurses' goal orientation, service behavior, and service performance.

    Science.gov (United States)

    Chien, Chun-Cheng; Chou, Hsin-Kai; Hung, Shuo-Tsung

    2008-01-01

    Based on the conceptual framework known as the "service triangle," the authors constructed a model of nurses' goal orientation, service behavior, and service performance to investigate the antecedents and consequences of the medical service behavior provided by nurses. This cross-sectional study collected data from 127 nurses in six hospitals using a mail-in questionnaire. Analysis of the model revealed that the customer-oriented behavior of nurses had a positive influence on organizational citizenship behavior; and both of these behaviors had a significant positive influence on service performance. The results also indicate that a higher learning goal orientation among nurses was associated with the performance of both observable customer-oriented behavior and organizational-citizenship behavior.

  11. A General Framework for Probabilistic Characterizing Formulae

    DEFF Research Database (Denmark)

    Sack, Joshua; Zhang, Lijun

    2012-01-01

    Recently, a general framework on characteristic formulae was proposed by Aceto et al. It offers a simple theory that allows one to easily obtain characteristic formulae of many non-probabilistic behavioral relations. Our paper studies their techniques in a probabilistic setting. We provide...... a general method for determining characteristic formulae of behavioral relations for probabilistic automata using fixed-point probability logics. We consider such behavioral relations as simulations and bisimulations, probabilistic bisimulations, probabilistic weak simulations, and probabilistic forward...

  12. Probabilistic risk assessment as an aid to risk management

    International Nuclear Information System (INIS)

    Garrick, B.J.

    1982-01-01

    Probabilistic risk assessments are providing important insights into nuclear power plant safety. Their value is two-fold: first as a means of quantifying nuclear plant risk including contributors to risk, and second as an aid to risk management. A risk assessment provides an analytical plant model that can be the basis for performing meaningful decision analyses for controlling safety. It is the aspect of quantitative risk management that makes probabilistic risk assessment an important technical discipline of the future

  13. Relations between Classroom Goal Structures and Students' Goal Orientations in Mathematics Classes: When Is a Mastery Goal Structure Adaptive?

    Science.gov (United States)

    Skaalvik, Einar M.; Federici, Roger A.

    2016-01-01

    The purpose of this study was to test possible interactions between mastery and performance goal structures in mathematics classrooms when predicting students' goal orientations. More specifically, we tested if the degree of performance goal structure moderated the associations between mastery goal structure and students' goal orientations.…

  14. Fronto-parietal coding of goal-directed actions performed by artificial agents.

    Science.gov (United States)

    Kupferberg, Aleksandra; Iacoboni, Marco; Flanagin, Virginia; Huber, Markus; Kasparbauer, Anna; Baumgartner, Thomas; Hasler, Gregor; Schmidt, Florian; Borst, Christoph; Glasauer, Stefan

    2018-03-01

    With advances in technology, artificial agents such as humanoid robots will soon become a part of our daily lives. For safe and intuitive collaboration, it is important to understand the goals behind their motor actions. In humans, this process is mediated by changes in activity in fronto-parietal brain areas. The extent to which these areas are activated when observing artificial agents indicates the naturalness and easiness of interaction. Previous studies indicated that fronto-parietal activity does not depend on whether the agent is human or artificial. However, it is unknown whether this activity is modulated by observing grasping (self-related action) and pointing actions (other-related action) performed by an artificial agent depending on the action goal. Therefore, we designed an experiment in which subjects observed human and artificial agents perform pointing and grasping actions aimed at two different object categories suggesting different goals. We found a signal increase in the bilateral inferior parietal lobule and the premotor cortex when tool versus food items were pointed to or grasped by both agents, probably reflecting the association of hand actions with the functional use of tools. Our results show that goal attribution engages the fronto-parietal network not only for observing a human but also a robotic agent for both self-related and social actions. The debriefing after the experiment has shown that actions of human-like artificial agents can be perceived as being goal-directed. Therefore, humans will be able to interact with service robots intuitively in various domains such as education, healthcare, public service, and entertainment. © 2017 Wiley Periodicals, Inc.

  15. Fast probabilistic file fingerprinting for big data.

    Science.gov (United States)

    Tretyakov, Konstantin; Laur, Sven; Smant, Geert; Vilo, Jaak; Prins, Pjotr

    2013-01-01

    Biological data acquisition is raising new challenges, both in data analysis and handling. Not only is it proving hard to analyze the data at the rate it is generated today, but simply reading and transferring data files can be prohibitively slow due to their size. This primarily concerns logistics within and between data centers, but is also important for workstation users in the analysis phase. Common usage patterns, such as comparing and transferring files, are proving computationally expensive and are tying down shared resources. We present an efficient method for calculating file uniqueness for large scientific data files, that takes less computational effort than existing techniques. This method, called Probabilistic Fast File Fingerprinting (PFFF), exploits the variation present in biological data and computes file fingerprints by sampling randomly from the file instead of reading it in full. Consequently, it has a flat performance characteristic, correlated with data variation rather than file size. We demonstrate that probabilistic fingerprinting can be as reliable as existing hashing techniques, with provably negligible risk of collisions. We measure the performance of the algorithm on a number of data storage and access technologies, identifying its strengths as well as limitations. Probabilistic fingerprinting may significantly reduce the use of computational resources when comparing very large files. Utilisation of probabilistic fingerprinting techniques can increase the speed of common file-related workflows, both in the data center and for workbench analysis. The implementation of the algorithm is available as an open-source tool named pfff, as a command-line tool as well as a C library. The tool can be downloaded from http://biit.cs.ut.ee/pfff.

  16. The Effects Of Leadership Styles On Goal Clarity And Fairness Mediated Used Performance Measure

    Directory of Open Access Journals (Sweden)

    Amris Rusli Tanjung

    2017-04-01

    Full Text Available This paper investigate the effects of superiors performance evaluation behaviors on subordinates work-related attitudes mediated used performance measure. We used leadership style initiating structure and consideration and performance measure use objective and subjective measures on managerial work related attitudes goal clarity and evaluation fairness. We test our hypotheses using survey data from 56 middle-level managers in 4 services organizations. The results from Structural Equation Model with PLS show that an initiating structure leadership style has significant effect goal clarity and used objective performance measure mediated relationship initiating structure and goal clarity and used subjective performance measure not mediated relationship consideration leadership style and fairness in evaluation. Consideration leadership behavior instead only has a direct impact on fairness in evaluation. These findings have important implications for management accounting research on superiors use of performance measures and provide an explanation of some of the problematic findings in the literature.

  17. Evaluation of Beckman Coulter DxI 800 immunoassay system using clinically oriented performance goals.

    Science.gov (United States)

    Akbas, Neval; Schryver, Patricia G; Algeciras-Schimnich, Alicia; Baumann, Nikola A; Block, Darci R; Budd, Jeffrey R; Gaston, S J Stephen; Klee, George G

    2014-11-01

    We evaluated the analytical performance of 24 immunoassays using the Beckman Coulter DxI 800 immunoassay systems at Mayo Clinic, Rochester, MN for trueness, precision, detection limits, linearity, and consistency (across instruments and reagent lots). Clinically oriented performance goals were defined using the following methods: trueness-published desirable accuracy limits, precision-published desirable biologic variation; detection limits - 0.1 percentile of patient test values, linearity - 50% of total error, and consistency-percentage test values crossing key decision points. Local data were collected for precision, linearity, and consistency. Data were provided by Beckman Coulter, Inc. for trueness and detection limits. All evaluated assays except total thyroxine were within the proposed goals for trueness. Most of the assays met the proposed goals for precision (86% of intra-assay results and 75% of inter-assay results). Five assays had more than 15% of the test results below the minimum detection limits. Carcinoembryonic antigen, total thyroxine and free triiodothyronine exceeded the proposed goals of ±6.3%, ±5% and ±5.7% for dilution linearity. All evaluated assays were within the proposed goals for instrument consistency. Lot-to-lot consistency results for cortisol, ferritin and total thyroxine exceeded the proposed goals of 3.3%, 11.4% and 7% at one medical decision level, while vitamin B12 exceeded the proposed goals of 5.2% and 3.8% at two decision levels. The Beckman Coulter DxI 800 immunoassay system meets most of these proposed goals, even though these clinically focused performance goals represent relatively stringent limits. Copyright © 2014 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  18. Survey of probabilistic methods in safety and risk assessment for nuclear power plant licensing

    International Nuclear Information System (INIS)

    1984-04-01

    After an overview about the goals and general methods of probabilistic approaches in nuclear safety the main features of probabilistic safety or risk assessment (PRA) methods are discussed. Mostly in practical applications not a full-fledged PRA is applied but rather various levels of analysis leading from unavailability assessment of systems over the more complex analysis of the probable core damage stages up to the assessment of the overall health effects on the total population from a certain practice. The various types of application are discussed in relation to their limitation and benefits for different stages of design or operation of nuclear power plants. This gives guidance for licensing staff to judge the usefulness of the various methods for their licensing decisions. Examples of the application of probabilistic methods in several countries are given. Two appendices on reliability analysis and on containment and consequence analysis provide some more details on these subjects. (author)

  19. Goal Orientations of General Chemistry Students via the Achievement Goal Framework

    Science.gov (United States)

    Lewis, Scott E.

    2018-01-01

    The Achievement Goal Framework describes students' goal orientations as: task-based, focusing on the successful completion of the task; self-based, evaluating performance relative to one's own past performance; or other-based, evaluating performance relative to the performance of others. Goal orientations have been used to explain student success…

  20. The Effects of Differential Goal Weights on the Performance of a Complex Financial Task.

    Science.gov (United States)

    Edmister, Robert O.; Locke, Edwin A.

    1987-01-01

    Determined whether people could obtain outcomes on a complex task that would be in line with differential goal weights corresponding to different aspects of the task. Bank lending officers were run through lender-simulation exercises. Five performance goals were weighted. Demonstrated effectiveness of goal setting with complex tasks, using group…

  1. Development of probabilistic fast reactor fuel design method

    International Nuclear Information System (INIS)

    Ozawa, Takayuki

    1997-01-01

    Under the current method of evaluating fuel robustness in FBR fuel rod design, a variety of uncertain quantities including fuel production tolerance and power density are estimated conservatively. In the future, in order to proceed with improvements in the FBR core's performance and optimize the fuel's specifications, a rationalization of fuel design tolerance is required. Among the measures aimed at realizing this rationalization, the introduction of a probabilistic fast reactor fuel design method is currently under consideration. I have developed a probabilistic fast reactor fuel design code named BORNFREE, in order to make use of this method in FBR fuel design. At the same time, I have carried out a trial calculation of the cladding stress using this code and made a study and an evaluation of the possibility of employing tolerance rationalization in fuel rod design. In this paper, I provide an outline description of BORNFREE and report the results of the above study and evaluation. After performing cladding stress trial calculations using the probabilistic method, I was able to confirm that this method promises more rational design evaluation results than the conventional deterministic method. (author)

  2. Probabilistic Logic and Probabilistic Networks

    NARCIS (Netherlands)

    Haenni, R.; Romeijn, J.-W.; Wheeler, G.; Williamson, J.

    2009-01-01

    While in principle probabilistic logics might be applied to solve a range of problems, in practice they are rarely applied at present. This is perhaps because they seem disparate, complicated, and computationally intractable. However, we shall argue in this programmatic paper that several approaches

  3. Goal striving strategies and effort mobilization: When implementation intentions reduce effort-related cardiac activity during task performance.

    Science.gov (United States)

    Freydefont, Laure; Gollwitzer, Peter M; Oettingen, Gabriele

    2016-09-01

    Two experiments investigate the influence of goal and implementation intentions on effort mobilization during task performance. Although numerous studies have demonstrated the beneficial effects of setting goals and making plans on performance, the effects of goals and plans on effort-related cardiac activity and especially the cardiac preejection period (PEP) during goal striving have not yet been addressed. According to the Motivational Intensity Theory, participants should increase effort mobilization proportionally to task difficulty as long as success is possible and justified. Forming goals and making plans should allow for reduced effort mobilization when participants perform an easy task. However, when the task is difficult, goals and plans should differ in their effect on effort mobilization. Participants who set goals should disengage, whereas participants who made if-then plans should stay in the field showing high effort mobilization during task performance. As expected, using an easy task in Experiment 1, we observed a lower cardiac PEP in both the implementation intention and the goal intention condition than in the control condition. In Experiment 2, we varied task difficulty and demonstrated that while participants with a mere goal intention disengaged from difficult tasks, participants with an implementation intention increased effort mobilization proportionally with task difficulty. These findings demonstrate the influence of goal striving strategies (i.e., mere goals vs. if-then plans) on effort mobilization during task performance. Copyright © 2016 Elsevier B.V. All rights reserved.

  4. Multinational Design Evaluation Programme (MDEP) - Safety Goals

    International Nuclear Information System (INIS)

    Vaughan, G.J.

    2011-01-01

    One of the aims of the NEA's Multinational Design Evaluation Programme (MDEP) is to work towards greater harmonisation of regulatory requirements. To achieve this aim, it is necessary that there is a degree of convergence on the safety goals that are required to be met by designers and operators. The term 'safety goals' is defined to cover all health and safety requirements which must be met: these may be deterministic rules and/or probabilistic targets. They should cover the safety of workers, public and the environment in line with the IAEA's Basic Safety Objective; encompassing safety in normal operation through to severe accidents. MDEP is also interested in how its work can be extended to future reactors, which may use significantly different technology to the almost ubiquitous LWRs used today and in the next generation, building on the close co-operation within MDEP between the regulators who are currently engaged in constructing or carrying out design reviews on new designs. For two designs this work has involved several regulators sharing their safety assessments and in some cases issuing statements on issues that need to be addressed. Work is also progressing towards joint regulatory position statements on specific assessment areas. Harmonisation of safety goals will enhance the cooperation between regulators as further developments in design and technology occur. All regulators have safety goals, but these are expressed in many different ways and exercises in comparing them frequently are done at a very low level eg specific temperatures in the reactor vessel of a specific reactor type. The differences in the requirements from different regulators are difficult to resolve as the goals are derived using different principles and assumptions and are often for a specific technology. Therefore a different approach is being investigated, starting with the top-level safety goals and try to derive a structure and means of deriving lower tier

  5. Shining lights and bad apples : The effect of goal setting on group performance

    NARCIS (Netherlands)

    Curseu, P.L.; Janssen, S.E.A.; Meeus, M.T.H.

    2014-01-01

    Management education programs increasingly use group work as a tool for developing teamwork knowledge and skills. A critical factor identified in prior research to influence group performance in student groups is goal-setting. We test in a sample of 37 groups the effect of group goal configurations

  6. Probabilistic structural analysis using a general purpose finite element program

    Science.gov (United States)

    Riha, D. S.; Millwater, H. R.; Thacker, B. H.

    1992-07-01

    This paper presents an accurate and efficient method to predict the probabilistic response for structural response quantities, such as stress, displacement, natural frequencies, and buckling loads, by combining the capabilities of MSC/NASTRAN, including design sensitivity analysis and fast probability integration. Two probabilistic structural analysis examples have been performed and verified by comparison with Monte Carlo simulation of the analytical solution. The first example consists of a cantilevered plate with several point loads. The second example is a probabilistic buckling analysis of a simply supported composite plate under in-plane loading. The coupling of MSC/NASTRAN and fast probability integration is shown to be orders of magnitude more efficient than Monte Carlo simulation with excellent accuracy.

  7. The international probabilistic system assessment group. Background and results 1990

    International Nuclear Information System (INIS)

    1991-01-01

    The OECD Nuclear Energy Agency (NEA) devotes considerable effort to the further development of methodologies to assess the performance of radioactive waste disposal systems, and to increase confidence in their application and results. The NEA provides an international forum for the exchange of information and experience among national experts of its twenty-three Member countries and conducts joint studies of issues important for safety assessment. In 1985, the NEA Radioactive Waste Management Committee set up the Probabilistic System Assessment Code User Group (PSAC), in order to help coordinate the development of probabilistic system assessment codes. The activities of the Group include exchange of information, code and experience, discussion of relevant technical issues, and the conduct of code comparison (PSACOIN) exercises designed to build confidence in the correct operation of these tools for safety assessment. The Group is now known simply as the Probabilistic System Assessment Group (PSAG). This report has been prepared to inform interested parties, beyond the group of specialists directly involved, about probabilistic system assessment techniques as used for performance assessment of waste disposal systems, and to give a summary of the objectives and achievements of PSAG. The report is published under the responsibility of the Secretary General of the OECD

  8. Applications of probabilistic techniques at NRC

    International Nuclear Information System (INIS)

    Thadani, A.; Rowsome, F.; Speis, T.

    1984-01-01

    The NRC is currently making extensive use of probabilistic safety assessment in the reactor regulation. Most of these applications have been introduced in the regulatory activities in the past few years. Plant Probabilistic Safety Studies are being utilized as a design tool for applications for standard designs and for assessment of plants located in regions of particularly high population density. There is considerable motivation for licenses to perform plant-specific probabilistic studies for many, if not all, of the existing operating nuclear power plants as a tool for prioritizing the implementation of the many outstanding licensing actions of these plants as well as recommending the elimination of a number of these issues which are judged to be insignificant in terms of their contribution to safety and risk. Risk assessment perspectives are being used in the priorization of generic safety issues, development of technical resolution of unresolved safety issues, assessing safety significance of proposed new regulatory requirements, assessment of safety significance of some of the occurrences at operating facilities and in environmental impact analyses of license applicants as required by the National Environmental Policy Act. (orig.)

  9. Predicting Examination Performance Using an Expanded Integrated Hierarchical Model of Test Emotions and Achievement Goals

    Science.gov (United States)

    Putwain, Dave; Deveney, Carolyn

    2009-01-01

    The aim of this study was to examine an expanded integrative hierarchical model of test emotions and achievement goal orientations in predicting the examination performance of undergraduate students. Achievement goals were theorised as mediating the relationship between test emotions and performance. 120 undergraduate students completed…

  10. CANDU type fuel behavior evaluation - a probabilistic approach

    International Nuclear Information System (INIS)

    Moscalu, D.R.; Horhoianu, G.; Popescu, I.A.; Olteanu, G.

    1995-01-01

    In order to realistically assess the behavior of the fuel elements during in-reactor operation, probabilistic methods have recently been introduced in the analysis of fuel performance. The present paper summarizes the achievements in this field at the Institute for Nuclear Research (INR), pointing out some advantages of the utilized method in the evaluation of CANDU type fuel behavior in steady state conditions. The Response Surface Method (RSM) has been selected for the investigation of the effects of the variability in fuel element computer code inputs on the code outputs (fuel element performance parameters). A new developed version of the probabilistic code APMESRA based on RSM is briefly presented. The examples of application include the analysis of the results of an in-reactor fuel element experiment and the investigation of the calculated performance parameter distribution for a new CANDU type extended burnup fuel element design. (author)

  11. Multiple goals and time constraints: perceived impact on physicians' performance of evidence-based behaviours

    Directory of Open Access Journals (Sweden)

    Francis Jillian J

    2009-11-01

    Full Text Available Abstract Background Behavioural approaches to knowledge translation inform interventions to improve healthcare. However, such approaches often focus on a single behaviour without considering that health professionals perform multiple behaviours in pursuit of multiple goals in a given clinical context. In resource-limited consultations, performing these other goal-directed behaviours may influence optimal performance of a particular evidence-based behaviour. This study aimed to investigate whether a multiple goal-directed behaviour perspective might inform implementation research beyond single-behaviour approaches. Methods We conducted theory-based semi-structured interviews with 12 general medical practitioners (GPs in Scotland on their views regarding two focal clinical behaviours--providing physical activity (PA advice and prescribing to reduce blood pressure (BP to Results Most GPs reported strong intention to prescribe to reduce BP but expressed reasons why they would not. Intention to provide PA advice was variable. Most GPs reported that time constraints and patient preference detrimentally affected their control over providing PA advice and prescribing to reduce BP, respectively. Most GPs perceived many of their other goal-directed behaviours as interfering with providing PA advice, while fewer GPs reported goal-directed behaviours that interfere with prescribing to reduce BP. Providing PA advice and prescribing to reduce BP were perceived to be facilitated by similar diabetes-related behaviours (e.g., discussing cholesterol. While providing PA advice was perceived to be mainly facilitated by providing other lifestyle-related clinical advice (e.g., talking about weight, BP prescribing was reported as facilitated by pursuing ongoing standard consultation-related goals (e.g., clearly structuring the consultation. Conclusion GPs readily relate their other goal-directed behaviours with having a facilitating and interfering influence on their

  12. PROBABILISTIC RELATIONAL MODELS OF COMPLETE IL-SEMIRINGS

    OpenAIRE

    Tsumagari, Norihiro

    2012-01-01

    This paper studies basic properties of probabilistic multirelations which are generalized the semantic domain of probabilistic systems and then provides two probabilistic models of complete IL-semirings using probabilistic multirelations. Also it is shown that these models need not be models of complete idempotentsemirings.

  13. A probabilistic analysis of rapid boron dilution scenarios

    International Nuclear Information System (INIS)

    Kohut, P.; Diamond, D.J.

    1993-01-01

    A probabilistic and deterministic analysis of a rapid boron dilution scenario related to reactor restart was performed. The event is initiated by a loss of off-site power during the startup dilution process. The automatic restart of the charging pump in such cases may lead to the accumulation of a diluted slug of water in the lower plenum. The restart of the reactor coolant pumps may send the diluted slug through the core, adding sufficient reactivity to overcome the shutdown margin and cause a power excursion. The concern is that the power excursion is sufficient in certain circumstances to cause fuel damage. The estimated core damage frequency based on the scoping analysis is 1.0--3.0E-05/yr for the plants analyzed. These are relatively significant values when compared to desirable goals. The analysis contained assumptions related to plant specific design characteristics which may lead to non-conservative estimates. The most important conservative assumptions were that mixing of the injected diluted water is insignificant and that fuel damage occurs when the slug passes through the core

  14. The Performance of Structure-Controller Coupled Systems Analysis Using Probabilistic Evaluation and Identification Model Approach

    Directory of Open Access Journals (Sweden)

    Mosbeh R. Kaloop

    2017-01-01

    Full Text Available This study evaluates the performance of passively controlled steel frame building under dynamic loads using time series analysis. A novel application is utilized for the time and frequency domains evaluation to analyze the behavior of controlling systems. In addition, the autoregressive moving average (ARMA neural networks are employed to identify the performance of the controller system. Three passive vibration control devices are utilized in this study, namely, tuned mass damper (TMD, tuned liquid damper (TLD, and tuned liquid column damper (TLCD. The results show that the TMD control system is a more reliable controller than TLD and TLCD systems in terms of vibration mitigation. The probabilistic evaluation and identification model showed that the probability analysis and ARMA neural network model are suitable to evaluate and predict the response of coupled building-controller systems.

  15. Probabilistic safety analysis vs probabilistic fracture mechanics -relation and necessary merging

    International Nuclear Information System (INIS)

    Nilsson, Fred

    1997-01-01

    A comparison is made between some general features of probabilistic fracture mechanics (PFM) and probabilistic safety assessment (PSA) in its standard form. We conclude that: Result from PSA is a numerically expressed level of confidence in the system based on the state of current knowledge. It is thus not any objective measure of risk. It is important to carefully define the precise nature of the probabilistic statement and relate it to a well defined situation. Standardisation of PFM methods is necessary. PFM seems to be the only way to obtain estimates of the pipe break probability. Service statistics are of doubtful value because of scarcity of data and statistical inhomogeneity. Collection of service data should be directed towards the occurrence of growing cracks

  16. Achievement Goals and their Underlying Goal Motivation: Does it Matter Why Sport Participants Pursue their Goals?

    Directory of Open Access Journals (Sweden)

    Patrick Gaudreau

    2016-07-01

    Full Text Available This study examined whether the good or bad outcomes associated with mastery-approach (MAP and performance-approach (PAP goals depend on the extent to which they are motivated by autonomous or controlled motivation. A sample of 515 undergraduate students who participated in sport completed measures of achievement goals, motivation of achievement goals, perceived goal attainment, sport satisfaction, and both positive and negative affect. Results of moderated regression analyses revealed that the positive relations of both MAP and PAP goals with perceived goal attainment were stronger for athletes pursuing these goals with high level of autonomous goal motivation. Also, the positive relations between PAP goals and both sport satisfaction and positive affect were stronger at high levels of autonomous goal motivation and controlled goal motivation. The shape of all these significant interactions was consistent with tenets of Self-Determination Theory as controlled goal motivation was negatively associated with positive affect and sport satisfaction and positively associated with negative affect. Overall, these findings demonstrated the importance of considering goal motivation in order to better understand the conditions under which achievement goals are associated with better experiential and performance outcomes in the lives of sport participants.

  17. Probabilistic approach to manipulator kinematics and dynamics

    International Nuclear Information System (INIS)

    Rao, S.S.; Bhatti, P.K.

    2001-01-01

    A high performance, high speed robotic arm must be able to manipulate objects with a high degree of accuracy and repeatability. As with any other physical system, there are a number of factors causing uncertainties in the behavior of a robotic manipulator. These factors include manufacturing and assembling tolerances, and errors in the joint actuators and controllers. In order to study the effect of these uncertainties on the robotic end-effector and to obtain a better insight into the manipulator behavior, the manipulator kinematics and dynamics are modeled using a probabilistic approach. Based on the probabilistic model, kinematic and dynamic performance criteria are defined to provide measures of the behavior of the robotic end-effector. Techniques are presented to compute the kinematic and dynamic reliabilities of the manipulator. The effects of tolerances associated with the various manipulator parameters on the reliabilities are studied. Numerical examples are presented to illustrate the procedures

  18. Does Extrinsic Goal Framing Enhance Extrinsic Goal-Oriented Individuals' Learning and Performance? An Experimental Test of the Match Perspective versus Self-Determination Theory

    Science.gov (United States)

    Vansteenkiste, Maarten; Timmermans, Tinneke; Lens, Willy; Soenens, Bart; Van den Broeck, Anja

    2008-01-01

    Previous work within self-determination theory has shown that experimentally framing a learning activity in terms of extrinsic rather than intrinsic goals results in poorer conceptual learning and performance, presumably because extrinsic goal framing detracts attention from the learning activity and is less directly satisfying of basic…

  19. Specifying design conservatism: Worst case versus probabilistic analysis

    Science.gov (United States)

    Miles, Ralph F., Jr.

    1993-01-01

    Design conservatism is the difference between specified and required performance, and is introduced when uncertainty is present. The classical approach of worst-case analysis for specifying design conservatism is presented, along with the modern approach of probabilistic analysis. The appropriate degree of design conservatism is a tradeoff between the required resources and the probability and consequences of a failure. A probabilistic analysis properly models this tradeoff, while a worst-case analysis reveals nothing about the probability of failure, and can significantly overstate the consequences of failure. Two aerospace examples will be presented that illustrate problems that can arise with a worst-case analysis.

  20. Review of the Diablo Canyon probabilistic risk assessment

    International Nuclear Information System (INIS)

    Bozoki, G.E.; Fitzpatrick, R.G.; Bohn, M.P.; Sabek, M.G.; Ravindra, M.K.; Johnson, J.J.

    1994-08-01

    This report details the review of the Diablo Canyon Probabilistic Risk Assessment (DCPRA). The study was performed under contract from the Probabilistic Risk Analysis Branch, Office of Nuclear Reactor Research, USNRC by Brookhaven National Laboratory. The DCPRA is a full scope Level I effort and although the review touched on all aspects of the PRA, the internal events and seismic events received the vast majority of the review effort. The report includes a number of independent systems analyses sensitivity studies, importance analyses as well as conclusions on the adequacy of the DCPRA for use in the Diablo Canyon Long Term Seismic Program

  1. Evaluation of replacement tritium facility (RTF) compliance with DOE safety goals using probabilistic consequence assessment methodology

    International Nuclear Information System (INIS)

    O'Kula, K.R.; East, J.M.; Moore, M.L.

    1993-01-01

    The Savannah River Site (SRS), operated by the Westinghouse Savannah River Company (WSRC) for the US Department of Energy (DOE), is a major center for the processing of nuclear materials for national defense, deep-space exploration, and medical treatment applications in the United States. As an integral part of the DOE's effort to modernize facilities, implement improved handling and processing technology, and reduce operational risk to the general public and onsite workers, transition of tritium processing at SRS from the Consolidated Tritium Facility to the Replacement Tritium Facility (RTF) began in 1993. To ensure that operation of new DOE facilities such as RTF present minimum involuntary and voluntary risks to the neighboring public and workers, indices of risk have been established to serve as target levels or safety goals of performance for assessing nuclear safety. These goals are discussed from a historical perspective in the initial part of this paper. Secondly, methodologies to quantify risk indices are briefly described. Lastly, accident, abnormal event, and normal operation source terms from RTF are evaluated for consequence assessment purposes relative to the safety targets

  2. On the quality and value of probabilistic forecasts of wind generation

    DEFF Research Database (Denmark)

    Pinson, Pierre; Juban, Jeremie; Kariniotakis, Georges

    2006-01-01

    the uncertainty information, can be seen as optimal for the management or trading of wind generation. This paper explores the differences and relations between the quality (i.e. statistical performance) and the operational value of these forecasts. An application is presented on the use of probabilistic...... predictions for bidding in a European electricity market. The benefits of a probabilistic view of wind power forecasting are clearly demonstrated....

  3. On synchronous parallel computations with independent probabilistic choice

    International Nuclear Information System (INIS)

    Reif, J.H.

    1984-01-01

    This paper introduces probabilistic choice to synchronous parallel machine models; in particular parallel RAMs. The power of probabilistic choice in parallel computations is illustrate by parallelizing some known probabilistic sequential algorithms. The authors characterize the computational complexity of time, space, and processor bounded probabilistic parallel RAMs in terms of the computational complexity of probabilistic sequential RAMs. They show that parallelism uniformly speeds up time bounded probabilistic sequential RAM computations by nearly a quadratic factor. They also show that probabilistic choice can be eliminated from parallel computations by introducing nonuniformity

  4. Geothermal probabilistic cost study

    Energy Technology Data Exchange (ETDEWEB)

    Orren, L.H.; Ziman, G.M.; Jones, S.C.; Lee, T.K.; Noll, R.; Wilde, L.; Sadanand, V.

    1981-08-01

    A tool is presented to quantify the risks of geothermal projects, the Geothermal Probabilistic Cost Model (GPCM). The GPCM model is used to evaluate a geothermal reservoir for a binary-cycle electric plant at Heber, California. Three institutional aspects of the geothermal risk which can shift the risk among different agents are analyzed. The leasing of geothermal land, contracting between the producer and the user of the geothermal heat, and insurance against faulty performance are examined. (MHR)

  5. Intrinsic motivation, performance, and the mediating role of mastery goal orientation: a test of self-determination theory.

    Science.gov (United States)

    Cerasoli, Christopher P; Ford, Michael T

    2014-01-01

    Although intrinsic motivation has been linked repeatedly to performance and outcomes, the causal relationship between the two has remained unclear. To explain the link, this study considered the focusing influence of mastery goals. Using a three-wave panel study and hypotheses drawn from self-determination theory and achievement goal theory, the current study sought to clarify the relationships between intrinsic motivation, mastery goal orientation, and performance. Specifically, the current study hypothesized and found that mastery goals mediated (explained) the relationship between intrinsic motivation and performance.

  6. 42 CFR 457.710 - State plan requirements: Strategic objectives and performance goals.

    Science.gov (United States)

    2010-10-01

    .... The State's strategic objectives, performance goals and performance measures must include a common... 42 Public Health 4 2010-10-01 2010-10-01 false State plan requirements: Strategic objectives and...) ALLOTMENTS AND GRANTS TO STATES Strategic Planning, Reporting, and Evaluation § 457.710 State plan...

  7. Probabilistic cloning and deleting of quantum states

    International Nuclear Information System (INIS)

    Feng Yuan; Zhang Shengyu; Ying Mingsheng

    2002-01-01

    We construct a probabilistic cloning and deleting machine which, taking several copies of an input quantum state, can output a linear superposition of multiple cloning and deleting states. Since the machine can perform cloning and deleting in a single unitary evolution, the probabilistic cloning and other cloning machines proposed in the previous literature can be thought of as special cases of our machine. A sufficient and necessary condition for successful cloning and deleting is presented, and it requires that the copies of an arbitrarily presumed number of the input states are linearly independent. This simply generalizes some results for cloning. We also derive an upper bound for the success probability of the cloning and deleting machine

  8. Risk management for existing energy facilities. A global approach to numerical safety goals

    International Nuclear Information System (INIS)

    Pate-Cornell, M.E.

    1993-01-01

    This paper presents a structured set of numerical safety goals for risk management of existing energy facilities. The rationale behind these safety goals is based on principles of equity and economic efficiency. Some of the issues involved when using probabilistic risk analyses results for safety decisions are discussed. A brief review of existing safety targets and open-quotes floating numbersclose quotes is presented, and a set of safety goals for industrial risk management is proposed. Relaxation of these standards for existing facilities, the relevance of the lifetime of the plant, the treatment of uncertainties, and problems of failure dependencies are discussed briefly. 17 refs., 1 fig

  9. Achievement Goals and Achievement Emotions: Testing a Model of Their Joint Relations with Academic Performance

    Science.gov (United States)

    Pekrun, Reinhard; Elliot, Andrew J.; Maier, Markus A.

    2009-01-01

    The authors propose a theoretical model linking achievement goals and achievement emotions to academic performance. This model was tested in a prospective study with undergraduates (N = 213), using exam-specific assessments of both goals and emotions as predictors of exam performance in an introductory-level psychology course. The findings were…

  10. Validation of the probabilistic approach for the analysis of PWR transients

    International Nuclear Information System (INIS)

    Amesz, J.; Francocci, G.F.; Clarotti, C.

    1978-01-01

    This paper reviews the pilot study at present being carried out on the validation of probabilistic methodology with real data coming from the operational records of the PWR power station at Obrigheim (KWO, Germany) operating since 1969. The aim of this analysis is to validate the a priori predictions of reactor transients performed by a probabilistic methodology, with the posteriori analysis of transients that actually occurred at a power station. Two levels of validation have been distinguished: (a) validation of the rate of occurrence of initiating events; (b) validation of the transient-parameter amplitude (i.e., overpressure) caused by the above mentioned initiating events. The paper describes the a priori calculations performed using a fault-tree analysis by means of a probabilistic code (SALP 3) and event-trees coupled with a PWR system deterministic computer code (LOOP 7). Finally the principle results of these analyses are presented and critically reviewed

  11. Probabilistic design of fibre concrete structures

    Science.gov (United States)

    Pukl, R.; Novák, D.; Sajdlová, T.; Lehký, D.; Červenka, J.; Červenka, V.

    2017-09-01

    Advanced computer simulation is recently well-established methodology for evaluation of resistance of concrete engineering structures. The nonlinear finite element analysis enables to realistically predict structural damage, peak load, failure, post-peak response, development of cracks in concrete, yielding of reinforcement, concrete crushing or shear failure. The nonlinear material models can cover various types of concrete and reinforced concrete: ordinary concrete, plain or reinforced, without or with prestressing, fibre concrete, (ultra) high performance concrete, lightweight concrete, etc. Advanced material models taking into account fibre concrete properties such as shape of tensile softening branch, high toughness and ductility are described in the paper. Since the variability of the fibre concrete material properties is rather high, the probabilistic analysis seems to be the most appropriate format for structural design and evaluation of structural performance, reliability and safety. The presented combination of the nonlinear analysis with advanced probabilistic methods allows evaluation of structural safety characterized by failure probability or by reliability index respectively. Authors offer a methodology and computer tools for realistic safety assessment of concrete structures; the utilized approach is based on randomization of the nonlinear finite element analysis of the structural model. Uncertainty of the material properties or their randomness obtained from material tests are accounted in the random distribution. Furthermore, degradation of the reinforced concrete materials such as carbonation of concrete, corrosion of reinforcement, etc. can be accounted in order to analyze life-cycle structural performance and to enable prediction of the structural reliability and safety in time development. The results can serve as a rational basis for design of fibre concrete engineering structures based on advanced nonlinear computer analysis. The presented

  12. Pro-Social Goals in Achievement Situations: Amity Goal Orientation Enhances the Positive Effects of Mastery Goal Orientation.

    Science.gov (United States)

    Levontin, Liat; Bardi, Anat

    2018-04-01

    Research has neglected the utility of pro-social goals within achievement situations. In this article, four studies demonstrate that amity goal orientation, promoting mutual success of oneself together with others, enhances the utility of mastery goal orientation. We demonstrate this in longitudinally predicting performance (Studies 1 and 2) and in maintaining motivation after a disappointing performance (Studies 3 and 4). The studies demonstrate the same interaction effect in academic and in work achievement contexts. Specifically, whereas amity goal orientation did not predict achievement on its own, it enhanced the positive effect of mastery goal orientation. Together, these studies establish the importance of amity goal orientation while also advancing our understanding of the effects of other achievement goal orientations. We suggest future directions in examining the utility of amity goals in other contexts.

  13. Preliminary Study for Application of the New Safety Goal related with the Limitation of Cs-137 release

    Energy Technology Data Exchange (ETDEWEB)

    Seo, Mi Ro; Shin, Tae Young [KHNP CRI, Daejeon (Korea, Republic of)

    2016-10-15

    In the New Safety Goal, it is clearly stated that the Probabilistic Safety Assessment (PSA) should be performed with the proper technical appropriateness, the detail, and the scope in accordance with the integrated risk assessment against the accident for the nuclear power plants. This requirement is known to be come from the provision for preventing the long term ground contamination due to the release of radioactive material. However, there were so many concerns that this goal is so severe that the current design, even in the case of the constructing nuclear power plants, cannot meet this criterion. Especially for the operating nuclear power plants, since there were no mitigation facilities against the severe accident at the design stages, the application of this new goal is known to be much severe that the constructing nuclear power plants and it is necessary to develop the alternative methods to strengthen the safety of the operating nuclear power plants. The purpose of this study is to review the new safety goal from the view point of severe accident analysis and probabilistic safety assessment, and to find the appropriate methods in order to meet that goal for the operating nuclear power plants. In order to strengthen the safety for domestic nuclear power plants, all of the domestic operating nuclear power plants are required to prepare the Accident Management Plan within 3 years. Also, this Accident Management Plan should meet the New Safety Goal including the requirement that the sum of the accident frequency that the release of the radioactive nuclide Cs- 137 to the environment exceeds the 100TBq should be less than 1.0E-6/RY. Since the operating nuclear power plants was not designed against the severe accident and they have the limited exclusive mitigation facilities, it is not easy to meet the New Safety Goal. So, it is necessary to develop the alternative methods to meet the New Safety Goal. In this study, the amount of Cs-137 released to the

  14. Preliminary Study for Application of the New Safety Goal related with the Limitation of Cs-137 release

    International Nuclear Information System (INIS)

    Seo, Mi Ro; Shin, Tae Young

    2016-01-01

    In the New Safety Goal, it is clearly stated that the Probabilistic Safety Assessment (PSA) should be performed with the proper technical appropriateness, the detail, and the scope in accordance with the integrated risk assessment against the accident for the nuclear power plants. This requirement is known to be come from the provision for preventing the long term ground contamination due to the release of radioactive material. However, there were so many concerns that this goal is so severe that the current design, even in the case of the constructing nuclear power plants, cannot meet this criterion. Especially for the operating nuclear power plants, since there were no mitigation facilities against the severe accident at the design stages, the application of this new goal is known to be much severe that the constructing nuclear power plants and it is necessary to develop the alternative methods to strengthen the safety of the operating nuclear power plants. The purpose of this study is to review the new safety goal from the view point of severe accident analysis and probabilistic safety assessment, and to find the appropriate methods in order to meet that goal for the operating nuclear power plants. In order to strengthen the safety for domestic nuclear power plants, all of the domestic operating nuclear power plants are required to prepare the Accident Management Plan within 3 years. Also, this Accident Management Plan should meet the New Safety Goal including the requirement that the sum of the accident frequency that the release of the radioactive nuclide Cs- 137 to the environment exceeds the 100TBq should be less than 1.0E-6/RY. Since the operating nuclear power plants was not designed against the severe accident and they have the limited exclusive mitigation facilities, it is not easy to meet the New Safety Goal. So, it is necessary to develop the alternative methods to meet the New Safety Goal. In this study, the amount of Cs-137 released to the

  15. Probabilistic structural integrity of reactor vessel under pressurized thermal shock

    International Nuclear Information System (INIS)

    Myung Jo Hhung; Young Hwan Choi; Hho Jung Kim; Changheui Jang

    2005-01-01

    Performed here is a comparative assessment study for the probabilistic fracture mechanics approach of the pressurized thermal shock of the reactor pressure vessel. A round robin consisting of 1 prerequisite study and 5 cases for probabilistic approaches is proposed, and all organizations interested are invited. The problems are solved and their results are compared to issue some recommendation of best practices in this area and to assure an understanding of the key parameters of this type of approach, which will be useful in the justification through a probabilistic approach for the case of a plant over-passing the screening criteria. Six participants from 3 organizations in Korea responded to the problem and their results are compiled in this study. (authors)

  16. Some probabilistic aspects of fracture

    International Nuclear Information System (INIS)

    Thomas, J.M.

    1982-01-01

    Some probabilistic aspects of fracture in structural and mechanical components are examined. The principles of fracture mechanics, material quality and inspection uncertainty are formulated into a conceptual and analytical framework for prediction of failure probability. The role of probabilistic fracture mechanics in a more global context of risk and optimization of decisions is illustrated. An example, where Monte Carlo simulation was used to implement a probabilistic fracture mechanics analysis, is discussed. (orig.)

  17. Learning Probabilistic Inference through Spike-Timing-Dependent Plasticity.

    Science.gov (United States)

    Pecevski, Dejan; Maass, Wolfgang

    2016-01-01

    Numerous experimental data show that the brain is able to extract information from complex, uncertain, and often ambiguous experiences. Furthermore, it can use such learnt information for decision making through probabilistic inference. Several models have been proposed that aim at explaining how probabilistic inference could be performed by networks of neurons in the brain. We propose here a model that can also explain how such neural network could acquire the necessary information for that from examples. We show that spike-timing-dependent plasticity in combination with intrinsic plasticity generates in ensembles of pyramidal cells with lateral inhibition a fundamental building block for that: probabilistic associations between neurons that represent through their firing current values of random variables. Furthermore, by combining such adaptive network motifs in a recursive manner the resulting network is enabled to extract statistical information from complex input streams, and to build an internal model for the distribution p (*) that generates the examples it receives. This holds even if p (*) contains higher-order moments. The analysis of this learning process is supported by a rigorous theoretical foundation. Furthermore, we show that the network can use the learnt internal model immediately for prediction, decision making, and other types of probabilistic inference.

  18. Task complexity and task, goal, and reward interdependence in group performance : a prescriptive model

    NARCIS (Netherlands)

    Vijfeijken, van H.T.G.A.; Kleingeld, P.A.M.; Tuijl, van H.F.J.M.; Algera, J.A.; Thierry, H.

    2002-01-01

    A prescriptive model on how to design effective combinations of goal setting and contingent rewards for group performance management is presented. The model incorporates the constructs task complexity, task interdependence, goal interdependence, and reward interdependence and specifies optimal fit

  19. Probabilistic safety analysis using microcomputer

    International Nuclear Information System (INIS)

    Futuro Filho, F.L.F.; Mendes, J.E.S.; Santos, M.J.P. dos

    1990-01-01

    The main steps of execution of a Probabilistic Safety Assessment (PSA) are presented in this report, as the study of the system description, construction of event trees and fault trees, and the calculation of overall unavailability of the systems. It is also presented the use of microcomputer in performing some tasks, highlightning the main characteristics of a software to perform adequately the job. A sample case of fault tree construction and calculation is presented, using the PSAPACK software, distributed by the IAEA (International Atomic Energy Agency) for training purpose. (author)

  20. Design for containment of hazardous materials

    International Nuclear Information System (INIS)

    Murray, R.C.; McDonald, J.R.

    1991-03-01

    Department of Energy, (DOE), facilities across the United States, use wind and tornado design and evaluation criteria based on probabilistic performance goals. In addition, other programs such as Advanced Light Water Reactors, New Production Reactors, and Individual Plant Examinations for External Events for commercial nuclear power plants utilize design and evaluation criteria based on probabilistic performance goals. The use of probabilistic performance goals is a departure from design practice for commercial nuclear power plants which have traditionally been designed utilizing a conservative specification of wind and tornado loading combined with deterministic response evaluation methods and permissible behavior limits. Approaches which utilize probabilistic wind and tornado hazard curves for specification of loading and deterministic response evaluation methods and permissible behavior limits are discussed in this paper. Through the use of such design/evaluation approaches, it may be demonstrated that there is high likelihood that probabilistic performance goals can be achieved. 14 refs., 1 fig., 5 tabs

  1. Probabilistic Structural Analysis Methods (PSAM) for select space propulsion system structural components

    Science.gov (United States)

    Cruse, T. A.

    1987-01-01

    The objective is the development of several modular structural analysis packages capable of predicting the probabilistic response distribution for key structural variables such as maximum stress, natural frequencies, transient response, etc. The structural analysis packages are to include stochastic modeling of loads, material properties, geometry (tolerances), and boundary conditions. The solution is to be in terms of the cumulative probability of exceedance distribution (CDF) and confidence bounds. Two methods of probability modeling are to be included as well as three types of structural models - probabilistic finite-element method (PFEM); probabilistic approximate analysis methods (PAAM); and probabilistic boundary element methods (PBEM). The purpose in doing probabilistic structural analysis is to provide the designer with a more realistic ability to assess the importance of uncertainty in the response of a high performance structure. Probabilistic Structural Analysis Method (PSAM) tools will estimate structural safety and reliability, while providing the engineer with information on the confidence that should be given to the predicted behavior. Perhaps most critically, the PSAM results will directly provide information on the sensitivity of the design response to those variables which are seen to be uncertain.

  2. Probabilistic Structural Analysis Methods for select space propulsion system structural components (PSAM)

    Science.gov (United States)

    Cruse, T. A.; Burnside, O. H.; Wu, Y.-T.; Polch, E. Z.; Dias, J. B.

    1988-01-01

    The objective is the development of several modular structural analysis packages capable of predicting the probabilistic response distribution for key structural variables such as maximum stress, natural frequencies, transient response, etc. The structural analysis packages are to include stochastic modeling of loads, material properties, geometry (tolerances), and boundary conditions. The solution is to be in terms of the cumulative probability of exceedance distribution (CDF) and confidence bounds. Two methods of probability modeling are to be included as well as three types of structural models - probabilistic finite-element method (PFEM); probabilistic approximate analysis methods (PAAM); and probabilistic boundary element methods (PBEM). The purpose in doing probabilistic structural analysis is to provide the designer with a more realistic ability to assess the importance of uncertainty in the response of a high performance structure. Probabilistic Structural Analysis Method (PSAM) tools will estimate structural safety and reliability, while providing the engineer with information on the confidence that should be given to the predicted behavior. Perhaps most critically, the PSAM results will directly provide information on the sensitivity of the design response to those variables which are seen to be uncertain.

  3. Secondary Students' Writing Achievement Goals: Assessing the Mediating Effects of Mastery and Performance Goals on Writing Self-Efficacy, Affect, and Writing Achievement

    Science.gov (United States)

    Yilmaz Soylu, Meryem; Zeleny, Mary G.; Zhao, Ruomeng; Bruning, Roger H.; Dempsey, Michael S.; Kauffman, Douglas F.

    2017-01-01

    The two studies reported here explored the factor structure of the newly constructed Writing Achievement Goal Scale (WAGS), and examined relationships among secondary students' writing achievement goals, writing self-efficacy, affect for writing, and writing achievement. In the first study, 697 middle school students completed the WAGS. A confirmatory factor analysis revealed a good fit for this data with a three-factor model that corresponds with mastery, performance approach, and performance avoidance goals. The results of Study 1 were an indication for the researchers to move forward with Study 2, which included 563 high school students. The secondary students completed the WAGS, as well as the Self-efficacy for Writing Scale, and the Liking Writing Scale. Students also self-reported grades for writing and for language arts courses. Approximately 6 weeks later, students completed a statewide writing assessment. We tested a theoretical model representing relationships among Study 2 variables using structural equation modeling including students' responses to the study scales and students' scores on the statewide assessment. Results from Study 2 revealed a good fit between a model depicting proposed relationships among the constructs and the data. Findings are discussed relative to achievement goal theory and writing. PMID:28878707

  4. Probabilistic assessment of the long-term performance of the Panel Mine tailings area

    International Nuclear Information System (INIS)

    Balins, J.K.; Davis, J.B.; Payne, R.A.

    1994-01-01

    Rio Algom's Panel Uranium Mine originally operated between 1958 and 1961. It was reactivated in 1979 and operated continuously until 1990. In all, the mine produced about 14 million tons of potentially acid generating, low level radioactive uranium tailings; about 5% pyrite (by weight) with less than 0.1% U 3 O 8 . The tailings area consists of two rock rimmed basins. Topographic lows around the perimeter are closed by a total of six containment dams. To minimize the acid generating potential within the tailings, a decommissioning plan to flood the impounded tailings is being implemented. The anticipated performance of engineered structures (dams, spillways, channels, etc.) and the flooded tailings concept, over time periods in the order of thousands of years, have been addressed using probabilistic methods, based on subjective probability distributions consistent with available site specific information. The probable costs associated with long-term inspection and maintenance of the facility, as well as the probable costs and environmental consequences (e.g. tailings releases) associated with potential dam failures due to disruptive events such as floods, droughts and earthquakes were determined using a probabilistic model which consists of five, essentially independent, sub-models: a Maintenance Model, an Earthquake Response Model, a Flood Response Model, a Drought Model and an Integration Model. The principal conclusion derived from this assessment is that, for a well designed, constructed and maintained facility, there is very little likelihood that water and/or tailings solids will be released as a result of a containment dam failure; annual probability of the order of 10 -6 . Failure to maintain the facility over the long-term significantly increases the likelihood of dam failure with resultant release of water and suspended tailings solids

  5. Compression of Probabilistic XML Documents

    Science.gov (United States)

    Veldman, Irma; de Keijzer, Ander; van Keulen, Maurice

    Database techniques to store, query and manipulate data that contains uncertainty receives increasing research interest. Such UDBMSs can be classified according to their underlying data model: relational, XML, or RDF. We focus on uncertain XML DBMS with as representative example the Probabilistic XML model (PXML) of [10,9]. The size of a PXML document is obviously a factor in performance. There are PXML-specific techniques to reduce the size, such as a push down mechanism, that produces equivalent but more compact PXML documents. It can only be applied, however, where possibilities are dependent. For normal XML documents there also exist several techniques for compressing a document. Since Probabilistic XML is (a special form of) normal XML, it might benefit from these methods even more. In this paper, we show that existing compression mechanisms can be combined with PXML-specific compression techniques. We also show that best compression rates are obtained with a combination of PXML-specific technique with a rather simple generic DAG-compression technique.

  6. Use of probabilistic methods for analysis of cost and duration uncertainties in a decision analysis framework

    International Nuclear Information System (INIS)

    Boak, D.M.; Painton, L.

    1995-01-01

    Probabilistic forecasting techniques have been used in many risk assessment and performance assessment applications on radioactive waste disposal projects such as Yucca Mountain and the Waste Isolation Pilot Plant (WIPP). Probabilistic techniques such as Monte Carlo and Latin Hypercube sampling methods are routinely used to treat uncertainties in physical parameters important in simulating radionuclide transport in a coupled geohydrologic system and assessing the ability of that system to comply with regulatory release limits. However, the use of probabilistic techniques in the treatment of uncertainties in the cost and duration of programmatic alternatives on risk and performance assessment projects is less common. Where significant uncertainties exist and where programmatic decisions must be made despite existing uncertainties, probabilistic techniques may yield important insights into decision options, especially when used in a decision analysis framework and when properly balanced with deterministic analyses. For relatively simple evaluations, these types of probabilistic evaluations can be made using personal computer-based software

  7. Modeling and Quantification of Team Performance in Human Reliability Analysis for Probabilistic Risk Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Jeffrey C. JOe; Ronald L. Boring

    2014-06-01

    Probabilistic Risk Assessment (PRA) and Human Reliability Assessment (HRA) are important technical contributors to the United States (U.S.) Nuclear Regulatory Commission’s (NRC) risk-informed and performance based approach to regulating U.S. commercial nuclear activities. Furthermore, all currently operating commercial NPPs in the U.S. are required by federal regulation to be staffed with crews of operators. Yet, aspects of team performance are underspecified in most HRA methods that are widely used in the nuclear industry. There are a variety of "emergent" team cognition and teamwork errors (e.g., communication errors) that are 1) distinct from individual human errors, and 2) important to understand from a PRA perspective. The lack of robust models or quantification of team performance is an issue that affects the accuracy and validity of HRA methods and models, leading to significant uncertainty in estimating HEPs. This paper describes research that has the objective to model and quantify team dynamics and teamwork within NPP control room crews for risk informed applications, thereby improving the technical basis of HRA, which improves the risk-informed approach the NRC uses to regulate the U.S. commercial nuclear industry.

  8. Probabilistic safety analysis procedures guide, Sections 8-12. Volume 2, Rev. 1

    International Nuclear Information System (INIS)

    McCann, M.; Reed, J.; Ruger, C.; Shiu, K.; Teichmann, T.; Unione, A.; Youngblood, R.

    1985-08-01

    A procedures guide for the performance of probabilistic safety assessment has been prepared for interim use in the Nuclear Regulatory Commission programs. It will be revised as comments are received, and as experience is gained from its use. The probabilistic safety assessment studies performed are intended to produce probabilistic predictive models that can be used and extended by the utilities and by NRC to sharpen the focus of inquiries into a range of issues affecting reactor safety. The first volume of the guide describes the determination of the probability (per year) of core damage resulting from accident initiators internal to the plant (i.e., intrinsic to plant operation) and from loss of off-site electric power. The scope includes human reliability analysis, a determination of the importance of various core damage accident sequences, and an explicit treatment and display of uncertainties for key accident sequences. This second volume deals with the treatment of the so-called external events including seismic disturbances, fires, floods, etc. Ultimately, the guide will be augmented to include the plant-specific analysis of in-plant processes (i.e., containment performance). This guide provides the structure of a probabilistic safety study to be performed, and indicates what products of the study are valuable for regulatory decision making. For internal events, methodology is treated in the guide only to the extent necessary to indicate the range of methods which is acceptable; ample reference is given to alternative methodologies which may be utilized in the performance of the study. For external events, more explicit guidance is given

  9. Evaluation of Probabilistic Disease Forecasts.

    Science.gov (United States)

    Hughes, Gareth; Burnett, Fiona J

    2017-10-01

    The statistical evaluation of probabilistic disease forecasts often involves calculation of metrics defined conditionally on disease status, such as sensitivity and specificity. However, for the purpose of disease management decision making, metrics defined conditionally on the result of the forecast-predictive values-are also important, although less frequently reported. In this context, the application of scoring rules in the evaluation of probabilistic disease forecasts is discussed. An index of separation with application in the evaluation of probabilistic disease forecasts, described in the clinical literature, is also considered and its relation to scoring rules illustrated. Scoring rules provide a principled basis for the evaluation of probabilistic forecasts used in plant disease management. In particular, the decomposition of scoring rules into interpretable components is an advantageous feature of their application in the evaluation of disease forecasts.

  10. Validation of a performance model on entrepreneurship based on self-efficacy, personal goal orientation and environment goal orientation using Structural Equation Modeling

    OpenAIRE

    Figueroa Reyes, Rodrigo

    2013-01-01

    Master's thesis in International hotel and tourism management Three are the main contributions that I pretend to provide through this research. First, I will combine four theoretical constructs that, to my knowledge, have not been worked through this way before. I mean that this research is attempting to validate and estimate the existing relationships between Self-efficacy, Perceived Personal Goal Orientation, Perceived Environment Goal Orientation and Perceived Personal Performance. T...

  11. A Probabilistic Approach for Robustness Evaluation of Timber Structures

    DEFF Research Database (Denmark)

    Kirkegaard, Poul Henning; Sørensen, John Dalsgaard

    of Structures and a probabilistic modelling of the timber material proposed in the Probabilistic Model Code (PMC) of the Joint Committee on Structural Safety (JCSS). Due to the framework in the Danish Code the timber structure has to be evaluated with respect to the following criteria where at least one shall...... to criteria a) and b) the timber frame structure has one column with a reliability index a bit lower than an assumed target level. By removal three columns one by one no significant extensive failure of the entire structure or significant parts of it are obatined. Therefore the structure can be considered......A probabilistic based robustness analysis has been performed for a glulam frame structure supporting the roof over the main court in a Norwegian sports centre. The robustness analysis is based on the framework for robustness analysis introduced in the Danish Code of Practice for the Safety...

  12. A Probabilistic Damage Tolerance Concept for Welded Joints

    DEFF Research Database (Denmark)

    Lassen, T.; Sørensen, John Dalsgaard

    2002-01-01

    The first part of this paper presented the required statistics and stochastic models for reliability analysis of the fatigue fracture of welded plate joints. This present Part 2 suggests a probabilistic damage tolerance supplement to the design S–N curves for welded joints. The goal is to provide......) will have the same reliability level for the same FDF. This is true at the end of TSL and at earlier stages, i.e. fractions of TSL. The absolute value of TSL is immaterial for a given FDF. In the case of in-service inspection, the inspection interval is also given without dimensions as a fraction of TSL...

  13. Achievement goals, social goals, and motivational regulations in physical education settings.

    Science.gov (United States)

    Cecchini Estrada, José A; González González-Mesa, Carmen; Méndez-Giménez, Antonio; Fernández-Río, Javier

    2011-02-01

    This study examined the relationship between achievement and social goals, and explored how both goals affect students' level of informed self-determination in Physical Education. Participants were 395 high school students. Three scales were used to assess achievement, social goals, and motivation. Several hierarchical regression analyses revealed that mastery-approach goals were the greatest contributors to the individuals' levels of self-determination. Achievement and social goals were found to be separate predictors of students' levels of self-determination, and this highlights the importance of separating mastery and performance goals into avoidance and approach profiles. Girls reported significantly higher values than boys on responsibility, relationship, and mastery-avoidance goals, whereas boys scored higher on performance-approach goals. Researchers could use achievement and social goals to study students' motivation and achievement in Physical Education settings.

  14. Grip type and task goal modify reach-to-grasp performance in post-stroke hemiparesis

    Science.gov (United States)

    Schaefer, Sydney Y.; DeJong, Stacey L.; Cherry, Kendra M.; Lang, Catherine E.

    2011-01-01

    This study investigated whether grip type and/or task goal influenced reaching and grasping performance in post-stroke hemiparesis. Sixteen adults with post-stroke hemiparesis and twelve healthy adults reached to and grasped a cylindrical object using one of two grip types (3-finger or palmar) to achieve one of two task goals (hold or lift). Performance of the stroke group was characteristic of hemiparetic limb movement during reach-to-grasp, with more curved handpaths and slower velocities compared to the control group. These effects were present regardless of grip type or task goal. Other measures of reaching (reach time and reach velocity at object contact) and grasping (peak thumb-index finger aperture during the reach and peak grip force during the grasp) were differentially affected by grip type, task goal, or both, despite the presence of hemiparesis, providing new evidence that changes in motor patterns after stroke may occur to compensate for stroke-related motor impairment. PMID:22357103

  15. Grip type and task goal modify reach-to-grasp performance in post-stroke hemiparesis.

    Science.gov (United States)

    Schaefer, Sydney Y; DeJong, Stacey L; Cherry, Kendra M; Lang, Catherine E

    2012-04-01

    This study investigated whether grip type and/or task goal influenced reaching and grasping performance in poststroke hemiparesis. Sixteen adults with poststroke hemiparesis and twelve healthy adults reached to and grasped a cylindrical object using one of two grip types (3-finger or palmar) to achieve one of two task goals (hold or lift). Performance of the stroke group was characteristic of hemiparetic limb movement during reach-to-grasp, with more curved handpaths and slower velocities compared with the control group. These effects were present regardless of grip type or task goal. Other measures of reaching (reach time and reach velocity at object contact) and grasping (peak thumb-index finger aperture during the reach and peak grip force during the grasp) were differentially affected by grip type, task goal, or both, despite the presence of hemiparesis, providing new evidence that changes in motor patterns after stroke may occur to compensate for stroke-related motor impairment.

  16. HERMES probabilistic risk assessment. Pilot study

    International Nuclear Information System (INIS)

    Parisot, F.; Munoz, J.

    1993-01-01

    The study was performed in 1989 of the contribution of probabilistic analysis for the optimal construction of system safety status in aeronautical and European nuclear industries, shows the growing trends towards incorporation of quantitative safety assessment and lead to an agreement to undertake a prototype proof study on Hermes. The main steps of the study and results are presented in the paper

  17. Aspects of human performance as perceived by the members of a joint probabilistic risk assessment working group

    International Nuclear Information System (INIS)

    Gubler, R.; Chakraborty, S.

    1987-01-01

    For the purpose of refining the basis for emergency planning a partial probabilistic risk assessment has been carried out for a large Swiss pressurized water reactor of German design. During the investigations on system reliability it became apparent that the most sensitive and also the most important subject to deal with in the working group was the quantification of the performance of the plant personnel. The discussions showed clearly, that different and sometimes antagonistic aspects of viewing the performance of the plant personnel exist. However, because of the limited data base in the field considered, impartiality is difficult. In order to handle these difficulties the analysis was carried out with close reliance on previously performed and accessible studies for similar tasks and situations in nuclear power plants. The procedure is illustrated by two examples, the first assessing the reliability of calibrating an instrument channel of the reactor protection systems, the second assessing the performance of operators during a small loss of coolant accident. (author)

  18. Secondary Students' Writing Achievement Goals: Assessing the Mediating Effects of Mastery and Performance Goals on Writing Self-Efficacy, Affect, and Writing Achievement

    Directory of Open Access Journals (Sweden)

    Meryem Yilmaz Soylu

    2017-08-01

    Full Text Available The two studies reported here explored the factor structure of the newly constructed Writing Achievement Goal Scale (WAGS, and examined relationships among secondary students' writing achievement goals, writing self-efficacy, affect for writing, and writing achievement. In the first study, 697 middle school students completed the WAGS. A confirmatory factor analysis revealed a good fit for this data with a three-factor model that corresponds with mastery, performance approach, and performance avoidance goals. The results of Study 1 were an indication for the researchers to move forward with Study 2, which included 563 high school students. The secondary students completed the WAGS, as well as the Self-efficacy for Writing Scale, and the Liking Writing Scale. Students also self-reported grades for writing and for language arts courses. Approximately 6 weeks later, students completed a statewide writing assessment. We tested a theoretical model representing relationships among Study 2 variables using structural equation modeling including students' responses to the study scales and students' scores on the statewide assessment. Results from Study 2 revealed a good fit between a model depicting proposed relationships among the constructs and the data. Findings are discussed relative to achievement goal theory and writing.

  19. Probabilistic Analysis of Failures Mechanisms of Large Dams

    NARCIS (Netherlands)

    Shams Ghahfarokhi, G.

    2014-01-01

    Risk and reliability analysis is presently being performed in almost all fields of engineering depending upon the specific field and its particular area. Probabilistic risk analysis (PRA), also called quantitative risk analysis (QRA) is a central feature of hydraulic engineering structural design.

  20. Methods for Probabilistic Fault Diagnosis: An Electrical Power System Case Study

    Science.gov (United States)

    Ricks, Brian W.; Mengshoel, Ole J.

    2009-01-01

    Health management systems that more accurately and quickly diagnose faults that may occur in different technical systems on-board a vehicle will play a key role in the success of future NASA missions. We discuss in this paper the diagnosis of abrupt continuous (or parametric) faults within the context of probabilistic graphical models, more specifically Bayesian networks that are compiled to arithmetic circuits. This paper extends our previous research, within the same probabilistic setting, on diagnosis of abrupt discrete faults. Our approach and diagnostic algorithm ProDiagnose are domain-independent; however we use an electrical power system testbed called ADAPT as a case study. In one set of ADAPT experiments, performed as part of the 2009 Diagnostic Challenge, our system turned out to have the best performance among all competitors. In a second set of experiments, we show how we have recently further significantly improved the performance of the probabilistic model of ADAPT. While these experiments are obtained for an electrical power system testbed, we believe they can easily be transitioned to real-world systems, thus promising to increase the success of future NASA missions.

  1. Probabilistic broadcasting of mixed states

    International Nuclear Information System (INIS)

    Li Lvjun; Li Lvzhou; Wu Lihua; Zou Xiangfu; Qiu Daowen

    2009-01-01

    It is well known that the non-broadcasting theorem proved by Barnum et al is a fundamental principle of quantum communication. As we are aware, optimal broadcasting (OB) is the only method to broadcast noncommuting mixed states approximately. In this paper, motivated by the probabilistic cloning of quantum states proposed by Duan and Guo, we propose a new way for broadcasting noncommuting mixed states-probabilistic broadcasting (PB), and we present a sufficient condition for PB of mixed states. To a certain extent, we generalize the probabilistic cloning theorem from pure states to mixed states, and in particular, we generalize the non-broadcasting theorem, since the case that commuting mixed states can be exactly broadcast can be thought of as a special instance of PB where the success ratio is 1. Moreover, we discuss probabilistic local broadcasting (PLB) of separable bipartite states

  2. Probabilistic modeling of timber structures

    DEFF Research Database (Denmark)

    Köhler, Jochen; Sørensen, John Dalsgaard; Faber, Michael Havbro

    2007-01-01

    The present paper contains a proposal for the probabilistic modeling of timber material properties. It is produced in the context of the Probabilistic Model Code (PMC) of the Joint Committee on Structural Safety (JCSS) [Joint Committee of Structural Safety. Probabilistic Model Code, Internet...... Publication: www.jcss.ethz.ch; 2001] and of the COST action E24 ‘Reliability of Timber Structures' [COST Action E 24, Reliability of timber structures. Several meetings and Publications, Internet Publication: http://www.km.fgg.uni-lj.si/coste24/coste24.htm; 2005]. The present proposal is based on discussions...... and comments from participants of the COST E24 action and the members of the JCSS. The paper contains a description of the basic reference properties for timber strength parameters and ultimate limit state equations for timber components. The recommended probabilistic model for these basic properties...

  3. Structural reliability codes for probabilistic design

    DEFF Research Database (Denmark)

    Ditlevsen, Ove Dalager

    1997-01-01

    probabilistic code format has not only strong influence on the formal reliability measure, but also on the formal cost of failure to be associated if a design made to the target reliability level is considered to be optimal. In fact, the formal cost of failure can be different by several orders of size for two...... different, but by and large equally justifiable probabilistic code formats. Thus, the consequence is that a code format based on decision theoretical concepts and formulated as an extension of a probabilistic code format must specify formal values to be used as costs of failure. A principle of prudence...... is suggested for guiding the choice of the reference probabilistic code format for constant reliability. In the author's opinion there is an urgent need for establishing a standard probabilistic reliability code. This paper presents some considerations that may be debatable, but nevertheless point...

  4. GOAL PROFILES, MENTAL TOUGHNESS AND ITS INFLUENCE ON PERFORMANCE OUTCOMES AMONG WUSHU ATHLETES

    Directory of Open Access Journals (Sweden)

    Garry Kuan

    2007-10-01

    Full Text Available This study examined the association between goal orientations and mental toughness and its influence on performance outcomes in competition. Wushu athletes (n = 40 competing in Intervarsity championships in Malaysia completed Task and Ego Orientations in Sport Questionnaire (TEOSQ and Psychological Performance Inventory (PPI. Using cluster analysis techniques including hierarchical methods and the non-hierarchical method (k-means cluster to examine goal profiles, a three cluster solution emerged viz. cluster 1 - high task and moderate ego (HT/ME, cluster 2 - moderate task and low ego (MT/LE and, cluster 3 - moderate task and moderate ego (MT/ME. Analysis of the fundamental areas of mental toughness based on goal profiles revealed that athletes in cluster 1 scored significantly higher on negative energy control than athletes in cluster 2. Further, athletes in cluster 1 also scored significantly higher on positive energy control than athletes in cluster 3. Chi-square (χ2 test revealed no significant differences among athletes with different goal profiles on performance outcomes in the competition. However, significant differences were observed between athletes (medallist and non medallist in self- confidence (p = 0.001 and negative energy control (p = 0.042. Medallist's scored significantly higher on self-confidence (mean = 21.82 ± 2.72 and negative energy control (mean = 19.59 ± 2.32 than the non-medallists (self confidence-mean = 18.76 ± 2.49; negative energy control mean = 18.14 ± 1.91.

  5. Probabilistic biological network alignment.

    Science.gov (United States)

    Todor, Andrei; Dobra, Alin; Kahveci, Tamer

    2013-01-01

    Interactions between molecules are probabilistic events. An interaction may or may not happen with some probability, depending on a variety of factors such as the size, abundance, or proximity of the interacting molecules. In this paper, we consider the problem of aligning two biological networks. Unlike existing methods, we allow one of the two networks to contain probabilistic interactions. Allowing interaction probabilities makes the alignment more biologically relevant at the expense of explosive growth in the number of alternative topologies that may arise from different subsets of interactions that take place. We develop a novel method that efficiently and precisely characterizes this massive search space. We represent the topological similarity between pairs of aligned molecules (i.e., proteins) with the help of random variables and compute their expected values. We validate our method showing that, without sacrificing the running time performance, it can produce novel alignments. Our results also demonstrate that our method identifies biologically meaningful mappings under a comprehensive set of criteria used in the literature as well as the statistical coherence measure that we developed to analyze the statistical significance of the similarity of the functions of the aligned protein pairs.

  6. Anticipatory systems using a probabilistic-possibilistic formalism

    International Nuclear Information System (INIS)

    Tsoukalas, L.H.

    1989-01-01

    A methodology for the realization of the Anticipatory Paradigm in the diagnosis and control of complex systems, such as power plants, is developed. The objective is to synthesize engineering systems as analogs of certain biological systems which are capable of modifying their present states on the basis of anticipated future states. These future states are construed to be the output of predictive, numerical, stochastic or symbolic models. The mathematical basis of the implementation is developed on the basis of a formulation coupling probabilistic (random) and possibilistic(fuzzy) data in the form of an Information Granule. Random data are generated from observations and sensors input from the environment. Fuzzy data consists of eqistemic information, such as criteria or constraints qualifying the environmental inputs. The approach generates mathematical performance measures upon which diagnostic inferences and control functions are based. Anticipated performance is generated using a fuzzified Bayes formula. Triplex arithmetic is used in the numerical estimation of the performance measures. Representation of the system is based upon a goal-tree within the rule-based paradigm from the field of Applied Artificial Intelligence. The ensuing construction incorporates a coupling of Symbolic and Procedural programming methods. As a demonstration of the possibility of constructing such systems, a model-based system of a nuclear reactor is constructed. A numerical model of the reactor as a damped simple harmonic oscillator is used. The neutronic behavior is described by a point kinetics model with temperature feedback. The resulting system is programmed in OPS5 for the symbolic component and in FORTRAN for the procedural part

  7. Easy and difficult performance-approach goals : Their moderating effect on the link between task interest and performance attainment

    NARCIS (Netherlands)

    Blaga, Monica; Van Yperen, N.W.

    2008-01-01

    The purpose of this study was to demonstrate that the positive link between task interest and performance attainment can he negatively affected by the pursuit of difficult performance-approach goals. This was tested in a sample of 60 undergraduate Students at a Dutch university, In line with

  8. Confluence Reduction for Probabilistic Systems (extended version)

    NARCIS (Netherlands)

    Timmer, Mark; Stoelinga, Mariëlle Ida Antoinette; van de Pol, Jan Cornelis

    2010-01-01

    This paper presents a novel technique for state space reduction of probabilistic specifications, based on a newly developed notion of confluence for probabilistic automata. We prove that this reduction preserves branching probabilistic bisimulation and can be applied on-the-fly. To support the

  9. Optimization of structures subjected to dynamic load: deterministic and probabilistic methods

    Directory of Open Access Journals (Sweden)

    Élcio Cassimiro Alves

    Full Text Available Abstract This paper deals with the deterministic and probabilistic optimization of structures against bending when submitted to dynamic loads. The deterministic optimization problem considers the plate submitted to a time varying load while the probabilistic one takes into account a random loading defined by a power spectral density function. The correlation between the two problems is made by one Fourier Transformed. The finite element method is used to model the structures. The sensitivity analysis is performed through the analytical method and the optimization problem is dealt with by the method of interior points. A comparison between the deterministic optimisation and the probabilistic one with a power spectral density function compatible with the time varying load shows very good results.

  10. Next-generation probabilistic seismicity forecasting

    Energy Technology Data Exchange (ETDEWEB)

    Hiemer, S.

    2014-07-01

    novel automated method to investigate the significance of spatial b-value variations. The method incorporates an objective data-driven partitioning scheme, which is based on penalized likelihood estimates. These well-defined criteria avoid the difficult choice of commonly applied spatial mapping parameters, such as grid spacing or size of mapping radii. We construct a seismicity forecast that includes spatial b-value variations and demonstrate our model’s skill and reliability when applied to data from California. All proposed probabilistic seismicity forecasts were subjected to evaluation methods using state of the art algorithms provided by the 'Collaboratory for the Study of Earthquake Predictability' infrastructure. First, we evaluated the statistical agreement between the forecasted and observed rates of target events in terms of number, space and magnitude. Secondly, we assessed the performance of one forecast relative to another. We find that the forecasts presented in this thesis are reliable and show significant skills with respect to established classical forecasts. These next-generation probabilistic seismicity forecasts can thus provide hazard information that are potentially useful in reducing earthquake losses and enhancing community preparedness and resilience. (author)

  11. Next-generation probabilistic seismicity forecasting

    International Nuclear Information System (INIS)

    Hiemer, S.

    2014-01-01

    novel automated method to investigate the significance of spatial b-value variations. The method incorporates an objective data-driven partitioning scheme, which is based on penalized likelihood estimates. These well-defined criteria avoid the difficult choice of commonly applied spatial mapping parameters, such as grid spacing or size of mapping radii. We construct a seismicity forecast that includes spatial b-value variations and demonstrate our model’s skill and reliability when applied to data from California. All proposed probabilistic seismicity forecasts were subjected to evaluation methods using state of the art algorithms provided by the 'Collaboratory for the Study of Earthquake Predictability' infrastructure. First, we evaluated the statistical agreement between the forecasted and observed rates of target events in terms of number, space and magnitude. Secondly, we assessed the performance of one forecast relative to another. We find that the forecasts presented in this thesis are reliable and show significant skills with respect to established classical forecasts. These next-generation probabilistic seismicity forecasts can thus provide hazard information that are potentially useful in reducing earthquake losses and enhancing community preparedness and resilience. (author)

  12. Achievement goals affect metacognitive judgments

    Science.gov (United States)

    Ikeda, Kenji; Yue, Carole L.; Murayama, Kou; Castel, Alan D.

    2017-01-01

    The present study examined the effect of achievement goals on metacognitive judgments, such as judgments of learning (JOLs) and metacomprehension judgments, and actual recall performance. We conducted five experiments manipulating the instruction of achievement goals. In each experiment, participants were instructed to adopt mastery-approach goals (i.e., develop their own mental ability through a memory task) or performance-approach goals (i.e., demonstrate their strong memory ability through getting a high score on a memory task). The results of Experiments 1 and 2 showed that JOLs of word pairs in the performance-approach goal condition tended to be higher than those in the mastery-approach goal condition. In contrast, cued recall performance did not differ between the two goal conditions. Experiment 3 also demonstrated that metacomprehension judgments of text passages were higher in the performance-approach goal condition than in the mastery-approach goals condition, whereas test performance did not differ between conditions. These findings suggest that achievement motivation affects metacognitive judgments during learning, even when achievement motivation does not influence actual performance. PMID:28983496

  13. Setting Ambitious yet Achievable Targets Using Probabilistic Projections: Meeting Demand for Family Planning.

    Science.gov (United States)

    Kantorová, Vladimíra; New, Jin Rou; Biddlecom, Ann; Alkema, Leontine

    2017-09-01

    In 2015, governments adopted 17 internationally agreed goals to ensure progress and well-being in the economic, social, and environmental dimensions of sustainable development. These new goals present a challenge for countries to set empirical targets that are ambitious yet achievable and that can account for different starting points and rates of progress. We used probabilistic projections of family planning indicators, based on a global data set and Bayesian hierarchical modeling, to generate illustrative targets at the country level. Targets were defined as the percentage of demand for family planning satisfied with modern contraceptive methods where a country has at least a 10 percent chance of reaching the target by 2030. National targets for 2030 ranged from below 50 percent of demand satisfied with modern contraceptives (for three countries in Africa) to above 90 percent (for 41 countries from all major areas of the world). The probabilistic approach also identified countries for which a global fixed target value of 75 percent demand satisfied was either unambitious or has little chance of achievement. We present the web-based Family Planning Estimation Tool (FPET) enabling national decision makers to compute and assess targets for meeting family planning demand. © 2017 The Population Council, Inc.

  14. Institutional implications of establishing safety goals for nuclear power plants

    International Nuclear Information System (INIS)

    Morris, F.A.; Hooper, R.L.

    1983-07-01

    The purpose of this project is to anticipate and address institutional problems that may arise from the adoption of NRC's proposed Policy Statement on Safety Goals for Nuclear Power Plants. The report emphasizes one particular category of institutional problems: the possible use of safety goals as a basis for legal challenges to NRC actions, and the resolution of such challenges by the courts. Three types of legal issues are identified and analyzed. These are, first, general legal issues such as access to the legal system, burden of proof, and standard of proof. Second is the particular formulation of goals. Involved here are such questions as sustainable rationale, definitions, avoided issues, vagueness of time and space details, and degree of conservatism. Implementation brings up the third set of issues which include interpretation and application, linkage to probabilistic risk assessment, consequences as compared to events, and the use of results

  15. Probabilistic cost estimating of nuclear power plant construction projects

    International Nuclear Information System (INIS)

    Finch, W.C.; Perry, L.W.; Postula, F.D.

    1978-01-01

    This paper shows how to identify and isolate cost accounts by developing probability trees down to component levels as justified by value and cost uncertainty. Examples are given of the procedure for assessing uncertainty in all areas contributing to cost: design, factory equipment pricing, and field labor and materials. The method of combining these individual uncertainties is presented so that the cost risk can be developed for components, systems and the total plant construction project. Formats which enable management to use the probabilistic cost estimate information for business planning and risk control are illustrated. Topics considered include code estimate performance, cost allocation, uncertainty encoding, probabilistic cost distributions, and interpretation. Effective cost control of nuclear power plant construction projects requires insight into areas of greatest cost uncertainty and a knowledge of the factors which can cause costs to vary from the single value estimates. It is concluded that probabilistic cost estimating can provide the necessary assessment of uncertainties both as to the cause and the consequences

  16. Experimental POD measurement using ultrasonic phased arrays for incorporating nondestructive testes in probabilistic failure analyses

    International Nuclear Information System (INIS)

    Kurz, Jochen H.; Dobmann, Gerd; Juengert, Anne; Dugan, Sandra; Roos, Eberhard

    2011-01-01

    In nuclear facilities, nondestructive tests are carried out during construction and during inspections. The type and extent of the tests are specified in the KTA rules. All tests must be qualified. In the past, the qualifications were made by extensive performance demonstrations of the test teams and equipment, which were judged by experts. This provided primarily pragmatic information on fault detection performance. In the USA, qualification of EPRI test teams also includes testing of test pieces with hidden (unknown) defects, of which a certain percentage must be detected. There is still a lack of information on the probability of detection (POD), in the form of POD curves, of specific defects in given test situations, using specifically selected testing techniques. Quantification of POD and the integration of relevant data in the probabilistic evaluation chain is one of the key goals of a research project whose first results are presented here. The concept of the project and first results of ultrasonic tests are presented. Defect distributions in the test pieces, experiment planning, and test specifications are gone into more closely. One of the most important goals is the specification of the residual uncertainty of components failure on the basis of the investigations. An outlook is presented for this.

  17. Learning Probabilistic Inference through Spike-Timing-Dependent Plasticity123

    Science.gov (United States)

    Pecevski, Dejan

    2016-01-01

    Abstract Numerous experimental data show that the brain is able to extract information from complex, uncertain, and often ambiguous experiences. Furthermore, it can use such learnt information for decision making through probabilistic inference. Several models have been proposed that aim at explaining how probabilistic inference could be performed by networks of neurons in the brain. We propose here a model that can also explain how such neural network could acquire the necessary information for that from examples. We show that spike-timing-dependent plasticity in combination with intrinsic plasticity generates in ensembles of pyramidal cells with lateral inhibition a fundamental building block for that: probabilistic associations between neurons that represent through their firing current values of random variables. Furthermore, by combining such adaptive network motifs in a recursive manner the resulting network is enabled to extract statistical information from complex input streams, and to build an internal model for the distribution p* that generates the examples it receives. This holds even if p* contains higher-order moments. The analysis of this learning process is supported by a rigorous theoretical foundation. Furthermore, we show that the network can use the learnt internal model immediately for prediction, decision making, and other types of probabilistic inference. PMID:27419214

  18. Probabilistic finite elements for fracture mechanics

    Science.gov (United States)

    Besterfield, Glen

    1988-01-01

    The probabilistic finite element method (PFEM) is developed for probabilistic fracture mechanics (PFM). A finite element which has the near crack-tip singular strain embedded in the element is used. Probabilistic distributions, such as expectation, covariance and correlation stress intensity factors, are calculated for random load, random material and random crack length. The method is computationally quite efficient and can be expected to determine the probability of fracture or reliability.

  19. N reactor individual risk comparison to quantitative nuclear safety goals

    International Nuclear Information System (INIS)

    Wang, O.S.; Rainey, T.E.; Zentner, M.D.

    1990-01-01

    A full-scope level III probabilistic risk assessment (PRA) has been completed for N reactor, a US Department of Energy (DOE) production reactor located on the Hanford Reservation in the state of Washington. Sandia National Laboratories (SNL) provided the technical leadership for this work, using the state-of-the-art NUREG-1150 methodology developed for the US Nuclear Regulatory Commission (NRC). The main objectives of this effort were to assess the risks to the public and to the on-site workers posed by the operation of N reactor, to identify changes to the plant that could reduce the overall risk, and to compare those risks to the proposed NRC and DOE quantitative safety goals. This paper presents the methodology adopted by Westinghouse Hanford Company (WHC) and SNL for individual health risk evaluation, its results, and a comparison to the NRC safety objectives and the DOE nuclear safety guidelines. The N reactor results, are also compared with the five NUREG-1150 nuclear plants. Only internal events are compared here because external events are not yet reported in the current draft NUREG-1150. This is the first full-scope level III PRA study with a detailed quantitative safety goal comparison performed for DOE production reactors

  20. Task complexity and task, goal, and reward interdependence in group performance management : A prescriptive model

    NARCIS (Netherlands)

    van Vijfeijken, H.; Kleingeld, A.; van Tuijl, H.; Algera, J.A.; Thierry, Hk.

    2002-01-01

    A prescriptive model on how to design effective combinations of goal setting and contingent rewards for group performance management is presented. The model incorporates the constructs task complexity, task interdependence, goal interdependence, and reward interdependence and specifies optimal fit

  1. Nuclear power plant personnel errors in decision-making as an object of probabilistic risk assessment

    International Nuclear Information System (INIS)

    Reer, B.

    1993-09-01

    The integration of human error - also called man-machine system analysis (MMSA) - is an essential part of probabilistic risk assessment (PRA). A new method is presented which allows for a systematic and comprehensive PRA inclusions of decision-based errors due to conflicts or similarities. For the error identification procedure, new question techniques are developed. These errors are shown to be identified by looking at retroactions caused by subordinate goals as components of the overall safety relevant goal. New quantification methods for estimating situation-specific probabilities are developed. The factors conflict and similarity are operationalized in a way that allows their quantification based on informations which are usually available in PRA. The quantification procedure uses extrapolations and interpolations based on a poor set of data related to decision-based errors. Moreover, for passive errors in decision-making a completely new approach is presented where errors are quantified via a delay initiating the required action rather than via error probabilities. The practicability of this dynamic approach is demonstrated by a probabilistic analysis of the actions required during the total loss of feedwater event at the Davis-Besse plant 1985. The extensions of the ''classical'' PRA method developed in this work are applied to a MMSA of the decay heat removal (DHR) of the ''HTR-500''. Errors in decision-making - as potential roots of extraneous acts - are taken into account in a comprehensive and systematic manner. Five additional errors are identified. However, the probabilistic quantification results a nonsignificant increase of the DHR failure probability. (orig.) [de

  2. Probabilistic Mu-Calculus

    DEFF Research Database (Denmark)

    Larsen, Kim Guldstrand; Mardare, Radu Iulian; Xue, Bingtian

    2016-01-01

    We introduce a version of the probabilistic µ-calculus (PMC) built on top of a probabilistic modal logic that allows encoding n-ary inequational conditions on transition probabilities. PMC extends previously studied calculi and we prove that, despite its expressiveness, it enjoys a series of good...... metaproperties. Firstly, we prove the decidability of satisfiability checking by establishing the small model property. An algorithm for deciding the satisfiability problem is developed. As a second major result, we provide a complete axiomatization for the alternation-free fragment of PMC. The completeness proof...

  3. Concurrent Probabilistic Simulation of High Temperature Composite Structural Response

    Science.gov (United States)

    Abdi, Frank

    1996-01-01

    A computational structural/material analysis and design tool which would meet industry's future demand for expedience and reduced cost is presented. This unique software 'GENOA' is dedicated to parallel and high speed analysis to perform probabilistic evaluation of high temperature composite response of aerospace systems. The development is based on detailed integration and modification of diverse fields of specialized analysis techniques and mathematical models to combine their latest innovative capabilities into a commercially viable software package. The technique is specifically designed to exploit the availability of processors to perform computationally intense probabilistic analysis assessing uncertainties in structural reliability analysis and composite micromechanics. The primary objectives which were achieved in performing the development were: (1) Utilization of the power of parallel processing and static/dynamic load balancing optimization to make the complex simulation of structure, material and processing of high temperature composite affordable; (2) Computational integration and synchronization of probabilistic mathematics, structural/material mechanics and parallel computing; (3) Implementation of an innovative multi-level domain decomposition technique to identify the inherent parallelism, and increasing convergence rates through high- and low-level processor assignment; (4) Creating the framework for Portable Paralleled architecture for the machine independent Multi Instruction Multi Data, (MIMD), Single Instruction Multi Data (SIMD), hybrid and distributed workstation type of computers; and (5) Market evaluation. The results of Phase-2 effort provides a good basis for continuation and warrants Phase-3 government, and industry partnership.

  4. Performance goals in conflictual social interactions: towards the distinction between two modes of relational conflict regulation.

    Science.gov (United States)

    Sommet, Nicolas; Darnon, Céline; Mugny, Gabriel; Quiamzade, Alain; Pulfrey, Caroline; Dompnier, Benoît; Butera, Fabrizio

    2014-03-01

    Socio-cognitive conflict has been defined as a situation of confrontation with a disagreeing other. Previous research suggests that individuals can regulate conflict in a relational way, namely by focusing on social comparison between relative levels of competences. Relational conflict regulation has been described as yielding particularly negative effects on social interactions and learning, but has been understudied. The present research addresses the question of the origin of relational conflict regulation by introducing a fundamental distinction between two types of regulation, one based on the affirmation of one's own point of view and the invalidation of the other's (i.e., 'competitive' regulation), the other corresponding to the protection of self-competence via compliance (i.e., 'protective' regulation). Three studies show that these modes of relational conflict regulation result from the endorsement of distinct performance goals, respectively, performance-approach goals (trying to outperform others) and performance-avoidance goals (avoiding performing more poorly than others). Theoretical implications for the literature on both conflict regulation and achievement goals are discussed. © 2012 The British Psychological Society.

  5. Probabilistic seismic history matching using binary images

    Science.gov (United States)

    Davolio, Alessandra; Schiozer, Denis Jose

    2018-02-01

    Currently, the goal of history-matching procedures is not only to provide a model matching any observed data but also to generate multiple matched models to properly handle uncertainties. One such approach is a probabilistic history-matching methodology based on the discrete Latin Hypercube sampling algorithm, proposed in previous works, which was particularly efficient for matching well data (production rates and pressure). 4D seismic (4DS) data have been increasingly included into history-matching procedures. A key issue in seismic history matching (SHM) is to transfer data into a common domain: impedance, amplitude or pressure, and saturation. In any case, seismic inversions and/or modeling are required, which can be time consuming. An alternative to avoid these procedures is using binary images in SHM as they allow the shape, rather than the physical values, of observed anomalies to be matched. This work presents the incorporation of binary images in SHM within the aforementioned probabilistic history matching. The application was performed with real data from a segment of the Norne benchmark case that presents strong 4D anomalies, including softening signals due to pressure build up. The binary images are used to match the pressurized zones observed in time-lapse data. Three history matchings were conducted using: only well data, well and 4DS data, and only 4DS. The methodology is very flexible and successfully utilized the addition of binary images for seismic objective functions. Results proved the good convergence of the method in few iterations for all three cases. The matched models of the first two cases provided the best results, with similar well matching quality. The second case provided models presenting pore pressure changes according to the expected dynamic behavior (pressurized zones) observed on 4DS data. The use of binary images in SHM is relatively new with few examples in the literature. This work enriches this discussion by presenting a new

  6. Site-specific Probabilistic Analysis of DCGLs Using RESRAD Code

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jeongju; Yoon, Suk Bon; Sohn, Wook [KHNP CRI, Daejeon (Korea, Republic of)

    2016-10-15

    In general, DCGLs can be conservative (screening DCGL) if they do not take into account site specific factors. Use of such conservative DCGLs can lead to additional remediation that would not be required if the effort was made to develop site-specific DCGLs. Therefore, the objective of this work is to provide an example on the use of the RESRAD 6.0 probabilistic (site-specific) dose analysis to compare with the screening DCGL. Site release regulations state that a site will be considered acceptable for unrestricted use if the residual radioactivity that is distinguishable from background radiation results in a Total Effective Dose Equivalent (TEDE) to an average member of the critical group of less than the site release criteria, for example 0.25 mSv per year in U.S. Utilities use computer dose modeling codes to establish an acceptable level of contamination, the derived concentration guideline level (DCGL) that will meet this regulatory limit. Since the DCGL value is the principal measure of residual radioactivity, it is critical to understand the technical basis of these dose modeling codes. The objective this work was to provide example on nuclear power plant decommissioning dose analysis in a probabilistic analysis framework. The focus was on the demonstration of regulatory compliance for surface soil contamination using the RESRAD 6.0 code. Both the screening and site-specific probabilistic dose analysis methodologies were examined. Example analyses performed with the screening probabilistic dose analysis confirmed the conservatism of the NRC screening values and indicated the effectiveness of probabilistic dose analysis in reducing the conservatism in DCGL derivation.

  7. Formalizing Probabilistic Safety Claims

    Science.gov (United States)

    Herencia-Zapana, Heber; Hagen, George E.; Narkawicz, Anthony J.

    2011-01-01

    A safety claim for a system is a statement that the system, which is subject to hazardous conditions, satisfies a given set of properties. Following work by John Rushby and Bev Littlewood, this paper presents a mathematical framework that can be used to state and formally prove probabilistic safety claims. It also enables hazardous conditions, their uncertainties, and their interactions to be integrated into the safety claim. This framework provides a formal description of the probabilistic composition of an arbitrary number of hazardous conditions and their effects on system behavior. An example is given of a probabilistic safety claim for a conflict detection algorithm for aircraft in a 2D airspace. The motivation for developing this mathematical framework is that it can be used in an automated theorem prover to formally verify safety claims.

  8. The Relationship Between Goal Orientation, Social Comparison Responses, Self-Efficacy, and Performance

    NARCIS (Netherlands)

    Carmona, Carmen; Buunk, Abraham P.; Dijkstra, Arie; Peiro, Jose M.

    2008-01-01

    The present study examined whether social comparison responses (identification and contrast in social comparison) mediated the relationship between goal orientation (promotion and prevention) and self-efficacy, and whether self-efficacy was subsequently related with a better performance. As

  9. Compression of Probabilistic XML documents

    NARCIS (Netherlands)

    Veldman, Irma

    2009-01-01

    Probabilistic XML (PXML) files resulting from data integration can become extremely large, which is undesired. For XML there are several techniques available to compress the document and since probabilistic XML is in fact (a special form of) XML, it might benefit from these methods even more. In

  10. Comparison between Canadian probabilistic safety assessment methods formulated by Atomic Energy of Canada limited and probabilistic risk assessment methods

    International Nuclear Information System (INIS)

    Shapiro, H.S.; Smith, J.E.

    1989-01-01

    The procedures used by Atomic Energy of Canada Limited (AECL) to perform probabilistic safety assessments (PRAs) differ somewhat from conventionally accepted probabilistic risk assessment (PRA) procedures used elsewhere. In Canada, PSA is used by AECL as an audit tool for an evolving design. The purpose is to assess the safety of the plant in engineering terms. Thus, the PSA procedures are geared toward providing engineering feedback so that necessary changes can be made to the design at an early stage, input can be made to operating procedures, and test and maintenance programs can be optimized in terms of costs. Most PRAs, by contrast, are performed in plants that are already built. Their main purpose is to establish the core melt frequency and the risk to the public due to core melt. Also, any design modification is very expensive. The differences in purpose and timing between PSA and PRA have resulted in differences in methodology and scope. The PSA procedures are used on all plants being designed by AECL

  11. Probabilistic Teleportation of an Arbitrary Two-Particle State and Its Quantum Circuits

    Institute of Scientific and Technical Information of China (English)

    GUO Zhan-Ying; FANG Jian-Xing; ZHU Shi-Qun; QIAN Xue-Min

    2006-01-01

    Two simple schemes for probabilistic teleportation of an arbitrary unknown two-particle state using a non-maximally entangled EPR pair and a non-maximally entangled GHZ state as quantum channels are proposed.After receiving Alice's Bell state measurement results, Bob performs a collective unitary transformation on his inherent particles without introducing the auxiliary qubit. The original state can be probabilistically teleported. Meanwhile,quantum circuits for realization of successful teleportation are also presented.

  12. Probabilistic Infinite Secret Sharing

    OpenAIRE

    Csirmaz, László

    2013-01-01

    The study of probabilistic secret sharing schemes using arbitrary probability spaces and possibly infinite number of participants lets us investigate abstract properties of such schemes. It highlights important properties, explains why certain definitions work better than others, connects this topic to other branches of mathematics, and might yield new design paradigms. A probabilistic secret sharing scheme is a joint probability distribution of the shares and the secret together with a colle...

  13. Achieving safety/risk goals for less ATR backup power upgrades

    International Nuclear Information System (INIS)

    Atkinson, S.A.

    1995-01-01

    The Advanced Test Reactor probabilistic risk assessment for internal fire and flood events defined a relatively high risk for a total loss of electric power possibly leading to core damage. Backup power sources were disabled due to fire and flooding in the diesel generator area with propagation of the flooding to a common switchgear room. The ATR risk assessment was employed to define options for relocation of backup power system components to achieve needed risk reduction while minimizing costs. The risk evaluations were performed using sensitivity studies and importance measures. The risk-based evaluations of relocation options for backup power systems saved over $3 million from what might have been otherwise considered open-quotes necessaryclose quotes for safety/risk improvement. The ATR experience shows that the advantages of a good risk assessment are to define risk significance, risk specifics, and risk solutions which enable risk goals to be achieved at the lowest cost

  14. Probabilistic simulation applications to reliability assessments

    International Nuclear Information System (INIS)

    Miller, Ian; Nutt, Mark W.; Hill, Ralph S. III

    2003-01-01

    Probabilistic risk/reliability (PRA) analyses for engineered systems are conventionally based on fault-tree methods. These methods are mature and efficient, and are well suited to systems consisting of interacting components with known, low probabilities of failure. Even complex systems, such as nuclear power plants or aircraft, are modeled by the careful application of these approaches. However, for systems that may evolve in complex and nonlinear ways, and where the performance of components may be a sensitive function of the history of their working environments, fault-tree methods can be very demanding. This paper proposes an alternative method of evaluating such systems, based on probabilistic simulation using intelligent software objects to represent the components of such systems. Using a Monte Carlo approach, simulation models can be constructed from relatively simple interacting objects that capture the essential behavior of the components that they represent. Such models are capable of reflecting the complex behaviors of the systems that they represent in a natural and realistic way. (author)

  15. Probabilistic simple sticker systems

    Science.gov (United States)

    Selvarajoo, Mathuri; Heng, Fong Wan; Sarmin, Nor Haniza; Turaev, Sherzod

    2017-04-01

    A model for DNA computing using the recombination behavior of DNA molecules, known as a sticker system, was introduced by by L. Kari, G. Paun, G. Rozenberg, A. Salomaa, and S. Yu in the paper entitled DNA computing, sticker systems and universality from the journal of Acta Informatica vol. 35, pp. 401-420 in the year 1998. A sticker system uses the Watson-Crick complementary feature of DNA molecules: starting from the incomplete double stranded sequences, and iteratively using sticking operations until a complete double stranded sequence is obtained. It is known that sticker systems with finite sets of axioms and sticker rules generate only regular languages. Hence, different types of restrictions have been considered to increase the computational power of sticker systems. Recently, a variant of restricted sticker systems, called probabilistic sticker systems, has been introduced [4]. In this variant, the probabilities are initially associated with the axioms, and the probability of a generated string is computed by multiplying the probabilities of all occurrences of the initial strings in the computation of the string. Strings for the language are selected according to some probabilistic requirements. In this paper, we study fundamental properties of probabilistic simple sticker systems. We prove that the probabilistic enhancement increases the computational power of simple sticker systems.

  16. Probabilistic seismic hazard analysis - lessons learned: A regulator's perspective

    International Nuclear Information System (INIS)

    Reiter, L.

    1990-01-01

    Probabilistic seismic hazard analysis is a powerful, rational and attractive tool for decision-making. It is capable of absorbing and integrating a wide range of information and judgement and their associated uncertainties into a flexible framework that permits the application of societal goals and priorities. Unfortunately, its highly integrative nature can obscure those elements which drive the results, its highly quantitative nature can lead to false impressions of accuracy, and its open embrace of uncertainty can make decision-making difficult. Addressing these problems can only help to increase its use and make it more palatable to those who need to assess seismic hazard and utilize the results. (orig.)

  17. Probabilistic safety criteria on high burnup HWR fuels

    International Nuclear Information System (INIS)

    Marino, A.C.

    2002-01-01

    BACO is a code for the simulation of the thermo-mechanical and fission gas behaviour of a cylindrical fuel rod under operation conditions. Their input parameters and, therefore, output ones may include statistical dispersion. In this paper, experimental CANDU fuel rods irradiated at the NRX reactor together with experimental MOX fuel rods and the IAEA-CRP FUMEX cases are used in order to determine the sensitivity of BACO code predictions. The techniques for sensitivity analysis defined in BACO are: the 'extreme case analysis', the 'parametric analysis' and the 'probabilistic (or statistics) analysis'. We analyse the CARA and CAREM fuel rods relation between predicted performance and statistical dispersion in order of enhanced their original designs taking account probabilistic safety criteria and using the BACO's sensitivity analysis. (author)

  18. Comparing Psychology Undergraduates' Performance in Probabilistic Reasoning under Verbal-Numerical and Graphical-Pictorial Problem Presentation Format: What Is the Role of Individual and Contextual Dimensions?

    Science.gov (United States)

    Agus, Mirian; Peró-Cebollero, Maribel; Penna, Maria Pietronilla; Guàrdia-Olmos, Joan

    2015-01-01

    This study aims to investigate about the existence of a graphical facilitation effect on probabilistic reasoning. Measures of undergraduates' performances on problems presented in both verbal-numerical and graphical-pictorial formats have been related to visuo-spatial and numerical prerequisites, to statistical anxiety, to attitudes towards…

  19. Probabilistic methods for rotordynamics analysis

    Science.gov (United States)

    Wu, Y.-T.; Torng, T. Y.; Millwater, H. R.; Fossum, A. F.; Rheinfurth, M. H.

    1991-01-01

    This paper summarizes the development of the methods and a computer program to compute the probability of instability of dynamic systems that can be represented by a system of second-order ordinary linear differential equations. Two instability criteria based upon the eigenvalues or Routh-Hurwitz test functions are investigated. Computational methods based on a fast probability integration concept and an efficient adaptive importance sampling method are proposed to perform efficient probabilistic analysis. A numerical example is provided to demonstrate the methods.

  20. Probabilistic Modeling of Wind Turbine Drivetrain Components

    DEFF Research Database (Denmark)

    Rafsanjani, Hesam Mirzaei

    Wind energy is one of several energy sources in the world and a rapidly growing industry in the energy sector. When placed in offshore or onshore locations, wind turbines are exposed to wave excitations, highly dynamic wind loads and/or the wakes from other wind turbines. Therefore, most components...... in a wind turbine experience highly dynamic and time-varying loads. These components may fail due to wear or fatigue, and this can lead to unplanned shutdown repairs that are very costly. The design by deterministic methods using safety factors is generally unable to account for the many uncertainties. Thus......, a reliability assessment should be based on probabilistic methods where stochastic modeling of failures is performed. This thesis focuses on probabilistic models and the stochastic modeling of the fatigue life of the wind turbine drivetrain. Hence, two approaches are considered for stochastic modeling...

  1. Relating beliefs in writing skill malleability to writing performance: The mediating role of achievement goals and self-efficacy

    Directory of Open Access Journals (Sweden)

    Teresa Limpo

    2017-10-01

    Full Text Available It is well established that students’ beliefs in skill malleability influence their academic performance. Specifically, thinking of ability as an incremental (vs. fixed trait is associated with better outcomes. Though this was shown across many domains, little research exists into these beliefs in the writing domain and into the mechanisms underlying their effects on writing performance. The aim of this study was twofold: to gather evidence on the validity and reliability of instruments to measure beliefs in skill malleability, achievement goals, and self-efficacy in writing; and to test a path-analytic model specifying beliefs in writing skill malleability to influence writing performance, via goals and self-efficacy. For that, 196 Portuguese students in Grades 7-8 filled in the instruments and wrote an opinion essay that was assessed for writing performance. Confirmatory factor analyses supported instruments’ validity and reliability. Path analysis revealed direct effects from beliefs in writing skill malleability to mastery goals (ß = .45; from mastery goals to self-efficacy for conventions, ideation, and self-regulation (ß = .27, .42, and .42, respectively; and from self-efficacy for self-regulation to writing performance (ß = .16; along with indirect effects from beliefs in writing skill malleability to self-efficacy for self-regulation via mastery goals (ß = .19, and from mastery goals to writing performance via self-efficacy for self-regulation (ß = .07. Overall, students’ mastery goals and self-efficacy for self-regulation seem to be key factors underlying the link between beliefs in writing skill malleability and writing performance. These findings highlight the importance of attending to motivation-related components in the teaching of writing.

  2. Motivating learning, performance, and persistence: the synergistic effects of intrinsic goal contents and autonomy-supportive contexts.

    Science.gov (United States)

    Vansteenkiste, Maarten; Simons, Joke; Lens, Willy; Sheldon, Kennon M; Deci, Edward L

    2004-08-01

    Three field experiments with high school and college students tested the self-determination theory hypotheses that intrinsic (vs. extrinsic) goals and autonomy-supportive (vs. controlling) learning climates would improve students' learning, performance, and persistence. The learning of text material or physical exercises was framed in terms of intrinsic (community, personal growth, health) versus extrinsic (money, image) goals, which were presented in an autonomy-supportive versus controlling manner. Analyses of variance confirmed that both experimentally manipulated variables yielded main effects on depth of processing, test performance, and persistence (all ps intrinsic goals and autonomy support were present. Effects were significantly mediated by autonomous motivation.

  3. Which Feedback Is More Effective for Pursuing Multiple Goals of Differing Importance? The Interaction Effects of Goal Importance and Performance Feedback Type on Self-Regulation and Task Achievement

    Science.gov (United States)

    Lee, Hyunjoo

    2016-01-01

    This study examined how performance feedback type (progress vs. distance) affects Korean college students' self-regulation and task achievement according to relative goal importance in the pursuit of multiple goals. For this study, 146 students participated in a computerised task. The results showed the interaction effects of goal importance and…

  4. Probabilistic Design and Analysis Framework

    Science.gov (United States)

    Strack, William C.; Nagpal, Vinod K.

    2010-01-01

    PRODAF is a software package designed to aid analysts and designers in conducting probabilistic analysis of components and systems. PRODAF can integrate multiple analysis programs to ease the tedious process of conducting a complex analysis process that requires the use of multiple software packages. The work uses a commercial finite element analysis (FEA) program with modules from NESSUS to conduct a probabilistic analysis of a hypothetical turbine blade, disk, and shaft model. PRODAF applies the response surface method, at the component level, and extrapolates the component-level responses to the system level. Hypothetical components of a gas turbine engine are first deterministically modeled using FEA. Variations in selected geometrical dimensions and loading conditions are analyzed to determine the effects of the stress state within each component. Geometric variations include the cord length and height for the blade, inner radius, outer radius, and thickness, which are varied for the disk. Probabilistic analysis is carried out using developing software packages like System Uncertainty Analysis (SUA) and PRODAF. PRODAF was used with a commercial deterministic FEA program in conjunction with modules from the probabilistic analysis program, NESTEM, to perturb loads and geometries to provide a reliability and sensitivity analysis. PRODAF simplified the handling of data among the various programs involved, and will work with many commercial and opensource deterministic programs, probabilistic programs, or modules.

  5. Probabilistic Role Models and the Guarded Fragment

    DEFF Research Database (Denmark)

    Jaeger, Manfred

    2004-01-01

    We propose a uniform semantic framework for interpreting probabilistic concept subsumption and probabilistic role quantification through statistical sampling distributions. This general semantic principle serves as the foundation for the development of a probabilistic version of the guarded fragm...... fragment of first-order logic. A characterization of equivalence in that logic in terms of bisimulations is given....

  6. Probabilistic role models and the guarded fragment

    DEFF Research Database (Denmark)

    Jaeger, Manfred

    2006-01-01

    We propose a uniform semantic framework for interpreting probabilistic concept subsumption and probabilistic role quantification through statistical sampling distributions. This general semantic principle serves as the foundation for the development of a probabilistic version of the guarded fragm...... fragment of first-order logic. A characterization of equivalence in that logic in terms of bisimulations is given....

  7. Games people play: How video games improve probabilistic learning.

    Science.gov (United States)

    Schenk, Sabrina; Lech, Robert K; Suchan, Boris

    2017-09-29

    Recent research suggests that video game playing is associated with many cognitive benefits. However, little is known about the neural mechanisms mediating such effects, especially with regard to probabilistic categorization learning, which is a widely unexplored area in gaming research. Therefore, the present study aimed to investigate the neural correlates of probabilistic classification learning in video gamers in comparison to non-gamers. Subjects were scanned in a 3T magnetic resonance imaging (MRI) scanner while performing a modified version of the weather prediction task. Behavioral data yielded evidence for better categorization performance of video gamers, particularly under conditions characterized by stronger uncertainty. Furthermore, a post-experimental questionnaire showed that video gamers had acquired higher declarative knowledge about the card combinations and the related weather outcomes. Functional imaging data revealed for video gamers stronger activation clusters in the hippocampus, the precuneus, the cingulate gyrus and the middle temporal gyrus as well as in occipital visual areas and in areas related to attentional processes. All these areas are connected with each other and represent critical nodes for semantic memory, visual imagery and cognitive control. Apart from this, and in line with previous studies, both groups showed activation in brain areas that are related to attention and executive functions as well as in the basal ganglia and in memory-associated regions of the medial temporal lobe. These results suggest that playing video games might enhance the usage of declarative knowledge as well as hippocampal involvement and enhances overall learning performance during probabilistic learning. In contrast to non-gamers, video gamers showed better categorization performance, independently of the uncertainty of the condition. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Use of a probabilistic safety study in the design of the Italian reference PWR

    International Nuclear Information System (INIS)

    Richardson, D.C.; Russino, G.; Valentini, V.

    1985-01-01

    The intent of this paper is to provide a description of the experience gained in having performed a Probabilistic Safety Study (PSS) on the proposed Italian reference pressurized water reactor. The experience revealed that through careful application of probabilistic techniques, Probabilistic Risk Assessment (PRA) can be used as a tool to develop an optimum plant design in terms of safety and cost. Furthermore, the PSS can also be maintained as a living document and a tool to assess additional regulatory requirements that may be imposed during the construction and operational life of the plant. Through the use of flexible probabilistic techniques, the probabilistic safety model can provide a living safety assessment starting from the conceptual design and continuing through the construction, testing and operational phases. Moreover, the probabilistic safety model can be used during the operational phase of the plant as a method to evaluate the operational experience and identify potential problems before they occur. The experience, overall, provided additional insights into the various aspects of the plants design and operation that would not have been identified through the use of traditional safety evaluation techniques

  9. Probabilistic Programming (Invited Talk)

    OpenAIRE

    Yang, Hongseok

    2017-01-01

    Probabilistic programming refers to the idea of using standard programming constructs for specifying probabilistic models from machine learning and statistics, and employing generic inference algorithms for answering various queries on these models, such as posterior inference and estimation of model evidence. Although this idea itself is not new and was, in fact, explored by several programming-language and statistics researchers in the early 2000, it is only in the last few years that proba...

  10. River Flow Prediction Using the Nearest Neighbor Probabilistic Ensemble Method

    Directory of Open Access Journals (Sweden)

    H. Sanikhani

    2016-02-01

    . Different combinations of recorded data were used as the input pattern to streamflow forecasting. Results and Discussion: Application of the used approaches in ensemble form (in order to choice the optimized parameters improved the model accuracy and robustness in prediction. Different statistical criteria including correlation coefficient (R, root mean squared error (RMSE and Nash–Sutcliffe efficiency coefficient (E were used for evaluating the performance of models. The ranges of parameter values to be covered in the ensemble prediction have been identified by some preliminary tests on the calibration set. Since very small values of k have been found to produce unacceptable results due to the presence of noise, the minimum value is fixed at 100 and trial values are taken up to 10000 (k = 100, 200, 300,500, 1000, 2000, 5000, 10000. The values of mare chosen between 1 and 20 and delay time values γ are tested in the range [1,5]. With increasing the discharge values, the width of confidence band increased and the maximum confidence band is related to maximum river flows. In Dizaj station, for ensemble numbers in the range of 50-100, the variation of RMSE is linear. The variation of RMSE in Mashin station is linear for ensemble members in the range of 100-150. It seems the numbers of ensemble members equals to 100 is suitable for pattern construction. The performance of NNPE model was acceptable for two stations. The number of points excluded 95% confidence interval were equal to 108 and 96 for Dizaj and Mashin stations, respectively. The results showed that the performance of model was better in prediction of minimum and median discharge in comparing maximum values. Conclusion: The results confirmed the performance and reliability of applied methods. The results indicated the better performance and lower uncertainty of ensemble method based on nearest neighbor in comparison with probabilistic nonlinear ensemble method. Nash–Sutcliffe model efficiency coefficient (E for

  11. A functional look at goal orientations : their role for self-estimates of intelligence and performance

    NARCIS (Netherlands)

    Bipp, T.; Steinmayr, R.; Spinath, B.

    2012-01-01

    Building on the notion that motivation energizes and directs resources in achievement situations, we argue that goal orientations affect perceptions of own intelligence and that the effect of goals on performance is partly mediated by self-estimates of intelligence. Studies 1 (n = 89) and 2 (n =

  12. A High Performance Computing Approach to Tree Cover Delineation in 1-m NAIP Imagery Using a Probabilistic Learning Framework

    Science.gov (United States)

    Basu, Saikat; Ganguly, Sangram; Michaelis, Andrew; Votava, Petr; Roy, Anshuman; Mukhopadhyay, Supratik; Nemani, Ramakrishna

    2015-01-01

    Tree cover delineation is a useful instrument in deriving Above Ground Biomass (AGB) density estimates from Very High Resolution (VHR) airborne imagery data. Numerous algorithms have been designed to address this problem, but most of them do not scale to these datasets, which are of the order of terabytes. In this paper, we present a semi-automated probabilistic framework for the segmentation and classification of 1-m National Agriculture Imagery Program (NAIP) for tree-cover delineation for the whole of Continental United States, using a High Performance Computing Architecture. Classification is performed using a multi-layer Feedforward Backpropagation Neural Network and segmentation is performed using a Statistical Region Merging algorithm. The results from the classification and segmentation algorithms are then consolidated into a structured prediction framework using a discriminative undirected probabilistic graphical model based on Conditional Random Field, which helps in capturing the higher order contextual dependencies between neighboring pixels. Once the final probability maps are generated, the framework is updated and re-trained by relabeling misclassified image patches. This leads to a significant improvement in the true positive rates and reduction in false positive rates. The tree cover maps were generated for the whole state of California, spanning a total of 11,095 NAIP tiles covering a total geographical area of 163,696 sq. miles. The framework produced true positive rates of around 88% for fragmented forests and 74% for urban tree cover areas, with false positive rates lower than 2% for both landscapes. Comparative studies with the National Land Cover Data (NLCD) algorithm and the LiDAR canopy height model (CHM) showed the effectiveness of our framework for generating accurate high-resolution tree-cover maps.

  13. A High Performance Computing Approach to Tree Cover Delineation in 1-m NAIP Imagery using a Probabilistic Learning Framework

    Science.gov (United States)

    Basu, S.; Ganguly, S.; Michaelis, A.; Votava, P.; Roy, A.; Mukhopadhyay, S.; Nemani, R. R.

    2015-12-01

    Tree cover delineation is a useful instrument in deriving Above Ground Biomass (AGB) density estimates from Very High Resolution (VHR) airborne imagery data. Numerous algorithms have been designed to address this problem, but most of them do not scale to these datasets which are of the order of terabytes. In this paper, we present a semi-automated probabilistic framework for the segmentation and classification of 1-m National Agriculture Imagery Program (NAIP) for tree-cover delineation for the whole of Continental United States, using a High Performance Computing Architecture. Classification is performed using a multi-layer Feedforward Backpropagation Neural Network and segmentation is performed using a Statistical Region Merging algorithm. The results from the classification and segmentation algorithms are then consolidated into a structured prediction framework using a discriminative undirected probabilistic graphical model based on Conditional Random Field, which helps in capturing the higher order contextual dependencies between neighboring pixels. Once the final probability maps are generated, the framework is updated and re-trained by relabeling misclassified image patches. This leads to a significant improvement in the true positive rates and reduction in false positive rates. The tree cover maps were generated for the whole state of California, spanning a total of 11,095 NAIP tiles covering a total geographical area of 163,696 sq. miles. The framework produced true positive rates of around 88% for fragmented forests and 74% for urban tree cover areas, with false positive rates lower than 2% for both landscapes. Comparative studies with the National Land Cover Data (NLCD) algorithm and the LiDAR canopy height model (CHM) showed the effectiveness of our framework for generating accurate high-resolution tree-cover maps.

  14. The NUREG-1150 probabilistic risk assessment for the Grand Gulf nuclear station

    International Nuclear Information System (INIS)

    Brown, T.D.; Breeding, R.J.; Jow, H.N.; Higgins, S.J.; Shiver, A.W.; Helton, J.C.

    1992-01-01

    This paper summarizes the findings of the probabilistic risk assessment (PRA) for Unit 1 of the Grand Gulf Nuclear Station performed in support of NUREG-1150. The emphasis is on the 'back-end' analyses, that is, the acident progression, source term, consequence analsyes, and risk results obtained when the results of these analyses are combined with the accident frequency analysis. The offsite risk from internal initiating events was found to be quite low, both with respect to the safety goals and to the other plants analyzed in NUREG-1150. The offsite risk is dominated by short-term station blackout plant damage states. The long-term blackout group and the anticiptated transients without scram (ATWS) group contribute considerably less to risk. Transients in which the power conversion system is unavailable are very minor contributors to risk. The low values for risk can be attributed to low core damage frequency, good emergency response, and plant features that reduce the potential source term. (orig.)

  15. Work Engagement: Antecedents, the Mediating Role of Learning Goal Orientation and Job Performance

    Science.gov (United States)

    Chughtai, Aamir Ali; Buckley, Finian

    2011-01-01

    Purpose: The present paper aims to explore the effects of state (trust in supervisor) and trait (trust propensity) trust on employees' work engagement. Furthermore, it seeks to investigate the mediating role of learning goal orientation in the relationship between work engagement and two forms of performance: in-role job performance and innovative…

  16. Analytical incorporation of fractionation effects in probabilistic treatment planning for intensity-modulated proton therapy.

    Science.gov (United States)

    Wahl, Niklas; Hennig, Philipp; Wieser, Hans-Peter; Bangert, Mark

    2018-04-01

    We show that it is possible to explicitly incorporate fractionation effects into closed-form probabilistic treatment plan analysis and optimization for intensity-modulated proton therapy with analytical probabilistic modeling (APM). We study the impact of different fractionation schemes on the dosimetric uncertainty induced by random and systematic sources of range and setup uncertainty for treatment plans that were optimized with and without consideration of the number of treatment fractions. The APM framework is capable of handling arbitrarily correlated uncertainty models including systematic and random errors in the context of fractionation. On this basis, we construct an analytical dose variance computation pipeline that explicitly considers the number of treatment fractions for uncertainty quantitation and minimization during treatment planning. We evaluate the variance computation model in comparison to random sampling of 100 treatments for conventional and probabilistic treatment plans under different fractionation schemes (1, 5, 30 fractions) for an intracranial, a paraspinal and a prostate case. The impact of neglecting the fractionation scheme during treatment planning is investigated by applying treatment plans that were generated with probabilistic optimization for 1 fraction in a higher number of fractions and comparing them to the probabilistic plans optimized under explicit consideration of the number of fractions. APM enables the construction of an analytical variance computation model for dose uncertainty considering fractionation at negligible computational overhead. It is computationally feasible (a) to simultaneously perform a robustness analysis for all possible fraction numbers and (b) to perform a probabilistic treatment plan optimization for a specific fraction number. The incorporation of fractionation assumptions for robustness analysis exposes a dose to uncertainty trade-off, i.e., the dose in the organs at risk is increased for a

  17. Topics in Probabilistic Judgment Aggregation

    Science.gov (United States)

    Wang, Guanchun

    2011-01-01

    This dissertation is a compilation of several studies that are united by their relevance to probabilistic judgment aggregation. In the face of complex and uncertain events, panels of judges are frequently consulted to provide probabilistic forecasts, and aggregation of such estimates in groups often yield better results than could have been made…

  18. Quantitative probabilistic functional diffusion mapping in newly diagnosed glioblastoma treated with radiochemotherapy.

    Science.gov (United States)

    Ellingson, Benjamin M; Cloughesy, Timothy F; Lai, Albert; Nghiemphu, Phioanh L; Liau, Linda M; Pope, Whitney B

    2013-03-01

    Functional diffusion mapping (fDM) is a cancer imaging technique that uses voxel-wise changes in apparent diffusion coefficients (ADC) to evaluate response to treatment. Despite promising initial results, uncertainty in image registration remains the largest barrier to widespread clinical application. The current study introduces a probabilistic approach to fDM quantification to overcome some of these limitations. A total of 143 patients with newly diagnosed glioblastoma who were undergoing standard radiochemotherapy were enrolled in this retrospective study. Traditional and probabilistic fDMs were calculated using ADC maps acquired before and after therapy. Probabilistic fDMs were calculated by applying random, finite translational, and rotational perturbations to both pre-and posttherapy ADC maps, then repeating calculation of fDMs reflecting changes after treatment, resulting in probabilistic fDMs showing the voxel-wise probability of fDM classification. Probabilistic fDMs were then compared with traditional fDMs in their ability to predict progression-free survival (PFS) and overall survival (OS). Probabilistic fDMs applied to patients with newly diagnosed glioblastoma treated with radiochemotherapy demonstrated shortened PFS and OS among patients with a large volume of tumor with decreasing ADC evaluated at the posttreatment time with respect to the baseline scans. Alternatively, patients with a large volume of tumor with increasing ADC evaluated at the posttreatment time with respect to baseline scans were more likely to progress later and live longer. Probabilistic fDMs performed better than traditional fDMs at predicting 12-month PFS and 24-month OS with use of receiver-operator characteristic analysis. Univariate log-rank analysis on Kaplan-Meier data also revealed that probabilistic fDMs could better separate patients on the basis of PFS and OS, compared with traditional fDMs. Results suggest that probabilistic fDMs are a more predictive biomarker in

  19. A logic for inductive probabilistic reasoning

    DEFF Research Database (Denmark)

    Jaeger, Manfred

    2005-01-01

    Inductive probabilistic reasoning is understood as the application of inference patterns that use statistical background information to assign (subjective) probabilities to single events. The simplest such inference pattern is direct inference: from '70% of As are Bs" and "a is an A" infer...... that a is a B with probability 0.7. Direct inference is generalized by Jeffrey's rule and the principle of cross-entropy minimization. To adequately formalize inductive probabilistic reasoning is an interesting topic for artificial intelligence, as an autonomous system acting in a complex environment may have...... to base its actions on a probabilistic model of its environment, and the probabilities needed to form this model can often be obtained by combining statistical background information with particular observations made, i.e., by inductive probabilistic reasoning. In this paper a formal framework...

  20. A convergence theory for probabilistic metric spaces | Jäger ...

    African Journals Online (AJOL)

    We develop a theory of probabilistic convergence spaces based on Tardiff's neighbourhood systems for probabilistic metric spaces. We show that the resulting category is a topological universe and we characterize a subcategory that is isomorphic to the category of probabilistic metric spaces. Keywords: Probabilistic metric ...

  1. Potential and limits to cluster-state quantum computing using probabilistic gates

    International Nuclear Information System (INIS)

    Gross, D.; Kieling, K.; Eisert, J.

    2006-01-01

    We establish bounds to the necessary resource consumption when building up cluster states for one-way computing using probabilistic gates. Emphasis is put on state preparation with linear optical gates, as the probabilistic character is unavoidable here. We identify rigorous general bounds to the necessary consumption of initially available maximally entangled pairs when building up one-dimensional cluster states with individually acting linear optical quantum gates, entangled pairs, and vacuum modes. As the known linear optics gates have a limited maximum success probability, as we show, this amounts to finding the optimal classical strategy of fusing pieces of linear cluster states. A formal notion of classical configurations and strategies is introduced for probabilistic nonfaulty gates. We study the asymptotic performance of strategies that can be simply described, and prove ultimate bounds to the performance of the globally optimal strategy. The arguments employ methods of random walks and convex optimization. This optimal strategy is also the one that requires the shortest storage time, and necessitates the fewest invocations of probabilistic gates. For two-dimensional cluster states, we find, for any elementary success probability, an essentially deterministic preparation of a cluster state with quadratic, hence optimal, asymptotic scaling in the use of entangled pairs. We also identify a percolation effect in state preparation, in that from a threshold probability on, almost all preparations will be either successful or fail. We outline the implications on linear optical architectures and fault-tolerant computations

  2. a Probabilistic Embedding Clustering Method for Urban Structure Detection

    Science.gov (United States)

    Lin, X.; Li, H.; Zhang, Y.; Gao, L.; Zhao, L.; Deng, M.

    2017-09-01

    Urban structure detection is a basic task in urban geography. Clustering is a core technology to detect the patterns of urban spatial structure, urban functional region, and so on. In big data era, diverse urban sensing datasets recording information like human behaviour and human social activity, suffer from complexity in high dimension and high noise. And unfortunately, the state-of-the-art clustering methods does not handle the problem with high dimension and high noise issues concurrently. In this paper, a probabilistic embedding clustering method is proposed. Firstly, we come up with a Probabilistic Embedding Model (PEM) to find latent features from high dimensional urban sensing data by "learning" via probabilistic model. By latent features, we could catch essential features hidden in high dimensional data known as patterns; with the probabilistic model, we can also reduce uncertainty caused by high noise. Secondly, through tuning the parameters, our model could discover two kinds of urban structure, the homophily and structural equivalence, which means communities with intensive interaction or in the same roles in urban structure. We evaluated the performance of our model by conducting experiments on real-world data and experiments with real data in Shanghai (China) proved that our method could discover two kinds of urban structure, the homophily and structural equivalence, which means clustering community with intensive interaction or under the same roles in urban space.

  3. A PROBABILISTIC EMBEDDING CLUSTERING METHOD FOR URBAN STRUCTURE DETECTION

    Directory of Open Access Journals (Sweden)

    X. Lin

    2017-09-01

    Full Text Available Urban structure detection is a basic task in urban geography. Clustering is a core technology to detect the patterns of urban spatial structure, urban functional region, and so on. In big data era, diverse urban sensing datasets recording information like human behaviour and human social activity, suffer from complexity in high dimension and high noise. And unfortunately, the state-of-the-art clustering methods does not handle the problem with high dimension and high noise issues concurrently. In this paper, a probabilistic embedding clustering method is proposed. Firstly, we come up with a Probabilistic Embedding Model (PEM to find latent features from high dimensional urban sensing data by “learning” via probabilistic model. By latent features, we could catch essential features hidden in high dimensional data known as patterns; with the probabilistic model, we can also reduce uncertainty caused by high noise. Secondly, through tuning the parameters, our model could discover two kinds of urban structure, the homophily and structural equivalence, which means communities with intensive interaction or in the same roles in urban structure. We evaluated the performance of our model by conducting experiments on real-world data and experiments with real data in Shanghai (China proved that our method could discover two kinds of urban structure, the homophily and structural equivalence, which means clustering community with intensive interaction or under the same roles in urban space.

  4. Probabilistic anatomical labeling of brain structures using statistical probabilistic anatomical maps

    International Nuclear Information System (INIS)

    Kim, Jin Su; Lee, Dong Soo; Lee, Byung Il; Lee, Jae Sung; Shin, Hee Won; Chung, June Key; Lee, Myung Chul

    2002-01-01

    The use of statistical parametric mapping (SPM) program has increased for the analysis of brain PET and SPECT images. Montreal neurological institute (MNI) coordinate is used in SPM program as a standard anatomical framework. While the most researchers look up Talairach atlas to report the localization of the activations detected in SPM program, there is significant disparity between MNI templates and Talairach atlas. That disparity between Talairach and MNI coordinates makes the interpretation of SPM result time consuming, subjective and inaccurate. The purpose of this study was to develop a program to provide objective anatomical information of each x-y-z position in ICBM coordinate. Program was designed to provide the anatomical information for the given x-y-z position in MNI coordinate based on the statistical probabilistic anatomical map (SPAM) images of ICBM. When x-y-z position was given to the program, names of the anatomical structures with non-zero probability and the probabilities that the given position belongs to the structures were tabulated. The program was coded using IDL and JAVA language for the easy transplantation to any operating system or platform. Utility of this program was shown by comparing the results of this program to those of SPM program. Preliminary validation study was performed by applying this program to the analysis of PET brain activation study of human memory in which the anatomical information on the activated areas are previously known. Real time retrieval of probabilistic information with 1 mm spatial resolution was archived using the programs. Validation study showed the relevance of this program: probability that the activated area for memory belonged to hippocampal formation was more than 80%. These programs will be useful for the result interpretation of the image analysis performed on MNI coordinate, as done in SPM program

  5. Arbitrage and Hedging in a non probabilistic framework

    OpenAIRE

    Alvarez, Alexander; Ferrando, Sebastian; Olivares, Pablo

    2011-01-01

    The paper studies the concepts of hedging and arbitrage in a non probabilistic framework. It provides conditions for non probabilistic arbitrage based on the topological structure of the trajectory space and makes connections with the usual notion of arbitrage. Several examples illustrate the non probabilistic arbitrage as well perfect replication of options under continuous and discontinuous trajectories, the results can then be applied in probabilistic models path by path. The approach is r...

  6. Probabilistic safety analysis procedures guide. Sections 1-7 and appendices. Volume 1, Revision 1

    International Nuclear Information System (INIS)

    Bari, R.A.; Buslik, A.J.; Cho, N.Z.

    1985-08-01

    A procedures guide for the performance of probabilistic safety assessment has been prepared for interim use in the Nuclear Regulatory Commission programs. It will be revised as comments are received, and as experience is gained from its use. The probabilistic safety assessment studies performed are intended to produce probabilistic predictive models that can be used and extended by the utilities and by NRC to sharpen the focus of inquiries into a range of issues affecting reactor safety. This first volume of the guide describes the determination of the probability (per year) of core damage resulting from accident initiators internal to the plant (i.e., intrinsic to plant operation) and from loss of off-site electric power. The scope includes human reliability analysis, a determination of the importance of various core damage accident sequences, and an explicit treatment and display of uncertainties for key accident sequences. The second volume deals with the treatment of the so-called external events including seismic disturbances, fires, floods, etc. Ultimately, the guide will be augmented to include the plant-specific analysis of in-plant processes (i.e., containment performance). This guide provides the structure of a probabilistic safety study to be performed, and indicates what products of the study are valuable for regulatory decision making. For internal events, methodology is treated in the guide only to the extent necessary to indicate the range of methods which is acceptable; ample reference is given to alternative methodologies which may be utilized in the performance of the study. For external events, more explicit guidance is given

  7. Motivational Goal Bracketing: An Experiment

    DEFF Research Database (Denmark)

    Koch, Alexander; Nafziger, Julia

    We study in an online, real-effort experiment how the bracketing of non-binding goals affects performance in a work-leisure self-control problem. We externally induce the goal bracket - daily goals or a weekly goal - and within that bracket let subjects set goals for how much they want to work over...... a one-week period. Our theoretical model predicts (i) that weekly goals create incentives to compensate for a lower than desired performance today with the promise to work harder tomorrow, whereas daily goals exclude such excuses; (ii) that subjects with daily goals set higher goals in aggregate...... and work harder than those with weekly goals. Our data support these predictions. Surprisingly, however, when goals are combined with an externally enforced commitment that requires subjects to spend less than a minute each day on the task to get started working, performance deteriorates because of high...

  8. Bounding probabilistic safety assessment probabilities by reality

    International Nuclear Information System (INIS)

    Fragola, J.R.; Shooman, M.L.

    1991-01-01

    The investigation of the failure in systems where failure is a rare event makes the continual comparisons between the developed probabilities and empirical evidence difficult. The comparison of the predictions of rare event risk assessments with historical reality is essential to prevent probabilistic safety assessment (PSA) predictions from drifting into fantasy. One approach to performing such comparisons is to search out and assign probabilities to natural events which, while extremely rare, have a basis in the history of natural phenomena or human activities. For example the Segovian aqueduct and some of the Roman fortresses in Spain have existed for several millennia and in many cases show no physical signs of earthquake damage. This evidence could be used to bound the probability of earthquakes above a certain magnitude to less than 10 -3 per year. On the other hand, there is evidence that some repetitive actions can be performed with extremely low historical probabilities when operators are properly trained and motivated, and sufficient warning indicators are provided. The point is not that low probability estimates are impossible, but continual reassessment of the analysis assumptions, and a bounding of the analysis predictions by historical reality. This paper reviews the probabilistic predictions of PSA in this light, attempts to develop, in a general way, the limits which can be historically established and the consequent bounds that these limits place upon the predictions, and illustrates the methodology used in computing such limits. Further, the paper discusses the use of empirical evidence and the requirement for disciplined systematic approaches within the bounds of reality and the associated impact on PSA probabilistic estimates

  9. A goal-based approach for qualification of new technologies: Foundations, tool support, and industrial validation

    International Nuclear Information System (INIS)

    Sabetzadeh, Mehrdad; Falessi, Davide; Briand, Lionel; Di Alesio, Stefano

    2013-01-01

    New technologies typically involve innovative aspects that are not addressed by the existing normative standards and hence are not assessable through common certification procedures. To ensure that new technologies can be implemented in a safe and reliable manner, a specific kind of assessment is performed, which in many industries, e.g., the energy sector, is known as Technology Qualification (TQ). TQ aims at demonstrating with an acceptable level of confidence that a new technology will function within specified limits. Expert opinion plays an important role in TQ, both to identify the safety and reliability evidence that needs to be developed and to interpret the evidence provided. Since there are often multiple experts involved in TQ, it is crucial to apply a structured process for eliciting expert opinions, and to use this information systematically when analyzing the satisfaction of the technology's safety and reliability objectives. In this paper, we present a goal-based approach for TQ. Our approach enables analysts to quantitatively reason about the satisfaction of the technology's overall goals and further to identify the aspects that must be improved to increase goal satisfaction. The approach is founded on three main components: goal models, expert elicitation, and probabilistic simulation. We describe a tool, named Modus, that we have developed in support of our approach. We provide an extensive empirical validation of our approach through two industrial case studies and a survey

  10. A performance goal-based seismic design philosophy for waste repository facilities

    International Nuclear Information System (INIS)

    Hossain, Q.A.

    1994-02-01

    A performance goal-based seismic design philosophy, compatible with DOE's present natural phenomena hazards mitigation and ''graded approach'' philosophy, has been proposed for high level nuclear waste repository facilities. The rationale, evolution, and the desirable features of this method have been described. Why and how the method should and can be applied to the design of a repository facility are also discussed

  11. Development of Probabilistic Structural Analysis Integrated with Manufacturing Processes

    Science.gov (United States)

    Pai, Shantaram S.; Nagpal, Vinod K.

    2007-01-01

    An effort has been initiated to integrate manufacturing process simulations with probabilistic structural analyses in order to capture the important impacts of manufacturing uncertainties on component stress levels and life. Two physics-based manufacturing process models (one for powdered metal forging and the other for annular deformation resistance welding) have been linked to the NESSUS structural analysis code. This paper describes the methodology developed to perform this integration including several examples. Although this effort is still underway, particularly for full integration of a probabilistic analysis, the progress to date has been encouraging and a software interface that implements the methodology has been developed. The purpose of this paper is to report this preliminary development.

  12. Probabilistic studies of accident sequences

    International Nuclear Information System (INIS)

    Villemeur, A.; Berger, J.P.

    1986-01-01

    For several years, Electricite de France has carried out probabilistic assessment of accident sequences for nuclear power plants. In the framework of this program many methods were developed. As the interest in these studies was increasing and as adapted methods were developed, Electricite de France has undertaken a probabilistic safety assessment of a nuclear power plant [fr

  13. Convex sets in probabilistic normed spaces

    International Nuclear Information System (INIS)

    Aghajani, Asadollah; Nourouzi, Kourosh

    2008-01-01

    In this paper we obtain some results on convexity in a probabilistic normed space. We also investigate the concept of CSN-closedness and CSN-compactness in a probabilistic normed space and generalize the corresponding results of normed spaces

  14. Cognitive Orientation to (daily) Occupational Performance (CO-OP) with children with Asperger's syndrome who have motor-based occupational performance goals.

    Science.gov (United States)

    Rodger, Sylvia; Brandenburg, Julia

    2009-02-01

    Motor difficulties associated with Asperger's syndrome (AS) are commonly reported, despite these not being diagnostically significant. Cognitive Orientation to daily Occupational Performance (CO-OP) is a verbal problem-solving intervention developed for use with children with developmental coordination disorder to address their motor-based difficulties. This paper reports on two case studies of children with AS illustrating the outcomes of CO-OP to address motor-based occupational performance goals. A case study approach was used to document how two children with AS engaged in 10 weekly sessions of CO-OP addressing child-chosen motor-based occupational performance goals and the outcomes of this intervention. Pre and post-intervention assessment using the Canadian Occupational Performance Measure, Vineland Adaptive Behaviour Scales and the Performance Quality Rating Scale indicated that both children were able to engage in CO-OP intervention to successfully improve their occupational performance. Further research into the application of CO-OP with children with AS is warranted based on preliminary positive findings regarding the efficacy of this intervention to address motor-based performance difficulties in two children with AS.

  15. The effects of training and competition on achievement goals, motivational responses, and performance in a golf-putting task

    NARCIS (Netherlands)

    Pol, P.K.C. van de; Kavussanu, M.; Ring, C.

    2012-01-01

    This study examined whether (a) training and competition influence achievement goals, effort, enjoyment, tension, and performance; (b) achievement goals mediate the effects of training and competition on effort, enjoyment, tension, and performance; and (c) the context influences the relationships

  16. Probabilistic evaluations for CANTUP computer code analysis improvement

    International Nuclear Information System (INIS)

    Florea, S.; Pavelescu, M.

    2004-01-01

    Structural analysis with finite element method is today an usual way to evaluate and predict the behavior of structural assemblies subject to hard conditions in order to ensure their safety and reliability during their operation. A CANDU 600 fuel channel is an example of an assembly working in hard conditions, in which, except the corrosive and thermal aggression, long time irradiation, with implicit consequences on material properties evolution, interferes. That leads inevitably to material time-dependent properties scattering, their dynamic evolution being subject to a great degree of uncertainness. These are the reasons for developing, in association with deterministic evaluations with computer codes, the probabilistic and statistical methods in order to predict the structural component response. This work initiates the possibility to extend the deterministic thermomechanical evaluation on fuel channel components to probabilistic structural mechanics approach starting with deterministic analysis performed with CANTUP computer code which is a code developed to predict the long term mechanical behavior of the pressure tube - calandria tube assembly. To this purpose the structure of deterministic calculus CANTUP computer code has been reviewed. The code has been adapted from LAHEY 77 platform to Microsoft Developer Studio - Fortran Power Station platform. In order to perform probabilistic evaluations, it was added a part to the deterministic code which, using a subroutine from IMSL library from Microsoft Developer Studio - Fortran Power Station platform, generates pseudo-random values of a specified value. It was simulated a normal distribution around the deterministic value and 5% standard deviation for Young modulus material property in order to verify the statistical calculus of the creep behavior. The tube deflection and effective stresses were the properties subject to probabilistic evaluation. All the values of these properties obtained for all the values for

  17. Probabilistic reasoning with graphical security models

    NARCIS (Netherlands)

    Kordy, Barbara; Pouly, Marc; Schweitzer, Patrick

    This work provides a computational framework for meaningful probabilistic evaluation of attack–defense scenarios involving dependent actions. We combine the graphical security modeling technique of attack–defense trees with probabilistic information expressed in terms of Bayesian networks. In order

  18. On the Probabilistic Characterization of Robustness and Resilience

    DEFF Research Database (Denmark)

    Faber, Michael Havbro; Qin, J.; Miraglia, Simona

    2017-01-01

    Over the last decade significant research efforts have been devoted to the probabilistic modeling and analysis of system characteristics. Especially performance characteristics of systems subjected to random disturbances, such as robustness and resilience have been in the focus of these efforts...... in the modeling of robustness and resilience in the research areas of natural disaster risk management, socio-ecological systems and social systems and we propose a generic decision analysis framework for the modeling and analysis of systems across application areas. The proposed framework extends the concept...... of direct and indirect consequences and associated risks in probabilistic systems modeling formulated by the Joint Committee on Structural Safety (JCSS) to facilitate the modeling and analysis of resilience in addition to robustness and vulnerability. Moreover, based on recent insights in the modeling...

  19. Probabilistic fuel rod analyses using the TRANSURANUS code

    Energy Technology Data Exchange (ETDEWEB)

    Lassmann, K; O` Carroll, C; Laar, J Van De [CEC Joint Research Centre, Karlsruhe (Germany)

    1997-08-01

    After more than 25 years of fuel rod modelling research, the basic concepts are well established and the limitations of the specific approaches are known. However, the widely used mechanistic approach leads in many cases to discrepancies between theoretical predictions and experimental evidence indicating that models are not exact and that some of the physical processes encountered are of stochastic nature. To better understand uncertainties and their consequences, the mechanistic approach must therefore be augmented by statistical analyses. In the present paper the basic probabilistic methods are briefly discussed. Two such probabilistic approaches are included in the fuel rod performance code TRANSURANUS: the Monte Carlo method and the Numerical Noise Analysis. These two techniques are compared and their capabilities are demonstrated. (author). 12 refs, 4 figs, 2 tabs.

  20. Effects of goal-setting skills on students’academic performance in english language in Enugu Nigeria

    Directory of Open Access Journals (Sweden)

    Abe Iyabo Idowu

    2014-07-01

    Full Text Available The study investigated the effectiveness of goal-setting skills among Senior Secondary II students’ academic performance in English language in Enugu Metropolis, Enugu state, Nigeria. Quasi-experimental pre-test, post- test control group design was adopted for the study. The initial sample was 147 participants (male and female Senior Secondary School II students drawn from two public schools in Enugu zone of Enugu Metropolis. The final sample for the intervention consisted of 80 participants. This sample satisfied the condition for selection from the baseline data. Two research hypotheses were formulated and tested at 0.05 level of significance. Data generated were analyzed using the mean, standard deviation and t-test statistical method. The findings showed that performance in English language was enhanced among participants exposed to goal-setting intervention compared to those in the control group. The study also showed that there is a significant gender difference in students’ performance with female participants recording a higher mean score than males. Parental level of education was also found to be related to performance in English Language. Based on the findings, goal-setting intervention was recommended as a strategy to enhancing students’ academic performance particularly in English Language. 

  1. Consideration of aging in probabilistic safety assessment

    International Nuclear Information System (INIS)

    Titina, B.; Cepin, M.

    2007-01-01

    Probabilistic safety assessment is a standardised tool for assessment of safety of nuclear power plants. It is a complement to the safety analyses. Standard probabilistic models of safety equipment assume component failure rate as a constant. Ageing of systems, structures and components can theoretically be included in new age-dependent probabilistic safety assessment, which generally causes the failure rate to be a function of age. New age-dependent probabilistic safety assessment models, which offer explicit calculation of the ageing effects, are developed. Several groups of components are considered which require their unique models: e.g. operating components e.g. stand-by components. The developed models on the component level are inserted into the models of the probabilistic safety assessment in order that the ageing effects are evaluated for complete systems. The preliminary results show that the lack of necessary data for consideration of ageing causes highly uncertain models and consequently the results. (author)

  2. Implications of probabilistic risk assessment

    International Nuclear Information System (INIS)

    Cullingford, M.C.; Shah, S.M.; Gittus, J.H.

    1987-01-01

    Probabilistic risk assessment (PRA) is an analytical process that quantifies the likelihoods, consequences and associated uncertainties of the potential outcomes of postulated events. Starting with planned or normal operation, probabilistic risk assessment covers a wide range of potential accidents and considers the whole plant and the interactions of systems and human actions. Probabilistic risk assessment can be applied in safety decisions in design, licensing and operation of industrial facilities, particularly nuclear power plants. The proceedings include a review of PRA procedures, methods and technical issues in treating uncertainties, operating and licensing issues and future trends. Risk assessment for specific reactor types or components and specific risks (eg aircraft crashing onto a reactor) are used to illustrate the points raised. All 52 articles are indexed separately. (U.K.)

  3. Unsteady Probabilistic Analysis of a Gas Turbine System

    Science.gov (United States)

    Brown, Marilyn

    2003-01-01

    In this work, we have considered an annular cascade configuration subjected to unsteady inflow conditions. The unsteady response calculation has been implemented into the time marching CFD code, MSUTURBO. The computed steady state results for the pressure distribution demonstrated good agreement with experimental data. We have computed results for the amplitudes of the unsteady pressure over the blade surfaces. With the increase in gas turbine engine structural complexity and performance over the past 50 years, structural engineers have created an array of safety nets to ensure against component failures in turbine engines. In order to reduce what is now considered to be excessive conservatism and yet maintain the same adequate margins of safety, there is a pressing need to explore methods of incorporating probabilistic design procedures into engine development. Probabilistic methods combine and prioritize the statistical distributions of each design variable, generate an interactive distribution and offer the designer a quantified relationship between robustness, endurance and performance. The designer can therefore iterate between weight reduction, life increase, engine size reduction, speed increase etc.

  4. Characterizing the topology of probabilistic biological networks.

    Science.gov (United States)

    Todor, Andrei; Dobra, Alin; Kahveci, Tamer

    2013-01-01

    Biological interactions are often uncertain events, that may or may not take place with some probability. This uncertainty leads to a massive number of alternative interaction topologies for each such network. The existing studies analyze the degree distribution of biological networks by assuming that all the given interactions take place under all circumstances. This strong and often incorrect assumption can lead to misleading results. In this paper, we address this problem and develop a sound mathematical basis to characterize networks in the presence of uncertain interactions. Using our mathematical representation, we develop a method that can accurately describe the degree distribution of such networks. We also take one more step and extend our method to accurately compute the joint-degree distributions of node pairs connected by edges. The number of possible network topologies grows exponentially with the number of uncertain interactions. However, the mathematical model we develop allows us to compute these degree distributions in polynomial time in the number of interactions. Our method works quickly even for entire protein-protein interaction (PPI) networks. It also helps us find an adequate mathematical model using MLE. We perform a comparative study of node-degree and joint-degree distributions in two types of biological networks: the classical deterministic networks and the more flexible probabilistic networks. Our results confirm that power-law and log-normal models best describe degree distributions for both probabilistic and deterministic networks. Moreover, the inverse correlation of degrees of neighboring nodes shows that, in probabilistic networks, nodes with large number of interactions prefer to interact with those with small number of interactions more frequently than expected. We also show that probabilistic networks are more robust for node-degree distribution computation than the deterministic ones. all the data sets used, the software

  5. Branching bisimulation congruence for probabilistic systems

    NARCIS (Netherlands)

    Trcka, N.; Georgievska, S.; Aldini, A.; Baier, C.

    2008-01-01

    The notion of branching bisimulation for the alternating model of probabilistic systems is not a congruence with respect to parallel composition. In this paper we first define another branching bisimulation in the more general model allowing consecutive probabilistic transitions, and we prove that

  6. Online probabilistic learning with an ensemble of forecasts

    Science.gov (United States)

    Thorey, Jean; Mallet, Vivien; Chaussin, Christophe

    2016-04-01

    Our objective is to produce a calibrated weighted ensemble to forecast a univariate time series. In addition to a meteorological ensemble of forecasts, we rely on observations or analyses of the target variable. The celebrated Continuous Ranked Probability Score (CRPS) is used to evaluate the probabilistic forecasts. However applying the CRPS on weighted empirical distribution functions (deriving from the weighted ensemble) may introduce a bias because of which minimizing the CRPS does not produce the optimal weights. Thus we propose an unbiased version of the CRPS which relies on clusters of members and is strictly proper. We adapt online learning methods for the minimization of the CRPS. These methods generate the weights associated to the members in the forecasted empirical distribution function. The weights are updated before each forecast step using only past observations and forecasts. Our learning algorithms provide the theoretical guarantee that, in the long run, the CRPS of the weighted forecasts is at least as good as the CRPS of any weighted ensemble with weights constant in time. In particular, the performance of our forecast is better than that of any subset ensemble with uniform weights. A noteworthy advantage of our algorithm is that it does not require any assumption on the distributions of the observations and forecasts, both for the application and for the theoretical guarantee to hold. As application example on meteorological forecasts for photovoltaic production integration, we show that our algorithm generates a calibrated probabilistic forecast, with significant performance improvements on probabilistic diagnostic tools (the CRPS, the reliability diagram and the rank histogram).

  7. Target Coverage in Wireless Sensor Networks with Probabilistic Sensors

    Science.gov (United States)

    Shan, Anxing; Xu, Xianghua; Cheng, Zongmao

    2016-01-01

    Sensing coverage is a fundamental problem in wireless sensor networks (WSNs), which has attracted considerable attention. Conventional research on this topic focuses on the 0/1 coverage model, which is only a coarse approximation to the practical sensing model. In this paper, we study the target coverage problem, where the objective is to find the least number of sensor nodes in randomly-deployed WSNs based on the probabilistic sensing model. We analyze the joint detection probability of target with multiple sensors. Based on the theoretical analysis of the detection probability, we formulate the minimum ϵ-detection coverage problem. We prove that the minimum ϵ-detection coverage problem is NP-hard and present an approximation algorithm called the Probabilistic Sensor Coverage Algorithm (PSCA) with provable approximation ratios. To evaluate our design, we analyze the performance of PSCA theoretically and also perform extensive simulations to demonstrate the effectiveness of our proposed algorithm. PMID:27618902

  8. Probabilistic-Multiobjective Comparison of User-Defined Operating Rules. Case Study: Hydropower Dam in Spain

    Directory of Open Access Journals (Sweden)

    Paola Bianucci

    2015-03-01

    Full Text Available A useful tool is proposed in this paper to assist dam managers in comparing and selecting suitable operating rules. This procedure is based on well-known multiobjective and probabilistic methodologies, which were jointly applied here to assess and compare flood control strategies in hydropower reservoirs. The procedure consisted of evaluating the operating rules’ performance using a simulation fed by a representative and sufficiently large flood event series. These flood events were obtained from a synthetic rainfall series stochastically generated by using the RainSimV3 model coupled with a deterministic hydrological model. The performance of the assessed strategies was characterized using probabilistic variables. Finally, evaluation and comparison were conducted by analyzing objective functions which synthesize different aspects of the rules’ performance. These objectives were probabilistically defined in terms of risk and expected values. To assess the applicability and flexibility of the tool, it was implemented in a hydropower dam located in Galicia (Northern Spain. This procedure allowed alternative operating rule to be derived which provided a reasonable trade-off between dam safety, flood control, operability and energy production.

  9. A performance goal-based seismic design philosophy for waste repository facilities

    International Nuclear Information System (INIS)

    Hossain, Q.A.

    1994-01-01

    A performance goal-based seismic design philosophy, compatible with DOE's present natural phenomena hazards mitigation and open-quotes graded approachclose quotes philosophy, has been proposed for high level nuclear waste repository facilities. The rationale, evolution, and the desirable features of this method have been described. Why and how the method should and can be applied to the design of a repository facility are also discussed

  10. MULTICRITERIA ANALYSIS OF FOOTBALL MATCH PERFORMANCES: COMPOSITION OF PROBABILISTIC PREFERENCES APPLIED TO THE ENGLISH PREMIER LEAGUE 2015/2016

    Directory of Open Access Journals (Sweden)

    Vitor Principe

    Full Text Available ABSTRACT This article aims to analyze the technical performance of football teams in the FA Premier League during the 2015/2016 season. Data of twenty clubs over 38 matches for each club are considered using 23 variables. These variables have been explored in the football literature and address different features of technical performance. The different configuration of the data for teams in detached segments motivated the multi-criteria approach, which enables identification of strong and weak sectors in each segment. The uncertainty as to the outcome of football matches and the imprecision of the measures indicated the use of Composition of Probabilistic Preferences (CPP to model the problem. “R” software was used in the modeling and computation. The CPP global scores obtained were more consistent with the final classification than those of other methods. CPP scores revealed different performances of particular groups of variables indicating aspects to be improved and explored.

  11. Probabilistic approach to mechanisms

    CERN Document Server

    Sandler, BZ

    1984-01-01

    This book discusses the application of probabilistics to the investigation of mechanical systems. The book shows, for example, how random function theory can be applied directly to the investigation of random processes in the deflection of cam profiles, pitch or gear teeth, pressure in pipes, etc. The author also deals with some other technical applications of probabilistic theory, including, amongst others, those relating to pneumatic and hydraulic mechanisms and roller bearings. Many of the aspects are illustrated by examples of applications of the techniques under discussion.

  12. Advanced methods for a probabilistic safety analysis of fires. Development of advanced methods for performing as far as possible realistic plant specific fire risk analysis (fire PSA)

    International Nuclear Information System (INIS)

    Hofer, E.; Roewekamp, M.; Tuerschmann, M.

    2003-07-01

    In the frame of the research project RS 1112 'Development of Methods for a Recent Probabilistic Safety Analysis, Particularly Level 2' funded by the German Federal Ministry of Economics and Technology (BMWi), advanced methods, in particular for performing as far as possible realistic plant specific fire risk analyses (fire PSA), should be developed. The present Technical Report gives an overview on the methodologies developed in this context for assessing the fire hazard. In the context of developing advanced methodologies for fire PSA, a probabilistic dynamics analysis with a fire simulation code including an uncertainty and sensitivity study has been performed for an exemplary scenario of a cable fire induced by an electric cabinet inside the containment of a modern Konvoi type German nuclear power plant taking into consideration the effects of fire detection and fire extinguishing means. With the present study, it was possible for the first time to determine the probabilities of specified fire effects from a class of fire events by means of probabilistic dynamics supplemented by uncertainty and sensitivity analyses. The analysis applies a deterministic dynamics model, consisting of a dynamic fire simulation code and a model of countermeasures, considering effects of the stochastics (so-called aleatory uncertainties) as well as uncertainties in the state of knowledge (so-called epistemic uncertainties). By this means, probability assessments including uncertainties are provided to be used within the PSA. (orig.) [de

  13. Probabilistic machine learning and artificial intelligence.

    Science.gov (United States)

    Ghahramani, Zoubin

    2015-05-28

    How can a machine learn from experience? Probabilistic modelling provides a framework for understanding what learning is, and has therefore emerged as one of the principal theoretical and practical approaches for designing machines that learn from data acquired through experience. The probabilistic framework, which describes how to represent and manipulate uncertainty about models and predictions, has a central role in scientific data analysis, machine learning, robotics, cognitive science and artificial intelligence. This Review provides an introduction to this framework, and discusses some of the state-of-the-art advances in the field, namely, probabilistic programming, Bayesian optimization, data compression and automatic model discovery.

  14. Probabilistic machine learning and artificial intelligence

    Science.gov (United States)

    Ghahramani, Zoubin

    2015-05-01

    How can a machine learn from experience? Probabilistic modelling provides a framework for understanding what learning is, and has therefore emerged as one of the principal theoretical and practical approaches for designing machines that learn from data acquired through experience. The probabilistic framework, which describes how to represent and manipulate uncertainty about models and predictions, has a central role in scientific data analysis, machine learning, robotics, cognitive science and artificial intelligence. This Review provides an introduction to this framework, and discusses some of the state-of-the-art advances in the field, namely, probabilistic programming, Bayesian optimization, data compression and automatic model discovery.

  15. OCA-P, a deterministic and probabilistic fracture-mechanics code for application to pressure vessels

    International Nuclear Information System (INIS)

    Cheverton, R.D.; Ball, D.G.

    1984-05-01

    The OCA-P code is a probabilistic fracture-mechanics code that was prepared specifically for evaluating the integrity of pressurized-water reactor vessels when subjected to overcooling-accident loading conditions. The code has two-dimensional- and some three-dimensional-flaw capability; it is based on linear-elastic fracture mechanics; and it can treat cladding as a discrete region. Both deterministic and probabilistic analyses can be performed. For the former analysis, it is possible to conduct a search for critical values of the fluence and the nil-ductility reference temperature corresponding to incipient initiation of the initial flaw. The probabilistic portion of OCA-P is based on Monte Carlo techniques, and simulated parameters include fluence, flaw depth, fracture toughness, nil-ductility reference temperature, and concentrations of copper, nickel, and phosphorous. Plotting capabilities include the construction of critical-crack-depth diagrams (deterministic analysis) and various histograms (probabilistic analysis)

  16. Probabilistic linguistics

    NARCIS (Netherlands)

    Bod, R.; Heine, B.; Narrog, H.

    2010-01-01

    Probabilistic linguistics takes all linguistic evidence as positive evidence and lets statistics decide. It allows for accurate modelling of gradient phenomena in production and perception, and suggests that rule-like behaviour is no more than a side effect of maximizing probability. This chapter

  17. Achievement goals and perfectionism of high school students

    Directory of Open Access Journals (Sweden)

    Milojević Milica

    2009-01-01

    Full Text Available This research has been investigating one of the most contemporary approaches of achievement motivation - Achievement Goal Theory, which uses the construct of achievement goals. The construct of achievement goals involves three types of achievement goals: mastery goals, performance approach goals and performance avoidance goals. The main goal of the research was to examine correlation between perfectionism and its aspects with particular types of achievement goals. Also, the goal was to investigate the difference concerning gender regarding the achievement goals. The sample consisted of 200 senior year high school participants. The following instruments were used: Multi-dimensional scale of perfectionism (MSP and Test of achievement goals (TCP. The research results indicate that there is significant positive correlation between: perfectionism with performance approach goals and performance avoidance goals, concern over mistakes and parental expectations with performance approach goals and performance avoidance goals, personal standards and organization with mastery goals and performance approach goals, parental criticism and doubts about action with performance avoidance goals. Significant negative correlation was found between parental criticism and mastery goals. The results concerning the second goal indicates the female subjects have higher average scores in mastery goals.

  18. Probabilistic soft sets and dual probabilistic soft sets in decision making with positive and negative parameters

    Science.gov (United States)

    Fatimah, F.; Rosadi, D.; Hakim, R. B. F.

    2018-03-01

    In this paper, we motivate and introduce probabilistic soft sets and dual probabilistic soft sets for handling decision making problem in the presence of positive and negative parameters. We propose several types of algorithms related to this problem. Our procedures are flexible and adaptable. An example on real data is also given.

  19. Bayesian hypothesis testing and the maintenance rule

    International Nuclear Information System (INIS)

    Kelly, D.L.

    1997-01-01

    The Maintenance Rule (10 CFR 50.65) went into effect in the United States in July 1996. It requires commercial nuclear utilities to monitor system performance (system reliability and maintenance unavailability) for systems that are determined by the utility to be important to plant safety. Utilities must set performance goals for such systems and monitor system performance against these goals. In addition, these performance goals are intended to be commensurate with the safety significance of the system, which can be established by a probabilistic safety assessment of the plant. The author examines the frequents approach to monitoring performance, which is being used by several utilities, and proposes an alternative Bayesian approach. The Bayesian approach makes more complete use of the information in the probabilistic safety assessment, is consistent philosophically with the subjective interpretation given to probability in most probabilistic safety assessments, overcomes several pitfalls in the frequents approach, provides results which are easily interpretable, and is straightforward to implement using the information in the probabilistic safety assessment

  20. Error Discounting in Probabilistic Category Learning

    Science.gov (United States)

    Craig, Stewart; Lewandowsky, Stephan; Little, Daniel R.

    2011-01-01

    The assumption in some current theories of probabilistic categorization is that people gradually attenuate their learning in response to unavoidable error. However, existing evidence for this error discounting is sparse and open to alternative interpretations. We report 2 probabilistic-categorization experiments in which we investigated error…

  1. Probabilistic programming in Python using PyMC3

    Directory of Open Access Journals (Sweden)

    John Salvatier

    2016-04-01

    Full Text Available Probabilistic programming allows for automatic Bayesian inference on user-defined probabilistic models. Recent advances in Markov chain Monte Carlo (MCMC sampling allow inference on increasingly complex models. This class of MCMC, known as Hamiltonian Monte Carlo, requires gradient information which is often not readily available. PyMC3 is a new open source probabilistic programming framework written in Python that uses Theano to compute gradients via automatic differentiation as well as compile probabilistic programs on-the-fly to C for increased speed. Contrary to other probabilistic programming languages, PyMC3 allows model specification directly in Python code. The lack of a domain specific language allows for great flexibility and direct interaction with the model. This paper is a tutorial-style introduction to this software package.

  2. Comparing Categorical and Probabilistic Fingerprint Evidence.

    Science.gov (United States)

    Garrett, Brandon; Mitchell, Gregory; Scurich, Nicholas

    2018-04-23

    Fingerprint examiners traditionally express conclusions in categorical terms, opining that impressions do or do not originate from the same source. Recently, probabilistic conclusions have been proposed, with examiners estimating the probability of a match between recovered and known prints. This study presented a nationally representative sample of jury-eligible adults with a hypothetical robbery case in which an examiner opined on the likelihood that a defendant's fingerprints matched latent fingerprints in categorical or probabilistic terms. We studied model language developed by the U.S. Defense Forensic Science Center to summarize results of statistical analysis of the similarity between prints. Participant ratings of the likelihood the defendant left prints at the crime scene and committed the crime were similar when exposed to categorical and strong probabilistic match evidence. Participants reduced these likelihoods when exposed to the weaker probabilistic evidence, but did not otherwise discriminate among the prints assigned different match probabilities. © 2018 American Academy of Forensic Sciences.

  3. Goal Setting in Principal Evaluation: Goal Quality and Predictors of Achievement

    Science.gov (United States)

    Sinnema, Claire E. L.; Robinson, Viviane M. J.

    2012-01-01

    This article draws on goal-setting theory to investigate the goals set by experienced principals during their performance evaluations. While most goals were about teaching and learning, they tended to be vaguely expressed and only partially achieved. Five predictors (commitment, challenge, learning, effort, and support) explained a significant…

  4. Extending hierarchical achievement motivation models: the role of motivational needs for achievement goals and academic performance

    NARCIS (Netherlands)

    Bipp, T.; Dam, van K.

    2014-01-01

    In the current study, we investigated the role of three basic motivational needs (need for power, affiliation, achievement) as antecedents of goals within the 2 × 2 achievement goal framework, and examined their combined predictive validity with regard to academic performance in a sample of 120

  5. A probabilistic approach to safety/reliability of space nuclear power systems

    International Nuclear Information System (INIS)

    Medford, G.; Williams, K.; Kolaczkowski, A.

    1989-01-01

    An ongoing effort is investigating the feasibility of using probabilistic risk assessment (PRA) modeling techniques to construct a living model of a space nuclear power system. This is being done in conjunction with a traditional reliability and survivability analysis of the SP-100 space nuclear power system. The initial phase of the project consists of three major parts with the overall goal of developing a top-level system model and defining initiating events of interest for the SP-100 system. The three major tasks were performing a traditional survivability analysis, performing a simple system reliability analysis, and constructing a top-level system fault-tree model. Each of these tasks and their interim results are discussed in this paper. Initial results from the study support the conclusion that PRA modeling techniques can provide a valuable design and decision-making tool for space reactors. The ability of the model to rank and calculate relative contributions from various failure modes allows design optimization for maximum safety and reliability. Future efforts in the SP-100 program will see data development and quantification of the model to allow parametric evaluations of the SP-100 system. Current efforts have shown the need for formal data development and test programs within such a modeling framework

  6. Global/local methods for probabilistic structural analysis

    Science.gov (United States)

    Millwater, H. R.; Wu, Y.-T.

    1993-04-01

    A probabilistic global/local method is proposed to reduce the computational requirements of probabilistic structural analysis. A coarser global model is used for most of the computations with a local more refined model used only at key probabilistic conditions. The global model is used to establish the cumulative distribution function (cdf) and the Most Probable Point (MPP). The local model then uses the predicted MPP to adjust the cdf value. The global/local method is used within the advanced mean value probabilistic algorithm. The local model can be more refined with respect to the g1obal model in terms of finer mesh, smaller time step, tighter tolerances, etc. and can be used with linear or nonlinear models. The basis for this approach is described in terms of the correlation between the global and local models which can be estimated from the global and local MPPs. A numerical example is presented using the NESSUS probabilistic structural analysis program with the finite element method used for the structural modeling. The results clearly indicate a significant computer savings with minimal loss in accuracy.

  7. Data warehouse model for monitoring key performance indicators (KPIs) using goal oriented approach

    Science.gov (United States)

    Abdullah, Mohammed Thajeel; Ta'a, Azman; Bakar, Muhamad Shahbani Abu

    2016-08-01

    The growth and development of universities, just as other organizations, depend on their abilities to strategically plan and implement development blueprints which are in line with their vision and mission statements. The actualizations of these statements, which are often designed into goals and sub-goals and linked to their respective actors are better measured by defining key performance indicators (KPIs) of the university. The proposes ReGADaK, which is an extended the GRAnD approach highlights the facts, dimensions, attributes, measures and KPIs of the organization. The measures from the goal analysis of this unit serve as the basis of developing the related university's KPIs. The proposed data warehouse schema is evaluated through expert review, prototyping and usability evaluation. The findings from the evaluation processes suggest that the proposed data warehouse schema is suitable for monitoring the University's KPIs.

  8. The application of probabilistic fracture analysis to residual life evaluation of embrittled reactor vessels

    International Nuclear Information System (INIS)

    Dickson, T.L.; Simonen, F.A.

    1992-01-01

    Probabilistic fracture mechanics analysis is a major element of the comprehensive probabilistic methodology on which current NRC regulatory requirements for pressurized water reactor vessel integrity evaluation are based. Computer codes such as OCA-P and VISA-11 perform probabilistic fracture analyses to estimate the increase in vessel failure probability that occurs as the vessel material accumulates radiation damage over the operating life of the vessel. The results of such analyses, when compared with limits of acceptable failure probabilities, provide an estimation of the residual life of a vessel. Such codes can be applied to evaluate the potential benefits of plant-specific mitigating actions designed to reduce the probability of failure of a reactor vessel

  9. The application of probabilistic fracture analysis to residual life evaluation of embrittled reactor vessels

    International Nuclear Information System (INIS)

    Dickson, T.L.; Simonen, F.A.

    1992-01-01

    Probabilistic fracture mechanics analysis is a major element of comprehensive probabilistic methodology on which current NRC regulatory requirements for pressurized water reactor vessel integrity evaluation are based. Computer codes such as OCA-P and VISA-II perform probabilistic fracture analyses to estimate the increase in vessel failure probability that occurs as the vessel material accumulates radiation damage over the operating life of the vessel. The results of such analyses, when compared with limits of acceptable failure probabilities, provide an estimation of the residual life of a vessel. Such codes can be applied to evaluate the potential benefits of plant-specific mitigating actions designed to reduce the probability of failure of a reactor vessel. 10 refs

  10. Design of Probabilistic Random Forests with Applications to Anticancer Drug Sensitivity Prediction.

    Science.gov (United States)

    Rahman, Raziur; Haider, Saad; Ghosh, Souparno; Pal, Ranadip

    2015-01-01

    Random forests consisting of an ensemble of regression trees with equal weights are frequently used for design of predictive models. In this article, we consider an extension of the methodology by representing the regression trees in the form of probabilistic trees and analyzing the nature of heteroscedasticity. The probabilistic tree representation allows for analytical computation of confidence intervals (CIs), and the tree weight optimization is expected to provide stricter CIs with comparable performance in mean error. We approached the ensemble of probabilistic trees' prediction from the perspectives of a mixture distribution and as a weighted sum of correlated random variables. We applied our methodology to the drug sensitivity prediction problem on synthetic and cancer cell line encyclopedia dataset and illustrated that tree weights can be selected to reduce the average length of the CI without increase in mean error.

  11. Bell-Boole Inequality: Nonlocality or Probabilistic Incompatibility of Random Variables?

    Directory of Open Access Journals (Sweden)

    Andrei Khrennikov

    2008-03-01

    Full Text Available The main aim of this report is to inform the quantum information community about investigations on the problem of probabilistic compatibility of a family of random variables: a possibility to realize such a family on the basis of a single probability measure (to construct a single Kolmogorov probability space. These investigations were started hundred of years ago by J. Boole (who invented Boolean algebras. The complete solution of the problem was obtained by Soviet mathematician Vorobjev in 60th. Surprisingly probabilists and statisticians obtained inequalities for probabilities and correlations among which one can find the famous Bell’s inequality and its generalizations. Such inequalities appeared simply as constraints for probabilistic compatibility. In this framework one can not see a priori any link to such problems as nonlocality and “death of reality” which are typically linked to Bell’s type inequalities in physical literature. We analyze the difference between positions of mathematicians and quantum physicists. In particular, we found that one of the most reasonable explanations of probabilistic incompatibility is mixing in Bell’s type inequalities statistical data from a number of experiments performed under different experimental contexts.

  12. Probabilistic Assessment of the Occurrence and Duration of Ice Accretion on Cables

    DEFF Research Database (Denmark)

    Roldsgaard, Joan Hee; Georgakis, Christos Thomas; Faber, Michael Havbro

    2015-01-01

    This paper presents an operational framework for assessing the probability of occurrence of in-cloud and precipitation icing and its duration. The framework utilizes the features of the Bayesian Probabilistic Networks. and its performance is illustrated through a case study of the cable-stayed...... Oresund Bridge. The Bayesian Probabilistic Network model used for the estimation of the occurrence and duration probabilities is studied and it is found to be robust with respect to changes in the choice of distribution types used to model the meteorological variables that influence the two icing...

  13. A framework for the probabilistic analysis of meteotsunamis

    Science.gov (United States)

    Geist, Eric L.; ten Brink, Uri S.; Gove, Matthew D.

    2014-01-01

    A probabilistic technique is developed to assess the hazard from meteotsunamis. Meteotsunamis are unusual sea-level events, generated when the speed of an atmospheric pressure or wind disturbance is comparable to the phase speed of long waves in the ocean. A general aggregation equation is proposed for the probabilistic analysis, based on previous frameworks established for both tsunamis and storm surges, incorporating different sources and source parameters of meteotsunamis. Parameterization of atmospheric disturbances and numerical modeling is performed for the computation of maximum meteotsunami wave amplitudes near the coast. A historical record of pressure disturbances is used to establish a continuous analytic distribution of each parameter as well as the overall Poisson rate of occurrence. A demonstration study is presented for the northeast U.S. in which only isolated atmospheric pressure disturbances from squall lines and derechos are considered. For this study, Automated Surface Observing System stations are used to determine the historical parameters of squall lines from 2000 to 2013. The probabilistic equations are implemented using a Monte Carlo scheme, where a synthetic catalog of squall lines is compiled by sampling the parameter distributions. For each entry in the catalog, ocean wave amplitudes are computed using a numerical hydrodynamic model. Aggregation of the results from the Monte Carlo scheme results in a meteotsunami hazard curve that plots the annualized rate of exceedance with respect to maximum event amplitude for a particular location along the coast. Results from using multiple synthetic catalogs, resampled from the parent parameter distributions, yield mean and quantile hazard curves. Further refinements and improvements for probabilistic analysis of meteotsunamis are discussed.

  14. A meta-analysis of self-reported achievement goals and nonself-report performance across three achievement domains (work, sports, and education.

    Directory of Open Access Journals (Sweden)

    Nico W Van Yperen

    Full Text Available During the past three decades, the achievement goal approach to achievement motivation has emerged as an influential area of research, and is dedicated to understanding the reasons behind the individual's drive to achieve competence and performance. However, the current literature on achievement goals is segmented rather than integrated. That is, citations across the three major and distinct achievement domains (work, education, and sports are more the exception than the rule and similarities and differences between findings for the different achievement domains have yet to be tested. The purpose of the present study was to examine the relationships between self-reported achievement goals and nonself-report performance through meta-analysis, and the moderating potential of achievement domain. Identifying achievement domain as moderator improves our understanding to which contexts we can (not generalize conclusions to, it helps to understand seemingly inconsistent findings, and opens avenues for future research on the underlying processes. Because the achievement goal (AG measure used in a study is partially confounded with achievement domain, we examined the moderating role of this variable as well. Our findings suggest that - overall - approach goals (either mastery or performance were associated positively with performance attainment, whereas avoidance goals (either mastery or performance were associated negatively with performance attainment. These relationships were moderated by achievement domain. For example, relative to the education or work domain, in the sports domain, we did not observe negative correlations between avoidance goals and performance. The absence of statistical moderation due to AG measure suggests that the observed moderation of achievement domain cannot be explained by the AG measure utilized. We suggest further steps to integrate the achievement goal literature, and accordingly, to broaden and deepen understanding of

  15. A meta-analysis of self-reported achievement goals and nonself-report performance across three achievement domains (work, sports, and education).

    Science.gov (United States)

    Van Yperen, Nico W; Blaga, Monica; Postmes, Tom

    2014-01-01

    During the past three decades, the achievement goal approach to achievement motivation has emerged as an influential area of research, and is dedicated to understanding the reasons behind the individual's drive to achieve competence and performance. However, the current literature on achievement goals is segmented rather than integrated. That is, citations across the three major and distinct achievement domains (work, education, and sports) are more the exception than the rule and similarities and differences between findings for the different achievement domains have yet to be tested. The purpose of the present study was to examine the relationships between self-reported achievement goals and nonself-report performance through meta-analysis, and the moderating potential of achievement domain. Identifying achievement domain as moderator improves our understanding to which contexts we can (not) generalize conclusions to, it helps to understand seemingly inconsistent findings, and opens avenues for future research on the underlying processes. Because the achievement goal (AG) measure used in a study is partially confounded with achievement domain, we examined the moderating role of this variable as well. Our findings suggest that - overall - approach goals (either mastery or performance) were associated positively with performance attainment, whereas avoidance goals (either mastery or performance) were associated negatively with performance attainment. These relationships were moderated by achievement domain. For example, relative to the education or work domain, in the sports domain, we did not observe negative correlations between avoidance goals and performance. The absence of statistical moderation due to AG measure suggests that the observed moderation of achievement domain cannot be explained by the AG measure utilized. We suggest further steps to integrate the achievement goal literature, and accordingly, to broaden and deepen understanding of performance

  16. The Effects Of Leadership Styles On Goal Clarity And Fairness Mediated Used Performance Measure

    OpenAIRE

    Amris Rusli Tanjung; Yesi Mutia Basri

    2017-01-01

    This paper investigate the effects of superiors performance evaluation behaviors on subordinates work-related attitudes mediated used performance measure. We used leadership style initiating structure and consideration and performance measure use objective and subjective measures on managerial work related attitudes goal clarity and evaluation fairness. We test our hypotheses using survey data from 56 middle-level managers in 4 services organizations. The results from Structural Equation Mode...

  17. Probabilistic Simulation of Multi-Scale Composite Behavior

    Science.gov (United States)

    Chamis, Christos C.

    2012-01-01

    A methodology is developed to computationally assess the non-deterministic composite response at all composite scales (from micro to structural) due to the uncertainties in the constituent (fiber and matrix) properties, in the fabrication process and in structural variables (primitive variables). The methodology is computationally efficient for simulating the probability distributions of composite behavior, such as material properties, laminate and structural responses. Bi-products of the methodology are probabilistic sensitivities of the composite primitive variables. The methodology has been implemented into the computer codes PICAN (Probabilistic Integrated Composite ANalyzer) and IPACS (Integrated Probabilistic Assessment of Composite Structures). The accuracy and efficiency of this methodology are demonstrated by simulating the uncertainties in composite typical laminates and comparing the results with the Monte Carlo simulation method. Available experimental data of composite laminate behavior at all scales fall within the scatters predicted by PICAN. Multi-scaling is extended to simulate probabilistic thermo-mechanical fatigue and to simulate the probabilistic design of a composite redome in order to illustrate its versatility. Results show that probabilistic fatigue can be simulated for different temperature amplitudes and for different cyclic stress magnitudes. Results also show that laminate configurations can be selected to increase the redome reliability by several orders of magnitude without increasing the laminate thickness--a unique feature of structural composites. The old reference denotes that nothing fundamental has been done since that time.

  18. Integrated Deterministic-Probabilistic Safety Assessment Methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Kudinov, P.; Vorobyev, Y.; Sanchez-Perea, M.; Queral, C.; Jimenez Varas, G.; Rebollo, M. J.; Mena, L.; Gomez-Magin, J.

    2014-02-01

    IDPSA (Integrated Deterministic-Probabilistic Safety Assessment) is a family of methods which use tightly coupled probabilistic and deterministic approaches to address respective sources of uncertainties, enabling Risk informed decision making in a consistent manner. The starting point of the IDPSA framework is that safety justification must be based on the coupling of deterministic (consequences) and probabilistic (frequency) considerations to address the mutual interactions between stochastic disturbances (e.g. failures of the equipment, human actions, stochastic physical phenomena) and deterministic response of the plant (i.e. transients). This paper gives a general overview of some IDPSA methods as well as some possible applications to PWR safety analyses. (Author)

  19. Effects of Goal Line Feedback on Level, Slope, and Stability of Performance within Curriculum-Based Measurement.

    Science.gov (United States)

    Fuchs, Lynn S.; And Others

    1991-01-01

    Nineteen special educators implemented Curriculum-Based Measurement with a total of 36 learning-disabled math pupils in grades 2-8 to examine the effects of goal line feedback. Results indicated comparable levels and slopes of student performance across treatment conditions, although goal line feedback was associated with greater performance…

  20. To Confirm or to Conform? Performance Goals as a Regulator of Conflict with More-Competent Others

    Science.gov (United States)

    Sommet, Nicolas; Darnon, Céline; Butera, Fabrizio

    2015-01-01

    Despite the fact that most competence-relevant settings are "socially" relevant settings, the interpersonal effects of achievement goals have been understudied. This is all the more surprising in the case of performance goals, for which self-competence is assessed using an other-referenced standard. In the present research, performance…

  1. Multiple sequential failure model: A probabilistic approach to quantifying human error dependency

    International Nuclear Information System (INIS)

    Samanta

    1985-01-01

    This paper rpesents a probabilistic approach to quantifying human error dependency when multiple tasks are performed. Dependent human failures are dominant contributors to risks from nuclear power plants. An overview of the Multiple Sequential Failure (MSF) model developed and its use in probabilistic risk assessments (PRAs) depending on the available data are discussed. A small-scale psychological experiment was conducted on the nature of human dependency and the interpretation of the experimental data by the MSF model show remarkable accommodation of the dependent failure data. The model, which provides an unique method for quantification of dependent failures in human reliability analysis, can be used in conjunction with any of the general methods currently used for performing the human reliability aspect in PRAs

  2. Application of laboratory data from small-scale simulators to human performance issues in the nuclear industry

    International Nuclear Information System (INIS)

    Spettell, C.M.

    1986-01-01

    Laboratory analogs of nuclear power plant tasks were simulated on personal computers in two experimental studies. Human performance data were collected during each experimental study. The goal of the first experiment was to validate a quantitative model of dependence among human errors during testing, calibration, and maintenance activities. This model, the Multiple Sequential Failure (MSF) model (NUREG/CR-2211) has been used to quantify dependent human error failure probabilities for human reliability analyses in Probabilistic Risk Assessments (PRAs). The goal of the second experiment was to examine the relationship among psychological and behavioral characteristics of individuals and their performance at controlling a simulated nuclear power plant. These studies demonstrated the usefulness of the experimental psychology approach for validating models of human performance at nuclear power plant tasks

  3. Variate generation for probabilistic fracture mechanics and fitness-for-service studies

    International Nuclear Information System (INIS)

    Walker, J.R.

    1987-01-01

    Atomic Energy of Canada Limited is conducting studies in Probabilistic Fracture Mechanics. These studies are being conducted as part of a fitness-for-service programme in support of CANDU reactors. The Monte Carlo analyses, which form part of the Probabilistic Fracture Mechanics studies, require that variates can be sampled from probability density functions. Accurate pseudo-random numbers are necessary for accurate variate generation. This report details the principles of variate generation, and describes the production and testing of pseudo-random numbers. A new algorithm has been produced for the correct performance of the lattice test for the independence of pseudo-random numbers. Two new pseudo-random number generators have been produced. These generators have excellent randomness properties and can be made fully machine-independent. Versions, in FORTRAN, for VAX and CDC computers are given. Accurate and efficient algorithms for the generation of variates from the specialized probability density functions of Probabilistic Fracture Mechanics are given. 38 refs

  4. Perception of Risk and Terrorism-Related Behavior Change: Dual Influences of Probabilistic Reasoning and Reality Testing

    Science.gov (United States)

    Denovan, Andrew; Dagnall, Neil; Drinkwater, Kenneth; Parker, Andrew; Clough, Peter

    2017-01-01

    The present study assessed the degree to which probabilistic reasoning performance and thinking style influenced perception of risk and self-reported levels of terrorism-related behavior change. A sample of 263 respondents, recruited via convenience sampling, completed a series of measures comprising probabilistic reasoning tasks (perception of randomness, base rate, probability, and conjunction fallacy), the Reality Testing subscale of the Inventory of Personality Organization (IPO-RT), the Domain-Specific Risk-Taking Scale, and a terrorism-related behavior change scale. Structural equation modeling examined three progressive models. Firstly, the Independence Model assumed that probabilistic reasoning, perception of risk and reality testing independently predicted terrorism-related behavior change. Secondly, the Mediation Model supposed that probabilistic reasoning and reality testing correlated, and indirectly predicted terrorism-related behavior change through perception of risk. Lastly, the Dual-Influence Model proposed that probabilistic reasoning indirectly predicted terrorism-related behavior change via perception of risk, independent of reality testing. Results indicated that performance on probabilistic reasoning tasks most strongly predicted perception of risk, and preference for an intuitive thinking style (measured by the IPO-RT) best explained terrorism-related behavior change. The combination of perception of risk with probabilistic reasoning ability in the Dual-Influence Model enhanced the predictive power of the analytical-rational route, with conjunction fallacy having a significant indirect effect on terrorism-related behavior change via perception of risk. The Dual-Influence Model possessed superior fit and reported similar predictive relations between intuitive-experiential and analytical-rational routes and terrorism-related behavior change. The discussion critically examines these findings in relation to dual-processing frameworks. This

  5. Perception of Risk and Terrorism-Related Behavior Change: Dual Influences of Probabilistic Reasoning and Reality Testing.

    Science.gov (United States)

    Denovan, Andrew; Dagnall, Neil; Drinkwater, Kenneth; Parker, Andrew; Clough, Peter

    2017-01-01

    The present study assessed the degree to which probabilistic reasoning performance and thinking style influenced perception of risk and self-reported levels of terrorism-related behavior change. A sample of 263 respondents, recruited via convenience sampling, completed a series of measures comprising probabilistic reasoning tasks (perception of randomness, base rate, probability, and conjunction fallacy), the Reality Testing subscale of the Inventory of Personality Organization (IPO-RT), the Domain-Specific Risk-Taking Scale, and a terrorism-related behavior change scale. Structural equation modeling examined three progressive models. Firstly, the Independence Model assumed that probabilistic reasoning, perception of risk and reality testing independently predicted terrorism-related behavior change. Secondly, the Mediation Model supposed that probabilistic reasoning and reality testing correlated, and indirectly predicted terrorism-related behavior change through perception of risk. Lastly, the Dual-Influence Model proposed that probabilistic reasoning indirectly predicted terrorism-related behavior change via perception of risk, independent of reality testing. Results indicated that performance on probabilistic reasoning tasks most strongly predicted perception of risk, and preference for an intuitive thinking style (measured by the IPO-RT) best explained terrorism-related behavior change. The combination of perception of risk with probabilistic reasoning ability in the Dual-Influence Model enhanced the predictive power of the analytical-rational route, with conjunction fallacy having a significant indirect effect on terrorism-related behavior change via perception of risk. The Dual-Influence Model possessed superior fit and reported similar predictive relations between intuitive-experiential and analytical-rational routes and terrorism-related behavior change. The discussion critically examines these findings in relation to dual-processing frameworks. This

  6. Perception of Risk and Terrorism-Related Behavior Change: Dual Influences of Probabilistic Reasoning and Reality Testing

    Directory of Open Access Journals (Sweden)

    Andrew Denovan

    2017-10-01

    Full Text Available The present study assessed the degree to which probabilistic reasoning performance and thinking style influenced perception of risk and self-reported levels of terrorism-related behavior change. A sample of 263 respondents, recruited via convenience sampling, completed a series of measures comprising probabilistic reasoning tasks (perception of randomness, base rate, probability, and conjunction fallacy, the Reality Testing subscale of the Inventory of Personality Organization (IPO-RT, the Domain-Specific Risk-Taking Scale, and a terrorism-related behavior change scale. Structural equation modeling examined three progressive models. Firstly, the Independence Model assumed that probabilistic reasoning, perception of risk and reality testing independently predicted terrorism-related behavior change. Secondly, the Mediation Model supposed that probabilistic reasoning and reality testing correlated, and indirectly predicted terrorism-related behavior change through perception of risk. Lastly, the Dual-Influence Model proposed that probabilistic reasoning indirectly predicted terrorism-related behavior change via perception of risk, independent of reality testing. Results indicated that performance on probabilistic reasoning tasks most strongly predicted perception of risk, and preference for an intuitive thinking style (measured by the IPO-RT best explained terrorism-related behavior change. The combination of perception of risk with probabilistic reasoning ability in the Dual-Influence Model enhanced the predictive power of the analytical-rational route, with conjunction fallacy having a significant indirect effect on terrorism-related behavior change via perception of risk. The Dual-Influence Model possessed superior fit and reported similar predictive relations between intuitive-experiential and analytical-rational routes and terrorism-related behavior change. The discussion critically examines these findings in relation to dual

  7. Development of RESRAD probabilistic computer codes for NRC decommissioning and license termination applications

    International Nuclear Information System (INIS)

    Chen, S. Y.; Yu, C.; Mo, T.; Trottier, C.

    2000-01-01

    In 1999, the US Nuclear Regulatory Commission (NRC) tasked Argonne National Laboratory to modify the existing RESRAD and RESRAD-BUILD codes to perform probabilistic, site-specific dose analysis for use with the NRC's Standard Review Plan for demonstrating compliance with the license termination rule. The RESRAD codes have been developed by Argonne to support the US Department of Energy's (DOEs) cleanup efforts. Through more than a decade of application, the codes already have established a large user base in the nation and a rigorous QA support. The primary objectives of the NRC task are to: (1) extend the codes' capabilities to include probabilistic analysis, and (2) develop parameter distribution functions and perform probabilistic analysis with the codes. The new codes also contain user-friendly features specially designed with graphic-user interface. In October 2000, the revised RESRAD (version 6.0) and RESRAD-BUILD (version 3.0), together with the user's guide and relevant parameter information, have been developed and are made available to the general public via the Internet for use

  8. Using ELM-based weighted probabilistic model in the classification of synchronous EEG BCI.

    Science.gov (United States)

    Tan, Ping; Tan, Guan-Zheng; Cai, Zi-Xing; Sa, Wei-Ping; Zou, Yi-Qun

    2017-01-01

    Extreme learning machine (ELM) is an effective machine learning technique with simple theory and fast implementation, which has gained increasing interest from various research fields recently. A new method that combines ELM with probabilistic model method is proposed in this paper to classify the electroencephalography (EEG) signals in synchronous brain-computer interface (BCI) system. In the proposed method, the softmax function is used to convert the ELM output to classification probability. The Chernoff error bound, deduced from the Bayesian probabilistic model in the training process, is adopted as the weight to take the discriminant process. Since the proposed method makes use of the knowledge from all preceding training datasets, its discriminating performance improves accumulatively. In the test experiments based on the datasets from BCI competitions, the proposed method is compared with other classification methods, including the linear discriminant analysis, support vector machine, ELM and weighted probabilistic model methods. For comparison, the mutual information, classification accuracy and information transfer rate are considered as the evaluation indicators for these classifiers. The results demonstrate that our method shows competitive performance against other methods.

  9. Diffusion tensor tractography of the arcuate fasciculus in patients with brain tumors: Comparison between deterministic and probabilistic models.

    Science.gov (United States)

    Li, Zhixi; Peck, Kyung K; Brennan, Nicole P; Jenabi, Mehrnaz; Hsu, Meier; Zhang, Zhigang; Holodny, Andrei I; Young, Robert J

    2013-02-01

    The purpose of this study was to compare the deterministic and probabilistic tracking methods of diffusion tensor white matter fiber tractography in patients with brain tumors. We identified 29 patients with left brain tumors probabilistic method based on an extended Monte Carlo Random Walk algorithm. Tracking was controlled using two ROIs corresponding to Broca's and Wernicke's areas. Tracts in tumoraffected hemispheres were examined for extension between Broca's and Wernicke's areas, anterior-posterior length and volume, and compared with the normal contralateral tracts. Probabilistic tracts displayed more complete anterior extension to Broca's area than did FACT tracts on the tumor-affected and normal sides (p probabilistic tracts than FACT tracts (p probabilistic tracts than FACT tracts (p = 0.01). Probabilistic tractography reconstructs the arcuate fasciculus more completely and performs better through areas of tumor and/or edema. The FACT algorithm tends to underestimate the anterior-most fibers of the arcuate fasciculus, which are crossed by primary motor fibers.

  10. A Probabilistic Safety Assessment of a Pyro-processed Waste Repository

    International Nuclear Information System (INIS)

    Lee, Youn Myoung; Jeong, Jong Tae

    2012-01-01

    A GoldSim template program for a safety assessment of a hybrid-typed repository system, called A-KRS, in which two kinds of pyro-processed radioactive wastes, low-level metal wastes and ceramic high-level wastes that arise from the pyro-processing of PWR nuclear spent fuels are disposed of, has been developed. This program is ready both for a deterministic and probabilistic total system performance assessment which is able to evaluate nuclide release from the repository and farther transport into the geosphere and biosphere under various normal, disruptive natural and manmade events, and scenarios. The A-KRS has been probabilistically assessed with 9 selected input parameters, each of which has its own statistical distribution for a normal release and transport scenario associated with nuclide release and transport in and around the repository. Probabilistic dose exposure rates to the farming exposure group have been evaluated. A sensitivity of 9 selected parameters to the result has also been investigated to see which parameter is more sensitive and important to the exposure rates.

  11. Millennium development goals: Examining Kenya constraints in achieving the eight goals

    Directory of Open Access Journals (Sweden)

    Wambua Leonard Munyao

    2013-06-01

    Full Text Available This paper examines Kenya’s performance in achieving the famous millennium development goals. The paper provides the government and other stakeholders with proper understanding of the constraints of achieving the millennium development goals as well as reflecting the phase and the passion of the country in achieving this important development goal. The paper further seeks to stress the importance of this goal in reducing poverty in the country. The paper has cited some key factors undermining achieving of the millennium development goals in Kenya. Major recommendations that can contribute towards achieving of the millennium development goals have also been made.

  12. Representation of human behaviour in probabilistic safety analysis

    International Nuclear Information System (INIS)

    Whittingham, R.B.

    1991-01-01

    This paper provides an overview of the representation of human behaviour in probabilistic safety assessment. Human performance problems which may result in errors leading to accidents are considered in terms of methods of identification using task analysis, screening analysis of critical errors, representation and quantification of human errors in fault trees and event trees and error reduction measures. (author) figs., tabs., 43 refs

  13. Delineating probabilistic species pools in ecology and biogeography

    OpenAIRE

    Karger, Dirk Nikolaus; Cord, Anna F; Kessler, Michael; Kreft, Holger; Kühn, Ingolf; Pompe, Sven; Sandel, Brody; Sarmento Cabral, Juliano; Smith, Adam B; Svenning, Jens-Christian; Tuomisto, Hanna; Weigelt, Patrick; Wesche, Karsten

    2016-01-01

    Aim To provide a mechanistic and probabilistic framework for defining the species pool based on species-specific probabilities of dispersal, environmental suitability and biotic interactions within a specific temporal extent, and to show how probabilistic species pools can help disentangle the geographical structure of different community assembly processes. Innovation Probabilistic species pools provide an improved species pool definition based on probabilities in conjuncti...

  14. Probabilistic safety assessment for high-level waste tanks at Hanford

    International Nuclear Information System (INIS)

    Sullivan, L.H.; MacFarlane, D.R.; Stack, D.W.

    1996-01-01

    Los Alamos National Laboratory has performed a comprehensive probabilistic safety assessment (PSA), including consideration of external events, for the 18 tank farms at the Hanford Tank Farm (HTF). This work was sponsored by the Department of Energy/Environmental Restoration and Waste Management Division (DOE/EM)

  15. How can leaders foster team learning? Effects of leader-assigned mastery and performance goals and psychological safety.

    Science.gov (United States)

    Ashauer, Shirley A; Macan, Therese

    2013-01-01

    Learning and adapting to change are imperative as teams today face unprecedented change. Yet, an important part of learning involves challenging assumptions and addressing differences of opinion openly within a group--the kind of behaviors that pose the potential for embarrassment or threat. How can leaders foster an environment in which team members feel it is safe to take interpersonal risks in order to learn? In a study of 71 teams, we found that psychological safety and learning behavior were higher for teams with mastery than performance goal instructions or no goal instructions. Team psychological safety mediated the relationship between mastery and performance goal instructions and learning behavior. Findings contribute to our understanding of how leader-assigned goals are related to psychological safety and learning behavior in a team context, and suggest approaches to foster such processes.

  16. Probabilistic Design

    DEFF Research Database (Denmark)

    Sørensen, John Dalsgaard; Burcharth, H. F.

    This chapter describes how partial safety factors can be used in design of vertical wall breakwaters and an example of a code format is presented. The partial safety factors are calibrated on a probabilistic basis. The code calibration process used to calibrate some of the partial safety factors...

  17. Need to use probabilistic risk approach in performance assessment of waste disposal facilities

    International Nuclear Information System (INIS)

    Bonano, E.J.; Gallegos, D.P.

    1991-01-01

    Regulations governing the disposal of radioactive, hazardous, and/or mixed wastes will likely require, either directly or indirectly, that the performance of disposal facilities be assessed quantitatively. Such analyses, commonly called ''performance assessments,'' rely on the use of predictive models to arrive at a quantitative estimate of the potential impact of disposal on the environment and the safety and health of the public. It has been recognized that a suite of uncertainties affect the results of a performance assessment. These uncertainties are conventionally categorized as (1) uncertainty in the future state of the disposal system (facility and surrounding medium), (2) uncertainty in models (including conceptual models, mathematical models, and computer codes), and (3) uncertainty in data and parameters. Decisions regarding the suitability of a waste disposal facility must be made in light of these uncertainties. Hence, an approach is needed that would allow the explicit consideration of these uncertainties so that their impact on the estimated consequences of disposal can be evaluated. While most regulations for waste disposal do not prescribe the consideration of uncertainties, it is proposed that, even in such cases, a meaningful decision regarding the suitability of a waste disposal facility cannot be made without considering the impact of the attendant uncertainties. A probabilistic risk assessment (PRA) approach provides the formalism for considering the uncertainties and the technical basis that the decision makers can use in discharging their duties. A PRA methodology developed and demonstrated for the disposal of high-level radioactive waste provides a general framework for assessing the disposal of all types of wastes (radioactive, hazardous, and mixed). 15 refs., 1 fig., 1 tab

  18. Probabilistic Learning by Rodent Grid Cells.

    Science.gov (United States)

    Cheung, Allen

    2016-10-01

    Mounting evidence shows mammalian brains are probabilistic computers, but the specific cells involved remain elusive. Parallel research suggests that grid cells of the mammalian hippocampal formation are fundamental to spatial cognition but their diverse response properties still defy explanation. No plausible model exists which explains stable grids in darkness for twenty minutes or longer, despite being one of the first results ever published on grid cells. Similarly, no current explanation can tie together grid fragmentation and grid rescaling, which show very different forms of flexibility in grid responses when the environment is varied. Other properties such as attractor dynamics and grid anisotropy seem to be at odds with one another unless additional properties are assumed such as a varying velocity gain. Modelling efforts have largely ignored the breadth of response patterns, while also failing to account for the disastrous effects of sensory noise during spatial learning and recall, especially in darkness. Here, published electrophysiological evidence from a range of experiments are reinterpreted using a novel probabilistic learning model, which shows that grid cell responses are accurately predicted by a probabilistic learning process. Diverse response properties of probabilistic grid cells are statistically indistinguishable from rat grid cells across key manipulations. A simple coherent set of probabilistic computations explains stable grid fields in darkness, partial grid rescaling in resized arenas, low-dimensional attractor grid cell dynamics, and grid fragmentation in hairpin mazes. The same computations also reconcile oscillatory dynamics at the single cell level with attractor dynamics at the cell ensemble level. Additionally, a clear functional role for boundary cells is proposed for spatial learning. These findings provide a parsimonious and unified explanation of grid cell function, and implicate grid cells as an accessible neuronal population

  19. Living probabilistic safety assessment (LPSA)

    International Nuclear Information System (INIS)

    1999-08-01

    Over the past few years many nuclear power plant organizations have performed probabilistic safety assessments (PSAs) to identify and understand key plant vulnerabilities. As a result of the availability of these PSA studies, there is a desire to use them to enhance plant safety and to operate the nuclear stations in the most efficient manner. PSA is an effective tool for this purpose as it assists plant management to target resources where the largest benefit to plant safety can be obtained. However, any PSA which is to be used in this way must have a credible and defensible basis. Thus, it is very important to have a high quality 'living PSA' accepted by the plant and the regulator. With this background in mind, the IAEA has prepared this report on Living Probabilistic Safety Assessment (LPSA) which addresses the updating, documentation, quality assurance, and management and organizational requirements for LPSA. Deficiencies in the areas addressed in this report would seriously reduce the adequacy of the LPSA as a tool to support decision making at NPPs. This report was reviewed by a working group during a Technical Committee Meeting on PSA Applications to Improve NPP Safety held in Madrid, Spain, from 23 to 27 February 1998

  20. Probabilistic approach to EMP assessment

    International Nuclear Information System (INIS)

    Bevensee, R.M.; Cabayan, H.S.; Deadrick, F.J.; Martin, L.C.; Mensing, R.W.

    1980-09-01

    The development of nuclear EMP hardness requirements must account for uncertainties in the environment, in interaction and coupling, and in the susceptibility of subsystems and components. Typical uncertainties of the last two kinds are briefly summarized, and an assessment methodology is outlined, based on a probabilistic approach that encompasses the basic concepts of reliability. It is suggested that statements of survivability be made compatible with system reliability. Validation of the approach taken for simple antenna/circuit systems is performed with experiments and calculations that involve a Transient Electromagnetic Range, numerical antenna modeling, separate device failure data, and a failure analysis computer program

  1. Probabilistic composition of preferences, theory and applications

    CERN Document Server

    Parracho Sant'Anna, Annibal

    2015-01-01

    Putting forward a unified presentation of the features and possible applications of probabilistic preferences composition, and serving as a methodology for decisions employing multiple criteria, this book maximizes reader insights into the evaluation in probabilistic terms and the development of composition approaches that do not depend on assigning weights to the criteria. With key applications in important areas of management such as failure modes, effects analysis and productivity analysis – together with explanations about the application of the concepts involved –this book makes available numerical examples of probabilistic transformation development and probabilistic composition. Useful not only as a reference source for researchers, but also in teaching classes of graduate courses in Production Engineering and Management Science, the key themes of the book will be of especial interest to researchers in the field of Operational Research.

  2. Global assessment of predictability of water availability: A bivariate probabilistic Budyko analysis

    Science.gov (United States)

    Wang, Weiguang; Fu, Jianyu

    2018-02-01

    Estimating continental water availability is of great importance for water resources management, in terms of maintaining ecosystem integrity and sustaining society development. To more accurately quantify the predictability of water availability, on the basis of univariate probabilistic Budyko framework, a bivariate probabilistic Budyko approach was developed using copula-based joint distribution model for considering the dependence between parameter ω of Wang-Tang's equation and the Normalized Difference Vegetation Index (NDVI), and was applied globally. The results indicate the predictive performance in global water availability is conditional on the climatic condition. In comparison with simple univariate distribution, the bivariate one produces the lower interquartile range under the same global dataset, especially in the regions with higher NDVI values, highlighting the importance of developing the joint distribution by taking into account the dependence structure of parameter ω and NDVI, which can provide more accurate probabilistic evaluation of water availability.

  3. Probabilistic Reversible Automata and Quantum Automata

    OpenAIRE

    Golovkins, Marats; Kravtsev, Maksim

    2002-01-01

    To study relationship between quantum finite automata and probabilistic finite automata, we introduce a notion of probabilistic reversible automata (PRA, or doubly stochastic automata). We find that there is a strong relationship between different possible models of PRA and corresponding models of quantum finite automata. We also propose a classification of reversible finite 1-way automata.

  4. P-CARES 2.0.0, Probabilistic Computer Analysis for Rapid Evaluation of Structures

    International Nuclear Information System (INIS)

    2008-01-01

    1 - Description of program or function: P-CARES 2.0.0 (Probabilistic Computer Analysis for Rapid Evaluation of Structures) was developed for NRC staff use to determine the validity and accuracy of the analysis methods used by various utilities for structural safety evaluations of nuclear power plants. P-CARES provides the capability to effectively evaluate the probabilistic seismic response using simplified soil and structural models and to quickly check the validity and/or accuracy of the SSI data received from applicants and licensees. The code is organized in a modular format with the basic modules of the system performing static, seismic, and nonlinear analysis. 2 - Methods: P-CARES is an update of the CARES program developed at Brookhaven National Laboratory during the 1980's. A major improvement is the enhanced analysis capability in which a probabilistic algorithm has been implemented to perform the probabilistic site response and soil-structure interaction (SSI) analyses. This is accomplished using several sampling techniques such as the Latin Hypercube sampling (LHC), engineering LHC, the Fekete Point Set method, and also the traditional Monte Carlo simulation. This new feature enhances the site response and SSI analysis such that the effect of uncertainty in local site soil properties can now be quantified. Another major addition to P-CARES is a graphical user interface (GUI) which significantly improves the performance of P-Cares in terms of the inter-relations among different functions of the program, and facilitates the input/output processing and execution management. It also provides many user friendly features that would allow an analyst to quickly develop insights from the analysis results. 3 - Restrictions on the complexity of the problem: None noted

  5. Probabilistic uniformities of uniform spaces

    Energy Technology Data Exchange (ETDEWEB)

    Rodriguez Lopez, J.; Romaguera, S.; Sanchis, M.

    2017-07-01

    The theory of metric spaces in the fuzzy context has shown to be an interesting area of study not only from a theoretical point of view but also for its applications. Nevertheless, it is usual to consider these spaces as classical topological or uniform spaces and there are not too many results about constructing fuzzy topological structures starting from a fuzzy metric. Maybe, H/{sup o}hle was the first to show how to construct a probabilistic uniformity and a Lowen uniformity from a probabilistic pseudometric /cite{Hohle78,Hohle82a}. His method can be directly translated to the context of fuzzy metrics and allows to characterize the categories of probabilistic uniform spaces or Lowen uniform spaces by means of certain families of fuzzy pseudometrics /cite{RL}. On the other hand, other different fuzzy uniformities can be constructed in a fuzzy metric space: a Hutton $[0,1]$-quasi-uniformity /cite{GGPV06}; a fuzzifiying uniformity /cite{YueShi10}, etc. The paper /cite{GGRLRo} gives a study of several methods of endowing a fuzzy pseudometric space with a probabilistic uniformity and a Hutton $[0,1]$-quasi-uniformity. In 2010, J. Guti/'errez Garc/'{/i}a, S. Romaguera and M. Sanchis /cite{GGRoSanchis10} proved that the category of uniform spaces is isomorphic to a category formed by sets endowed with a fuzzy uniform structure, i. e. a family of fuzzy pseudometrics satisfying certain conditions. We will show here that, by means of this isomorphism, we can obtain several methods to endow a uniform space with a probabilistic uniformity. Furthermore, these constructions allow to obtain a factorization of some functors introduced in /cite{GGRoSanchis10}. (Author)

  6. RNA-PAIRS: RNA probabilistic assignment of imino resonance shifts

    International Nuclear Information System (INIS)

    Bahrami, Arash; Clos, Lawrence J.; Markley, John L.; Butcher, Samuel E.; Eghbalnia, Hamid R.

    2012-01-01

    The significant biological role of RNA has further highlighted the need for improving the accuracy, efficiency and the reach of methods for investigating RNA structure and function. Nuclear magnetic resonance (NMR) spectroscopy is vital to furthering the goals of RNA structural biology because of its distinctive capabilities. However, the dispersion pattern in the NMR spectra of RNA makes automated resonance assignment, a key step in NMR investigation of biomolecules, remarkably challenging. Herein we present RNA Probabilistic Assignment of Imino Resonance Shifts (RNA-PAIRS), a method for the automated assignment of RNA imino resonances with synchronized verification and correction of predicted secondary structure. RNA-PAIRS represents an advance in modeling the assignment paradigm because it seeds the probabilistic network for assignment with experimental NMR data, and predicted RNA secondary structure, simultaneously and from the start. Subsequently, RNA-PAIRS sets in motion a dynamic network that reverberates between predictions and experimental evidence in order to reconcile and rectify resonance assignments and secondary structure information. The procedure is halted when assignments and base-parings are deemed to be most consistent with observed crosspeaks. The current implementation of RNA-PAIRS uses an initial peak list derived from proton-nitrogen heteronuclear multiple quantum correlation ( 1 H– 15 N 2D HMQC) and proton–proton nuclear Overhauser enhancement spectroscopy ( 1 H– 1 H 2D NOESY) experiments. We have evaluated the performance of RNA-PAIRS by using it to analyze NMR datasets from 26 previously studied RNAs, including a 111-nucleotide complex. For moderately sized RNA molecules, and over a range of comparatively complex structural motifs, the average assignment accuracy exceeds 90%, while the average base pair prediction accuracy exceeded 93%. RNA-PAIRS yielded accurate assignments and base pairings consistent with imino resonances for a

  7. RNA-PAIRS: RNA probabilistic assignment of imino resonance shifts

    Energy Technology Data Exchange (ETDEWEB)

    Bahrami, Arash; Clos, Lawrence J.; Markley, John L.; Butcher, Samuel E. [National Magnetic Resonance Facility at Madison (United States); Eghbalnia, Hamid R., E-mail: eghbalhd@uc.edu [University of Cincinnati, Department of Molecular and Cellular Physiology (United States)

    2012-04-15

    The significant biological role of RNA has further highlighted the need for improving the accuracy, efficiency and the reach of methods for investigating RNA structure and function. Nuclear magnetic resonance (NMR) spectroscopy is vital to furthering the goals of RNA structural biology because of its distinctive capabilities. However, the dispersion pattern in the NMR spectra of RNA makes automated resonance assignment, a key step in NMR investigation of biomolecules, remarkably challenging. Herein we present RNA Probabilistic Assignment of Imino Resonance Shifts (RNA-PAIRS), a method for the automated assignment of RNA imino resonances with synchronized verification and correction of predicted secondary structure. RNA-PAIRS represents an advance in modeling the assignment paradigm because it seeds the probabilistic network for assignment with experimental NMR data, and predicted RNA secondary structure, simultaneously and from the start. Subsequently, RNA-PAIRS sets in motion a dynamic network that reverberates between predictions and experimental evidence in order to reconcile and rectify resonance assignments and secondary structure information. The procedure is halted when assignments and base-parings are deemed to be most consistent with observed crosspeaks. The current implementation of RNA-PAIRS uses an initial peak list derived from proton-nitrogen heteronuclear multiple quantum correlation ({sup 1}H-{sup 15}N 2D HMQC) and proton-proton nuclear Overhauser enhancement spectroscopy ({sup 1}H-{sup 1}H 2D NOESY) experiments. We have evaluated the performance of RNA-PAIRS by using it to analyze NMR datasets from 26 previously studied RNAs, including a 111-nucleotide complex. For moderately sized RNA molecules, and over a range of comparatively complex structural motifs, the average assignment accuracy exceeds 90%, while the average base pair prediction accuracy exceeded 93%. RNA-PAIRS yielded accurate assignments and base pairings consistent with imino

  8. Bisimulations meet PCTL equivalences for probabilistic automata

    DEFF Research Database (Denmark)

    Song, Lei; Zhang, Lijun; Godskesen, Jens Chr.

    2013-01-01

    Probabilistic automata (PAs) have been successfully applied in formal verification of concurrent and stochastic systems. Efficient model checking algorithms have been studied, where the most often used logics for expressing properties are based on probabilistic computation tree logic (PCTL) and its...

  9. APROBA-Plus: A probabilistic tool to evaluate and express uncertainty in hazard characterization and exposure assessment of substances.

    Science.gov (United States)

    Bokkers, Bas G H; Mengelers, Marcel J; Bakker, Martine I; Chiu, Weihsueh A; Slob, Wout

    2017-12-01

    To facilitate the application of probabilistic risk assessment, the WHO released the APROBA tool. This tool applies lognormal uncertainty distributions to the different aspects of the hazard characterization, resulting in a probabilistic health-based guidance value. The current paper describes an extension, APROBA-Plus, which combines the output from the probabilistic hazard characterization with the probabilistic exposure to rapidly characterize risk and its uncertainty. The uncertainty in exposure is graphically compared with the uncertainty in the target human dose, i.e. the dose that complies with the specified protection goals. APROBA-Plus is applied to several case studies, resulting in distinct outcomes and illustrating that APROBA-Plus could serve as a standard extension of routine risk assessments. By visualizing the uncertainties, APROBA-Plus provides a more transparent and informative outcome than the more usual deterministic approaches, so that risk managers can make better informed decisions. For example, APROBA-Plus can help in deciding whether risk-reducing measures are warranted or that a refined risk assessment would first be needed. If the latter, the tool can be used to prioritize possible refinements. APROBA-Plus may also be used to rank substances into different risk categories, based on potential health risks without being compromised by different levels of conservatism that may be associated with point estimates of risk. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. Effects of methamphetamine administration on information gathering during probabilistic reasoning in healthy humans.

    Science.gov (United States)

    Ermakova, Anna O; Ramachandra, Pranathi; Corlett, Philip R; Fletcher, Paul C; Murray, Graham K

    2014-01-01

    Jumping to conclusions (JTC) during probabilistic reasoning is a cognitive bias repeatedly demonstrated in people with schizophrenia and shown to be associated with delusions. Little is known about the neurochemical basis of probabilistic reasoning. We tested the hypothesis that catecholamines influence data gathering and probabilistic reasoning by administering intravenous methamphetamine, which is known to cause synaptic release of the catecholamines noradrenaline and dopamine, to healthy humans whilst they undertook a probabilistic inference task. Our study used a randomised, double-blind, cross-over design. Seventeen healthy volunteers on three visits were administered either placebo or methamphetamine or methamphetamine preceded by amisulpride. In all three conditions participants performed the "beads" task in which participants decide how much information to gather before making a probabilistic inference, and which measures the cognitive bias towards jumping to conclusions. Psychotic symptoms triggered by methamphetamine were assessed using Comprehensive Assessment of At-Risk Mental States (CAARMS). Methamphetamine induced mild psychotic symptoms, but there was no effect of drug administration on the number of draws to decision (DTD) on the beads task. DTD was a stable trait that was highly correlated within subjects across visits (intra-class correlation coefficients of 0.86 and 0.91 on two versions of the task). The less information was sampled in the placebo condition, the more psychotic-like symptoms the person had after the methamphetamine plus amisulpride condition (p = 0.028). Our results suggest that information gathering during probabilistic reasoning is a stable trait, not easily modified by dopaminergic or noradrenergic modulation.

  11. Probabilistic liquefaction hazard analysis at liquefied sites of 1956 Dunaharaszti earthquake, in Hungary

    Science.gov (United States)

    Győri, Erzsébet; Gráczer, Zoltán; Tóth, László; Bán, Zoltán; Horváth, Tibor

    2017-04-01

    Liquefaction potential evaluations are generally made to assess the hazard from specific scenario earthquakes. These evaluations may estimate the potential in a binary fashion (yes/no), define a factor of safety or predict the probability of liquefaction given a scenario event. Usually the level of ground shaking is obtained from the results of PSHA. Although it is determined probabilistically, a single level of ground shaking is selected and used within the liquefaction potential evaluation. In contrary, the fully probabilistic liquefaction potential assessment methods provide a complete picture of liquefaction hazard, namely taking into account the joint probability distribution of PGA and magnitude of earthquake scenarios; both of which are key inputs in the stress-based simplified methods. Kramer and Mayfield (2007) has developed a fully probabilistic liquefaction potential evaluation method using a performance-based earthquake engineering (PBEE) framework. The results of the procedure are the direct estimate of the return period of liquefaction and the liquefaction hazard curves in function of depth. The method combines the disaggregation matrices computed for different exceedance frequencies during probabilistic seismic hazard analysis with one of the recent models for the conditional probability of liquefaction. We have developed a software for the assessment of performance-based liquefaction triggering on the basis of Kramer and Mayfield method. Originally the SPT based probabilistic method of Cetin et al. (2004) was built-in into the procedure of Kramer and Mayfield to compute the conditional probability however there is no professional consensus about its applicability. Therefore we have included not only Cetin's method but Idriss and Boulanger (2012) SPT based moreover Boulanger and Idriss (2014) CPT based procedures into our computer program. In 1956, a damaging earthquake of magnitude 5.6 occurred in Dunaharaszti, in Hungary. Its epicenter was located

  12. Distributed collaborative probabilistic design for turbine blade-tip radial running clearance using support vector machine of regression

    Science.gov (United States)

    Fei, Cheng-Wei; Bai, Guang-Chen

    2014-12-01

    To improve the computational precision and efficiency of probabilistic design for mechanical dynamic assembly like the blade-tip radial running clearance (BTRRC) of gas turbine, a distribution collaborative probabilistic design method-based support vector machine of regression (SR)(called as DCSRM) is proposed by integrating distribution collaborative response surface method and support vector machine regression model. The mathematical model of DCSRM is established and the probabilistic design idea of DCSRM is introduced. The dynamic assembly probabilistic design of aeroengine high-pressure turbine (HPT) BTRRC is accomplished to verify the proposed DCSRM. The analysis results reveal that the optimal static blade-tip clearance of HPT is gained for designing BTRRC, and improving the performance and reliability of aeroengine. The comparison of methods shows that the DCSRM has high computational accuracy and high computational efficiency in BTRRC probabilistic analysis. The present research offers an effective way for the reliability design of mechanical dynamic assembly and enriches mechanical reliability theory and method.

  13. Goal orientation and work role performance: predicting adaptive and proactive work role performance through self-leadership strategies.

    Science.gov (United States)

    Marques-Quinteiro, Pedro; Curral, Luís Alberto

    2012-01-01

    This article explores the relationship between goal orientation, self-leadership dimensions, and adaptive and proactive work role performances. The authors hypothesize that learning orientation, in contrast to performance orientation, positively predicts proactive and adaptive work role performances and that this relationship is mediated by self-leadership behavior-focused strategies. It is posited that self-leadership natural reward strategies and thought pattern strategies are expected to moderate this relationship. Workers (N = 108) from a software company participated in this study. As expected, learning orientation did predict adaptive and proactive work role performance. Moreover, in the relationship between learning orientation and proactive work role performance through self-leadership behavior-focused strategies, a moderated mediation effect was found for self-leadership natural reward and thought pattern strategies. In the end, the authors discuss the results and implications are discussed and future research directions are proposed.

  14. Overview of Future of Probabilistic Methods and RMSL Technology and the Probabilistic Methods Education Initiative for the US Army at the SAE G-11 Meeting

    Science.gov (United States)

    Singhal, Surendra N.

    2003-01-01

    The SAE G-11 RMSL Division and Probabilistic Methods Committee meeting sponsored by the Picatinny Arsenal during March 1-3, 2004 at Westin Morristown, will report progress on projects for probabilistic assessment of Army system and launch an initiative for probabilistic education. The meeting features several Army and industry Senior executives and Ivy League Professor to provide an industry/government/academia forum to review RMSL technology; reliability and probabilistic technology; reliability-based design methods; software reliability; and maintainability standards. With over 100 members including members with national/international standing, the mission of the G-11s Probabilistic Methods Committee is to enable/facilitate rapid deployment of probabilistic technology to enhance the competitiveness of our industries by better, faster, greener, smarter, affordable and reliable product development.

  15. Multiobjective optimal allocation problem with probabilistic non ...

    African Journals Online (AJOL)

    This paper considers the optimum compromise allocation in multivariate stratified sampling with non-linear objective function and probabilistic non-linear cost constraint. The probabilistic non-linear cost constraint is converted into equivalent deterministic one by using Chance Constrained programming. A numerical ...

  16. Pre-processing for Triangulation of Probabilistic Networks

    NARCIS (Netherlands)

    Bodlaender, H.L.; Koster, A.M.C.A.; Eijkhof, F. van den; Gaag, L.C. van der

    2001-01-01

    The currently most efficient algorithm for inference with a probabilistic network builds upon a triangulation of a networks graph. In this paper, we show that pre-processing can help in finding good triangulations for probabilistic networks, that is, triangulations with a minimal maximum

  17. Strategic Team AI Path Plans: Probabilistic Pathfinding

    Directory of Open Access Journals (Sweden)

    Tng C. H. John

    2008-01-01

    Full Text Available This paper proposes a novel method to generate strategic team AI pathfinding plans for computer games and simulations using probabilistic pathfinding. This method is inspired by genetic algorithms (Russell and Norvig, 2002, in that, a fitness function is used to test the quality of the path plans. The method generates high-quality path plans by eliminating the low-quality ones. The path plans are generated by probabilistic pathfinding, and the elimination is done by a fitness test of the path plans. This path plan generation method has the ability to generate variation or different high-quality paths, which is desired for games to increase replay values. This work is an extension of our earlier work on team AI: probabilistic pathfinding (John et al., 2006. We explore ways to combine probabilistic pathfinding and genetic algorithm to create a new method to generate strategic team AI pathfinding plans.

  18. Results of the Level 1 probabilistic risk assessment (PRA) of internal events for heavy water production reactors

    International Nuclear Information System (INIS)

    Tinnes, S.P.; Cramer, D.S.; Logan, V.E.; Topp, S.V.; Smith, J.A.; Brandyberry, M.D.

    1990-01-01

    A full-scope probabilistic risk assessment (PRA) is being performed for the Savannah River site (SRS) production reactors. The Level 1 PRA for the K Reactor has been completed and includes the assessment of reactor systems response to accidents and estimates of the severe core melt frequency (SCMF). The internal events spectrum includes those events related directly to plant systems and safety functions for which transients or failures may initiate an accident. The SRS PRA has three principal objectives: improved understanding of SRS reactor safety issues through discovery and understanding of the mechanisms involved. Improved risk management capability through tools for assessing the safety impact of both current standard operations and proposed revisions. A quantitative measure of the risks posed by SRS reactor operation to employees and the general public, to allow comparison with declared goals and other societal risks

  19. Probabilistic Geoacoustic Inversion in Complex Environments

    Science.gov (United States)

    2015-09-30

    Probabilistic Geoacoustic Inversion in Complex Environments Jan Dettmer School of Earth and Ocean Sciences, University of Victoria, Victoria BC...long-range inversion methods can fail to provide sufficient resolution. For proper quantitative examination of variability, parameter uncertainty must...project aims to advance probabilistic geoacoustic inversion methods for complex ocean environments for a range of geoacoustic data types. The work is

  20. Probabilistic conditional independence structures

    CERN Document Server

    Studeny, Milan

    2005-01-01

    Probabilistic Conditional Independence Structures provides the mathematical description of probabilistic conditional independence structures; the author uses non-graphical methods of their description, and takes an algebraic approach.The monograph presents the methods of structural imsets and supermodular functions, and deals with independence implication and equivalence of structural imsets.Motivation, mathematical foundations and areas of application are included, and a rough overview of graphical methods is also given.In particular, the author has been careful to use suitable terminology, and presents the work so that it will be understood by both statisticians, and by researchers in artificial intelligence.The necessary elementary mathematical notions are recalled in an appendix.

  1. E-Area LLWF Vadose Zone Model: Probabilistic Model for Estimating Subsided-Area Infiltration Rates

    Energy Technology Data Exchange (ETDEWEB)

    Dyer, J. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL); Flach, G. [Savannah River Site (SRS), Aiken, SC (United States). Savannah River National Lab. (SRNL)

    2017-12-12

    A probabilistic model employing a Monte Carlo sampling technique was developed in Python to generate statistical distributions of the upslope-intact-area to subsided-area ratio (AreaUAi/AreaSAi) for closure cap subsidence scenarios that differ in assumed percent subsidence and the total number of intact plus subsided compartments. The plan is to use this model as a component in the probabilistic system model for the E-Area Performance Assessment (PA), contributing uncertainty in infiltration estimates.

  2. Non-unitary probabilistic quantum computing circuit and method

    Science.gov (United States)

    Williams, Colin P. (Inventor); Gingrich, Robert M. (Inventor)

    2009-01-01

    A quantum circuit performing quantum computation in a quantum computer. A chosen transformation of an initial n-qubit state is probabilistically obtained. The circuit comprises a unitary quantum operator obtained from a non-unitary quantum operator, operating on an n-qubit state and an ancilla state. When operation on the ancilla state provides a success condition, computation is stopped. When operation on the ancilla state provides a failure condition, computation is performed again on the ancilla state and the n-qubit state obtained in the previous computation, until a success condition is obtained.

  3. Probabilistic assessment of nuclear safety and safeguards

    International Nuclear Information System (INIS)

    Higson, D.J.

    1987-01-01

    Nuclear reactor accidents and diversions of materials from the nuclear fuel cycle are perceived by many people as particularly serious threats to society. Probabilistic assessment is a rational approach to the evaluation of both threats, and may provide a basis for decisions on appropriate actions to control them. Probabilistic method have become standard tools used in the analysis of safety, but there are disagreements on the criteria to be applied when assessing the results of analysis. Probabilistic analysis and assessment of the effectiveness of nuclear material safeguards are still at an early stage of development. (author)

  4. bayesPop: Probabilistic Population Projections

    Directory of Open Access Journals (Sweden)

    Hana Ševčíková

    2016-12-01

    Full Text Available We describe bayesPop, an R package for producing probabilistic population projections for all countries. This uses probabilistic projections of total fertility and life expectancy generated by Bayesian hierarchical models. It produces a sample from the joint posterior predictive distribution of future age- and sex-specific population counts, fertility rates and mortality rates, as well as future numbers of births and deaths. It provides graphical ways of summarizing this information, including trajectory plots and various kinds of probabilistic population pyramids. An expression language is introduced which allows the user to produce the predictive distribution of a wide variety of derived population quantities, such as the median age or the old age dependency ratio. The package produces aggregated projections for sets of countries, such as UN regions or trading blocs. The methodology has been used by the United Nations to produce their most recent official population projections for all countries, published in the World Population Prospects.

  5. bayesPop: Probabilistic Population Projections

    Science.gov (United States)

    Ševčíková, Hana; Raftery, Adrian E.

    2016-01-01

    We describe bayesPop, an R package for producing probabilistic population projections for all countries. This uses probabilistic projections of total fertility and life expectancy generated by Bayesian hierarchical models. It produces a sample from the joint posterior predictive distribution of future age- and sex-specific population counts, fertility rates and mortality rates, as well as future numbers of births and deaths. It provides graphical ways of summarizing this information, including trajectory plots and various kinds of probabilistic population pyramids. An expression language is introduced which allows the user to produce the predictive distribution of a wide variety of derived population quantities, such as the median age or the old age dependency ratio. The package produces aggregated projections for sets of countries, such as UN regions or trading blocs. The methodology has been used by the United Nations to produce their most recent official population projections for all countries, published in the World Population Prospects. PMID:28077933

  6. Academic Self-Handicapping and Achievement Goals: A Further Examination.

    Science.gov (United States)

    Midgley, Carol; Urdan, Tim

    2001-01-01

    This study extends previous research on the relations among students' personal achievement goals, perceptions of the classroom goal structure, and reports of the use of self-handicapping strategies. Surveys, specific to the math domain, were given to 484 7th-grade students in nine middle schools. Personal performance-avoid goals positively predicted handicapping, whereas personal performance-approach goals did not. Personal task goals negatively predicted handicapping. Perceptions of a performance goal structure positively predicted handicapping, and perceptions of a task goal structure negatively predicted handicapping, independent of personal goals. Median splits used to examine multiple goal profiles revealed that students high in performance-avoid goals used handicapping more than did those low in performance-avoid goals regardless of the level of task goals. Students low in performance-avoid goals and high in task goals handicapped less than those low in both goals. Level of performance-approach goals had little effect on the relation between task goals and handicapping. Copyright 2001 Academic Press.

  7. Toward a Probabilistic Phenological Model for Wheat Growing Degree Days (GDD)

    Science.gov (United States)

    Rahmani, E.; Hense, A.

    2017-12-01

    Are there deterministic relations between phenological and climate parameters? The answer is surely `No'. This answer motivated us to solve the problem through probabilistic theories. Thus, we developed a probabilistic phenological model which has the advantage of giving additional information in terms of uncertainty. To that aim, we turned to a statistical analysis named survival analysis. Survival analysis deals with death in biological organisms and failure in mechanical systems. In survival analysis literature, death or failure is considered as an event. By event, in this research we mean ripening date of wheat. We will assume only one event in this special case. By time, we mean the growing duration from sowing to ripening as lifetime for wheat which is a function of GDD. To be more precise we will try to perform the probabilistic forecast for wheat ripening. The probability value will change between 0 and 1. Here, the survivor function gives the probability that the not ripened wheat survives longer than a specific time or will survive to the end of its lifetime as a ripened crop. The survival function at each station is determined by fitting a normal distribution to the GDD as the function of growth duration. Verification of the models obtained is done using CRPS skill score (CRPSS). The positive values of CRPSS indicate the large superiority of the probabilistic phonologic survival model to the deterministic models. These results demonstrate that considering uncertainties in modeling are beneficial, meaningful and necessary. We believe that probabilistic phenological models have the potential to help reduce the vulnerability of agricultural production systems to climate change thereby increasing food security.

  8. Probabilistic and machine learning-based retrieval approaches for biomedical dataset retrieval

    Science.gov (United States)

    Karisani, Payam; Qin, Zhaohui S; Agichtein, Eugene

    2018-01-01

    Abstract The bioCADDIE dataset retrieval challenge brought together different approaches to retrieval of biomedical datasets relevant to a user’s query, expressed as a text description of a needed dataset. We describe experiments in applying a data-driven, machine learning-based approach to biomedical dataset retrieval as part of this challenge. We report on a series of experiments carried out to evaluate the performance of both probabilistic and machine learning-driven techniques from information retrieval, as applied to this challenge. Our experiments with probabilistic information retrieval methods, such as query term weight optimization, automatic query expansion and simulated user relevance feedback, demonstrate that automatically boosting the weights of important keywords in a verbose query is more effective than other methods. We also show that although there is a rich space of potential representations and features available in this domain, machine learning-based re-ranking models are not able to improve on probabilistic information retrieval techniques with the currently available training data. The models and algorithms presented in this paper can serve as a viable implementation of a search engine to provide access to biomedical datasets. The retrieval performance is expected to be further improved by using additional training data that is created by expert annotation, or gathered through usage logs, clicks and other processes during natural operation of the system. Database URL: https://github.com/emory-irlab/biocaddie

  9. Failure probability assessment of wall-thinned nuclear pipes using probabilistic fracture mechanics

    International Nuclear Information System (INIS)

    Lee, Sang-Min; Chang, Yoon-Suk; Choi, Jae-Boong; Kim, Young-Jin

    2006-01-01

    The integrity of nuclear piping system has to be maintained during operation. In order to maintain the integrity, reliable assessment procedures including fracture mechanics analysis, etc., are required. Up to now, this has been performed using conventional deterministic approaches even though there are many uncertainties to hinder a rational evaluation. In this respect, probabilistic approaches are considered as an appropriate method for piping system evaluation. The objectives of this paper are to estimate the failure probabilities of wall-thinned pipes in nuclear secondary systems and to propose limited operating conditions under different types of loadings. To do this, a probabilistic assessment program using reliability index and simulation techniques was developed and applied to evaluate failure probabilities of wall-thinned pipes subjected to internal pressure, bending moment and combined loading of them. The sensitivity analysis results as well as prototypal integrity assessment results showed a promising applicability of the probabilistic assessment program, necessity of practical evaluation reflecting combined loading condition and operation considering limited condition

  10. A Markov Chain Approach to Probabilistic Swarm Guidance

    Science.gov (United States)

    Acikmese, Behcet; Bayard, David S.

    2012-01-01

    This paper introduces a probabilistic guidance approach for the coordination of swarms of autonomous agents. The main idea is to drive the swarm to a prescribed density distribution in a prescribed region of the configuration space. In its simplest form, the probabilistic approach is completely decentralized and does not require communication or collabo- ration between agents. Agents make statistically independent probabilistic decisions based solely on their own state, that ultimately guides the swarm to the desired density distribution in the configuration space. In addition to being completely decentralized, the probabilistic guidance approach has a novel autonomous self-repair property: Once the desired swarm density distribution is attained, the agents automatically repair any damage to the distribution without collaborating and without any knowledge about the damage.

  11. Probabilistic reasoning for assembly-based 3D modeling

    KAUST Repository

    Chaudhuri, Siddhartha

    2011-01-01

    Assembly-based modeling is a promising approach to broadening the accessibility of 3D modeling. In assembly-based modeling, new models are assembled from shape components extracted from a database. A key challenge in assembly-based modeling is the identification of relevant components to be presented to the user. In this paper, we introduce a probabilistic reasoning approach to this problem. Given a repository of shapes, our approach learns a probabilistic graphical model that encodes semantic and geometric relationships among shape components. The probabilistic model is used to present components that are semantically and stylistically compatible with the 3D model that is being assembled. Our experiments indicate that the probabilistic model increases the relevance of presented components. © 2011 ACM.

  12. Shear-wave velocity-based probabilistic and deterministic assessment of seismic soil liquefaction potential

    Science.gov (United States)

    Kayen, R.; Moss, R.E.S.; Thompson, E.M.; Seed, R.B.; Cetin, K.O.; Der Kiureghian, A.; Tanaka, Y.; Tokimatsu, K.

    2013-01-01

    Shear-wave velocity (Vs) offers a means to determine the seismic resistance of soil to liquefaction by a fundamental soil property. This paper presents the results of an 11-year international project to gather new Vs site data and develop probabilistic correlations for seismic soil liquefaction occurrence. Toward that objective, shear-wave velocity test sites were identified, and measurements made for 301 new liquefaction field case histories in China, Japan, Taiwan, Greece, and the United States over a decade. The majority of these new case histories reoccupy those previously investigated by penetration testing. These new data are combined with previously published case histories to build a global catalog of 422 case histories of Vs liquefaction performance. Bayesian regression and structural reliability methods facilitate a probabilistic treatment of the Vs catalog for performance-based engineering applications. Where possible, uncertainties of the variables comprising both the seismic demand and the soil capacity were estimated and included in the analysis, resulting in greatly reduced overall model uncertainty relative to previous studies. The presented data set and probabilistic analysis also help resolve the ancillary issues of adjustment for soil fines content and magnitude scaling factors.

  13. Goal Definition

    DEFF Research Database (Denmark)

    Bjørn, Anders; Laurent, Alexis; Owsianiak, Mikołaj

    2018-01-01

    The goal definition is the first phase of an LCA and determines the purpose of a study in detail. This chapter teaches how to perform the six aspects of a goal definition: (1) Intended applications of the results, (2) Limitations due to methodological choices, (3) Decision context and reasons...... for carrying out the study, (4) Target audience , (5) Comparative studies to be disclosed to the public and (6) Commissioner of the study and other influential actors. The instructions address both the conduct and reporting of a goal definition and are largely based on the ILCD guidance document (EC...

  14. Probabilistic Seismic Risk Assessment in Manizales, Colombia:Quantifying Losses for Insurance Purposes

    Institute of Scientific and Technical Information of China (English)

    Mario A.Salgado-Gálvez; Gabriel A.Bernal; Daniela Zuloaga; Mabel C.Marulanda; Omar-Darío Cardona; Sebastián Henao

    2017-01-01

    A fully probabilistic seismic risk assessment was developed in Manizales,Colombia,considering assets of different types.The first type includes elements that are part of the water and sewage network,and the second type includes public and private buildings.This assessment required the development of a probabilistic seismic hazard analysis that accounts for the dynamic soil response,assembling high resolution exposure databases,and the development of damage models for different types of elements.The economic appraisal of the exposed assets was developed together with specialists of the water utilities company of Manizales and the city administration.The risk assessment was performed using several Comprehensive Approach to Probabilistic Risk Assessment modules as well as the R-System,obtaining results in terms of traditional metrics such as loss exceedance curve,average annual loss,and probable maximum loss.For the case of pipelines,repair rates were also estimated.The results for the water and sewage network were used in activities related to the expansion and maintenance strategies,as well as for the exploration of financial retention and transfer alternatives using insurance schemes based on technical,probabilistic,and prospective damage and loss estimations.In the case of the buildings,the results were used in the update of the technical premium values of the existing collective insurance scheme.

  15. Probabilistic reasoning in data analysis.

    Science.gov (United States)

    Sirovich, Lawrence

    2011-09-20

    This Teaching Resource provides lecture notes, slides, and a student assignment for a lecture on probabilistic reasoning in the analysis of biological data. General probabilistic frameworks are introduced, and a number of standard probability distributions are described using simple intuitive ideas. Particular attention is focused on random arrivals that are independent of prior history (Markovian events), with an emphasis on waiting times, Poisson processes, and Poisson probability distributions. The use of these various probability distributions is applied to biomedical problems, including several classic experimental studies.

  16. OVERVIEW OF DEVELOPMENT OF P-CARES: PROBABILISTIC COMPUTER ANALYSIS FOR RAPID EVALUATION OF STRUCTURES

    International Nuclear Information System (INIS)

    NIE, J.; XU, J.; COSTANTINO, C.; THOMAS, V.

    2007-01-01

    Brookhaven National Laboratory (BNL) undertook an effort to revise the CARES (Computer Analysis for Rapid Evaluation of Structures) program under the auspices of the US Nuclear Regulatory Commission (NRC). The CARES program provided the NRC staff a capability to quickly check the validity and/or accuracy of the soil-structure interaction (SSI) models and associated data received from various applicants. The aim of the current revision was to implement various probabilistic simulation algorithms in CARES (referred hereinafter as P-CARES [1]) for performing the probabilistic site response and soil-structure interaction (SSI) analyses. This paper provides an overview of the development process of P-CARES, including the various probabilistic simulation techniques used to incorporate the effect of site soil uncertainties into the seismic site response and SSI analyses and an improved graphical user interface (GUI)

  17. Do probabilistic forecasts lead to better decisions?

    Directory of Open Access Journals (Sweden)

    M. H. Ramos

    2013-06-01

    Full Text Available The last decade has seen growing research in producing probabilistic hydro-meteorological forecasts and increasing their reliability. This followed the promise that, supplied with information about uncertainty, people would take better risk-based decisions. In recent years, therefore, research and operational developments have also started focusing attention on ways of communicating the probabilistic forecasts to decision-makers. Communicating probabilistic forecasts includes preparing tools and products for visualisation, but also requires understanding how decision-makers perceive and use uncertainty information in real time. At the EGU General Assembly 2012, we conducted a laboratory-style experiment in which several cases of flood forecasts and a choice of actions to take were presented as part of a game to participants, who acted as decision-makers. Answers were collected and analysed. In this paper, we present the results of this exercise and discuss if we indeed make better decisions on the basis of probabilistic forecasts.

  18. Management of the Area 5 Radioactive Waste Management Site using Decision-based, Probabilistic Performance Assessment Modeling

    International Nuclear Information System (INIS)

    Carilli, J.; Crowe, B.; Black, P.; Tauxe, J.; Stockton, T.; Catlett, K.; Yucel, V.

    2003-01-01

    Low-level radioactive waste from cleanup activities at the Nevada Test Site and from multiple sites across the U.S. Department of Energy (DOE) complex is disposed at two active Radioactive Waste Management Sites (RWMS) on the Nevada Test Site. These facilities, which are managed by the DOE National Nuclear Security Administration Nevada Site Office, were recently designated as one of two regional disposal centers and yearly volumes of disposed waste now exceed 50,000 m3 (> 2 million ft3). To safely and cost-effectively manage the disposal facilities, the Waste Management Division of Environmental Management has implemented decision-based management practices using flexible and problem-oriented probabilistic performance assessment modeling. Deterministic performance assessments and composite analyses were completed originally for the Area 5 and Area 3 RWMSs located in, respectively, Frenchman Flat and Yucca Flat on the Nevada Test Site. These documents provide the technical bases for issuance of disposal authorization statements for continuing operation of the disposal facilities. Both facilities are now in a maintenance phase that requires testing of conceptual models, reduction of uncertainty, and site monitoring all leading to eventual closure of the facilities and transition to long-term stewardship

  19. A linear process-algebraic format for probabilistic systems with data

    NARCIS (Netherlands)

    Katoen, Joost P.; van de Pol, Jan Cornelis; Stoelinga, Mariëlle Ida Antoinette; Timmer, Mark; Gomes, L.; Khomenko, V.; Fernandes, J.M.

    This paper presents a novel linear process algebraic format for probabilistic automata. The key ingredient is a symbolic transformation of probabilistic process algebra terms that incorporate data into this linear format while preserving strong probabilistic bisimulation. This generalises similar

  20. PROBABILISTIC SEISMIC ASSESSMENT OF BASE-ISOLATED NPPS SUBJECTED TO STRONG GROUND MOTIONS OF TOHOKU EARTHQUAKE

    Directory of Open Access Journals (Sweden)

    AHMER ALI

    2014-10-01

    Full Text Available The probabilistic seismic performance of a standard Korean nuclear power plant (NPP with an idealized isolation is investigated in the present work. A probabilistic seismic hazard analysis (PSHA of the Wolsong site on the Korean peninsula is performed by considering peak ground acceleration (PGA as an earthquake intensity measure. A procedure is reported on the categorization and selection of two sets of ground motions of the Tohoku earthquake, i.e. long-period and common as Set A and Set B respectively, for the nonlinear time history response analysis of the base-isolated NPP. Limit state values as multiples of the displacement responses of the NPP base isolation are considered for the fragility estimation. The seismic risk of the NPP is further assessed by incorporation of the rate of frequency exceedance and conditional failure probability curves. Furthermore, this framework attempts to show the unacceptable performance of the isolated NPP in terms of the probabilistic distribution and annual probability of limit states. The comparative results for long and common ground motions are discussed to contribute to the future safety of nuclear facilities against drastic events like Tohoku.

  1. Probabilistic seismic assessment of base-isolated NPPs subjected to strong ground motions of Tohoku earthquake

    Energy Technology Data Exchange (ETDEWEB)

    Ali, Ahmer; Hayah, Nadin Abu; Kim, Doo Kie [Dept. of Civil and Environmental Engineering, Kunsan National University, Kunsan (Korea, Republic of); Cho, Sung Gook [R and D Center, JACE KOREA Company, Gyeonggido (Korea, Republic of)

    2014-10-15

    The probabilistic seismic performance of a standard Korean nuclear power plant (NPP) with an idealized isolation is investigated in the present work. A probabilistic seismic hazard analysis (PSHA) of the Wolsong site on the Korean peninsula is performed by considering peak ground acceleration (PGA) as an earthquake intensity measure. A procedure is reported on the categorization and selection of two sets of ground motions of the Tohoku earthquake, i.e. long-period and common as Set A and Set B respectively, for the nonlinear time history response analysis of the base-isolated NPP. Limit state values as multiples of the displacement responses of the NPP base isolation are considered for the fragility estimation. The seismic risk of the NPP is further assessed by incorporation of the rate of frequency exceedance and conditional failure probability curves. Furthermore, this framework attempts to show the unacceptable performance of the isolated NPP in terms of the probabilistic distribution and annual probability of limit states. The comparative results for long and common ground motions are discussed to contribute to the future safety of nuclear facilities against drastic events like Tohoku.

  2. Predictive control for stochastic systems based on multi-layer probabilistic sets

    Directory of Open Access Journals (Sweden)

    Huaqing LIANG

    2016-04-01

    Full Text Available Aiming at a class of discrete-time stochastic systems with Markov jump features, the state-feedback predictive control problem under probabilistic constraints of input variables is researched. On the basis of the concept and method of the multi-layer probabilistic sets, the predictive controller design algorithm with the soft constraints of different probabilities is presented. Under the control of the multi-step feedback laws, the system state moves to different ellipses with specified probabilities. The stability of the system is guaranteed, the feasible region of the control problem is enlarged, and the system performance is improved. Finally, a simulation example is given to prove the effectiveness of the proposed method.

  3. The NUREG-1150 probabilistic risk assessment for the Sequoyah nuclear plant

    International Nuclear Information System (INIS)

    Gregory, J.J.; Breeding, R.J.; Higgins, S.J.; Shiver, A.W.; Helton, J.C.; Murfin, W.B.

    1992-01-01

    This paper summarizes the findings of the probabilistic risk assessment (PRA) for Unit 1 of the Sequoyah Nuclear Plant performed in support of NUREG-1150. The emphasis is on the 'back-end' analyses, the accident progression, source term, and consequence analyses, and the risk results obtained when the results of these analyses are combined with the accident frequency analysis. The results of this PRA indicate that the offsite risk from internal initiating events at Sequoyah are quite low with respect to the safety goals. The containment appears likely to withstand the loads that might be placed upon it if the reactor vessel fails. A good portion of the risk, in this analysis, comes from initiating events which bypass the containment. These events are estimated to have a relatively low frequency of occurrence, but their consequences are quite large. Other events that contribute to offsite risk involve early containment failures that occur during degradation of the core or near the time of vessel breach. Considerable uncertainty is associated with the risk estimates produced in this analysis. Offsite risk from external initiating events was not included in this analysis. (orig.)

  4. Development of advanced methods and related software for human reliability evaluation within probabilistic safety analyses

    International Nuclear Information System (INIS)

    Kosmowski, K.T.; Mertens, J.; Degen, G.; Reer, B.

    1994-06-01

    Human Reliability Analysis (HRA) is an important part of Probabilistic Safety Analysis (PSA). The first part of this report consists of an overview of types of human behaviour and human error including the effect of significant performance shaping factors on human reliability. Particularly with regard to safety assessments for nuclear power plants a lot of HRA methods have been developed. The most important of these methods are presented and discussed in the report, together with techniques for incorporating HRA into PSA and with models of operator cognitive behaviour. Based on existing HRA methods the concept of a software system is described. For the development of this system the utilization of modern programming tools is proposed; the essential goal is the effective application of HRA methods. A possible integration of computeraided HRA within PSA is discussed. The features of Expert System Technology and examples of applications (PSA, HRA) are presented in four appendices. (orig.) [de

  5. Probabilistic thread algebra

    NARCIS (Netherlands)

    Bergstra, J.A.; Middelburg, C.A.

    2015-01-01

    We add probabilistic features to basic thread algebra and its extensions with thread-service interaction and strategic interleaving. Here, threads represent the behaviours produced by instruction sequences under execution and services represent the behaviours exhibited by the components of execution

  6. Writing Performance Goals: Strategy and Prototypes. A Manual for Vocational and Technical Educators.

    Science.gov (United States)

    McGraw-Hill Book Co., New York, NY. Gregg Div.

    The result of a cooperative project of the Center for Vocational and Technical Education at the Ohio State University and the McGraw-Hill Book Company, this manual was prepared to develop prototypes of performance goals for use by curriculum specialists and developers of instructional materials in vocational and technical education and to provide…

  7. On probabilistic forecasting of wind power time-series

    DEFF Research Database (Denmark)

    Pinson, Pierre

    power dynamics. In both cases, the model parameters are adaptively and recursively estimated, time-adaptativity being the result of exponential forgetting of past observations. The probabilistic forecasting methodology is applied at the Horns Rev wind farm in Denmark, for 10-minute ahead probabilistic...... forecasting of wind power generation. Probabilistic forecasts generated from the proposed methodology clearly have higher skill than those obtained from a classical Gaussian assumption about wind power predictive densities. Corresponding point forecasts also exhibit significantly lower error criteria....

  8. Probabilistic Safety Assessment Of It TRIGA Mark-II Reactor

    International Nuclear Information System (INIS)

    Ergun, E; Kadiroglu, O.S.

    1999-01-01

    The probabilistic safety assessment for Istanbul Technical University (ITU) TRIGA Mark-II reactor is performed. Qualitative analysis, which includes fault and event trees and quantitative analysis which includes the collection of data for basic events, determination of minimal cut sets, calculation of quantitative values of top events, sensitivity analysis and importance measures, uncertainty analysis and radiation release from fuel elements are considered

  9. US Department of Energy Approach to Probabilistic Evaluation of Long-Term Safety for a Potential Yucca Mountain Repository

    International Nuclear Information System (INIS)

    Dr. R. Dyer; Dr. R. Andrews; Dr. A. Van Luik

    2005-01-01

    Regulatory requirements being addressed in the US geological repository program for spent nuclear fuel and high-level waste disposal specify probabilistically defined mean-value dose limits. These dose limits reflect acceptable levels of risk. The probabilistic approach mandated by regulation calculates a ''risk of a dose,'' a risk of a potential given dose value at a specific time in the future to a hypothetical person. The mean value of the time-dependent performance measure needs to remain below an acceptable level defined by regulation. Because there are uncertain parameters that are important to system performance, the regulation mandates an analysis focused on the mean value of the performance measure, but that also explores the ''full range of defensible and reasonable parameter distributions''...System performance evaluations should not be unduly influenced by...''extreme physical situations and parameter values''. Challenges in this approach lie in defending the scientific basis for the models selected, and the data and distributions sampled. A significant challenge lies in showing that uncertainties are properly identified and evaluated. A single-value parameter has no uncertainty, and where used such values need to be supported by scientific information showing the selected value is appropriate. Uncertainties are inherent in data, but are also introduced by creating parameter distributions from data sets, selecting models from among alternative models, abstracting models for use in probabilistic analysis, and in selecting the range of initiating event probabilities for unlikely events. The goal of the assessment currently in progress is to evaluate the level of risk inherent in moving ahead to the next phase of repository development: construction. During the construction phase, more will be learned to inform a new long-term risk evaluation to support moving to the next phase: accepting waste. Therefore, though there was sufficient confidence of safety

  10. Probabilistic double guarantee kidnapping detection in SLAM.

    Science.gov (United States)

    Tian, Yang; Ma, Shugen

    2016-01-01

    For determining whether kidnapping has happened and which type of kidnapping it is while a robot performs autonomous tasks in an unknown environment, a double guarantee kidnapping detection (DGKD) method has been proposed. The good performance of DGKD in a relative small environment is shown. However, a limitation of DGKD is found in a large-scale environment by our recent work. In order to increase the adaptability of DGKD in a large-scale environment, an improved method called probabilistic double guarantee kidnapping detection is proposed in this paper to combine probability of features' positions and the robot's posture. Simulation results demonstrate the validity and accuracy of the proposed method.

  11. Review of the Brunswick Steam Electric Plant Probabilistic Risk Assessment

    International Nuclear Information System (INIS)

    Sattison, M.B.; Davis, P.R.; Satterwhite, D.G.; Gilmore, W.E.; Gregg, R.E.

    1989-11-01

    A review of the Brunswick Steam Electric Plant probabilistic risk Assessment was conducted with the objective of confirming the safety perspectives brought to light by the probabilistic risk assessment. The scope of the review included the entire Level I probabilistic risk assessment including external events. This is consistent with the scope of the probabilistic risk assessment. The review included an assessment of the assumptions, methods, models, and data used in the study. 47 refs., 14 figs., 15 tabs

  12. Deliverable D74.2. Probabilistic analysis methods for support structures

    DEFF Research Database (Denmark)

    Gintautas, Tomas

    2018-01-01

    Relevant Description: Report describing the probabilistic analysis for offshore substructures and results attained. This includes comparison with experimental data and with conventional design. Specific targets: 1) Estimate current reliability level of support structures 2) Development of basis...... for probabilistic calculations and evaluation of reliability for offshore support structures (substructures) 3) Development of a probabilistic model for stiffness and strength of soil parameters and for modeling geotechnical load bearing capacity 4) Comparison between probabilistic analysis and deterministic...

  13. A common fixed point for operators in probabilistic normed spaces

    International Nuclear Information System (INIS)

    Ghaemi, M.B.; Lafuerza-Guillen, Bernardo; Razani, A.

    2009-01-01

    Probabilistic Metric spaces was introduced by Karl Menger. Alsina, Schweizer and Sklar gave a general definition of probabilistic normed space based on the definition of Menger [Alsina C, Schweizer B, Sklar A. On the definition of a probabilistic normed spaces. Aequationes Math 1993;46:91-8]. Here, we consider the equicontinuity of a class of linear operators in probabilistic normed spaces and finally, a common fixed point theorem is proved. Application to quantum Mechanic is considered.

  14. International status of application of probabilistic risk analysis

    International Nuclear Information System (INIS)

    Cullingford, M.C.

    1984-01-01

    Probabilistic Risk Assessment (PRA) having been practised for about ten years and with more than twenty studies completed has reached a level of maturity such that the insights and other products derived from specific studies may be assessed. The first full-scale PRA studies were designed to develop the methodology and assess the overall risk from nuclear power. At present PRA is performed mostly for individual plants to identify core damage accident sequences and significant contributors to such sequences. More than 25 countries are utilizing insights from PRA, some from full-scale PRA studies and other countries by performing reliability analyses on safety systems identified as important contributors to one or more core melt sequences. Many Member States of the IAEA fall into one of three groups: those having (a) a large, (b) a medium number of reactor-years of operating experience and (c) those countries in the planning or feasibility study stages of a nuclear power programme. Of the many potential uses of PRA the decision areas of safety improvement by backfitting, development of operating procedures and as the basis of standards are felt to be important by countries of all three groups. The use of PRA in showing compliance with safety goals and for plant availability studies is held to be important only by those countries which have operating experience. The evolution of the PRA methodology has led to increased attention to quantification of uncertainties both in the probabilities and consequences. Although many products from performing a PRA do not rely upon overall risk numbers, increasing emphasis is being placed on the interpretation of uncertainties in risk numbers for use in decisions. International co-operation through exchange of information regarding experience with PRA methodology and its application to nuclear safety decisions will greatly enhance the widespread use of PRA. (author)

  15. A History of Probabilistic Inductive Logic Programming

    Directory of Open Access Journals (Sweden)

    Fabrizio eRiguzzi

    2014-09-01

    Full Text Available The field of Probabilistic Logic Programming (PLP has seen significant advances in the last 20 years, with many proposals for languages that combine probability with logic programming. Since the start, the problem of learning probabilistic logic programs has been the focus of much attention. Learning these programs represents a whole subfield of Inductive Logic Programming (ILP. In Probabilistic ILP (PILP two problems are considered: learning the parameters of a program given the structure (the rules and learning both the structure and the parameters. Usually structure learning systems use parameter learning as a subroutine. In this article we present an overview of PILP and discuss the main results.

  16. Probabilistic methods in combinatorial analysis

    CERN Document Server

    Sachkov, Vladimir N

    2014-01-01

    This 1997 work explores the role of probabilistic methods for solving combinatorial problems. These methods not only provide the means of efficiently using such notions as characteristic and generating functions, the moment method and so on but also let us use the powerful technique of limit theorems. The basic objects under investigation are nonnegative matrices, partitions and mappings of finite sets, with special emphasis on permutations and graphs, and equivalence classes specified on sequences of finite length consisting of elements of partially ordered sets; these specify the probabilist

  17. Probabilistic coding of quantum states

    International Nuclear Information System (INIS)

    Grudka, Andrzej; Wojcik, Antoni; Czechlewski, Mikolaj

    2006-01-01

    We discuss the properties of probabilistic coding of two qubits to one qutrit and generalize the scheme to higher dimensions. We show that the protocol preserves the entanglement between the qubits to be encoded and the environment and can also be applied to mixed states. We present a protocol that enables encoding of n qudits to one qudit of dimension smaller than the Hilbert space of the original system and then allows probabilistic but error-free decoding of any subset of k qudits. We give a formula for the probability of successful decoding

  18. Probabilistic Modeling of Timber Structures

    DEFF Research Database (Denmark)

    Köhler, J.D.; Sørensen, John Dalsgaard; Faber, Michael Havbro

    2005-01-01

    The present paper contains a proposal for the probabilistic modeling of timber material properties. It is produced in the context of the Probabilistic Model Code (PMC) of the Joint Committee on Structural Safety (JCSS) and of the COST action E24 'Reliability of Timber Structures'. The present...... proposal is based on discussions and comments from participants of the COST E24 action and the members of the JCSS. The paper contains a description of the basic reference properties for timber strength parameters and ultimate limit state equations for components and connections. The recommended...

  19. Application of probabilistic precipitation forecasts from a ...

    African Journals Online (AJOL)

    2014-02-14

    Feb 14, 2014 ... Application of probabilistic precipitation forecasts from a deterministic model ... aim of this paper is to investigate the increase in the lead-time of flash flood warnings of the SAFFG using probabilistic precipitation forecasts ... The procedure is applied to a real flash flood event and the ensemble-based.

  20. Probabilistic Analysis Methods for Hybrid Ventilation

    DEFF Research Database (Denmark)

    Brohus, Henrik; Frier, Christian; Heiselberg, Per

    This paper discusses a general approach for the application of probabilistic analysis methods in the design of ventilation systems. The aims and scope of probabilistic versus deterministic methods are addressed with special emphasis on hybrid ventilation systems. A preliminary application...... of stochastic differential equations is presented comprising a general heat balance for an arbitrary number of loads and zones in a building to determine the thermal behaviour under random conditions....