WorldWideScience

Sample records for analysis hra methodology

  1. Development of Human Performance Analysis and Advanced HRA Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Won Dea; Park, Jin Kyun; Kim, Jae Whan; Kim, Seong Whan; Kim, Man Cheol; Ha, Je Joo

    2007-06-15

    The purpose of this project is to build a systematic framework that can evaluate the effect of human factors related problems on the safety of nuclear power plants (NPPs) as well as develop a technology that can be used to enhance human performance. The research goal of this project is twofold: (1) the development of a human performance database and a framework to enhance human performance, and (2) the analysis of human error with constructing technical basis for human reliability analysis. There are three kinds of main results of this study. The first result is the development of a human performance database, called OPERA-I/II (Operator Performance and Reliability Analysis, Part I and Part II). In addition, a standard communication protocol was developed based on OPERA to reduce human error caused from communication error in the phase of event diagnosis. Task complexity (TACOM) measure and the methodology of optimizing diagnosis procedures were also finalized during this research phase. The second main result is the development of a software, K-HRA, which is to support the standard HRA method. Finally, an advanced HRA method named as AGAPE-ET was developed by combining methods MDTA (misdiagnosis tree analysis technique) and K-HRA, which can be used to analyze EOC (errors of commission) and EOO (errors of ommission). These research results, such as OPERA-I/II, TACOM, a standard communication protocol, K-HRA and AGAPE-ET methods will be used to improve the quality of HRA and to enhance human performance in nuclear power plants.

  2. Development of Human Performance Analysis and Advanced HRA Methodology

    International Nuclear Information System (INIS)

    Jung, Won Dea; Park, Jin Kyun; Kim, Jae Whan; Kim, Seong Whan; Kim, Man Cheol; Ha, Je Joo

    2007-06-01

    The purpose of this project is to build a systematic framework that can evaluate the effect of human factors related problems on the safety of nuclear power plants (NPPs) as well as develop a technology that can be used to enhance human performance. The research goal of this project is twofold: (1) the development of a human performance database and a framework to enhance human performance, and (2) the analysis of human error with constructing technical basis for human reliability analysis. There are three kinds of main results of this study. The first result is the development of a human performance database, called OPERA-I/II (Operator Performance and Reliability Analysis, Part I and Part II). In addition, a standard communication protocol was developed based on OPERA to reduce human error caused from communication error in the phase of event diagnosis. Task complexity (TACOM) measure and the methodology of optimizing diagnosis procedures were also finalized during this research phase. The second main result is the development of a software, K-HRA, which is to support the standard HRA method. Finally, an advanced HRA method named as AGAPE-ET was developed by combining methods MDTA (misdiagnosis tree analysis technique) and K-HRA, which can be used to analyze EOC (errors of commission) and EOO (errors of ommission). These research results, such as OPERA-I/II, TACOM, a standard communication protocol, K-HRA and AGAPE-ET methods will be used to improve the quality of HRA and to enhance human performance in nuclear power plants

  3. Space Mission Human Reliability Analysis (HRA) Project

    Science.gov (United States)

    Boyer, Roger

    2014-01-01

    The purpose of the Space Mission Human Reliability Analysis (HRA) Project is to extend current ground-based HRA risk prediction techniques to a long-duration, space-based tool. Ground-based HRA methodology has been shown to be a reasonable tool for short-duration space missions, such as Space Shuttle and lunar fly-bys. However, longer-duration deep-space missions, such as asteroid and Mars missions, will require the crew to be in space for as long as 400 to 900 day missions with periods of extended autonomy and self-sufficiency. Current indications show higher risk due to fatigue, physiological effects due to extended low gravity environments, and others, may impact HRA predictions. For this project, Safety & Mission Assurance (S&MA) will work with Human Health & Performance (HH&P) to establish what is currently used to assess human reliabiilty for human space programs, identify human performance factors that may be sensitive to long duration space flight, collect available historical data, and update current tools to account for performance shaping factors believed to be important to such missions. This effort will also contribute data to the Human Performance Data Repository and influence the Space Human Factors Engineering research risks and gaps (part of the HRP Program). An accurate risk predictor mitigates Loss of Crew (LOC) and Loss of Mission (LOM).The end result will be an updated HRA model that can effectively predict risk on long-duration missions.

  4. Space Mission Human Reliability Analysis (HRA) Project

    Data.gov (United States)

    National Aeronautics and Space Administration — The purpose of this project is to extend current ground-based Human Reliability Analysis (HRA) techniques to a long-duration, space-based tool to more effectively...

  5. HRA qualitative analysis in a nuclear power plant

    International Nuclear Information System (INIS)

    Dai Licao; Zhang Li; Huang Shudong

    2004-01-01

    Human reliability analysis (HRA) is a very important part of probability safety assessment (PSA) in a nuclear power plant. Qualitative analysis is the basis and starting point of HRA. The purpose, the principle, the method and the procedure of qualitative HRA are introduced. SGTR, a pressurized nuclear power plant as an example, is used to illustrate it. (authors)

  6. A perspective on Human Reliability Analysis (HRA) and studies on the application of HRA to Indian Pressurised Heavy Water Reactors

    International Nuclear Information System (INIS)

    Subramaniam, K.; Saraf, R.K.; Sanyasi Rao, V.V.S.; Venkat Raj, V.; Venkatraman, R.

    2000-05-01

    Probabilistic studies of risks show that the human factor contributes significantly to overall risk. The potential for and mechanisms of human error to affect plant risk and safety is evaluated by Human Reliability Analysis (HRA). HRA has quantitative and qualitative aspects, both of which are useful for Human Factors Engineering (HFE) which aims at designing operator interfaces that will minimise operator error and provide for error detection and recovery capability. HRA has therefore to be conducted as an integrated activity in support of PSA and HFE design. The objectives of HRA therefore, are to assure that potential effects on plant safety and reliability are analysed and that human actions that are important to plant risk are identified so that they can be addressed in both PSA and plant design. This report is in two parts. The first part presents a comprehensive overview of HRA. It attempts to provide an understanding of how human failures are incorporated into PSA models and how HRA is performed. The focus is on the HRA process, frameworks, techniques and models. The second part begins with a discussion on the application of HRA to IPHWRs and then continues with the presentation of three specific HRA case studies. This work was carried out by the working group on HRA constituted by AERB. Part of the work was done under the aegis of the IAEA Coordinated Research Programme (CRP) on collection and classification of human reliability data and use in PSA - Research contract No. 8239/RB. (author)

  7. Implications of an HRA framework for quantifying human acts of commission and dependency: Development of a methodology for conducting an integrated HRA/PRA

    International Nuclear Information System (INIS)

    Barriere, M.T.; Luckas, W.J.; Brown, W.S.; Cooper, S.E.; Wreathall, J.; Bley, D.C.

    1993-01-01

    To support the development of a refined human reliability analysis (HRA) framework, to address identified HRA user needs and improve HRA modeling, unique aspects of human performance have been identified from an analysis of actual plant-specific events. Through the use of the refined framework, relationships between the following HRA, human factors and probabilistic risk assessment (PRA) elements were described: the PRA model, plant states, plant conditions, PRA basic events, unsafe human actions, error mechanisms, and performance shaping factors (PSFs). The event analyses performed in the context of the refined HRA framework, identified the need for new HRA methods that are capable of: evaluating a range of different error mechanisms (e.g., slips as well as mistakes); addressing errors of commission (EOCs) and dependencies between human actions; and incorporating the influence of plant conditions and multiple PSFs on human actions. This report discusses the results of the assessment of user needs, the refinement of the existing HRA framework, as well as, the current status on EOCs, and human dependencies

  8. Implications of an HRA framework for quantifying human acts of commission and dependency: Development of a methodology for conducting an integrated HRA/PRA

    International Nuclear Information System (INIS)

    Barriere, M.T.; Luckas, W.J.; Brown, W.S.; Cooper, S.E.; Wreathall, J.; Bley, D.C.

    1994-01-01

    To support the development of a refined human reliability analysis (HRA) framework, to address identified HRA user needs and improve HRA modeling, unique aspects of human performance have been identified from an analysis of actual plant-specific events. Through the use of the refined framework, relationships between the following HRA, human factors and probabilistic risk assessment (PRA) elements were described: the PRA model, plant states, plant conditions, PRA basic events, unsafe human actions, error mechanisms, and performance shaping factors (PSFs). The event analyses performed in the context of the refined HRA framework, identified the need for new HRA methods that are capable of: evaluating a range of different error mechanisms (e.g., slips as well as mistakes); addressing errors of commission (EOCs) and dependencies between human actions; and incorporating the influence of plant conditions and multiple PSFs on human actions. This report discusses the results of the assessment of user needs, the refinement of the existing HRA framework, as well as, the current status on EOCs, and human dependencies

  9. Development of a methodology for conducting an integrated HRA/PRA --

    International Nuclear Information System (INIS)

    Luckas, W.J.; Barriere, M.T.; Brown, W.S.; Wreathall, J.; Cooper, S.E.

    1993-01-01

    During Low Power and Shutdown (LP ampersand S) conditions in a nuclear power plant (i.e., when the reactor is subcritical or at less than 10--15% power), human interactions with the plant's systems will be more frequent and more direct. Control is typically not mediated by automation, and there are fewer protective systems available. Therefore, an assessment of LP ampersand S related risk should include a greater emphasis on human reliability than such an assessment made for power operation conditions. In order to properly account for the increase in human interaction and thus be able to perform a probabilistic risk assessment (PRA) applicable to operations during LP ampersand S, it is important that a comprehensive human reliability assessment (HRA) methodology be developed and integrated into the LP ampersand S PRA. The tasks comprising the comprehensive HRA methodology development are as follows: (1) identification of the human reliability related influences and associated human actions during LP ampersand S, (2) identification of potentially important LP ampersand S related human actions and appropriate HRA framework and quantification methods, and (3) incorporation and coordination of methodology development with other integrated PRA/HRA efforts. This paper describes the first task, i.e., the assessment of human reliability influences and any associated human actions during LP ampersand S conditions for a pressurized water reactor (PWR)

  10. Development of a methodology for conducting an integrated HRA/PRA --

    Energy Technology Data Exchange (ETDEWEB)

    Luckas, W.J.; Barriere, M.T.; Brown, W.S. (Brookhaven National Lab., Upton, NY (United States)); Wreathall, J. (Wreathall (John) and Co., Dublin, OH (United States)); Cooper, S.E. (Science Applications International Corp., McLean, VA (United States))

    1993-01-01

    During Low Power and Shutdown (LP S) conditions in a nuclear power plant (i.e., when the reactor is subcritical or at less than 10--15% power), human interactions with the plant's systems will be more frequent and more direct. Control is typically not mediated by automation, and there are fewer protective systems available. Therefore, an assessment of LP S related risk should include a greater emphasis on human reliability than such an assessment made for power operation conditions. In order to properly account for the increase in human interaction and thus be able to perform a probabilistic risk assessment (PRA) applicable to operations during LP S, it is important that a comprehensive human reliability assessment (HRA) methodology be developed and integrated into the LP S PRA. The tasks comprising the comprehensive HRA methodology development are as follows: (1) identification of the human reliability related influences and associated human actions during LP S, (2) identification of potentially important LP S related human actions and appropriate HRA framework and quantification methods, and (3) incorporation and coordination of methodology development with other integrated PRA/HRA efforts. This paper describes the first task, i.e., the assessment of human reliability influences and any associated human actions during LP S conditions for a pressurized water reactor (PWR).

  11. Development of A Standard Method for Human Reliability Analysis (HRA) of Nuclear Power Plants

    International Nuclear Information System (INIS)

    Kang, Dae Il; Jung, Won Dea; Kim, Jae Whan

    2005-12-01

    According as the demand of risk-informed regulation and applications increase, the quality and reliability of a probabilistic safety assessment (PSA) has been more important. KAERI started a study to standardize the process and the rules of HRA (Human Reliability Analysis) which was known as a major contributor to the uncertainty of PSA. The study made progress as follows; assessing the level of quality of the HRAs in Korea and identifying the weaknesses of the HRAs, determining the requirements for developing a standard HRA method, developing the process and rules for quantifying human error probability. Since the risk-informed applications use the ASME and ANS PSA standard to ensure PSA quality, the standard HRA method was developed to meet the ASME and ANS HRA requirements with level of category II. The standard method was based on THERP and ASEP HRA that are widely used for conventional HRA. However, the method focuses on standardizing and specifying the analysis process, quantification rules and criteria to minimize the deviation of the analysis results caused by different analysts. Several HRA experts from different organizations in Korea participated in developing the standard method. Several case studies were interactively undertaken to verify the usability and applicability of the standard method

  12. Development of A Standard Method for Human Reliability Analysis (HRA) of Nuclear Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Kang, Dae Il; Jung, Won Dea; Kim, Jae Whan

    2005-12-15

    According as the demand of risk-informed regulation and applications increase, the quality and reliability of a probabilistic safety assessment (PSA) has been more important. KAERI started a study to standardize the process and the rules of HRA (Human Reliability Analysis) which was known as a major contributor to the uncertainty of PSA. The study made progress as follows; assessing the level of quality of the HRAs in Korea and identifying the weaknesses of the HRAs, determining the requirements for developing a standard HRA method, developing the process and rules for quantifying human error probability. Since the risk-informed applications use the ASME and ANS PSA standard to ensure PSA quality, the standard HRA method was developed to meet the ASME and ANS HRA requirements with level of category II. The standard method was based on THERP and ASEP HRA that are widely used for conventional HRA. However, the method focuses on standardizing and specifying the analysis process, quantification rules and criteria to minimize the deviation of the analysis results caused by different analysts. Several HRA experts from different organizations in Korea participated in developing the standard method. Several case studies were interactively undertaken to verify the usability and applicability of the standard method.

  13. Study on Performance Shaping Factors (PSFs) Quantification Method in Human Reliability Analysis (HRA)

    International Nuclear Information System (INIS)

    Kim, Ar Ryum; Jang, Inseok Jang; Seong, Poong Hyun; Park, Jinkyun; Kim, Jong Hyun

    2015-01-01

    The purpose of HRA implementation is 1) to achieve the human factor engineering (HFE) design goal of providing operator interfaces that will minimize personnel errors and 2) to conduct an integrated activity to support probabilistic risk assessment (PRA). For these purposes, various HRA methods have been developed such as technique for human error rate prediction (THERP), simplified plant analysis risk human reliability assessment (SPAR-H), cognitive reliability and error analysis method (CREAM) and so on. In performing HRA, such conditions that influence human performances have been represented via several context factors called performance shaping factors (PSFs). PSFs are aspects of the human's individual characteristics, environment, organization, or task that specifically decrements or improves human performance, thus respectively increasing or decreasing the likelihood of human errors. Most HRA methods evaluate the weightings of PSFs by expert judgment and explicit guidance for evaluating the weighting is not provided. It has been widely known that the performance of the human operator is one of the critical factors to determine the safe operation of NPPs. HRA methods have been developed to identify the possibility and mechanism of human errors. In performing HRA methods, the effect of PSFs which may increase or decrease human error should be investigated. However, the effect of PSFs were estimated by expert judgment so far. Accordingly, in order to estimate the effect of PSFs objectively, the quantitative framework to estimate PSFs by using PSF profiles is introduced in this paper

  14. Task Analysis of Emergency Operating Procedures for Generating Quantitative HRA Data

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Yochan; Park, Jinkyun; Kim, Seunghwan; Choi, Sun Yeong; Jung, Wondea; Jang, Inseok [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2015-10-15

    In this paper, the analysis results of the emergency task in the procedures (EOPs; emergency operating procedures) that can be observed from the simulator data are introduced. The task type, component type, system type, and additional information related with the performance of the operators were described. In addition, a prospective application of the analyzed information to HEP quantification process was discussed. In the probabilistic safety analysis (PSA) field, various human reliability analyses (HRAs) have been performed to produce estimates of human error probabilities (HEPs) for significant tasks in complex socio-technical systems. To this end, Many HRA methods have provided basic or nominal HEPs for typical tasks and the quantitative relations describing how a certain performance context or performance shaping factors (PSFs) affects the HEPs. In the HRA community, however, the necessity of appropriate and sufficient human performance data has been recently indicated. This is because a wide range of quantitative estimates in the previous HRA methods are not supported by solid empirical bases. Hence, there have been attempts to collect HRA supporting data. For example, KAERI has started to collect information on both unsafe acts of operators and the relevant PSFs. A characteristic of the database that is being developed at KAERI is that human errors and related PSF surrogates that can be objectively observable are collected from full-scope simulator experiences. In this environment, to produce concretely grounded bases of the HEPs, the traits or attributes of tasks where significant human errors can be observed should be definitely determined. The determined traits should be applicable to compare the HEPs on the traits with the data in previous HRA methods or databases. In this study, task characteristics in a Westinghouse type of EOPs were analyzed with the defining task, component, and system taxonomies.

  15. Bridging Human Reliability Analysis and Psychology, Part 2: A Cognitive Framework to Support HRA

    Energy Technology Data Exchange (ETDEWEB)

    April M. Whaley; Stacey M. L. Hendrickson; Ronald L. Boring; Jing Xing

    2012-06-01

    This is the second of two papers that discuss the literature review conducted as part of the U.S. Nuclear Regulatory Commission (NRC) effort to develop a hybrid human reliability analysis (HRA) method in response to Staff Requirements Memorandum (SRM) SRM-M061020. This review was conducted with the goal of strengthening the technical basis within psychology, cognitive science and human factors for the hybrid HRA method being proposed. An overview of the literature review approach and high-level structure is provided in the first paper, whereas this paper presents the results of the review. The psychological literature review encompassed research spanning the entirety of human cognition and performance, and consequently produced an extensive list of psychological processes, mechanisms, and factors that contribute to human performance. To make sense of this large amount of information, the results of the literature review were organized into a cognitive framework that identifies causes of failure of macrocognition in humans, and connects those proximate causes to psychological mechanisms and performance influencing factors (PIFs) that can lead to the failure. This cognitive framework can serve as a tool to inform HRA. Beyond this, however, the cognitive framework has the potential to also support addressing human performance issues identified in Human Factors applications.

  16. Human Reliability Analysis. Applicability of the HRA-concept in maintenance shutdown; Analys av maensklig tillfoerlitlighet. HRA-begreppets tillaempbarhet vid revisionsavstaellning

    Energy Technology Data Exchange (ETDEWEB)

    Obenius, Aino (MTO Psykologi AB, Stockholm (SE))

    2007-08-15

    Probabilistic Safety Analysis (PSA) is performed for Swedish nuclear power plants in order to make predictions and improvements of system safety. The analysis of the Three Mile Island and Chernobyl accidents contributed to broaden the approach to nuclear power plant safety. A system perspective focusing on the interaction between aspects of Man, Technology and Organization (MTO) emerged in addition to the development of Human Factors knowledge. To take the human influence on the technical system into consideration when performing PSAs, a Human Reliability Analysis (HRA) is performed. PSA is performed for different stages and plant operating states, and the current state of Swedish analyses is Low power and Shutdown (LPSD), also called Shutdown PSA (SPSA). The purpose of this master's thesis is to describe methods and basic models used when analysing human reliability for the LPSD state. The following questions are at issue: 1. How can the LPSD state be characterised and defined? 2. What is important to take into consideration when performing a LPSD HRA? 3. How can human behaviour be modelled for a LPSD risk analysis? 4. According to available empirical material, how are the questions above treated in performed analysis of human operation during LPSD? 5. How does the result of the questions above affect the way methods for analysis of LPSD could and/or should be developed? The procedure of this project has mainly consisted of literature studies of available theory for modelling of human behaviour and risk analysis of the LPSD state. This study regards analysis of planned outages when maintenance, fuel change, tests and inspections are performed. The outage period is characterised by planned maintenance activities performed in rotating 3-shifts, around the clock, as well as many of the persons performing work tasks on the plant being external contractors. The working conditions are characterised by stress due to heat, radiation and physically demanding or

  17. Development of a methodology for conducting an integrated HRA/PRA --. Task 1, An assessment of human reliability influences during LP&S conditions PWRs

    Energy Technology Data Exchange (ETDEWEB)

    Luckas, W.J.; Barriere, M.T.; Brown, W.S. [Brookhaven National Lab., Upton, NY (United States); Wreathall, J. [Wreathall (John) and Co., Dublin, OH (United States); Cooper, S.E. [Science Applications International Corp., McLean, VA (United States)

    1993-06-01

    During Low Power and Shutdown (LP&S) conditions in a nuclear power plant (i.e., when the reactor is subcritical or at less than 10--15% power), human interactions with the plant`s systems will be more frequent and more direct. Control is typically not mediated by automation, and there are fewer protective systems available. Therefore, an assessment of LP&S related risk should include a greater emphasis on human reliability than such an assessment made for power operation conditions. In order to properly account for the increase in human interaction and thus be able to perform a probabilistic risk assessment (PRA) applicable to operations during LP&S, it is important that a comprehensive human reliability assessment (HRA) methodology be developed and integrated into the LP&S PRA. The tasks comprising the comprehensive HRA methodology development are as follows: (1) identification of the human reliability related influences and associated human actions during LP&S, (2) identification of potentially important LP&S related human actions and appropriate HRA framework and quantification methods, and (3) incorporation and coordination of methodology development with other integrated PRA/HRA efforts. This paper describes the first task, i.e., the assessment of human reliability influences and any associated human actions during LP&S conditions for a pressurized water reactor (PWR).

  18. Structural Design of HRA Database using generic task for Quantitative Analysis of Human Performance

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Seung Hwan; Kim, Yo Chan; Choi, Sun Yeong; Park, Jin Kyun; Jung Won Dea [KAERI, Daejeon (Korea, Republic of)

    2016-05-15

    This paper describes a design of generic task based HRA database for quantitative analysis of human performance in order to estimate the number of task conductions. The estimation method to get the total task conduction number using direct counting is not easy to realize and maintain its data collection framework. To resolve this problem, this paper suggests an indirect method and a database structure using generic task that enables to estimate the total number of conduction based on instructions of operating procedures of nuclear power plants. In order to reduce human errors, therefore, all information on the human errors taken by operators in the power plant should be systematically collected and examined in its management. Korea Atomic Energy Research Institute (KAERI) is carrying out a research to develop a data collection framework to establish a Human Reliability Analysis (HRA) database that could be employed as technical bases to generate human error probabilities (HEPs) and performance shaping factors (PSFs)]. As a result of the study, the essential table schema was designed to the generic task database which stores generic tasks, procedure lists and task tree structures, and other supporting tables. The number of task conduction based on the operating procedures for HEP estimation was enabled through the generic task database and framework. To verify the framework applicability, case study for the simulated experiments was performed and analyzed using graphic user interfaces developed in this study.

  19. Structural Design of HRA Database using generic task for Quantitative Analysis of Human Performance

    International Nuclear Information System (INIS)

    Kim, Seung Hwan; Kim, Yo Chan; Choi, Sun Yeong; Park, Jin Kyun; Jung Won Dea

    2016-01-01

    This paper describes a design of generic task based HRA database for quantitative analysis of human performance in order to estimate the number of task conductions. The estimation method to get the total task conduction number using direct counting is not easy to realize and maintain its data collection framework. To resolve this problem, this paper suggests an indirect method and a database structure using generic task that enables to estimate the total number of conduction based on instructions of operating procedures of nuclear power plants. In order to reduce human errors, therefore, all information on the human errors taken by operators in the power plant should be systematically collected and examined in its management. Korea Atomic Energy Research Institute (KAERI) is carrying out a research to develop a data collection framework to establish a Human Reliability Analysis (HRA) database that could be employed as technical bases to generate human error probabilities (HEPs) and performance shaping factors (PSFs)]. As a result of the study, the essential table schema was designed to the generic task database which stores generic tasks, procedure lists and task tree structures, and other supporting tables. The number of task conduction based on the operating procedures for HEP estimation was enabled through the generic task database and framework. To verify the framework applicability, case study for the simulated experiments was performed and analyzed using graphic user interfaces developed in this study.

  20. Human Reliability Analysis. Applicability of the HRA-concept in maintenance shutdown

    International Nuclear Information System (INIS)

    Obenius, Aino

    2007-08-01

    Probabilistic Safety Analysis (PSA) is performed for Swedish nuclear power plants in order to make predictions and improvements of system safety. The analysis of the Three Mile Island and Chernobyl accidents contributed to broaden the approach to nuclear power plant safety. A system perspective focusing on the interaction between aspects of Man, Technology and Organization (MTO) emerged in addition to the development of Human Factors knowledge. To take the human influence on the technical system into consideration when performing PSAs, a Human Reliability Analysis (HRA) is performed. PSA is performed for different stages and plant operating states, and the current state of Swedish analyses is Low power and Shutdown (LPSD), also called Shutdown PSA (SPSA). The purpose of this master's thesis is to describe methods and basic models used when analysing human reliability for the LPSD state. The following questions are at issue: 1. How can the LPSD state be characterised and defined? 2. What is important to take into consideration when performing a LPSD HRA? 3. How can human behaviour be modelled for a LPSD risk analysis? 4. According to available empirical material, how are the questions above treated in performed analysis of human operation during LPSD? 5. How does the result of the questions above affect the way methods for analysis of LPSD could and/or should be developed? The procedure of this project has mainly consisted of literature studies of available theory for modelling of human behaviour and risk analysis of the LPSD state. This study regards analysis of planned outages when maintenance, fuel change, tests and inspections are performed. The outage period is characterised by planned maintenance activities performed in rotating 3-shifts, around the clock, as well as many of the persons performing work tasks on the plant being external contractors. The working conditions are characterised by stress due to heat, radiation and physically demanding or monotonous

  1. Benchmarking HRA methods against different NPP simulator data

    International Nuclear Information System (INIS)

    Petkov, Gueorgui; Filipov, Kalin; Velev, Vladimir; Grigorov, Alexander; Popov, Dimiter; Lazarov, Lazar; Stoichev, Kosta

    2008-01-01

    The paper presents both international and Bulgarian experience in assessing HRA methods, underlying models approaches for their validation and verification by benchmarking HRA methods against different NPP simulator data. The organization, status, methodology and outlooks of the studies are described

  2. Consideration on HRA implementation during LPSD operation

    International Nuclear Information System (INIS)

    Kim, Ar Ryum; Kim, Jae Whan; Jang, In Seok; Seong, Poong Hyun

    2014-01-01

    During low power and shutdown operation, it has been pointed out that the important of human actions is significantly increased. Because automatic control may be disabled, most control room annunciation titles indicate alarm status, and procedures are insufficient, human operators plays a much larger role during outages and in lower power state. In this regards, in order to reduce human errors and secure nuclear power plant safety, it is necessary to identify and estimate human errors during LPSD operations. However, many researchers have argued that there is no comprehensive LPSD human reliability analysis (HRA) method so far. In this study, we reviewed and implemented the existing HRA methods during LPSD operations: Korean standard HRA (K-HRA), Simplified plant analysis risk HRA (SPAR-H), Caused based decision tree (CBDT), and human cognitive reliability/ operator reliability experiments (HCR/ORE). We reviewed HRA methods based on several reports related to the unique aspects of human operator during LPSD operation and HRA requirements. In addition, the existing HRA methods were implemented during various plant operating states (POS) of LPSD operation. Loss of shutdown cooling system (SCS) was selected as initiating event. As a result of review and implementation of HRA methods, we derived the some limitations of the existing HRA methods and related procedure during loss of SCS event

  3. Bridging the gap between HRA research and HRA practice: A Bayesian network version of SPAR-H

    International Nuclear Information System (INIS)

    Groth, Katrina M.; Swiler, Laura P.

    2013-01-01

    The shortcomings of Human Reliability Analysis (HRA) have been a topic of discussion for over two decades. Repeated attempts to address these limitations have resulted in over 50 HRA methods, and the HRA research community continues to develop new methods. However, there remains a gap between the methods developed by HRA researchers and those actually used by HRA practitioners. Bayesian Networks (BNs) have become an increasingly popular part of the risk and reliability analysis framework over the past decade. BNs provide a framework for addressing many of the shortcomings of HRA from a researcher perspective and from a practitioner perspective. Several research groups have developed advanced HRA methods based on BNs, but none of these methods has been adopted by HRA practitioners in the U.S. nuclear power industry or at the U.S. Nuclear Regulatory Commission. In this paper we bridge the gap between HRA research and HRA practice by building a BN version of the widely used SPAR-H method. We demonstrate how the SPAR-H BN can be used by HRA practitioners, and we also demonstrate how it can be modified to incorporate data and information from research to advance HRA practice. The SPAR-H BN can be used as a starting point for translating HRA research efforts and advances in scientific understanding into real, timely benefits for HRA practitioners

  4. Suggestions for an improved HRA method for use in Probabilistic Safety Assessment

    International Nuclear Information System (INIS)

    Parry, Gareth W.

    1995-01-01

    This paper discusses why an improved Human Reliability Analysis (HRA) approach for use in Probabilistic Safety Assessments (PSAs) is needed, and proposes a set of requirements on the improved HRA method. The constraints imposed by the need to embed the approach into the PSA methodology are discussed. One approach to laying the foundation for an improved method, using models from the cognitive psychology and behavioral science disciplines, is outlined

  5. Design of Qualitative HRA Database Structure

    International Nuclear Information System (INIS)

    Kim, Seunghwan; Kim, Yochan; Choi, Sun Yeong; Park, Jinkyun; Jung, Wondea

    2015-01-01

    HRA DB is to collect and store the data in a database form to manage and maintain them from the perspective of human reliability analysis. All information on the human errors taken by operators in the power plant should be systematically collected and documented in its management. KAERI is developing the simulator-based HRA data handbook. In this study, the information required to store and manage the data necessary to perform an HRA as to store the HRA data to be stored in the handbook is identified and summarized. Especially this study is to summarize the collection and classification of qualitative data as the raw data to organize the data required to draw the HEP and its DB process. Qualitative HRA DB is a storehouse of all sub-information needed to receive the human error probability for Pasa. In this study, the requirements for structural design and implementation of qualitative HRA DB must be implemented for HRA DB were summarized. The follow-up study of the quantitative HRA DB implementation should be followed to draw the substantial HEP

  6. Selecting the seismic HRA approach for Savannah River Plant PRA revision 1

    International Nuclear Information System (INIS)

    Papouchado, K.; Salaymeh, J.

    1993-10-01

    The Westinghouse Savannah River Company (WSRC) has prepared a level I probabilistic risk assessment (PRA), Rev. 0 of reactor operations for externally-initiated events including seismic events. The SRS PRA, Rev. 0 Seismic HRA received a critical review that expressed skepticism with the approach used for human reliability analysis because it had not been previously used and accepted in other published PRAs. This report provides a review of published probabilistic risk assessments (PRAs), the associated methodology guidance documents, and the psychological literature to identify parameters important to seismic human reliability analysis (HRA). It also describes a recommended approach for use in the Savannah River Site (SRS) PRA. The SRS seismic event PRA performs HRA to account for the contribution of human errors in the accident sequences. The HRA of human actions during and after a seismic event is an area subject to many uncertainties and involves significant analyst judgment. The approach recommended by this report is based on seismic HRA methods and associated issues and concerns identified from the review of these referenced documents that represent the current state-of-the- art knowledge and acceptance in the seismic HRA field

  7. DEPEND-HRA-A method for consideration of dependency in human reliability analysis

    International Nuclear Information System (INIS)

    Cepin, Marko

    2008-01-01

    A consideration of dependencies between human actions is an important issue within the human reliability analysis. A method was developed, which integrates the features of existing methods and the experience from a full scope plant simulator. The method is used on real plant-specific human reliability analysis as a part of the probabilistic safety assessment of a nuclear power plant. The method distinguishes dependency for pre-initiator events from dependency for initiator and post-initiator events. The method identifies dependencies based on scenarios, where consecutive human actions are modeled, and based on a list of minimal cut sets, which is obtained by running the minimal cut set analysis considering high values of human error probabilities in the evaluation. A large example study, which consisted of a large number of human failure events, demonstrated the applicability of the method. Comparative analyses that were performed show that both selection of dependency method and selection of dependency levels within the method largely impact the results of probabilistic safety assessment. If the core damage frequency is not impacted much, the listings of important basic events in terms of risk increase and risk decrease factors may change considerably. More efforts are needed on the subject, which will prepare the background for more detailed guidelines, which will remove the subjectivity from the evaluations as much as it is possible

  8. EXAM-HRA. A comparison between HRA approaches in Nordic countries and in Germany

    International Nuclear Information System (INIS)

    Becker, G.; Hussels, U.; Schubert, B.; Maennistoe, Ilkka

    2012-01-01

    A joint project called ''EXAM-HRA'' is performed by German, Swedish and Finnish nuclear power plants. The working group presently consists of: - Kent Bladh, RAB. - Anders Karlsson, FKA. - Guenter Becker and Marina Proske, RISA. - Ilkka Maennistoeand Jan-Erik Holmberg, VTT. - Lasse Tunturivuori, TVO. - Christian Bjursten Carlsson and Anders Olsson, Scandpower. - Gunnar Johanson and Lisa Fritzson, ES-konsult. The overall project objective is to provide guidance for a ''state of the art'' human reliability analysis (HRA) for purposes of probabilistic safety assessment (PSA) to ensure that plant specific properties are properly taken into consideration in the analysis /1/. This will also provide means to improve plant features based on HRA and PSA results as well as providing means to improve Nordic and German HRA application for PSA purposes. In previous phase 1, a method of comparing existing HRA analysis has been developed and applied to tasks, which have been modelled for various German and Nordic PSA. In phase 2, three plant visits have been performed in Brunsbuettel (Germany), Forsmark (Sweden), and Olkiluoto (Finland) to compare actions in more detail, and to perform additional analysis using specific variants of HRA methods. Although both German and Nordic partners strongly rely on THERP method, the variants of this HRA method turned out to differ in some aspects. Swedish and Finnish analyses focus on the diagnosis part of the action. They are based on the time correlation of Swain /2/, but they make use of additional performance shaping factors normally not used in Germany. German analyses have a less complex model for diagnosis, but a more thorough investigation of the various action steps, especially considering various types of recovery. This will be demonstrated by an example, which will show the different analyses and their influence on results. (orig.)

  9. Development of new HRA methods based upon operational experience

    International Nuclear Information System (INIS)

    Cooper, S.E.; Luckas, W.J.; Barriere, M.T.; Wreathall, J.

    2004-01-01

    Under the auspices of the US Nuclear Regulatory Commission (NRC), previously unaddressed human reliability issues are being investigated in order to support the development of human reliability analysis (HRA) methods for both low power and shutdown (LP and S) and full-power conditions. Actual operational experience, such as that reported in Licensee Event Reports (LERs), have been used to gain insights and provide a basis for the requirements of new HRA methods. In particular, operational experience has shown that new HRA methods for LP and S must address human-induced initiators, errors of commission, mistakes (vs. slips), dependencies, and the effects of multiple performance shaping factors (PSFs). (author)

  10. The Importance of HRA in Human Space Flight: Understanding the Risks

    Science.gov (United States)

    Hamlin, Teri

    2010-01-01

    Human performance is critical to crew safety during space missions. Humans interact with hardware and software during ground processing, normal flight, and in response to events. Human interactions with hardware and software can cause Loss of Crew and/or Vehicle (LOCV) through improper actions, or may prevent LOCV through recovery and control actions. Humans have the ability to deal with complex situations and system interactions beyond the capability of machines. Human Reliability Analysis (HRA) is a method used to qualitatively and quantitatively assess the occurrence of human failures that affect availability and reliability of complex systems. Modeling human actions with their corresponding failure probabilities in a Probabilistic Risk Assessment (PRA) provides a more complete picture of system risks and risk contributions. A high-quality HRA can provide valuable information on potential areas for improvement, including training, procedures, human interfaces design, and the need for automation. Modeling human error has always been a challenge in part because performance data is not always readily available. For spaceflight, the challenge is amplified not only because of the small number of participants and limited amount of performance data available, but also due to the lack of definition of the unique factors influencing human performance in space. These factors, called performance shaping factors in HRA terminology, are used in HRA techniques to modify basic human error probabilities in order to capture the context of an analyzed task. Many of the human error modeling techniques were developed within the context of nuclear power plants and therefore the methodologies do not address spaceflight factors such as the effects of microgravity and longer duration missions. This presentation will describe the types of human error risks which have shown up as risk drivers in the Shuttle PRA which may be applicable to commercial space flight. As with other large PRAs

  11. Application of human reliability analysis methodology of second generation

    International Nuclear Information System (INIS)

    Ruiz S, T. de J.; Nelson E, P. F.

    2009-10-01

    The human reliability analysis (HRA) is a very important part of probabilistic safety analysis. The main contribution of HRA in nuclear power plants is the identification and characterization of the issues that are brought together for an error occurring in the human tasks that occur under normal operation conditions and those made after abnormal event. Additionally, the analysis of various accidents in history, it was found that the human component has been a contributing factor in the cause. Because of need to understand the forms and probability of human error in the 60 decade begins with the collection of generic data that result in the development of the first generation of HRA methodologies. Subsequently develop methods to include in their models additional performance shaping factors and the interaction between them. So by the 90 mid, comes what is considered the second generation methodologies. Among these is the methodology A Technique for Human Event Analysis (ATHEANA). The application of this method in a generic human failure event, it is interesting because it includes in its modeling commission error, the additional deviations quantification to nominal scenario considered in the accident sequence of probabilistic safety analysis and, for this event the dependency actions evaluation. That is, the generic human failure event was required first independent evaluation of the two related human failure events . So the gathering of the new human error probabilities involves the nominal scenario quantification and cases of significant deviations considered by the potential impact on analyzed human failure events. Like probabilistic safety analysis, with the analysis of the sequences were extracted factors more specific with the highest contribution in the human error probabilities. (Author)

  12. Phoenix – A model-based Human Reliability Analysis methodology: Qualitative Analysis Procedure

    International Nuclear Information System (INIS)

    Ekanem, Nsimah J.; Mosleh, Ali; Shen, Song-Hua

    2016-01-01

    Phoenix method is an attempt to address various issues in the field of Human Reliability Analysis (HRA). Built on a cognitive human response model, Phoenix incorporates strong elements of current HRA good practices, leverages lessons learned from empirical studies, and takes advantage of the best features of existing and emerging HRA methods. Its original framework was introduced in previous publications. This paper reports on the completed methodology, summarizing the steps and techniques of its qualitative analysis phase. The methodology introduces the “Crew Response Tree” which provides a structure for capturing the context associated with Human Failure Events (HFEs), including errors of omission and commission. It also uses a team-centered version of the Information, Decision and Action cognitive model and “macro-cognitive” abstractions of crew behavior, as well as relevant findings from cognitive psychology literature and operating experience, to identify potential causes of failures and influencing factors during procedure-driven and knowledge-supported crew-plant interactions. The result is the set of identified HFEs and likely scenarios leading to each. The methodology itself is generic in the sense that it is compatible with various quantification methods, and can be adapted for use across different environments including nuclear, oil and gas, aerospace, aviation, and healthcare. - Highlights: • Produces a detailed, consistent, traceable, reproducible and properly documented HRA. • Uses “Crew Response Tree” to capture context associated with Human Failure Events. • Models dependencies between Human Failure Events and influencing factors. • Provides a human performance model for relating context to performance. • Provides a framework for relating Crew Failure Modes to its influencing factors.

  13. Representing cognitive activities and errors in HRA trees

    International Nuclear Information System (INIS)

    Gertman, D.I.

    1992-01-01

    A graphic representation method is presented herein for adapting an existing technology--human reliability analysis (HRA) event trees, used to support event sequence logic structures and calculations--to include a representation of the underlying cognitive activity and corresponding errors associated with human performance. The analyst is presented with three potential means of representing human activity: the NUREG/CR-1278 HRA event tree approach; the skill-, rule- and knowledge-based paradigm; and the slips, lapses, and mistakes paradigm. The above approaches for representing human activity are integrated in order to produce an enriched HRA event tree -- the cognitive event tree system (COGENT)-- which, in turn, can be used to increase the analyst's understanding of the basic behavioral mechanisms underlying human error and the representation of that error in probabilistic risk assessment. Issues pertaining to the implementation of COGENT are also discussed

  14. Applicability of HRA to support advanced MMI design review

    International Nuclear Information System (INIS)

    Kim, Inn Seock

    2000-01-01

    More than half of all incidents in large complex technological systems, particularly in nuclear power or aviation industries, were attributable in some way to human erroneous actions. These incidents were largely due to the human engineering deficiencies of man-machine interface (MMI). In nuclear industry, advanced computer-based MMI designs are emerging as part of new reactor designs. The impact of advanced MMI technology on the operator performance, and as a result, on plant safety should be thoroughly evaluated before such technology is actually adopted in nuclear power plants. This paper discusses the applicability of human reliability analysis (HRA) to support the design review process. Both the first-generation and the second-generation HRA methods are considered focusing on a couple of promising HRA methods, i.e., ATHEANA and CREAM, with the potential to assist the design review process. (author)

  15. A methodology to incorporate organizational factors into human reliability analysis

    International Nuclear Information System (INIS)

    Li Pengcheng; Chen Guohua; Zhang Li; Xiao Dongsheng

    2010-01-01

    A new holistic methodology for Human Reliability Analysis (HRA) is proposed to model the effects of the organizational factors on the human reliability. Firstly, a conceptual framework is built, which is used to analyze the causal relationships between the organizational factors and human reliability. Then, the inference model for Human Reliability Analysis is built by combining the conceptual framework with Bayesian networks, which is used to execute the causal inference and diagnostic inference of human reliability. Finally, a case example is presented to demonstrate the specific application of the proposed methodology. The results show that the proposed methodology of combining the conceptual model with Bayesian Networks can not only easily model the causal relationship between organizational factors and human reliability, but in a given context, people can quantitatively measure the human operational reliability, and identify the most likely root causes or the prioritization of root causes caused human error. (authors)

  16. Development of a methodology for the application of the analysis of human reliability to individualized temporary storage facility; Desarrollo de una metodologia de aplicacion del Analisis de Fiabilidad Humana a una instalacion de Almacen Temporal Individualizado

    Energy Technology Data Exchange (ETDEWEB)

    Diaz, P.; Dies, J.; Tapia, C.; Blas, A. de

    2014-07-01

    The paper aims to present the methodology that has been developed with the purpose of applying an ATI without the need of having experts during the process of modelling and quantification analysis of HRA. The developed methodology is based on ATHEANA and relies on the use of other methods of analysis of human action and in-depth analysis. (Author)

  17. A human reliability analysis (HRA) method for identifying and assessing the error of commission (EOC) from a diagnosis failure

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jae Whan; Jung, Won Dea; Park, Jin Yun; Kang, Dae Il

    2005-01-01

    The study deals with a method for systematically identifying and assessing the EOC events that might be caused from a diagnosis failure or misdiagnosis of the expected events in accident scenarios of nuclear power plants. The method for EOC identification and assessment consists of three steps: analysis of the potential for a diagnosis failure (or misdiagnosis), identification of the EOC events from the diagnosis failure, quantitative assessment of the identified EOC events. As a tool for analysing a diagnosis failure, the MisDiagnosis Tree Analysis (MDTA) technique is proposed with the taxonomy of misdiagnosis causes. Also, the guidance on the identification of EOC events and the classification system and data are given for quantitiative assessment. As an applicaton of the proposed method, the EOCs identification and assessment for Younggwang 3 and 4 plants and their impact on the plant risk were performed. As the result, six events or event sequences were considered for diagnosis failures and about 20 new Human Failure Events (HFEs) involving EOCs were identified. According to the assessment of the risk impact of the identified HFEs, they increase the CDF by 11.4 % of the current CDF value, which corresponds to 10.2 % of the new CDF. The small loss of coolant accident (SLOCA) turned out to be a major contributor to the increase of CDF resulting in 9.2 % increaseof the current CDF.

  18. Some insights from recent applications of HRA methods in PSA effort and plant operation feedback in Czech Republic

    International Nuclear Information System (INIS)

    Holy, Jaroslav

    2004-01-01

    The methods of human reliability analysis (HRA) form an integral part of probabilistic safety assessment. These methods are currently still under development and there is significant lack of data needed for applications. As a consequence, no unique formal approach for solution of many aspects related to plant staff human performance does exist. In this contribution, some examples of fruitful work are given, which reflect the needs and possibilities specific for HRA field. In the first two examples, methodology, course and results of analysis of concrete matters related to pre-accident errors and shutdown control room crew performance are described. The last example is oriented to one potential human performance related data source-observation of control room crew full-scope simulator exercise. Some conclusions of general validity are presented in the final part of the contribution

  19. Advances in human reliability analysis in Mexico

    International Nuclear Information System (INIS)

    Nelson, Pamela F.; Gonzalez C, M.; Ruiz S, T.; Guillen M, D.; Contreras V, A.

    2010-10-01

    Human Reliability Analysis (HRA) is a very important part of Probabilistic Risk Analysis (PRA), and constant work is dedicated to improving methods, guidance and data in order to approach realism in the results as well as looking for ways to use these to reduce accident frequency at plants. Further, in order to advance in these areas, several HRA studies are being performed globally. Mexico has participated in the International HRA Empirical study with the objective of -benchmarking- HRA methods by comparing HRA predictions to actual crew performance in a simulator, as well as in the empirical study on a US nuclear power plant currently in progress. The focus of the first study was the development of an understanding of how methods are applied by various analysts, and characterize the methods for their capability to guide the analysts to identify potential human failures, and associated causes and performance shaping factors. The HRA benchmarking study has been performed by using the Halden simulator, 14 European crews, and 15 HRA equipment s (NRC, EPRI, and foreign HRA equipment s using different HRA methods). This effort in Mexico is reflected through the work being performed on updating the Laguna Verde PRA to comply with the ASME PRA standard. In order to be considered an HRA with technical adequacy, that is, be considered as a capability category II, for risk-informed applications, the methodology used for the HRA in the original PRA is not considered sufficiently detailed, and the methodology had to upgraded. The HCR/CBDT/THERP method was chosen, since this is used in many nuclear plants with similar design. The HRA update includes identification and evaluation of human errors that can occur during testing and maintenance, as well as human errors that can occur during an accident using the Emergency Operating Procedures. The review of procedures for maintenance, surveillance and operation is a necessary step in HRA and provides insight into the possible

  20. "Trojitá hra" - FTTH

    OpenAIRE

    Bobkovič, Peter

    2008-01-01

    Tento projekt sa zaoberá možnosťami realizácie pripojenia poslednej míle k užívatežovi pomocou FTTH, čo je optika až do domu. Celá práca je rozdelená na 2 časti a to všeobecný prehžad FTTH a vlastné meranie. Vo všeobecnom prehžade sú postupne uvedené druhy sietí FTTH, problematika s FTTH spojená, zmapovaná situácia vo svete a trojitá hra (cenník, ponuka TV a pod.). Vo druhej časti je vypracované postupné zmeranie a výsledky troch vykonaných meraní (PPM, OTDR, priama metóda) ako aj vzhžad čist...

  1. Regional Shelter Analysis Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Dillon, Michael B. [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Dennison, Deborah [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Kane, Jave [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Walker, Hoyt [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States); Miller, Paul [Lawrence Livermore National Lab. (LLNL), Livermore, CA (United States)

    2015-08-01

    The fallout from a nuclear explosion has the potential to injure or kill 100,000 or more people through exposure to external gamma (fallout) radiation. Existing buildings can reduce radiation exposure by placing material between fallout particles and exposed people. Lawrence Livermore National Laboratory was tasked with developing an operationally feasible methodology that could improve fallout casualty estimates. The methodology, called a Regional Shelter Analysis, combines the fallout protection that existing buildings provide civilian populations with the distribution of people in various locations. The Regional Shelter Analysis method allows the consideration of (a) multiple building types and locations within buildings, (b) country specific estimates, (c) population posture (e.g., unwarned vs. minimally warned), and (d) the time of day (e.g., night vs. day). The protection estimates can be combined with fallout predictions (or measurements) to (a) provide a more accurate assessment of exposure and injury and (b) evaluate the effectiveness of various casualty mitigation strategies. This report describes the Regional Shelter Analysis methodology, highlights key operational aspects (including demonstrating that the methodology is compatible with current tools), illustrates how to implement the methodology, and provides suggestions for future work.

  2. A probabilistic cognitive simulator for HRA studies (PROCOS)

    International Nuclear Information System (INIS)

    Trucco, P.; Leva, M.C.

    2007-01-01

    The paper deals with the development of a simulator for approaching human errors in complex operational frameworks (e.g., plant commissioning). The aim is to integrate the quantification capabilities of the so-called 'first-generation' human reliability assessment (HRA) methods with a cognitive evaluation of the operator. The simulator allows analysing both error prevention and error recovery. It integrates cognitive human error analysis with standard hazard analysis methods (Hazop and event tree) by means of a 'semi static approach'. The comparison between the results obtained through the proposed approach and those of a traditional HRA method such as human error assessment and reduction technique, shows the capability of the simulator to provide coherent and accurate analysis

  3. Failures without errors: quantification of context in HRA

    International Nuclear Information System (INIS)

    Fujita, Yushi; Hollnagel, Erik

    2004-01-01

    PSA-cum-human reliability analysis (HRA) has traditionally used individual human actions, hence individual 'human errors', as a meaningful unit of analysis. This is inconsistent with the current understanding of accidents, which points out that the notion of 'human error' is ill defined and that adverse events more often are the due to the working conditions than to people. Several HRA approaches, such as ATHEANA and CREAM have recognised this conflict and proposed ways to deal with it. This paper describes an improvement of the basic screening method in CREAM, whereby a rating of the performance conditions can be used to calculate a Mean Failure Rate directly without invoking the notion of human error

  4. SCHEME (Soft Control Human error Evaluation MEthod) for advanced MCR HRA

    International Nuclear Information System (INIS)

    Jang, Inseok; Jung, Wondea; Seong, Poong Hyun

    2015-01-01

    The Technique for Human Error Rate Prediction (THERP), Korean Human Reliability Analysis (K-HRA), Human Error Assessment and Reduction Technique (HEART), A Technique for Human Event Analysis (ATHEANA), Cognitive Reliability and Error Analysis Method (CREAM), and Simplified Plant Analysis Risk Human Reliability Assessment (SPAR-H) in relation to NPP maintenance and operation. Most of these methods were developed considering the conventional type of Main Control Rooms (MCRs). They are still used for HRA in advanced MCRs even though the operating environment of advanced MCRs in NPPs has been considerably changed by the adoption of new human-system interfaces such as computer-based soft controls. Among the many features in advanced MCRs, soft controls are an important feature because the operation action in NPP advanced MCRs is performed by soft controls. Consequently, those conventional methods may not sufficiently consider the features of soft control execution human errors. To this end, a new framework of a HRA method for evaluating soft control execution human error is suggested by performing the soft control task analysis and the literature reviews regarding widely accepted human error taxonomies. In this study, the framework of a HRA method for evaluating soft control execution human error in advanced MCRs is developed. First, the factors which HRA method in advanced MCRs should encompass are derived based on the literature review, and soft control task analysis. Based on the derived factors, execution HRA framework in advanced MCRs is developed mainly focusing on the features of soft control. Moreover, since most current HRA database deal with operation in conventional type of MCRs and are not explicitly designed to deal with digital HSI, HRA database are developed under lab scale simulation

  5. MODIFICATION OF THE SPAR-H METHOD TO SUPPORT HRA FOR LEVEL 2 PSA

    Energy Technology Data Exchange (ETDEWEB)

    St. Germain, S.; Boring, R.; Banaseanu, G.; Akl, Y.; Xu, M.

    2016-10-01

    Currently, available Human Reliability Analysis (HRA) methods were generally developed to support Level 1 Probabilistic Safety Analysis (PSA) models. There has been an increased emphasis placed on Level 2 PSA in recent years; however, the currently used HRA methods are not ideal for this application, including the SPAR-H method. Challenges that will likely be present during a severe accident such as degraded or hazardous operating conditions, shift in control from the main control room to the technical support center, unavailability of instrumentation, and others are not routinely considered for Level 1 HRA analysis. These factors combine to create a much more uncertain condition to be accounted for in the HRA analysis. While the SPAR-H shaping factors were established to support Level 1 HRA, previous studies have shown it may be used for Level 2 HRA analysis as well. The Canadian Nuclear Safety Commission (CNSC) and Idaho National Laboratory (INL) in a joint project are investigating modifications to the SPAR-H method to create more consistency in applying the performance shaping factors used in the method for Level 2 analysis.

  6. The use of empirical data sources in HRA

    International Nuclear Information System (INIS)

    Hallbert, Bruce; Gertman, David; Lois, Erasmia; Marble, Julie; Blackman, Harold; Byers, James

    2004-01-01

    This paper presents a review of available information related to human performance to support Human Reliability Analysis (HRA) performed for nuclear power plants (NPPs). A number of data sources are identified as potentially useful. These include NPP licensee event reports, augmented inspection team reports, operator requalification data, results from the literature in experimental psychology, and the Aviation Safety Reporting System. The paper discusses how utilizing such information improves our capability to model and quantify human performance. In particular, the paper discusses how information related to performance shaping factors can be extracted from empirical data to determine their size effect, their relative effects, as well as their interactions. The paper concludes that appropriate use of existing sources can help addressing some of the important issues we are currently facing in HRA

  7. SMART performance analysis methodology

    International Nuclear Information System (INIS)

    Lim, H. S.; Kim, H. C.; Lee, D. J.

    2001-04-01

    To ensure the required and desired operation over the plant lifetime, the performance analysis for the SMART NSSS design is done by means of the specified analysis methodologies for the performance related design basis events(PRDBE). The PRDBE is an occurrence(event) that shall be accommodated in the design of the plant and whose consequence would be no more severe than normal service effects of the plant equipment. The performance analysis methodology which systematizes the methods and procedures to analyze the PRDBEs is as follows. Based on the operation mode suitable to the characteristics of the SMART NSSS, the corresponding PRDBEs and allowable range of process parameters for these events are deduced. With the developed control logic for each operation mode, the system thermalhydraulics are analyzed for the chosen PRDBEs using the system analysis code. Particularly, because of different system characteristics of SMART from the existing commercial nuclear power plants, the operation mode, PRDBEs, control logic, and analysis code should be consistent with the SMART design. This report presents the categories of the PRDBEs chosen based on each operation mode and the transition among these and the acceptance criteria for each PRDBE. It also includes the analysis methods and procedures for each PRDBE and the concept of the control logic for each operation mode. Therefore this report in which the overall details for SMART performance analysis are specified based on the current SMART design, would be utilized as a guide for the detailed performance analysis

  8. HRA Data Collection from the Simulations of Abnormal Situations

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Yo Chan; Park, Jink Yun; Jung, Won Dea [Korea Atomic Energy Research Institute, Daejeon (Korea, Republic of)

    2014-10-15

    In this study, it was revealed that the designed worksheets were feasible to collect HRA data, especially in abnormal situations. The defined taxonomy of UAs was unambiguous to distinguish actions of operators and quantify the probabilities. It is widely recognized that reliability of operators are critical to complex socio-technical systems. For this reason, human reliability analysis (HRA), which aims to identify unsafe actions (UAs) that contribute to risks of the systems and assess the failure rates of the actions, has been conducted. Although many techniques of HRA have been developed and used in many years, many reports indicated lack of database for supporting empirical bases of HRA methods. Thus, there have been recent efforts to collect data about human reliability from plant experience, simulator experiment or qualification, and laboratory experiments. As one of these efforts, KAERI also established a guideline to collect information about human reliability and performance shaping factors from simulators. This guideline particularly presented a set of worksheets that allows comprehensively gathering objectively observable information in simulations of emergency situations. This paper reports the process and preliminary results of the data collection from the simulations of abnormal situations based on the developed worksheets in KAERI database guideline. We analyzed operator behaviors of the sixteen experiments for the two kinds of abnormal situations: RCP (reactor coolant pump) cyclone filter blockage and CDP (condensate pump) valve stuck. The UAs of operators were identified and quantified. Because the number of simulations was limited and data of various situations will be obtained continuously, it is uncertain to conclude the resulted probabilities. Based on the worksheet, operator behaviors in many different kinds of scenarios will be analyzed and the relations between human reliabilities and the observed factors will be also investigated.

  9. Development of a qualitative evaluation framework for performance shaping factors (PSFs) in advanced MCR HRA

    International Nuclear Information System (INIS)

    Lee, Seung Woo; Kim, Ar Ryum; Ha, Jun Su; Seong, Poong Hyun

    2011-01-01

    Highlights: → Context changes in advanced MCR have impact on PSFs. → PSFs in the 1st and 2nd generation HRA methods are reviewed. → We made a qualitative evaluation framework for PSF based on human factor issues. - Abstract: Human reliability analysis (HRA) is performed as part of the probabilistic risk assessment to identify and quantify human actions and the associated impacts on structures, systems, and components of complex facilities. In performing HRA, conditions that influence human performance have been analyzed in terms of several context factors. These context factors, which are called performance shaping factors (PSFs) are used to adjust the basic human error probability (BHEP), and PSFs have been derived in various ways depending on the HRA methods used. As the design of instrumentation and control (I and C) systems for nuclear power plants (NPPs) is rapidly moving toward fully digital I and C, and modern computer techniques have been gradually introduced into the design of advanced main control room (MCR), computer-based human-system interfaces (HSIs), such as CRT-based displays, large display panels (LDPs), advanced information systems, soft control, and computerized procedure system (CPS) will be applied in advanced MCR. Environmental changes in MCR have some implications for PSFs, and they have an influence on when PSFs should be applied in HRA because different situations might induce different internal or external factors which can lead to human errors. In this study, PSFs for advanced MCR HRA are derived, and a new qualitative evaluation framework for these PSFs is suggested. First, PSFs from various HRA methods are collected, and these PSFs are further grouped into PSFs categories to be used in advanced MCR HRA. Second, human factor (HF) issues in advanced MCR are analyzed and derived to be used as an evaluation framework for PSFs.

  10. Development of a HRA method based on Human Factor Issues for advanced NPP

    International Nuclear Information System (INIS)

    Lee, Seung Woo; Seong, Poong Hyun; Ha, Jun Su; Park, Jae Hyuk; Kim, Ja Kyung

    2010-01-01

    A design of instrumentation and control (I and C) systems for various plant systems including nuclear power plants (NPPs) is rapidly moving toward fully digital I and C and modern computer techniques have been gradually introduced into the design of advanced main control room (MCR). In advanced MCR, computer based Human-System Interfaces (HSIs) such as CRT based displays, large display panels (LDP), advanced information system, soft control and computerized procedure system (CPS) are applied in advanced MCR. Human operators in an advanced MCR still play an important role. However, various research and experiences from NPPs with an advanced MCR show that characteristics of human operators' task would be changed due to the use of inexperienced HSIs. This gives implications to the PSFs (Performance Shaping Factors) in HRA (Human Reliability Analysis). PSF in HRA is an aspect of the human's individual characteristics, environment, organization, or task that specifically decrements or improves human performance resulting in increasing or decreasing the likelihood of human error. These PSFs have been suggested in various ways depending on the HRA methods used. In most HRA methods, however, there is a lack of inconsistency for the derivation of the PSFs and a lack of considerations of how the changes implemented in advanced MCR give impact on the operators' task. In this study, a framework for the derivation of and evaluation in the PSFs to be used in HRA for advanced NPPs is suggested

  11. A new dynamic HRA method and its application

    International Nuclear Information System (INIS)

    Je, Moo Sung; Park, Chang Kyoo

    1995-01-01

    This paper presents a new dynamic HRA (Human Reliability Analysis) method and its application for quantifying the human error probabilities in implementing an accident management action. For comparisons of current HRA methods with the new method, the characteristics of THERP, HCR, and SLIM-MAUD, which are most frequently used methods in PSAs, are discussed. The action associated with the implementation of the cavity flooding during a station blackout sequence is considered for its application. This method is based on the concepts of the quantified correlation between the performance requirement and performance achievement. The MAAP 3.0B code and Latin Hypercube sampling technique are used to determine the uncertainty of the performance achievement parameter. Meanwhile, the value of the performance requirement parameter is obtained from interviews. Based on these stochastic distributions obtained, human error probabilities are calculated with respect to the various means and variances of the timings. It is shown that this method is very flexible in that it can be applied to any kind of the operator actions, including the actions associated with the implementation of accident management strategies. 1 fig., 3 tabs., 17 refs. (Author)

  12. Lessons learned from HRA and human-system modeling efforts

    International Nuclear Information System (INIS)

    Hallbert, B.P.

    1993-01-01

    Human-System modeling is not unique to the field of Human Reliability Analysis (HRA). Since human factors professionals first began their explorations of human activities, they have done so with the concept of open-quotes systemclose quotes in mind. Though the two - human and system - are distinct, they can be properly understood only in terms of each other: the system provides a context in which goals and objectives for work are defined, and the human plays either a pre-defined or ad hoc role in meeting these goals. In this sense, every intervention which attempts to evaluate or improve upon some system parameter requires that an understanding of human-system interactions be developed. It is too often the case, however, that somewhere between the inception of a system and its implementation, the human-system relationships are overlooked, misunderstood, or inadequately framed. This results in mismatches between demands versus capabilities of human operators, systems which are difficult to operate, and the obvious end product-human error. The lessons learned from human system modeling provide a valuable feedback mechanism to the process of HRA, and the technologies which employ this form of modeling

  13. An advanced human reliability analysis methodology: analysis of cognitive errors focused on

    International Nuclear Information System (INIS)

    Kim, J. H.; Jeong, W. D.

    2001-01-01

    The conventional Human Reliability Analysis (HRA) methods such as THERP/ASEP, HCR and SLIM has been criticised for their deficiency in analysing cognitive errors which occurs during operator's decision making process. In order to supplement the limitation of the conventional methods, an advanced HRA method, what is called the 2 nd generation HRA method, including both qualitative analysis and quantitative assessment of cognitive errors has been being developed based on the state-of-the-art theory of cognitive systems engineering and error psychology. The method was developed on the basis of human decision-making model and the relation between the cognitive function and the performance influencing factors. The application of the proposed method to two emergency operation tasks is presented

  14. Trace Chemical Analysis Methodology

    Science.gov (United States)

    1980-04-01

    147 65 Modified DR/2 spectrophotometer face ........... ... 150 66 Colorimetric oil analysis field test kit ......... .. 152 67 Pictorial step...Assisted Pattern Recognitio Perhaps the most promising application of pattern recogntiontechniques for this research effort is the elucidation ".f the...large compartment on the spectrophotomer face . The screwdriver is used to adjust the zero adjust and light ad- just knobs, and the stainless steel

  15. An Estimation of Human Error Probability of Filtered Containment Venting System Using Dynamic HRA Method

    Energy Technology Data Exchange (ETDEWEB)

    Jang, Seunghyun; Jae, Moosung [Hanyang University, Seoul (Korea, Republic of)

    2016-10-15

    The human failure events (HFEs) are considered in the development of system fault trees as well as accident sequence event trees in part of Probabilistic Safety Assessment (PSA). As a method for analyzing the human error, several methods, such as Technique for Human Error Rate Prediction (THERP), Human Cognitive Reliability (HCR), and Standardized Plant Analysis Risk-Human Reliability Analysis (SPAR-H) are used and new methods for human reliability analysis (HRA) are under developing at this time. This paper presents a dynamic HRA method for assessing the human failure events and estimation of human error probability for filtered containment venting system (FCVS) is performed. The action associated with implementation of the containment venting during a station blackout sequence is used as an example. In this report, dynamic HRA method was used to analyze FCVS-related operator action. The distributions of the required time and the available time were developed by MAAP code and LHS sampling. Though the numerical calculations given here are only for illustrative purpose, the dynamic HRA method can be useful tools to estimate the human error estimation and it can be applied to any kind of the operator actions, including the severe accident management strategy.

  16. Fire safety analysis: methodology

    International Nuclear Information System (INIS)

    Kazarians, M.

    1998-01-01

    From a review of the fires that have occurred in nuclear power plants and the results of fire risk studies that have been completed over the last 17 years, we can conclude that internal fires in nuclear power plants can be an important contributor to plant risk. Methods and data are available to quantify the fire risk. These methods and data have been subjected to a series of reviews and detailed scrutiny and have been applied to a large number of plants. There is no doubt that we do not know everything about fire and its impact on a nuclear power plants. However, this lack of knowledge or uncertainty can be quantified and can be used in the decision making process. In other words, the methods entail uncertainties and limitations that are not insurmountable and there is little or no basis for the results of a fire risk analysis fail to support a decision process

  17. METHODOLOGICAL ELEMENTS OF SITUATIONAL ANALYSIS

    Directory of Open Access Journals (Sweden)

    Tetyana KOVALCHUK

    2016-07-01

    Full Text Available The article deals with the investigation of theoretical and methodological principles of situational analysis. The necessity of situational analysis is proved in modern conditions. The notion “situational analysis” is determined. We have concluded that situational analysis is a continuous system study which purpose is to identify dangerous situation signs, to evaluate comprehensively such signs influenced by a system of objective and subjective factors, to search for motivated targeted actions used to eliminate adverse effects of the exposure of the system to the situation now and in the future and to develop the managerial actions needed to bring the system back to norm. It is developed a methodological approach to the situational analysis, its goal is substantiated, proved the expediency of diagnostic, evaluative and searching functions in the process of situational analysis. The basic methodological elements of the situational analysis are grounded. The substantiation of the principal methodological elements of system analysis will enable the analyst to develop adaptive methods able to take into account the peculiar features of a unique object which is a situation that has emerged in a complex system, to diagnose such situation and subject it to system and in-depth analysis, to identify risks opportunities, to make timely management decisions as required by a particular period.

  18. Haven't a Cue? Mapping the CUE Space as an Aid to HRA Modeling

    Energy Technology Data Exchange (ETDEWEB)

    David I Gertman; Ronald L Boring; Jacques Hugo; William Phoenix

    2012-06-01

    Advances in automation present a new modeling environment for the human reliability analysis (HRA) practitioner. Many, if not most, current day HRA methods have their origin in characterizing and quantifying human performance in analog environments where mode awareness and system status indications are potentially less comprehensive, but simpler to comprehend at a glance when compared to advanced presentation systems. The introduction of highly complex automation has the potential to lead to: decreased levels of situation awareness caused by the need for increased monitoring; confusion regarding the often non-obvious causes of automation failures, and emergent system dependencies that formerly may have been uncharacterized. Understanding the relation of incoming cues available to operators during plant upset conditions, in conjunction with operating procedures, yields insight into understanding the nature of the expected operator response in this control room environment. Static systems methods such as fault trees do not contain the appropriate temporal information or necessarily specify the relationship among cues leading to operator response. In this paper, we do not attempt to replace standard performance shaping factors commonly used in HRA nor offer a new HRA method, existing methods may suffice. In this paper we strive to enhance current understanding of the basis for operator response through a technique that can be used during the qualitative portion of the HRA analysis process. The CUE map is a means to visualize the relationship among salient cues in the control room that help influence operator response, show how the cognitive map of the operator changes as information is gained or lost, and is applicable to existing as well as advanced hybrid plants and small modular reactor designs. A brief application involving loss of condensate is presented and advantages and limitations of the modeling approach and use of the CUE map are discussed.

  19. Proceedings of the international workshop on building the new HRA: errors of commission - from research to application

    International Nuclear Information System (INIS)

    2003-01-01

    The main mission of the Working Group on Risk Assessment (RISK) is to advance the understanding and utilisation of probabilistic safety analysis (PSA) in ensuring continued safety of nuclear installations in Member countries. One of the major criticisms of current PSAs is that they do not adequately address an important class of human system interactions, namely inappropriate actions, particularly those that might occur during the response to a transient or accident, that place the plant in a situation of higher risk. This class of inappropriate actions is often referred to as 'errors of commission'. The principal characteristic of an error of commission in a PSA context is that its consequence is a state of unavailability of a component, system or function. This is in contrast to an error of omission, which is characterised by a lack of action and, therefore, preserves the status quo of a system, component, or function. In the PSA context, the most significant errors of commission are those that, in addition to resulting in failure to perform some function, also fail or make unavailable other equipment or functions needed to mitigate the accident scenario, or otherwise exacerbate the situation. The workshop reported herein is an extension of the work of the Working Group on Risk Assessment (RISK) performed to review errors of commission in probabilistic safety analysis (NEA/CSNI/R(2000)17). The main purpose of the meeting was to provide a forum for exchange of information including lessons learned, identification of gaps in our current understanding and knowledge, data needs, and research needs. This workshop also provides a perspective for another workshop, Building the New HRA: Strengthening the Link Between Experience and HRA, to be held in Munich in January of 2002. Individual speakers present a broad international perspective that summarises technical issues, lessons learned, and experiences gained through applying second-generation human reliability

  20. Methodology of Credit Analysis Development

    Directory of Open Access Journals (Sweden)

    Slađana Neogradi

    2017-12-01

    Full Text Available The subject of research presented in this paper refers to the definition of methodology for the development of credit analysis in companies and its application in lending operations in the Republic of Serbia. With the developing credit market, there is a growing need for a well-developed risk and loss prevention system. In the introduction the process of bank analysis of the loan applicant is presented in order to minimize and manage the credit risk. By examining the subject matter, the process of processing the credit application is described, the procedure of analyzing the financial statements in order to get an insight into the borrower's creditworthiness. In the second part of the paper, the theoretical and methodological framework is presented applied in the concrete company. In the third part, models are presented which banks should use to protect against exposure to risks, i.e. their goal is to reduce losses on loan operations in our country, as well as to adjust to market conditions in an optimal way.

  1. Representing cognitive activities and errors in HRA trees

    International Nuclear Information System (INIS)

    Gertman, D.I.

    1992-01-01

    This paper discusses development of a means by which to present cognitive information in human reliability assessment (HRA) event trees. The descriptions found in probabilistic risk assessments (PRAs) regarding the demands on, and the resulting performance of, nuclear power plant (NPP) crews often make use of the technique for human error rate prediction (THERP), which provides a mechanism, the HRA event tree, for presenting the analyst's conceptualization of the activities underlying performance and the errors associated with that performance. When using THERP, analysts have often omitted the more complex elements of human cognition from these trees. There has yet to be a concerted effort to take theory, principles, and data from cognitive psychology and wed it to the logic structure of the HRA event tree. This paper attempts to do so. The COGENT modeling scheme (cognitively based HRA event trees) adds two taxonomies to the HRA event tree proposed by Swain and Guttman. The first taxonomy, the one proposed by Norman and Reason, describes the type of error committed and implies something about the underlying cognition as well. The second of these, the Rasmussen taxonomy, provides description regarding the skill-based, rule-based, or knowledge-based behavior underlying the execution of tasks. It is not apparent and must be deduced from the pattern of errors exhibited by personnel

  2. Stakeholder analysis methodologies resource book

    Energy Technology Data Exchange (ETDEWEB)

    Babiuch, W.M.; Farhar, B.C.

    1994-03-01

    Stakeholder analysis allows analysts to identify how parties might be affected by government projects. This process involves identifying the likely impacts of a proposed action and stakeholder groups affected by that action. Additionally, the process involves assessing how these groups might be affected and suggesting measures to mitigate any adverse effects. Evidence suggests that the efficiency and effectiveness of government actions can be increased and adverse social impacts mitigated when officials understand how a proposed action might affect stakeholders. This report discusses how to conduct useful stakeholder analyses for government officials making decisions on energy-efficiency and renewable-energy technologies and their commercialization. It discusses methodological issues that may affect the validity and reliability of findings, including sampling, generalizability, validity, ``uncooperative`` stakeholder groups, using social indicators, and the effect of government regulations. The Appendix contains resource directories and a list of specialists in stakeholder analysis and involvement.

  3. LOFT uncertainty-analysis methodology

    International Nuclear Information System (INIS)

    Lassahn, G.D.

    1983-01-01

    The methodology used for uncertainty analyses of measurements in the Loss-of-Fluid Test (LOFT) nuclear-reactor-safety research program is described and compared with other methodologies established for performing uncertainty analyses

  4. LOFT uncertainty-analysis methodology

    International Nuclear Information System (INIS)

    Lassahn, G.D.

    1983-01-01

    The methodology used for uncertainty analyses of measurements in the Loss-of-Fluid Test (LOFT) nuclear reactor safety research program is described and compared with other methodologies established for performing uncertainty analyses

  5. Standardization of domestic human reliability analysis and experience of human reliability analysis in probabilistic safety assessment for NPPs under design

    International Nuclear Information System (INIS)

    Kang, D. I.; Jung, W. D.

    2002-01-01

    This paper introduces the background and development activities of domestic standardization of procedure and method for Human Reliability Analysis (HRA) to avoid the intervention of subjectivity by HRA analyst in Probabilistic Safety Assessment (PSA) as possible, and the review of the HRA results for domestic nuclear power plants under design studied by Korea Atomic Energy Research Institute. We identify the HRA methods used for PSA for domestic NPPs and discuss the subjectivity of HRA analyst shown in performing a HRA. Also, we introduce the PSA guidelines published in USA and review the HRA results based on them. We propose the system of a standard procedure and method for HRA to be developed

  6. Comparison of HRA methods based on WWER-1000 NPP real and simulated accident scenarios

    International Nuclear Information System (INIS)

    Petkov, Gueorgui

    2010-01-01

    Full text: Adequate treatment of human interactions in probabilistic safety analysis (PSA) studies is a key to the understanding of accident sequences and their relative importance in overall risk. Human interactions with machines have long been recognized as important contributors to the safe operation of nuclear power plants (NPP). Human interactions affect the ordering of dominant accident sequences and hence have a significant effect on the risk of NPP. By virtue of the ability to combine the treatment of both human and hardware reliability in real accidents, NPP fullscope, multifunctional and computer-based simulators provide a unique way of developing an understanding of the importance of specific human actions for overall plant safety. Context dependent human reliability assessment (HRA) models, such as the holistic decision tree (HDT) and performance evaluation of teamwork (PET) methods, are the so-called second generation HRA techniques. The HDT model has been used for a number of PSA studies. The PET method reflects promising prospects for dealing with dynamic aspects of human performance. The paper presents a comparison of the two HRA techniques for calculation of post-accident human error probability in the PSA. The real and simulated event training scenario 'turbine's stop after loss of feedwater' based on standard PSA model assumptions is designed for WWER-1000 computer simulator and their detailed boundary conditions are described and analyzed. The error probability of post-accident individual actions will be calculated by means of each investigated technique based on student's computer simulator training archives

  7. A Multi-Methods Approach to HRA and Human Performance Modeling: A Field Assessment

    Energy Technology Data Exchange (ETDEWEB)

    Jacques Hugo; David I Gertman

    2012-06-01

    The Advanced Test Reactor (ATR) is a research reactor at the Idaho National Laboratory is primarily designed and used to test materials to be used in other, larger-scale and prototype reactors. The reactor offers various specialized systems and allows certain experiments to be run at their own temperature and pressure. The ATR Canal temporarily stores completed experiments and used fuel. It also has facilities to conduct underwater operations such as experiment examination or removal. In reviewing the ATR safety basis, a number of concerns were identified involving the ATR canal. A brief study identified ergonomic issues involving the manual handling of fuel elements in the canal that may increase the probability of human error and possible unwanted acute physical outcomes to the operator. In response to this concern, that refined the previous HRA scoping analysis by determining the probability of the inadvertent exposure of a fuel element to the air during fuel movement and inspection was conducted. The HRA analysis employed the SPAR-H method and was supplemented by information gained from a detailed analysis of the fuel inspection and transfer tasks. This latter analysis included ergonomics, work cycles, task duration, and workload imposed by tool and workplace characteristics, personal protective clothing, and operational practices that have the potential to increase physical and mental workload. Part of this analysis consisted of NASA-TLX analyses, combined with operational sequence analysis, computational human performance analysis (CHPA), and 3D graphical modeling to determine task failures and precursors to such failures that have safety implications. Experience in applying multiple analysis techniques in support of HRA methods is discussed.

  8. A Multi-Methods Approach to HRA and Human Performance Modeling: A Field Assessment

    International Nuclear Information System (INIS)

    Hugo, Jacques; Gertman, David I.

    2012-01-01

    The Advanced Test Reactor (ATR) is a research reactor at the Idaho National Laboratory is primarily designed and used to test materials to be used in other, larger-scale and prototype reactors. The reactor offers various specialized systems and allows certain experiments to be run at their own temperature and pressure. The ATR Canal temporarily stores completed experiments and used fuel. It also has facilities to conduct underwater operations such as experiment examination or removal. In reviewing the ATR safety basis, a number of concerns were identified involving the ATR canal. A brief study identified ergonomic issues involving the manual handling of fuel elements in the canal that may increase the probability of human error and possible unwanted acute physical outcomes to the operator. In response to this concern, that refined the previous HRA scoping analysis by determining the probability of the inadvertent exposure of a fuel element to the air during fuel movement and inspection was conducted. The HRA analysis employed the SPAR-H method and was supplemented by information gained from a detailed analysis of the fuel inspection and transfer tasks. This latter analysis included ergonomics, work cycles, task duration, and workload imposed by tool and workplace characteristics, personal protective clothing, and operational practices that have the potential to increase physical and mental workload. Part of this analysis consisted of NASA-TLX analyses, combined with operational sequence analysis, computational human performance analysis (CHPA), and 3D graphical modeling to determine task failures and precursors to such failures that have safety implications. Experience in applying multiple analysis techniques in support of HRA methods is discussed.

  9. Causal Meta-Analysis : Methodology and Applications

    NARCIS (Netherlands)

    Bax, L.J.

    2009-01-01

    Meta-analysis is a statistical method to summarize research data from multiple studies in a quantitative manner. This dissertation addresses a number of methodological topics in causal meta-analysis and reports the development and validation of meta-analysis software. In the first (methodological)

  10. Safety analysis methodology for OPR 1000

    International Nuclear Information System (INIS)

    Hwang-Yong, Jun

    2005-01-01

    Full text: Korea Electric Power Research Institute (KEPRI) has been developing inhouse safety analysis methodology based on the delicate codes available to KEPRI to overcome the problems arising from currently used vendor oriented methodologies. For the Loss of Coolant Accident (LOCA) analysis, the KREM (KEPRI Realistic Evaluation Methodology) has been developed based on the RELAP-5 code. The methodology was approved for the Westinghouse 3-loop plants by the Korean regulatory organization and the project to extent the methodology to the Optimized Power Reactor 1000 (OPR1000) has been ongoing since 2001. Also, for the Non-LOCA analysis, the KNAP (Korea Non-LOCA Analysis Package) has been developed using the UNICORN-TM code system. To demonstrate the feasibility of these codes systems and methodologies, some typical cases of the design basis accidents mentioned in the final safety analysis report (FSAR) were analyzed. (author)

  11. Preliminary safety analysis methodology for the SMART

    Energy Technology Data Exchange (ETDEWEB)

    Bae, Kyoo Hwan; Chung, Y. J.; Kim, H. C.; Sim, S. K.; Lee, W. J.; Chung, B. D.; Song, J. H. [Korea Atomic Energy Research Institute, Taejeon (Korea)

    2000-03-01

    This technical report was prepared for a preliminary safety analysis methodology of the 330MWt SMART (System-integrated Modular Advanced ReacTor) which has been developed by Korea Atomic Energy Research Institute (KAERI) and funded by the Ministry of Science and Technology (MOST) since July 1996. This preliminary safety analysis methodology has been used to identify an envelope for the safety of the SMART conceptual design. As the SMART design evolves, further validated final safety analysis methodology will be developed. Current licensing safety analysis methodology of the Westinghouse and KSNPP PWRs operating and under development in Korea as well as the Russian licensing safety analysis methodology for the integral reactors have been reviewed and compared to develop the preliminary SMART safety analysis methodology. SMART design characteristics and safety systems have been reviewed against licensing practices of the PWRs operating or KNGR (Korean Next Generation Reactor) under construction in Korea. Detailed safety analysis methodology has been developed for the potential SMART limiting events of main steam line break, main feedwater pipe break, loss of reactor coolant flow, CEA withdrawal, primary to secondary pipe break and the small break loss of coolant accident. SMART preliminary safety analysis methodology will be further developed and validated in parallel with the safety analysis codes as the SMART design further evolves. Validated safety analysis methodology will be submitted to MOST as a Topical Report for a review of the SMART licensing safety analysis methodology. Thus, it is recommended for the nuclear regulatory authority to establish regulatory guides and criteria for the integral reactor. 22 refs., 18 figs., 16 tabs. (Author)

  12. Human reliability analysis of Lingao Nuclear Power Station

    International Nuclear Information System (INIS)

    Zhang Li; Huang Shudong; Yang Hong; He Aiwu; Huang Xiangrui; Zheng Tao; Su Shengbing; Xi Haiying

    2001-01-01

    The necessity of human reliability analysis (HRA) of Lingao Nuclear Power Station are analyzed, and the method and operation procedures of HRA is briefed. One of the human factors events (HFE) is analyzed in detail and some questions of HRA are discussed. The authors present the analytical results of 61 HFEs, and make a brief introduction of HRA contribution to Lingao Nuclear Power Station

  13. 3D hra s technologií Leap Motion

    OpenAIRE

    Mainuš, Matěj

    2014-01-01

    Cílem této bakalářské práce bylo navrhnout a implementovat 3D hru labyrint ovládanou pomocí gest rukou. Pro rozpoznávání pohybu a gest hra využívá technologii Leap Motion, aplikace samotná je vytvořena v herním enginu Unity. Výsledkem práce je multiplatformní 3D hra s vlastní knihovnou, která integruje Leap Motion SDK do Unity a eliminuje chyby v detekci rukou. The goal of this bachelor's thesis is to design and create a 3D labyrinth game controlled by hand gestures. This is achieved by us...

  14. Development of a systematic methodology to select hazard analysis techniques for nuclear facilities

    Energy Technology Data Exchange (ETDEWEB)

    Vasconcelos, Vanderley de; Reis, Sergio Carneiro dos; Costa, Antonio Carlos Lopes da [Centro de Desenvolvimento da Tecnologia Nuclear (CDTN/CNEN-MG), Belo Horizonte, MG (Brazil)]. E-mails: vasconv@cdtn.br; reissc@cdtn.br; aclc@cdtn.br; Jordao, Elizabete [Universidade Estadual de Campinas (UNICAMP), SP (Brazil). Faculdade de Engenharia Quimica]. E-mail: bete@feq.unicamp.br

    2008-07-01

    In order to comply with licensing requirements of regulatory bodies risk assessments of nuclear facilities should be carried out. In Brazil, such assessments are part of the Safety Analysis Reports, required by CNEN (Brazilian Nuclear Energy Commission), and of the Risk Analysis Studies, required by the competent environmental bodies. A risk assessment generally includes the identification of the hazards and accident sequences that can occur, as well as the estimation of the frequencies and effects of these unwanted events on the plant, people, and environment. The hazard identification and analysis are also particularly important when implementing an Integrated Safety, Health, and Environment Management System following ISO 14001, BS 8800 and OHSAS 18001 standards. Among the myriad of tools that help the process of hazard analysis can be highlighted: CCA (Cause- Consequence Analysis); CL (Checklist Analysis); ETA (Event Tree Analysis); FMEA (Failure Mode and Effects Analysis); FMECA (Failure Mode, Effects and Criticality Analysis); FTA (Fault Tree Analysis); HAZOP (Hazard and Operability Study); HRA (Human Reliability Analysis); Pareto Analysis; PHA (Preliminary Hazard Analysis); RR (Relative Ranking); SR (Safety Review); WI (What-If); and WI/CL (What-If/Checklist Analysis). The choice of a particular technique or a combination of techniques depends on many factors like motivation of the analysis, available data, complexity of the process being analyzed, expertise available on hazard analysis, and initial perception of the involved risks. This paper presents a systematic methodology to select the most suitable set of tools to conduct the hazard analysis, taking into account the mentioned involved factors. Considering that non-reactor nuclear facilities are, to a large extent, chemical processing plants, the developed approach can also be applied to analysis of chemical and petrochemical plants. The selected hazard analysis techniques can support cost

  15. Development of a systematic methodology to select hazard analysis techniques for nuclear facilities

    International Nuclear Information System (INIS)

    Vasconcelos, Vanderley de; Reis, Sergio Carneiro dos; Costa, Antonio Carlos Lopes da; Jordao, Elizabete

    2008-01-01

    In order to comply with licensing requirements of regulatory bodies risk assessments of nuclear facilities should be carried out. In Brazil, such assessments are part of the Safety Analysis Reports, required by CNEN (Brazilian Nuclear Energy Commission), and of the Risk Analysis Studies, required by the competent environmental bodies. A risk assessment generally includes the identification of the hazards and accident sequences that can occur, as well as the estimation of the frequencies and effects of these unwanted events on the plant, people, and environment. The hazard identification and analysis are also particularly important when implementing an Integrated Safety, Health, and Environment Management System following ISO 14001, BS 8800 and OHSAS 18001 standards. Among the myriad of tools that help the process of hazard analysis can be highlighted: CCA (Cause- Consequence Analysis); CL (Checklist Analysis); ETA (Event Tree Analysis); FMEA (Failure Mode and Effects Analysis); FMECA (Failure Mode, Effects and Criticality Analysis); FTA (Fault Tree Analysis); HAZOP (Hazard and Operability Study); HRA (Human Reliability Analysis); Pareto Analysis; PHA (Preliminary Hazard Analysis); RR (Relative Ranking); SR (Safety Review); WI (What-If); and WI/CL (What-If/Checklist Analysis). The choice of a particular technique or a combination of techniques depends on many factors like motivation of the analysis, available data, complexity of the process being analyzed, expertise available on hazard analysis, and initial perception of the involved risks. This paper presents a systematic methodology to select the most suitable set of tools to conduct the hazard analysis, taking into account the mentioned involved factors. Considering that non-reactor nuclear facilities are, to a large extent, chemical processing plants, the developed approach can also be applied to analysis of chemical and petrochemical plants. The selected hazard analysis techniques can support cost

  16. Methodology for Validating Building Energy Analysis Simulations

    Energy Technology Data Exchange (ETDEWEB)

    Judkoff, R.; Wortman, D.; O' Doherty, B.; Burch, J.

    2008-04-01

    The objective of this report was to develop a validation methodology for building energy analysis simulations, collect high-quality, unambiguous empirical data for validation, and apply the validation methodology to the DOE-2.1, BLAST-2MRT, BLAST-3.0, DEROB-3, DEROB-4, and SUNCAT 2.4 computer programs. This report covers background information, literature survey, validation methodology, comparative studies, analytical verification, empirical validation, comparative evaluation of codes, and conclusions.

  17. Probabilistic methodology for turbine missile risk analysis

    International Nuclear Information System (INIS)

    Twisdale, L.A.; Dunn, W.L.; Frank, R.A.

    1984-01-01

    A methodology has been developed for estimation of the probabilities of turbine-generated missile damage to nuclear power plant structures and systems. Mathematical models of the missile generation, transport, and impact events have been developed and sequenced to form an integrated turbine missile simulation methodology. Probabilistic Monte Carlo techniques are used to estimate the plant impact and damage probabilities. The methodology has been coded in the TURMIS computer code to facilitate numerical analysis and plant-specific turbine missile probability assessments. Sensitivity analyses have been performed on both the individual models and the integrated methodology, and probabilities have been estimated for a hypothetical nuclear power plant case study. (orig.)

  18. 'Violation' - does HRA need the concept?

    International Nuclear Information System (INIS)

    Dougherty, Ed

    1995-01-01

    Violations are part of a complex matrix of judgmental behavior. The term violation is an indicator of the possibility of recent judgment errors but more so of latent errors in management and/or design. The effect of investigations in this arena do indicate a need for an extension of the classic slip/mistake taxonomy of human reliability analysis. The note attempts to initiate this extension

  19. Disposal Criticality Analysis Methodology Topical Report

    International Nuclear Information System (INIS)

    Horton, D.G.

    1998-01-01

    The fundamental objective of this topical report is to present the planned risk-informed disposal criticality analysis methodology to the NRC to seek acceptance that the principles of the methodology and the planned approach to validating the methodology are sound. The design parameters and environmental assumptions within which the waste forms will reside are currently not fully established and will vary with the detailed waste package design, engineered barrier design, repository design, and repository layout. Therefore, it is not practical to present the full validation of the methodology in this report, though a limited validation over a parameter range potentially applicable to the repository is presented for approval. If the NRC accepts the methodology as described in this section, the methodology will be fully validated for repository design applications to which it will be applied in the License Application and its references. For certain fuel types (e.g., intact naval fuel), a ny processes, criteria, codes or methods different from the ones presented in this report will be described in separate addenda. These addenda will employ the principles of the methodology described in this report as a foundation. Departures from the specifics of the methodology presented in this report will be described in the addenda

  20. Disposal Criticality Analysis Methodology Topical Report

    International Nuclear Information System (INIS)

    D.G. Horton

    1998-01-01

    The fundamental objective of this topical report is to present the planned risk-informed disposal criticality analysis methodology to the NRC to seek acceptance that the principles of the methodology and the planned approach to validating the methodology are sound. The design parameters and environmental assumptions within which the waste forms will reside are currently not fully established and will vary with the detailed waste package design, engineered barrier design, repository design, and repository layout. Therefore, it is not practical to present the full validation of the methodology in this report, though a limited validation over a parameter range potentially applicable to the repository is presented for approval. If the NRC accepts the methodology as described in this section, the methodology will be fully validated for repository design applications to which it will be applied in the License Application and its references. For certain fuel types (e.g., intact naval fuel), any processes, criteria, codes or methods different from the ones presented in this report will be described in separate addenda. These addenda will employ the principles of the methodology described in this report as a foundation. Departures from the specifics of the methodology presented in this report will be described in the addenda

  1. Multidisciplinary framework for human reliability analysis with an application to errors of commission and dependencies

    International Nuclear Information System (INIS)

    Barriere, M.T.; Luckas, W.J.; Wreathall, J.; Cooper, S.E.; Bley, D.C.; Ramey-Smith, A.

    1995-08-01

    Since the early 1970s, human reliability analysis (HRA) has been considered to be an integral part of probabilistic risk assessments (PRAs). Nuclear power plant (NPP) events, from Three Mile Island through the mid-1980s, showed the importance of human performance to NPP risk. Recent events demonstrate that human performance continues to be a dominant source of risk. In light of these observations, the current limitations of existing HRA approaches become apparent when the role of humans is examined explicitly in the context of real NPP events. The development of new or improved HRA methodologies to more realistically represent human performance is recognized by the Nuclear Regulatory Commission (NRC) as a necessary means to increase the utility of PRAS. To accomplish this objective, an Improved HRA Project, sponsored by the NRC's Office of Nuclear Regulatory Research (RES), was initiated in late February, 1992, at Brookhaven National Laboratory (BNL) to develop an improved method for HRA that more realistically assesses the human contribution to plant risk and can be fully integrated with PRA. This report describes the research efforts including the development of a multidisciplinary HRA framework, the characterization and representation of errors of commission, and an approach for addressing human dependencies. The implications of the research and necessary requirements for further development also are discussed

  2. Development of a BN framework for human reliability analysis through virtual simulation

    International Nuclear Information System (INIS)

    Garg, Vipul; Santhosh, T.V.; Vinod, Gopika; Antony, P.D.

    2017-01-01

    Humans are an integral part of complex systems such as nuclear power plants and have to play a significant role in ensuring the safety and reliability of these systems. Failure to perform the intended task within the stipulated time by the operator can challenge the safety of the system. Human reliability analysis (HRA) is a widely practiced methodology to estimate the contribution of operator error towards the overall risk to the facility. HRA methods quantify this contribution in terms of human error probability (HEP) accounting for various psychological and physiological factors that influence the performance of the operator. These factors are referred to as human factors (HF), which enhance or degrade the human performance. The paper discusses the use of virtual simulation as a tool to generate the HF data from the virtual model of an in-house experimental facility. This paper also demonstrates the use of multi-attribute utility theory to determine a suitable HRA method amongst several HRA methods to quantify the HEP based on the desired set of HRA attributes. As classical HRA methods, generally, do not address the interactions among the HFs, the Bayesian network technique has been employed in this study to account for HF interactions. (author)

  3. Multidisciplinary framework for human reliability analysis with an application to errors of commission and dependencies

    Energy Technology Data Exchange (ETDEWEB)

    Barriere, M.T.; Luckas, W.J. [Brookhaven National Lab., Upton, NY (United States); Wreathall, J. [Wreathall (John) and Co., Dublin, OH (United States); Cooper, S.E. [Science Applications International Corp., Reston, VA (United States); Bley, D.C. [PLG, Inc., Newport Beach, CA (United States); Ramey-Smith, A. [Nuclear Regulatory Commission, Washington, DC (United States). Div. of Systems Technology

    1995-08-01

    Since the early 1970s, human reliability analysis (HRA) has been considered to be an integral part of probabilistic risk assessments (PRAs). Nuclear power plant (NPP) events, from Three Mile Island through the mid-1980s, showed the importance of human performance to NPP risk. Recent events demonstrate that human performance continues to be a dominant source of risk. In light of these observations, the current limitations of existing HRA approaches become apparent when the role of humans is examined explicitly in the context of real NPP events. The development of new or improved HRA methodologies to more realistically represent human performance is recognized by the Nuclear Regulatory Commission (NRC) as a necessary means to increase the utility of PRAS. To accomplish this objective, an Improved HRA Project, sponsored by the NRC`s Office of Nuclear Regulatory Research (RES), was initiated in late February, 1992, at Brookhaven National Laboratory (BNL) to develop an improved method for HRA that more realistically assesses the human contribution to plant risk and can be fully integrated with PRA. This report describes the research efforts including the development of a multidisciplinary HRA framework, the characterization and representation of errors of commission, and an approach for addressing human dependencies. The implications of the research and necessary requirements for further development also are discussed.

  4. Exploring Participatory Methodologies in Organizational Discourse Analysis

    DEFF Research Database (Denmark)

    Plotnikof, Mie

    2014-01-01

    Recent debates in the field of organizational discourse analysis stress contrasts in approaches as single-level vs. multi-level, critical vs. participatory, discursive vs. material methods. They raise methodological issues of combining such to embrace multimodality in order to enable new contribu......Recent debates in the field of organizational discourse analysis stress contrasts in approaches as single-level vs. multi-level, critical vs. participatory, discursive vs. material methods. They raise methodological issues of combining such to embrace multimodality in order to enable new...... contributions. As regards conceptual efforts are made but further exploration of methodological combinations and their practical implications are called for. This paper argues 1) to combine methodologies by approaching this as scholarly subjectification processes, and 2) to perform combinations in both...

  5. Comparative analysis of proliferation resistance assessment methodologies

    International Nuclear Information System (INIS)

    Takaki, Naoyuki; Kikuchi, Masahiro; Inoue, Naoko; Osabe, Takeshi

    2005-01-01

    Comparative analysis of the methodologies was performed based on the discussions in the international workshop on 'Assessment Methodology of Proliferation Resistance for Future Nuclear Energy Systems' held in Tokyo, on March 2005. Through the workshop and succeeding considerations, it is clarified that the proliferation resistance assessment methodologies are affected by the broader nuclear options being pursued and also by the political situations of the state. Even the definition of proliferation resistance, despite the commonality of fundamental issues, derives from perceived threat and implementation circumstances inherent to the larger programs. Deep recognitions of the 'difference' among communities would help us to make further essential and progressed discussion with harmonization. (author)

  6. Malware Analysis Sandbox Testing Methodology

    Directory of Open Access Journals (Sweden)

    Zoltan Balazs

    2016-01-01

    Full Text Available Manual processing of malware samples became impossible years ago. Sandboxes are used to automate the analysis of malware samples to gather information about the dynamic behaviour of the malware, both at AV companies and at enterprises. Some malware samples use known techniques to detect when it runs in a sandbox, but most of these sandbox detection techniques can be easily detected and thus flagged as malicious. I invented new approaches to detect these sandboxes. I developed a tool, which can collect a lot of interesting information from these sandboxes to create statistics how the current technologies work. After analysing these results I will demonstrate tricks to detect sandboxes. These tricks can’t be easily flagged as malicious. Some sandboxes don’t not interact with the Internet in order to block data extraction, but with some DNS-fu the information can be extracted from these appliances as well.

  7. Sampling methodology and PCB analysis

    International Nuclear Information System (INIS)

    Dominelli, N.

    1995-01-01

    As a class of compounds PCBs are extremely stable and resist chemical and biological decomposition. Diluted solutions exposed to a range of environmental conditions will undergo some preferential degradation and the resulting mixture may differ considerably from the original PCB used as insulating fluid in electrical equipment. The structure of mixtures of PCBs (synthetic compounds prepared by direct chlorination of biphenyl with chlorine gas) is extremely complex and presents a formidable analytical problem, further complicated by the presence of PCBs as contaminants in oils to soils to water. This paper provides some guidance into sampling and analytical procedures; it also points out various potential problems encountered during these processes. The guidelines provided deal with sample collection, storage and handling, sample stability, laboratory analysis (usually gas chromatography), determination of PCB concentration, calculation of total PCB content, and quality assurance. 1 fig

  8. Nondestructive assay methodologies in nuclear forensics analysis

    International Nuclear Information System (INIS)

    Tomar, B.S.

    2016-01-01

    In the present chapter, the nondestructive assay (NDA) methodologies used for analysis of nuclear materials as a part of nuclear forensic investigation have been described. These NDA methodologies are based on (i) measurement of passive gamma and neutrons emitted by the radioisotopes present in the nuclear materials, (ii) measurement of gamma rays and neutrons emitted after the active interrogation of the nuclear materials with a source of X-rays, gamma rays or neutrons

  9. Update of Part 61 impacts analysis methodology

    International Nuclear Information System (INIS)

    Oztunali, O.I.; Roles, G.W.

    1986-01-01

    The US Nuclear Regulatory Commission is expanding the impacts analysis methodology used during the development of the 10 CFR Part 61 rule to allow improved consideration of costs and impacts of disposal of waste that exceeds Class C concentrations. The project includes updating the computer codes that comprise the methodology, reviewing and updating data assumptions on waste streams and disposal technologies, and calculation of costs for small as well as large disposal facilities. This paper outlines work done to date on this project

  10. Update of Part 61 impacts analysis methodology

    International Nuclear Information System (INIS)

    Oztunali, O.I.; Roles, G.W.; US Nuclear Regulatory Commission, Washington, DC 20555)

    1985-01-01

    The US Nuclear Regulatory Commission is expanding the impacts analysis methodology used during the development of the 10 CFR Part 61 regulation to allow improved consideration of costs and impacts of disposal of waste that exceeds Class C concentrations. The project includes updating the computer codes that comprise the methodology, reviewing and updating data assumptions on waste streams and disposal technologies, and calculation of costs for small as well as large disposal facilities. This paper outlines work done to date on this project

  11. Constructive Analysis : A Study in Epistemological Methodology

    DEFF Research Database (Denmark)

    Ahlström, Kristoffer

    , and develops a framework for a kind of analysis that is more in keeping with recent psychological research on categorization. Finally, it is shown that this kind of analysis can be applied to the concept of justification in a manner that furthers the epistemological goal of providing intellectual guidance.......The present study is concerned the viability of the primary method in contemporary philosophy, i.e., conceptual analysis. Starting out by tracing the roots of this methodology to Platonic philosophy, the study questions whether such a methodology makes sense when divorced from Platonic philosophy...

  12. Simplified methodology for Angra 1 containment analysis

    International Nuclear Information System (INIS)

    Neves Conti, T. das; Souza, A.L. de; Sabundjian, G.

    1991-08-01

    A simplified methodology of analysis was developed to simulate a Large Break Loss of Coolant Accident in the Angra 1 Nuclear Power Station. Using the RELAP5/MOD1, RELAP4/MOD5 and CONTEMPT-LT Codes, the time variation of pressure and temperature in the containment was analysed. The obtained data was compared with the Angra 1 Final Safety Analysis Report, and too those calculated by a Detailed Model. The results obtained by this new methodology such as the small computational time of simulation, were satisfactory when getting the preliminary evaluation of the Angra 1 global parameters. (author)

  13. The Development of Marine Accidents Human Reliability Assessment Approach: HEART Methodology and MOP Model

    OpenAIRE

    Ludfi Pratiwi Bowo; Wanginingastuti Mutmainnah; Masao Furusho

    2017-01-01

    Humans are one of the important factors in the assessment of accidents, particularly marine accidents. Hence, studies are conducted to assess the contribution of human factors in accidents. There are two generations of Human Reliability Assessment (HRA) that have been developed. Those methodologies are classified by the differences of viewpoints of problem-solving, as the first generation and second generation. The accident analysis can be determined using three techniques of analysis; sequen...

  14. A human error analysis methodology, AGAPE-ET, for emergency tasks in nuclear power plants and its application

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Jae Whan; Jung, Won Dea [Korea Atomic Energy Research Institute, Taejeon (Korea)

    2002-03-01

    This report presents a procedurised human reliability analysis (HRA) methodology, AGAPE-ET (A Guidance And Procedure for Human Error Analysis for Emergency Tasks), for both qualitative error analysis and quantification of human error probability (HEP) of emergency tasks in nuclear power plants. The AGAPE-ET is based on the simplified cognitive model. By each cognitive function, error causes or error-likely situations have been identified considering the characteristics of the performance of each cognitive function and influencing mechanism of PIFs on the cognitive function. Then, error analysis items have been determined from the identified error causes or error-likely situations to help the analysts cue or guide overall human error analysis. A human error analysis procedure based on the error analysis items is organised. The basic scheme for the quantification of HEP consists in the multiplication of the BHEP assigned by the error analysis item and the weight from the influencing factors decision tree (IFDT) constituted by cognitive function. The method can be characterised by the structured identification of the weak points of the task required to perform and the efficient analysis process that the analysts have only to carry out with the necessary cognitive functions. The report also presents the the application of AFAPE-ET to 31 nuclear emergency tasks and its results. 42 refs., 7 figs., 36 tabs. (Author)

  15. An analysis of operational experience during low power and shutdown and a plan for addressing human reliability assessment issues

    International Nuclear Information System (INIS)

    Barriere, M.; Luckas, W.; Whitehead, D.; Ramey-Smith, A.

    1994-06-01

    Recent nuclear power plant events (e.g. Chernobyl, Diablo Canyon, and Vogtle) and US Nuclear Regulatory Commission (NRC) reports (e.g. NUREG-1449) have led to concerns regarding human reliability during low power and shutdown (LP ampersand S) conditions and limitations of human reliability analysis (HRA) methodologies in adequately representing the LP ampersand S environment. As a result of these concerns, the NRC initiated two parallel research projects to assess the influence of LP ampersand S conditions on human reliability through an analysis of operational experience at pressurized water reactors (PWRs) an boiling water reactors (BWRs). These research projects, performed by Brookhaven National Laboratory for PWRS, and Sandia National Laboratories for BWRs, identified unique aspects of human performance during LP ampersand S conditions and provided a program plan for research and development necessary to improve existing HRA methodologies. This report documents the results of the analysis of LP ampersand S operating experience and describes the improved HRA program plan

  16. An Evaluation Methodology for Protocol Analysis Systems

    Science.gov (United States)

    2007-03-01

    Main Memory Requirement NS: Needham-Schroeder NSL: Needham-Schroeder-Lowe OCaml : Objective Caml POSIX: Portable Operating System...methodology is needed. A. PROTOCOL ANALYSIS FIELD As with any field, there is a specialized language used within the protocol analysis community. Figure...ProVerif requires that Objective Caml ( OCaml ) be installed on the system, OCaml version 3.09.3 was installed. C. WINDOWS CONFIGURATION OS

  17. Supplement to the Disposal Criticality Analysis Methodology

    International Nuclear Information System (INIS)

    Thomas, D.A.

    1999-01-01

    The methodology for evaluating criticality potential for high-level radioactive waste and spent nuclear fuel after the repository is sealed and permanently closed is described in the Disposal Criticality Analysis Methodology Topical Report (DOE 1998b). The topical report provides a process for validating various models that are contained in the methodology and states that validation will be performed to support License Application. The Supplement to the Disposal Criticality Analysis Methodology provides a summary of data and analyses that will be used for validating these models and will be included in the model validation reports. The supplement also summarizes the process that will be followed in developing the model validation reports. These reports will satisfy commitments made in the topical report, and thus support the use of the methodology for Site Recommendation and License Application. It is concluded that this report meets the objective of presenting additional information along with references that support the methodology presented in the topical report and can be used both in validation reports and in answering request for additional information received from the Nuclear Regulatory Commission concerning the topical report. The data and analyses summarized in this report and presented in the references are not sufficient to complete a validation report. However, this information will provide a basis for several of the validation reports. Data from several references in this report have been identified with TBV-1349. Release of the TBV governing this data is required prior to its use in quality affecting activities and for use in analyses affecting procurement, construction, or fabrication. Subsequent to the initiation of TBV-1349, DOE issued a concurrence letter (Mellington 1999) approving the request to identify information taken from the references specified in Section 1.4 as accepted data

  18. Nuclear methodology development for clinical analysis

    International Nuclear Information System (INIS)

    Oliveira, Laura Cristina de

    2003-01-01

    In the present work the viability of using the neutron activation analysis to perform urine and blood clinical analysis was checked. The aim of this study is to investigate the biological behavior of animals that has been fed with chow doped by natural uranium for a long period. Aiming at time and cost reduction, the absolute method was applied to determine element concentration on biological samples. The quantitative results of urine sediment using NAA were compared with the conventional clinical analysis and the results were compatible. This methodology was also used on bone and body organs such as liver and muscles to help the interpretation of possible anomalies. (author)

  19. Theoretical and methodological approaches in discourse analysis.

    Science.gov (United States)

    Stevenson, Chris

    2004-01-01

    Discourse analysis (DA) embodies two main approaches: Foucauldian DA and radical social constructionist DA. Both are underpinned by social constructionism to a lesser or greater extent. Social constructionism has contested areas in relation to power, embodiment, and materialism, although Foucauldian DA does focus on the issue of power Embodiment and materialism may be especially relevant for researchers of nursing where the physical body is prominent. However, the contested nature of social constructionism allows a fusion of theoretical and methodological approaches tailored to a specific research interest. In this paper, Chris Stevenson suggests a framework for working out and declaring the DA approach to be taken in relation to a research area, as well as to aid anticipating methodological critique. Method, validity, reliability and scholarship are discussed from within a discourse analytic frame of reference.

  20. Theoretical and methodological approaches in discourse analysis.

    Science.gov (United States)

    Stevenson, Chris

    2004-10-01

    Discourse analysis (DA) embodies two main approaches: Foucauldian DA and radical social constructionist DA. Both are underpinned by social constructionism to a lesser or greater extent. Social constructionism has contested areas in relation to power, embodiment, and materialism, although Foucauldian DA does focus on the issue of power. Embodiment and materialism may be especially relevant for researchers of nursing where the physical body is prominent. However, the contested nature of social constructionism allows a fusion of theoretical and methodological approaches tailored to a specific research interest. In this paper, Chris Stevenson suggests a frame- work for working out and declaring the DA approach to be taken in relation to a research area, as well as to aid anticipating methodological critique. Method, validity, reliability and scholarship are discussed from within a discourse analytic frame of reference.

  1. PROOF OF CONCEPT FOR A HUMAN RELIABILITY ANALYSIS METHOD FOR HEURISTIC USABILITY EVALUATION OF SOFTWARE

    International Nuclear Information System (INIS)

    Ronald L. Boring; David I. Gertman; Jeffrey C. Joe; Julie L. Marble

    2005-01-01

    An ongoing issue within human-computer interaction (HCI) is the need for simplified or ''discount'' methods. The current economic slowdown has necessitated innovative methods that are results driven and cost effective. The myriad methods of design and usability are currently being cost-justified, and new techniques are actively being explored that meet current budgets and needs. Recent efforts in human reliability analysis (HRA) are highlighted by the ten-year development of the Standardized Plant Analysis Risk HRA (SPAR-H) method. The SPAR-H method has been used primarily for determining human centered risk at nuclear power plants. The SPAR-H method, however, shares task analysis underpinnings with HCI. Despite this methodological overlap, there is currently no HRA approach deployed in heuristic usability evaluation. This paper presents an extension of the existing SPAR-H method to be used as part of heuristic usability evaluation in HCI

  2. Comparison of the THERP quantitative tables with the human reliability analysis techniques of second generation

    International Nuclear Information System (INIS)

    Alvarenga, Marco Antonio Bayout; Fonseca, Renato Alves

    2009-01-01

    The methodology THERP is classified as a Human Reliability Analysis (HRA) technique of first generation and its emergence was an important initial step for the development of HRA techniques in the industry. Due to the fact of being a first generation technique, THERP quantification tables of human errors are based on a taxonomy that does not take into account the human errors mechanisms. Concerning the three cognitive levels in the Rasmussen framework for the cognitive information processing in human beings, THERP deals in most cases with errors that happen in the perceptual-motor level (stimulus-response). In the rules level, this technique can work better using the time dependent probabilities curves of diagnosis errors, obtained in nuclear power plants simulators. Nevertheless, this is done without processing any error mechanisms. Another deficiency is the fact that the performance shaping factors are in limited number. Furthermore, the influences (predictable or not) of operational context, arising from operational deviations of the most probable (in terms of occurrence probabilities) standard scenarios beside the consequent operational tendencies (operator actions) are not estimated. This work makes a critical analysis of these deficiencies and it points out possible solutions in order to modify the THERP tables, seeking a realistic quantification, that does not underestimate or overestimate the human errors probabilities when applying the HRA techniques to nuclear power plants. The critical analysis is accomplished through a qualitative comparison between THERP, a HRA technique of first generation, with CREAM, as well as ATHEANA, which are HRA techniques of second generation. (author)

  3. Comparison of the THERP quantitative tables with the human reliability analysis techniques of second generation

    Energy Technology Data Exchange (ETDEWEB)

    Alvarenga, Marco Antonio Bayout; Fonseca, Renato Alves [Comissao Nacional de Energia Nuclear (CNEN), Rio de Janeiro, RJ (Brazil)], e-mail: bayout@cnen.gov.br, e-mail: rfonseca@cnen.gov.br

    2009-07-01

    The methodology THERP is classified as a Human Reliability Analysis (HRA) technique of first generation and its emergence was an important initial step for the development of HRA techniques in the industry. Due to the fact of being a first generation technique, THERP quantification tables of human errors are based on a taxonomy that does not take into account the human errors mechanisms. Concerning the three cognitive levels in the Rasmussen framework for the cognitive information processing in human beings, THERP deals in most cases with errors that happen in the perceptual-motor level (stimulus-response). In the rules level, this technique can work better using the time dependent probabilities curves of diagnosis errors, obtained in nuclear power plants simulators. Nevertheless, this is done without processing any error mechanisms. Another deficiency is the fact that the performance shaping factors are in limited number. Furthermore, the influences (predictable or not) of operational context, arising from operational deviations of the most probable (in terms of occurrence probabilities) standard scenarios beside the consequent operational tendencies (operator actions) are not estimated. This work makes a critical analysis of these deficiencies and it points out possible solutions in order to modify the THERP tables, seeking a realistic quantification, that does not underestimate or overestimate the human errors probabilities when applying the HRA techniques to nuclear power plants. The critical analysis is accomplished through a qualitative comparison between THERP, a HRA technique of first generation, with CREAM, as well as ATHEANA, which are HRA techniques of second generation. (author)

  4. Requirements Analysis in the Value Methodology

    Energy Technology Data Exchange (ETDEWEB)

    Conner, Alison Marie

    2001-05-01

    The Value Methodology (VM) study brings together a multidisciplinary team of people who own the problem and have the expertise to identify and solve it. With the varied backgrounds and experiences the team brings to the study, come different perspectives on the problem and the requirements of the project. A requirements analysis step can be added to the Information and Function Analysis Phases of a VM study to validate whether the functions being performed are required, either regulatory or customer prescribed. This paper will provide insight to the level of rigor applied to a requirements analysis step and give some examples of tools and techniques utilized to ease the management of the requirements and functions those requirements support for highly complex problems.

  5. Cost analysis methodology of spent fuel storage

    International Nuclear Information System (INIS)

    1994-01-01

    The report deals with the cost analysis of interim spent fuel storage; however, it is not intended either to give a detailed cost analysis or to compare the costs of the different options. This report provides a methodology for calculating the costs of different options for interim storage of the spent fuel produced in the reactor cores. Different technical features and storage options (dry and wet, away from reactor and at reactor) are considered and the factors affecting all options defined. The major cost categories are analysed. Then the net present value of each option is calculated and the levelized cost determined. Finally, a sensitivity analysis is conducted taking into account the uncertainty in the different cost estimates. Examples of current storage practices in some countries are included in the Appendices, with description of the most relevant technical and economic aspects. 16 figs, 14 tabs

  6. RAMS (Risk Analysis - Modular System) methodology

    Energy Technology Data Exchange (ETDEWEB)

    Stenner, R.D.; Strenge, D.L.; Buck, J.W. [and others

    1996-10-01

    The Risk Analysis - Modular System (RAMS) was developed to serve as a broad scope risk analysis tool for the Risk Assessment of the Hanford Mission (RAHM) studies. The RAHM element provides risk analysis support for Hanford Strategic Analysis and Mission Planning activities. The RAHM also provides risk analysis support for the Hanford 10-Year Plan development activities. The RAMS tool draws from a collection of specifically designed databases and modular risk analysis methodologies and models. RAMS is a flexible modular system that can be focused on targeted risk analysis needs. It is specifically designed to address risks associated with overall strategy, technical alternative, and `what if` questions regarding the Hanford cleanup mission. RAMS is set up to address both near-term and long-term risk issues. Consistency is very important for any comparative risk analysis, and RAMS is designed to efficiently and consistently compare risks and produce risk reduction estimates. There is a wide range of output information that can be generated by RAMS. These outputs can be detailed by individual contaminants, waste forms, transport pathways, exposure scenarios, individuals, populations, etc. However, they can also be in rolled-up form to support high-level strategy decisions.

  7. Environmental impact statement analysis: dose methodology

    International Nuclear Information System (INIS)

    Mueller, M.A.; Strenge, D.L.; Napier, B.A.

    1981-01-01

    Standardized sections and methodologies are being developed for use in environmental impact statements (EIS) for activities to be conducted on the Hanford Reservation. Five areas for standardization have been identified: routine operations dose methodologies, accident dose methodology, Hanford Site description, health effects methodology, and socioeconomic environment for Hanford waste management activities

  8. Research review and development trends of human reliability analysis techniques

    International Nuclear Information System (INIS)

    Li Pengcheng; Chen Guohua; Zhang Li; Dai Licao

    2011-01-01

    Human reliability analysis (HRA) methods are reviewed. The theoretical basis of human reliability analysis, human error mechanism, the key elements of HRA methods as well as the existing HRA methods are respectively introduced and assessed. Their shortcomings,the current research hotspot and difficult problems are identified. Finally, it takes a close look at the trends of human reliability analysis methods. (authors)

  9. Studying creativity training programs: A methodological analysis

    DEFF Research Database (Denmark)

    Valgeirsdóttir, Dagný; Onarheim, Balder

    2017-01-01

    Throughout decades of creativity research, a range of creativity training programs have been developed, tested, and analyzed. In 2004 Scott and colleagues published a meta‐analysis of all creativity training programs to date, and the review presented here sat out to identify and analyze studies...... published since the seminal 2004 review. Focusing on quantitative studies of creativity training programs for adults, our systematic review resulted in 22 publications. All studies were analyzed, but comparing the reported effectiveness of training across studies proved difficult due to methodological...... inconsistencies, variations in reporting of results as well as types of measures used. Thus a consensus for future studies is called for to answer the question: Which elements make one creativity training program more effective than another? This is a question of equal relevance to academia and industry...

  10. Methodologies for risk analysis in slope instability

    International Nuclear Information System (INIS)

    Bernabeu Garcia, M.; Diaz Torres, J. A.

    2014-01-01

    This paper is an approach to the different methodologies used in conducting landslide risk maps so that the reader can get a basic knowledge about how to proceed in its development. The landslide hazard maps are increasingly demanded by governments. This is because due to climate change, deforestation and the pressure exerted by the growth of urban centers, damage caused by natural phenomena is increasing each year, making this area of work a field of study with increasing importance. To explain the process of mapping a journey through each of the phases of which it is composed is made: from the study of the types of slope movements and the necessary management of geographic information systems (GIS) inventories and landslide susceptibility analysis, threat, vulnerability and risk. (Author)

  11. Human reliability analysis methods for probabilistic safety assessment

    International Nuclear Information System (INIS)

    Pyy, P.

    2000-11-01

    Human reliability analysis (HRA) of a probabilistic safety assessment (PSA) includes identifying human actions from safety point of view, modelling the most important of them in PSA models, and assessing their probabilities. As manifested by many incidents and studies, human actions may have both positive and negative effect on safety and economy. Human reliability analysis is one of the areas of probabilistic safety assessment (PSA) that has direct applications outside the nuclear industry. The thesis focuses upon developments in human reliability analysis methods and data. The aim is to support PSA by extending the applicability of HRA. The thesis consists of six publications and a summary. The summary includes general considerations and a discussion about human actions in the nuclear power plant (NPP) environment. A condensed discussion about the results of the attached publications is then given, including new development in methods and data. At the end of the summary part, the contribution of the publications to good practice in HRA is presented. In the publications, studies based on the collection of data on maintenance-related failures, simulator runs and expert judgement are presented in order to extend the human reliability analysis database. Furthermore, methodological frameworks are presented to perform a comprehensive HRA, including shutdown conditions, to study reliability of decision making, and to study the effects of wrong human actions. In the last publication, an interdisciplinary approach to analysing human decision making is presented. The publications also include practical applications of the presented methodological frameworks. (orig.)

  12. ATHEANA: A Technique for Human Error Analysis: An Overview of Its Methodological Basis

    International Nuclear Information System (INIS)

    Wreathall, John; Ramey-Smith, Ann

    1998-01-01

    The U.S. NRC has developed a new human reliability analysis (HRA) method, called A Technique for Human Event Analysis (ATHEANA), to provide a way of modeling the so-called 'errors of commission' - that is, situations in which operators terminate or disable engineered safety features (ESFs) or similar equipment during accident conditions, thereby putting the plant at an increased risk of core damage. In its reviews of operational events, NRC has found that these errors of commission occur with a relatively high frequency (as high as 2 or 3 per year), but are noticeably missing from the scope of most current probabilistic risk assessments (PRAs). This new method was developed through a formalized approach that describes what can occur when operators behave rationally but have inadequate knowledge or poor judgement. In particular, the method is based on models of decision-making and response planning that have been used extensively in the aviation field, and on the analysis of major accidents in both the nuclear and non-nuclear fields. Other papers at this conference present summaries of these event analyses in both the nuclear and non-nuclear fields. This paper presents an overview of ATHEANA and summarizes how the method structures the analysis of operationally significant events, and helps HRA analysts identify and model potentially risk-significant errors of commission in plant PRAs. (authors)

  13. A Review of Citation Analysis Methodologies for Collection Management

    Science.gov (United States)

    Hoffmann, Kristin; Doucette, Lise

    2012-01-01

    While there is a considerable body of literature that presents the results of citation analysis studies, most researchers do not provide enough detail in their methodology to reproduce the study, nor do they provide rationale for methodological decisions. In this paper, we review the methodologies used in 34 recent articles that present a…

  14. Clean Energy Manufacturing Analysis Center Benchmark Report: Framework and Methodologies

    Energy Technology Data Exchange (ETDEWEB)

    Sandor, Debra [National Renewable Energy Lab. (NREL), Golden, CO (United States); Chung, Donald [National Renewable Energy Lab. (NREL), Golden, CO (United States); Keyser, David [National Renewable Energy Lab. (NREL), Golden, CO (United States); Mann, Margaret [National Renewable Energy Lab. (NREL), Golden, CO (United States); Engel-Cox, Jill [National Renewable Energy Lab. (NREL), Golden, CO (United States)

    2017-05-23

    This report documents the CEMAC methodologies for developing and reporting annual global clean energy manufacturing benchmarks. The report reviews previously published manufacturing benchmark reports and foundational data, establishes a framework for benchmarking clean energy technologies, describes the CEMAC benchmark analysis methodologies, and describes the application of the methodologies to the manufacturing of four specific clean energy technologies.

  15. Methodology for Mode Selection in Corridor Analysis of Freight Transportation

    OpenAIRE

    Kanafani, Adib

    1984-01-01

    The purpose of tins report is to outline a methodology for the analysis of mode selection in freight transportation. This methodology is intended to partake of transportation corridor analysts, a component of demand analysis that is part of a national transportation process. The methodological framework presented here provides a basis on which specific models and calculation procedures might be developed. It also provides a basis for the development of a data management system suitable for co...

  16. Comparison of the Utility of Two Assessments for Explaining and Predicting Productivity Change: Well-Being Versus an HRA.

    Science.gov (United States)

    Gandy, William M; Coberley, Carter; Pope, James E; Rula, Elizabeth Y

    2016-01-01

    To compare utility of employee well-being to health risk assessment (HRA) as predictors of productivity change. Panel data from 2189 employees who completed surveys 2 years apart were used in hierarchical models comparing the influence of well-being and health risk on longitudinal changes in presenteeism and job performance. Absenteeism change was evaluated in a nonexempt subsample. Change in well-being was the most significant independent predictor of productivity change across all three measures. Comparing hierarchical models, well-being models performed significantly better than HRA models. The HRA added no incremental explanatory power over well-being in combined models. Alone, nonphysical health well-being components outperformed the HRA for all productivity measures. Well-being offers a more comprehensive measure of factors that influence productivity and can be considered preferential to HRA in understanding and addressing suboptimal productivity.

  17. Using a Realist Research Methodology in Policy Analysis

    Science.gov (United States)

    Lourie, Megan; Rata, Elizabeth

    2017-01-01

    The article describes the usefulness of a realist methodology in linking sociological theory to empirically obtained data through the development of a methodological device. Three layers of analysis were integrated: 1. the findings from a case study about Maori language education in New Zealand; 2. the identification and analysis of contradictions…

  18. The methodology of semantic analysis for extracting physical effects

    Science.gov (United States)

    Fomenkova, M. A.; Kamaev, V. A.; Korobkin, D. M.; Fomenkov, S. A.

    2017-01-01

    The paper represents new methodology of semantic analysis for physical effects extracting. This methodology is based on the Tuzov ontology that formally describes the Russian language. In this paper, semantic patterns were described to extract structural physical information in the form of physical effects. A new algorithm of text analysis was described.

  19. Development of analysis methodology on turbulent thermal stripping

    Energy Technology Data Exchange (ETDEWEB)

    Yoo, Geun Jong; Jeon, Won Dae; Han, Jin Woo; Gu, Byong Kook [Changwon National University, Changwon(Korea)

    2001-03-01

    For developing analysis methodology, important governing factors of thermal stripping phenomena are identified as geometric configuration and flow characteristics such as velocity. Along these factors, performance of turbulence models in existing analysis methodology are evaluated against experimental data. Status of DNS application is also accessed based on literature. Evaluation results are reflected in setting up the new analysis methodology. From the evaluation of existing analysis methodology, Full Reynolds Stress model is identified as best one among other turbulence models. And LES is found to be able to provide time dependent turbulence values. Further improvements in near-wall region and temperature variance equation are required for FRS and implementation of new sub-grid scale models is also required for LES. Through these improvements, new reliable analysis methodology for thermal stripping can be developed. 30 refs., 26 figs., 6 tabs. (Author)

  20. Čertovy obrázky a královská hra

    Czech Academy of Sciences Publication Activity Database

    Opavská, Zdeňka

    2017-01-01

    Roč. 96, č. 10 (2017), s. 600-600 ISSN 0042-4544 Institutional support: RVO:68378092 Keywords : Czech * lexicology * phraseology Subject RIV: AI - Linguistics OBOR OECD: Linguistics https://vesmir.cz/cz/casopis/archiv-casopisu/2017/cislo-10/certovy-obrazky-kralovska-hra. html

  1. Integrated sequence analysis. Final report

    International Nuclear Information System (INIS)

    Andersson, K.; Pyy, P.

    1998-02-01

    The NKS/RAK subprojet 3 'integrated sequence analysis' (ISA) was formulated with the overall objective to develop and to test integrated methodologies in order to evaluate event sequences with significant human action contribution. The term 'methodology' denotes not only technical tools but also methods for integration of different scientific disciplines. In this report, we first discuss the background of ISA and the surveys made to map methods in different application fields, such as man machine system simulation software, human reliability analysis (HRA) and expert judgement. Specific event sequences were, after the surveys, selected for application and testing of a number of ISA methods. The event sequences discussed in the report were cold overpressure of BWR, shutdown LOCA of BWR, steam generator tube rupture of a PWR and BWR disturbed signal view in the control room after an external event. Different teams analysed these sequences by using different ISA and HRA methods. Two kinds of results were obtained from the ISA project: sequence specific and more general findings. The sequence specific results are discussed together with each sequence description. The general lessons are discussed under a separate chapter by using comparisons of different case studies. These lessons include areas ranging from plant safety management (design, procedures, instrumentation, operations, maintenance and safety practices) to methodological findings (ISA methodology, PSA,HRA, physical analyses, behavioural analyses and uncertainty assessment). Finally follows a discussion about the project and conclusions are presented. An interdisciplinary study of complex phenomena is a natural way to produce valuable and innovative results. This project came up with structured ways to perform ISA and managed to apply the in practice. The project also highlighted some areas where more work is needed. In the HRA work, development is required for the use of simulators and expert judgement as

  2. A Goal based methodology for HAZOP analysis

    DEFF Research Database (Denmark)

    Rossing, Netta Liin; Lind, Morten; Jensen, Niels

    2010-01-01

    to nodes with simple functions such as liquid transport, gas transport, liquid storage, gas-liquid contacting etc. From the functions of the nodes the selection of relevant process variables and deviation variables follows directly. The knowledge required to perform the pre-meeting HAZOP task of dividing...... the plant along functional lines is that of chemical unit operations and transport processes plus a some familiarity with the plant a hand. Thus the preparatory work may be performed by a chemical engineer with just an introductory course in risk assessment. The goal based methodology lends itself directly...

  3. Methodology for risk analysis of nuclear installations

    International Nuclear Information System (INIS)

    Vasconcelos, Vanderley de; Senne Junior, Murillo; Jordao, Elizabete

    2002-01-01

    Both the licensing standards for general uses in nuclear facilities and the specific ones require a risk assessment during their licensing processes. The risk assessment is carried out through the estimation of both probability of the occurrence of the accident, and their magnitudes. This is a complex task because the great deal of potential hazardous events that can occur in nuclear facilities difficult the statement of the accident scenarios. There are also many available techniques to identify the potential accidents, estimate their probabilities, and evaluate their magnitudes. In this paper is presented a new methodology that systematizes the risk assessment process, and orders the accomplishment of their several steps. (author)

  4. Update of Part 61 Impacts Analysis Methodology. Methodology report. Volume 1

    Energy Technology Data Exchange (ETDEWEB)

    Oztunali, O.I.; Roles, G.W.

    1986-01-01

    Under contract to the US Nuclear Regulatory Commission, the Envirosphere Company has expanded and updated the impacts analysis methodology used during the development of the 10 CFR Part 61 rule to allow improved consideration of the costs and impacts of treatment and disposal of low-level waste that is close to or exceeds Class C concentrations. The modifications described in this report principally include: (1) an update of the low-level radioactive waste source term, (2) consideration of additional alternative disposal technologies, (3) expansion of the methodology used to calculate disposal costs, (4) consideration of an additional exposure pathway involving direct human contact with disposed waste due to a hypothetical drilling scenario, and (5) use of updated health physics analysis procedures (ICRP-30). Volume 1 of this report describes the calculational algorithms of the updated analysis methodology.

  5. Update of Part 61 Impacts Analysis Methodology. Methodology report. Volume 1

    International Nuclear Information System (INIS)

    Oztunali, O.I.; Roles, G.W.

    1986-01-01

    Under contract to the US Nuclear Regulatory Commission, the Envirosphere Company has expanded and updated the impacts analysis methodology used during the development of the 10 CFR Part 61 rule to allow improved consideration of the costs and impacts of treatment and disposal of low-level waste that is close to or exceeds Class C concentrations. The modifications described in this report principally include: (1) an update of the low-level radioactive waste source term, (2) consideration of additional alternative disposal technologies, (3) expansion of the methodology used to calculate disposal costs, (4) consideration of an additional exposure pathway involving direct human contact with disposed waste due to a hypothetical drilling scenario, and (5) use of updated health physics analysis procedures (ICRP-30). Volume 1 of this report describes the calculational algorithms of the updated analysis methodology

  6. Shuttle TPS thermal performance and analysis methodology

    Science.gov (United States)

    Neuenschwander, W. E.; Mcbride, D. U.; Armour, G. A.

    1983-01-01

    Thermal performance of the thermal protection system was approximately as predicted. The only extensive anomalies were filler bar scorching and over-predictions in the high Delta p gap heating regions of the orbiter. A technique to predict filler bar scorching has been developed that can aid in defining a solution. Improvement in high Delta p gap heating methodology is still under study. Minor anomalies were also examined for improvements in modeling techniques and prediction capabilities. These include improved definition of low Delta p gap heating, an analytical model for inner mode line convection heat transfer, better modeling of structure, and inclusion of sneak heating. The limited number of problems related to penetration items that presented themselves during orbital flight tests were resolved expeditiously, and designs were changed and proved successful within the time frame of that program.

  7. METHODOLOGICAL ANALYSIS OF TRAINING STUDENT basketball teams

    Directory of Open Access Journals (Sweden)

    Kozina Zh.L.

    2011-06-01

    Full Text Available Considered the leading position of the preparation of basketball teams in high schools. The system includes the following: reliance on top-quality players in the structure of preparedness, widespread use of visual aids, teaching movies and cartoons with a record of technology implementation of various methods by professional basketball players, the application of the methods of autogenic and ideomotor training according to our methodology. The study involved 63 students 1.5 courses from various universities of Kharkov 1.2 digits: 32 experimental group and 31 - control. The developed system of training students, basketball players used within 1 year. The efficiency of the developed system in the training process of students, basketball players.

  8. An economic analysis methodology for project evaluation and programming.

    Science.gov (United States)

    2013-08-01

    Economic analysis is a critical component of a comprehensive project or program evaluation methodology that considers all key : quantitative and qualitative impacts of highway investments. It allows highway agencies to identify, quantify, and value t...

  9. Development of Advanced Non-LOCA Analysis Methodology for Licensing

    International Nuclear Information System (INIS)

    Jang, Chansu; Um, Kilsup; Choi, Jaedon

    2008-01-01

    KNF is developing a new design methodology on the Non-LOCA analysis for the licensing purpose. The code chosen is the best-estimate transient analysis code RETRAN and the OPR1000 is aimed as a target plant. For this purpose, KNF prepared a simple nodal scheme appropriate to the licensing analyses and developed the designer-friendly analysis tool ASSIST (Automatic Steady-State Initialization and Safety analysis Tool). To check the validity of the newly developed methodology, the single CEA withdrawal and the locked rotor accidents are analyzed by using a new methodology and are compared with current design results. Comparison results show a good agreement and it is concluded that the new design methodology can be applied to the licensing calculations for OPR1000 Non-LOCA

  10. Opening Remarks of the Acquisition Path Analysis Methodology Session

    International Nuclear Information System (INIS)

    Renis, T.

    2015-01-01

    An overview of the recent development work that has been done on acquisition path analysis, implementation of the methodologies within the Department of Safeguards, lessons learned and future areas for development will be provided. (author)

  11. Vulnerability and Risk Analysis Program: Overview of Assessment Methodology

    National Research Council Canada - National Science Library

    2001-01-01

    .... Over the last three years, a team of national laboratory experts, working in partnership with the energy industry, has successfully applied the methodology as part of OCIP's Vulnerability and Risk Analysis Program (VRAP...

  12. A methodology for the data energy regional consumption consistency analysis

    International Nuclear Information System (INIS)

    Canavarros, Otacilio Borges; Silva, Ennio Peres da

    1999-01-01

    The article introduces a methodology for data energy regional consumption consistency analysis. The work was going based on recent studies accomplished by several cited authors and boarded Brazilian matrices and Brazilian energetics regional balances. The results are compared and analyzed

  13. Structured information analysis for human reliability analysis of emergency tasks in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Won Dea; Kim, Jae Whan; Park, Jin Kyun; Ha, Jae Joo [Korea Atomic Energy Research Institute, Taejeon (Korea)

    2000-02-01

    More than twenty HRA (Human Reliability Analysis) methodologies have been developed and used for the safety analysis in nuclear field during the past two decades. However, no methodology appears to have universally been accepted, as various limitations have been raised for more widely used ones. One of the most important limitations of conventional HRA is insufficient analysis of the task structure and problem space. To resolve this problem, we suggest SIA (Structured Information Analysis) for HRA. The proposed SIA consists of three parts. The first part is the scenario analysis that investigates the contextual information related to the given task on the basis of selected scenarios. The second is the goals-means analysis to define the relations between the cognitive goal and task steps. The third is the cognitive function analysis module that identifies the cognitive patterns and information flows involved in the task. Through the three-part analysis, systematic investigation is made possible from the macroscopic information on the tasks to the microscopic information on the specific cognitive processes. It is expected that analysts can attain a structured set of information that helps to predict the types and possibility of human error in the given task. 48 refs., 12 figs., 11 tabs. (Author)

  14. Probabilistic safety analysis and human reliability analysis. Proceedings. Working material

    International Nuclear Information System (INIS)

    1996-01-01

    An international meeting on Probabilistic Safety Assessment (PSA) and Human Reliability Analysis (HRA) was jointly organized by Electricite de France - Research and Development (EDF DER) and SRI International in co-ordination with the International Atomic Energy Agency. The meeting was held in Paris 21-23 November 1994. A group of international and French specialists in PSA and HRA participated at the meeting and discussed the state of the art and current trends in the following six topics: PSA Methodology; PSA Applications; From PSA to Dependability; Incident Analysis; Safety Indicators; Human Reliability. For each topic a background paper was prepared by EDF/DER and reviewed by the international group of specialists who attended the meeting. The results of this meeting provide a comprehensive overview of the most important questions related to the readiness of PSA for specific uses and areas where further research and development is required. Refs, figs, tabs

  15. Probabilistic safety analysis and human reliability analysis. Proceedings. Working material

    Energy Technology Data Exchange (ETDEWEB)

    NONE

    1997-12-31

    An international meeting on Probabilistic Safety Assessment (PSA) and Human Reliability Analysis (HRA) was jointly organized by Electricite de France - Research and Development (EDF DER) and SRI International in co-ordination with the International Atomic Energy Agency. The meeting was held in Paris 21-23 November 1994. A group of international and French specialists in PSA and HRA participated at the meeting and discussed the state of the art and current trends in the following six topics: PSA Methodology; PSA Applications; From PSA to Dependability; Incident Analysis; Safety Indicators; Human Reliability. For each topic a background paper was prepared by EDF/DER and reviewed by the international group of specialists who attended the meeting. The results of this meeting provide a comprehensive overview of the most important questions related to the readiness of PSA for specific uses and areas where further research and development is required. Refs, figs, tabs.

  16. Severe accident analysis methodology in support of accident management

    International Nuclear Information System (INIS)

    Boesmans, B.; Auglaire, M.; Snoeck, J.

    1997-01-01

    The author addresses the implementation at BELGATOM of a generic severe accident analysis methodology, which is intended to support strategic decisions and to provide quantitative information in support of severe accident management. The analysis methodology is based on a combination of severe accident code calculations, generic phenomenological information (experimental evidence from various test facilities regarding issues beyond present code capabilities) and detailed plant-specific technical information

  17. Radiochemical Analysis Methodology for uranium Depletion Measurements

    Energy Technology Data Exchange (ETDEWEB)

    Scatena-Wachel DE

    2007-01-09

    This report provides sufficient material for a test sponsor with little or no radiochemistry background to understand and follow physics irradiation test program execution. Most irradiation test programs employ similar techniques and the general details provided here can be applied to the analysis of other irradiated sample types. Aspects of program management directly affecting analysis quality are also provided. This report is not an in-depth treatise on the vast field of radiochemical analysis techniques and related topics such as quality control. Instrumental technology is a very fast growing field and dramatic improvements are made each year, thus the instrumentation described in this report is no longer cutting edge technology. Much of the background material is still applicable and useful for the analysis of older experiments and also for subcontractors who still retain the older instrumentation.

  18. Diversion Path Analysis Handbook. Volume 1. Methodology

    International Nuclear Information System (INIS)

    Goodwin, K.E.; Schleter, J.C.; Maltese, M.D.K.

    1978-11-01

    Diversion Path Analysis (DPA) is a safeguards evaluation tool which is used to determine the vulnerability of the Material Control and Material Accounting (MC and MA) Subsystems to the threat of theft of Special Nuclear Material (SNM) by a knowledgeable Insider. The DPA team should consist of two individuals who have technical backgrounds. The implementation of DPA is divided into five basic steps: Information and Data Gathering, Process Characterization, Analysis of Diversion Paths, Results and Findings, and Documentation

  19. SINGULAR SPECTRUM ANALYSIS: METHODOLOGY AND APPLICATION TO ECONOMICS DATA

    Institute of Scientific and Technical Information of China (English)

    Hossein HASSANI; Anatoly ZHIGLJAVSKY

    2009-01-01

    This paper describes the methodology of singular spectrum analysis (SSA) and demonstrate that it is a powerful method of time series analysis and forecasting, particulary for economic time series. The authors consider the application of SSA to the analysis and forecasting of the Iranian national accounts data as provided by the Central Bank of the Islamic Republic of lran.

  20. Gap analysis methodology for business service engineering

    NARCIS (Netherlands)

    Nguyen, D.K.; van den Heuvel, W.J.A.M.; Papazoglou, M.; de Castro, V.; Marcos, E.; Hofreiter, B.; Werthner, H.

    2009-01-01

    Many of today’s service analysis and design techniques rely on ad-hoc and experience-based identification of value-creating business services and implicitly assume a “green-field” situation focusing on the development of completely new services while offering very limited support for discovering

  1. Discourse analysis: making complex methodology simple

    NARCIS (Netherlands)

    Bondarouk, Tatiana; Ruel, Hubertus Johannes Maria; Leino, T.; Saarinen, T.; Klein, S.

    2004-01-01

    Discursive-based analysis of organizations is not new in the field of interpretive social studies. Since not long ago have information systems (IS) studies also shown a keen interest in discourse (Wynn et al, 2002). The IS field has grown significantly in its multiplicity that is echoed in the

  2. Scenario aggregation and analysis via Mean-Shift Methodology

    International Nuclear Information System (INIS)

    Mandelli, D.; Yilmaz, A.; Metzroth, K.; Aldemir, T.; Denning, R.

    2010-01-01

    A new generation of dynamic methodologies is being developed for nuclear reactor probabilistic risk assessment (PRA) which explicitly account for the time element in modeling the probabilistic system evolution and use numerical simulation tools to account for possible dependencies between failure events. The dynamic event tree (DET) approach is one of these methodologies. One challenge with dynamic PRA methodologies is the large amount of data they produce which may be difficult to analyze without appropriate software tools. The concept of 'data mining' is well known in the computer science community and several methodologies have been developed in order to extract useful information from a dataset with a large number of records. Using the dataset generated by the DET analysis of the reactor vessel auxiliary cooling system (RVACS) of an ABR-1000 for an aircraft crash recovery scenario and the Mean-Shift Methodology for data mining, it is shown how clusters of transients with common characteristics can be identified and classified. (authors)

  3. Methodological aspects on drug receptor binding analysis

    International Nuclear Information System (INIS)

    Wahlstroem, A.

    1978-01-01

    Although drug receptors occur in relatively low concentrations, they can be visualized by the use of appropriate radioindicators. In most cases the procedure is rapid and can reach a high degree of accuracy. Specificity of the interaction is studied by competition analysis. The necessity of using several radioindicators to define a receptor population is emphasized. It may be possible to define isoreceptors and drugs with selectivity for one isoreceptor. (Author)

  4. Franchisingový projekt Hračky Modré z nebe

    OpenAIRE

    Maňasová, Petra

    2008-01-01

    The diploma thesis contains a description of the situation in franchising in the Czech Republic. The main part is focused on a franchising system and future strategy of the project Hračky Modré z nebe. It presents the company Eltsen a.s. which wants to realize the project in the future, its historical development and current situation. The thesis describes the way which the company would like to choose to be able to realize whole project.

  5. Analysis of Alternatives for Risk Assessment Methodologies and Tools

    Energy Technology Data Exchange (ETDEWEB)

    Nachtigal, Noel M. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). System Analytics; Fruetel, Julia A. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Gleason, Nathaniel J. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Helms, Jovana [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Imbro, Dennis Raymond [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis; Sumner, Matthew C. [Sandia National Lab. (SNL-CA), Livermore, CA (United States). Systems Research and Analysis

    2013-10-01

    The purpose of this document is to provide a basic overview and understanding of risk assessment methodologies and tools from the literature and to assess the suitability of these methodologies and tools for cyber risk assessment. Sandia National Laboratories (SNL) performed this review in support of risk modeling activities performed for the Stakeholder Engagement and Cyber Infrastructure Resilience (SECIR) division of the Department of Homeland Security (DHS) Office of Cybersecurity and Communications (CS&C). The set of methodologies and tools covered in this document is not intended to be exhaustive; instead, it focuses on those that are commonly used in the risk assessment community. The classification of methodologies and tools was performed by a group of analysts with experience in risk analysis and cybersecurity, and the resulting analysis of alternatives has been tailored to address the needs of a cyber risk assessment.

  6. A process for application of ATHEANA - a new HRA method

    International Nuclear Information System (INIS)

    Parry, G.W.; Bley, D.C.; Cooper, S.E.

    1996-01-01

    This paper describes the analytical process for the application of ATHEANA, a new approach to the performance of human reliability analysis as part of a PRA. This new method, unlike existing methods, is based upon an understanding of the reasons why people make errors, and was developed primarily to address the analysis of errors of commission

  7. Diversion path analysis handbook. Volume I. Methodology

    International Nuclear Information System (INIS)

    Maltese, M.D.K.; Goodwin, K.E.; Schleter, J.C.

    1976-10-01

    Diversion Path Analysis (DPA) is a procedure for analyzing internal controls of a facility in order to identify vulnerabilities to successful diversion of material by an adversary. The internal covert threat is addressed but the results are also applicable to the external overt threat. The diversion paths are identified. Complexity parameters include records alteration or falsification, multiple removals of sub-threshold quantities, collusion, and access authorization of the individual. Indicators, or data elements and information of significance to detection of unprevented theft, are identified by means of DPA. Indicator sensitivity is developed in terms of the threshold quantity, the elapsed time between removal and indication and the degree of localization of facility area and personnel given by the indicator. Evaluation of facility internal controls in light of these sensitivities defines the capability of interrupting identified adversary action sequences related to acquisition of material at fixed sites associated with the identified potential vulnerabilities. Corrective measures can, in many cases, also be prescribed for management consideration and action. DPA theory and concepts have been developing over the last several years, and initial field testing proved both the feasibility and practicality of the procedure. Follow-on implementation testing verified the ability of facility personnel to perform DPA

  8. Advanced Power Plant Development and Analysis Methodologies

    Energy Technology Data Exchange (ETDEWEB)

    A.D. Rao; G.S. Samuelsen; F.L. Robson; B. Washom; S.G. Berenyi

    2006-06-30

    Under the sponsorship of the U.S. Department of Energy/National Energy Technology Laboratory, a multi-disciplinary team led by the Advanced Power and Energy Program of the University of California at Irvine is defining the system engineering issues associated with the integration of key components and subsystems into advanced power plant systems with goals of achieving high efficiency and minimized environmental impact while using fossil fuels. These power plant concepts include 'Zero Emission' power plants and the 'FutureGen' H2 co-production facilities. The study is broken down into three phases. Phase 1 of this study consisted of utilizing advanced technologies that are expected to be available in the 'Vision 21' time frame such as mega scale fuel cell based hybrids. Phase 2 includes current state-of-the-art technologies and those expected to be deployed in the nearer term such as advanced gas turbines and high temperature membranes for separating gas species and advanced gasifier concepts. Phase 3 includes identification of gas turbine based cycles and engine configurations suitable to coal-based gasification applications and the conceptualization of the balance of plant technology, heat integration, and the bottoming cycle for analysis in a future study. Also included in Phase 3 is the task of acquiring/providing turbo-machinery in order to gather turbo-charger performance data that may be used to verify simulation models as well as establishing system design constraints. The results of these various investigations will serve as a guide for the U. S. Department of Energy in identifying the research areas and technologies that warrant further support.

  9. Risk analysis methodologies for the transportation of radioactive materials

    International Nuclear Information System (INIS)

    Geffen, C.A.

    1983-05-01

    Different methodologies have evolved for consideration of each of the many steps required in performing a transportation risk analysis. Although there are techniques that attempt to consider the entire scope of the analysis in depth, most applications of risk assessment to the transportation of nuclear fuel cycle materials develop specific methodologies for only one or two parts of the analysis. The remaining steps are simplified for the analyst by narrowing the scope of the effort (such as evaluating risks for only one material, or a particular set of accident scenarios, or movement over a specific route); performing a qualitative rather than a quantitative analysis (probabilities may be simply ranked as high, medium or low, for instance); or assuming some generic, conservative conditions for potential release fractions and consequences. This paper presents a discussion of the history and present state-of-the-art of transportation risk analysis methodologies. Many reports in this area were reviewed as background for this presentation. The literature review, while not exhaustive, did result in a complete representation of the major methods used today in transportation risk analysis. These methodologies primarily include the use of severity categories based on historical accident data, the analysis of specifically assumed accident sequences for the transportation activity of interest, and the use of fault or event tree analysis. Although the focus of this work has generally been on potential impacts to public groups, some effort has been expended in the estimation of risks to occupational groups in transportation activities

  10. CONTENT ANALYSIS, DISCOURSE ANALYSIS, AND CONVERSATION ANALYSIS: PRELIMINARY STUDY ON CONCEPTUAL AND THEORETICAL METHODOLOGICAL DIFFERENCES

    Directory of Open Access Journals (Sweden)

    Anderson Tiago Peixoto Gonçalves

    2016-08-01

    Full Text Available This theoretical essay aims to reflect on three models of text interpretation used in qualitative research, which is often confused in its concepts and methodologies (Content Analysis, Discourse Analysis, and Conversation Analysis. After the presentation of the concepts, the essay proposes a preliminary discussion on conceptual and theoretical methodological differences perceived between them. A review of the literature was performed to support the conceptual and theoretical methodological discussion. It could be verified that the models have differences related to the type of strategy used in the treatment of texts, the type of approach, and the appropriate theoretical position.

  11. Development of a Long Term Cooling Analysis Methodology Using Rappel

    International Nuclear Information System (INIS)

    Lee, S. I.; Jeong, J. H.; Ban, C. H.; Oh, S. J.

    2012-01-01

    Since the revision of the 10CFR50.46 in 1988, which allowed BE (Best-Estimate) method in analyzing the safety performance of a nuclear power plant, safety analysis methodologies have been changed continuously from conservative EM (Evaluation Model) approaches to BE ones. In this context, LSC (Long-Term core Cooling) methodologies have been reviewed by the regulatory bodies of USA and Korea. Some non-conservatism and improperness of the old methodology have been identified, and as a result, USNRC suspended the approval of CENPD-254-P-A which is the old LSC methodology for CE-designed NPPs. Regulatory bodies requested to remove the non-conservatisms and to reflect system transient behaviors in all the LSC methodologies used. In the present study, a new LSC methodology using RELAP5 is developed. RELAP5 and a newly developed code, BACON (Boric Acid Concentration Of Nuclear power plant) are used to calculate the transient behavior of the system and the boric acid concentration, respectively. Full range of break spectrum is considered and the applicability is confirmed through plant demonstration calculations. The result shows a good comparison with the old-fashioned ones, therefore, the methodology could be applied with no significant changes of current LSC plans

  12. Safety analysis and evaluation methodology for fusion systems

    International Nuclear Information System (INIS)

    Fujii-e, Y.; Kozawa, Y.; Namba, C.

    1987-03-01

    Fusion systems which are under development as future energy systems have reached a stage that the break even is expected to be realized in the near future. It is desirable to demonstrate that fusion systems are well acceptable to the societal environment. There are three crucial viewpoints to measure the acceptability, that is, technological feasibility, economy and safety. These three points have close interrelation. The safety problem is more important since three large scale tokamaks, JET, TFTR and JT-60, start experiment, and tritium will be introduced into some of them as the fusion fuel. It is desirable to establish a methodology to resolve the safety-related issues in harmony with the technological evolution. The promising fusion system toward reactors is not yet settled. This study has the objective to develop and adequate methodology which promotes the safety design of general fusion systems and to present a basis for proposing the R and D themes and establishing the data base. A framework of the methodology, the understanding and modeling of fusion systems, the principle of ensuring safety, the safety analysis based on the function and the application of the methodology are discussed. As the result of this study, the methodology for the safety analysis and evaluation of fusion systems was developed. New idea and approach were presented in the course of the methodology development. (Kako, I.)

  13. Incorporation of advanced accident analysis methodology into safety analysis reports

    International Nuclear Information System (INIS)

    2003-05-01

    as structural analysis codes and computational fluid dynamics codes (CFD) are applied. The initial code development took place in the sixties and seventies and resulted in a set of quite conservative codes for the reactor dynamics, thermal-hydraulics and containment analysis. The most important limitations of these codes came from insufficient knowledge of the physical phenomena and of the limited computer memory and speed. Very significant advances have been made in the development of the code systems during the last twenty years in all of the above areas. If the data for the physical models of the code are sufficiently well established and allow quite a realistic analysis, these newer versions are called advanced codes. The assumptions used in the deterministic safety analysis vary from very pessimistic to realistic assumptions. In the accident analysis terminology, it is customary to call the pessimistic assumptions 'conservative' and the realistic assumptions 'best estimate'. The assumptions can refer to the selection of physical models, the introduction of these models into the code, and the initial and boundary conditions including the performance and failures of the equipment and human action. The advanced methodology in the present report means application of advanced codes (or best estimate codes), which sometimes represent a combination of various advanced codes for separate stages of the analysis, and in some cases in combination with experiments. The Safety Analysis Reports are required to be available before and during the operation of the plant in most countries. The contents, scope and stages of the SAR vary among the countries. The guide applied in the USA, i.e. the Regulatory Guide 1.70 is representative for the way in which the SARs are made in many countries. During the design phase, a preliminary safety analysis report (PSAR) is requested in many countries and the final safety analysis report (FSAR) is required for the operating licence. There is

  14. Methodology for flood risk analysis for nuclear power plants

    International Nuclear Information System (INIS)

    Wagner, D.P.; Casada, M.L.; Fussell, J.B.

    1984-01-01

    The methodology for flood risk analysis described here addresses the effects of a flood on nuclear power plant safety systems. Combining the results of this method with the probability of a flood allows the effects of flooding to be included in a probabilistic risk assessment. The five-step methodology includes accident sequence screening to focus the detailed analysis efforts on the accident sequences that are significantly affected by a flood event. The quantitative results include the flood's contribution to system failure probability, accident sequence occurrence frequency and consequence category occurrence frequency. The analysis can be added to existing risk assessments without a significant loss in efficiency. The results of two example applications show the usefulness of the methodology. Both examples rely on the Reactor Safety Study for the required risk assessment inputs and present changes in the Reactor Safety Study results as a function of flood probability

  15. Simplified methodology for analysis of Angra-1 containing

    International Nuclear Information System (INIS)

    Neves Conti, T. das; Souza, A.L. de; Sabundjian, G.

    1988-01-01

    A simplified methodology of analysis was developed to simulate a Large Break Loss of Coolant Accident in the Angra 1 Nuclear Power Station. Using the RELAP5/MOD1, RELAP4/MOD5 and CONTEMPT-LT Codes, the time the variation of pressure and temperature in the containment was analysed. The obtained data was compared with the Angra 1 Final Safety Analysis Report, and too those calculated by a Detailed Model. The results obtained by this new methodology such as the small computational time of simulation, were satisfactory when getting the preliminar avaliation of the Angra 1 global parameters. (author) [pt

  16. Methodological developments and applications of neutron activation analysis

    International Nuclear Information System (INIS)

    Kucera, J.

    2007-01-01

    The paper reviews the author's experience acquired and achievements made in methodological developments of neutron activation analysis (NAA) of mostly biological materials. These involve epithermal neutron activation analysis, radiochemical neutron activation analysis using both single- and multi-element separation procedures, use of various counting modes, and the development and use of the self-verification principle. The role of NAA in the detection of analytical errors is discussed and examples of applications of the procedures developed are given. (author)

  17. PIXE methodology of rare earth element analysis and its applications

    International Nuclear Information System (INIS)

    Ma Xinpei

    1992-01-01

    The Proton Induced X-ray Emission (PIXE) methodology of rare earth element (REEs) analysis is discussed, including the significance of REE analysis, the principle of PIXE applied to REE, selection of characteristic X-ray for Lanthanide series elements, deconvolution of highly over lapped PIXE spectrum and minimum detection limit (MDL) of REEs. Some practical applications are presented. And the specialities of PIXE analysis to the high pure REE chemicals are discussed. (author)

  18. Development of seismic risk analysis methodologies at JAERI

    International Nuclear Information System (INIS)

    Tanaka, T.; Abe, K.; Ebisawa, K.; Oikawa, T.

    1988-01-01

    The usefulness of probabilistic safety assessment (PSA) is recognized worldwidely for balanced design and regulation of nuclear power plants. In Japan, the Japan Atomic Energy Research Institute (JAERI) has been engaged in developing methodologies necessary for carrying out PSA. The research and development program was started in 1980. In those days the effort was only for internal initiator PSA. In 1985 the program was expanded so as to include external event analysis. Although this expanded program is to cover various external initiators, the current effort is dedicated for seismic risk analysis. There are three levels of seismic PSA, similarly to internal initiator PSA: Level 1: Evaluation of core damage frequency, Level 2: Evaluation of radioactive release frequency and source terms, and Level 3: Evaluation of environmental consequence. In the JAERI's program, only the methodologies for level 1 seismic PSA are under development. The methodology development for seismic risk analysis is divided into two phases. The Phase I study is to establish a whole set of simple methodologies based on currently available data. In the Phase II, Sensitivity study will be carried out to identify the parameters whose uncertainty may result in lage uncertainty in seismic risk, and For such parameters, the methodology will be upgraded. Now the Phase I study has almost been completed. In this report, outlines of the study and some of its outcomes are described

  19. A Global Sensitivity Analysis Methodology for Multi-physics Applications

    Energy Technology Data Exchange (ETDEWEB)

    Tong, C H; Graziani, F R

    2007-02-02

    Experiments are conducted to draw inferences about an entire ensemble based on a selected number of observations. This applies to both physical experiments as well as computer experiments, the latter of which are performed by running the simulation models at different input configurations and analyzing the output responses. Computer experiments are instrumental in enabling model analyses such as uncertainty quantification and sensitivity analysis. This report focuses on a global sensitivity analysis methodology that relies on a divide-and-conquer strategy and uses intelligent computer experiments. The objective is to assess qualitatively and/or quantitatively how the variabilities of simulation output responses can be accounted for by input variabilities. We address global sensitivity analysis in three aspects: methodology, sampling/analysis strategies, and an implementation framework. The methodology consists of three major steps: (1) construct credible input ranges; (2) perform a parameter screening study; and (3) perform a quantitative sensitivity analysis on a reduced set of parameters. Once identified, research effort should be directed to the most sensitive parameters to reduce their uncertainty bounds. This process is repeated with tightened uncertainty bounds for the sensitive parameters until the output uncertainties become acceptable. To accommodate the needs of multi-physics application, this methodology should be recursively applied to individual physics modules. The methodology is also distinguished by an efficient technique for computing parameter interactions. Details for each step will be given using simple examples. Numerical results on large scale multi-physics applications will be available in another report. Computational techniques targeted for this methodology have been implemented in a software package called PSUADE.

  20. Taipower's transient analysis methodology for pressurized water reactors

    International Nuclear Information System (INIS)

    Huang, Pinghue

    1998-01-01

    The methodology presented in this paper is a part of the 'Taipower's Reload Design and Transient Analysis Methodologies for Light Water Reactors' developed by the Taiwan Power Company (TPC) and the Institute of Nuclear Energy Research. This methodology utilizes four computer codes developed or sponsored by Electric Power Research institute: system transient analysis code RETRAN-02, core thermal-hydraulic analysis code COBRAIIIC, three-dimensional spatial kinetics code ARROTTA, and fuel rod evaluation code FREY. Each of the computer codes was extensively validated. Analysis methods and modeling techniques were conservatively established for each application using a systematic evaluation with the assistance of sensitivity studies. The qualification results and analysis methods were documented in detail in TPC topical reports. The topical reports for COBRAIIIC, ARROTTA. and FREY have been reviewed and approved by the Atomic Energy Council (ABC). TPC 's in-house transient methodology have been successfully applied to provide valuable support for many operational issues and plant improvements for TPC's Maanshan Units I and 2. Major applications include the removal of the resistance temperature detector bypass system, the relaxation of the hot-full-power moderator temperature coefficient design criteria imposed by the ROCAEC due to a concern on Anticipated Transient Without Scram, the reduction of boron injection tank concentration and the elimination of the heat tracing, and the reduction of' reactor coolant system flow. (author)

  1. A methodology for radiological accidents analysis in industrial gamma radiography

    International Nuclear Information System (INIS)

    Silva, F.C.A. da.

    1990-01-01

    A critical review of 34 published severe radiological accidents in industrial gamma radiography, that happened in 15 countries, from 1960 to 1988, was performed. The most frequent causes, consequences and dose estimation methods were analysed, aiming to stablish better procedures of radiation safety and accidents analysis. The objective of this work is to elaborate a radiological accidents analysis methodology in industrial gamma radiography. The suggested methodology will enable professionals to determine the true causes of the event and to estimate the dose with a good certainty. The technical analytical tree, recommended by International Atomic Energy Agency to perform radiation protection and nuclear safety programs, was adopted in the elaboration of the suggested methodology. The viability of the use of the Electron Gamma Shower 4 Computer Code System to calculate the absorbed dose in radiological accidents in industrial gamma radiography, mainly at sup(192)Ir radioactive source handling situations was also studied. (author)

  2. Disposal criticality analysis methodology's principal isotope burnup credit

    International Nuclear Information System (INIS)

    Doering, T.W.; Thomas, D.A.

    2001-01-01

    This paper presents the burnup credit aspects of the United States Department of Energy Yucca Mountain Project's methodology for performing criticality analyses for commercial light-water-reactor fuel. The disposal burnup credit methodology uses a 'principal isotope' model, which takes credit for the reduced reactivity associated with the build-up of the primary principal actinides and fission products in irradiated fuel. Burnup credit is important to the disposal criticality analysis methodology and to the design of commercial fuel waste packages. The burnup credit methodology developed for disposal of irradiated commercial nuclear fuel can also be applied to storage and transportation of irradiated commercial nuclear fuel. For all applications a series of loading curves are developed using a best estimate methodology and depending on the application, an additional administrative safety margin may be applied. The burnup credit methodology better represents the 'true' reactivity of the irradiated fuel configuration, and hence the real safety margin, than do evaluations using the 'fresh fuel' assumption. (author)

  3. A Local Approach Methodology for the Analysis of Ultimate Strength ...

    African Journals Online (AJOL)

    The local approach methodology in contrast to classical fracture mechanics can be used to predict the onset of tearing fracture, and the effects of geometry in tubular joints. Finite element analysis of T-joints plate geometries, and tubular joints has been done. The parameters of constraint, equivalent stress, plastic strain and ...

  4. Methodology for reactor core physics analysis - part 2

    International Nuclear Information System (INIS)

    Ponzoni Filho, P.; Fernandes, V.B.; Lima Bezerra, J. de; Santos, T.I.C.

    1992-12-01

    The computer codes used for reactor core physics analysis are described. The modifications introduced in the public codes and the technical basis for the codes developed by the FURNAS utility are justified. An evaluation of the impact of these modifications on the parameter involved in qualifying the methodology is included. (F.E.). 5 ref, 7 figs, 5 tabs

  5. Human Schedule Performance, Protocol Analysis, and the "Silent Dog" Methodology

    Science.gov (United States)

    Cabello, Francisco; Luciano, Carmen; Gomez, Inmaculada; Barnes-Holmes, Dermot

    2004-01-01

    The purpose of the current experiment was to investigate the role of private verbal behavior on the operant performances of human adults, using a protocol analysis procedure with additional methodological controls (the "silent dog" method). Twelve subjects were exposed to fixed ratio 8 and differential reinforcement of low rate 3-s schedules. For…

  6. Science-based HRA: experimental comparison of operator performance to IDAC (Information-Decision-Action Crew) simulations

    Energy Technology Data Exchange (ETDEWEB)

    Shirley, Rachel [The Ohio State Univ., Columbus, OH (United States); Smidts, Carol [The Ohio State Univ., Columbus, OH (United States); Boring, Ronald [Idaho National Lab. (INL), Idaho Falls, ID (United States); Li, Yuandan [Univ. of Maryland, College Park, MD (United States); Mosleh, Ali [Univ. of Maryland, College Park, MD (United States)

    2015-02-01

    Information-Decision-Action Crew (IDAC) operator model simulations of a Steam Generator Tube Rupture are compared to student operator performance in studies conducted in the Ohio State University’s Nuclear Power Plant Simulator Facility. This study is presented as a prototype for conducting simulator studies to validate key aspects of Human Reliability Analysis (HRA) methods. Seven student operator crews are compared to simulation results for crews designed to demonstrate three different decision-making strategies. The IDAC model used in the simulations is modified slightly to capture novice behavior rather that expert operators. Operator actions and scenario pacing are compared. A preliminary review of available performance shaping factors (PSFs) is presented. After the scenario in the NPP Simulator Facility, student operators review a video of the scenario and evaluate six PSFs at pre-determined points in the scenario. This provides a dynamic record of the PSFs experienced by the OSU student operators. In this preliminary analysis, Time Constraint Load (TCL) calculated in the IDAC simulations is compared to TCL reported by student operators. We identify potential modifications to the IDAC model to develop an “IDAC Student Operator Model.” This analysis provides insights into how similar experiments could be conducted using expert operators to improve the fidelity of IDAC simulations.

  7. PROBLEMS AND METHODOLOGY OF THE PETROLOGIC ANALYSIS OF COAL FACIES.

    Science.gov (United States)

    Chao, Edward C.T.

    1983-01-01

    This condensed synthesis gives a broad outline of the methodology of coal facies analysis, procedures for constructing sedimentation and geochemical formation curves, and micro- and macrostratigraphic analysis. The hypothetical coal bed profile has a 3-fold cycle of material characteristics. Based on studies of other similar profiles of the same coal bed, and on field studies of the sedimentary rock types and their facies interpretation, one can assume that the 3-fold subdivision is of regional significance.

  8. Disposal criticality analysis methodology for fissile waste forms

    International Nuclear Information System (INIS)

    Davis, J.W.; Gottlieb, P.

    1998-03-01

    A general methodology has been developed to evaluate the criticality potential of the wide range of waste forms planned for geologic disposal. The range of waste forms include commercial spent fuel, high level waste, DOE spent fuel (including highly enriched), MOX using weapons grade plutonium, and immobilized plutonium. The disposal of these waste forms will be in a container with sufficiently thick corrosion resistant barriers to prevent water penetration for up to 10,000 years. The criticality control for DOE spent fuel is primarily provided by neutron absorber material incorporated into the basket holding the individual assemblies. For the immobilized plutonium, the neutron absorber material is incorporated into the waste form itself. The disposal criticality analysis methodology includes the analysis of geochemical and physical processes that can breach the waste package and affect the waste forms within. The basic purpose of the methodology is to guide the criticality control features of the waste package design, and to demonstrate that the final design meets the criticality control licensing requirements. The methodology can also be extended to the analysis of criticality consequences (primarily increased radionuclide inventory), which will support the total performance assessment for the respository

  9. Human reliability analysis of control room operators

    Energy Technology Data Exchange (ETDEWEB)

    Santos, Isaac J.A.L.; Carvalho, Paulo Victor R.; Grecco, Claudio H.S. [Instituto de Engenharia Nuclear (IEN), Rio de Janeiro, RJ (Brazil)

    2005-07-01

    Human reliability is the probability that a person correctly performs some system required action in a required time period and performs no extraneous action that can degrade the system Human reliability analysis (HRA) is the analysis, prediction and evaluation of work-oriented human performance using some indices as human error likelihood and probability of task accomplishment. Significant progress has been made in the HRA field during the last years, mainly in nuclear area. Some first-generation HRA methods were developed, as THERP (Technique for human error rate prediction). Now, an array of called second-generation methods are emerging as alternatives, for instance ATHEANA (A Technique for human event analysis). The ergonomics approach has as tool the ergonomic work analysis. It focus on the study of operator's activities in physical and mental form, considering at the same time the observed characteristics of operator and the elements of the work environment as they are presented to and perceived by the operators. The aim of this paper is to propose a methodology to analyze the human reliability of the operators of industrial plant control room, using a framework that includes the approach used by ATHEANA, THERP and the work ergonomics analysis. (author)

  10. Two methodologies for optical analysis of contaminated engine lubricants

    International Nuclear Information System (INIS)

    Aghayan, Hamid; Yang, Jun; Bordatchev, Evgueni

    2012-01-01

    The performance, efficiency and lifetime of modern combustion engines significantly depend on the quality of the engine lubricants. However, contaminants, such as gasoline, moisture, coolant and wear particles, reduce the life of engine mechanical components and lubricant quality. Therefore, direct and indirect measurements of engine lubricant properties, such as physical-mechanical, electro-magnetic, chemical and optical properties, are intensively utilized in engine condition monitoring systems and sensors developed within the last decade. Such sensors for the measurement of engine lubricant properties can be used to detect a functional limit of the in-use lubricant, increase drain interval and reduce the environmental impact. This paper proposes two new methodologies for the quantitative and qualitative analysis of the presence of contaminants in the engine lubricants. The methodologies are based on optical analysis of the distortion effect when an object image is obtained through a thin random optical medium (e.g. engine lubricant). The novelty of the proposed methodologies is in the introduction of an object with a known periodic shape behind a thin film of the contaminated lubricant. In this case, an acquired image represents a combined lubricant–object optical appearance, where an a priori known periodic structure of the object is distorted by a contaminated lubricant. In the object shape-based optical analysis, several parameters of an acquired optical image, such as the gray scale intensity of lubricant and object, shape width at object and lubricant levels, object relative intensity and width non-uniformity coefficient are newly proposed. Variations in the contaminant concentration and use of different contaminants lead to the changes of these parameters measured on-line. In the statistical optical analysis methodology, statistical auto- and cross-characteristics (e.g. auto- and cross-correlation functions, auto- and cross-spectrums, transfer function

  11. Methodology for dimensional variation analysis of ITER integrated systems

    International Nuclear Information System (INIS)

    Fuentes, F. Javier; Trouvé, Vincent; Cordier, Jean-Jacques; Reich, Jens

    2016-01-01

    Highlights: • Tokamak dimensional management methodology, based on 3D variation analysis, is presented. • Dimensional Variation Model implementation workflow is described. • Methodology phases are described in detail. The application of this methodology to the tolerance analysis of ITER Vacuum Vessel is presented. • Dimensional studies are a valuable tool for the assessment of Tokamak PCR (Project Change Requests), DR (Deviation Requests) and NCR (Non-Conformance Reports). - Abstract: The ITER machine consists of a large number of complex systems highly integrated, with critical functional requirements and reduced design clearances to minimize the impact in cost and performances. Tolerances and assembly accuracies in critical areas could have a serious impact in the final performances, compromising the machine assembly and plasma operation. The management of tolerances allocated to part manufacture and assembly processes, as well as the control of potential deviations and early mitigation of non-compliances with the technical requirements, is a critical activity on the project life cycle. A 3D tolerance simulation analysis of ITER Tokamak machine has been developed based on 3DCS dedicated software. This integrated dimensional variation model is representative of Tokamak manufacturing functional tolerances and assembly processes, predicting accurate values for the amount of variation on critical areas. This paper describes the detailed methodology to implement and update the Tokamak Dimensional Variation Model. The model is managed at system level. The methodology phases are illustrated by its application to the Vacuum Vessel (VV), considering the status of maturity of VV dimensional variation model. The following topics are described in this paper: • Model description and constraints. • Model implementation workflow. • Management of input and output data. • Statistical analysis and risk assessment. The management of the integration studies based on

  12. Methodology for dimensional variation analysis of ITER integrated systems

    Energy Technology Data Exchange (ETDEWEB)

    Fuentes, F. Javier, E-mail: FranciscoJavier.Fuentes@iter.org [ITER Organization, Route de Vinon-sur-Verdon—CS 90046, 13067 St Paul-lez-Durance (France); Trouvé, Vincent [Assystem Engineering & Operation Services, rue J-M Jacquard CS 60117, 84120 Pertuis (France); Cordier, Jean-Jacques; Reich, Jens [ITER Organization, Route de Vinon-sur-Verdon—CS 90046, 13067 St Paul-lez-Durance (France)

    2016-11-01

    Highlights: • Tokamak dimensional management methodology, based on 3D variation analysis, is presented. • Dimensional Variation Model implementation workflow is described. • Methodology phases are described in detail. The application of this methodology to the tolerance analysis of ITER Vacuum Vessel is presented. • Dimensional studies are a valuable tool for the assessment of Tokamak PCR (Project Change Requests), DR (Deviation Requests) and NCR (Non-Conformance Reports). - Abstract: The ITER machine consists of a large number of complex systems highly integrated, with critical functional requirements and reduced design clearances to minimize the impact in cost and performances. Tolerances and assembly accuracies in critical areas could have a serious impact in the final performances, compromising the machine assembly and plasma operation. The management of tolerances allocated to part manufacture and assembly processes, as well as the control of potential deviations and early mitigation of non-compliances with the technical requirements, is a critical activity on the project life cycle. A 3D tolerance simulation analysis of ITER Tokamak machine has been developed based on 3DCS dedicated software. This integrated dimensional variation model is representative of Tokamak manufacturing functional tolerances and assembly processes, predicting accurate values for the amount of variation on critical areas. This paper describes the detailed methodology to implement and update the Tokamak Dimensional Variation Model. The model is managed at system level. The methodology phases are illustrated by its application to the Vacuum Vessel (VV), considering the status of maturity of VV dimensional variation model. The following topics are described in this paper: • Model description and constraints. • Model implementation workflow. • Management of input and output data. • Statistical analysis and risk assessment. The management of the integration studies based on

  13. The Development of Marine Accidents Human Reliability Assessment Approach: HEART Methodology and MOP Model

    Directory of Open Access Journals (Sweden)

    Ludfi Pratiwi Bowo

    2017-06-01

    Full Text Available Humans are one of the important factors in the assessment of accidents, particularly marine accidents. Hence, studies are conducted to assess the contribution of human factors in accidents. There are two generations of Human Reliability Assessment (HRA that have been developed. Those methodologies are classified by the differences of viewpoints of problem-solving, as the first generation and second generation. The accident analysis can be determined using three techniques of analysis; sequential techniques, epidemiological techniques and systemic techniques, where the marine accidents are included in the epidemiological technique. This study compares the Human Error Assessment and Reduction Technique (HEART methodology and the 4M Overturned Pyramid (MOP model, which are applied to assess marine accidents. Furthermore, the MOP model can effectively describe the relationships of other factors which affect the accidents; whereas, the HEART methodology is only focused on human factors.

  14. Development of Audit Calculation Methodology for RIA Safety Analysis

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Joosuk; Kim, Gwanyoung; Woo, Swengwoong [Korea Institute of Nuclear Safety, Daejeon (Korea, Republic of)

    2015-05-15

    The interim criteria contain more stringent limits than previous ones. For example, pellet-to-cladding mechanical interaction(PCMI) was introduced as a new failure criteria. And both short-term (e.g. fuel-to coolant interaction, rod burst) and long-term(e.g., fuel rod ballooning, flow blockage) phenomena should be addressed for core coolability assurance. For dose calculations, transient-induced fission gas release has to be accounted additionally. Traditionally, the approved RIA analysis methodologies for licensing application are developed based on conservative approach. But newly introduced safety criteria tend to reduce the margins to the criteria. Thereby, licensees are trying to improve the margins by utilizing a less conservative approach. In this situation, to cope with this trend, a new audit calculation methodology needs to be developed. In this paper, the new methodology, which is currently under developing in KINS, was introduced. For the development of audit calculation methodology of RIA safety analysis based on the realistic evaluation approach, preliminary calculation by utilizing the best estimate code has been done on the initial core of APR1400. Followings are main conclusions. - With the assumption of single full-strength control rod ejection in HZP condition, rod failure due to PCMI is not predicted. - And coolability can be assured in view of entalphy and fuel melting. - But, rod failure due to DNBR is expected, and there is possibility of fuel failure at the rated power conditions also.

  15. Absorption, distribution, and excretion of 8-methoxypsoralen in HRA/Skh mice

    International Nuclear Information System (INIS)

    Muni, I.A.; Schneider, F.H.; Olsson, T.A. III; King, M.

    1984-01-01

    The tissue distribution and excretion of [ 3 H]8-methoxypsoralen (8-MOP), a well-accepted therapeutic agent for the treatment of psoriasis, was studied in hairless HRA/Skh female mice. Mice were given single oral doses of 6 mg of [ 3 H]8-MOP or 5-[ 14 C]8-MOP/kg in corn oil. Radiochemical analyses of tissues and excreta were accomplished by liquid scintillation counting. The 8-MOP appeared to be rapidly absorbed through the gastrointestinal tract, where the tritium levels were highest, followed by skin, blood, and liver; levels were lowest in fat (adipose tissue). In female HRA/Skh mice which had not been irradiated with UVA (320-400 nm), 84% of the carbon-14 and 58% of the tritium were recovered in the urine and feces within 24 hours of oral administration of 5-[ 14 C]8-MOP or [ 3 H]8-MOP, respectively. Animals that were exposed to UVA and received [3H]8-MOP excreted approximately 12% less tritium in the urine and feces compared with the animals which received no UVA

  16. Latest developments on safety analysis methodologies at the Juzbado plant

    International Nuclear Information System (INIS)

    Zurron-Cifuentes, Oscar; Ortiz-Trujillo, Diego; Blanco-Fernandez, Luis A.

    2010-01-01

    Over the last few years the Juzbado Plant has developed and implemented several analysis methodologies to cope with specific issues regarding safety management. This paper describes the three most outstanding of them, so as to say, the Integrated Safety Analysis (ISA) project, the adaptation of the MARSSIM methodology for characterization surveys of radioactive contamination spots, and the programme for the Systematic Review of the Operational Conditions of the Safety Systems (SROCSS). Several reasons motivated the decision to implement such methodologies, such as Regulator requirements, operational experience and of course, the strong commitment of ENUSA to maintain the highest standards of nuclear industry on all the safety relevant activities. In this context, since 2004 ENUSA is undertaking the ISA project, which consists on a systematic examination of plant's processes, equipment, structures and personnel activities to ensure that all relevant hazards that could result in unacceptable consequences have been adequately evaluated and the appropriate protective measures have been identified. On the other hand and within the framework of a current programme to ensure the absence of radioactive contamination spots on unintended areas, the MARSSIM methodology is being applied as a tool to conduct the radiation surveys and investigation of potentially contaminated areas. Finally, the SROCSS programme was initiated earlier this year 2009 to assess the actual operating conditions of all the systems with safety relevance, aiming to identify either potential non-conformities or areas for improvement in order to ensure their high performance after years of operation. The following paragraphs describe the key points related to these three methodologies as well as an outline of the results obtained so far. (authors)

  17. Review of cause-based decision tree approach for the development of domestic standard human reliability analysis procedure in low power/shutdown operation probabilistic safety assessment

    International Nuclear Information System (INIS)

    Kang, D. I.; Jung, W. D.

    2003-01-01

    We review the Cause-Based Decision Tree (CBDT) approach to decide whether we incorporate it or not for the development of domestic standard Human Reliability Analysis (HRA) procedure in low power/shutdown operation Probabilistic Safety Assessment (PSA). In this paper, we introduce the cause based decision tree approach, quantify human errors using it, and identify merits and demerits of it in comparision with previously used THERP. The review results show that it is difficult to incorporate the CBDT method for the development of domestic standard HRA procedure in low power/shutdown PSA because the CBDT method need for the subjective judgment of HRA analyst like as THERP. However, it is expected that the incorporation of the CBDT method into the development of domestic standard HRA procedure only for the comparision of quantitative HRA results will relieve the burden of development of detailed HRA procedure and will help maintain consistent quantitative HRA results

  18. Methodology, Measurement and Analysis of Flow Table Update Characteristics in Hardware OpenFlow Switches

    KAUST Repository

    Kuźniar, Maciej; Pereší ni, Peter; Kostić, Dejan; Canini, Marco

    2018-01-01

    and performance characteristics is essential for ensuring successful and safe deployments.We propose a systematic methodology for SDN switch performance analysis and devise a series of experiments based on this methodology. The methodology relies on sending a

  19. A human error probability estimate methodology based on fuzzy inference and expert judgment on nuclear plants

    International Nuclear Information System (INIS)

    Nascimento, C.S. do; Mesquita, R.N. de

    2009-01-01

    Recent studies point human error as an important factor for many industrial and nuclear accidents: Three Mile Island (1979), Bhopal (1984), Chernobyl and Challenger (1986) are classical examples. Human contribution to these accidents may be better understood and analyzed by using Human Reliability Analysis (HRA), which has being taken as an essential part on Probabilistic Safety Analysis (PSA) of nuclear plants. Both HRA and PSA depend on Human Error Probability (HEP) for a quantitative analysis. These probabilities are extremely affected by the Performance Shaping Factors (PSF), which has a direct effect on human behavior and thus shape HEP according with specific environment conditions and personal individual characteristics which are responsible for these actions. This PSF dependence raises a great problem on data availability as turn these scarcely existent database too much generic or too much specific. Besides this, most of nuclear plants do not keep historical records of human error occurrences. Therefore, in order to overcome this occasional data shortage, a methodology based on Fuzzy Inference and expert judgment was employed in this paper in order to determine human error occurrence probabilities and to evaluate PSF's on performed actions by operators in a nuclear power plant (IEA-R1 nuclear reactor). Obtained HEP values were compared with reference tabled data used on current literature in order to show method coherence and valid approach. This comparison leads to a conclusion that this work results are able to be employed both on HRA and PSA enabling efficient prospection of plant safety conditions, operational procedures and local working conditions potential improvements (author)

  20. Snapshot analysis for rhodium fixed incore detector using BEACON methodology

    International Nuclear Information System (INIS)

    Cha, Kyoon Ho; Choi, Yu Sun; Lee, Eun Ki; Park, Moon Ghu; Morita, Toshio; Heibel, Michael D.

    2004-01-01

    The purpose of this report is to process the rhodium detector data of the Yonggwang nuclear unit 4 cycle 5 core for the measured power distribution by using the BEACON methodology. Rhodium snapshots of the YGN 4 cycle 5 have been analyzed by both BEACON/SPINOVA and CECOR to compare the results of both codes. By analyzing a large number of snapshots obtained during normal plant operation. Reviewing the results of this analysis, the BEACON/SPNOVA can be used for the snapshot analysis of Korean Standard Nuclear Power (KSNP) plants

  1. Interpretive Phenomenological Analysis: An Appropriate Methodology for Educational Research?

    Directory of Open Access Journals (Sweden)

    Edward John Noon

    2018-04-01

    Full Text Available Interpretive phenomenological analysis (IPA is a contemporary qualitative methodology, first developed by psychologist Jonathan Smith (1996. Whilst its roots are in psychology, it is increasingly being drawn upon by scholars in the human, social and health sciences (Charlick, Pincombe, McKellar, & Fielder, 2016. Despite this, IPA has received limited attention across educationalist literature. Drawing upon my experiences of using IPA to explore the barriers to the use of humour in the teaching of Childhood Studies (Noon, 2017, this paper will discuss its theoretical orientation, sampling and methods of data collection and analysis, before examining the strengths and weaknesses to IPA’s employment in educational research.

  2. A methodology for uncertainty analysis of reference equations of state

    DEFF Research Database (Denmark)

    Cheung, Howard; Frutiger, Jerome; Bell, Ian H.

    We present a detailed methodology for the uncertainty analysis of reference equations of state (EOS) based on Helmholtz energy. In recent years there has been an increased interest in uncertainties of property data and process models of thermal systems. In the literature there are various...... for uncertainty analysis is suggested as a tool for EOS. The uncertainties of the EOS properties are calculated from the experimental values and the EOS model structure through the parameter covariance matrix and subsequent linear error propagation. This allows reporting the uncertainty range (95% confidence...

  3. A methodology for strain-based fatigue reliability analysis

    International Nuclear Information System (INIS)

    Zhao, Y.X.

    2000-01-01

    A significant scatter of the cyclic stress-strain (CSS) responses should be noted for a nuclear reactor material, 1Cr18Ni9Ti pipe-weld metal. Existence of the scatter implies that a random cyclic strain applied history will be introduced under any of the loading modes even a deterministic loading history. A non-conservative evaluation might be given in the practice without considering the scatter. A methodology for strain-based fatigue reliability analysis, which has taken into account the scatter, is developed. The responses are approximately modeled by probability-based CSS curves of Ramberg-Osgood relation. The strain-life data are modeled, similarly, by probability-based strain-life curves of Coffin-Manson law. The reliability assessment is constructed by considering interference of the random fatigue strain applied and capacity histories. Probability density functions of the applied and capacity histories are analytically given. The methodology could be conveniently extrapolated to the case of deterministic CSS relation as the existent methods did. Non-conservative evaluation of the deterministic CSS relation and availability of present methodology have been indicated by an analysis of the material test results

  4. ASSESSMENT OF SEISMIC ANALYSIS METHODOLOGIES FOR DEEPLY EMBEDDED NPP STRUCTURES

    International Nuclear Information System (INIS)

    XU, J.; MILLER, C.; COSTANTINO, C.; HOFMAYER, C.; GRAVES, H. NRC.

    2005-01-01

    Several of the new generation nuclear power plant designs have structural configurations which are proposed to be deeply embedded. Since current seismic analysis methodologies have been applied to shallow embedded structures (e.g., ASCE 4 suggest that simple formulations may be used to model embedment effect when the depth of embedment is less than 30% of its foundation radius), the US Nuclear Regulatory Commission is sponsoring a program at the Brookhaven National Laboratory with the objective of investigating the extent to which procedures acceptable for shallow embedment depths are adequate for larger embedment depths. This paper presents the results of a study comparing the response spectra obtained from two of the more popular analysis methods for structural configurations varying from shallow embedment to complete embedment. A typical safety related structure embedded in a soil profile representative of a typical nuclear power plant site was utilized in the study and the depths of burial (DOB) considered range from 25-100% the height of the structure. Included in the paper are: (1) the description of a simplified analysis and a detailed approach for the SSI analyses of a structure with various DOB, (2) the comparison of the analysis results for the different DOBs between the two methods, and (3) the performance assessment of the analysis methodologies for SSI analyses of deeply embedded structures. The resulting assessment from this study has indicated that simplified methods may be capable of capturing the seismic response for much deeper embedded structures than would be normally allowed by the standard practice

  5. 3-D rod ejection analysis using a conservative methodology

    Energy Technology Data Exchange (ETDEWEB)

    Park, Min Ho; Park, Jin Woo; Park, Guen Tae; Um, Kil Sup; Ryu, Seok Hee; Lee, Jae Il; Choi, Tong Soo [KEPCO, Daejeon (Korea, Republic of)

    2016-05-15

    The point kinetics model which simplifies the core phenomena and physical specifications is used for the conventional rod ejection accident analysis. The point kinetics model is convenient to assume conservative core parameters but this simplification loses large amount of safety margin. The CHASER system couples the three-dimensional core neutron kinetics code ASTRA, the sub-channel analysis code THALES and the fuel performance analysis code FROST. The validation study for the CHASER system is addressed using the NEACRP three-dimensional PWR core transient benchmark problem. A series of conservative rod ejection analyses for the APR1400 type plant is performed for both hot full power (HFP) and hot zero power (HZP) conditions to determine the most limiting cases. The conservative rod ejection analysis methodology is designed to properly consider important phenomena and physical parameters.

  6. Methodological challenges in qualitative content analysis: A discussion paper.

    Science.gov (United States)

    Graneheim, Ulla H; Lindgren, Britt-Marie; Lundman, Berit

    2017-09-01

    This discussion paper is aimed to map content analysis in the qualitative paradigm and explore common methodological challenges. We discuss phenomenological descriptions of manifest content and hermeneutical interpretations of latent content. We demonstrate inductive, deductive, and abductive approaches to qualitative content analysis, and elaborate on the level of abstraction and degree of interpretation used in constructing categories, descriptive themes, and themes of meaning. With increased abstraction and interpretation comes an increased challenge to demonstrate the credibility and authenticity of the analysis. A key issue is to show the logic in how categories and themes are abstracted, interpreted, and connected to the aim and to each other. Qualitative content analysis is an autonomous method and can be used at varying levels of abstraction and interpretation. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Biomass Thermogravimetric Analysis: Uncertainty Determination Methodology and Sampling Maps Generation

    Science.gov (United States)

    Pazó, Jose A.; Granada, Enrique; Saavedra, Ángeles; Eguía, Pablo; Collazo, Joaquín

    2010-01-01

    The objective of this study was to develop a methodology for the determination of the maximum sampling error and confidence intervals of thermal properties obtained from thermogravimetric analysis (TG), including moisture, volatile matter, fixed carbon and ash content. The sampling procedure of the TG analysis was of particular interest and was conducted with care. The results of the present study were compared to those of a prompt analysis, and a correlation between the mean values and maximum sampling errors of the methods were not observed. In general, low and acceptable levels of uncertainty and error were obtained, demonstrating that the properties evaluated by TG analysis were representative of the overall fuel composition. The accurate determination of the thermal properties of biomass with precise confidence intervals is of particular interest in energetic biomass applications. PMID:20717532

  8. Safety analysis methodologies for radioactive waste repositories in shallow ground

    International Nuclear Information System (INIS)

    1984-01-01

    The report is part of the IAEA Safety Series and is addressed to authorities and specialists responsible for or involved in planning, performing and/or reviewing safety assessments of shallow ground radioactive waste repositories. It discusses approaches that are applicable for safety analysis of a shallow ground repository. The methodologies, analysis techniques and models described are pertinent to the task of predicting the long-term performance of a shallow ground disposal system. They may be used during the processes of selection, confirmation and licensing of new sites and disposal systems or to evaluate the long-term consequences in the post-sealing phase of existing operating or inactive sites. The analysis may point out need for remedial action, or provide information to be used in deciding on the duration of surveillance. Safety analysis both general in nature and specific to a certain repository, site or design concept, are discussed, with emphasis on deterministic and probabilistic studies

  9. Comparative analysis as a basic research orientation: Key methodological problems

    Directory of Open Access Journals (Sweden)

    N P Narbut

    2015-12-01

    Full Text Available To date, the Sociological Laboratory of the Peoples’ Friendship University of Russia has accumulated a vast experience in the field of cross-cultural studies reflected in the publications based on the results of mass surveys conducted in Moscow, Maikop, Beijing, Guangzhou, Prague, Belgrade, and Pristina. However, these publications mainly focus on the comparisons of the empirical data rather than methodological and technical issues, that is why the aim of this article is to identify key problems of the comparative analysis in cross-cultural studies that become evident only if you conduct an empirical research yourself - from the first step of setting the problem and approving it by all the sides (countries involved to the last step of interpreting and comparing the data obtained. The authors are sure that no sociologist would ever doubt the necessity and importance of comparative analysis in the broadest sense of the word, but at the same time very few are ready to discuss its key methodological challenges and prefer to ignore them completely. We summarize problems of the comparative analysis in sociology as follows: (1 applying research techniques to the sample in another country - both in translating and adapting them to different social realities and worldview (in particular, the problematic status of standardization and qualitative approach; (2 choosing “right” respondents to question and relevant cases (cultures to study; (3 designing the research scheme, i.e. justifying the sequence of steps (what should go first - methodology or techniques; (4 accepting the procedures that are correct within one country for cross-cultural work (whether or not that is an appropriate choice.

  10. Analysis of gaming community using Soft System Methodology

    OpenAIRE

    Hurych, Jan

    2015-01-01

    This diploma thesis aims to analyse virtual gaming community and it's problems in case of community belonging to EU server of the game called World of Tanks. To solve these problems, Soft System Methodology by P. Checkland, is used. The thesis includes analysis of significance of gaming communities for the gaming industry as a whole. Gaming community is then defined as a soft system. There are 3 problems analysed in the practical part of the thesis using newer version of SSM. One iteration of...

  11. A development of containment performance analysis methodology using GOTHIC code

    Energy Technology Data Exchange (ETDEWEB)

    Lee, B. C.; Yoon, J. I. [Future and Challenge Company, Seoul (Korea, Republic of); Byun, C. S.; Lee, J. Y. [Korea Electric Power Research Institute, Taejon (Korea, Republic of); Lee, J. Y. [Seoul National University, Seoul (Korea, Republic of)

    2003-10-01

    In a circumstance that well-established containment pressure/temperature analysis code, CONTEMPT-LT treats the reactor containment as a single volume, this study introduces, as an alternative, the GOTHIC code for an usage on multi-compartmental containment performance analysis. With a developed GOTHIC methodology, its applicability is verified for containment performance analysis for Korean Nuclear Unit 1. The GOTHIC model for this plant is simply composed of 3 compartments including the reactor containment and RWST. In addition, the containment spray system and containment recirculation system are simulated. As a result of GOTHIC calculation, under the same assumptions and conditions as those in CONTEMPT-LT, the GOTHIC prediction shows a very good result; pressure and temperature transients including their peaks are nearly the same. It can be concluded that the GOTHIC could provide reasonable containment pressure and temperature responses if considering the inherent conservatism in CONTEMPT-LT code.

  12. A development of containment performance analysis methodology using GOTHIC code

    International Nuclear Information System (INIS)

    Lee, B. C.; Yoon, J. I.; Byun, C. S.; Lee, J. Y.; Lee, J. Y.

    2003-01-01

    In a circumstance that well-established containment pressure/temperature analysis code, CONTEMPT-LT treats the reactor containment as a single volume, this study introduces, as an alternative, the GOTHIC code for an usage on multi-compartmental containment performance analysis. With a developed GOTHIC methodology, its applicability is verified for containment performance analysis for Korean Nuclear Unit 1. The GOTHIC model for this plant is simply composed of 3 compartments including the reactor containment and RWST. In addition, the containment spray system and containment recirculation system are simulated. As a result of GOTHIC calculation, under the same assumptions and conditions as those in CONTEMPT-LT, the GOTHIC prediction shows a very good result; pressure and temperature transients including their peaks are nearly the same. It can be concluded that the GOTHIC could provide reasonable containment pressure and temperature responses if considering the inherent conservatism in CONTEMPT-LT code

  13. METHODOLOGY FOR ANALYSIS OF DECISION MAKING IN AIR NAVIGATION SYSTEM

    Directory of Open Access Journals (Sweden)

    Volodymyr Kharchenko

    2011-03-01

    Full Text Available Abstract. In the research of Air Navigation System as a complex socio-technical system the methodologyof analysis of human-operator's decision-making has been developed. The significance of individualpsychologicalfactors as well as the impact of socio-psychological factors on the professional activities of ahuman-operator during the flight situation development from normal to catastrophic were analyzed. On thebasis of the reflexive theory of bipolar choice the expected risks of decision-making by the Air NavigationSystem's operator influenced by external environment, previous experience and intentions were identified.The methods for analysis of decision-making by the human-operator of Air Navigation System usingstochastic networks have been developed.Keywords: Air Navigation System, bipolar choice, human operator, decision-making, expected risk, individualpsychologicalfactors, methodology of analysis, reflexive model, socio-psychological factors, stochastic network.

  14. Integration of human reliability analysis into the probabilistic risk assessment process: phase 1

    International Nuclear Information System (INIS)

    Bell, B.J.; Vickroy, S.C.

    1985-01-01

    The US Nuclear Regulatory Commission and Pacific Northwest Laboratory initiated a research program in 1984 to develop a testable set of analytical procedures for integrating human reliability analysis (HRA) into the probabilistic risk assessment (PRA) process to more adequately assess the overall impact of human performance on risk. In this three phase program, stand-alone HRA/PRA analytic procedures will be developed and field evaluated to provide improved methods, techniques, and models for applying quantitative and qualitative human error data which systematically integrate HRA principles, techniques, and analyses throughout the entire PRA process. Phase 1 of the program involved analysis of state-of-the-art PRAs to define the structures and processes currently in use in the industry. Phase 2 research will involve developing a new or revised PRA methodology which will enable more efficient regulation of the industry using quantitative or qualitative results of the PRA. Finally, Phase 3 will be to field test those procedures to assure that the results generated by the new methodologies will be usable and acceptable to the NRC. This paper briefly describes the first phase of the program and outlines the second

  15. Integration of human reliability analysis into the probabilistic risk assessment process: Phase 1

    International Nuclear Information System (INIS)

    Bell, B.J.; Vickroy, S.C.

    1984-10-01

    A research program was initiated to develop a testable set of analytical procedures for integrating human reliability analysis (HRA) into the probabilistic risk assessment (PRA) process to more adequately assess the overall impact of human performance on risk. In this three-phase program, stand-alone HRA/PRA analytic procedures will be developed and field evaluated to provide improved methods, techniques, and models for applying quantitative and qualitative human error data which systematically integrate HRA principles, techniques, and analyses throughout the entire PRA process. Phase 1 of the program involved analysis of state-of-the-art PRAs to define the structures and processes currently in use in the industry. Phase 2 research will involve developing a new or revised PRA methodology which will enable more efficient regulation of the industry using quantitative or qualitative results of the PRA. Finally, Phase 3 will be to field test those procedures to assure that the results generated by the new methodologies will be usable and acceptable to the NRC. This paper briefly describes the first phase of the program and outlines the second

  16. A powerful methodology for reactor vessel pressurized thermal shock analysis

    International Nuclear Information System (INIS)

    Boucau, J.; Mager, T.

    1994-01-01

    The recent operating experience of the Pressurized Water Reactor (PWR) Industry has focused increasing attention on the issue of reactor vessel pressurized thermal shock (PTS). More specifically, the review of the old WWER-type of reactors (WWER 440/230) has indicated a sensitive behaviour to neutron embrittlement. This led already to some remedial actions including safety injection water preheating or vessel annealing. Such measures are usually taken based on the analysis of a selected number of conservative PTS events. Consideration of all postulated cooldown events would draw attention to the impact of operator action and control system effects on reactor vessel PTS. Westinghouse has developed a methodology which couples event sequence analysis with probabilistic fracture mechanics analyses, to identify those events that are of primary concern for reactor vessel integrity. Operating experience is utilized to aid in defining the appropriate event sequences and event frequencies of occurrence for the evaluation. Once the event sequences of concern are identified, detailed deterministic thermal-hydraulic and structural evaluations can be performed to determine the conditions required to minimize the extension of postulated flaws or enhance flaw arrest in the reactor vessel. The results of these analyses can then be used to better define further modifications in vessel and plant system design and to operating procedures. The purpose of the present paper will be to describe this methodology and to show its benefits for decision making. (author). 1 ref., 3 figs

  17. Development of a heat exchanger root-cause analysis methodology

    International Nuclear Information System (INIS)

    Jarrel, D.B.

    1989-01-01

    The objective of this work is to determine a generic methodology for approaching the accurate identification of the root cause of component failure. Root-cause determinations are an everyday challenge to plant personnel, but they are handled with widely differing degrees of success due to the approaches, levels of diagnostic expertise, and documentation. The criterion for success is simple: If the root cause of the failure has truly been determined and corrected, the same causal failure relationship will not be demonstrated again in the future. The approach to root-cause analysis (RCA) element definition was to first selectively choose and constrain a functionally significant component (in this case a component cooling water to service water heat exchanger) that has demonstrated prevalent failures. Then a root cause of failure analysis was performed by a systems engineer on a large number of actual failure scenarios. The analytical process used by the engineer was documented and evaluated to abstract the logic model used to arrive at the root cause. For the case of the heat exchanger, the actual root-cause diagnostic approach is described. A generic methodology for the solution of the root cause of component failure is demonstrable for this general heat exchanger sample

  18. APPROPRIATE ALLOCATION OF CONTINGENCY USING RISK ANALYSIS METHODOLOGY

    Directory of Open Access Journals (Sweden)

    Andi Andi

    2004-01-01

    Full Text Available Many cost overruns in the world of construction are attributable to either unforeseen events or foreseen events for which uncertainty was not appropriately accommodated. It is argued that a significant improvement to project management performance may result from greater attention to the process of analyzing project risks. The objective of this paper is to propose a risk analysis methodology for appropriate allocation of contingency in project cost estimation. In the first step, project risks will be identified. Influence diagramming technique is employed to identify and to show how the risks affect the project cost elements and also the relationships among the risks themselves. The second step is to assess the project costs with regards to the risks under consideration. Using a linguistic approach, the degree of uncertainty of identified project risks is assessed and quantified. The problem of dependency between risks is taken into consideration during this analysis. For the final step, as the main purpose of this paper, a method for allocating appropriate contingency is presented. Two types of contingencies, i.e. project contingency and management reserve are proposed to accommodate the risks. An illustrative example is presented at the end to show the application of the methodology.

  19. Human error probability quantification using fuzzy methodology in nuclear plants

    International Nuclear Information System (INIS)

    Nascimento, Claudio Souza do

    2010-01-01

    This work obtains Human Error Probability (HEP) estimates from operator's actions in response to emergency situations a hypothesis on Research Reactor IEA-R1 from IPEN. It was also obtained a Performance Shaping Factors (PSF) evaluation in order to classify them according to their influence level onto the operator's actions and to determine these PSF actual states over the plant. Both HEP estimation and PSF evaluation were done based on Specialists Evaluation using interviews and questionnaires. Specialists group was composed from selected IEA-R1 operators. Specialist's knowledge representation into linguistic variables and group evaluation values were obtained through Fuzzy Logic and Fuzzy Set Theory. HEP obtained values show good agreement with literature published data corroborating the proposed methodology as a good alternative to be used on Human Reliability Analysis (HRA). (author)

  20. Validating analysis methodologies used in burnup credit criticality calculations

    International Nuclear Information System (INIS)

    Brady, M.C.; Napolitano, D.G.

    1992-01-01

    The concept of allowing reactivity credit for the depleted (or burned) state of pressurized water reactor fuel in the licensing of spent fuel facilities introduces a new challenge to members of the nuclear criticality community. The primary difference in this analysis approach is the technical ability to calculate spent fuel compositions (or inventories) and to predict their effect on the system multiplication factor. Isotopic prediction codes are used routinely for in-core physics calculations and the prediction of radiation source terms for both thermal and shielding analyses, but represent an innovation for criticality specialists. This paper discusses two methodologies currently being developed to specifically evaluate isotopic composition and reactivity for the burnup credit concept. A comprehensive approach to benchmarking and validating the methods is also presented. This approach involves the analysis of commercial reactor critical data, fuel storage critical experiments, chemical assay isotopic data, and numerical benchmark calculations

  1. CONTENT ANALYSIS IN PROJECT MANAGEMENT: PROPOSALOF A METHODOLOGICAL FRAMEWORK

    Directory of Open Access Journals (Sweden)

    Alessandro Prudêncio Lukosevicius

    2016-12-01

    Full Text Available Content analysis (CA is a popular approach among researchers from different areas, but incipient in project management (PM. However, the volume of usage apparently does not translate into application quality. The method receives constant criticism about the scientific rigor adopted, especially when led by junior researchers. This article proposes a methodological framework for CA and investigate the use of CA in PM research. To accomplish this goal, literature systematic review is conjugated with CA related to 23 articles from EBSCO base in the last 20 years (1996 – 2016. The findings showed that the proposed framework can help researchers better apply the CA and suggests that the use of the method in terms of quantity and quality in PM research should be expanded. In addition to the framework, another contribution of this research is an analysis of the use of CA in PM in the last 20 years.

  2. Task analysis and computer aid development for human reliability analysis in nuclear power plants

    Energy Technology Data Exchange (ETDEWEB)

    Yoon, W. C.; Kim, H.; Park, H. S.; Choi, H. H.; Moon, J. M.; Heo, J. Y.; Ham, D. H.; Lee, K. K.; Han, B. T. [Korea Advanced Institute of Science and Technology, Taejeon (Korea)

    2001-04-01

    Importance of human reliability analysis (HRA) that predicts the error's occurrence possibility in a quantitative and qualitative manners is gradually increased by human errors' effects on the system's safety. HRA needs a task analysis as a virtue step, but extant task analysis techniques have the problem that a collection of information about the situation, which the human error occurs, depends entirely on HRA analyzers. The problem makes results of the task analysis inconsistent and unreliable. To complement such problem, KAERI developed the structural information analysis (SIA) that helps to analyze task's structure and situations systematically. In this study, the SIA method was evaluated by HRA experts, and a prototype computerized supporting system named CASIA (Computer Aid for SIA) was developed for the purpose of supporting to perform HRA using the SIA method. Additionally, through applying the SIA method to emergency operating procedures, we derived generic task types used in emergency and accumulated the analysis results in the database of the CASIA. The CASIA is expected to help HRA analyzers perform the analysis more easily and consistently. If more analyses will be performed and more data will be accumulated to the CASIA's database, HRA analyzers can share freely and spread smoothly his or her analysis experiences, and there by the quality of the HRA analysis will be improved. 35 refs., 38 figs., 25 tabs. (Author)

  3. Guidelines for the regulatory review of the human reliability analysis in PSAs

    International Nuclear Information System (INIS)

    Reer, Bernhard; Dang, V.N.; Hirschberg, Stefan; Meyer, Patrick

    2000-01-01

    In the review guidelines recently developed for the Swiss Federal Nuclear Inspectorate, the Human Reliability Analysis (HRA) is reviewed in two stages. The preliminary review is aimed at identifying major shortcomings and potential issues to be examined in the detailed review. The detailed review comprehensively addresses the overall adequacy and transparency of the HRA. For the two review stages, 97 indicators are defined in terms of questions focusing on verifiable features of the methodology, implementation and results. The guidelines provide steps for information gathering and present examples of acceptable practices as well as of potential deficiencies. Both review stages may result in requests for clarification, additional documentation or analyses. The first applications of the guidelines consist of the preliminary reviews of two HRAs. (author)

  4. Application of survival analysis methodology to the quantitative analysis of LC-MS proteomics data

    KAUST Repository

    Tekwe, C. D.; Carroll, R. J.; Dabney, A. R.

    2012-01-01

    positive, skewed and often left-censored, we propose using survival methodology to carry out differential expression analysis of proteins. Various standard statistical techniques including non-parametric tests such as the Kolmogorov-Smirnov and Wilcoxon

  5. DiapHRaGM: A mnemonic to describe the work of breathing in patients with respiratory failure.

    Directory of Open Access Journals (Sweden)

    Aiman Tulaimat

    Full Text Available The assessment of the work of breathing in the definitions of respiratory failure is vague and variable.Identify a parsimonious set of signs to describe the work of breathing in hypoxemic, acutely ill patients.We examined consecutive medical ICU patients receiving oxygen with a mask, non-invasive ventilation, or T-piece. A physician inspected each patient for 10 seconds, rated the level of respiratory distress, and then examined the patient for vital signs and 17 other physical signs. We used the rating of distress as a surrogate for measuring the work of breathing, constructed three multivariate models to identify the one with the smallest number of signs and largest explained variance, and validated it with bootstrap analysis.We performed 402 observations on 240 patients. Respiratory distress was absent in 78, mild in 157, moderate in 107, and severe in 60. Respiratory rate, hypoxia, heart rate, and frequency of most signs increased as distress increased. Respiratory rate and hypoxia explained 43% of the variance in respiratory distress. Diaphoresis, gasping, and contraction of the sternomastoid explained an additional 28%. Heart rate, blood pressure, alertness, agitation, body posture, nasal flaring, audible breathing, cyanosis, tracheal tug, retractions, paradox, scalene or abdominal muscles contraction did not increase the explained variance in respiratory distress.Most of the variance is respiratory distress can be explained by five signs summarized by the mnemonic DiapHRaGM (diaphoresis, hypoxia, respiratory rate, gasping, accessory muscle. This set of signs may allow for efficient, standardized assessments of the work of breathing of hypoxic patients.

  6. Proposal of methodology of tsunami accident sequence analysis induced by earthquake using DQFM methodology

    International Nuclear Information System (INIS)

    Muta, Hitoshi; Muramatsu, Ken

    2017-01-01

    Since the Fukushima-Daiichi nuclear power station accident, the Japanese regulatory body has improved and upgraded the regulation of nuclear power plants, and continuous effort is required to enhance risk management in the mid- to long term. Earthquakes and tsunamis are considered as the most important risks, and the establishment of probabilistic risk assessment (PRA) methodologies for these events is a major issue of current PRA. The Nuclear Regulation Authority (NRA) addressed the PRA methodology for tsunamis induced by earthquakes, which is one of the methodologies that should be enhanced step by step for the improvement and maturity of PRA techniques. The AESJ standard for the procedure of seismic PRA for nuclear power plants in 2015 provides the basic concept of the methodology; however, details of the application to the actual plant PRA model have not been sufficiently provided. This study proposes a detailed PRA methodology for tsunamis induced by earthquakes using the DQFM methodology, which contributes to improving the safety of nuclear power plants. Furthermore, this study also states the issues which need more research. (author)

  7. Application of transient analysis methodology to heat exchanger performance monitoring

    International Nuclear Information System (INIS)

    Rampall, I.; Soler, A.I.; Singh, K.P.; Scott, B.H.

    1994-01-01

    A transient testing technique is developed to evaluate the thermal performance of industrial scale heat exchangers. A Galerkin-based numerical method with a choice of spectral basis elements to account for spatial temperature variations in heat exchangers is developed to solve the transient heat exchanger model equations. Testing a heat exchanger in the transient state may be the only viable alternative where conventional steady state testing procedures are impossible or infeasible. For example, this methodology is particularly suited to the determination of fouling levels in component cooling water system heat exchangers in nuclear power plants. The heat load on these so-called component coolers under steady state conditions is too small to permit meaningful testing. An adequate heat load develops immediately after a reactor shutdown when the exchanger inlet temperatures are highly time-dependent. The application of the analysis methodology is illustrated herein with reference to an in-situ transient testing carried out at a nuclear power plant. The method, however, is applicable to any transient testing application

  8. Methodology to analysis of aging processes of containment spray system

    International Nuclear Information System (INIS)

    Borges, D. da Silva; Lava, D.D.; Moreira, M. de L.; Ferreira Guimarães, A.C.; Fernandes da Silva, L.

    2015-01-01

    This paper presents a contribution to the study of aging process of components in commercial plants of Pressurized Water Reactors (PWRs). The motivation for write this work emerged from the current perspective nuclear. Numerous nuclear power plants worldwide have an advanced operating time. This problem requires a process to ensure the confiability of the operative systems of these plants, because of this, it is necessary a methodologies capable of estimate the failure probability of the components and systems. In addition to the safety factors involved, such methodologies can to be used to search ways to ensure the extension of the life cycle of nuclear plants, which inevitably will pass by the decommissioning process after the operating time of 40 years. This process negatively affects the power generation, besides demanding an enormous investment for such. Thus, this paper aims to present modeling techniques and sensitivity analysis, which together can generate an estimate of how components, which are more sensitive to the aging process, will behave during the normal operation cycle of a nuclear power plant. (authors)

  9. Multi-Unit Considerations for Human Reliability Analysis

    Energy Technology Data Exchange (ETDEWEB)

    St. Germain, S.; Boring, R.; Banaseanu, G.; Akl, Y.; Chatri, H.

    2017-03-01

    This paper uses the insights from the Standardized Plant Analysis Risk-Human Reliability Analysis (SPAR-H) methodology to help identify human actions currently modeled in the single unit PSA that may need to be modified to account for additional challenges imposed by a multi-unit accident as well as identify possible new human actions that might be modeled to more accurately characterize multi-unit risk. In identifying these potential human action impacts, the use of the SPAR-H strategy to include both errors in diagnosis and errors in action is considered as well as identifying characteristics of a multi-unit accident scenario that may impact the selection of the performance shaping factors (PSFs) used in SPAR-H. The lessons learned from the Fukushima Daiichi reactor accident will be addressed to further help identify areas where improved modeling may be required. While these multi-unit impacts may require modifications to a Level 1 PSA model, it is expected to have much more importance for Level 2 modeling. There is little currently written specifically about multi-unit HRA issues. A review of related published research will be presented. While this paper cannot answer all issues related to multi-unit HRA, it will hopefully serve as a starting point to generate discussion and spark additional ideas towards the proper treatment of HRA in a multi-unit PSA.

  10. A new methodology of spatial cross-correlation analysis.

    Science.gov (United States)

    Chen, Yanguang

    2015-01-01

    Spatial correlation modeling comprises both spatial autocorrelation and spatial cross-correlation processes. The spatial autocorrelation theory has been well-developed. It is necessary to advance the method of spatial cross-correlation analysis to supplement the autocorrelation analysis. This paper presents a set of models and analytical procedures for spatial cross-correlation analysis. By analogy with Moran's index newly expressed in a spatial quadratic form, a theoretical framework is derived for geographical cross-correlation modeling. First, two sets of spatial cross-correlation coefficients are defined, including a global spatial cross-correlation coefficient and local spatial cross-correlation coefficients. Second, a pair of scatterplots of spatial cross-correlation is proposed, and the plots can be used to visually reveal the causality behind spatial systems. Based on the global cross-correlation coefficient, Pearson's correlation coefficient can be decomposed into two parts: direct correlation (partial correlation) and indirect correlation (spatial cross-correlation). As an example, the methodology is applied to the relationships between China's urbanization and economic development to illustrate how to model spatial cross-correlation phenomena. This study is an introduction to developing the theory of spatial cross-correlation, and future geographical spatial analysis might benefit from these models and indexes.

  11. A New Methodology of Spatial Cross-Correlation Analysis

    Science.gov (United States)

    Chen, Yanguang

    2015-01-01

    Spatial correlation modeling comprises both spatial autocorrelation and spatial cross-correlation processes. The spatial autocorrelation theory has been well-developed. It is necessary to advance the method of spatial cross-correlation analysis to supplement the autocorrelation analysis. This paper presents a set of models and analytical procedures for spatial cross-correlation analysis. By analogy with Moran’s index newly expressed in a spatial quadratic form, a theoretical framework is derived for geographical cross-correlation modeling. First, two sets of spatial cross-correlation coefficients are defined, including a global spatial cross-correlation coefficient and local spatial cross-correlation coefficients. Second, a pair of scatterplots of spatial cross-correlation is proposed, and the plots can be used to visually reveal the causality behind spatial systems. Based on the global cross-correlation coefficient, Pearson’s correlation coefficient can be decomposed into two parts: direct correlation (partial correlation) and indirect correlation (spatial cross-correlation). As an example, the methodology is applied to the relationships between China’s urbanization and economic development to illustrate how to model spatial cross-correlation phenomena. This study is an introduction to developing the theory of spatial cross-correlation, and future geographical spatial analysis might benefit from these models and indexes. PMID:25993120

  12. Human Reliability Analysis for Small Modular Reactors

    Energy Technology Data Exchange (ETDEWEB)

    Ronald L. Boring; David I. Gertman

    2012-06-01

    Because no human reliability analysis (HRA) method was specifically developed for small modular reactors (SMRs), the application of any current HRA method to SMRs represents tradeoffs. A first- generation HRA method like THERP provides clearly defined activity types, but these activity types do not map to the human-system interface or concept of operations confronting SMR operators. A second- generation HRA method like ATHEANA is flexible enough to be used for SMR applications, but there is currently insufficient guidance for the analyst, requiring considerably more first-of-a-kind analyses and extensive SMR expertise in order to complete a quality HRA. Although no current HRA method is optimized to SMRs, it is possible to use existing HRA methods to identify errors, incorporate them as human failure events in the probabilistic risk assessment (PRA), and quantify them. In this paper, we provided preliminary guidance to assist the human reliability analyst and reviewer in understanding how to apply current HRA methods to the domain of SMRs. While it is possible to perform a satisfactory HRA using existing HRA methods, ultimately it is desirable to formally incorporate SMR considerations into the methods. This may require the development of new HRA methods. More practicably, existing methods need to be adapted to incorporate SMRs. Such adaptations may take the form of guidance on the complex mapping between conventional light water reactors and small modular reactors. While many behaviors and activities are shared between current plants and SMRs, the methods must adapt if they are to perform a valid and accurate analysis of plant personnel performance in SMRs.

  13. Using of BEPU methodology in a final safety analysis report

    International Nuclear Information System (INIS)

    Menzel, Francine; Sabundjian, Gaiane; D'auria, Francesco; Madeira, Alzira A.

    2015-01-01

    The Nuclear Reactor Safety (NRS) has been established since the discovery of nuclear fission, and the occurrence of accidents in Nuclear Power Plants worldwide has contributed for its improvement. The Final Safety Analysis Report (FSAR) must contain complete information concerning safety of the plant and plant site, and must be seen as a compendium of NRS. The FSAR integrates both the licensing requirements and the analytical techniques. The analytical techniques can be applied by using a realistic approach, addressing the uncertainties of the results. This work aims to show an overview of the main analytical techniques that can be applied with a Best Estimated Plus Uncertainty (BEPU) methodology, which is 'the best one can do', as well as the ALARA (As Low As Reasonably Achievable) principle. Moreover, the paper intends to demonstrate the background of the licensing process through the main licensing requirements. (author)

  14. Using of BEPU methodology in a final safety analysis report

    Energy Technology Data Exchange (ETDEWEB)

    Menzel, Francine; Sabundjian, Gaiane, E-mail: fmenzel@ipen.br, E-mail: gdjian@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil); D' auria, Francesco, E-mail: f.dauria@ing.unipi.it [Universita degli Studi di Pisa, Gruppo di Ricerca Nucleare San Piero a Grado (GRNSPG), Pisa (Italy); Madeira, Alzira A., E-mail: alzira@cnen.gov.br [Comissao Nacional de Energia Nuclear (CNEN), Rio de Janeiro, RJ (Brazil)

    2015-07-01

    The Nuclear Reactor Safety (NRS) has been established since the discovery of nuclear fission, and the occurrence of accidents in Nuclear Power Plants worldwide has contributed for its improvement. The Final Safety Analysis Report (FSAR) must contain complete information concerning safety of the plant and plant site, and must be seen as a compendium of NRS. The FSAR integrates both the licensing requirements and the analytical techniques. The analytical techniques can be applied by using a realistic approach, addressing the uncertainties of the results. This work aims to show an overview of the main analytical techniques that can be applied with a Best Estimated Plus Uncertainty (BEPU) methodology, which is 'the best one can do', as well as the ALARA (As Low As Reasonably Achievable) principle. Moreover, the paper intends to demonstrate the background of the licensing process through the main licensing requirements. (author)

  15. Methodology for Modeling and Analysis of Business Processes (MMABP

    Directory of Open Access Journals (Sweden)

    Vaclav Repa

    2015-10-01

    Full Text Available This paper introduces the methodology for modeling business processes. Creation of the methodology is described in terms of the Design Science Method. Firstly, the gap in contemporary Business Process Modeling approaches is identified and general modeling principles which can fill the gap are discussed. The way which these principles have been implemented in the main features of created methodology is described. Most critical identified points of the business process modeling are process states, process hierarchy and the granularity of process description. The methodology has been evaluated by use in the real project. Using the examples from this project the main methodology features are explained together with the significant problems which have been met during the project. Concluding from these problems together with the results of the methodology evaluation the needed future development of the methodology is outlined.

  16. Application of System Dynamics Methodology in Population Analysis

    Directory of Open Access Journals (Sweden)

    August Turina

    2009-09-01

    Full Text Available The goal of this work is to present the application of system dynamics and system thinking, as well as the advantages and possible defects of this analytic approach, in order to improve the analysis of complex systems such as population and, thereby, to monitor more effectively the underlying causes of migrations. This methodology has long been present in interdisciplinary scientific circles, but its scientific contribution has not been sufficiently applied in analysis practice in Croatia. Namely, the major part of system analysis is focused on detailed complexity rather than on dynamic complexity. Generally, the science of complexity deals with emergence, innovation, learning and adaptation. Complexity is viewed according to the number of system components, or through a number of combinations that must be continually analyzed in order to understand and consequently provide adequate decisions. Simulations containing thousands of variables and complex arrays of details distract overall attention from the basic cause patterns and key inter-relations emerging and prevailing within an analyzed population. Systems thinking offers a holistic and integral perspective for observation of the world.

  17. Transuranium analysis methodologies for biological and environmental samples

    International Nuclear Information System (INIS)

    Wessman, R.A.; Lee, K.D.; Curry, B.; Leventhal, L.

    1978-01-01

    Analytical procedures for the most abundant transuranium nuclides in the environment (i.e., plutonium and, to a lesser extent, americium) are available. There is a lack of procedures for doing sequential analysis for Np, Pu, Am, and Cm in environmental samples, primarily because of current emphasis on Pu and Am. Reprocessing requirements and waste disposal connected with the fuel cycle indicate that neptunium and curium must be considered in environmental radioactive assessments. Therefore it was necessary to develop procedures that determine all four of these radionuclides in the environment. The state of the art of transuranium analysis methodology as applied to environmental samples is discussed relative to different sample sources, such as soil, vegetation, air, water, and animals. Isotope-dilution analysis with 243 Am ( 239 Np) and 236 Pu or 242 Pu radionuclide tracers is used. Americium and curium are analyzed as a group, with 243 Am as the tracer. Sequential extraction procedures employing bis(2-ethyl-hexyl)orthophosphoric acid (HDEHP) were found to result in lower yields and higher Am--Cm fractionation than ion-exchange methods

  18. Methodology for the analysis and retirement of assets: Power transformers

    Directory of Open Access Journals (Sweden)

    Gustavo Adolfo Gómez-Ramírez

    2015-09-01

    Full Text Available The following article consists of the development of a methodology of the area of the engineering of high voltage for the analysis and retirement of repaired power transformers, based on engineering criteria, in order to establish a correlation between the conditions of the transformer from several points of view: electrical, mechanical, dielectrics and thermal. Realizing an analysis of the condition of the question, there are two situations of great transcendency, first one, the international procedure are a “guide” for the acceptance of new transformers, by what they cannot be applied to the letter for repaired transformers, due to the process itself of degradation that the transformer has suffered with to happen of the years and all the factors that they were carrying to a possible repair. Second one, with base in the most recent technical literature, there have been analyzed articles, which analyze the oil dielectrics and the paper, which correlations are established between the quality of the insulating paper and the furan concentrations in the oils. To finish, great part of the investigation till now realized, it has focused in the analysis of the transformer, from the condition of the dielectric oil, so in most cases, there is not had the possibility of realizing a forensic engineering inside the transformer in operation and of being able this way, of analyzing the components of design who can compromise the integrity and operability of the same one.

  19. Failure mode effect analysis and fault tree analysis as a combined methodology in risk management

    Science.gov (United States)

    Wessiani, N. A.; Yoshio, F.

    2018-04-01

    There have been many studies reported the implementation of Failure Mode Effect Analysis (FMEA) and Fault Tree Analysis (FTA) as a method in risk management. However, most of the studies usually only choose one of these two methods in their risk management methodology. On the other side, combining these two methods will reduce the drawbacks of each methods when implemented separately. This paper aims to combine the methodology of FMEA and FTA in assessing risk. A case study in the metal company will illustrate how this methodology can be implemented. In the case study, this combined methodology will assess the internal risks that occur in the production process. Further, those internal risks should be mitigated based on their level of risks.

  20. Supporting Space Systems Design via Systems Dependency Analysis Methodology

    Science.gov (United States)

    Guariniello, Cesare

    assess the behavior of each system based on its internal status and on the topology of its dependencies on systems connected to it. Designers and decision makers can therefore quickly analyze and explore the behavior of complex systems and evaluate different architectures under various working conditions. The methods support educated decision making both in the design and in the update process of systems architecture, reducing the need to execute extensive simulations. In particular, in the phase of concept generation and selection, the information given by the methods can be used to identify promising architectures to be further tested and improved, while discarding architectures that do not show the required level of global features. The methods, when used in conjunction with appropriate metrics, also allow for improved reliability and risk analysis, as well as for automatic scheduling and re-scheduling based on the features of the dependencies and on the accepted level of risk. This dissertation illustrates the use of the two methods in sample aerospace applications, both in the operational and in the developmental domain. The applications show how to use the developed methodology to evaluate the impact of failures, assess the criticality of systems, quantify metrics of interest, quantify the impact of delays, support informed decision making when scheduling the development of systems and evaluate the achievement of partial capabilities. A larger, well-framed case study illustrates how the Systems Operational Dependency Analysis method and the Systems Developmental Dependency Analysis method can support analysis and decision making, at the mid and high level, in the design process of architectures for the exploration of Mars. The case study also shows how the methods do not replace the classical systems engineering methodologies, but support and improve them.

  1. The analysis of RWAP(Rod Withdrawal at Power) using the KEPRI methodology

    International Nuclear Information System (INIS)

    Yang, C. K.; Kim, Y. H.

    2001-01-01

    KEPRI developed new methodology which was based on RASP(Reactor Analysis Support Package). In this paper, The analysis of RWAP(Rod Withdrawal at Power) accident which can result in reactivity and power distribution anomaly was performed using the KEPRI methodology. The calculation describes RWAP transient and documents the analysis, including the computer code modeling assumptions and input parameters used in the analysis. To validity for the new methodology, the result of calculation was compared with FSAR. As compared with FSAR, result of the calculation using the KEPRI Methodology is similar to FSAR's. And result of the sensitivity of postulated parameters were similar to the existing methodology

  2. Reliability analysis for power supply system in a reprocessing facility based on GO methodology

    International Nuclear Information System (INIS)

    Wang Renze

    2014-01-01

    GO methodology was applied to analyze the reliability of power supply system in a typical reprocessing facility. Based on the fact that tie breakers are set in the system, tie breaker operator was defined. Then GO methodology modeling and quantitative analysis were performed sequently, minimal cut sets and average unavailability of the system were obtained. Parallel analysis between GO methodology and fault tree methodology was also performed. The results showed that setup of tie breakers was rational and necessary and that the modeling was much easier and the chart was much more succinct for GO methodology parallel with fault tree methodology to analyze the reliability of the power supply system. (author)

  3. FRACTAL ANALYSIS OF TRABECULAR BONE: A STANDARDISED METHODOLOGY

    Directory of Open Access Journals (Sweden)

    Ian Parkinson

    2011-05-01

    Full Text Available A standardised methodology for the fractal analysis of histological sections of trabecular bone has been established. A modified box counting method has been developed for use on a PC based image analyser (Quantimet 500MC, Leica Cambridge. The effect of image analyser settings, magnification, image orientation and threshold levels, was determined. Also, the range of scale over which trabecular bone is effectively fractal was determined and a method formulated to objectively calculate more than one fractal dimension from the modified Richardson plot. The results show that magnification, image orientation and threshold settings have little effect on the estimate of fractal dimension. Trabecular bone has a lower limit below which it is not fractal (λ<25 μm and the upper limit is 4250 μm. There are three distinct fractal dimensions for trabecular bone (sectional fractals, with magnitudes greater than 1.0 and less than 2.0. It has been shown that trabecular bone is effectively fractal over a defined range of scale. Also, within this range, there is more than 1 fractal dimension, describing spatial structural entities. Fractal analysis is a model independent method for describing a complex multifaceted structure, which can be adapted for the study of other biological systems. This may be at the cell, tissue or organ level and compliments conventional histomorphometric and stereological techniques.

  4. Methodological frontier in operational analysis for roundabouts: a review

    Directory of Open Access Journals (Sweden)

    Orazio Giuffre'

    2016-11-01

    Full Text Available Several studies and researches have shown that modern roundabouts are safe and effective as engineering countermeasures for traffic calming, and they are now widely used worldwide. The increasing use of roundabouts and, more recently, turbo and flower roundabouts, has induced a great variety of experiences in the field of intersection design, traffic safety and capacity modelling. As for unsignalized intersections which represent the starting point to extend knowledge about the operational analysis to roundabouts, the general situation in capacity estimation is still characterized by the discussion between gap acceptance models and empirical regression models. However, capacity modelling must contain both the analytical construction and then solution of the model, and the implementation of driver behavior. Thus, issues on a realistic modelling of driver behavior by the parameters that are included into the models are always of interest for practioners and analysts in transportation and road infrastructure engineering. Based on these considerations, this paper presents a literature review about the key methodological issues in the operational analysis of modern roundabouts. Focus is made on the aspects associated with the gap acceptance behavior, the derivation of the analytical-based models and the calculation of parameters included into the capacity equations, as well as steady state and non-steady state conditions and uncertainty in entry capacity estimation. At last, insights on future developments of the research in this field of investigation will be also outlined.

  5. Integrated sequence analysis. Final report

    Energy Technology Data Exchange (ETDEWEB)

    Andersson, K.; Pyy, P

    1998-02-01

    The NKS/RAK subprojet 3 `integrated sequence analysis` (ISA) was formulated with the overall objective to develop and to test integrated methodologies in order to evaluate event sequences with significant human action contribution. The term `methodology` denotes not only technical tools but also methods for integration of different scientific disciplines. In this report, we first discuss the background of ISA and the surveys made to map methods in different application fields, such as man machine system simulation software, human reliability analysis (HRA) and expert judgement. Specific event sequences were, after the surveys, selected for application and testing of a number of ISA methods. The event sequences discussed in the report were cold overpressure of BWR, shutdown LOCA of BWR, steam generator tube rupture of a PWR and BWR disturbed signal view in the control room after an external event. Different teams analysed these sequences by using different ISA and HRA methods. Two kinds of results were obtained from the ISA project: sequence specific and more general findings. The sequence specific results are discussed together with each sequence description. The general lessons are discussed under a separate chapter by using comparisons of different case studies. These lessons include areas ranging from plant safety management (design, procedures, instrumentation, operations, maintenance and safety practices) to methodological findings (ISA methodology, PSA,HRA, physical analyses, behavioural analyses and uncertainty assessment). Finally follows a discussion about the project and conclusions are presented. An interdisciplinary study of complex phenomena is a natural way to produce valuable and innovative results. This project came up with structured ways to perform ISA and managed to apply the in practice. The project also highlighted some areas where more work is needed. In the HRA work, development is required for the use of simulators and expert judgement as

  6. Study on HRA-based method for assessing digital man-machine interface

    International Nuclear Information System (INIS)

    Li Pengcheng; Dai Licao; Zhang Li; Zhao Ming; Hu Hong

    2014-01-01

    In order to identify the design flaws of digital man-machine interface (MMI) that may trigger human errors or weaken the performance of operators, a HRA-based method (namely HCR + CREAM + HEC) for assessing digital MMI was established. Firstly, the HCR method was used to identify the risk scenarios of high human error probability from the overall event as a whole perspective. Then, for the identified high-risk scenarios, the CREAM was adopted to determine the various error modes and its error probability, and the failure probability was ranked. Finally, the human factors engineering checklist of digital MMI was established according to the characteristics of digital MMI, it was used to check the digital MMI with high error probability in order to identify the design flaws of digital MMI, and the suggestions of optimization were provided. The results show that the provided assessment method can quickly and efficiently identify the design flaws of digital MMI which easily trigger human errors, and the safety of operation of the digital control system for nuclear power plants can be enhanced by optimization of design. (authors)

  7. 50 Years of coastal erosion analysis: A new methodological approach.

    Science.gov (United States)

    Prieto Campos, Antonio; Diaz Cuevas, Pilar; Ojeda zujar, Jose; Guisado-Pintado, Emilia

    2017-04-01

    Coasts over the world have been subjected to increased anthropogenic pressures which combined with natural hazards impacts (storm events, rising sea-levels) have led to strong erosion problems with negative impacts on the economy and the safety of coastal communities. The Andalusian coast (South Spain) is a renowned global tourist destination. In the past decades a deep transformation in the economic model led to significant land use changes: strong regulation of rivers, urbanisation and occupation of dunes, among others. As a result irreversible transformations on the coastline, from the aggressive urbanisation undertaken, are now to be faced by local authorities and suffered by locals and visitors. Moreover, the expected impacts derived from the climate change aggravated by anthropic activities emphasises the need for tools that facilitates decision making for a sustainable coastal management. In this contribution a homogeneous (only a proxy and one photointerpreter) methodology is proposed for the calculation of coastal erosion rates of exposed beaches in Andalusia (640 km) through the use of detailed series (1:2500) of open source orthophotographies for the period (1956-1977-2001-2011). The outstanding combination of the traditional software DSAS (Digital Shoreline Analysis System) with a spatial database (PostgreSQL) which integrates the resulting erosion rates with related coastal thematic information (geomorphology, presence of engineering infrastructures, dunes and ecosystems) enhances the capacity of analysis and exploitation. Further, the homogeneity of the method used allows the comparison of the results among years in a highly diverse coast, with both Mediterranean and Atlantic façades. The novelty development and integration of a PostgreSQL/Postgis database facilitates the exploitation of the results by the user (for instance by relating calculated rates with other thematic information as geomorphology of the coast or the presence of a dune field on

  8. Compliance strategy for statistically based neutron overpower protection safety analysis methodology

    International Nuclear Information System (INIS)

    Holliday, E.; Phan, B.; Nainer, O.

    2009-01-01

    The methodology employed in the safety analysis of the slow Loss of Regulation (LOR) event in the OPG and Bruce Power CANDU reactors, referred to as Neutron Overpower Protection (NOP) analysis, is a statistically based methodology. Further enhancement to this methodology includes the use of Extreme Value Statistics (EVS) for the explicit treatment of aleatory and epistemic uncertainties, and probabilistic weighting of the initial core states. A key aspect of this enhanced NOP methodology is to demonstrate adherence, or compliance, with the analysis basis. This paper outlines a compliance strategy capable of accounting for the statistical nature of the enhanced NOP methodology. (author)

  9. A study on safety analysis methodology in spent fuel dry storage facility

    Energy Technology Data Exchange (ETDEWEB)

    Che, M. S.; Ryu, J. H.; Kang, K. M.; Cho, N. C.; Kim, M. S. [Hanyang Univ., Seoul (Korea, Republic of)

    2004-02-15

    Collection and review of the domestic and foreign technology related to spent fuel dry storage facility. Analysis of a reference system. Establishment of a framework for criticality safety analysis. Review of accident analysis methodology. Establishment of accident scenarios. Establishment of scenario analysis methodology.

  10. Modeling methodology for supply chain synthesis and disruption analysis

    Science.gov (United States)

    Wu, Teresa; Blackhurst, Jennifer

    2004-11-01

    The concept of an integrated or synthesized supply chain is a strategy for managing today's globalized and customer driven supply chains in order to better meet customer demands. Synthesizing individual entities into an integrated supply chain can be a challenging task due to a variety of factors including conflicting objectives, mismatched incentives and constraints of the individual entities. Furthermore, understanding the effects of disruptions occurring at any point in the system is difficult when working toward synthesizing supply chain operations. Therefore, the goal of this research is to present a modeling methodology to manage the synthesis of a supply chain by linking hierarchical levels of the system and to model and analyze disruptions in the integrated supply chain. The contribution of this research is threefold: (1) supply chain systems can be modeled hierarchically (2) the performance of synthesized supply chain system can be evaluated quantitatively (3) reachability analysis is used to evaluate the system performance and verify whether a specific state is reachable, allowing the user to understand the extent of effects of a disruption.

  11. Formation of the methodological matrix of the strategic analysis of the enterprise

    Directory of Open Access Journals (Sweden)

    N.H. Vygovskaya

    2018-04-01

    Full Text Available The article is devoted to the study of the methodological matrix of the strategic analysis of the enterprise. The aim of this article is to analyze the influence of methodological changes in the 20th century on the methodology of strategic analysis; critical assessment and generalization of scientific approaches to its methods. Evaluation of scientific works on analysis made it possible to identify such problems in the methodology of strategic analysis as the lack of consideration of the features of strategic analysis in the formation of its methods, which often leads to confusion of methods of financial (economic, thrifty analysis; failure to use the fact that the strategic analysis contains, besides the methods of analyzing the internal and external environment, the methods of forecast analysis aimed at forming the strategy for the development of the enterprise; identification of the concepts «image», «reception», «method» of analysis; multidirectionality and indistinctness of signs of classification of methods of strategic analysis; blind copying of foreign methods of application of techniques and methods of strategic analysis without taking into account the specifics of domestic economic conditions. The expediency of using the system approach in forming the methodological design of strategic analysis is proved, which will allow to combine the methodology as a science of methods (a broad approach to the methods of strategic analysis with methodology as a set of applied methods and methods of analysis (narrow approach to methodology. The use of the system approach allowed to distinguish three levels of the methodology of strategic analysis. The first and second levels of methodology correspond to the level of science, the third level – the practice. When developing the third level of special methods of strategic analysis, an approach is applied that differentiates them depending on the stages of strategic analysis (methods of the stage

  12. A METHODOLOGICAL APPROACH TO THE STRATEGIC ANALYSIS OF FOOD SECURITY

    Directory of Open Access Journals (Sweden)

    Anastasiia Mostova

    2017-12-01

    Full Text Available The objective of present work is to substantiate the use of tools for strategic analysis in order to develop a strategy for the country’s food security under current conditions and to devise the author’s original technique to perform strategic analysis of food security using a SWOT-analysis. The methodology of the study. The article substantiates the need for strategic planning of food security. The author considers stages in strategic planning and explains the importance of the stage of strategic analysis of the country’s food security. It is proposed to apply a SWOT-analysis when running a strategic analysis of food security. The study is based on the system of indicators and characteristics of the country’s economy, agricultural sector, market trends, material-technical, financial, human resources, which are essential to obtain an objective assessment of the impact of trends and factors on food security, and in order to further develop the procedure for conducting a strategic analysis of the country’s food security. Results of the study. The procedure for strategic analysis of food security is developed based on the tool of a SWOT-analysis, which implies three stages: a strategic analysis of weaknesses and strengths, opportunities and threats; construction of the matrix of weaknesses and strengths, opportunities, and threats (SWOT-analysis matrix; formation of the food security strategy based on the SWOT-analysis matrix. A list of characteristics was compiled in order to conduct a strategic analysis of food security and to categorize them as strengths or weaknesses, threats, and opportunities. The characteristics are systemized into strategic groups: production, market; resources; consumption: this is necessary for the objective establishing of strategic directions, responsible performers, allocation of resources, and effective control, for the purpose of further development and implementation of the strategy. A strategic analysis

  13. Discrete dynamic event tree modeling and analysis of nuclear power plant crews for safety assessment

    International Nuclear Information System (INIS)

    Mercurio, D.

    2011-01-01

    Current Probabilistic Risk Assessment (PRA) and Human Reliability Analysis (HRA) methodologies model the evolution of accident sequences in Nuclear Power Plants (NPPs) mainly based on Logic Trees. The evolution of these sequences is a result of the interactions between the crew and plant; in current PRA methodologies, simplified models of these complex interactions are used. In this study, the Accident Dynamic Simulator (ADS), a modeling framework based on the Discrete Dynamic Event Tree (DDET), has been used for the simulation of crew-plant interactions during potential accident scenarios in NPPs. In addition, an operator/crew model has been developed to treat the response of the crew to the plant. The 'crew model' is made up of three operators whose behavior is guided by a set of rules-of-behavior (which represents the knowledge and training of the operators) coupled with written and mental procedures. In addition, an approach for addressing the crew timing variability in DDETs has been developed and implemented based on a set of HRA data from a simulator study. Finally, grouping techniques were developed and applied to the analysis of the scenarios generated by the crew-plant simulation. These techniques support the post-simulation analysis by grouping similar accident sequences, identifying the key contributing events, and quantifying the conditional probability of the groups. These techniques are used to characterize the context of the crew actions in order to obtain insights for HRA. The model has been applied for the analysis of a Small Loss Of Coolant Accident (SLOCA) event for a Pressurized Water Reactor (PWR). The simulation results support an improved characterization of the performance conditions or context of operator actions, which can be used in an HRA, in the analysis of the reliability of the actions. By providing information on the evolution of system indications, dynamic of cues, crew timing in performing procedure steps, situation

  14. Using functional analysis in archival appraisal a practical and effective alternative to traditional appraisal methodologies

    CERN Document Server

    Robyns, Marcus C

    2014-01-01

    In an age of scarcity and the challenge of electronic records, can archivists and records managers continue to rely upon traditional methodology essentially unchanged since the early 1950s? Using Functional Analysis in Archival Appraisal: A Practical and Effective Alternative to Traditional Appraisal Methodologies shows how archivists in other countries are already using functional analysis, which offers a better, more effective, and imminently more practical alternative to traditional appraisal methodologies that rely upon an analysis of the records themselves.

  15. Population Analysis: A Methodology for Understanding Populations in COIN Environments

    National Research Council Canada - National Science Library

    Burke, Mark C; Self, Eric C

    2008-01-01

    .... Our methodology provides a heuristic model, called the "3 x 5 P.I.G.S.P.E.E.R. Model," that can be applied in any environment and will help bridge the gap between strategic theory and tactical implementation...

  16. Standardizing the practice of human reliability analysis

    International Nuclear Information System (INIS)

    Hallbert, B.P.

    1993-01-01

    The practice of human reliability analysis (HRA) within the nuclear industry varies greatly in terms of posited mechanisms that shape human performance, methods of characterizing and analytically modeling human behavior, and the techniques that are employed to estimate the frequency with which human error occurs. This variation has been a source of contention among HRA practitioners regarding the validity of results obtained from different HRA methods. It has also resulted in attempts to develop standard methods and procedures for conducting HRAs. For many of the same reasons, the practice of HRA has not been standardized or has been standardized only to the extent that individual analysts have developed heuristics and consistent approaches in their practice of HRA. From the standpoint of consumers and regulators, this has resulted in a lack of clear acceptance criteria for the assumptions, modeling, and quantification of human errors in probabilistic risk assessments

  17. The development of a safety analysis methodology for the optimized power reactor 1000

    International Nuclear Information System (INIS)

    Hwang-Yong, Jun; Yo-Han, Kim

    2005-01-01

    Korea Electric Power Research Institute (KEPRI) has been developing inhouse safety analysis methodology based on the delicate codes available to KEPRI to overcome the problems arising from currently used vendor oriented methodologies. For the Loss of Coolant Accident (LOCA) analysis, the KREM (KEPRI Realistic Evaluation Methodology) has been developed based on the RELAP-5 code. The methodology was approved for the Westinghouse 3-loop plants by the Korean regulatory organization and the project to extent the methodology to the Optimized Power Reactor 1000 (OPR1000) has been ongoing since 2001. Also, for the Non-LOCA analysis, the KNAP (Korea Non-LOCA Analysis Package) has been developed using the UNICORN-TM code system. To demonstrate the feasibility of these codes systems and methodologies, some typical cases of the design basis accidents mentioned in the final safety analysis report (FSAR) were analyzed. (author)

  18. Improved Methodology of MSLB M/E Release Analysis for OPR1000

    International Nuclear Information System (INIS)

    Park, Seok Jeong; Kim, Cheol Woo; Seo, Jong Tae

    2006-01-01

    A new mass and energy (M/E) release analysis methodology for the equipment environmental qualification (EEQ) on loss-of-coolant accident (LOCA) has been recently developed and adopted on small break LOCA EEQ. The new methodology for the M/E release analysis is extended to the M/E release analysis for the containment design for large break LOCA and the main steam line break (MSLB) accident, and named KIMERA (KOPEC Improved Mass and Energy Release Analysis) methodology. The computer code systems used in this methodology is RELAP5K/CONTEMPT4 (or RELAP5-ME) which couples RELAP5/MOD3.1/K with enhanced M/E model and LOCA long term model, and CONTEMPT4/ MOD5. This KIMERA methodology is applied to the MSLB M/E release analysis to evaluate the validation of KIMERA methodology for MSLB in containment design. The results are compared with the OPR 1000 FSAR

  19. Discussion of comments from a peer review of a technique for human event analysis (ATHEANA)

    International Nuclear Information System (INIS)

    Forester, J.A.; Ramey-Smith, A.; Bley, D.C.; Kolaczkowski, A.M.; Cooper, S.E.; Wreathall, J.

    1998-01-01

    In May of 1998, a technical basis and implementation guidelines document for A Technique for Human Event Analysis (ATHEANA) was issued as a draft report for public comment (NUREG-1624). In conjunction with the release of the draft NUREG, a paper review of the method, its documentation, and the results of an initial test of the method was held over a two-day period in Seattle, Washington, in June of 1998. Four internationally-known and respected experts in human reliability analysis (HRA) were selected to serve as the peer reviewers and were paid for their services. In addition, approximately 20 other individuals with an interest in HRA and ATHEANA also attended the peer review meeting and were invited to provide comments. The peer review team was asked to comment on any aspect of the method or the report in which improvements could be made and to discuss its strengths and weaknesses. All of the reviewers thought the ATEANA method had made significant contributions to the field of PRA/HRA, in particular by addressing the most important open questions and issues in HRA, by attempting to develop an integrated approach, and by developing a framework capable of identifying types of unsafe actions that generally have not been considered using existing methods. The reviewers had many concerns about specific aspects of the methodology and made many recommendations for ways to improve and extend the method, and to make its application more cost effective and useful to PRA in general. Details of the reviewers' comments and the ATHEANA team's responses to specific criticisms will be discussed

  20. Accident Sequence Evaluation Program: Human reliability analysis procedure

    Energy Technology Data Exchange (ETDEWEB)

    Swain, A.D.

    1987-02-01

    This document presents a shortened version of the procedure, models, and data for human reliability analysis (HRA) which are presented in the Handbook of Human Reliability Analysis With emphasis on Nuclear Power Plant Applications (NUREG/CR-1278, August 1983). This shortened version was prepared and tried out as part of the Accident Sequence Evaluation Program (ASEP) funded by the US Nuclear Regulatory Commission and managed by Sandia National Laboratories. The intent of this new HRA procedure, called the ''ASEP HRA Procedure,'' is to enable systems analysts, with minimal support from experts in human reliability analysis, to make estimates of human error probabilities and other human performance characteristics which are sufficiently accurate for many probabilistic risk assessments. The ASEP HRA Procedure consists of a Pre-Accident Screening HRA, a Pre-Accident Nominal HRA, a Post-Accident Screening HRA, and a Post-Accident Nominal HRA. The procedure in this document includes changes made after tryout and evaluation of the procedure in four nuclear power plants by four different systems analysts and related personnel, including human reliability specialists. The changes consist of some additional explanatory material (including examples), and more detailed definitions of some of the terms. 42 refs.

  1. Accident Sequence Evaluation Program: Human reliability analysis procedure

    International Nuclear Information System (INIS)

    Swain, A.D.

    1987-02-01

    This document presents a shortened version of the procedure, models, and data for human reliability analysis (HRA) which are presented in the Handbook of Human Reliability Analysis With emphasis on Nuclear Power Plant Applications (NUREG/CR-1278, August 1983). This shortened version was prepared and tried out as part of the Accident Sequence Evaluation Program (ASEP) funded by the US Nuclear Regulatory Commission and managed by Sandia National Laboratories. The intent of this new HRA procedure, called the ''ASEP HRA Procedure,'' is to enable systems analysts, with minimal support from experts in human reliability analysis, to make estimates of human error probabilities and other human performance characteristics which are sufficiently accurate for many probabilistic risk assessments. The ASEP HRA Procedure consists of a Pre-Accident Screening HRA, a Pre-Accident Nominal HRA, a Post-Accident Screening HRA, and a Post-Accident Nominal HRA. The procedure in this document includes changes made after tryout and evaluation of the procedure in four nuclear power plants by four different systems analysts and related personnel, including human reliability specialists. The changes consist of some additional explanatory material (including examples), and more detailed definitions of some of the terms. 42 refs

  2. EUROCONTROL-Systemic Occurrence Analysis Methodology (SOAM)-A 'Reason'-based organisational methodology for analysing incidents and accidents

    International Nuclear Information System (INIS)

    Licu, Tony; Cioran, Florin; Hayward, Brent; Lowe, Andrew

    2007-01-01

    The Safety Occurrence Analysis Methodology (SOAM) developed for EUROCONTROL is an accident investigation methodology based on the Reason Model of organisational accidents. The purpose of a SOAM is to broaden the focus of an investigation from human involvement issues, also known as 'active failures of operational personnel' under Reason's original model, to include analysis of the latent conditions deeper within the organisation that set the context for the event. Such an approach is consistent with the tenets of Just Culture in which people are encouraged to provide full and open information about how incidents occurred, and are not penalised for errors. A truly systemic approach is not simply a means of transferring responsibility for a safety occurrence from front-line employees to senior managers. A consistent philosophy must be applied, where the investigation process seeks to correct deficiencies wherever they may be found, without attempting to apportion blame or liability

  3. Methodologies for analysis of patterning in the mouse RPE sheet

    Science.gov (United States)

    Boatright, Jeffrey H.; Dalal, Nupur; Chrenek, Micah A.; Gardner, Christopher; Ziesel, Alison; Jiang, Yi; Grossniklaus, Hans E.

    2015-01-01

    Purpose Our goal was to optimize procedures for assessing shapes, sizes, and other quantitative metrics of retinal pigment epithelium (RPE) cells and contact- and noncontact-mediated cell-to-cell interactions across a large series of flatmount RPE images. Methods The two principal methodological advances of this study were optimization of a mouse RPE flatmount preparation and refinement of open-access software to rapidly analyze large numbers of flatmount images. Mouse eyes were harvested, and extra-orbital fat and muscles were removed. Eyes were fixed for 10 min, and dissected by puncturing the cornea with a sharp needle or a stab knife. Four radial cuts were made with iridectomy scissors from the puncture to near the optic nerve head. The lens, iris, and the neural retina were removed, leaving the RPE sheet exposed. The dissection and outcomes were monitored and evaluated by video recording. The RPE sheet was imaged under fluorescence confocal microscopy after staining for ZO-1 to identify RPE cell boundaries. Photoshop, Java, Perl, and Matlab scripts, as well as CellProfiler, were used to quantify selected parameters. Data were exported into Excel spreadsheets for further analysis. Results A simplified dissection procedure afforded a consistent source of images that could be processed by computer. The dissection and flatmounting techniques were illustrated in a video recording. Almost all of the sheet could be routinely imaged, and substantial fractions of the RPE sheet (usually 20–50% of the sheet) could be analyzed. Several common technical problems were noted and workarounds developed. The software-based analysis merged 25 to 36 images into one and adjusted settings to record an image suitable for large-scale identification of cell-to-cell boundaries, and then obtained quantitative descriptors of the shape of each cell, its neighbors, and interactions beyond direct cell–cell contact in the sheet. To validate the software, human- and computer

  4. GO-FLOW methodology. Basic concept and integrated analysis framework for its applications

    International Nuclear Information System (INIS)

    Matsuoka, Takeshi

    2010-01-01

    GO-FLOW methodology is a success oriented system analysis technique, and is capable of evaluating a large system with complex operational sequences. Recently an integrated analysis framework of the GO-FLOW has been developed for the safety evaluation of elevator systems by the Ministry of Land, Infrastructure, Transport and Tourism, Japanese Government. This paper describes (a) an Overview of the GO-FLOW methodology, (b) Procedure of treating a phased mission problem, (c) Common cause failure analysis, (d) Uncertainty analysis, and (e) Integrated analysis framework. The GO-FLOW methodology is a valuable and useful tool for system reliability analysis and has a wide range of applications. (author)

  5. Seismic hazard analysis. A methodology for the Eastern United States

    Energy Technology Data Exchange (ETDEWEB)

    Bernreuter, D L

    1980-08-01

    This report presents a probabilistic approach for estimating the seismic hazard in the Central and Eastern United States. The probabilistic model (Uniform Hazard Methodology) systematically incorporates the subjective opinion of several experts in the evaluation of seismic hazard. Subjective input, assumptions and associated hazard are kept separate for each expert so as to allow review and preserve diversity of opinion. The report is organized into five sections: Introduction, Methodology Comparison, Subjective Input, Uniform Hazard Methodology (UHM), and Uniform Hazard Spectrum. Section 2 Methodology Comparison, briefly describes the present approach and compares it with other available procedures. The remainder of the report focuses on the UHM. Specifically, Section 3 describes the elicitation of subjective input; Section 4 gives details of various mathematical models (earthquake source geometry, magnitude distribution, attenuation relationship) and how these models re combined to calculate seismic hazard. The lost section, Uniform Hazard Spectrum, highlights the main features of typical results. Specific results and sensitivity analyses are not presented in this report. (author)

  6. Towards a Methodological Improvement of Narrative Inquiry: A Qualitative Analysis

    Science.gov (United States)

    Abdallah, Mahmoud Mohammad Sayed

    2009-01-01

    The article suggests that though narrative inquiry as a research methodology entails free conversations and personal stories, yet it should not be totally free and fictional as it has to conform to some recognized standards used for conducting educational research. Hence, a qualitative study conducted by Russ (1999) was explored as an exemplar…

  7. Prototype application of best estimate and uncertainty safety analysis methodology to large LOCA analysis

    International Nuclear Information System (INIS)

    Luxat, J.C.; Huget, R.G.

    2001-01-01

    Development of a methodology to perform best estimate and uncertainty nuclear safety analysis has been underway at Ontario Power Generation for the past two and one half years. A key driver for the methodology development, and one of the major challenges faced, is the need to re-establish demonstrated safety margins that have progressively been undermined through excessive and compounding conservatism in deterministic analyses. The major focus of the prototyping applications was to quantify the safety margins that exist at the probable range of high power operating conditions, rather than the highly improbable operating states associated with Limit of the Envelope (LOE) assumptions. In LOE, all parameters of significance to the consequences of a postulated accident are assumed to simultaneously deviate to their limiting values. Another equally important objective of the prototyping was to demonstrate the feasibility of conducting safety analysis as an incremental analysis activity, as opposed to a major re-analysis activity. The prototype analysis solely employed prior analyses of Bruce B large break LOCA events - no new computer simulations were undertaken. This is a significant and novel feature of the prototyping work. This methodology framework has been applied to a postulated large break LOCA in a Bruce generating unit on a prototype basis. This paper presents results of the application. (author)

  8. Attack Methodology Analysis: Emerging Trends in Computer-Based Attack Methodologies and Their Applicability to Control System Networks

    Energy Technology Data Exchange (ETDEWEB)

    Bri Rolston

    2005-06-01

    Threat characterization is a key component in evaluating the threat faced by control systems. Without a thorough understanding of the threat faced by critical infrastructure networks, adequate resources cannot be allocated or directed effectively to the defense of these systems. Traditional methods of threat analysis focus on identifying the capabilities and motivations of a specific attacker, assessing the value the adversary would place on targeted systems, and deploying defenses according to the threat posed by the potential adversary. Too many effective exploits and tools exist and are easily accessible to anyone with access to an Internet connection, minimal technical skills, and a significantly reduced motivational threshold to be able to narrow the field of potential adversaries effectively. Understanding how hackers evaluate new IT security research and incorporate significant new ideas into their own tools provides a means of anticipating how IT systems are most likely to be attacked in the future. This research, Attack Methodology Analysis (AMA), could supply pertinent information on how to detect and stop new types of attacks. Since the exploit methodologies and attack vectors developed in the general Information Technology (IT) arena can be converted for use against control system environments, assessing areas in which cutting edge exploit development and remediation techniques are occurring can provide significance intelligence for control system network exploitation, defense, and a means of assessing threat without identifying specific capabilities of individual opponents. Attack Methodology Analysis begins with the study of what exploit technology and attack methodologies are being developed in the Information Technology (IT) security research community within the black and white hat community. Once a solid understanding of the cutting edge security research is established, emerging trends in attack methodology can be identified and the gap between

  9. Proposed methodology for completion of scenario analysis for the Basalt Waste Isolation Project

    International Nuclear Information System (INIS)

    Roberds, W.J.; Plum, R.J.; Visca, P.J.

    1984-11-01

    This report presents the methodology to complete an assessment of postclosure performance, considering all credible scenarios, including the nominal case, for a proposed repository for high-level nuclear waste at the Hanford Site, Washington State. The methodology consists of defensible techniques for identifying and screening scenarios, and for then assessing the risks associated with each. The results of the scenario analysis are used to comprehensively determine system performance and/or risk for evaluation of compliance with postclosure performance criteria (10 CFR 60 and 40 CFR 191). In addition to describing the proposed methodology, this report reviews available methodologies for scenario analysis, discusses pertinent performance assessment and uncertainty concepts, advises how to implement the methodology (including the organizational requirements and a description of tasks) and recommends how to use the methodology in guiding future site characterization, analysis, and engineered subsystem design work. 36 refs., 24 figs., 1 tab

  10. Common cause failure analysis methodology for complex systems

    International Nuclear Information System (INIS)

    Wagner, D.P.; Cate, C.L.; Fussell, J.B.

    1977-01-01

    Common cause failure analysis, also called common mode failure analysis, is an integral part of a complex system reliability analysis. This paper extends existing methods of computer aided common cause failure analysis by allowing analysis of the complex systems often encountered in practice. The methods presented here aid in identifying potential common cause failures and also address quantitative common cause failure analysis

  11. Discovering the Effects-Endstate Linkage: Using Soft Systems Methodology to Perform EBO Mission Analysis

    National Research Council Canada - National Science Library

    Young, Jr, William E

    2005-01-01

    .... EBO mission analysis is shown to be more problem structuring than problem solving. A new mission analysis process is proposed using a modified version of Soft Systems Methodology to meet these challenges...

  12. Internal fire analysis screening methodology for the Salem Nuclear Generating Station

    International Nuclear Information System (INIS)

    Eide, S.; Bertucio, R.; Quilici, M.; Bearden, R.

    1989-01-01

    This paper reports on an internal fire analysis screening methodology that has been utilized for the Salem Nuclear Generating Station (SNGS) Probabilistic Risk Assessment (PRA). The methodology was first developed and applied in the Brunswick Steam Electric Plant (BSEP) PRA. The SNGS application includes several improvements and extensions to the original methodology. The SNGS approach differs significantly from traditional fire analysis methodologies by providing a much more detailed treatment of transient combustibles. This level of detail results in a model which is more usable for assisting in the management of fire risk at the plant

  13. Development Risk Methodology for Whole Systems Trade Analysis

    Science.gov (United States)

    2016-08-01

    WSTAT). In the early stages of the V&V for development risk, it was discovered that the original risk rating and methodology did not actually...4932 Standard Form 298 (Rev. 8-98) Prescribed by ANSI Std. Z39.18 ii THIS PAGE INTENTIONALLY LEFT ...WSTA has opened trade space exploration by allowing the tool to evaluate trillions of potential system configurations to then return a handful of

  14. User's manual of a support system for human reliability analysis

    International Nuclear Information System (INIS)

    Yokobayashi, Masao; Tamura, Kazuo.

    1995-10-01

    Many kinds of human reliability analysis (HRA) methods have been developed. However, users are required to be skillful so as to use them, and also required complicated works such as drawing event tree (ET) and calculation of uncertainty bounds. Moreover, each method is not so complete that only one method of them is not enough to evaluate human reliability. Therefore, a personal computer (PC) based support system for HRA has been developed to execute HRA practically and efficiently. The system consists of two methods, namely, simple method and detailed one. The former uses ASEP that is a simplified THERP-technique, and combined method of OAT and HRA-ET/DeBDA is used for the latter. Users can select a suitable method for their purpose. Human error probability (HEP) data were collected and a database of them was built to use for the support system. This paper describes outline of the HRA methods, support functions and user's guide of the system. (author)

  15. Tailoring a Human Reliability Analysis to Your Industry Needs

    Science.gov (United States)

    DeMott, D. L.

    2016-01-01

    Companies at risk of accidents caused by human error that result in catastrophic consequences include: airline industry mishaps, medical malpractice, medication mistakes, aerospace failures, major oil spills, transportation mishaps, power production failures and manufacturing facility incidents. Human Reliability Assessment (HRA) is used to analyze the inherent risk of human behavior or actions introducing errors into the operation of a system or process. These assessments can be used to identify where errors are most likely to arise and the potential risks involved if they do occur. Using the basic concepts of HRA, an evolving group of methodologies are used to meet various industry needs. Determining which methodology or combination of techniques will provide a quality human reliability assessment is a key element to developing effective strategies for understanding and dealing with risks caused by human errors. There are a number of concerns and difficulties in "tailoring" a Human Reliability Assessment (HRA) for different industries. Although a variety of HRA methodologies are available to analyze human error events, determining the most appropriate tools to provide the most useful results can depend on industry specific cultures and requirements. Methodology selection may be based on a variety of factors that include: 1) how people act and react in different industries, 2) expectations based on industry standards, 3) factors that influence how the human errors could occur such as tasks, tools, environment, workplace, support, training and procedure, 4) type and availability of data, 5) how the industry views risk & reliability, and 6) types of emergencies, contingencies and routine tasks. Other considerations for methodology selection should be based on what information is needed from the assessment. If the principal concern is determination of the primary risk factors contributing to the potential human error, a more detailed analysis method may be employed

  16. Fully Stochastic Distributed Methodology for Multivariate Flood Frequency Analysis

    Directory of Open Access Journals (Sweden)

    Isabel Flores-Montoya

    2016-05-01

    Full Text Available An adequate estimation of the extreme behavior of basin response is essential both for designing river structures and for evaluating their risk. The aim of this paper is to develop a new methodology to generate extreme hydrograph series of thousands of years using an event-based model. To this end, a spatial-temporal synthetic rainfall generator (RainSimV3 is combined with a distributed physically-based rainfall–runoff event-based model (RIBS. The use of an event-based model allows simulating longer hydrograph series with less computational and data requirements but need to characterize the initial basis state, which depends on the initial basin moisture distribution. To overcome this problem, this paper proposed a probabilistic calibration–simulation approach, which considers the initial state and the model parameters as random variables characterized by a probability distribution though a Monte Carlo simulation. This approach is compared with two other approaches, the deterministic and the semi-deterministic approaches. Both approaches use a unique initial state. The deterministic approach also uses a unique value of the model parameters while the semi-deterministic approach obtains these values from its probability distribution through a Monte Carlo simulation, considering the basin variability. This methodology has been applied to the Corbès and Générargues basins, in the Southeast of France. The results show that the probabilistic approach offers the best fit. That means that the proposed methodology can be successfully used to characterize the extreme behavior of the basin considering the basin variability and overcoming the basin initial state problem.

  17. Development and application of a deterministic-realistic hybrid methodology for LOCA licensing analysis

    International Nuclear Information System (INIS)

    Liang, Thomas K.S.; Chou, Ling-Yao; Zhang, Zhongwei; Hsueh, Hsiang-Yu; Lee, Min

    2011-01-01

    Highlights: → A new LOCA licensing methodology (DRHM, deterministic-realistic hybrid methodology) was developed. → DRHM involves conservative Appendix K physical models and statistical treatment of plant status uncertainties. → DRHM can generate 50-100 K PCT margin as compared to a traditional Appendix K methodology. - Abstract: It is well recognized that a realistic LOCA analysis with uncertainty quantification can generate greater safety margin as compared with classical conservative LOCA analysis using Appendix K evaluation models. The associated margin can be more than 200 K. To quantify uncertainty in BELOCA analysis, generally there are two kinds of uncertainties required to be identified and quantified, which involve model uncertainties and plant status uncertainties. Particularly, it will take huge effort to systematically quantify individual model uncertainty of a best estimate LOCA code, such as RELAP5 and TRAC. Instead of applying a full ranged BELOCA methodology to cover both model and plant status uncertainties, a deterministic-realistic hybrid methodology (DRHM) was developed to support LOCA licensing analysis. Regarding the DRHM methodology, Appendix K deterministic evaluation models are adopted to ensure model conservatism, while CSAU methodology is applied to quantify the effect of plant status uncertainty on PCT calculation. Generally, DRHM methodology can generate about 80-100 K margin on PCT as compared to Appendix K bounding state LOCA analysis.

  18. INTEGRATED METHODOLOGY FOR PRODUCT PLANNING USING MULTI CRITERIA ANALYSIS

    Directory of Open Access Journals (Sweden)

    Tarun Soota

    2016-09-01

    Full Text Available Integrated approach to multi-criteria decision problems is proposed using quality function deployment and analytical network process. The objective of the work is to rationalize and improve the method of analyzing and interpreting customer needs and technical requirements. The methodology is used to determine, prioritize engineering requirements based on customer needs for development of best product. Framework allows decision maker to decompose a complex problem in a hierarchical structure to show relationship between objective and criteria. Multi-criteria decision modeling is used for extending the hierarchy process to both dependence and feedback. A case study on bikes is presented for the proposed model.

  19. A methodological comparison of customer service analysis techniques

    Science.gov (United States)

    James Absher; Alan Graefe; Robert Burns

    2003-01-01

    Techniques used to analyze customer service data need to be studied. Two primary analysis protocols, importance-performance analysis (IP) and gap score analysis (GA), are compared in a side-by-side comparison using data from two major customer service research projects. A central concern is what, if any, conclusion might be different due solely to the analysis...

  20. Stream habitat analysis using the instream flow incremental methodology

    Science.gov (United States)

    Bovee, Ken D.; Lamb, Berton L.; Bartholow, John M.; Stalnaker, Clair B.; Taylor, Jonathan; Henriksen, Jim

    1998-01-01

    This document describes the Instream Flow Methodology in its entirety. This also is to serve as a comprehensive introductory textbook on IFIM for training courses as it contains the most complete and comprehensive description of IFIM in existence today. This should also serve as an official guide to IFIM in publication to counteract the misconceptions about the methodology that have pervaded the professional literature since the mid-1980's as this describes IFIM as it is envisioned by its developers. The document is aimed at the decisionmakers of management and allocation of natural resources in providing them an overview; and to those who design and implement studies to inform the decisionmakers. There should be enough background on model concepts, data requirements, calibration techniques, and quality assurance to help the technical user design and implement a cost-effective application of IFIM that will provide policy-relevant information. Some of the chapters deal with basic organization of IFIM, procedural sequence of applying IFIM starting with problem identification, study planning and implementation, and problem resolution.

  1. Using HABIT to Establish the Chemicals Analysis Methodology for Maanshan Nuclear Power Plant

    OpenAIRE

    J. R. Wang; S. W. Chen; Y. Chiang; W. S. Hsu; J. H. Yang; Y. S. Tseng; C. Shih

    2017-01-01

    In this research, the HABIT analysis methodology was established for Maanshan nuclear power plant (NPP). The Final Safety Analysis Report (FSAR), reports, and other data were used in this study. To evaluate the control room habitability under the CO2 storage burst, the HABIT methodology was used to perform this analysis. The HABIT result was below the R.G. 1.78 failure criteria. This indicates that Maanshan NPP habitability can be maintained. Additionally, the sensitivity study of the paramet...

  2. Human reliability analysis data obtainment through fuzzy logic in nuclear plants

    Energy Technology Data Exchange (ETDEWEB)

    Nascimento, C.S. do, E-mail: claudio.souza@ctmsp.mar.mil.br [Centro Tecnologico da Marinha em Sao Paulo (CTMSP), Av. Professor Lineu Prestes 2468, 05508-000 Sao Paulo, SP (Brazil); Mesquita, R.N. de, E-mail: rnavarro@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN - SP), Av. Professor Lineu Prestes 2242, 05508-000 Sao Paulo, SP (Brazil)

    2012-09-15

    Highlights: Black-Right-Pointing-Pointer Human Error Probability estimates from operator's reactions to emergency situations. Black-Right-Pointing-Pointer Human Reliability Analysis input data obtainment through fuzzy logic inference. Black-Right-Pointing-Pointer Performance Shaping Factors evaluation influence level onto the operator's actions. - Abstract: Human error has been recognized as an important factor for many industrial and nuclear accidents occurrence. Human error data is scarcely available for different reasons among which, lapses in historical database registry methodology is an important one. Human Reliability Analysis (HRA) is an usual tool employed to estimate the probability that an operator will reasonably perform a system required task in required time without degrading the system. This meta-analysis requires specific Human Error Probability estimates for most of its procedure. This work obtains Human Error Probability (HEP) estimates from operator's actions in response to emergency situations hypothesis on Research Reactor IEA-R1 from IPEN, Brazil. Through this proposed methodology HRA should be able to be performed even with shortage of related human error statistical data. A Performance Shaping Factors (PSF's) evaluation in order to classify and estimate their influence level onto the operator's actions and to determine their actual state over the plant was also done. Both HEP estimation and PSF evaluation were done based on expert judgment using interviews and questionnaires. Expert group was established based on selected IEA-R1 operators, and their evaluation were put into a knowledge representation system which used linguistic variables and group evaluation values that were obtained through Fuzzy Logic and Fuzzy Set theory. HEP obtained values show good agreement with literature published data corroborating the proposed methodology as a good alternative to be used on HRA.

  3. Human reliability analysis data obtainment through fuzzy logic in nuclear plants

    International Nuclear Information System (INIS)

    Nascimento, C.S. do; Mesquita, R.N. de

    2012-01-01

    Highlights: ► Human Error Probability estimates from operator's reactions to emergency situations. ► Human Reliability Analysis input data obtainment through fuzzy logic inference. ► Performance Shaping Factors evaluation influence level onto the operator's actions. - Abstract: Human error has been recognized as an important factor for many industrial and nuclear accidents occurrence. Human error data is scarcely available for different reasons among which, lapses in historical database registry methodology is an important one. Human Reliability Analysis (HRA) is an usual tool employed to estimate the probability that an operator will reasonably perform a system required task in required time without degrading the system. This meta-analysis requires specific Human Error Probability estimates for most of its procedure. This work obtains Human Error Probability (HEP) estimates from operator's actions in response to emergency situations hypothesis on Research Reactor IEA-R1 from IPEN, Brazil. Through this proposed methodology HRA should be able to be performed even with shortage of related human error statistical data. A Performance Shaping Factors (PSF's) evaluation in order to classify and estimate their influence level onto the operator's actions and to determine their actual state over the plant was also done. Both HEP estimation and PSF evaluation were done based on expert judgment using interviews and questionnaires. Expert group was established based on selected IEA-R1 operators, and their evaluation were put into a knowledge representation system which used linguistic variables and group evaluation values that were obtained through Fuzzy Logic and Fuzzy Set theory. HEP obtained values show good agreement with literature published data corroborating the proposed methodology as a good alternative to be used on HRA.

  4. Establishing Equivalence: Methodological Progress in Group-Matching Design and Analysis

    Science.gov (United States)

    Kover, Sara T.; Atwood, Amy K.

    2013-01-01

    This methodological review draws attention to the challenges faced by intellectual and developmental disabilities researchers in the appropriate design and analysis of group comparison studies. We provide a brief overview of matching methodologies in the field, emphasizing group-matching designs used in behavioral research on cognition and…

  5. The quantitative failure of human reliability analysis

    Energy Technology Data Exchange (ETDEWEB)

    Bennett, C.T.

    1995-07-01

    This philosophical treatise argues the merits of Human Reliability Analysis (HRA) in the context of the nuclear power industry. Actually, the author attacks historic and current HRA as having failed in informing policy makers who make decisions based on risk that humans contribute to systems performance. He argues for an HRA based on Bayesian (fact-based) inferential statistics, which advocates a systems analysis process that employs cogent heuristics when using opinion, and tempers itself with a rational debate over the weight given subjective and empirical probabilities.

  6. Literature research of FMEA (Failure Mode and Effects Analysis) methodology

    International Nuclear Information System (INIS)

    Hustak, S.

    1999-01-01

    The potential of the FMEA applications is demonstrated. Some approaches can be used for system analysis or immediately for PSA, in particular, for obtaining background information for fault tree analysis in the area of component modelling and, to a lesser extent, for identification of the initiating events. On the other hand, other FMEA applications, such as criticality analysis, are unusable in PSA. (author)

  7. Interaction between core analysis methodology and nuclear design: some PWR examples

    International Nuclear Information System (INIS)

    Rothleder, B.M.; Eich, W.J.

    1982-01-01

    The interaction between core analysis methodology and nuclear design is exemplified by PSEUDAX, a major improvement related to the Advanced Recycle methodology program (ARMP) computer code system, still undergoing development by the Electric Power Research Institute. The mechanism of this interaction is explored by relating several specific nulcear design changes to the demands placed by these changes on the ARMP system, and by examining the meeting of these demands, first within the standard ARMP methodology and then through augmentation of the standard methodology by development of PSEUDAX

  8. Development of design and analysis methodology for composite bolted joints

    Science.gov (United States)

    Grant, Peter; Sawicki, Adam

    1991-05-01

    This paper summarizes work performed to develop composite joint design methodology for use on rotorcraft primary structure, determine joint characteristics which affect joint bearing and bypass strength, and develop analytical methods for predicting the effects of such characteristics in structural joints. Experimental results have shown that bearing-bypass interaction allowables cannot be defined using a single continuous function due to variance of failure modes for different bearing-bypass ratios. Hole wear effects can be significant at moderate stress levels and should be considered in the development of bearing allowables. A computer program has been developed and has successfully predicted bearing-bypass interaction effects for the (0/+/-45/90) family of laminates using filled hole and unnotched test data.

  9. Adapting Human Reliability Analysis from Nuclear Power to Oil and Gas Applications

    Energy Technology Data Exchange (ETDEWEB)

    Boring, Ronald Laurids [Idaho National Laboratory

    2015-09-01

    ABSTRACT: Human reliability analysis (HRA), as currently used in risk assessments, largely derives its methods and guidance from application in the nuclear energy domain. While there are many similarities be-tween nuclear energy and other safety critical domains such as oil and gas, there remain clear differences. This paper provides an overview of HRA state of the practice in nuclear energy and then describes areas where refinements to the methods may be necessary to capture the operational context of oil and gas. Many key distinctions important to nuclear energy HRA such as Level 1 vs. Level 2 analysis may prove insignifi-cant for oil and gas applications. On the other hand, existing HRA methods may not be sensitive enough to factors like the extensive use of digital controls in oil and gas. This paper provides an overview of these con-siderations to assist in the adaptation of existing nuclear-centered HRA methods to the petroleum sector.

  10. Adapting Job Analysis Methodology to Improve Evaluation Practice

    Science.gov (United States)

    Jenkins, Susan M.; Curtin, Patrick

    2006-01-01

    This article describes how job analysis, a method commonly used in personnel research and organizational psychology, provides a systematic method for documenting program staffing and service delivery that can improve evaluators' knowledge about program operations. Job analysis data can be used to increase evaluators' insight into how staffs…

  11. Methodological Analysis of Gregarious Behaviour of Agents in the Financial Markets

    OpenAIRE

    Solodukhin Stanislav V.

    2013-01-01

    The article considers methodological approaches to analysis of gregarious behaviour of agents in the financial markets and also studies foundations of the agent modelling of decision making processes with consideration of the gregarious instinct.

  12. Development of an analysis rule of diagnosis error for standard method of human reliability analysis

    International Nuclear Information System (INIS)

    Jeong, W. D.; Kang, D. I.; Jeong, K. S.

    2003-01-01

    This paper presents the status of development of Korea standard method for Human Reliability Analysis (HRA), and proposed a standard procedure and rules for the evaluation of diagnosis error probability. The quality of KSNP HRA was evaluated using the requirement of ASME PRA standard guideline, and the design requirement for the standard HRA method was defined. Analysis procedure and rules, developed so far, to analyze diagnosis error probability was suggested as a part of the standard method. And also a study of comprehensive application was performed to evaluate the suitability of the proposed rules

  13. Montecarlo simulation for a new high resolution elemental analysis methodology

    Energy Technology Data Exchange (ETDEWEB)

    Figueroa S, Rodolfo; Brusa, Daniel; Riveros, Alberto [Universidad de La Frontera, Temuco (Chile). Facultad de Ingenieria y Administracion

    1996-12-31

    Full text. Spectra generated by binary, ternary and multielement matrixes when irradiated by a variable energy photon beam are simulated by means of a Monte Carlo code. Significative jumps in the counting rate are shown when the photon energy is just over the edge associated to each element, because of the emission of characteristic X rays. For a given associated energy, the net height of these jumps depends mainly on the concentration and of the sample absorption coefficient. The spectra were obtained by a monochromatic energy scan considering all the emitted radiation by the sample in a 2{pi} solid angle, associating a single multichannel spectrometer channel to each incident energy (Multichannel Scaling (MCS) mode). The simulated spectra were made with Monte Carlo simulation software adaptation of the package called PENELOPE (Penetration and Energy Loss of Positrons and Electrons in matter). The results show that it is possible to implement a new high resolution spectroscopy methodology, where a synchrotron would be an ideal source, due to the high intensity and ability to control the energy of the incident beam. The high energy resolution would be determined by the monochromating system and not by the detection system and not by the detection system, which would basicalbe a photon counter. (author)

  14. Recent Methodologies for Creep Deformation Analysis and Its Life Prediction

    International Nuclear Information System (INIS)

    Kim, Woo-Gon; Park, Jae-Young; Iung

    2016-01-01

    To design the high-temperature creeping materials, various creep data are needed for codification, as follows: i) stress vs. creep rupture time for base metals and weldments (average and minimum), ii) stress vs. time to 1% total strain (average), iii) stress vs. time to onset of tertiary creep (minimum), and iv) constitutive eqns. for conducting time- and temperature- dependent stress-strain (average), and v) isochronous stress-strain curves (average). Also, elevated temperature components such as those used in modern power generation plant are designed using allowable stress under creep conditions. The allowable stress is usually estimated on the basis of up to 10"5 h creep rupture strength at the operating temperature. The master curve of the “sinh” function was found to have a wider acceptance with good flexibility in the low stress ranges beyond the experimental data. The proposed multi-C method in the LM parameter revealed better life prediction than a single-C method. These improved methodologies can be utilized to accurately predict the long-term creep life or strength of Gen-IV nuclear materials which are designed for life span of 60 years

  15. Montecarlo simulation for a new high resolution elemental analysis methodology

    International Nuclear Information System (INIS)

    Figueroa S, Rodolfo; Brusa, Daniel; Riveros, Alberto

    1996-01-01

    Full text. Spectra generated by binary, ternary and multielement matrixes when irradiated by a variable energy photon beam are simulated by means of a Monte Carlo code. Significative jumps in the counting rate are shown when the photon energy is just over the edge associated to each element, because of the emission of characteristic X rays. For a given associated energy, the net height of these jumps depends mainly on the concentration and of the sample absorption coefficient. The spectra were obtained by a monochromatic energy scan considering all the emitted radiation by the sample in a 2π solid angle, associating a single multichannel spectrometer channel to each incident energy (Multichannel Scaling (MCS) mode). The simulated spectra were made with Monte Carlo simulation software adaptation of the package called PENELOPE (Penetration and Energy Loss of Positrons and Electrons in matter). The results show that it is possible to implement a new high resolution spectroscopy methodology, where a synchrotron would be an ideal source, due to the high intensity and ability to control the energy of the incident beam. The high energy resolution would be determined by the monochromating system and not by the detection system and not by the detection system, which would basicalbe a photon counter. (author)

  16. Capturing cognitive causal paths in human reliability analysis with Bayesian network models

    International Nuclear Information System (INIS)

    Zwirglmaier, Kilian; Straub, Daniel; Groth, Katrina M.

    2017-01-01

    reIn the last decade, Bayesian networks (BNs) have been identified as a powerful tool for human reliability analysis (HRA), with multiple advantages over traditional HRA methods. In this paper we illustrate how BNs can be used to include additional, qualitative causal paths to provide traceability. The proposed framework provides the foundation to resolve several needs frequently expressed by the HRA community. First, the developed extended BN structure reflects the causal paths found in cognitive psychology literature, thereby addressing the need for causal traceability and strong scientific basis in HRA. Secondly, the use of node reduction algorithms allows the BN to be condensed to a level of detail at which quantification is as straightforward as the techniques used in existing HRA. We illustrate the framework by developing a BN version of the critical data misperceived crew failure mode in the IDHEAS HRA method, which is currently under development at the US NRC . We illustrate how the model could be quantified with a combination of expert-probabilities and information from operator performance databases such as SACADA. This paper lays the foundations necessary to expand the cognitive and quantitative foundations of HRA. - Highlights: • A framework for building traceable BNs for HRA, based on cognitive causal paths. • A qualitative BN structure, directly showing these causal paths is developed. • Node reduction algorithms are used for making the BN structure quantifiable. • BN quantified through expert estimates and observed data (Bayesian updating). • The framework is illustrated for a crew failure mode of IDHEAS.

  17. Lessons Learned on Benchmarking from the International Human Reliability Analysis Empirical Study

    International Nuclear Information System (INIS)

    Boring, Ronald L.; Forester, John A.; Bye, Andreas; Dang, Vinh N.; Lois, Erasmia

    2010-01-01

    The International Human Reliability Analysis (HRA) Empirical Study is a comparative benchmark of the prediction of HRA methods to the performance of nuclear power plant crews in a control room simulator. There are a number of unique aspects to the present study that distinguish it from previous HRA benchmarks, most notably the emphasis on a method-to-data comparison instead of a method-to-method comparison. This paper reviews seven lessons learned about HRA benchmarking from conducting the study: (1) the dual purposes of the study afforded by joining another HRA study; (2) the importance of comparing not only quantitative but also qualitative aspects of HRA; (3) consideration of both negative and positive drivers on crew performance; (4) a relatively large sample size of crews; (5) the use of multiple methods and scenarios to provide a well-rounded view of HRA performance; (6) the importance of clearly defined human failure events; and (7) the use of a common comparison language to 'translate' the results of different HRA methods. These seven lessons learned highlight how the present study can serve as a useful template for future benchmarking studies.

  18. Lessons Learned on Benchmarking from the International Human Reliability Analysis Empirical Study

    Energy Technology Data Exchange (ETDEWEB)

    Ronald L. Boring; John A. Forester; Andreas Bye; Vinh N. Dang; Erasmia Lois

    2010-06-01

    The International Human Reliability Analysis (HRA) Empirical Study is a comparative benchmark of the prediction of HRA methods to the performance of nuclear power plant crews in a control room simulator. There are a number of unique aspects to the present study that distinguish it from previous HRA benchmarks, most notably the emphasis on a method-to-data comparison instead of a method-to-method comparison. This paper reviews seven lessons learned about HRA benchmarking from conducting the study: (1) the dual purposes of the study afforded by joining another HRA study; (2) the importance of comparing not only quantitative but also qualitative aspects of HRA; (3) consideration of both negative and positive drivers on crew performance; (4) a relatively large sample size of crews; (5) the use of multiple methods and scenarios to provide a well-rounded view of HRA performance; (6) the importance of clearly defined human failure events; and (7) the use of a common comparison language to “translate” the results of different HRA methods. These seven lessons learned highlight how the present study can serve as a useful template for future benchmarking studies.

  19. Eco-efficiency analysis methodology on the example of the chosen polyolefins production

    OpenAIRE

    K. Czaplicka-Kolarz; D. Burchart-Korol; P. Krawczyk

    2010-01-01

    the chosen polyolefins production. The article presents also main tools of eco-efficiency analysis: Life Cycle Assessment (LCA) and Net Present Value (NPV).Design/methodology/approach: On the basis of LCA and NPV of high density polyethylene (HDPE) and low density polyethylene (LDPE) production, eco-efficiency analysis is conducted.Findings: In this article environmental and economic performance of the chosen polyolefins production was presented. The basis phases of eco-efficiency methodology...

  20. Uncertainty and sensitivity analysis methodology in a level-I PSA (Probabilistic Safety Assessment)

    International Nuclear Information System (INIS)

    Nunez McLeod, J.E.; Rivera, S.S.

    1997-01-01

    This work presents a methodology for sensitivity and uncertainty analysis, applicable to a probabilistic safety assessment level I. The work contents are: correct association of distributions to parameters, importance and qualification of expert opinions, generations of samples according to sample sizes, and study of the relationships among system variables and system response. A series of statistical-mathematical techniques are recommended along the development of the analysis methodology, as well different graphical visualization for the control of the study. (author) [es

  1. Go-flow: a reliability analysis methodology applicable to piping system

    International Nuclear Information System (INIS)

    Matsuoka, T.; Kobayashi, M.

    1985-01-01

    Since the completion of the Reactor Safety Study, the use of probabilistic risk assessment technique has been becoming more widespread in the nuclear community. Several analytical methods are used for the reliability analysis of nuclear power plants. The GO methodology is one of these methods. Using the GO methodology, the authors performed a reliability analysis of the emergency decay heat removal system of the nuclear ship Mutsu, in order to examine its applicability to piping systems. By this analysis, the authors have found out some disadvantages of the GO methodology. In the GO methodology, the signal is on-to-off or off-to-on signal, therefore the GO finds out the time point at which the state of a system changes, and can not treat a system which state changes as off-on-off. Several computer runs are required to obtain the time dependent failure probability of a system. In order to overcome these disadvantages, the authors propose a new analytical methodology: GO-FLOW. In GO-FLOW, the modeling method (chart) and the calculation procedure are similar to those in the GO methodology, but the meaning of signal and time point, and the definitions of operators are essentially different. In the paper, the GO-FLOW methodology is explained and two examples of the analysis by GO-FLOW are given

  2. Reliability analysis and operator modelling

    International Nuclear Information System (INIS)

    Hollnagel, Erik

    1996-01-01

    The paper considers the state of operator modelling in reliability analysis. Operator models are needed in reliability analysis because operators are needed in process control systems. HRA methods must therefore be able to account both for human performance variability and for the dynamics of the interaction. A selected set of first generation HRA approaches is briefly described in terms of the operator model they use, their classification principle, and the actual method they propose. In addition, two examples of second generation methods are also considered. It is concluded that first generation HRA methods generally have very simplistic operator models, either referring to the time-reliability relationship or to elementary information processing concepts. It is argued that second generation HRA methods must recognise that cognition is embedded in a context, and be able to account for that in the way human reliability is analysed and assessed

  3. Toward a computer-aided methodology for discourse analysis ...

    African Journals Online (AJOL)

    aided methods to discourse analysis”. This project aims to develop an e-learning environment dedicated to documenting, evaluating and teaching the use of corpus linguistic tools suitable for interpretative text analysis. Even though its roots are in ...

  4. Proteome analysis of Saccharomyces cerevisiae: a methodological outline

    DEFF Research Database (Denmark)

    Fey, S J; Nawrocki, A; Görg, A

    1997-01-01

    Proteome analysis offers a unique means of identifying important proteins, characterizing their modifications and beginning to describe their function. This is achieved through the combination of two technologies: protein separation and selection by two-dimensional gel electrophoresis, and protei...

  5. Theoretical and methodological analysis of personality theories of leadership

    OpenAIRE

    Оксана Григорівна Гуменюк

    2016-01-01

    The psychological analysis of personality theories of leadership, which is the basis for other conceptual approaches to understanding the nature of leadership, is conducted. Conceptual approach of leadership is analyzed taking into account the priority of personality theories, including: heroic, psychoanalytic, «trait» theory, charismatic and five-factor. It is noted that the psychological analysis of personality theories are important in understanding the nature of leadership

  6. Methodology Series Module 6: Systematic Reviews and Meta-analysis.

    Science.gov (United States)

    Setia, Maninder Singh

    2016-01-01

    Systematic reviews and meta-analysis have become an important of biomedical literature, and they provide the "highest level of evidence" for various clinical questions. There are a lot of studies - sometimes with contradictory conclusions - on a particular topic in literature. Hence, as a clinician, which results will you believe? What will you tell your patient? Which drug is better? A systematic review or a meta-analysis may help us answer these questions. In addition, it may also help us understand the quality of the articles in literature or the type of studies that have been conducted and published (example, randomized trials or observational studies). The first step it to identify a research question for systematic review or meta-analysis. The next step is to identify the articles that will be included in the study. This will be done by searching various databases; it is important that the researcher should search for articles in more than one database. It will also be useful to form a group of researchers and statisticians that have expertise in conducting systematic reviews and meta-analysis before initiating them. We strongly encourage the readers to register their proposed review/meta-analysis with PROSPERO. Finally, these studies should be reported according to the Preferred Reporting Items for Systematic Reviews and Meta-analysis checklist.

  7. Methodology for risk-based analysis of technical specifications

    International Nuclear Information System (INIS)

    Vesely, W.E.; Gaertner, J.P.; Wagner, D.P.

    1985-01-01

    Part of the effort by EPRI to apply probabilistic risk assessment methods and results to the solution of utility problems involves the investigation of methods for risk-based analysis of technical specifications. The culmination of this investigation is the SOCRATES computer code developed by Battelle's Columbus Laboratories to assist in the evaluation of technical specifications of nuclear power plants. The program is designed to use information found in PRAs to re-evaluate risk for changes in component allowed outage times (AOTs) and surveillance test intervals (STIs). The SOCRATES program is a unique and important tool for technical specification evaluations. The detailed component unavailability model allows a detailed analysis of AOT and STI contributions to risk. Explicit equations allow fast and inexpensive calculations. Because the code is designed to accept ranges of parameters and to save results of calculations that do not change during the analysis, sensitivity studies are efficiently performed and results are clearly displayed

  8. Uncertainty analysis on probabilistic fracture mechanics assessment methodology

    International Nuclear Information System (INIS)

    Rastogi, Rohit; Vinod, Gopika; Chandra, Vikas; Bhasin, Vivek; Babar, A.K.; Rao, V.V.S.S.; Vaze, K.K.; Kushwaha, H.S.; Venkat-Raj, V.

    1999-01-01

    Fracture Mechanics has found a profound usage in the area of design of components and assessing fitness for purpose/residual life estimation of an operating component. Since defect size and material properties are statistically distributed, various probabilistic approaches have been employed for the computation of fracture probability. Monte Carlo Simulation is one such procedure towards the analysis of fracture probability. This paper deals with uncertainty analysis using the Monte Carlo Simulation methods. These methods were developed based on the R6 failure assessment procedure, which has been widely used in analysing the integrity of structures. The application of this method is illustrated with a case study. (author)

  9. Full cost accounting in the analysis of separated waste collection efficiency: A methodological proposal.

    Science.gov (United States)

    D'Onza, Giuseppe; Greco, Giulio; Allegrini, Marco

    2016-02-01

    Recycling implies additional costs for separated municipal solid waste (MSW) collection. The aim of the present study is to propose and implement a management tool - the full cost accounting (FCA) method - to calculate the full collection costs of different types of waste. Our analysis aims for a better understanding of the difficulties of putting FCA into practice in the MSW sector. We propose a FCA methodology that uses standard cost and actual quantities to calculate the collection costs of separate and undifferentiated waste. Our methodology allows cost efficiency analysis and benchmarking, overcoming problems related to firm-specific accounting choices, earnings management policies and purchase policies. Our methodology allows benchmarking and variance analysis that can be used to identify the causes of off-standards performance and guide managers to deploy resources more efficiently. Our methodology can be implemented by companies lacking a sophisticated management accounting system. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. Analysis of market competitive structure: The new methodological approach based in the using

    International Nuclear Information System (INIS)

    Romero de la Fuente, J.; Yague Guillen, M. J.

    2007-01-01

    This paper proposes a new methodological approach to identify market competitive structure, applying usage situation concept in positioning analysis. Dimensions used by consumer to classify products are identified using Correspondence Analysis and competitive groups are formed. Results are validated with Discriminant Analysis. (Author) 23 refs

  11. BWR stability analysis: methodology of the stability analysis and results of PSI for the NEA/NCR benchmark task

    International Nuclear Information System (INIS)

    Hennig, D.; Nechvatal, L.

    1996-09-01

    The report describes the PSI stability analysis methodology and the validation of this methodology based on the international OECD/NEA BWR stability benchmark task. In the frame of this work, the stability properties of some operation points of the NPP Ringhals 1 have been analysed and compared with the experimental results. (author) figs., tabs., 45 refs

  12. Physical data generation methodology for return-to-power steam line break analysis

    Energy Technology Data Exchange (ETDEWEB)

    Zee, Sung Kyun; Lee, Chung Chan; Lee, Chang Kue [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1996-02-01

    Current methodology to generate physics data for steamline break accident analysis of CE-type nuclear plant such as Yonggwang Unit 3 is valid only if the core reactivity does not reach the criticality after shutdown. Therefore, the methodology requires tremendous amount of net scram worth, specially at the end of the cycle when moderator temperature coefficient is most negative. Therefore, we need a new methodology to obtain reasonably conservation physics data, when the reactor returns to power condition. Current methodology used ROCS which include only closed channel model. But it is well known that the closed channel model estimates the core reactivity too much negative if core flow rate is low. Therefore, a conservative methodology is presented which utilizes open channel 3D HERMITE model. Current methodology uses ROCS which include only closed channel model. But it is well known that the closed channel model estimates the core reactivity too much negative if core flow rate is low. Therefore, a conservative methodology is presented which utilizes open channel 3D HERMITE model. Return-to-power reactivity credit is produced to assist the reactivity table generated by closed channel model. Other data includes hot channel axial power shape, peaking factor and maximum quality for DNBR analysis. It also includes pin census for radiological consequence analysis. 48 figs., 22 tabs., 18 refs. (Author) .new.

  13. SAFETY ANALYSIS METHODOLOGY FOR AGED CANDU® 6 NUCLEAR REACTORS

    Directory of Open Access Journals (Sweden)

    WOLFGANG HARTMANN

    2013-10-01

    Full Text Available This paper deals with the Safety Analysis for CANDU® 6 nuclear reactors as affected by main Heat Transport System (HTS aging. Operational and aging related changes of the HTS throughout its lifetime may lead to restrictions in certain safety system settings and hence some restriction in performance under certain conditions. A step in confirming safe reactor operation is the tracking of relevant data and their corresponding interpretation by the use of appropriate thermalhydraulic analytic models. Safety analyses ranging from the assessment of safety limits associated with the prevention of intermittent fuel sheath dryout for a slow Loss of Regulation (LOR analysis and fission gas release after a fuel failure are summarized. Specifically for fission gas release, the thermalhydraulic analysis for a fresh core and an 11 Effective Full Power Years (EFPY aged core was summarized, leading to the most severe stagnation break sizes for the inlet feeder break and the channel failure time. Associated coolant conditions provide the input data for fuel analyses. Based on the thermalhydraulic data, the fission product inventory under normal operating conditions may be calculated for both fresh and aged cores, and the fission gas release may be evaluated during the transient. This analysis plays a major role in determining possible radiation doses to the public after postulated accidents have occurred.

  14. Methodological aspects in the analysis of spontaneously produced sputum

    NARCIS (Netherlands)

    Out, T. A.; Jansen, H. M.; Lutter, R.

    2001-01-01

    Analysis of sputum as a specimen containing inflammatory indices has gained considerable interest during the last decade with focus on chronic bronchitis (CB) with or without airway obstruction, cystic fibrosis (CF), chronic obstructive pulmonary disease (COPD) and asthma. The nature of the

  15. Recent methodology in the phytochemical analysis of ginseng

    NARCIS (Netherlands)

    Angelova, N.; Kong, H.-W.; Heijden, R. van de; Yang, S.-Y.; Choi, Y.H.; Kim, H.K.; Wang, M.; Hankemeier, T.; Greef, J. van der; Xu, G.; Verpoorte, R.

    2008-01-01

    This review summarises the most recent developments in ginseng analysis, in particular the novel approaches in sample pre-treatment and the use of high-performance liquid-chromatography-mass spectrometry. The review also presents novel data on analysing ginseng extracts by nuclear magnetic resonance

  16. Important Literature in Endocrinology: Citation Analysis and Historial Methodology.

    Science.gov (United States)

    Hurt, C. D.

    1982-01-01

    Results of a study comparing two approaches to the identification of important literature in endocrinology reveals that association between ranking of cited items using the two methods is not statistically significant and use of citation or historical analysis alone will not result in same set of literature. Forty-two sources are appended. (EJS)

  17. Boolean comparative analysis of qualitative data : a methodological note

    NARCIS (Netherlands)

    Romme, A.G.L.

    1995-01-01

    This paper explores the use of Boolean logic in the analysis of qualitative data, especially on the basis of so-called process theories. Process theories treat independent variables as necessary conditions which are binary rather than variable in nature, while the dependent variable is a final

  18. CANDU safety analysis system establishment; development of trip coverage and multi-dimensional hydrogen analysis methodology

    Energy Technology Data Exchange (ETDEWEB)

    Choi, Jong Ho; Ohn, M. Y.; Cho, C. H. [KOPEC, Taejon (Korea)

    2002-03-01

    The trip coverage analysis model requires the geometry network for primary and secondary circuit as well as the plant control system to simulate all the possible plant operating conditions throughout the plant life. The model was validated for the power maneuvering and the Wolsong 4 commissioning test. The trip coverage map was produced for the large break loss of coolant accident and the complete loss of class IV power event. The reliable multi-dimensional hydrogen analysis requires the high capability for thermal hydraulic modelling. To acquire such a basic capability and verify the applicability of GOTHIC code, the assessment of heat transfer model, hydrogen mixing and combustion model was performed. Also, the assessment methodology for flame acceleration and deflagration-to-detonation transition is established. 22 refs., 120 figs., 31 tabs. (Author)

  19. Big and complex data analysis methodologies and applications

    CERN Document Server

    2017-01-01

    This volume conveys some of the surprises, puzzles and success stories in high-dimensional and complex data analysis and related fields. Its peer-reviewed contributions showcase recent advances in variable selection, estimation and prediction strategies for a host of useful models, as well as essential new developments in the field. The continued and rapid advancement of modern technology now allows scientists to collect data of increasingly unprecedented size and complexity. Examples include epigenomic data, genomic data, proteomic data, high-resolution image data, high-frequency financial data, functional and longitudinal data, and network data. Simultaneous variable selection and estimation is one of the key statistical problems involved in analyzing such big and complex data. The purpose of this book is to stimulate research and foster interaction between researchers in the area of high-dimensional data analysis. More concretely, its goals are to: 1) highlight and expand the breadth of existing methods in...

  20. Environmental analysis applied to schools. Methodologies for data acquisition

    International Nuclear Information System (INIS)

    Andriola, L.; Ceccacci, R.

    2001-01-01

    The environment analysis is the basis of environmental management for organizations and it is considered as the first step in EMAS. It allows to identify, deal with the issues and have a clear knowledge on environmental performances of organizations. Schools can be included in the organizations. Nevertheless, the complexity of environmental issues and applicable regulations makes very difficult for a school, that wants to implement an environmental management system (EMAS, ISO 14001, etc.), to face this first step. So, it has been defined an instrument, that is easy but complete and coherent with reference standard, to let schools choose their process for elaborating the initial environmental revue. This instrument consists, essentially, in cards that, if completed, facilitate the drafting of the environmental analysis report [it

  1. Development of analysis methodology for hot leg break mass and energy release

    International Nuclear Information System (INIS)

    Song, Jin Ho; Kim, Cheol Woo; Kwon, Young Min; Kim, Sook Kwan

    1995-04-01

    A study for the development of an analysis methodology for hot leg break mass and energy release is performed. For the blowdown period a modified CEFLASH-4A methodology is suggested. For the post blowdown period a modified CONTRAST boil-off model is suggested. By using these computer code improved mass and energy release data are generated. Also, a RELAP5/MOD3 analysis for finally the FLOOD-3 computer code has been modified for use in the analysis of hot leg break. The results of analysis using modified FLOOD-3 are reasonable as we expected and their trends are good. 66 figs., 8 tabs. (Author) .new

  2. Exploratory market structure analysis. Topology-sensitive methodology.

    OpenAIRE

    Mazanec, Josef

    1999-01-01

    Given the recent abundance of brand choice data from scanner panels market researchers have neglected the measurement and analysis of perceptions. Heterogeneity of perceptions is still a largely unexplored issue in market structure and segmentation studies. Over the last decade various parametric approaches toward modelling segmented perception-preference structures such as combined MDS and Latent Class procedures have been introduced. These methods, however, are not taylored for qualitative ...

  3. On the methodology of the analysis of Moessbauer spectra

    International Nuclear Information System (INIS)

    Vandenberghe, R.E.; Grave, E. de; Bakker, P.M.A. de

    1994-01-01

    A review is presented of the direct fitting procedures which are used in the analysis of Moessbauer spectra. Direct lineshape fitting with alternative profiles as well as shape-dependent, shape-independent and quasi shape-independent distribution fitting methods all can easily be incorporated in one computer program scheme yielding a large versatility for modification and/or extension of the programs according to specific spectra. (orig.)

  4. Methodological issues underlying multiple decrement life table analysis.

    Science.gov (United States)

    Mode, C J; Avery, R C; Littman, G S; Potter, R G

    1977-02-01

    In this paper, the actuarial method of multiple decrement life table analysis of censored, longitudinal data is examined. The discussion is organized in terms of the first segment of usage of an intrauterine device. Weaknesses of the actuarial approach are pointed out, and an alternative approach, based on the classical model of competing risks, is proposed. Finally, the actuarial and the alternative method of analyzing censored data are compared, using data from the Taichung Medical Study on Intrauterine Devices.

  5. Applications of the TSUNAMI sensitivity and uncertainty analysis methodology

    International Nuclear Information System (INIS)

    Rearden, Bradley T.; Hopper, Calvin M.; Elam, Karla R.; Goluoglu, Sedat; Parks, Cecil V.

    2003-01-01

    The TSUNAMI sensitivity and uncertainty analysis tools under development for the SCALE code system have recently been applied in four criticality safety studies. TSUNAMI is used to identify applicable benchmark experiments for criticality code validation, assist in the design of new critical experiments for a particular need, reevaluate previously computed computational biases, and assess the validation coverage and propose a penalty for noncoverage for a specific application. (author)

  6. Integrated modeling and analysis methodology for precision pointing applications

    Science.gov (United States)

    Gutierrez, Homero L.

    2002-07-01

    Space-based optical systems that perform tasks such as laser communications, Earth imaging, and astronomical observations require precise line-of-sight (LOS) pointing. A general approach is described for integrated modeling and analysis of these types of systems within the MATLAB/Simulink environment. The approach can be applied during all stages of program development, from early conceptual design studies to hardware implementation phases. The main objective is to predict the dynamic pointing performance subject to anticipated disturbances and noise sources. Secondary objectives include assessing the control stability, levying subsystem requirements, supporting pointing error budgets, and performing trade studies. The integrated model resides in Simulink, and several MATLAB graphical user interfaces (GUI"s) allow the user to configure the model, select analysis options, run analyses, and process the results. A convenient parameter naming and storage scheme, as well as model conditioning and reduction tools and run-time enhancements, are incorporated into the framework. This enables the proposed architecture to accommodate models of realistic complexity.

  7. Complexity and Vulnerability Analysis of Critical Infrastructures: A Methodological Approach

    Directory of Open Access Journals (Sweden)

    Yongliang Deng

    2017-01-01

    Full Text Available Vulnerability analysis of network models has been widely adopted to explore the potential impacts of random disturbances, deliberate attacks, and natural disasters. However, almost all these models are based on a fixed topological structure, in which the physical properties of infrastructure components and their interrelationships are not well captured. In this paper, a new research framework is put forward to quantitatively explore and assess the complexity and vulnerability of critical infrastructure systems. Then, a case study is presented to prove the feasibility and validity of the proposed framework. After constructing metro physical network (MPN, Pajek is employed to analyze its corresponding topological properties, including degree, betweenness, average path length, network diameter, and clustering coefficient. With a comprehensive understanding of the complexity of MPN, it would be beneficial for metro system to restrain original near-miss or accidents and support decision-making in emergency situations. Moreover, through the analysis of two simulation protocols for system component failure, it is found that the MPN turned to be vulnerable under the condition that the high-degree nodes or high-betweenness edges are attacked. These findings will be conductive to offer recommendations and proposals for robust design, risk-based decision-making, and prioritization of risk reduction investment.

  8. Methodology for Design and Analysis of Reactive Distillation Involving Multielement Systems

    DEFF Research Database (Denmark)

    Jantharasuk, Amnart; Gani, Rafiqul; Górak, Andrzej

    2011-01-01

    A new methodology for design and analysis of reactive distillation has been developed. In this work, the elementbased approach, coupled with a driving force diagram, has been extended and applied to the design of a reactive distillation column involving multielement (multicomponent) systems...... consisting of two components. Based on this methodology, an optimal design configuration is identified using the equivalent binary-element-driving force diagram. Two case studies of methyl acetate (MeOAc) synthesis and methyl-tert-butyl ether (MTBE) synthesis have been considered to demonstrate...... the successful applications of the methodology. Moreover, energy requirements for various column configurations corresponding to different feed locatio...

  9. Landslide risk analysis: a multi-disciplinary methodological approach

    Directory of Open Access Journals (Sweden)

    S. Sterlacchini

    2007-11-01

    Full Text Available This study describes an analysis carried out within the European community project "ALARM" (Assessment of Landslide Risk and Mitigation in Mountain Areas, 2004 on landslide risk assessment in the municipality of Corvara in Badia, Italy. This mountainous area, located in the central Dolomites (Italian Alps, poses a significant landslide hazard to several man-made and natural objects. Three parameters for determining risk were analysed as an aid to preparedness and mitigation planning: event occurrence probability, elements at risk, and the vulnerability of these elements. Initially, a landslide hazard scenario was defined; this step was followed by the identification of the potential vulnerable elements, by the estimation of the expected physical effects, due to the occurrence of a damaging phenomenon, and by the analysis of social and economic features of the area. Finally, a potential risk scenario was defined, where the relationships between the event, its physical effects, and its economic consequences were investigated. People and public administrators with training and experience in local landsliding and slope processes were involved in each step of the analysis.

    A "cause-effect" correlation was applied, derived from the "dose-response" equation initially used in the biological sciences and then adapted by economists for the assessment of environmental risks. The relationship was analysed from a physical point of view and the cause (the natural event was correlated to the physical effects, i.e. the aesthetic, functional, and structural damage. An economic evaluation of direct and indirect damage was carried out considering the assets in the affected area (i.e., tourist flows, goods, transport and the effect on other social and economic activities. This study shows the importance of indirect damage, which is as significant as direct damage. The total amount of direct damage was estimated in 8 913 000 €; on the contrary, indirect

  10. Landslide risk analysis: a multi-disciplinary methodological approach

    Science.gov (United States)

    Sterlacchini, S.; Frigerio, S.; Giacomelli, P.; Brambilla, M.

    2007-11-01

    This study describes an analysis carried out within the European community project "ALARM" (Assessment of Landslide Risk and Mitigation in Mountain Areas, 2004) on landslide risk assessment in the municipality of Corvara in Badia, Italy. This mountainous area, located in the central Dolomites (Italian Alps), poses a significant landslide hazard to several man-made and natural objects. Three parameters for determining risk were analysed as an aid to preparedness and mitigation planning: event occurrence probability, elements at risk, and the vulnerability of these elements. Initially, a landslide hazard scenario was defined; this step was followed by the identification of the potential vulnerable elements, by the estimation of the expected physical effects, due to the occurrence of a damaging phenomenon, and by the analysis of social and economic features of the area. Finally, a potential risk scenario was defined, where the relationships between the event, its physical effects, and its economic consequences were investigated. People and public administrators with training and experience in local landsliding and slope processes were involved in each step of the analysis. A "cause-effect" correlation was applied, derived from the "dose-response" equation initially used in the biological sciences and then adapted by economists for the assessment of environmental risks. The relationship was analysed from a physical point of view and the cause (the natural event) was correlated to the physical effects, i.e. the aesthetic, functional, and structural damage. An economic evaluation of direct and indirect damage was carried out considering the assets in the affected area (i.e., tourist flows, goods, transport and the effect on other social and economic activities). This study shows the importance of indirect damage, which is as significant as direct damage. The total amount of direct damage was estimated in 8 913 000 €; on the contrary, indirect damage ranged considerably

  11. Mediation analysis in nursing research: a methodological review.

    Science.gov (United States)

    Liu, Jianghong; Ulrich, Connie

    2016-12-01

    Mediation statistical models help clarify the relationship between independent predictor variables and dependent outcomes of interest by assessing the impact of third variables. This type of statistical analysis is applicable for many clinical nursing research questions, yet its use within nursing remains low. Indeed, mediational analyses may help nurse researchers develop more effective and accurate prevention and treatment programs as well as help bridge the gap between scientific knowledge and clinical practice. In addition, this statistical approach allows nurse researchers to ask - and answer - more meaningful and nuanced questions that extend beyond merely determining whether an outcome occurs. Therefore, the goal of this paper is to provide a brief tutorial on the use of mediational analyses in clinical nursing research by briefly introducing the technique and, through selected empirical examples from the nursing literature, demonstrating its applicability in advancing nursing science.

  12. Methodology for analysis and simulation of large multidisciplinary problems

    Science.gov (United States)

    Russell, William C.; Ikeda, Paul J.; Vos, Robert G.

    1989-01-01

    The Integrated Structural Modeling (ISM) program is being developed for the Air Force Weapons Laboratory and will be available for Air Force work. Its goal is to provide a design, analysis, and simulation tool intended primarily for directed energy weapons (DEW), kinetic energy weapons (KEW), and surveillance applications. The code is designed to run on DEC (VMS and UNIX), IRIS, Alliant, and Cray hosts. Several technical disciplines are included in ISM, namely structures, controls, optics, thermal, and dynamics. Four topics from the broad ISM goal are discussed. The first is project configuration management and includes two major areas: the software and database arrangement and the system model control. The second is interdisciplinary data transfer and refers to exchange of data between various disciplines such as structures and thermal. Third is a discussion of the integration of component models into one system model, i.e., multiple discipline model synthesis. Last is a presentation of work on a distributed processing computing environment.

  13. THE MURCHISON WIDEFIELD ARRAY 21 cm POWER SPECTRUM ANALYSIS METHODOLOGY

    Energy Technology Data Exchange (ETDEWEB)

    Jacobs, Daniel C.; Beardsley, A. P.; Bowman, Judd D. [Arizona State University, School of Earth and Space Exploration, Tempe, AZ 85287 (United States); Hazelton, B. J.; Sullivan, I. S.; Barry, N.; Carroll, P. [University of Washington, Department of Physics, Seattle, WA 98195 (United States); Trott, C. M.; Pindor, B.; Briggs, F.; Gaensler, B. M. [ARC Centre of Excellence for All-sky Astrophysics (CAASTRO) (Australia); Dillon, Joshua S.; Oliveira-Costa, A. de; Ewall-Wice, A.; Feng, L. [MIT Kavli Institute for Astrophysics and Space Research, Cambridge, MA 02139 (United States); Pober, J. C. [Brown University, Department of Physics, Providence, RI 02912 (United States); Bernardi, G. [Department of Physics and Electronics, Rhodes University, Grahamstown 6140 (South Africa); Cappallo, R. J.; Corey, B. E. [MIT Haystack Observatory, Westford, MA 01886 (United States); Emrich, D., E-mail: daniel.c.jacobs@asu.edu [International Centre for Radio Astronomy Research, Curtin University, Perth, WA 6845 (Australia); and others

    2016-07-10

    We present the 21 cm power spectrum analysis approach of the Murchison Widefield Array Epoch of Reionization project. In this paper, we compare the outputs of multiple pipelines for the purpose of validating statistical limits cosmological hydrogen at redshifts between 6 and 12. Multiple independent data calibration and reduction pipelines are used to make power spectrum limits on a fiducial night of data. Comparing the outputs of imaging and power spectrum stages highlights differences in calibration, foreground subtraction, and power spectrum calculation. The power spectra found using these different methods span a space defined by the various tradeoffs between speed, accuracy, and systematic control. Lessons learned from comparing the pipelines range from the algorithmic to the prosaically mundane; all demonstrate the many pitfalls of neglecting reproducibility. We briefly discuss the way these different methods attempt to handle the question of evaluating a significant detection in the presence of foregrounds.

  14. Methodology for global nonlinear analysis of nuclear systems

    International Nuclear Information System (INIS)

    Cacuci, D.G.; Cacuci, G.L.

    1987-01-01

    This paper outlines a general method for globally computing the crucial features of nonlinear problems: bifurcations, limit points, saddle points, extrema (maxima and minima); our method also yields the local sensitivities (i.e., first order derivatives) of the system's state variables (e.g., fluxes, power, temperatures, flows) at any point in the system's phase space. We also present an application of this method to the nonlinear BWR model discussed in Refs. 8 and 11. The most significant novel feature of our method is the recasting of a general mathematical problem comprising three aspects: (1) nonlinear constrained optimization, (2) sensitivity analysis, into a fixed point problem of the form F[u(s), λ(s)] = 0 whose global zeros and singular points are related to the special features (i.e., extrema, bifurcations, etc.) of the original problem

  15. New approaches in intelligent image analysis techniques, methodologies and applications

    CERN Document Server

    Nakamatsu, Kazumi

    2016-01-01

    This book presents an Introduction and 11 independent chapters, which are devoted to various new approaches of intelligent image processing and analysis. The book also presents new methods, algorithms and applied systems for intelligent image processing, on the following basic topics: Methods for Hierarchical Image Decomposition; Intelligent Digital Signal Processing and Feature Extraction; Data Clustering and Visualization via Echo State Networks; Clustering of Natural Images in Automatic Image Annotation Systems; Control System for Remote Sensing Image Processing; Tissue Segmentation of MR Brain Images Sequence; Kidney Cysts Segmentation in CT Images; Audio Visual Attention Models in Mobile Robots Navigation; Local Adaptive Image Processing; Learning Techniques for Intelligent Access Control; Resolution Improvement in Acoustic Maps. Each chapter is self-contained with its own references. Some of the chapters are devoted to the theoretical aspects while the others are presenting the practical aspects and the...

  16. A faster reactor transient analysis methodology for PCs

    International Nuclear Information System (INIS)

    Ott, K.O.

    1991-10-01

    The simplified ANL model for LMR transient analysis, in which point kinetics as well as lumped descriptions of the heat transfer equations in all components are applied, is converted from a differential into an integral formulation. All differential balance equations are implicitly solved in terms of convolution integrals. The prompt jump approximation is applied as the strong negative feedback effectively keeps the net reactivity well below prompt critical. After implicit finite differencing of the convolution integrals, the kinetics equation assumes the form of a quadratic equation, the ''quadratic dynamics equation.'' This model forms the basis for GW-BASIC program, LTC, for LMR Transient Calculation program, which can effectively be run on a PC. The GW-BASIC version of the LTC program is described in detail in Volume 2 of this report

  17. A study on the dependency evaluation for multiple human actions in human reliability analysis of probabilistic safety assessment

    International Nuclear Information System (INIS)

    Kang, D. I.; Yang, J. E.; Jung, W. D.; Sung, T. Y.; Park, J. H.; Lee, Y. H.; Hwang, M. J.; Kim, K. Y.; Jin, Y. H.; Kim, S. C.

    1997-02-01

    This report describes the study results on the method of the dependency evaluation and the modeling, and the limited value of human error probability (HEP) for multiple human actions in accident sequences of probabilistic safety assessment (PSA). THERP and Parry's method, which have been generally used in dependency evaluation of human reliability analysis (HRA), are introduced and their limitations are discussed. New dependency evaluation method in HRA is established to make up for the weak points of THERP and Parry's methods. The limited value of HEP is also established based on the review of several HRA related documents. This report describes the definition, the type, the evaluation method, and the evaluation example of dependency to help the reader's understanding. It is expected that this study results will give a guidance to HRA analysts in dependency evaluation of multiple human actions and enable PSA analysts to understand HRA in detail. (author). 23 refs., 3 tabs., 2 figs

  18. Methodology for diagnosing of skin cancer on images of dermatologic spots by spectral analysis

    OpenAIRE

    Guerra-Rosas, Esperanza; Álvarez-Borrego, Josué

    2015-01-01

    In this paper a new methodology for the diagnosing of skin cancer on images of dermatologic spots using image processing is presented. Currently skin cancer is one of the most frequent diseases in humans. This methodology is based on Fourier spectral analysis by using filters such as the classic, inverse and k-law nonlinear. The sample images were obtained by a medical specialist and a new spectral technique is developed to obtain a quantitative measurement of the complex pattern found in can...

  19. submitter Methodologies for the Statistical Analysis of Memory Response to Radiation

    CERN Document Server

    Bosser, Alexandre L; Tsiligiannis, Georgios; Frost, Christopher D; Zadeh, Ali; Jaatinen, Jukka; Javanainen, Arto; Puchner, Helmut; Saigne, Frederic; Virtanen, Ari; Wrobel, Frederic; Dilillo, Luigi

    2016-01-01

    Methodologies are proposed for in-depth statistical analysis of Single Event Upset data. The motivation for using these methodologies is to obtain precise information on the intrinsic defects and weaknesses of the tested devices, and to gain insight on their failure mechanisms, at no additional cost. The case study is a 65 nm SRAM irradiated with neutrons, protons and heavy ions. This publication is an extended version of a previous study [1].

  20. Fuzzy Clustering based Methodology for Multidimensional Data Analysis in Computational Forensic Domain

    OpenAIRE

    Kilian Stoffel; Paul Cotofrei; Dong Han

    2012-01-01

    As interdisciplinary domain requiring advanced and innovative methodologies the computational forensics domain is characterized by data being simultaneously large scaled and uncertain multidimensional and approximate. Forensic domain experts trained to discover hidden pattern from crime data are limited in their analysis without the assistance of a computational intelligence approach. In this paper a methodology and an automatic procedure based on fuzzy set theory and designed to infer precis...

  1. Methodology of the Integrated Analysis of Company's Financial Status and Its Performance Results

    OpenAIRE

    Mackevičius, Jonas; Valkauskas, Romualdas

    2010-01-01

    Information about company's financial status and its performance results is very important for the objective evaluation of company's position in the market and competitive possibilities in the future. Such information is provided in the financial statement. It is important to apply and investigate this information properly. The methodology of company's financial status and performance results integrated analysis is recommended in this article. This methodology consists of these three elements...

  2. A fast reactor transient analysis methodology for personal computers

    International Nuclear Information System (INIS)

    Ott, K.O.

    1993-01-01

    A simplified model for a liquid-metal-cooled reactor (LMR) transient analysis, in which point kinetics as well as lumped descriptions of the heat transfer equations in all components are applied, is converted from a differential into an integral formulation. All 30 differential balance equations are implicitly solved in terms of convolution integrals. The prompt jump approximation is applied as the strong negative feedback effectively keeps the net reactivity well below prompt critical. After implicit finite differencing of the convolution integrals, the kinetics equation assumes a new form, i.e., the quadratic dynamics equation. In this integral formulation, the initial value problem of typical LMR transients can be solved with large item steps (initially 1 s, later up to 256 s). This then makes transient problems amenable to a treatment on personal computer. The resulting mathematical model forms the basis for the GW-BASIC program LMR transient calculation (LTC) program. The LTC program has also been converted to QuickBASIC. The running time for a 10-h transient overpower transient is then ∼40 to 10 s, depending on the hardware version (286, 386, or 486 with math coprocessors)

  3. Textbooks in transitional countries: Towards a methodology for comparative analysis

    Directory of Open Access Journals (Sweden)

    Miha Kovač

    2004-01-01

    Full Text Available In its first part, the paper analyses the ambiguous nature of the book as a medium: its physical production and its distribution to the end user takes place on a market basis; on the other hand, its content is predominantly consumed in a sector that was at least in the continental Europe traditionally considered as public and non-profit making. This ambiguous nature of the book and with it the impact of the market on the organization of knowledge in book format remains a dark spot in contemporary book research. On the other hand, textbooks are considered as ephemera both in contemporary education and book studies. Therefore, research on textbooks publishing models could be considered as a blind-spot of contemporary social studies. As a consequence, in the majority of European countries, textbook publishing and the organization of the textbook market are considered as self-evident. Throughout a comparative analysis of textbook publishing models in small transitional and developed countries, the paper points out that this self-evident organization of the textbook market is always culturally determined. In its final part, the paper compares different models of textbook publishing and outlines the scenarios for the development of the Slovene textbook market.

  4. Energy minimization in medical image analysis: Methodologies and applications.

    Science.gov (United States)

    Zhao, Feng; Xie, Xianghua

    2016-02-01

    Energy minimization is of particular interest in medical image analysis. In the past two decades, a variety of optimization schemes have been developed. In this paper, we present a comprehensive survey of the state-of-the-art optimization approaches. These algorithms are mainly classified into two categories: continuous method and discrete method. The former includes Newton-Raphson method, gradient descent method, conjugate gradient method, proximal gradient method, coordinate descent method, and genetic algorithm-based method, while the latter covers graph cuts method, belief propagation method, tree-reweighted message passing method, linear programming method, maximum margin learning method, simulated annealing method, and iterated conditional modes method. We also discuss the minimal surface method, primal-dual method, and the multi-objective optimization method. In addition, we review several comparative studies that evaluate the performance of different minimization techniques in terms of accuracy, efficiency, or complexity. These optimization techniques are widely used in many medical applications, for example, image segmentation, registration, reconstruction, motion tracking, and compressed sensing. We thus give an overview on those applications as well. Copyright © 2015 John Wiley & Sons, Ltd.

  5. Economy As A Phenomenon Of Culture: Theoretical And Methodological Analysis

    Directory of Open Access Journals (Sweden)

    S. N. Ivaskovsky

    2017-01-01

    Full Text Available The article redefines economy as a phenomenon of culture, a product of a historically and socially grounded set of values shared by members of a given society. The research shows that culture is not always identical to social utility, because there are multiple examples when archaic, traditionalist, irrational cultural norms hinder social and economic progress and trap nations into poverty and underdevelopment. One of the reasons for the lack of scholarly attention to cultural dimension of economy is the triumph of positivism in economics. Mathematics has become the dominant language of economic analysis. It leads to the transformation of the economics into a sort of «social physics», accompanied by the loss of its original humanitarian nature shared in the works of all the great economists of the past. The author emphasizes the importance of the interdisciplinary approach to the economic research and the incorporation of the achievements of the other social disciplines – history, philosophy, sociology and cultural studies - into the subject matter of economic theory. Substantiating the main thesis of the article, the author shows that there is a profound ontological bond between economy and culture, which primarily consists in the fact that these spheres of human relations are aimed at the solution of the same problem – the competitive selection of the best ways for survival of people, of satisfying the relevant living needs. In order to overcome the difficulties related to the inclusion of culture in the set of analytical tools used in the economic theory, the author suggests using a category of «cultural capital», which reestablishes the earlier and more familiar for the economists meaning of capital.

  6. Mathematical modelling methodologies in predictive food microbiology: a SWOT analysis.

    Science.gov (United States)

    Ferrer, Jordi; Prats, Clara; López, Daniel; Vives-Rego, Josep

    2009-08-31

    Predictive microbiology is the area of food microbiology that attempts to forecast the quantitative evolution of microbial populations over time. This is achieved to a great extent through models that include the mechanisms governing population dynamics. Traditionally, the models used in predictive microbiology are whole-system continuous models that describe population dynamics by means of equations applied to extensive or averaged variables of the whole system. Many existing models can be classified by specific criteria. We can distinguish between survival and growth models by seeing whether they tackle mortality or cell duplication. We can distinguish between empirical (phenomenological) models, which mathematically describe specific behaviour, and theoretical (mechanistic) models with a biological basis, which search for the underlying mechanisms driving already observed phenomena. We can also distinguish between primary, secondary and tertiary models, by examining their treatment of the effects of external factors and constraints on the microbial community. Recently, the use of spatially explicit Individual-based Models (IbMs) has spread through predictive microbiology, due to the current technological capacity of performing measurements on single individual cells and thanks to the consolidation of computational modelling. Spatially explicit IbMs are bottom-up approaches to microbial communities that build bridges between the description of micro-organisms at the cell level and macroscopic observations at the population level. They provide greater insight into the mesoscale phenomena that link unicellular and population levels. Every model is built in response to a particular question and with different aims. Even so, in this research we conducted a SWOT (Strength, Weaknesses, Opportunities and Threats) analysis of the different approaches (population continuous modelling and Individual-based Modelling), which we hope will be helpful for current and future

  7. Fukushima Daiichi unit 1 uncertainty analysis--Preliminary selection of uncertain parameters and analysis methodology

    Energy Technology Data Exchange (ETDEWEB)

    Cardoni, Jeffrey N.; Kalinich, Donald A.

    2014-02-01

    Sandia National Laboratories (SNL) plans to conduct uncertainty analyses (UA) on the Fukushima Daiichi unit (1F1) plant with the MELCOR code. The model to be used was developed for a previous accident reconstruction investigation jointly sponsored by the US Department of Energy (DOE) and Nuclear Regulatory Commission (NRC). However, that study only examined a handful of various model inputs and boundary conditions, and the predictions yielded only fair agreement with plant data and current release estimates. The goal of this uncertainty study is to perform a focused evaluation of uncertainty in core melt progression behavior and its effect on key figures-of-merit (e.g., hydrogen production, vessel lower head failure, etc.). In preparation for the SNL Fukushima UA work, a scoping study has been completed to identify important core melt progression parameters for the uncertainty analysis. The study also lays out a preliminary UA methodology.

  8. A Systematic Review of Brief Functional Analysis Methodology with Typically Developing Children

    Science.gov (United States)

    Gardner, Andrew W.; Spencer, Trina D.; Boelter, Eric W.; DuBard, Melanie; Jennett, Heather K.

    2012-01-01

    Brief functional analysis (BFA) is an abbreviated assessment methodology derived from traditional extended functional analysis methods. BFAs are often conducted when time constraints in clinics, schools or homes are of concern. While BFAs have been used extensively to identify the function of problem behavior for children with disabilities, their…

  9. Reactor analysis support package (RASP). Volume 7. PWR set-point methodology. Final report

    International Nuclear Information System (INIS)

    Temple, S.M.; Robbins, T.R.

    1986-09-01

    This report provides an overview of the basis and methodology requirements for determining Pressurized Water Reactor (PWR) technical specifications related setpoints and focuses on development of the methodology for a reload core. Additionally, the report documents the implementation and typical methods of analysis used by PWR vendors during the 1970's to develop Protection System Trip Limits (or Limiting Safety System Settings) and Limiting Conditions for Operation. The descriptions of the typical setpoint methodologies are provided for Nuclear Steam Supply Systems as designed and supplied by Babcock and Wilcox, Combustion Engineering, and Westinghouse. The description of the methods of analysis includes the discussion of the computer codes used in the setpoint methodology. Next, the report addresses the treatment of calculational and measurement uncertainties based on the extent to which such information was available for each of the three types of PWR. Finally, the major features of the setpoint methodologies are compared, and the principal effects of each particular methodology on plant operation are summarized for each of the three types of PWR

  10. Development of the GO-FLOW reliability analysis methodology for nuclear reactor system

    International Nuclear Information System (INIS)

    Matsuoka, Takeshi; Kobayashi, Michiyuki

    1994-01-01

    Probabilistic Safety Assessment (PSA) is important in the safety analysis of technological systems and processes, such as, nuclear plants, chemical and petroleum facilities, aerospace systems. Event trees and fault trees are the basic analytical tools that have been most frequently used for PSAs. Several system analysis methods can be used in addition to, or in support of, the event- and fault-tree analysis. The need for more advanced methods of system reliability analysis has grown with the increased complexity of engineered systems. The Ship Research Institute has been developing a new reliability analysis methodology, GO-FLOW, which is a success-oriented system analysis technique, and is capable of evaluating a large system with complex operational sequences. The research has been supported by the special research fund for Nuclear Technology, Science and Technology Agency, from 1989 to 1994. This paper describes the concept of the Probabilistic Safety Assessment (PSA), an overview of various system analysis techniques, an overview of the GO-FLOW methodology, the GO-FLOW analysis support system, procedure of treating a phased mission problem, a function of common cause failure analysis, a function of uncertainty analysis, a function of common cause failure analysis with uncertainty, and printing out system of the results of GO-FLOW analysis in the form of figure or table. Above functions are explained by analyzing sample systems, such as PWR AFWS, BWR ECCS. In the appendices, the structure of the GO-FLOW analysis programs and the meaning of the main variables defined in the GO-FLOW programs are described. The GO-FLOW methodology is a valuable and useful tool for system reliability analysis, and has a wide range of applications. With the development of the total system of the GO-FLOW, this methodology has became a powerful tool in a living PSA. (author) 54 refs

  11. Design and analysis of sustainable computer mouse using design for disassembly methodology

    Science.gov (United States)

    Roni Sahroni, Taufik; Fitri Sukarman, Ahmad; Agung Mahardini, Karunia

    2017-12-01

    This paper presents the design and analysis of computer mouse using Design for Disassembly methodology. Basically, the existing computer mouse model consist a number of unnecessary part that cause the assembly and disassembly time in production. The objective of this project is to design a new computer mouse based on Design for Disassembly (DFD) methodology. The main methodology of this paper was proposed from sketch generation, concept selection, and concept scoring. Based on the design screening, design concept B was selected for further analysis. New design of computer mouse is proposed using fastening system. Furthermore, three materials of ABS, Polycarbonate, and PE high density were prepared to determine the environmental impact category. Sustainable analysis was conducted using software SolidWorks. As a result, PE High Density gives the lowers amount in the environmental category with great maximum stress value.

  12. Fire risk analysis for nuclear power plants: Methodological developments and applications

    International Nuclear Information System (INIS)

    Kazarians, M.; Apostolakis, G.; Siv, N.O.

    1985-01-01

    A methodology to quantify the risk from fires in nuclear power plants is described. This methodology combines engineering judgment, statistical evidence, fire phenomenology, and plant system analysis. It can be divided into two major parts: (1) fire scenario identification and quantification, and (2) analysis of the impact on plant safety. This article primarily concentrates on the first part. Statistical analysis of fire occurrence data is used to establish the likelihood of ignition. The temporal behaviors of the two competing phenomena, fire propagation and fire detection and suppression, are studied and their characteristic times are compared. Severity measures are used to further specialize the frequency of the fire scenario. The methodology is applied to a switchgear room of a nuclear power plant

  13. A Critique on the Effectiveness of Current Human Reliability Analysis Approach for the Human-Machine Interface Design in Nuclear Power Plants

    International Nuclear Information System (INIS)

    Lee, Yong Hee

    2010-01-01

    Human Reliability Analysis (HRA) in cooperation of PSA has been conducted to evaluate the safety of a system and the validity of a system design. HRA has been believed to provide a quantitative value of human error potential and the safety level of a design alternative in Nuclear Power Plants (NPPs). However, it becomes doubtful that current HRA is worth to conduct to evaluate the human factors of NPP design, since there have been many critiques upon the virtue of HRA. Inevitably, the newer the technology becomes, the larger endeavors bound for the new facilitated methods. This paper describes the limitations and the obsolescence of the current HRA, especially for the design evaluation of Human-Machine Interface (HMI) utilizing the recent digital technologies. An alternative approach to the assessment of the human error potential of HMI design is proposed

  14. Different methodologies in neutron activation to approach the full analysis of environmental and nutritional samples

    International Nuclear Information System (INIS)

    Freitas, M.C.; Dionisio, I.; Dung, H.M.

    2008-01-01

    Different methodologies of neutron activation analysis (NAA) are now available at the Technological and Nuclear Institute (Sacavem, Portugal), namely Compton suppression, epithermal activation, replicate and cyclic activation, and low energy photon measurement. Prompt gamma activation analysis (PGAA) will be implemented soon. Results by instrumental NAA and PGAA on environmental and nutritional samples are discussed herein, showing that PGAA - carried out at the Institute of Isotope Research (Budapest, Hungary) - brings about an effective input to assessing relevant elements. Sensitivity enhancement in NAA by Compton suppression is also illustrated. Through a judicious combination of methodologies, practically all elements of interest in pollution and nutrition terms can be determined. (author)

  15. The IDEAL (Integrated Design and Engineering Analysis Languages) modeling methodology: Capabilities and Applications

    Science.gov (United States)

    Evers, Ken H.; Bachert, Robert F.

    1987-01-01

    The IDEAL (Integrated Design and Engineering Analysis Languages) modeling methodology has been formulated and applied over a five-year period. It has proven to be a unique, integrated approach utilizing a top-down, structured technique to define and document the system of interest; a knowledge engineering technique to collect and organize system descriptive information; a rapid prototyping technique to perform preliminary system performance analysis; and a sophisticated simulation technique to perform in-depth system performance analysis.

  16. Analysis Planning Methodology: For Thesis, Joint Applied Project, & MBA Research Reports

    OpenAIRE

    Naegle, Brad R.

    2010-01-01

    Acquisition Research Handbook Series Purpose: This guide provides the graduate student researcher—you—with techniques and advice on creating an effective analysis plan, and it provides methods for focusing the data-collection effort based on that analysis plan. As a side benefit, this analysis planning methodology will help you to properly scope the research effort and will provide you with insight for changes in that effort. The information presented herein was supported b...

  17. Development of Non-LOCA Safety Analysis Methodology with RETRAN-3D and VIPRE-01/K

    International Nuclear Information System (INIS)

    Kim, Yo-Han; Cheong, Ae-Ju; Yang, Chang-Keun

    2004-01-01

    Korea Electric Power Research Institute has launched a project to develop an in-house non-loss-of-coolant-accident analysis methodology to overcome the hardships caused by the narrow analytical scopes of existing methodologies. Prior to the development, some safety analysis codes were reviewed, and RETRAN-3D and VIPRE-01 were chosen as the base codes. The codes have been modified to improve the analytical capabilities required to analyze the nuclear power plants in Korea. The methodologies of the vendors and the Electric Power Research Institute have been reviewed, and some documents of foreign utilities have been used to compensate for the insufficiencies. For the next step, a draft methodology for pressurized water reactors has been developed and modified to apply to Westinghouse-type plants in Korea. To verify the feasibility of the methodology, some events of Yonggwang Units 1 and 2 have been analyzed from the standpoints of reactor coolant system pressure and the departure from nucleate boiling ratio. The results of the analyses show trends similar to those of the Final Safety Analysis Report

  18. Methodology for object-oriented real-time systems analysis and design: Software engineering

    Science.gov (United States)

    Schoeffler, James D.

    1991-01-01

    Successful application of software engineering methodologies requires an integrated analysis and design life-cycle in which the various phases flow smoothly 'seamlessly' from analysis through design to implementation. Furthermore, different analysis methodologies often lead to different structuring of the system so that the transition from analysis to design may be awkward depending on the design methodology to be used. This is especially important when object-oriented programming is to be used for implementation when the original specification and perhaps high-level design is non-object oriented. Two approaches to real-time systems analysis which can lead to an object-oriented design are contrasted: (1) modeling the system using structured analysis with real-time extensions which emphasizes data and control flows followed by the abstraction of objects where the operations or methods of the objects correspond to processes in the data flow diagrams and then design in terms of these objects; and (2) modeling the system from the beginning as a set of naturally occurring concurrent entities (objects) each having its own time-behavior defined by a set of states and state-transition rules and seamlessly transforming the analysis models into high-level design models. A new concept of a 'real-time systems-analysis object' is introduced and becomes the basic building block of a series of seamlessly-connected models which progress from the object-oriented real-time systems analysis and design system analysis logical models through the physical architectural models and the high-level design stages. The methodology is appropriate to the overall specification including hardware and software modules. In software modules, the systems analysis objects are transformed into software objects.

  19. Development of A Standard Method for Human Reliability Analysis of Nuclear Power Plants

    International Nuclear Information System (INIS)

    Jung, Won Dea; Kang, Dae Il; Kim, Jae Whan

    2005-12-01

    According as the demand of risk-informed regulation and applications increase, the quality and reliability of a probabilistic safety assessment (PSA) has been more important. KAERI started a study to standardize the process and the rules of HRA (Human Reliability Analysis) which was known as a major contributor to the uncertainty of PSA. The study made progress as follows; assessing the level of quality of the HRAs in Korea and identifying the weaknesses of the HRAs, determining the requirements for developing a standard HRA method, developing the process and rules for quantifying human error probability. Since the risk-informed applications use the ASME PSA standard to ensure PSA quality, the standard HRA method was developed to meet the ASME HRA requirements with level of category II. The standard method was based on THERP and ASEP HRA that are widely used for conventional HRA. However, the method focuses on standardizing and specifying the analysis process, quantification rules and criteria to minimize the deviation of the analysis results caused by different analysts. Several HRA experts from different organizations in Korea participated in developing the standard method. Several case studies were interactively undertaken to verify the usability and applicability of the standard method

  20. Development of A Standard Method for Human Reliability Analysis of Nuclear Power Plants

    Energy Technology Data Exchange (ETDEWEB)

    Jung, Won Dea; Kang, Dae Il; Kim, Jae Whan

    2005-12-15

    According as the demand of risk-informed regulation and applications increase, the quality and reliability of a probabilistic safety assessment (PSA) has been more important. KAERI started a study to standardize the process and the rules of HRA (Human Reliability Analysis) which was known as a major contributor to the uncertainty of PSA. The study made progress as follows; assessing the level of quality of the HRAs in Korea and identifying the weaknesses of the HRAs, determining the requirements for developing a standard HRA method, developing the process and rules for quantifying human error probability. Since the risk-informed applications use the ASME PSA standard to ensure PSA quality, the standard HRA method was developed to meet the ASME HRA requirements with level of category II. The standard method was based on THERP and ASEP HRA that are widely used for conventional HRA. However, the method focuses on standardizing and specifying the analysis process, quantification rules and criteria to minimize the deviation of the analysis results caused by different analysts. Several HRA experts from different organizations in Korea participated in developing the standard method. Several case studies were interactively undertaken to verify the usability and applicability of the standard method.

  1. Towards a Multimodal Methodology for the Analysis of Translated/Localised Games

    Directory of Open Access Journals (Sweden)

    Bárbara Resende Coelho

    2016-12-01

    Full Text Available Multimedia materials require research methodologies that are able to comprehend all of their assets. Videogames are the epitome of multimedia, joining image, sound, video, animation, graphics and text with the interactivity factor. A methodology to conduct research into translation and localisation of videogames should be able to analyse all of its assets and features. This paper sets out to develop a research methodology for games and their translations/localisations that goes beyond the collection and analysis of “screenshots” and includes as many of their assets as possible. Using the fully localised version of the game Watchdogs, this papers shows how tools and technologies allow for transcending the mere analysis of linguistic contents within multimedia materials. Using software ELAN Language Archive to analyse Portuguese-language dubbed and English-language subtitled excerpts from the videogame, it was possible to identify patterns in both linguistic and audio-visual elements, as well as to correlate them.

  2. Applications of a methodology for the analysis of learning trends in nuclear power plants

    International Nuclear Information System (INIS)

    Cho, Hang Youn; Choi, Sung Nam; Yun, Won Yong

    1995-01-01

    A methodology is applied to identify the learning trend related to the safety and availability of U.S. commercial nuclear power plants. The application is intended to aid in reducing likelihood of human errors. To assure that the methodology can be easily adapted to various types of classification schemes of operation data, a data bank classified by the Transient Analysis Classification and Evaluation(TRACE) scheme is selected for the methodology. The significance criteria for human-initiated events affecting the systems and for events caused by human deficiencies were used. Clustering analysis was used to identify the learning trend in multi-dimensional histograms. A computer code is developed based on the K-Means algorithm and applied to find the learning period in which error rates are monotonously decreasing with plant age

  3. Methodology for Quantitative Analysis of Large Liquid Samples with Prompt Gamma Neutron Activation Analysis using Am-Be Source

    International Nuclear Information System (INIS)

    Idiri, Z.; Mazrou, H.; Beddek, S.; Amokrane, A.

    2009-01-01

    An optimized set-up for prompt gamma neutron activation analysis (PGNAA) with Am-Be source is described and used for large liquid samples analysis. A methodology for quantitative analysis is proposed: it consists on normalizing the prompt gamma count rates with thermal neutron flux measurements carried out with He-3 detector and gamma attenuation factors calculated using MCNP-5. The relative and absolute methods are considered. This methodology is then applied to the determination of cadmium in industrial phosphoric acid. The same sample is then analyzed by inductively coupled plasma (ICP) method. Our results are in good agreement with those obtained with ICP method.

  4. Human Reliability Analysis For Computerized Procedures

    International Nuclear Information System (INIS)

    Boring, Ronald L.; Gertman, David I.; Le Blanc, Katya

    2011-01-01

    This paper provides a characterization of human reliability analysis (HRA) issues for computerized procedures in nuclear power plant control rooms. It is beyond the scope of this paper to propose a new HRA approach or to recommend specific methods or refinements to those methods. Rather, this paper provides a review of HRA as applied to traditional paper-based procedures, followed by a discussion of what specific factors should additionally be considered in HRAs for computerized procedures. Performance shaping factors and failure modes unique to computerized procedures are highlighted. Since there is no definitive guide to HRA for paper-based procedures, this paper also serves to clarify the existing guidance on paper-based procedures before delving into the unique aspects of computerized procedures.

  5. Sensitivity Analysis on LOCCW of Westinghouse typed Reactors Considering WOG2000 RCP Seal Leakage Model

    International Nuclear Information System (INIS)

    Na, Jang-Hwan; Jeon, Ho-Jun; Hwang, Seok-Won

    2015-01-01

    In this paper, we focus on risk insights of Westinghouse typed reactors. We identified that Reactor Coolant Pump (RCP) seal integrity is the most important contributor to Core Damage Frequency (CDF). As we reflected the latest technical report; WCAP-15603(Rev. 1-A), 'WOG2000 RCP Seal Leakage Model for Westinghouse PWRs' instead of the old version, RCP seal integrity became more important to Westinghouse typed reactors. After Fukushima accidents, Korea Hydro and Nuclear Power (KHNP) decided to develop Low Power and Shutdown (LPSD) Probabilistic Safety Assessment (PSA) models and upgrade full power PSA models of all operating Nuclear Power Plants (NPPs). As for upgrading full power PSA models, we have tried to standardize the methodology of CCF (Common Cause Failure) and HRA (Human Reliability Analysis), which are the most influential factors to risk measures of NPPs. Also, we have reviewed and reflected the latest operating experiences, reliability data sources and technical methods to improve the quality of PSA models. KHNP has operating various types of reactors; Optimized Pressurized Reactor (OPR) 1000, CANDU, Framatome and Westinghouse. So, one of the most challengeable missions is to keep the balance of risk contributors of all types of reactors. This paper presents the method of new RCP seal leakage model and the sensitivity analysis results from applying the detailed method to PSA models of Westinghouse typed reference reactors. To perform the sensitivity analysis on LOCCW of the reference Westinghouse typed reactors, we reviewed WOG2000 RCP seal leakage model and developed the detailed event tree of LOCCW considering all scenarios of RCP seal failures. Also, we performed HRA based on the T/H analysis by using the leakage rates for each scenario. We could recognize that HRA was the sensitive contributor to CDF, and the RCP seal failure scenario of 182gpm leakage rate was estimated as the most important scenario

  6. Sensitivity Analysis on LOCCW of Westinghouse typed Reactors Considering WOG2000 RCP Seal Leakage Model

    Energy Technology Data Exchange (ETDEWEB)

    Na, Jang-Hwan; Jeon, Ho-Jun; Hwang, Seok-Won [KHNP Central Research Institute, Daejeon (Korea, Republic of)

    2015-10-15

    In this paper, we focus on risk insights of Westinghouse typed reactors. We identified that Reactor Coolant Pump (RCP) seal integrity is the most important contributor to Core Damage Frequency (CDF). As we reflected the latest technical report; WCAP-15603(Rev. 1-A), 'WOG2000 RCP Seal Leakage Model for Westinghouse PWRs' instead of the old version, RCP seal integrity became more important to Westinghouse typed reactors. After Fukushima accidents, Korea Hydro and Nuclear Power (KHNP) decided to develop Low Power and Shutdown (LPSD) Probabilistic Safety Assessment (PSA) models and upgrade full power PSA models of all operating Nuclear Power Plants (NPPs). As for upgrading full power PSA models, we have tried to standardize the methodology of CCF (Common Cause Failure) and HRA (Human Reliability Analysis), which are the most influential factors to risk measures of NPPs. Also, we have reviewed and reflected the latest operating experiences, reliability data sources and technical methods to improve the quality of PSA models. KHNP has operating various types of reactors; Optimized Pressurized Reactor (OPR) 1000, CANDU, Framatome and Westinghouse. So, one of the most challengeable missions is to keep the balance of risk contributors of all types of reactors. This paper presents the method of new RCP seal leakage model and the sensitivity analysis results from applying the detailed method to PSA models of Westinghouse typed reference reactors. To perform the sensitivity analysis on LOCCW of the reference Westinghouse typed reactors, we reviewed WOG2000 RCP seal leakage model and developed the detailed event tree of LOCCW considering all scenarios of RCP seal failures. Also, we performed HRA based on the T/H analysis by using the leakage rates for each scenario. We could recognize that HRA was the sensitive contributor to CDF, and the RCP seal failure scenario of 182gpm leakage rate was estimated as the most important scenario.

  7. Methodologies for uncertainty analysis in the level 2 PSA and their implementation procedures

    International Nuclear Information System (INIS)

    Ahn, Kwang Il; Yang, Joon Eun; Kim, Dong Ha

    2002-04-01

    Main purpose of this report to present standardized methodologies for uncertainty analysis in the Level 2 Probabilistic Safety Assessment (PSA) and their implementation procedures, based on results obtained through a critical review of the existing methodologies for the analysis of uncertainties employed in the Level 2 PSA, especially Accident Progression Event Tree (APET). Uncertainties employed in the Level 2 PSA, quantitative expressions of overall knowledge of analysts' and experts' participating in the probabilistic quantification process of phenomenological accident progressions ranging from core melt to containment failure, their numerical values are directly related to the degree of confidence that the analyst has that a given phenomenological event or accident process will or will not occur, or analyst's subjective probabilities of occurrence. These results that are obtained from Level 2 PSA uncertainty analysis, become an essential contributor to the plant risk, in addition to the Level 1 PSA and Level 3 PSA uncertainties. Uncertainty analysis methodologies and their implementation procedures presented in this report was prepared based on the following criteria: 'uncertainty quantification process must be logical, scrutable, complete, consistent and in an appropriate level of detail, as mandated by the Level 2 PSA objectives'. For the aforementioned purpose, this report deals mainly with (1) summary of general or Level 2 PSA specific uncertainty analysis methodologies, (2) selection of phenomenological branch events for uncertainty analysis in the APET, methodology for quantification of APET uncertainty inputs and its implementation procedure, (3) statistical propagation of uncertainty inputs through APET and its implementation procedure, and (4) formal procedure for quantification of APET uncertainties and source term categories (STCs) through the Level 2 PSA quantification codes

  8. A Methodology for the Analysis of Memory Response to Radiation through Bitmap Superposition and Slicing

    CERN Document Server

    Bosser, A.; Tsiligiannis, G.; Ferraro, R.; Frost, C.; Javanainen, A.; Puchner, H.; Rossi, M.; Saigne, F.; Virtanen, A.; Wrobel, F.; Zadeh, A.; Dilillo, L.

    2015-01-01

    A methodology is proposed for the statistical analysis of memory radiation test data, with the aim of identifying trends in the single-even upset (SEU) distribution. The treated case study is a 65nm SRAM irradiated with neutrons, protons and heavy-ions.

  9. A framework for characterizing usability requirements elicitation and analysis methodologies (UREAM)

    NARCIS (Netherlands)

    Trienekens, J.J.M.; Kusters, R.J.; Mannaert, H.

    2012-01-01

    Dedicated methodologies for the elicitation and analysis of usability requirements have been proposed in literature, usually developed by usability experts. The usability of these approaches by non-expert software engineers is not obvious. In this paper, the objective is to support developers and

  10. Cost and Benefit Analysis of an Automated Nursing Administration System: A Methodology*

    OpenAIRE

    Rieder, Karen A.

    1984-01-01

    In order for a nursing service administration to select the appropriate automated system for its requirements, a systematic process of evaluating alternative approaches must be completed. This paper describes a methodology for evaluating and comparing alternative automated systems based upon an economic analysis which includes two major categories of criteria: costs and benefits.

  11. Beyond Needs Analysis: Soft Systems Methodology for Meaningful Collaboration in EAP Course Design

    Science.gov (United States)

    Tajino, Akira; James, Robert; Kijima, Kyoichi

    2005-01-01

    Designing an EAP course requires collaboration among various concerned stakeholders, including students, subject teachers, institutional administrators and EAP teachers themselves. While needs analysis is often considered fundamental to EAP, alternative research methodologies may be required to facilitate meaningful collaboration between these…

  12. Combining soft system methodology and pareto analysis in safety management performance assessment : an aviation case

    NARCIS (Netherlands)

    Karanikas, Nektarios

    2016-01-01

    Although reengineering is strategically advantageous for organisations in order to keep functional and sustainable, safety must remain a priority and respective efforts need to be maintained. This paper suggests the combination of soft system methodology (SSM) and Pareto analysis on the scope of

  13. Methodology for reactor core physics analysis - part 2; Metodologia de analise fisica do nucleo - etapa 2

    Energy Technology Data Exchange (ETDEWEB)

    Ponzoni Filho, P; Fernandes, V B; Lima Bezerra, J de; Santos, T I.C.

    1992-12-01

    The computer codes used for reactor core physics analysis are described. The modifications introduced in the public codes and the technical basis for the codes developed by the FURNAS utility are justified. An evaluation of the impact of these modifications on the parameter involved in qualifying the methodology is included. (F.E.). 5 ref, 7 figs, 5 tabs.

  14. Understanding information exchange during disaster response: Methodological insights from infocentric analysis

    Science.gov (United States)

    Toddi A. Steelman; Branda Nowell; Deena. Bayoumi; Sarah. McCaffrey

    2014-01-01

    We leverage economic theory, network theory, and social network analytical techniques to bring greater conceptual and methodological rigor to understand how information is exchanged during disasters. We ask, "How can information relationships be evaluated more systematically during a disaster response?" "Infocentric analysis"—a term and...

  15. Methodology for LOCA analysis and its qualification procedures for PWR reload licensing

    International Nuclear Information System (INIS)

    Serrano, M.A.B.

    1986-01-01

    The methodology for LOCA analysis developed by FURNAS and its qualification procedure for PWR reload licensing are presented. Digital computer codes developed by NRC and published collectively as the WREM package were modified to get versions that comply to each requirement of Brazilian Licensing Criteria. This metodology is applied to Angra-1 basic case to conclude the qualification process. (Author) [pt

  16. Human error probability quantification using fuzzy methodology in nuclear plants; Aplicacao da metodologia fuzzy na quantificacao da probabilidade de erro humano em instalacoes nucleares

    Energy Technology Data Exchange (ETDEWEB)

    Nascimento, Claudio Souza do

    2010-07-01

    This work obtains Human Error Probability (HEP) estimates from operator's actions in response to emergency situations a hypothesis on Research Reactor IEA-R1 from IPEN. It was also obtained a Performance Shaping Factors (PSF) evaluation in order to classify them according to their influence level onto the operator's actions and to determine these PSF actual states over the plant. Both HEP estimation and PSF evaluation were done based on Specialists Evaluation using interviews and questionnaires. Specialists group was composed from selected IEA-R1 operators. Specialist's knowledge representation into linguistic variables and group evaluation values were obtained through Fuzzy Logic and Fuzzy Set Theory. HEP obtained values show good agreement with literature published data corroborating the proposed methodology as a good alternative to be used on Human Reliability Analysis (HRA). (author)

  17. Application of survival analysis methodology to the quantitative analysis of LC-MS proteomics data.

    Science.gov (United States)

    Tekwe, Carmen D; Carroll, Raymond J; Dabney, Alan R

    2012-08-01

    Protein abundance in quantitative proteomics is often based on observed spectral features derived from liquid chromatography mass spectrometry (LC-MS) or LC-MS/MS experiments. Peak intensities are largely non-normal in distribution. Furthermore, LC-MS-based proteomics data frequently have large proportions of missing peak intensities due to censoring mechanisms on low-abundance spectral features. Recognizing that the observed peak intensities detected with the LC-MS method are all positive, skewed and often left-censored, we propose using survival methodology to carry out differential expression analysis of proteins. Various standard statistical techniques including non-parametric tests such as the Kolmogorov-Smirnov and Wilcoxon-Mann-Whitney rank sum tests, and the parametric survival model and accelerated failure time-model with log-normal, log-logistic and Weibull distributions were used to detect any differentially expressed proteins. The statistical operating characteristics of each method are explored using both real and simulated datasets. Survival methods generally have greater statistical power than standard differential expression methods when the proportion of missing protein level data is 5% or more. In particular, the AFT models we consider consistently achieve greater statistical power than standard testing procedures, with the discrepancy widening with increasing missingness in the proportions. The testing procedures discussed in this article can all be performed using readily available software such as R. The R codes are provided as supplemental materials. ctekwe@stat.tamu.edu.

  18. Application of survival analysis methodology to the quantitative analysis of LC-MS proteomics data

    KAUST Repository

    Tekwe, C. D.

    2012-05-24

    MOTIVATION: Protein abundance in quantitative proteomics is often based on observed spectral features derived from liquid chromatography mass spectrometry (LC-MS) or LC-MS/MS experiments. Peak intensities are largely non-normal in distribution. Furthermore, LC-MS-based proteomics data frequently have large proportions of missing peak intensities due to censoring mechanisms on low-abundance spectral features. Recognizing that the observed peak intensities detected with the LC-MS method are all positive, skewed and often left-censored, we propose using survival methodology to carry out differential expression analysis of proteins. Various standard statistical techniques including non-parametric tests such as the Kolmogorov-Smirnov and Wilcoxon-Mann-Whitney rank sum tests, and the parametric survival model and accelerated failure time-model with log-normal, log-logistic and Weibull distributions were used to detect any differentially expressed proteins. The statistical operating characteristics of each method are explored using both real and simulated datasets. RESULTS: Survival methods generally have greater statistical power than standard differential expression methods when the proportion of missing protein level data is 5% or more. In particular, the AFT models we consider consistently achieve greater statistical power than standard testing procedures, with the discrepancy widening with increasing missingness in the proportions. AVAILABILITY: The testing procedures discussed in this article can all be performed using readily available software such as R. The R codes are provided as supplemental materials. CONTACT: ctekwe@stat.tamu.edu.

  19. Knowledge-base for the new human reliability analysis method, A Technique for Human Error Analysis (ATHEANA)

    International Nuclear Information System (INIS)

    Cooper, S.E.; Wreathall, J.; Thompson, C.M., Drouin, M.; Bley, D.C.

    1996-01-01

    This paper describes the knowledge base for the application of the new human reliability analysis (HRA) method, a ''A Technique for Human Error Analysis'' (ATHEANA). Since application of ATHEANA requires the identification of previously unmodeled human failure events, especially errors of commission, and associated error-forcing contexts (i.e., combinations of plant conditions and performance shaping factors), this knowledge base is an essential aid for the HRA analyst

  20. Systemic design methodologies for electrical energy systems analysis, synthesis and management

    CERN Document Server

    Roboam, Xavier

    2012-01-01

    This book proposes systemic design methodologies applied to electrical energy systems, in particular analysis and system management, modeling and sizing tools. It includes 8 chapters: after an introduction to the systemic approach (history, basics & fundamental issues, index terms) for designing energy systems, this book presents two different graphical formalisms especially dedicated to multidisciplinary devices modeling, synthesis and analysis: Bond Graph and COG/EMR. Other systemic analysis approaches for quality and stability of systems, as well as for safety and robustness analysis tools are also proposed. One chapter is dedicated to energy management and another is focused on Monte Carlo algorithms for electrical systems and networks sizing. The aim of this book is to summarize design methodologies based in particular on a systemic viewpoint, by considering the system as a whole. These methods and tools are proposed by the most important French research laboratories, which have many scientific partn...

  1. Methodology for diagnosing of skin cancer on images of dermatologic spots by spectral analysis.

    Science.gov (United States)

    Guerra-Rosas, Esperanza; Álvarez-Borrego, Josué

    2015-10-01

    In this paper a new methodology for the diagnosing of skin cancer on images of dermatologic spots using image processing is presented. Currently skin cancer is one of the most frequent diseases in humans. This methodology is based on Fourier spectral analysis by using filters such as the classic, inverse and k-law nonlinear. The sample images were obtained by a medical specialist and a new spectral technique is developed to obtain a quantitative measurement of the complex pattern found in cancerous skin spots. Finally a spectral index is calculated to obtain a range of spectral indices defined for skin cancer. Our results show a confidence level of 95.4%.

  2. Methodology for national risk analysis and prioritization of toxic industrial chemicals.

    Science.gov (United States)

    Taxell, Piia; Engström, Kerstin; Tuovila, Juha; Söderström, Martin; Kiljunen, Harri; Vanninen, Paula; Santonen, Tiina

    2013-01-01

    The identification of chemicals that pose the greatest threat to human health from incidental releases is a cornerstone in public health preparedness for chemical threats. The present study developed and applied a methodology for the risk analysis and prioritization of industrial chemicals to identify the most significant chemicals that pose a threat to public health in Finland. The prioritization criteria included acute and chronic health hazards, physicochemical and environmental hazards, national production and use quantities, the physicochemical properties of the substances, and the history of substance-related incidents. The presented methodology enabled a systematic review and prioritization of industrial chemicals for the purpose of national public health preparedness for chemical incidents.

  3. Application of a new methodology on the multicycle analysis for the Laguna Verde NPP en Mexico

    International Nuclear Information System (INIS)

    Cortes C, Carlos C.

    1997-01-01

    This paper describes the improvements done in the physical and economic methodologies on the multicycle analysis for the Boiling Water Reactors of the Laguna Verde NPP in Mexico, based on commercial codes and in-house developed computational tools. With these changes in our methodology, three feasible scenarios are generated for the operation of Laguna Verde Nuclear Power Plant Unit 2 at 12, 18 and 24 months. The physical economic results obtained are showed. Further, the effect of the replacement power is included in the economic evaluation. (author). 11 refs., 3 figs., 7 tabs

  4. THEORETICAL AND METHODOLOGICAL PRINCIPLES OF THE STRATEGIC FINANCIAL ANALYSIS OF CAPITAL

    Directory of Open Access Journals (Sweden)

    Olha KHUDYK

    2016-07-01

    Full Text Available The article is devoted to the theoretical and methodological principles of strategic financial analysis of capital. The necessity of strategic financial analysis of capital as a methodological basis for study strategies is proved in modern conditions of a high level of dynamism, uncertainty and risk. The methodological elements of the strategic financial analysis of capital (the object of investigation, the indicators, the factors, the methods of study, the subjects of analysis, the sources of incoming and outgoing information are justified in the system of financial management, allowing to improve its theoretical foundations. It is proved that the strategic financial analysis of capital is a continuous process, carried out in an appropriate sequence at each stage of capital circulation. The system of indexes is substantiated, based on the needs of the strategic financial analysis. The classification of factors determining the size and structure of company’s capital is grounded. The economic nature of capital of the company is clarified. We consider that capital is a stock of economic resources in the form of cash, tangible and intangible assets accumulated by savings, which is used by its owner as a factor of production and investment resource in the economic process in order to obtain profit, to ensure the growth of owners’ prosperity and to achieve social effect.

  5. An integrated approach to human reliability analysis -- decision analytic dynamic reliability model

    International Nuclear Information System (INIS)

    Holmberg, J.; Hukki, K.; Norros, L.; Pulkkinen, U.; Pyy, P.

    1999-01-01

    The reliability of human operators in process control is sensitive to the context. In many contemporary human reliability analysis (HRA) methods, this is not sufficiently taken into account. The aim of this article is that integration between probabilistic and psychological approaches in human reliability should be attempted. This is achieved first, by adopting such methods that adequately reflect the essential features of the process control activity, and secondly, by carrying out an interactive HRA process. Description of the activity context, probabilistic modeling, and psychological analysis form an iterative interdisciplinary sequence of analysis in which the results of one sub-task maybe input to another. The analysis of the context is carried out first with the help of a common set of conceptual tools. The resulting descriptions of the context promote the probabilistic modeling, through which new results regarding the probabilistic dynamics can be achieved. These can be incorporated in the context descriptions used as reference in the psychological analysis of actual performance. The results also provide new knowledge of the constraints of activity, by providing information of the premises of the operator's actions. Finally, the stochastic marked point process model gives a tool, by which psychological methodology may be interpreted and utilized for reliability analysis

  6. An efficient methodology for the analysis of primary frequency control of electric power systems

    Energy Technology Data Exchange (ETDEWEB)

    Popovic, D.P. [Nikola Tesla Institute, Belgrade (Yugoslavia); Mijailovic, S.V. [Electricity Coordinating Center, Belgrade (Yugoslavia)

    2000-06-01

    The paper presents an efficient methodology for the analysis of primary frequency control of electric power systems. This methodology continuously monitors the electromechanical transient processes with durations that last up to 30 s, occurring after the characteristic disturbances. It covers the period of short-term dynamic processes, appearing immediately after the disturbance, in which the dynamics of the individual synchronous machines is dominant, as well as the period with the uniform movement of all generators and restoration of their voltages. The characteristics of the developed methodology were determined based on the example of real electric power interconnection formed by the electric power systems of Yugoslavia, a part of Republic of Srpska, Romania, Bulgaria, former Yugoslav Republic of Macedonia, Greece and Albania (the second UCPTE synchronous zone). (author)

  7. Methodology for Risk Analysis of Dam Gates and Associated Operating Equipment Using Fault Tree Analysis

    National Research Council Canada - National Science Library

    Patev, Robert C; Putcha, Chandra; Foltz, Stuart D

    2005-01-01

    .... This report summarizes research on methodologies to assist in quantifying risks related to dam gates and associated operating equipment, and how those risks relate to overall spillway failure risk...

  8. Best-estimate methodology for analysis of anticipated transients without scram in pressurized water reactors

    International Nuclear Information System (INIS)

    Rebollo, L.

    1993-01-01

    Union Fenosa, a utility company in Spain, has performed research on pressurized water reactor (PWR) safety with respect to the development of a best-estimate methodology for the analysis of anticipated transients without scram (ATWS), i.e., those anticipated transients for which failure of the reactor protection system is postulated. A scientific and technical approach is adopted with respect to the ATWS phenomenon as it affects a PWR, specifically the Zorita nuclear power plant, a single-loop Westinghouse-designed PWR in Spain. In this respect, an ATWS sequence analysis methodology based on published codes that is generically applicable to any PWR is proposed, which covers all the anticipated phenomena and defines the applicable acceptance criteria. The areas contemplated are cell neutron analysis, core thermal hydraulics, and plant dynamics, which are developed, qualified, and plant dynamics, which are developed, qualified, and validated by comparison with reference calculations and measurements obtained from integral or separate-effects tests

  9. Methodology for the analysis of dietary data from the Mexican National Health and Nutrition Survey 2006.

    Science.gov (United States)

    Rodríguez-Ramírez, Sonia; Mundo-Rosas, Verónica; Jiménez-Aguilar, Alejandra; Shamah-Levy, Teresa

    2009-01-01

    To describe the methodology for the analysis of dietary data from the Mexican National Health and Nutrition Survey 2006 (ENSANUT 2006) carried out in Mexico. Dietary data from the population who participated in the ENSANUT 2006 were collected through a 7-day food-frequency questionnaire. Energy and nutrient intake of each food consumed and adequacy percentage by day were also estimated. Intakes and adequacy percentages > 5 SDs from the energy and nutrient general distribution and observations with energy adequacy percentages < 25% were excluded from the analysis. Valid dietary data were obtained from 3552 children aged 1 to 4 years, 8716 children aged 5 to 11 years, 8442 adolescents, 15951 adults, and 3357 older adults. It is important to detail the methodology for the analysis of dietary data to standardize data cleaning criteria and to be able to compare the results of different studies.

  10. Analysis methodology for the post-trip return to power steam line break event

    Energy Technology Data Exchange (ETDEWEB)

    Lee, Chul Shin; Kim, Chul Woo; You, Hyung Keun [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    1996-06-01

    An analysis for Steam Line Break (SLB) events which result in a Return-to-Power (RTP) condition after reactor trip was performed for a postulated Yonggwang Nuclear Power Plant Unit 3 cycle 8. Analysis methodology for post-trip RTP SLB is quite different from that of non-RTP SLB and is more difficult. Therefore, it is necessary to develop a methodology to analyze the response of the NSSS parameters to the post-trip RTP SLB events and the fuel performance after the total reactivity exceeds the criticality. In this analysis, the cases with and without offsite power were simulated crediting 3-D reactivity feedback effect due to a local heatup in the vicinity of stuck CEA and compared with the cases without 3-D reactivity feedback with respect to post-trip fuel performance. Departure-to Nucleate Boiling Ratio (DNBR) and Linear Heat Generation Rate (LHGR). 36 tabs., 32 figs., 11 refs. (Author) .new.

  11. Accidental safety analysis methodology development in decommission of the nuclear facility

    Energy Technology Data Exchange (ETDEWEB)

    Park, G. H.; Hwang, J. H.; Jae, M. S.; Seong, J. H.; Shin, S. H.; Cheong, S. J.; Pae, J. H.; Ang, G. R.; Lee, J. U. [Seoul National Univ., Seoul (Korea, Republic of)

    2002-03-15

    Decontamination and Decommissioning (D and D) of a nuclear reactor cost about 20% of construction expense and production of nuclear wastes during decommissioning makes environmental issues. Decommissioning of a nuclear reactor in Korea is in a just beginning stage, lacking clear standards and regulations for decommissioning. This work accident safety analysis in decommissioning of the nuclear facility can be a solid ground for the standards and regulations. For source term analysis for Kori-1 reactor vessel, MCNP/ORIGEN calculation methodology was applied. The activity of each important nuclide in the vessel was estimated at a time after 2008, the year Kori-1 plant is supposed to be decommissioned. And a methodology for risk analysis assessment in decommissioning was developed.

  12. More Than Just a Discursive Practice? Conceptual Principles and Methodological Aspects of Dispositif Analysis

    Directory of Open Access Journals (Sweden)

    Andrea D. Bührmann

    2007-05-01

    Full Text Available This article gives an introduction into the conceptual and practical field of dispositf analysis—a field that is of great importance but that is as yet underdeveloped. In order to render this introduction, we first explain the terms discourse and dispositif. Then we examine the conceptual instruments and methodological procedures of dispositf analysis. In this way, we define the relations between discourse and (a non discoursive practices (b subjectification, (c everyday orders of knowledge and (d institutional practices like societal changes as central issues of dispositif analysis. Furthermore, we point out the methodological possibilities and limitations of dispositif analysis. We demonstrate these possibilities and limitations with some practical examples. In general, this article aims to provide an extension of the perspectives of discourse theory and research by stressing the relations between normative orders of knowledge, their effects on interactions and individual self–reflections connected with them. URN: urn:nbn:de:0114-fqs0702281

  13. THEORETICAL AND METHODOLOGICAL PRINCIPLES OF THE STRATEGIC FINANCIAL ANALYSIS OF CAPITAL

    Directory of Open Access Journals (Sweden)

    Olha KHUDYK

    2016-07-01

    Full Text Available The article is devoted to the theoretical and methodological principles of strategic financial analysis of capital. The necessity of strategic financial analysis of capital as a methodological basis for study strategies is proved in modern conditions of a high level of dynamism, uncertainty and risk. The methodological elements of the strategic indicators, the factors, the methods of study, the subjects of analysis, the sources of incoming and outgoing information are justified in the system of financial management, allowing to improve its theoretical foundations. It is proved that the strategic financial analysis of capital is a continuous process, carried out in an appropriate sequence at each stage of capital circulation. The system of indexes is substantiated, based on the needs of the strategic financial analysis. The classification of factors determining the size and structure of company’s capital is grounded. The economic nature of capital of the company is clarified. We consider that capital is a stock of economic resources in the form of cash, tangible and intangible assets accumulated by savings, which is used by its owner as a factor of production and investment resource in the economic process in order to obtain profit, to ensure the growth of owners’ prosperity and to achieve social effect.

  14. A Methodology for the Development of RESTful Semantic Web Services for Gene Expression Analysis.

    Science.gov (United States)

    Guardia, Gabriela D A; Pires, Luís Ferreira; Vêncio, Ricardo Z N; Malmegrim, Kelen C R; de Farias, Cléver R G

    2015-01-01

    Gene expression studies are generally performed through multi-step analysis processes, which require the integrated use of a number of analysis tools. In order to facilitate tool/data integration, an increasing number of analysis tools have been developed as or adapted to semantic web services. In recent years, some approaches have been defined for the development and semantic annotation of web services created from legacy software tools, but these approaches still present many limitations. In addition, to the best of our knowledge, no suitable approach has been defined for the functional genomics domain. Therefore, this paper aims at defining an integrated methodology for the implementation of RESTful semantic web services created from gene expression analysis tools and the semantic annotation of such services. We have applied our methodology to the development of a number of services to support the analysis of different types of gene expression data, including microarray and RNASeq. All developed services are publicly available in the Gene Expression Analysis Services (GEAS) Repository at http://dcm.ffclrp.usp.br/lssb/geas. Additionally, we have used a number of the developed services to create different integrated analysis scenarios to reproduce parts of two gene expression studies documented in the literature. The first study involves the analysis of one-color microarray data obtained from multiple sclerosis patients and healthy donors. The second study comprises the analysis of RNA-Seq data obtained from melanoma cells to investigate the role of the remodeller BRG1 in the proliferation and morphology of these cells. Our methodology provides concrete guidelines and technical details in order to facilitate the systematic development of semantic web services. Moreover, it encourages the development and reuse of these services for the creation of semantically integrated solutions for gene expression analysis.

  15. A Methodology for the Development of RESTful Semantic Web Services for Gene Expression Analysis.

    Directory of Open Access Journals (Sweden)

    Gabriela D A Guardia

    Full Text Available Gene expression studies are generally performed through multi-step analysis processes, which require the integrated use of a number of analysis tools. In order to facilitate tool/data integration, an increasing number of analysis tools have been developed as or adapted to semantic web services. In recent years, some approaches have been defined for the development and semantic annotation of web services created from legacy software tools, but these approaches still present many limitations. In addition, to the best of our knowledge, no suitable approach has been defined for the functional genomics domain. Therefore, this paper aims at defining an integrated methodology for the implementation of RESTful semantic web services created from gene expression analysis tools and the semantic annotation of such services. We have applied our methodology to the development of a number of services to support the analysis of different types of gene expression data, including microarray and RNASeq. All developed services are publicly available in the Gene Expression Analysis Services (GEAS Repository at http://dcm.ffclrp.usp.br/lssb/geas. Additionally, we have used a number of the developed services to create different integrated analysis scenarios to reproduce parts of two gene expression studies documented in the literature. The first study involves the analysis of one-color microarray data obtained from multiple sclerosis patients and healthy donors. The second study comprises the analysis of RNA-Seq data obtained from melanoma cells to investigate the role of the remodeller BRG1 in the proliferation and morphology of these cells. Our methodology provides concrete guidelines and technical details in order to facilitate the systematic development of semantic web services. Moreover, it encourages the development and reuse of these services for the creation of semantically integrated solutions for gene expression analysis.

  16. Summary of the Supplemental Model Reports Supporting the Disposal Criticality Analysis Methodology Topical Report

    International Nuclear Information System (INIS)

    Brownson, D. A.

    2002-01-01

    The Department of Energy (DOE) Office of Civilian Radioactive Waste Management (OCRWM) has committed to a series of model reports documenting the methodology to be utilized in the Disposal Criticality Analysis Methodology Topical Report (YMP 2000). These model reports detail and provide validation of the methodology to be utilized for criticality analyses related to: (1) Waste form/waste package degradation; (2) Waste package isotopic inventory; (3) Criticality potential of degraded waste form/waste package configurations (effective neutron multiplication factor); (4) Probability of criticality (for each potential critical configuration as well as total event); and (5) Criticality consequences. This purpose of this summary report is to provide a status of the model reports and a schedule for their completion. This report also provides information relative to the model report content and validation. The model reports and their revisions are being generated as a result of: (1) Commitments made in the Disposal Criticality Analysis Methodology Topical Report (YMP 2000); (2) Open Items from the Safety Evaluation Report (Reamer 2000); (3) Key Technical Issue agreements made during DOE/U.S. Nuclear Regulatory Commission (NRC) Technical Exchange Meeting (Reamer and Williams 2000); and (4) NRC requests for additional information (Schlueter 2002)

  17. Application of NASA Kennedy Space Center system assurance analysis methodology to nuclear power plant systems designs

    International Nuclear Information System (INIS)

    Page, D.W.

    1985-01-01

    The Kennedy Space Center (KSC) entered into an agreement with the Nuclear Regulatory Commission (NRC) to conduct a study to demonstrate the feasibility and practicality of applying the KSC System Assurance Analysis (SAA) methodology to nuclear power plant systems designs. In joint meetings of KSC and Duke Power personnel, an agreement was made to select to CATAWBA systems, the Containment Spray System and the Residual Heat Removal System, for the analyses. Duke Power provided KSC with a full set a Final Safety Analysis Reports as well as schematics for the two systems. During Phase I of the study the reliability analyses of the SAA were performed. During Phase II the hazard analyses were performed. The final product of Phase II is a handbook for implementing the SAA methodology into nuclear power plant systems designs. The purpose of this paper is to describe the SAA methodology as it applies to nuclear power plant systems designs and to discuss the feasibility of its application. The conclusion is drawn that nuclear power plant systems and aerospace ground support systems are similar in complexity and design and share common safety and reliability goals. The SAA methodology is readily adaptable to nuclear power plant designs because of it's practical application of existing and well known safety and reliability analytical techniques tied to an effective management information system

  18. Data development technical support document for the aircraft crash risk analysis methodology (ACRAM) standard

    International Nuclear Information System (INIS)

    Kimura, C.Y.; Glaser, R.E.; Mensing, R.W.; Lin, T.; Haley, T.A.; Barto, A.B.; Stutzke, M.A.

    1996-01-01

    The Aircraft Crash Risk Analysis Methodology (ACRAM) Panel has been formed by the US Department of Energy Office of Defense Programs (DOE/DP) for the purpose of developing a standard methodology for determining the risk from aircraft crashes onto DOE ground facilities. In order to accomplish this goal, the ACRAM panel has been divided into four teams, the data development team, the model evaluation team, the structural analysis team, and the consequence team. Each team, consisting of at least one member of the ACRAM plus additional DOE and DOE contractor personnel, specializes in the development of the methodology assigned to that team. This report documents the work performed by the data development team and provides the technical basis for the data used by the ACRAM Standard for determining the aircraft crash frequency. This report should be used to provide the generic data needed to calculate the aircraft crash frequency into the facility under consideration as part of the process for determining the aircraft crash risk to ground facilities as given by the DOE Standard Aircraft Crash Risk Assessment Methodology (ACRAM). Some broad guidance is presented on how to obtain the needed site-specific and facility specific data but this data is not provided by this document

  19. A quantitative flood risk analysis methodology for urban areas with integration of social research data

    Directory of Open Access Journals (Sweden)

    I. Escuder-Bueno

    2012-09-01

    Full Text Available Risk analysis has become a top priority for authorities and stakeholders in many European countries, with the aim of reducing flooding risk, considering the population's needs and improving risk awareness. Within this context, two methodological pieces have been developed in the period 2009–2011 within the SUFRI project (Sustainable Strategies of Urban Flood Risk Management with non-structural measures to cope with the residual risk, 2nd ERA-Net CRUE Funding Initiative. First, the "SUFRI Methodology for pluvial and river flooding risk assessment in urban areas to inform decision-making" provides a comprehensive and quantitative tool for flood risk analysis. Second, the "Methodology for investigation of risk awareness of the population concerned" presents the basis to estimate current risk from a social perspective and identify tendencies in the way floods are understood by citizens. Outcomes of both methods are integrated in this paper with the aim of informing decision making on non-structural protection measures. The results of two case studies are shown to illustrate practical applications of this developed approach. The main advantage of applying the methodology herein presented consists in providing a quantitative estimation of flooding risk before and after investing in non-structural risk mitigation measures. It can be of great interest for decision makers as it provides rational and solid information.

  20. A quantitative flood risk analysis methodology for urban areas with integration of social research data

    Science.gov (United States)

    Escuder-Bueno, I.; Castillo-Rodríguez, J. T.; Zechner, S.; Jöbstl, C.; Perales-Momparler, S.; Petaccia, G.

    2012-09-01

    Risk analysis has become a top priority for authorities and stakeholders in many European countries, with the aim of reducing flooding risk, considering the population's needs and improving risk awareness. Within this context, two methodological pieces have been developed in the period 2009-2011 within the SUFRI project (Sustainable Strategies of Urban Flood Risk Management with non-structural measures to cope with the residual risk, 2nd ERA-Net CRUE Funding Initiative). First, the "SUFRI Methodology for pluvial and river flooding risk assessment in urban areas to inform decision-making" provides a comprehensive and quantitative tool for flood risk analysis. Second, the "Methodology for investigation of risk awareness of the population concerned" presents the basis to estimate current risk from a social perspective and identify tendencies in the way floods are understood by citizens. Outcomes of both methods are integrated in this paper with the aim of informing decision making on non-structural protection measures. The results of two case studies are shown to illustrate practical applications of this developed approach. The main advantage of applying the methodology herein presented consists in providing a quantitative estimation of flooding risk before and after investing in non-structural risk mitigation measures. It can be of great interest for decision makers as it provides rational and solid information.

  1. A Human Performance Analysis on Emergency Tasks of a Nuclear Power Plant

    International Nuclear Information System (INIS)

    Jung, Wondea; Park, Jinkyun; Kim, Jae W.

    2007-01-01

    Considering risk-informed activities that require the probabilistic safety assessment (PSA) quality to be as high as possible, an HRA should be performed by using a systematic method with realistic plant specific data to meet the requirements for risk-informed applications. In order to obtain more objective HRA results, data extracted from real experiences or simulators is essential. To support HRA activities and researches, we have developed a human performance database, OPERA (Operator Performance and Reliability Analysis). This paper introduces a study to analyze an operators' performance time, which is the most crucial input for estimating a human error probability of a post-initiating human failure event

  2. Human reliability analysis in the man-machine interface design review

    International Nuclear Information System (INIS)

    Kim, I.S.

    2001-01-01

    Advanced, computer-based man-machine interface (MMI) is emerging as part of the new design of nuclear power plants. The impact of advanced MMI on the operator performance, and as a result, on plant safety should be thoroughly evaluated before such technology is actually adopted in the plants. This paper discusses the applicability of human reliability analysis (HRA) to support the design review process. Both the first-generation and the second-generation HRA methods are considered focusing on a couple of promising HRA methods, i.e. ATHEANA and CREAM, with the potential to assist the design review process

  3. Reporting and methodological quality of survival analysis in articles published in Chinese oncology journals.

    Science.gov (United States)

    Zhu, Xiaoyan; Zhou, Xiaobin; Zhang, Yuan; Sun, Xiao; Liu, Haihua; Zhang, Yingying

    2017-12-01

    Survival analysis methods have gained widespread use in the filed of oncology. For achievement of reliable results, the methodological process and report quality is crucial. This review provides the first examination of methodological characteristics and reporting quality of survival analysis in articles published in leading Chinese oncology journals.To examine methodological and reporting quality of survival analysis, to identify some common deficiencies, to desirable precautions in the analysis, and relate advice for authors, readers, and editors.A total of 242 survival analysis articles were included to be evaluated from 1492 articles published in 4 leading Chinese oncology journals in 2013. Articles were evaluated according to 16 established items for proper use and reporting of survival analysis.The application rates of Kaplan-Meier, life table, log-rank test, Breslow test, and Cox proportional hazards model (Cox model) were 91.74%, 3.72%, 78.51%, 0.41%, and 46.28%, respectively, no article used the parametric method for survival analysis. Multivariate Cox model was conducted in 112 articles (46.28%). Follow-up rates were mentioned in 155 articles (64.05%), of which 4 articles were under 80% and the lowest was 75.25%, 55 articles were100%. The report rates of all types of survival endpoint were lower than 10%. Eleven of 100 articles which reported a loss to follow-up had stated how to treat it in the analysis. One hundred thirty articles (53.72%) did not perform multivariate analysis. One hundred thirty-nine articles (57.44%) did not define the survival time. Violations and omissions of methodological guidelines included no mention of pertinent checks for proportional hazard assumption; no report of testing for interactions and collinearity between independent variables; no report of calculation method of sample size. Thirty-six articles (32.74%) reported the methods of independent variable selection. The above defects could make potentially inaccurate

  4. Methodological Choices in Muscle Synergy Analysis Impact Differentiation of Physiological Characteristics Following Stroke

    Directory of Open Access Journals (Sweden)

    Caitlin L. Banks

    2017-08-01

    Full Text Available Muscle synergy analysis (MSA is a mathematical technique that reduces the dimensionality of electromyographic (EMG data. Used increasingly in biomechanics research, MSA requires methodological choices at each stage of the analysis. Differences in methodological steps affect the overall outcome, making it difficult to compare results across studies. We applied MSA to EMG data collected from individuals post-stroke identified as either responders (RES or non-responders (nRES on the basis of a critical post-treatment increase in walking speed. Importantly, no clinical or functional indicators identified differences between the cohort of RES and nRES at baseline. For this exploratory study, we selected the five highest RES and five lowest nRES available from a larger sample. Our goal was to assess how the methodological choices made before, during, and after MSA affect the ability to differentiate two groups with intrinsic physiologic differences based on MSA results. We investigated 30 variations in MSA methodology to determine which choices allowed differentiation of RES from nRES at baseline. Trial-to-trial variability in time-independent synergy vectors (SVs and time-varying neural commands (NCs were measured as a function of: (1 number of synergies computed; (2 EMG normalization method before MSA; (3 whether SVs were held constant across trials or allowed to vary during MSA; and (4 synergy analysis output normalization method after MSA. MSA methodology had a strong effect on our ability to differentiate RES from nRES at baseline. Across all 10 individuals and MSA variations, two synergies were needed to reach an average of 90% variance accounted for (VAF. Based on effect sizes, differences in SV and NC variability between groups were greatest using two synergies with SVs that varied from trial-to-trial. Differences in SV variability were clearest using unit magnitude per trial EMG normalization, while NC variability was less sensitive to EMG

  5. Human Reliability Analysis in Support of Risk Assessment for Positive Train Control

    Science.gov (United States)

    2003-06-01

    This report describes an approach to evaluating the reliability of human actions that are modeled in a probabilistic risk assessment : (PRA) of train control operations. This approach to human reliability analysis (HRA) has been applied in the case o...

  6. Business analysis methodology in telecommunication industry – the research based on the grounded theory

    Directory of Open Access Journals (Sweden)

    Hana Nenickova

    2013-10-01

    Full Text Available The objective of this article is to present the grounded theory using in the qualitative research as a basis to build a business analysis methodology for the implementation of information systems in telecommunication enterprises in Czech Republic. In the preparation of the methodology I have used the current needs of telecommunications companies, which are characterized mainly by high dependence on information systems. Besides that, this industry is characterized by high flexibility and competition and compressing of the corporate strategy timeline. The grounded theory of business analysis defines the specifics of the telecommunications industry, focusing on the very specific description of the procedure for collecting the business requirements and following the business strategy.

  7. Application of GO methodology in reliability analysis of offsite power supply of Daya Bay NPP

    International Nuclear Information System (INIS)

    Shen Zupei; Li Xiaodong; Huang Xiangrui

    2003-01-01

    The author applies the GO methodology to reliability analysis of the offsite power supply system of Daya Bay NPP. The direct quantitative calculation formulas of the stable reliability target of the system with shared signals and the dynamic calculation formulas of the state probability for the unit with two states are derived. The method to solve the fault event sets of the system is also presented and all the fault event sets of the outer power supply system and their failure probability are obtained. The resumption reliability of the offsite power supply system after the stability failure of the power net is also calculated. The result shows that the GO methodology is very simple and useful in the stable and dynamic reliability analysis of the repairable system

  8. Methodology of a PWR containment analysis during a thermal-hydraulic accident

    Energy Technology Data Exchange (ETDEWEB)

    Silva, Dayane F.; Sabundjian, Gaiane; Lima, Ana Cecilia S., E-mail: dayane.silva@usp.br, E-mail: gdjian@ipen.br, E-mail: aclima@ipen.br [Instituto de Pesquisas Energeticas e Nucleares (IPEN/CNEN-SP), Sao Paulo, SP (Brazil)

    2015-07-01

    The aim of this work is to present the methodology of calculation to Angra 2 reactor containment during accidents of the type Loss of Coolant Accident (LOCA). This study will be possible to ensure the safety of the population of the surroundings upon the occurrence of accidents. One of the programs used to analyze containment of a nuclear plant is the CONTAIN. This computer code is an analysis tool used for predicting the physical conditions and distributions of radionuclides inside a containment building following the release of material from the primary system in a light-water reactor during an accident. The containment of the type PWR plant is a concrete building covered internally by metallic material and has limits of design pressure. The methodology of containment analysis must estimate the limits of pressure during a LOCA. The boundary conditions for the simulation are obtained from RELAP5 code. (author)

  9. Methodology of a PWR containment analysis during a thermal-hydraulic accident

    International Nuclear Information System (INIS)

    Silva, Dayane F.; Sabundjian, Gaiane; Lima, Ana Cecilia S.

    2015-01-01

    The aim of this work is to present the methodology of calculation to Angra 2 reactor containment during accidents of the type Loss of Coolant Accident (LOCA). This study will be possible to ensure the safety of the population of the surroundings upon the occurrence of accidents. One of the programs used to analyze containment of a nuclear plant is the CONTAIN. This computer code is an analysis tool used for predicting the physical conditions and distributions of radionuclides inside a containment building following the release of material from the primary system in a light-water reactor during an accident. The containment of the type PWR plant is a concrete building covered internally by metallic material and has limits of design pressure. The methodology of containment analysis must estimate the limits of pressure during a LOCA. The boundary conditions for the simulation are obtained from RELAP5 code. (author)

  10. Effects of methodology and analysis strategy on robustness of pestivirus phylogeny.

    Science.gov (United States)

    Liu, Lihong; Xia, Hongyan; Baule, Claudia; Belák, Sándor; Wahlberg, Niklas

    2010-01-01

    Phylogenetic analysis of pestiviruses is a useful tool for classifying novel pestiviruses and for revealing their phylogenetic relationships. In this study, robustness of pestivirus phylogenies has been compared by analyses of the 5'UTR, and complete N(pro) and E2 gene regions separately and combined, performed by four methods: neighbour-joining (NJ), maximum parsimony (MP), maximum likelihood (ML), and Bayesian inference (BI). The strategy of analysing the combined sequence dataset by BI, ML, and MP methods resulted in a single, well-supported tree topology, indicating a reliable and robust pestivirus phylogeny. By contrast, the single-gene analysis strategy resulted in 12 trees of different topologies, revealing different relationships among pestiviruses. These results indicate that the strategies and methodologies are two vital aspects affecting the robustness of the pestivirus phylogeny. The strategy and methodologies outlined in this paper may have a broader application in inferring phylogeny of other RNA viruses.

  11. Analysis of Interbrand, BrandZ and BAV brand valuation methodologies

    Directory of Open Access Journals (Sweden)

    Krstić Bojan

    2011-01-01

    Full Text Available Brand valuation is considered for one of the most significant challenges of not only theory and practice of contemporary marketing, but other disciplines, as well. Namely, the complex nature of this issue implies the need for multidisciplinary approach and creation of methodology which goes beyond the borders of marketing as a discipline, and includes knowledge derived from accounting, finance and other areas. However, mostly one-sided approaches oriented towards determining brand value either based on research results of consumer behavior and attitudes or based on financial success of the brand are dominant in marketing and financial literature. Simultaneously with these theoretical methodologies, agencies for consultancy and marketing and other subjects have been developing their own brand valuation methods and models. Some of them can be appointed to comprehensive approach to brand valuation, which overcomes mentioned problem considering one-sided analysis of brand value. The comprehensive approach, namely, presumes brand valuation based on benefits which brand provides to both customers and enterprise that owns it, in other words - based on qualitative and quantitative measures respectively reflecting behavior and attitudes of consumers and assumed financial value of the brand, or, more precisely, brand value capitalization. According to the defined research subject, this paper is structured as follows: importance and problem of brand value are reviewed in the Introduction, and three most well-known brand valuation methodologies developed by consultancy agencies - Interbrand methodology and BrandZ and BAV models are analyzed in the next section. In the further considerations the results of comparative analysis of these methodologies are presented and implications for adequate brand valuation suggested.

  12. Methodology of demand forecast by market analysis of electric power and load curves

    International Nuclear Information System (INIS)

    Barreiro, C.J.; Atmann, J.L.

    1989-01-01

    A methodology for demand forecast of consumer classes and their aggregation is presented. An analysis of the actual attended market can be done by appropriate measures and load curves studies. The suppositions for the future market behaviour by consumer classes (industrial, residential, commercial, others) are shown, and the actions for optimise this market are foreseen, obtained by load curves modulations. The process of future demand determination is obtained by the appropriate aggregation of this segmented demands. (C.G.C.)

  13. Comparing Internet Probing Methodologies Through an Analysis of Large Dynamic Graphs

    Science.gov (United States)

    2014-06-01

    System Number CAIDA Cooperative Association of Internet Data Analysis GB gigabyte IETF IPv4 IP IPv6 ISP NPS NTC RFC RTT TTL ICMP NPS ESD VSD TCP UDP DoS...including, DIMES, IPlane, Ark IPv4 All Prefix /24 and recently NPS probing methodol- ogy. NPS probing methodology is different from the others because it...trace, a history of the forward interface-level path and time to send and acknowledge are available to analyze. However, traceroute may not return

  14. Methodological Approach to Company Cash Flows Target-Oriented Forecasting Based on Financial Position Analysis

    OpenAIRE

    Sergey Krylov

    2012-01-01

    The article treats a new methodological approach to the company cash flows target-oriented forecasting based on its financial position analysis. The approach is featured to be universal and presumes application of the following techniques developed by the author: financial ratio values correction techniques and correcting cash flows techniques. The financial ratio values correction technique assumes to analyze and forecast company financial position while the correcting cash flows technique i...

  15. Methodological and theoretical issues in the comparative analysis of gender relations in Western Europe

    OpenAIRE

    S Walby

    1994-01-01

    The aim in this paper is to contribute to the development of a research agenda for the comparative analysis of gender relations in Western Europe. Its focus is the clarification of the methodological and theoretical issues involved. Several different indices of gender inequality are assessed. It is argued that it is important to distinguish between the form and degree of patriarchy, rather than assuming that these are closely associated. Data from the EC and Scandinavia are used to illustrate...

  16. Vyplácí se v on-line pokeru agresivní hra?

    OpenAIRE

    Rytíř, Miroslav

    2011-01-01

    This thesis examines impact of making aggresive moves on profit in big blinds obtained in hundred hands. With econometric analysis I have estimated two models. First general which contains only how often players choose aggressive moves. In second I have estimated impact of moves which are based on some advatage in different situations through the game. Variables which measures aggressive moves frequency had possitive coeffients in both models. So I think that aggression is profitable.

  17. An analysis methodology for hot leg break mass and energy release

    International Nuclear Information System (INIS)

    Song, Jin Ho; Kwon, Young Min; Kim, Taek Mo; Chung, Hae Yong; Lee, Sang Jong

    1996-07-01

    An analysis methodology for the hot leg break mass and energy release is developed. For the blowdown period a modified CEFLASH-4A analysis is suggested. For the post-blowdown period a new computer model named COMET is developed. Differently from previous post-blowdown analysis model FLOOD3, COMET is capable of analyzing both cold leg and hot leg break cases. The cold leg break model is essentially same as that of FLOOD3 with some improvements. The analysis results by the newly proposed hot leg break model in the COMET is in the same trend as those observed in scaled-down integral experiment. And the analyses results for the UCN 3 and 4 by COMET are qualitatively and quantitatively in good agreement with those predicted by best-estimate analysis by using RELAP5/MOD3. Therefore, the COMET code is validated and can be used for the licensing analysis. 6 tabs., 82 figs., 9 refs. (Author)

  18. Applicability of simplified human reliability analysis methods for severe accidents

    Energy Technology Data Exchange (ETDEWEB)

    Boring, R.; St Germain, S. [Idaho National Lab., Idaho Falls, Idaho (United States); Banaseanu, G.; Chatri, H.; Akl, Y. [Canadian Nuclear Safety Commission, Ottawa, Ontario (Canada)

    2016-03-15

    Most contemporary human reliability analysis (HRA) methods were created to analyse design-basis accidents at nuclear power plants. As part of a comprehensive expansion of risk assessments at many plants internationally, HRAs will begin considering severe accident scenarios. Severe accidents, while extremely rare, constitute high consequence events that significantly challenge successful operations and recovery. Challenges during severe accidents include degraded and hazardous operating conditions at the plant, the shift in control from the main control room to the technical support center, the unavailability of plant instrumentation, and the need to use different types of operating procedures. Such shifts in operations may also test key assumptions in existing HRA methods. This paper discusses key differences between design basis and severe accidents, reviews efforts to date to create customized HRA methods suitable for severe accidents, and recommends practices for adapting existing HRA methods that are already being used for HRAs at the plants. (author)

  19. Significant aspects of the external event analysis methodology of the Jose Cabrera NPP PSA

    International Nuclear Information System (INIS)

    Barquin Duena, A.; Martin Martinez, A.R.; Boneham, P.S.; Ortega Prieto, P.

    1994-01-01

    This paper describes the following advances in the methodology for Analysis of External Events in the PSA of the Jose Cabrera NPP: In the Fire Analysis, a version of the COMPBRN3 CODE, modified by Empresarios Agrupados according to the guidelines of Appendix D of the NUREG/CR-5088, has been used. Generic cases were modelled and general conclusions obtained, applicable to fire propagation in closed areas. The damage times obtained were appreciably lower than those obtained with the previous version of the code. The Flood Analysis methodology is based on the construction of event trees to represent flood propagation dependent on the condition of the communication paths between areas, and trees showing propagation stages as a function of affected areas and damaged mitigation equipment. To determine temporary evolution of the flood area level, the CAINZO-EA code has been developed, adapted to specific plant characteristics. In both the Fire and Flood Analyses a quantification methodology has been adopted, which consists of analysing the damages caused at each stage of growth or propagation and identifying, in the Internal Events models, the gates, basic events or headers to which safe failure (probability 1) due to damages is assigned. (Author)

  20. Safety analysis methodology with assessment of the impact of the prediction errors of relevant parameters

    International Nuclear Information System (INIS)

    Galia, A.V.

    2011-01-01

    The best estimate plus uncertainty approach (BEAU) requires the use of extensive resources and therefore it is usually applied for cases in which the available safety margin obtained with a conservative methodology can be questioned. Outside the BEAU methodology, there is not a clear approach on how to deal with the issue of considering the uncertainties resulting from prediction errors in the safety analyses performed for licensing submissions. However, the regulatory document RD-310 mentions that the analysis method shall account for uncertainties in the analysis data and models. A possible approach is presented, that is simple and reasonable, representing just the author's views, to take into account the impact of prediction errors and other uncertainties when performing safety analysis in line with regulatory requirements. The approach proposes taking into account the prediction error of relevant parameters. Relevant parameters would be those plant parameters that are surveyed and are used to initiate the action of a mitigating system or those that are representative of the most challenging phenomena for the integrity of a fission barrier. Examples of the application of the methodology are presented involving a comparison between the results with the new approach and a best estimate calculation during the blowdown phase for two small breaks in a generic CANDU 6 station. The calculations are performed with the CATHENA computer code. (author)

  1. Methodology for the analysis of pollutant emissions from a city bus

    International Nuclear Information System (INIS)

    Armas, Octavio; Lapuerta, Magín; Mata, Carmen

    2012-01-01

    In this work a methodology is proposed for measurement and analysis of gaseous emissions and particle size distributions emitted by a diesel city bus during its typical operation under urban driving conditions. As test circuit, a passenger transportation line at a Spanish city was used. Different ways for data processing and representation were studied and, derived from this work, a new approach is proposed. The methodology was useful to detect the most important uncertainties arising during registration and processing of data derived from a measurement campaign devoted to determine the main pollutant emissions. A HORIBA OBS-1300 gas analyzer and a TSI engine exhaust particle spectrometer were used with 1 Hz frequency data recording. The methodology proposed allows for the comparison of results (in mean values) derived from the analysis of either complete cycles or specific categories (or sequences). The analysis by categories is demonstrated to be a robust and helpful tool to isolate the effect of the main vehicle parameters (relative fuel–air ratio and velocity) on pollutant emissions. It was shown that acceleration sequences have the highest contribution to the total emissions, whereas deceleration sequences have the least. (paper)

  2. A methodology for collection and analysis of human error data based on a cognitive model: IDA

    International Nuclear Information System (INIS)

    Shen, S.-H.; Smidts, C.; Mosleh, A.

    1997-01-01

    This paper presents a model-based human error taxonomy and data collection. The underlying model, IDA (described in two companion papers), is a cognitive model of behavior developed for analysis of the actions of nuclear power plant operating crew during abnormal situations. The taxonomy is established with reference to three external reference points (i.e. plant status, procedures, and crew) and four reference points internal to the model (i.e. information collected, diagnosis, decision, action). The taxonomy helps the analyst: (1) recognize errors as such; (2) categorize the error in terms of generic characteristics such as 'error in selection of problem solving strategies' and (3) identify the root causes of the error. The data collection methodology is summarized in post event operator interview and analysis summary forms. The root cause analysis methodology is illustrated using a subset of an actual event. Statistics, which extract generic characteristics of error prone behaviors and error prone situations are presented. Finally, applications of the human error data collection are reviewed. A primary benefit of this methodology is to define better symptom-based and other auxiliary procedures with associated training to minimize or preclude certain human errors. It also helps in design of control rooms, and in assessment of human error probabilities in the probabilistic risk assessment framework. (orig.)

  3. Methodology for the analysis of pollutant emissions from a city bus

    Science.gov (United States)

    Armas, Octavio; Lapuerta, Magín; Mata, Carmen

    2012-04-01

    In this work a methodology is proposed for measurement and analysis of gaseous emissions and particle size distributions emitted by a diesel city bus during its typical operation under urban driving conditions. As test circuit, a passenger transportation line at a Spanish city was used. Different ways for data processing and representation were studied and, derived from this work, a new approach is proposed. The methodology was useful to detect the most important uncertainties arising during registration and processing of data derived from a measurement campaign devoted to determine the main pollutant emissions. A HORIBA OBS-1300 gas analyzer and a TSI engine exhaust particle spectrometer were used with 1 Hz frequency data recording. The methodology proposed allows for the comparison of results (in mean values) derived from the analysis of either complete cycles or specific categories (or sequences). The analysis by categories is demonstrated to be a robust and helpful tool to isolate the effect of the main vehicle parameters (relative fuel-air ratio and velocity) on pollutant emissions. It was shown that acceleration sequences have the highest contribution to the total emissions, whereas deceleration sequences have the least.

  4. Development of the fire PSA methodology and the fire analysis computer code system

    International Nuclear Information System (INIS)

    Katsunori, Ogura; Tomomichi, Ito; Tsuyoshi, Uchida; Yusuke, Kasagawa

    2009-01-01

    Fire PSA methodology has been developed and was applied to NPPs in Japan for power operation and LPSD states. CDFs of preliminary fire PSA for power operation were the higher than that of internal events. Fire propagation analysis code system (CFAST/FDS Network) was being developed and verified thru OECD-PRISME Project. Extension of the scope for LPSD state is planned to figure out the risk level. In order to figure out the fire risk level precisely, the enhancement of the methodology is planned. Verification and validation of phenomenological fire propagation analysis code (CFAST/FDS Network) in the context of Fire PSA. Enhancement of the methodology such as an application of 'Electric Circuit Analysis' in NUREG/CR-6850 and related tests in order to quantify the hot-short effect precisely. Development of seismic-induced fire PSA method being integration of existing seismic PSA and fire PSA methods is ongoing. Fire PSA will be applied to review the validity of fire prevention and mitigation measures

  5. How to conduct a qualitative meta-analysis: Tailoring methods to enhance methodological integrity.

    Science.gov (United States)

    Levitt, Heidi M

    2018-05-01

    Although qualitative research has long been of interest in the field of psychology, meta-analyses of qualitative literatures (sometimes called meta-syntheses) are still quite rare. Like quantitative meta-analyses, these methods function to aggregate findings and identify patterns across primary studies, but their aims, procedures, and methodological considerations may vary. This paper explains the function of qualitative meta-analyses and their methodological development. Recommendations have broad relevance but are framed with an eye toward their use in psychotherapy research. Rather than arguing for the adoption of any single meta-method, this paper advocates for considering how procedures can best be selected and adapted to enhance a meta-study's methodological integrity. Through the paper, recommendations are provided to help researchers identify procedures that can best serve their studies' specific goals. Meta-analysts are encouraged to consider the methodological integrity of their studies in relation to central research processes, including identifying a set of primary research studies, transforming primary findings into initial units of data for a meta-analysis, developing categories or themes, and communicating findings. The paper provides guidance for researchers who desire to tailor meta-analytic methods to meet their particular goals while enhancing the rigor of their research.

  6. Analysis of Feedback processes in Online Group Interaction: a methodological model

    Directory of Open Access Journals (Sweden)

    Anna Espasa

    2013-06-01

    Full Text Available The aim of this article is to present a methodological model to analyze students' group interaction to improve their essays in online learning environments, based on asynchronous and written communication. In these environments teacher and student scaffolds for discussion are essential to promote interaction. One of these scaffolds can be the feedback. Research on feedback processes has predominantly focused on feedback design rather than on how students utilize feedback to improve learning. This methodological model fills this gap contributing to analyse the implementation of the feedback processes while students discuss collaboratively in a specific case of writing assignments. A review of different methodological models was carried out to define a framework adjusted to the analysis of the relationship of written and asynchronous group interaction, and students' activity and changes incorporated into the final text. The model proposed includes the following dimensions: 1 student participation 2 nature of student learning and 3 quality of student learning. The main contribution of this article is to present the methodological model and also to ascertain the model's operativity regarding how students incorporate such feedback into their essays.

  7. Network meta-analysis-highly attractive but more methodological research is needed

    Directory of Open Access Journals (Sweden)

    Singh Sonal

    2011-06-01

    Full Text Available Abstract Network meta-analysis, in the context of a systematic review, is a meta-analysis in which multiple treatments (that is, three or more are being compared using both direct comparisons of interventions within randomized controlled trials and indirect comparisons across trials based on a common comparator. To ensure validity of findings from network meta-analyses, the systematic review must be designed rigorously and conducted carefully. Aspects of designing and conducting a systematic review for network meta-analysis include defining the review question, specifying eligibility criteria, searching for and selecting studies, assessing risk of bias and quality of evidence, conducting a network meta-analysis, interpreting and reporting findings. This commentary summarizes the methodologic challenges and research opportunities for network meta-analysis relevant to each aspect of the systematic review process based on discussions at a network meta-analysis methodology meeting we hosted in May 2010 at the Johns Hopkins Bloomberg School of Public Health. Since this commentary reflects the discussion at that meeting, it is not intended to provide an overview of the field.

  8. 3-Dimensional Methodology for the Control Rod Ejection Accident Analysis Using UNICORN{sup TM}

    Energy Technology Data Exchange (ETDEWEB)

    Jang, Chan-su; Um, Kil-sup; Ahn, Dawk-hwan [Korea Nuclear Fuel Company, Taejon (Korea, Republic of); Kim, Yo-han; Sung, Chang-kyung [KEPRI, Taejon (Korea, Republic of); Song, Jae-seung [Korea Atomic Energy Research Institute, Taejon (Korea, Republic of)

    2006-07-01

    The control rod ejection accident has been analyzed with STRIKIN-II code using the point kinetics model coupled with conservative factors to address the three dimensional aspects. This may result in a severe transient with very high fuel enthalpy deposition. KNFC, under the support of KEPRI and KAERI, is developing 3-dimensional methodology for the rod ejection accident analysis using UNICORNTM (Unified Code of RETRAN, TORC and MASTER). For this purpose, 3-dimensional MASTER-TORC codes, which have been combined with the dynamic-link library by KAERI, are used in the transient analysis of the core and RETRAN code is used to estimate the enthalpy deposition in the hot rod.

  9. Methodology for repeated load analysis of composite structures with embedded magnetic microwires

    Directory of Open Access Journals (Sweden)

    K. Semrád

    2017-01-01

    Full Text Available The article processes issue of strength of cyclically loaded composite structures with the possibility of contactless stress measuring inside a material. For this purpose a contactless tensile stress sensor using improved induction principle based on the magnetic microwires embedded in the composite structure has been developed. The methodology based on the E-N approach was applied for the analysis of the repeated load of the wing hinge connection, including finite element method (FEM fatigue strength analysis. The results proved that composites in comparison with the metal structures offer significant weight reduction of the small aircraft construction, whereas the required strength, stability and lifetime of the components are remained.

  10. 3-Dimensional Methodology for the Control Rod Ejection Accident Analysis Using UNICORNTM

    International Nuclear Information System (INIS)

    Jang, Chan-su; Um, Kil-sup; Ahn, Dawk-hwan; Kim, Yo-han; Sung, Chang-kyung; Song, Jae-seung

    2006-01-01

    The control rod ejection accident has been analyzed with STRIKIN-II code using the point kinetics model coupled with conservative factors to address the three dimensional aspects. This may result in a severe transient with very high fuel enthalpy deposition. KNFC, under the support of KEPRI and KAERI, is developing 3-dimensional methodology for the rod ejection accident analysis using UNICORNTM (Unified Code of RETRAN, TORC and MASTER). For this purpose, 3-dimensional MASTER-TORC codes, which have been combined with the dynamic-link library by KAERI, are used in the transient analysis of the core and RETRAN code is used to estimate the enthalpy deposition in the hot rod

  11. Statistical trend analysis methodology for rare failures in changing technical systems

    International Nuclear Information System (INIS)

    Ott, K.O.; Hoffmann, H.J.

    1983-07-01

    A methodology for a statistical trend analysis (STA) in failure rates is presented. It applies primarily to relatively rare events in changing technologies or components. The formulation is more general and the assumptions are less restrictive than in a previously published version. Relations of the statistical analysis and probabilistic assessment (PRA) are discussed in terms of categorization of decisions for action following particular failure events. The significance of tentatively identified trends is explored. In addition to statistical tests for trend significance, a combination of STA and PRA results quantifying the trend complement is proposed. The STA approach is compared with other concepts for trend characterization. (orig.)

  12. A rigorous methodology for development and uncertainty analysis of group contribution based property models

    DEFF Research Database (Denmark)

    Frutiger, Jerome; Abildskov, Jens; Sin, Gürkan

    ) weighted-least-square regression. 3) Initialization of estimation by use of linear algebra providing a first guess. 4) Sequential parameter and simultaneous GC parameter by using of 4 different minimization algorithms. 5) Thorough uncertainty analysis: a) based on asymptotic approximation of parameter...... covariance matrix b) based on boot strap method. Providing 95%-confidence intervals of parameters and predicted property. 6) Performance statistics analysis and model application. The application of the methodology is shown for a new GC model built to predict lower flammability limit (LFL) for refrigerants...... their credibility and robustness in wider industrial and scientific applications....

  13. Case-Crossover Analysis of Air Pollution Health Effects: A Systematic Review of Methodology and Application

    Science.gov (United States)

    Carracedo-Martínez, Eduardo; Taracido, Margarita; Tobias, Aurelio; Saez, Marc; Figueiras, Adolfo

    2010-01-01

    Background Case-crossover is one of the most used designs for analyzing the health-related effects of air pollution. Nevertheless, no one has reviewed its application and methodology in this context. Objective We conducted a systematic review of case-crossover (CCO) designs used to study the relationship between air pollution and morbidity and mortality, from the standpoint of methodology and application. Data sources and extraction A search was made of the MEDLINE and EMBASE databases. Reports were classified as methodologic or applied. From the latter, the following information was extracted: author, study location, year, type of population (general or patients), dependent variable(s), independent variable(s), type of CCO design, and whether effect modification was analyzed for variables at the individual level. Data synthesis The review covered 105 reports that fulfilled the inclusion criteria. Of these, 24 addressed methodological aspects, and the remainder involved the design’s application. In the methodological reports, the designs that yielded the best results in simulation were symmetric bidirectional CCO and time-stratified CCO. Furthermore, we observed an increase across time in the use of certain CCO designs, mainly symmetric bidirectional and time-stratified CCO. The dependent variables most frequently analyzed were those relating to hospital morbidity; the pollutants most often studied were those linked to particulate matter. Among the CCO-application reports, 13.6% studied effect modification for variables at the individual level. Conclusions The use of CCO designs has undergone considerable growth; the most widely used designs were those that yielded better results in simulation studies: symmetric bidirectional and time-stratified CCO. However, the advantages of CCO as a method of analysis of variables at the individual level are put to little use. PMID:20356818

  14. Ruling the Commons. Introducing a new methodology for the analysis of historical commons

    Directory of Open Access Journals (Sweden)

    Tine de Moor

    2016-10-01

    Full Text Available Despite significant progress in recent years, the evolution of commons over the long run remains an under-explored area within commons studies. During the last years an international team of historians have worked under the umbrella of the Common Rules Project in order to design and test a new methodology aimed at advancing our knowledge on the dynamics of institutions for collective action – in particular commons. This project aims to contribute to the current debate on commons on three different fronts. Theoretically, it explicitly draws our attention to issues of change and adaptation in the commons – contrasting with more static analyses. Empirically, it highlights the value of historical records as a rich source of information for longitudinal analysis of the functioning of commons. Methodologically, it develops a systematic way of analyzing and comparing commons’ regulations across regions and time, setting a number of variables that have been defined on the basis of the “most common denominators” in commons regulation across countries and time periods. In this paper we introduce the project, describe our sources and methodology, and present the preliminary results of our analysis.

  15. Application of NASA Kennedy Space Center System Assurance Analysis methodology to nuclear power plant systems designs

    International Nuclear Information System (INIS)

    Page, D.W.

    1985-01-01

    In May of 1982, the Kennedy Space Center (KSC) entered into an agreement with the NRC to conduct a study to demonstrate the feasibility and practicality of applying the KSC System Assurance Analysis (SAA) methodology to nuclear power plant systems designs. North Carolina's Duke Power Company expressed an interest in the study and proposed the nuclear power facility at CATAWBA for the basis of the study. In joint meetings of KSC and Duke Power personnel, an agreement was made to select two CATAWBA systems, the Containment Spray System and the Residual Heat Removal System, for the analyses. Duke Power provided KSC with a full set of Final Safety Analysis Reports (FSAR) as well as schematics for the two systems. During Phase I of the study the reliability analyses of the SAA were performed. During Phase II the hazard analyses were performed. The final product of Phase II is a handbook for implementing the SAA methodology into nuclear power plant systems designs. The purpose of this paper is to describe the SAA methodology as it applies to nuclear power plant systems designs and to discuss the feasibility of its application. (orig./HP)

  16. Methodology for thermal-hydraulics analysis of pool type MTR fuel research reactors

    International Nuclear Information System (INIS)

    Umbehaun, Pedro Ernesto

    2000-01-01

    This work presents a methodology developed for thermal-hydraulic analysis of pool type MTR fuel research reactors. For this methodology a computational program, FLOW, and a model, MTRCR-IEAR1 were developed. FLOW calculates the cooling flow distribution in the fuel elements, control elements, irradiators, and through the channels formed among the fuel elements and among the irradiators and reflectors. This computer program was validated against experimental data for the IEA-R1 research reactor core at IPEN-CNEN/SP. MTRCR-IEAR1 is a model based on the commercial program Engineering Equation Solver (EES). Besides the thermal-hydraulic analyses of the core in steady state accomplished by traditional computational programs like COBRA-3C/RERTR and PARET, this model allows to analyze parallel channels with different cooling flow and/or geometry. Uncertainty factors of the variables from neutronic and thermalhydraulic calculation and also from the fabrication of the fuel element are introduced in the model. For steady state analyses MTRCR-IEAR1 showed good agreement with the results of COBRA-3C/RERTR and PARET. The developed methodology was used for the calculation of the cooling flow distribution and the thermal-hydraulic analysis of a typical configuration of the IEA-R1 research reactor core. (author)

  17. Analysis of cold leg LOCA with failed HPSI by means of integrated safety assessment methodology

    International Nuclear Information System (INIS)

    Gonzalez-Cadelo, J.; Queral, C.; Montero-Mayorga, J.

    2014-01-01

    Highlights: • Results of ISA for considered sequences endorse EOPs guidance in an original way. • ISA allows to obtain accurate available times for accident management actions. • RCP-trip adequacy and available time for beginning depressurization are evaluated. • ISA minimizes the necessity of expert judgment to perform safety assessment. - Abstract: The integrated safety assessment (ISA) methodology, developed by the Spanish Nuclear Safety Council (CSN), has been applied to a thermal–hydraulic analysis of cold leg LOCA sequences with unavailable High Pressure Injection System in a Westinghouse 3-loop PWR. This analysis has been performed with TRACE 5.0 patch 1 code. ISA methodology allows obtaining the Damage Domain (the region of space of parameters where a safety limit is exceeded) as a function of uncertain parameters (break area) and operator actuation times, and provides to the analyst useful information about the impact of these uncertain parameters in safety concerns. In this work two main issues have been analyzed: the effect of reactor coolant pump trip and the available time for beginning of secondary-side depressurization. The main conclusions are that present Emergency Operating Procedures (EOPs) are adequate for managing this kind of sequences and the ISA methodology is able to take into account time delays and parameter uncertainties

  18. A SAS2H/KENO-V methodology for 3D fuel burnup analysis

    International Nuclear Information System (INIS)

    Milosevic, M.; Greenspan, E.; Vujic, J.

    2002-01-01

    An efficient methodology for 3D fuel burnup analysis of LWR reactors is described in this paper. This methodology is founded on coupling Monte Carlo method for 3D calculation of node power distribution, and transport method for depletion calculation in ID Wigner-Seitz equivalent cell for each node independently. The proposed fuel burnup modeling, based on application of SCALE-4.4a control modules SAS2H and KENO-V.a is verified for the case of 2D x-y model of IRIS 15 x 15 fuel assembly (with reflective boundary condition) by using two well benchmarked code systems. The one is MOCUP, a coupled MCNP-4C and ORIGEN2.1 utility code, and the second is KENO-V.a/ORIGEN2.1 code system recently developed by authors of this paper. The proposed SAS2H/KENO-V.a methodology was applied for 3D burnup analysis of IRIS-1000 benchmark.44 core. Detailed k sub e sub f sub f and power density evolution with burnup are reported. (author)

  19. Improving process methodology for measuring plutonium burden in human urine using fission track analysis

    International Nuclear Information System (INIS)

    Krahenbuhl, M.P.; Slaughter, D.M.

    1998-01-01

    The aim of this paper is to clearly define the chemical and nuclear principles governing Fission Track Analysis (FTA) to determine environmental levels of 239 Pu in urine. The paper also addresses deficiencies in FTA methodology and introduces improvements to make FTA a more reliable research tool. Our refined methodology, described herein, includes a chemically-induced precipitation phase, followed by anion exchange chromatography and employs a chemical tracer, 236 Pu. We have been able to establish an inverse correlation between Pu recovery and sample volume and our data confirms that increases in sample volume do not result in higher accuracy or lower detection limits. We conclude that in subsequent studies, samples should be limited to approximately two liters. The Pu detection limit for a sample of this volume is 2.8 μBq/l. (author)

  20. [Methodological novelties applied to the anthropology of food: agent-based models and social networks analysis].

    Science.gov (United States)

    Díaz Córdova, Diego

    2016-01-01

    The aim of this article is to introduce two methodological strategies that have not often been utilized in the anthropology of food: agent-based models and social networks analysis. In order to illustrate these methods in action, two cases based in materials typical of the anthropology of food are presented. For the first strategy, fieldwork carried out in Quebrada de Humahuaca (province of Jujuy, Argentina) regarding meal recall was used, and for the second, elements of the concept of "domestic consumption strategies" applied by Aguirre were employed. The underlying idea is that, given that eating is recognized as a "total social fact" and, therefore, as a complex phenomenon, the methodological approach must also be characterized by complexity. The greater the number of methods utilized (with the appropriate rigor), the better able we will be to understand the dynamics of feeding in the social environment.

  1. Performance analysis for disposal of mixed low-level waste. 1: Methodology

    International Nuclear Information System (INIS)

    Waters, R.D.; Gruebel, M.M.

    1999-01-01

    A simple methodology has been developed for evaluating the technical capabilities of potential sites for disposal of mixed low-level radioactive waste. The results of the evaluation are expressed as permissible radionuclide concentrations in disposed waste. The methodology includes an analysis of three separate pathways: (1) releases of radionuclides to groundwater; (2) releases of potentially volatile radionuclides to the atmosphere; and (3) the consequences of inadvertent intrusion into a disposal facility. For each radionuclide, its limiting permissible concentration in disposed waste is the lowest of the permissible concentrations determined from each of the three pathways. These permissible concentrations in waste at an evaluated site can be used to assess the capability of the site to dispose of waste streams containing multiple radionuclides

  2. Analysis of kyoto university reactor physics critical experiments using NCNSRC calculation methodology

    International Nuclear Information System (INIS)

    Amin, E.; Hathout, A.M.; Shouman, S.

    1997-01-01

    The kyoto university reactor physics experiments on the university critical assembly is used to benchmark validate the NCNSRC calculations methodology. This methodology has two lines, diffusion and Monte Carlo. The diffusion line includes the codes WIMSD4 for cell calculations and the two dimensional diffusion code DIXY2 for core calculations. The transport line uses the MULTIKENO-Code vax Version. Analysis is performed for the criticality, and the temperature coefficients of reactivity (TCR) for the light water moderated and reflected cores, of the different cores utilized in the experiments. The results of both Eigen value and TCR approximately reproduced the experimental and theoretical Kyoto results. However, some conclusions are drawn about the adequacy of the standard wimsd4 library. This paper is an extension of the NCNSRC efforts to assess and validate computer tools and methods for both Et-R R-1 and Et-MMpr-2 research reactors. 7 figs., 1 tab

  3. A new methodology to study customer electrocardiogram using RFM analysis and clustering

    Directory of Open Access Journals (Sweden)

    Mohammad Reza Gholamian

    2011-04-01

    Full Text Available One of the primary issues on marketing planning is to know the customer's behavioral trends. A customer's purchasing interest may fluctuate for different reasons and it is important to find the declining or increasing trends whenever they happen. It is important to study these fluctuations to improve customer relationships. There are different methods to increase the customer's willingness such as planning good promotions, an increase on advertisement, etc. This paper proposes a new methodology to measure customer's behavioral trends called customer electrocardiogram. The proposed model of this paper uses K-means clustering method with RFM analysis to study customer's fluctuations over different time frames. We also apply the proposed electrocardiogram methodology for a real-world case study of food industry and the results are discussed in details.

  4. Methodology and application of surrogate plant PRA analysis to the Rancho Seco Power Plant: Final report

    International Nuclear Information System (INIS)

    Gore, B.F.; Huenefeld, J.C.

    1987-07-01

    This report presents the development and the first application of generic probabilistic risk assessment (PRA) information for identifying systems and components important to public risk at nuclear power plants lacking plant-specific PRAs. A methodology is presented for using the results of PRAs for similar (surrogate) plants, along with plant-specific information about the plant of interest and the surrogate plants, to infer important failure modes for systems of the plant of interest. This methodology, and the rationale on which it is based, is presented in the context of its application to the Rancho Seco plant. The Rancho Seco plant has been analyzed using PRA information from two surrogate plants. This analysis has been used to guide development of considerable plant-specific information about Rancho Seco systems and components important to minimizing public risk, which is also presented herein

  5. The Nuclear Organization and Management Analysis Concept methodology: Four years later

    International Nuclear Information System (INIS)

    Haber, S.B.; Shurberg, D.A.; Barriere, M.T.; Hall, R.E.

    1992-01-01

    The Nuclear Organization and Management Analysis Concept was first presented at the IEEE Human Factors meeting in Monterey in 1988. In the four years since that paper, the concept and its associated methodology has been demonstrated at two commercial nuclear power plants (NPP) and one fossil power plant. In addition, applications of some of the methods have been utilized in other types of organizations, and products are being developed from the insights obtained using the concept for various organization and management activities. This paper will focus on the insights and results obtained from the two demonstration studies at the commercial NPPs. The results emphasize the utility of the methodology and the comparability of the results from the two organizations

  6. The GDOR model. A new methodology for the analysis of training needs: The case of Andorra

    Directory of Open Access Journals (Sweden)

    Marc Eguiguren Huerta

    2012-09-01

    Full Text Available Purpose: This article investigates the status and importance of continuing training in companies in the Principality of Andorra and its impact on the economic development of the country.Design/methodology/approach: The analysis is based on GDOR, a methodology based on the impact of training decisions on economic indicators and ratios that has been developed by the authors. By using GDOR, the authors explore and understand the current situation and the training needs of the main sectors in the Andorran economy.Findings: The findings incorporate a different view of lifelong learning training needs in Andorra much more related to the development needs of the country.Originality/value: With reference to best practice from different countries, particularly those in Europe, an original and new proposal to address those training needs is presented including recommendations to the country’s authorities on how to manage lifelong learning policies.

  7. Nonpoint source pollution of urban stormwater runoff: a methodology for source analysis.

    Science.gov (United States)

    Petrucci, Guido; Gromaire, Marie-Christine; Shorshani, Masoud Fallah; Chebbo, Ghassan

    2014-09-01

    The characterization and control of runoff pollution from nonpoint sources in urban areas are a major issue for the protection of aquatic environments. We propose a methodology to quantify the sources of pollutants in an urban catchment and to analyze the associated uncertainties. After describing the methodology, we illustrate it through an application to the sources of Cu, Pb, Zn, and polycyclic aromatic hydrocarbons (PAH) from a residential catchment (228 ha) in the Paris region. In this application, we suggest several procedures that can be applied for the analysis of other pollutants in different catchments, including an estimation of the total extent of roof accessories (gutters and downspouts, watertight joints and valleys) in a catchment. These accessories result as the major source of Pb and as an important source of Zn in the example catchment, while activity-related sources (traffic, heating) are dominant for Cu (brake pad wear) and PAH (tire wear, atmospheric deposition).

  8. A methodology for automated CPA extraction using liver biopsy image analysis and machine learning techniques.

    Science.gov (United States)

    Tsipouras, Markos G; Giannakeas, Nikolaos; Tzallas, Alexandros T; Tsianou, Zoe E; Manousou, Pinelopi; Hall, Andrew; Tsoulos, Ioannis; Tsianos, Epameinondas

    2017-03-01

    Collagen proportional area (CPA) extraction in liver biopsy images provides the degree of fibrosis expansion in liver tissue, which is the most characteristic histological alteration in hepatitis C virus (HCV). Assessment of the fibrotic tissue is currently based on semiquantitative staging scores such as Ishak and Metavir. Since its introduction as a fibrotic tissue assessment technique, CPA calculation based on image analysis techniques has proven to be more accurate than semiquantitative scores. However, CPA has yet to reach everyday clinical practice, since the lack of standardized and robust methods for computerized image analysis for CPA assessment have proven to be a major limitation. The current work introduces a three-stage fully automated methodology for CPA extraction based on machine learning techniques. Specifically, clustering algorithms have been employed for background-tissue separation, as well as for fibrosis detection in liver tissue regions, in the first and the third stage of the methodology, respectively. Due to the existence of several types of tissue regions in the image (such as blood clots, muscle tissue, structural collagen, etc.), classification algorithms have been employed to identify liver tissue regions and exclude all other non-liver tissue regions from CPA computation. For the evaluation of the methodology, 79 liver biopsy images have been employed, obtaining 1.31% mean absolute CPA error, with 0.923 concordance correlation coefficient. The proposed methodology is designed to (i) avoid manual threshold-based and region selection processes, widely used in similar approaches presented in the literature, and (ii) minimize CPA calculation time. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  9. Review of Recent Methodological Developments in Group-Randomized Trials: Part 2-Analysis.

    Science.gov (United States)

    Turner, Elizabeth L; Prague, Melanie; Gallis, John A; Li, Fan; Murray, David M

    2017-07-01

    In 2004, Murray et al. reviewed methodological developments in the design and analysis of group-randomized trials (GRTs). We have updated that review with developments in analysis of the past 13 years, with a companion article to focus on developments in design. We discuss developments in the topics of the earlier review (e.g., methods for parallel-arm GRTs, individually randomized group-treatment trials, and missing data) and in new topics, including methods to account for multiple-level clustering and alternative estimation methods (e.g., augmented generalized estimating equations, targeted maximum likelihood, and quadratic inference functions). In addition, we describe developments in analysis of alternative group designs (including stepped-wedge GRTs, network-randomized trials, and pseudocluster randomized trials), which require clustering to be accounted for in their design and analysis.

  10. Work Domain Analysis Methodology for Development of Operational Concepts for Advanced Reactors

    Energy Technology Data Exchange (ETDEWEB)

    Hugo, Jacques [Idaho National Lab. (INL), Idaho Falls, ID (United States)

    2015-05-01

    This report describes a methodology to conduct a Work Domain Analysis in preparation for the development of operational concepts for new plants. This method has been adapted from the classical method described in the literature in order to better deal with the uncertainty and incomplete information typical of first-of-a-kind designs. The report outlines the strategy for undertaking a Work Domain Analysis of a new nuclear power plant and the methods to be used in the development of the various phases of the analysis. Basic principles are described to the extent necessary to explain why and how the classical method was adapted to make it suitable as a tool for the preparation of operational concepts for a new nuclear power plant. Practical examples are provided of the systematic application of the method and the various presentation formats in the operational analysis of advanced reactors.

  11. Modeling and analysis of power processing systems: Feasibility investigation and formulation of a methodology

    Science.gov (United States)

    Biess, J. J.; Yu, Y.; Middlebrook, R. D.; Schoenfeld, A. D.

    1974-01-01

    A review is given of future power processing systems planned for the next 20 years, and the state-of-the-art of power processing design modeling and analysis techniques used to optimize power processing systems. A methodology of modeling and analysis of power processing equipment and systems has been formulated to fulfill future tradeoff studies and optimization requirements. Computer techniques were applied to simulate power processor performance and to optimize the design of power processing equipment. A program plan to systematically develop and apply the tools for power processing systems modeling and analysis is presented so that meaningful results can be obtained each year to aid the power processing system engineer and power processing equipment circuit designers in their conceptual and detail design and analysis tasks.

  12. Methodology for thermal hydraulic conceptual design and performance analysis of KALIMER core

    International Nuclear Information System (INIS)

    Young-Gyun Kim; Won-Seok Kim; Young-Jin Kim; Chang-Kue Park

    2000-01-01

    This paper summarizes the methodology for thermal hydraulic conceptual design and performance analysis which is used for KALIMER core, especially the preliminary methodology for flow grouping and peak pin temperature calculation in detail. And the major technical results of the conceptual design for the KALIMER 98.03 core was shown and compared with those of KALIMER 97.07 design core. The KALIMER 98.03 design core is proved to be more optimized compared to the 97.07 design core. The number of flow groups are reduced from 16 to 11, and the equalized peak cladding midwall temperature from 654 deg. C to 628 deg. C. It was achieved from the nuclear and thermal hydraulic design optimization study, i.e. core power flattening and increase of radial blanket power fraction. Coolant flow distribution to the assemblies and core coolant/component temperatures should be determined in core thermal hydraulic analysis. Sodium flow is distributed to core assemblies with the overall goal of equalizing the peak cladding midwall temperatures for the peak temperature pin of each bundle, thus pin cladding damage accumulation and pin reliability. The flow grouping and the peak pin temperature calculation for the preliminary conceptual design is performed with the modules ORFCE-F60 and ORFCE-T60 respectively. The basic subchannel analysis will be performed with the SLTHEN code, and the detailed subchannel analysis will be done with the MATRA-LMR code which is under development for the K-Core system. This methodology was proved practical to KALIMER core thermal hydraulic design from the related benchmark calculation studies, and it is used to KALIMER core thermal hydraulic conceptual design. (author)

  13. Methodological tools for the collection and analysis of participant observation data using grounded theory.

    Science.gov (United States)

    Laitinen, Heleena; Kaunonen, Marja; Astedt-Kurki, Päivi

    2014-11-01

    To give clarity to the analysis of participant observation in nursing when implementing the grounded theory method. Participant observation (PO) is a method of collecting data that reveals the reality of daily life in a specific context. In grounded theory, interviews are the primary method of collecting data but PO gives a distinctive insight, revealing what people are really doing, instead of what they say they are doing. However, more focus is needed on the analysis of PO. An observational study carried out to gain awareness of nursing care and its electronic documentation in four acute care wards in hospitals in Finland. Discussion of using the grounded theory method and PO as a data collection tool. The following methodological tools are discussed: an observational protocol, jotting of notes, microanalysis, the use of questioning, constant comparison, and writing and illustrating. Each tool has specific significance in collecting and analysing data, working in constant interaction. Grounded theory and participant observation supplied rich data and revealed the complexity of the daily reality of acute care. In this study, the methodological tools provided a base for the study at the research sites and outside. The process as a whole was challenging. It was time-consuming and it required rigorous and simultaneous data collection and analysis, including reflective writing. Using these methodological tools helped the researcher stay focused from data collection and analysis to building theory. Using PO as a data collection method in qualitative nursing research provides insights. It is not commonly discussed in nursing research and therefore this study can provide insight, which cannot be seen or revealed by using other data collection methods. Therefore, this paper can produce a useful tool for those who intend to use PO and grounded theory in their nursing research.

  14. METHODOLOGICAL ANALYSIS OF STUDYING THE PROBLEM OF PERCEPTION IN FUTURE MUSIC TEACHERS’ PROFESSIONAL TRAINING

    Directory of Open Access Journals (Sweden)

    Zhang Bo

    2017-04-01

    Full Text Available In the article the methodological analysis of problem of perception in future music teachers’ professional training is presented. The author of the article analyses works of outstanding scientists in philosophy, psychology, and art education. The hierarchical system of musical perception options is revealed. A methodological foundation is supported by consideration of the following modern research in specialty – a theory and methodology of musical study that gives proper appearance and circumstantiality to the presented material. Studying the vocal and choral researches in the field of forming the valued music art perception by future music teachers, an author sets an aim to present the methodological analysis of the problem of perception in future music teachers’ professional training. Realization of the system approach to updating the problem of forming the valued music art perception of future music teachers while being trained to vocal and choral work with senior pupils extends their artistic awareness; contributes to distinguishing art works, phenomena; to seeing their properties; to providing orientation in the informative content of music art works. The special attention is paid to revealing methodological principles of perception of category research in the aspect of the valued understanding images of music art works. As a result of analysing scientific sources on the issue of voice production the author of the article finds out that perception is densely related to transformation of external information, conditioning for forming images, operating category attention, memory, thinking, and emotions. The features of perception of maintaining vocal and choral studies and students’ extrapolation are analysed in the process of future professional activity with senior pupils in the aspects of perception and transformation of musical and intonation information, analysis, object perception, and interpretation in accordance with future

  15. Applying rigorous decision analysis methodology to optimization of a tertiary recovery project

    International Nuclear Information System (INIS)

    Wackowski, R.K.; Stevens, C.E.; Masoner, L.O.; Attanucci, V.; Larson, J.L.; Aslesen, K.S.

    1992-01-01

    This paper reports that the intent of this study was to rigorously look at all of the possible expansion, investment, operational, and CO 2 purchase/recompression scenarios (over 2500) to yield a strategy that would maximize net present value of the CO 2 project at the Rangely Weber Sand Unit. Traditional methods of project management, which involve analyzing large numbers of single case economic evaluations, was found to be too cumbersome and inaccurate for an analysis of this scope. The decision analysis methodology utilized a statistical approach which resulted in a range of economic outcomes. Advantages of the decision analysis methodology included: a more organized approach to classification of decisions and uncertainties; a clear sensitivity method to identify the key uncertainties; an application of probabilistic analysis through the decision tree; and a comprehensive display of the range of possible outcomes for communication to decision makers. This range made it possible to consider the upside and downside potential of the options and to weight these against the Unit's strategies. Savings in time and manpower required to complete the study were also realized

  16. Source apportionment and sensitivity analysis: two methodologies with two different purposes

    Science.gov (United States)

    Clappier, Alain; Belis, Claudio A.; Pernigotti, Denise; Thunis, Philippe

    2017-11-01

    This work reviews the existing methodologies for source apportionment and sensitivity analysis to identify key differences and stress their implicit limitations. The emphasis is laid on the differences between source impacts (sensitivity analysis) and contributions (source apportionment) obtained by using four different methodologies: brute-force top-down, brute-force bottom-up, tagged species and decoupled direct method (DDM). A simple theoretical example to compare these approaches is used highlighting differences and potential implications for policy. When the relationships between concentration and emissions are linear, impacts and contributions are equivalent concepts. In this case, source apportionment and sensitivity analysis may be used indifferently for both air quality planning purposes and quantifying source contributions. However, this study demonstrates that when the relationship between emissions and concentrations is nonlinear, sensitivity approaches are not suitable to retrieve source contributions and source apportionment methods are not appropriate to evaluate the impact of abatement strategies. A quantification of the potential nonlinearities should therefore be the first step prior to source apportionment or planning applications, to prevent any limitations in their use. When nonlinearity is mild, these limitations may, however, be acceptable in the context of the other uncertainties inherent to complex models. Moreover, when using sensitivity analysis for planning, it is important to note that, under nonlinear circumstances, the calculated impacts will only provide information for the exact conditions (e.g. emission reduction share) that are simulated.

  17. Probabilistic Safety Assessment: An Effective Tool to Support “Systemic Approach” to Nuclear Safety and Analysis of Human and Organizational Aspects

    International Nuclear Information System (INIS)

    Kuzmina, I.

    2016-01-01

    The Probabilistic Safety Assessment (PSA) represents a comprehensive conceptual and analytical tool for quantitative evaluation of risk of undesirable consequences from nuclear facilities and drawing on qualitative insights for nuclear safety. PSA considers various technical, human, and organizational factors in an integral manner thus explicitly pursuing a true ‘systemic approach’ to safety and enabling holistic insights for further safety improvement. Human Reliability Analysis (HRA) is one of the major tasks within PSA. The poster paper provides an overview of the objectives and scope of PSA and HRA and discusses on further needs in the area of HRA. (author)

  18. A study on the operator's errors of commission (EOC) in accident scenarios of nuclear power plants: methodology development and application

    International Nuclear Information System (INIS)

    Kim, Jae Whan; Jung, Won Dea; Park, Jin Kyun; Kang, Da Il

    2003-04-01

    As the concern on the operator's inappropriate interventions, the so-called Errors Of Commission (EOCs), that can exacerbate the plant safety has been raised, much of interest in the identification and analysis of EOC events from the risk assessment perspective has been increased. Also, one of the items in need of improvement for the conventional PSA and HRA that consider only the system-demanding human actions is the inclusion of the operator's EOC events into the PSA model. In this study, we propose a methodology for identifying and analysing human errors of commission that might be occurring from the failures in situation assessment and decision making during accident progressions given an initiating event. In order to achieve this goal, the following research items have been performed: Firstly, we analysed the error causes or situations contributed to the occurrence of EOCs in several incidents/accidents of nuclear power plants. Secondly, limitations of the advanced HRAs in treating EOCs were reviewed, and a requirement for a new methodology for analysing EOCs was established. Thirdly, based on these accomplishments a methodology for identifying and analysing EOC events inducible from the failures in situation assessment and decision making was proposed and applied to all the accident sequences of YGN 3 and 4 NPP which resulted in the identification of about 10 EOC situations

  19. A new methodology for fault detection in rolling element bearings using singular spectrum analysis

    Directory of Open Access Journals (Sweden)

    Bugharbee Hussein Al

    2018-01-01

    Full Text Available This paper proposes a vibration-based methodology for fault detection in rolling element bearings, which is based on pure data analysis via singular spectrum method. The method suggests building a baseline space from feature vectors made of the signals measured in the healthy/baseline bearing condition. The feature vectors are made using the Euclidean norms of the first three PC’s found for the signals measured. Then, the lagged version of any new signal corresponding to a new (possibly faulty condition is projected onto this baseline feature space in order to assess its similarity to the baseline condition. The category of a new signal vector is determined based on the Mahalanobis distance (MD of its feature vector to the baseline space. A validation of the methodology is suggested based on the results from an experimental test rig. The results obtained confirm the effective performance of the suggested methodology. It is made of simple steps and is easy to apply with a perspective to make it automatic and suitable for commercial applications.

  20. Application fo fault tree methodology in the risk analysis of complex systems

    International Nuclear Information System (INIS)

    Vasconcelos, V. de.

    1984-01-01

    This study intends to describe the fault tree methodology and apply it to risk assessment of complex facilities. In the methodology description, it has been attempted to provide all the pertinent basic information, pointing out its more important aspects like, for instance, fault tree construction, evaluation techniques and their use in risk and reliability assessment of a system. In view of their importance, topics like common mode failures, human errors, data bases used in the calculations, and uncertainty evaluation of the results, will be discussed separately, each one in a chapter. For the purpose of applying the methodology, it was necessary to implement computer codes normally used for this kind of analysis. The computer codes PREP, KITT and SAMPLE, written in FORTRAN IV, were chosen, due to their availability and to the fact that they have been used in important studies of the nuclear area, like Wash-1400. With these codes, the probability of occurence of excessive pressure in the main system of the component test loop - CTC, of CDTN, was evaluated. (Author) [pt

  1. Fusion integral experiments and analysis and the determination of design safety factors - I: Methodology

    International Nuclear Information System (INIS)

    Youssef, M.Z.; Kumar, A.; Abdou, M.A.; Oyama, Y.; Maekawa, H.

    1995-01-01

    The role of the neutronics experimentation and analysis in fusion neutronics research and development programs is discussed. A new methodology was developed to arrive at estimates to design safety factors based on the experimental and analytical results from design-oriented integral experiments. In this methodology, and for a particular nuclear response, R, a normalized density function (NDF) is constructed from the prediction uncertainties, and their associated standard deviations, as found in the various integral experiments where that response, R, is measured. Important statistical parameters are derived from the NDF, such as the global mean prediction uncertainty, and the possible spread around it. The method of deriving safety factors from many possible NDFs based on various calculational and measuring methods (among other variants) is also described. Associated with each safety factor is a confidence level, designers may choose to have, that the calculated response, R, will not exceed (or will not fall below) the actual measured value. An illustrative example is given on how to construct the NDFs. The methodology is applied in two areas, namely the line-integrated tritium production rate and bulk shielding integral experiments. Conditions under which these factors could be derived and the validity of the method are discussed. 72 refs., 17 figs., 4 tabs

  2. Methodology, Measurement and Analysis of Flow Table Update Characteristics in Hardware OpenFlow Switches

    KAUST Repository

    Kuźniar, Maciej

    2018-02-15

    Software-Defined Networking (SDN) and OpenFlow are actively being standardized and deployed. These deployments rely on switches that come from various vendors and differ in terms of performance and available features. Understanding these differences and performance characteristics is essential for ensuring successful and safe deployments.We propose a systematic methodology for SDN switch performance analysis and devise a series of experiments based on this methodology. The methodology relies on sending a stream of rule updates, while relying on both observing the control plane view as reported by the switch and probing the data plane state to determine switch characteristics by comparing these views. We measure, report and explain the performance characteristics of flow table updates in six hardware OpenFlow switches. Our results describing rule update rates can help SDN designers make their controllers efficient. Further, we also highlight differences between the OpenFlow specification and its implementations, that if ignored, pose a serious threat to network security and correctness.

  3. A methodology for the analysis of differential coexpression across the human lifespan.

    Science.gov (United States)

    Gillis, Jesse; Pavlidis, Paul

    2009-09-22

    Differential coexpression is a change in coexpression between genes that may reflect 'rewiring' of transcriptional networks. It has previously been hypothesized that such changes might be occurring over time in the lifespan of an organism. While both coexpression and differential expression of genes have been previously studied in life stage change or aging, differential coexpression has not. Generalizing differential coexpression analysis to many time points presents a methodological challenge. Here we introduce a method for analyzing changes in coexpression across multiple ordered groups (e.g., over time) and extensively test its validity and usefulness. Our method is based on the use of the Haar basis set to efficiently represent changes in coexpression at multiple time scales, and thus represents a principled and generalizable extension of the idea of differential coexpression to life stage data. We used published microarray studies categorized by age to test the methodology. We validated the methodology by testing our ability to reconstruct Gene Ontology (GO) categories using our measure of differential coexpression and compared this result to using coexpression alone. Our method allows significant improvement in characterizing these groups of genes. Further, we examine the statistical properties of our measure of differential coexpression and establish that the results are significant both statistically and by an improvement in semantic similarity. In addition, we found that our method finds more significant changes in gene relationships compared to several other methods of expressing temporal relationships between genes, such as coexpression over time. Differential coexpression over age generates significant and biologically relevant information about the genes producing it. Our Haar basis methodology for determining age-related differential coexpression performs better than other tested methods. The Haar basis set also lends itself to ready interpretation

  4. Hra pro Android OS

    OpenAIRE

    Salvet, Lukáš

    2017-01-01

    Tato práce se zabývá tvorbou 2D hry pro zařízení s operačním systémem Android. Popisuje možnost tvorby her v tomto prostředí. Hlavně pak využití knihovny OpenGL ES 2.0, kterou jsem pro implementaci použil. Dále je v práci popsáno renderování textu, použití Google play games services pro achievementy a leaderboard, testování hry, atd. Beta verze hry byla vydána na Google Play. This work deals with creating 2D games for devices running Android. It describes the ability to create games in thi...

  5. Analysis of core damage frequency from internal events: Methodology guidelines: Volume 1

    International Nuclear Information System (INIS)

    Drouin, M.T.; Harper, F.T.; Camp, A.L.

    1987-09-01

    NUREG-1150 examines the risk to the public from a selected group of nuclear power plants. This report describes the methodology used to estimate the internal event core damage frequencies of four plants in support of NUREG-1150. In principle, this methodology is similar to methods used in past probabilistic risk assessments; however, based on past studies and using analysts that are experienced in these techniques, the analyses can be focused in certain areas. In this approach, only the most important systems and failure modes are modeled in detail. Further, the data and human reliability analyses are simplified, with emphasis on the most important components and human actions. Using these methods, an analysis can be completed in six to nine months using two to three full-time systems analysts and part-time personnel in other areas, such as data analysis and human reliability analysis. This is significantly faster and less costly than previous analyses and provides most of the insights that are obtained by the more costly studies. 82 refs., 35 figs., 27 tabs

  6. A Probabilistic Analysis Methodology and Its Application to A Spent Fuel Pool System

    Energy Technology Data Exchange (ETDEWEB)

    Kim, Hyowon; Jae, Moosung [Hanyang Univ., Seoul (Korea, Republic of); Ryu, Ho G. [Daedeok R and D Center, Daejeon (Korea, Republic of)

    2013-05-15

    There was a similar accident occurring at the 2{sup nd} unit of PAKS nuclear power station in Hungary on the 10{sup th} April 2003. Insufficient cooling of spent fuel caused the spent fuel burn up or partly melting. There were many previous studies performed for analyzing and measuring the risk of spent fuel damage. In the 1980s, there are changes in conditions such as development of high density storage racks and new information concerning the possibility of cladding fires in the drained spent fuel pools. The US NRC assessed the spent fuel pool risk under the Generic Issue 82. In the 1990s, under the US NRC sponsorship, the risk assessment about the spent fuel pool at Susquehanna Steam Electric Station (SSES) has been performed and Analysis Evaluation of Operational Data (AEOD) has been organized for accumulating the reliability data. A methodology for assessing the risk associated with the spent fuel pool facility has been developed and is applied to the reference plant. It is shown that the methodology developed in this study might contribute to assessing these kinds of the SFP facilities. In this probabilistic risk analysis, the LINV Initial event results in the high frequent occurrence. The most dominant cut-sets include the human errors. The result of this analysis might contribute to identifying the weakness of the preventive and mitigating system in the SFP facility.

  7. Development of a methodology for radionuclide impurity analysis in radiopharmaceuticals using gamma spectrometry

    International Nuclear Information System (INIS)

    Paula, Eduardo Bonfim de; Araujo, Miriam Taina Ferreira de; Delgado, Jose Ubiratan; Poledna, Roberto; Lins, Ronaldo; Leiras, Anderson; Silva, Carlos Jose da; Oliveira, Antonio Eduardo de

    2016-01-01

    The LNMRI has sought to develop a methodology for the identification and accurate detection of gamma radiation impurities stations in metrological level, aiming to meet the recommendations not only of international pharmacopoeia, as well as the CNEN and ANVISA regarding the quality control can ensure patients the doses received by the practices are as low as feasible. As an initial target, it was possible to obtain an efficiency curve with uncertainty around 1% necessary to initiate future measurements of interest applied to nuclear medicine and to start the development of impurities analysis technique. (author)

  8. Sensitivity analysis of source driven subcritical systems by the HGPT methodology

    International Nuclear Information System (INIS)

    Gandini, A.

    1997-01-01

    The heuristically based generalized perturbation theory (HGPT) methodology has been extensively used in the last decades for analysis studies in the nuclear reactor field. Its use leads to fundamental reciprocity relationships from which perturbation, or sensitivity expressions can be derived, to first and higher order, in terms of simple integration operation of quantities calculated at unperturbed system conditions. Its application to subcritical, source-driven systems, now considered with increasing interest in many laboratories for their potential use as nuclear waste burners and/or safer energy producers, is here commented, with particular emphasis to problems implying an intensive system control variable. (author)

  9. Performance of neutron activation analysis in the evaluation of bismuth iodide purification methodology

    International Nuclear Information System (INIS)

    Armelin, Maria Jose A.; Ferraz, Caue de Mello; Hamada, Margarida M.

    2015-01-01

    Bismuth tri-iodide (BrI 3 ) is an attractive material for using as a semiconductor. In this paper, BiI 3 crystals have been grown by the vertical Bridgman technique using commercially available powder. The impurities were evaluated by instrumental neutron activation analysis (INAA). The results show that INAA is an analytical method appropriate for monitoring the impurities of: Ag, As, Br, Cr, K, Mo, Na and Sb in the various stages of the BiI 3 purification methodology. (author)

  10. Warpage analysis on thin shell part using response surface methodology (RSM)

    Science.gov (United States)

    Zulhasif, Z.; Shayfull, Z.; Nasir, S. M.; Fathullah, M.; Hazwan, M. H. M.

    2017-09-01

    The optimisation of moulding parameters appropriate to reduce warpage defects produce using Autodesk Moldflow Insight (AMI) 2012 software The product is injected by using Acrylonitrile-Butadiene-Styrene (ABS) materials. This analysis has processing parameter that varies in melting temperature, mould temperature, packing pressure and packing time. Design of Experiments (DOE) has been integrated to obtain a polynomial model using Response Surface Methodology (RSM). The Glowworm Swarm Optimisation (GSO) method is used to predict a best combination parameters to minimise warpage defect in order to produce high quality parts.

  11. Methodological aspects of fuel performance system analysis at raw hydrocarbon processing plants

    Science.gov (United States)

    Kulbjakina, A. V.; Dolotovskij, I. V.

    2018-01-01

    The article discusses the methodological aspects of fuel performance system analysis at raw hydrocarbon (RH) processing plants. Modern RH processing facilities are the major consumers of energy resources (ER) for their own needs. To reduce ER, including fuel consumption, and to develop rational fuel system structure are complex and relevant scientific tasks that can only be done using system analysis and complex system synthesis. In accordance with the principles of system analysis, the hierarchical structure of the fuel system, the block scheme for the synthesis of the most efficient alternative of the fuel system using mathematical models and the set of performance criteria have been developed on the main stages of the study. The results from the introduction of specific engineering solutions to develop their own energy supply sources for RH processing facilities have been provided.

  12. Adaptation of SW-846 methodology for the organic analysis of radioactive mixed wastes

    International Nuclear Information System (INIS)

    Griest, W.H.; Schenley, R.L.; Tomkins, B.A.; Caton, J.E. Jr.; Fleming, G.S.; Harmon, S.H.; Wachter, L.J.; Garcia, M.E.; Edwards, M.D.

    1990-01-01

    Modifications to SW-846 sample preparation methodology permit the organic analysis of radioactive mixed waste with minimum personal radiation exposure and equipment contamination. This paper describes modifications to SW-846 methods 5030 and 3510-3550 for sample preparation in radiation-zoned facilities (hood, glove box, and hot cell) and GC-MS analysis of the decontaminated organic extracts in a conventional laboratory for volatile and semivolatile organics by methods 8240 and 8270 (respectively). Results will be presented from the analysis of nearly 70 nuclear waste storage tank liquids and 17 sludges. Regulatory organics do not account for the organic matter suggested to be present by total organic carbon measurements. 7 refs., 5 tabs

  13. Quantitative Analysis of Uncertainty in Medical Reporting: Creating a Standardized and Objective Methodology.

    Science.gov (United States)

    Reiner, Bruce I

    2018-04-01

    Uncertainty in text-based medical reports has long been recognized as problematic, frequently resulting in misunderstanding and miscommunication. One strategy for addressing the negative clinical ramifications of report uncertainty would be the creation of a standardized methodology for characterizing and quantifying uncertainty language, which could provide both the report author and reader with context related to the perceived level of diagnostic confidence and accuracy. A number of computerized strategies could be employed in the creation of this analysis including string search, natural language processing and understanding, histogram analysis, topic modeling, and machine learning. The derived uncertainty data offers the potential to objectively analyze report uncertainty in real time and correlate with outcomes analysis for the purpose of context and user-specific decision support at the point of care, where intervention would have the greatest clinical impact.

  14. Methodology for reliability allocation based on fault tree analysis and dualistic contrast

    Institute of Scientific and Technical Information of China (English)

    TONG Lili; CAO Xuewu

    2008-01-01

    Reliability allocation is a difficult multi-objective optimization problem.This paper presents a methodology for reliability allocation that can be applied to determine the reliability characteristics of reactor systems or subsystems.The dualistic contrast,known as one of the most powerful tools for optimization problems,is applied to the reliability allocation model of a typical system in this article.And the fault tree analysis,deemed to be one of the effective methods of reliability analysis,is also adopted.Thus a failure rate allocation model based on the fault tree analysis and dualistic contrast is achieved.An application on the emergency diesel generator in the nuclear power plant is given to illustrate the proposed method.

  15. Human reliability analysis as an evaluation tool of the emergency evacuation process on industrial installation

    International Nuclear Information System (INIS)

    Santos, Isaac J.A.L. dos; Grecco, Claudio H.S.; Mol, Antonio C.A.; Carvalho, Paulo V.R.; Oliveira, Mauro V.; Botelho, Felipe Mury

    2007-01-01

    Human reliability is the probability that a person correctly performs some required activity by the system in a required time period and performs no extraneous activity that can degrade the system. Human reliability analysis (HRA) is the analysis, prediction and evaluation of work-oriented human performance using some indices as human error likelihood and probability of task accomplishment. The human error concept must not have connotation of guilt and punishment, having to be treated as a natural consequence, that emerges due to the not continuity between the human capacity and the system demand. The majority of the human error is a consequence of the work situation and not of the responsibility lack of the worker. The anticipation and the control of potentially adverse impacts of human action or interactions between the humans and the system are integral parts of the process safety, where the factors that influence the human performance must be recognized and managed. The aim of this paper is to propose a methodology to evaluate the emergency evacuation process on industrial installations including SLIM-MAUD, a HRA first-generation method, and using virtual reality and simulation software to build and to simulate the chosen emergency scenes. (author)

  16. Human reliability analysis as an evaluation tool of the emergency evacuation process on industrial installation

    Energy Technology Data Exchange (ETDEWEB)

    Santos, Isaac J.A.L. dos; Grecco, Claudio H.S.; Mol, Antonio C.A.; Carvalho, Paulo V.R.; Oliveira, Mauro V.; Botelho, Felipe Mury [Instituto de Engenharia Nuclear (IEN/CNEN-RJ), Rio de Janeiro, RJ (Brazil)]. E-mail: luquetti@ien.gov.br; grecco@ien.gov.br; mol@ien.gov.br; paulov@ien.gov.br; mvitor@ien.gov.br; felipemury@superig.com.br

    2007-07-01

    Human reliability is the probability that a person correctly performs some required activity by the system in a required time period and performs no extraneous activity that can degrade the system. Human reliability analysis (HRA) is the analysis, prediction and evaluation of work-oriented human performance using some indices as human error likelihood and probability of task accomplishment. The human error concept must not have connotation of guilt and punishment, having to be treated as a natural consequence, that emerges due to the not continuity between the human capacity and the system demand. The majority of the human error is a consequence of the work situation and not of the responsibility lack of the worker. The anticipation and the control of potentially adverse impacts of human action or interactions between the humans and the system are integral parts of the process safety, where the factors that influence the human performance must be recognized and managed. The aim of this paper is to propose a methodology to evaluate the emergency evacuation process on industrial installations including SLIM-MAUD, a HRA first-generation method, and using virtual reality and simulation software to build and to simulate the chosen emergency scenes. (author)

  17. Factors and competitiveness analysis in rare earth mining, new methodology: case study from Brazil

    Directory of Open Access Journals (Sweden)

    Gustavo A. Silva

    2018-03-01

    Full Text Available Rare earths are increasingly being applied in high-tech industries, such as green energy (e.g. wind power, hybrid cars, electric cars, permanent high-performance magnets, superconductors, luminophores and many other industrial sectors involved in modern technologies. Given that China dominates this market and imposes restrictions on production and exports whenever opportunities arise, it is becoming more and more challenging to develop business ventures in this sector. Several initiatives were taken to prospect new resources and develop the production chain, including the mining of these mineral assets around the world, but some factors of uncertainties, including current low prices, increased the challenge of transforming the current resources into deposits or productive mines. Thus, analyzing the competitiveness of advanced projects becomes indispensable. This work has the objective of introducing a new methodology of competitiveness analysis, where some variables are considered as main factors that can contribute strongly to make unfeasible a mining enterprise for the use of rare earth elements (REE with this methodology, which is quite practical and reproducible, it was possible to verify some real facts, such as: the fact that the Lynas Mount Weld CLD (AUS Project is resilient to the uncertainties of the RE sector, at the same time as the Molycorp Project is facing major financial difficulties (under judicial reorganization. It was also possible to verify that the Araxá Project of CBMM in Brazil is one of the most competitive in this country. Thus, we contribute to the existing literature, providing a new methodology for competitiveness analysis in rare earth mining. Keywords: Earth sciences, Business, Economics, Industry

  18. Factors and competitiveness analysis in rare earth mining, new methodology: case study from Brazil.

    Science.gov (United States)

    Silva, Gustavo A; Petter, Carlos O; Albuquerque, Nelson R

    2018-03-01

    Rare earths are increasingly being applied in high-tech industries, such as green energy (e.g. wind power), hybrid cars, electric cars, permanent high-performance magnets, superconductors, luminophores and many other industrial sectors involved in modern technologies. Given that China dominates this market and imposes restrictions on production and exports whenever opportunities arise, it is becoming more and more challenging to develop business ventures in this sector. Several initiatives were taken to prospect new resources and develop the production chain, including the mining of these mineral assets around the world, but some factors of uncertainties, including current low prices, increased the challenge of transforming the current resources into deposits or productive mines. Thus, analyzing the competitiveness of advanced projects becomes indispensable. This work has the objective of introducing a new methodology of competitiveness analysis, where some variables are considered as main factors that can contribute strongly to make unfeasible a mining enterprise for the use of rare earth elements (REE) with this methodology, which is quite practical and reproducible, it was possible to verify some real facts, such as: the fact that the Lynas Mount Weld CLD (AUS) Project is resilient to the uncertainties of the RE sector, at the same time as the Molycorp Project is facing major financial difficulties (under judicial reorganization). It was also possible to verify that the Araxá Project of CBMM in Brazil is one of the most competitive in this country. Thus, we contribute to the existing literature, providing a new methodology for competitiveness analysis in rare earth mining.

  19. The API methodology for risk-based inspection (RBI) analysis for the petroleum and petrochemical industry

    International Nuclear Information System (INIS)

    Reynolds, J.T.

    1998-01-01

    Twenty-one petroleum and petrochemical companies are currently sponsoring a project within the American Petroleum Institute (API) to develop risk-based inspection (RBI) methodology for application in the refining and petrochemical industry. This paper describes that particular RBI methodology and provides a summary of the three levels of RBI analysis developed by the project. Also included is a review of the first pilot project to validate the methodology by applying RBI to several existing refining units. The failure for pressure equipment in a process unit can have several undesirable effects. For the purpose of RBI analysis, the API RBI program categorizes these effects into four basic risk outcomes: flammable events, toxic releases, major environmental damage, and business interruption losses. API RBI is a strategic process, both qualitative and quantitative, for understanding and reducing these risks associated with operating pressure equipment. This paper will show how API RBI assesses the potential consequences of a failure of the pressure boundary, as well as assessing the likelihood (probability) of failure. Risk-based inspection also prioritizes risk levels in a systematic manner so that the owner-user can then plan an inspection program that focuses more resources on the higher risk equipment; while possibly saving inspection resources that are not doing an effective job of reducing risk. At the same time, if consequence of failure is a significant driving force for high risk equipment items, plant management also has the option of applying consequence mitigation steps to minimize the impact of a hazardous release, should one occur. The target audience for this paper is engineers, inspectors, and managers who want to understand what API Risk-Based Inspection is all about, what are the benefits and limitations of RBI, and how inspection practices can be changed to reduce risks and/or save costs without impacting safety risk. (Author)

  20. Optimization of coupled multiphysics methodology for safety analysis of pebble bed modular reactor

    Science.gov (United States)

    Mkhabela, Peter Tshepo

    The research conducted within the framework of this PhD thesis is devoted to the high-fidelity multi-physics (based on neutronics/thermal-hydraulics coupling) analysis of Pebble Bed Modular Reactor (PBMR), which is a High Temperature Reactor (HTR). The Next Generation Nuclear Plant (NGNP) will be a HTR design. The core design and safety analysis methods are considerably less developed and mature for HTR analysis than those currently used for Light Water Reactors (LWRs). Compared to LWRs, the HTR transient analysis is more demanding since it requires proper treatment of both slower and much longer transients (of time scale in hours and days) and fast and short transients (of time scale in minutes and seconds). There is limited operation and experimental data available for HTRs for validation of coupled multi-physics methodologies. This PhD work developed and verified reliable high fidelity coupled multi-physics models subsequently implemented in robust, efficient, and accurate computational tools to analyse the neutronics and thermal-hydraulic behaviour for design optimization and safety evaluation of PBMR concept The study provided a contribution to a greater accuracy of neutronics calculations by including the feedback from thermal hydraulics driven temperature calculation and various multi-physics effects that can influence it. Consideration of the feedback due to the influence of leakage was taken into account by development and implementation of improved buckling feedback models. Modifications were made in the calculation procedure to ensure that the xenon depletion models were accurate for proper interpolation from cross section tables. To achieve this, the NEM/THERMIX coupled code system was developed to create the system that is efficient and stable over the duration of transient calculations that last over several tens of hours. Another achievement of the PhD thesis was development and demonstration of full-physics, three-dimensional safety analysis

  1. Causality analysis in business performance measurement system using system dynamics methodology

    Science.gov (United States)

    Yusof, Zainuridah; Yusoff, Wan Fadzilah Wan; Maarof, Faridah

    2014-07-01

    One of the main components of the Balanced Scorecard (BSC) that differentiates it from any other performance measurement system (PMS) is the Strategy Map with its unidirectional causality feature. Despite its apparent popularity, criticisms on the causality have been rigorously discussed by earlier researchers. In seeking empirical evidence of causality, propositions based on the service profit chain theory were developed and tested using the econometrics analysis, Granger causality test on the 45 data points. However, the insufficiency of well-established causality models was found as only 40% of the causal linkages were supported by the data. Expert knowledge was suggested to be used in the situations of insufficiency of historical data. The Delphi method was selected and conducted in obtaining the consensus of the causality existence among the 15 selected expert persons by utilizing 3 rounds of questionnaires. Study revealed that only 20% of the propositions were not supported. The existences of bidirectional causality which demonstrate significant dynamic environmental complexity through interaction among measures were obtained from both methods. With that, a computer modeling and simulation using System Dynamics (SD) methodology was develop as an experimental platform to identify how policies impacting the business performance in such environments. The reproduction, sensitivity and extreme condition tests were conducted onto developed SD model to ensure their capability in mimic the reality, robustness and validity for causality analysis platform. This study applied a theoretical service management model within the BSC domain to a practical situation using SD methodology where very limited work has been done.

  2. Adding value in oil and gas by applying decision analysis methodologies: case history

    Energy Technology Data Exchange (ETDEWEB)

    Marot, Nicolas [Petro Andina Resources Inc., Alberta (Canada); Francese, Gaston [Tandem Decision Solutions, Buenos Aires (Argentina)

    2008-07-01

    Petro Andina Resources Ltd. together with Tandem Decision Solutions developed a strategic long range plan applying decision analysis methodology. The objective was to build a robust and fully integrated strategic plan that accomplishes company growth goals to set the strategic directions for the long range. The stochastic methodology and the Integrated Decision Management (IDM{sup TM}) staged approach allowed the company to visualize the associated value and risk of the different strategies while achieving organizational alignment, clarity of action and confidence in the path forward. A decision team involving jointly PAR representatives and Tandem consultants was established to carry out this four month project. Discovery and framing sessions allow the team to disrupt the status quo, discuss near and far reaching ideas and gather the building blocks from which creative strategic alternatives were developed. A comprehensive stochastic valuation model was developed to assess the potential value of each strategy applying simulation tools, sensitivity analysis tools and contingency planning techniques. Final insights and results have been used to populate the final strategic plan presented to the company board providing confidence to the team, assuring that the work embodies the best available ideas, data and expertise, and that the proposed strategy was ready to be elaborated into an optimized course of action. (author)

  3. Investigating DMOs through the Lens of Social Network Analysis: Theoretical Gaps, Methodological Challenges and Practitioner Perspectives

    Directory of Open Access Journals (Sweden)

    Dean HRISTOV

    2015-06-01

    Full Text Available The extant literature on networks in tourism management research has traditionally acknowledged destinations as the primary unit of analysis. This paper takes an alternative perspective and positions Destination Management Organisations (DMOs at the forefront of today’s tourism management research agenda. Whilst providing a relatively structured approach to generating enquiry, network research vis-à-vis Social Network Analysis (SNA in DMOs is often surrounded by serious impediments. Embedded in the network literature, this conceptual article aims to provide a practitioner perspective on addressing the obstacles to undertaking network studies in DMO organisations. A simple, three-step methodological framework for investigating DMOs as interorganisational networks of member organisations is proposed in response to complexities in network research. The rationale behind introducing such framework lies in the opportunity to trigger discussions and encourage further academic contributions embedded in both theory and practice. Academic and practitioner contributions are likely to yield insights into the importance of network methodologies applied to DMO organisations.

  4. Core melt progression and consequence analysis methodology development in support of the Savannah River Reactor PSA

    International Nuclear Information System (INIS)

    O'Kula, K.R.; Sharp, D.A.; Amos, C.N.; Wagner, K.C.; Bradley, D.R.

    1992-01-01

    A three-level Probabilistic Safety Assessment (PSA) of production reactor operation has been underway since 1985 at the US Department of Energy's Savannah River Site (SRS). The goals of this analysis are to: Analyze existing margins of safety provided by the heavy-water reactor (HWR) design challenged by postulated severe accidents; Compare measures of risk to the general public and onsite workers to guideline values, as well as to those posed by commercial reactor operation; and Develop the methodology and database necessary to prioritize improvements to engineering safety systems and components, operator training, and engineering projects that contribute significantly to improving plant safety. PSA technical staff from the Westinghouse Savannah River Company (WSRC) and Science Applications International Corporation (SAIC) have performed the assessment despite two obstacles: A variable baseline plant configuration and power level; and a lack of technically applicable code methodology to model the SRS reactor conditions. This paper discusses the detailed effort necessary to modify the requisite codes before accident analysis insights for the risk assessment were obtained

  5. Methodology for the analysis of self-tensioned wooden structural floors

    Directory of Open Access Journals (Sweden)

    F. Suárez-Riestra

    2017-09-01

    Full Text Available It is described a self-tensioning system constituted by a force multiplying device which, attached to the supports of the ends of the structural element, is able to convert the vertical resultant from the gravitatonial actions into an effective tensioning action, through the movement that was induced by a set of rods. The self-tensioning system is able to offer a high performance, thanks to the beneficial effect of the opposite deflection generated by the tensioning, in proportion to the increasing of the gravitational action. This allows to design long-span timber ribbed floors using reduced depths. The complexity of calculation due to the non-linearity of the system can be obviated with the methodology of analysis developed in the article. In order to illustrate the advantages of the self-tensioning system and the methodology of analysis which were developed, six cases of ribbed floors have been analysed, with spans of 9, 12 and 15 m and variable using loads of 3,00 kN/m2 and 5,00 kN/m2.

  6. ALARA cost/benefit analysis at Union Electric company using the ARP/AI methodology

    International Nuclear Information System (INIS)

    Williams, M.C.

    1987-01-01

    This paper describes the development of a specific method for justification of expenditures associated with reducing occupational radiation exposure to as low as reasonably achievable (ALARA). This methodology is based on the concepts of the Apparebt Reduction Potential (ARP) and Achievability index (AI) as described in NUREG/CR-0446, Union Eletric's corporate planning model and the EPRI Model for dose rate buildup with reactor operating life. The ARP provides a screening test to determine if there is a need for ALARA expenditures based on actual or predicted exposure rates and/or dose experience. The AI is a means of assessing all costs and all benefits, even though they are expressed in different units of measurement such as person-rem and dollars, to determine if ALARA expenditures are justified and their value. This method of cost/benefit analysis can be applied by any company or organization utilizing site-specific exposure and dose rate data, and incorporating consideration of administrative exposure controls which may vary from organization to organization. Specific example cases are presented and compared to other methodologies for ALARA cost/benefit analysis

  7. Discourse analysis: A useful methodology for health-care system researches.

    Science.gov (United States)

    Yazdannik, Ahmadreza; Yousefy, Alireza; Mohammadi, Sepideh

    2017-01-01

    Discourse analysis (DA) is an interdisciplinary field of inquiry and becoming an increasingly popular research strategy for researchers in various disciplines which has been little employed by health-care researchers. The methodology involves a focus on the sociocultural and political context in which text and talk occur. DA adds a linguistic approach to an understanding of the relationship between language and ideology, exploring the way in which theories of reality and relations of power are encoded in such aspects as the syntax, style, and rhetorical devices used in texts. DA is a useful and productive qualitative methodology but has been underutilized within health-care system research. Without a clear understanding of discourse theory and DA it is difficult to comprehend important research findings and impossible to use DA as a research strategy. To redress this deficiency, in this article, represents an introduction to concepts of discourse and DA, DA history, Philosophical background, DA types and analysis strategy. Finally, we discuss how affect to the ideological dimension of such phenomena discourse in health-care system, health beliefs and intra-disciplinary relationship in health-care system.

  8. Methodology of analysis sustainable development of Ukraine by using the theory fuzzy logic

    Directory of Open Access Journals (Sweden)

    Methodology of analysis sustainable development of Ukraine by using the theory fuzzy logic

    2016-02-01

    Full Text Available Article objective is analysis of the theoretical and methodological aspects for the assessment of sustainable development in times of crisis. The methodical approach to the analysis of sustainable development territory taking into account the assessment of the level of economic security has been proposed. A necessity of development of the complex methodical approach to the accounting of the indeterminacy properties and multicriterial in the tasks to provide economic safety on the basis of using the fuzzy logic theory (or the fuzzy sets theory was proved. The results of using the method of fuzzy sets of during the 2002-2012 years the dynamics of changes dynamics of sustainable development in Ukraine were presented.

  9. Patient's radioprotection and analysis of DPC practices and certification of health facilities - Methodological guide

    International Nuclear Information System (INIS)

    Bataillon, Remy; Lafont, Marielle; Rousse, Carole; Vuillez, Jean-Philippe; Ducou Le Pointe, Hubert; Grenier, Nicolas; Lartigau, Eric; Orcel, Philippe; Dujarric, Francis; Beaupin, Alain; Bar, Olivier; Blondet, Emmanuelle; Combe, Valerie; Pages, Frederique

    2012-11-01

    This methodological guide has been published in compliance with French and European regulatory texts to define the modalities of implementation of the assessment of clinical practices resulting in exposure to ionizing radiation in medical environment (radiotherapy, radio-surgery, interventional radiology, nuclear medicine), to promote clinical audits, and to ease the implementation of programs of continuous professional development in radiotherapy, radiology and nuclear medicine. This guide proposes an analysis of professional practices through analysis sheets which address several aspects: scope, practice data, objectives in terms of improvement of radiation protection, regulatory and institutional references, operational objectives, methods, approaches and tools, follow-up indicators, actions to improve practices, professional target, collective approach, program organisation, and program valorisation in existing arrangements. It also gives 20 program proposals which notably aim at a continuous professional development, 5 of them dealing with diagnosis-oriented imagery-based examinations, 9 with radiology and risk management, 4 with radiotherapy, and 2 with nuclear medicine

  10. A SAS2H/KENO-V Methodology for 3D Full Core depletion analysis

    International Nuclear Information System (INIS)

    Milosevic, M.; Greenspan, E.; Vujic, J.; Petrovic, B.

    2003-04-01

    This paper describes the use of a SAS2H/KENO-V methodology for 3D full core depletion analysis and illustrates its capabilities by applying it to burnup analysis of the IRIS core benchmarks. This new SAS2H/KENO-V sequence combines a 3D Monte Carlo full core calculation of node power distribution and a 1D Wigner-Seitz equivalent cell transport method for independent depletion calculation of each of the nodes. This approach reduces by more than an order of magnitude the time required for getting comparable results using the MOCUP code system. The SAS2H/KENO-V results for the asymmetric IRIS core benchmark are in good agreement with the results of the ALPHA/PHOENIX/ANC code system. (author)

  11. Evidential analytic hierarchy process dependence assessment methodology in human reliability analysis

    International Nuclear Information System (INIS)

    Chen, Lu Yuan; Zhou, Xinyi; Xiao, Fuyuan; Deng, Yong; Mahadevan, Sankaran

    2017-01-01

    In human reliability analysis, dependence assessment is an important issue in risky large complex systems, such as operation of a nuclear power plant. Many existing methods depend on an expert's judgment, which contributes to the subjectivity and restrictions of results. Recently, a computational method, based on the Dempster-Shafer evidence theory and analytic hierarchy process, has been proposed to handle the dependence in human reliability analysis. The model can deal with uncertainty in an analyst's judgment and reduce the subjectivity in the evaluation process. However, the computation is heavy and complicated to some degree. The most important issue is that the existing method is in a positive aspect, which may cause an underestimation of the risk. In this study, a new evidential analytic hierarchy process dependence assessment methodology, based on the improvement of existing methods, has been proposed, which is expected to be easier and more effective

  12. The definitive analysis of the Bendandi's methodology performed with a specific software

    Science.gov (United States)

    Ballabene, Adriano; Pescerelli Lagorio, Paola; Georgiadis, Teodoro

    2015-04-01

    The presentation aims to clarify the "Method Bendandi" supposed, in the past, to be able to forecast earthquakes and never let expressly resolved by the geophysicist from Faenza to posterity. The geoethics implications of the Bendandi's forecasts, and those that arise around the speculation of possible earthquakes inferred from suppositories "Bendandiane" methodologies, rose up in previous years caused by social alarms during supposed occurrences of earthquakes which never happened but where widely spread by media following some 'well informed' non conventional scientists. The analysis was conducted through an extensive literature search of the archive 'Raffaele Bendandi' at Geophy sical Observatory of Faenza and the forecasts analyzed utilising a specially developed software, called "Bendandiano Dashboard", that can reproduce the planetary configurations reported in the graphs made by Italian geophysicist. This analysis should serve to clarify 'definitively' what are the basis of the Bendandi's calculations as well as to prevent future unwarranted warnings issued on the basis of supposed prophecies and illusory legacy documents.

  13. Evidential Analytic Hierarchy Process Dependence Assessment Methodology in Human Reliability Analysis

    Directory of Open Access Journals (Sweden)

    Luyuan Chen

    2017-02-01

    Full Text Available In human reliability analysis, dependence assessment is an important issue in risky large complex systems, such as operation of a nuclear power plant. Many existing methods depend on an expert's judgment, which contributes to the subjectivity and restrictions of results. Recently, a computational method, based on the Dempster–Shafer evidence theory and analytic hierarchy process, has been proposed to handle the dependence in human reliability analysis. The model can deal with uncertainty in an analyst's judgment and reduce the subjectivity in the evaluation process. However, the computation is heavy and complicated to some degree. The most important issue is that the existing method is in a positive aspect, which may cause an underestimation of the risk. In this study, a new evidential analytic hierarchy process dependence assessment methodology, based on the improvement of existing methods, has been proposed, which is expected to be easier and more effective.

  14. Evidential analytic hierarchy process dependence assessment methodology in human reliability analysis

    Energy Technology Data Exchange (ETDEWEB)

    Chen, Lu Yuan; Zhou, Xinyi; Xiao, Fuyuan; Deng, Yong [School of Computer and Information Science, Southwest University, Chongqing (China); Mahadevan, Sankaran [School of Engineering, Vanderbilt University, Nashville (United States)

    2017-02-15

    In human reliability analysis, dependence assessment is an important issue in risky large complex systems, such as operation of a nuclear power plant. Many existing methods depend on an expert's judgment, which contributes to the subjectivity and restrictions of results. Recently, a computational method, based on the Dempster-Shafer evidence theory and analytic hierarchy process, has been proposed to handle the dependence in human reliability analysis. The model can deal with uncertainty in an analyst's judgment and reduce the subjectivity in the evaluation process. However, the computation is heavy and complicated to some degree. The most important issue is that the existing method is in a positive aspect, which may cause an underestimation of the risk. In this study, a new evidential analytic hierarchy process dependence assessment methodology, based on the improvement of existing methods, has been proposed, which is expected to be easier and more effective.

  15. Shrinkage Analysis on Thick Plate Part using Response Surface Methodology (RSM

    Directory of Open Access Journals (Sweden)

    Isafiq M.

    2016-01-01

    Full Text Available The work reported herein is about an analysis on the quality (shrinkage on a thick plate part using Response Surface Methodology (RSM. Previous researches showed that the most influential factor affecting the shrinkage on moulded parts are mould and melt temperature. Autodesk Moldflow Insight software was used for the analysis, while specifications of Nessei NEX 1000 injection moulding machine and P20 mould material were incorporated in this study on top of Acrylonitrile Butadiene Styrene (ABS as a moulded thermoplastic material. Mould temperature, melt temperature, packing pressure and packing time were selected as variable parameters. The results show that the shrinkage have improved 42.48% and 14.41% in parallel and normal directions respectively after the optimisation process.

  16. Analysis of the link between a definition of sustainability and the life cycle methodologies

    DEFF Research Database (Denmark)

    Jørgensen, Andreas; Herrmann, Ivan Tengbjerg; Bjørn, Anders

    2013-01-01

    explicit analysis of the claim has been made. The purpose of this article is to analyse this claim.An interpretation of the goals of sustainability, as outlined in the report Our Common Future (WCED 1987), which is the basis for most literature on sustainability assessment in the LCA community......It has been claimed that in order to assess the sustainability of products, a combination of the results from a life cycle assessment (LCA), social life cycle assessment (SLCA) and life cycle costing (LCC) is needed. Despite the frequent reference to this claim in the literature, very little......, is presented and detailed to a level enabling an analysis of the relation to the impact categories at midpoint level considered in life cycle (LC) methodologies.The interpretation of the definition of sustainability as outlined in Our Common Future (WCED 1987) suggests that the assessment of a product...

  17. A methodological proposal for quantifying environmental compensation through the spatial analysis of vulnerability indicators

    Directory of Open Access Journals (Sweden)

    Fabio Enrique Torresan

    2008-06-01

    Full Text Available The aim of this work was to propose a methodology for quantifying the environmental compensation through the spatial analysis of vulnerability indicators. A case study was applied for the analysis of sand extraction enterprises, in the region of Descalvado and Analândia, inland of São Paulo State, Brazil. Environmental vulnerability scores were attributed for the indicators related to erosion, hydrological resources and biodiversity loss. This methodological proposal allowed analyzing the local alternatives of certain enterprise with the objective of reducing impacts and at the same time reducing the costs of environmental compensation. The application of the methodology significantly reduced the subjectivity degree usually associated to the most of the methodologies of impact evaluation.O termo compensação ambiental refere-se à obrigação do empreendedor em apoiar a implantação e manutenção de Unidades de Conservação, aplicável a empreendimentos de significativo impacto ambiental, de acordo com a Lei 9.986/2000. Esta lei estabelece que o volume de recursos a ser aplicado pelo empreendedor deve ser de no mínimo 0,5% dos custos totais previstos para a implantação do empreendimento, sendo que este percentual deve ser fixado pelo órgão ambiental competente, de acordo com o grau de impacto ambiental. Sendo assim, o presente artigo tem o objetivo de propor uma metodologia para quantificação da compensação ambiental através da análise espacial de indicadores de vulnerabilidade ambiental. A proposta foi aplicada através de um estudo de caso em empreendimentos de mineração de areia, na região de Descalvado/Analândia, interior do Estado de São Paulo. Índices de vulnerabilidade ambiental foram atribuídos a indicadores de impactos relacionados à erosão, recursos hídricos e perda de biodiversidade. Esta metodologia representa importante instrumento de planejamento ambiental e econômico, podendo ser adaptada a diversos

  18. Extending the input–output energy balance methodology in agriculture through cluster analysis

    International Nuclear Information System (INIS)

    Bojacá, Carlos Ricardo; Casilimas, Héctor Albeiro; Gil, Rodrigo; Schrevens, Eddie

    2012-01-01

    The input–output balance methodology has been applied to characterize the energy balance of agricultural systems. This study proposes to extend this methodology with the inclusion of multivariate analysis to reveal particular patterns in the energy use of a system. The objective was to demonstrate the usefulness of multivariate exploratory techniques to analyze the variability found in a farming system and, establish efficiency categories that can be used to improve the energy balance of the system. To this purpose an input–output analysis was applied to the major greenhouse tomato production area in Colombia. Individual energy profiles were built and the k-means clustering method was applied to the production factors. On average, the production system in the study zone consumes 141.8 GJ ha −1 to produce 96.4 GJ ha −1 , resulting in an energy efficiency of 0.68. With the k-means clustering analysis, three clusters of farmers were identified with energy efficiencies of 0.54, 0.67 and 0.78. The most energy efficient cluster grouped 56.3% of the farmers. It is possible to optimize the production system by improving the management practices of those with the lowest energy use efficiencies. Multivariate analysis techniques demonstrated to be a complementary pathway to improve the energy efficiency of a system. -- Highlights: ► An input–output energy balance was estimated for greenhouse tomatoes in Colombia. ► We used the k-means clustering method to classify growers based on their energy use. ► Three clusters of growers were found with energy efficiencies of 0.54, 0.67 and 0.78. ► Overall system optimization is possible by improving the energy use of the less efficient.

  19. Methodology for cost analysis of film-based and filmless portable chest systems

    Science.gov (United States)

    Melson, David L.; Gauvain, Karen M.; Beardslee, Brian M.; Kraitsik, Michael J.; Burton, Larry; Blaine, G. James; Brink, Gary S.

    1996-05-01

    Many studies analyzing the costs of film-based and filmless radiology have focused on multi- modality, hospital-wide solutions. Yet due to the enormous cost of converting an entire large radiology department or hospital to a filmless environment all at once, institutions often choose to eliminate film one area at a time. Narrowing the focus of cost-analysis may be useful in making such decisions. This presentation will outline a methodology for analyzing the cost per exam of film-based and filmless solutions for providing portable chest exams to Intensive Care Units (ICUs). The methodology, unlike most in the literature, is based on parallel data collection from existing filmless and film-based ICUs, and is currently being utilized at our institution. Direct costs, taken from the perspective of the hospital, for portable computed radiography chest exams in one filmless and two film-based ICUs are identified. The major cost components are labor, equipment, materials, and storage. Methods for gathering and analyzing each of the cost components are discussed, including FTE-based and time-based labor analysis, incorporation of equipment depreciation, lease, and maintenance costs, and estimation of materials costs. Extrapolation of data from three ICUs to model hypothetical, hospital-wide film-based and filmless ICU imaging systems is described. Performance of sensitivity analysis on the filmless model to assess the impact of anticipated reductions in specific labor, equipment, and archiving costs is detailed. A number of indirect costs, which are not explicitly included in the analysis, are identified and discussed.

  20. CONTAINMENT ANALYSIS METHODOLOGY FOR TRANSPORT OF BREACHED CLAD ALUMINUM SPENT FUEL

    Energy Technology Data Exchange (ETDEWEB)

    Vinson, D.

    2010-07-11

    Aluminum-clad, aluminum-based spent nuclear fuel (Al-SNF) from foreign and domestic research reactors (FRR/DRR) is being shipped to the Savannah River Site and placed in interim storage in a water basin. To enter the United States, a cask with loaded fuel must be certified to comply with the requirements in the Title 10 of the U.S. Code of Federal Regulations, Part 71. The requirements include demonstration of containment of the cask with its contents under normal and accident conditions. Many Al-SNF assemblies have suffered corrosion degradation in storage in poor quality water, and many of the fuel assemblies are 'failed' or have through-clad damage. A methodology was developed to evaluate containment of Al-SNF even with severe cladding breaches for transport in standard casks. The containment analysis methodology for Al-SNF is in accordance with the methodology provided in ANSI N14.5 and adopted by the U. S. Nuclear Regulatory Commission in NUREG/CR-6487 to meet the requirements of 10CFR71. The technical bases for the inputs and assumptions are specific to the attributes and characteristics of Al-SNF received from basin and dry storage systems and its subsequent performance under normal and postulated accident shipping conditions. The results of the calculations for a specific case of a cask loaded with breached fuel show that the fuel can be transported in standard shipping casks and maintained within the allowable release rates under normal and accident conditions. A sensitivity analysis has been conducted to evaluate the effects of modifying assumptions and to assess options for fuel at conditions that are not bounded by the present analysis. These options would include one or more of the following: reduce the fuel loading; increase fuel cooling time; reduce the degree of conservatism in the bounding assumptions; or measure the actual leak rate of the cask system. That is, containment analysis for alternative inputs at fuel-specific conditions and

  1. The Spirit of OMERACT: Q Methodology Analysis of Conference Characteristics Valued by Delegates.

    Science.gov (United States)

    Flurey, Caroline A; Kirwan, John R; Hadridge, Phillip; Richards, Pamela; Grosskleg, Shawna; Tugwell, Peter S

    2015-10-01

    To identify the major features of OMERACT meetings as valued by frequent participants and to explore whether there are groups of participants with different opinions. Using Q methodology (a qualitative and quantitative approach to grouping people according to subjective opinion), participants (who attended more than 1 OMERACT conference) sorted 66 statements relating to the "spirit of OMERACT" according to level of agreement across a normal distribution grid. Data were examined using Q factor analysis. Of 226 potential participants, 105 responded (46%). All participants highly ranked the focus on global standardization of methods, outcome measures, data-driven research, methodological discussion, and international collaboration. Four factors describing the "spirit of OMERACT" were identified: "Evidence not eminence" (n = 31) valued the data- and evidence-driven research above personality and status; "Collaboration and collegiality" (n = 19) valued the international and cross-stakeholder collaboration, interaction, and collegiality; "Equal voices, equal votes, common goals" (n = 12) valued equality in discussion and voting, with everyone striving toward the same goal; "principles and product, not process" (n = 8) valued the principles of focusing on outcome measures and the product of guiding clinical trials, but were unsure whether the process is necessary to reach this. The factors did not segregate different stakeholder groups. Delegates value different elements of OMERACT, and thus the "spirit of OMERACT" encompasses evidence-based research, collaboration, and equality, although a small group are unsure whether the process is necessary to achieve the end result. Q methodology may prove useful for conference organizers to identify their delegates' different needs to tailor conference content.

  2. Development of a new methodology for the creation of water temperature scenarios using frequency analysis tool.

    Science.gov (United States)

    Val, Jonatan; Pino, María Rosa; Chinarro, David

    2018-03-15

    Thermal quality in river ecosystems is a fundamental property for the development of biological processes and many of the human activities linked to the aquatic environment. In the future, this property is going to be threatened due to global change impacts, and basin managers will need useful tools to evaluate these impacts. Currently, future projections in temperature modelling are based on the historical data for air and water temperatures, and the relationship with past temperature scenarios; however, this represents a problem when evaluating future scenarios with new thermal impacts. Here, we analysed the thermal impacts produced by several human activities, and linked them with the decoupling degree of the thermal transfer mechanism from natural systems measured with frequency analysis tools (wavelet coherence). Once this relationship has been established we develop a new methodology for simulating different thermal impacts scenarios in order to project them into future. Finally, we validate this methodology using a site that changed its thermal quality during the studied period due to human impacts. Results showed a high correlation (r 2 =0.84) between the decoupling degree of the thermal transfer mechanisms and the quantified human impacts, obtaining 3 thermal impact scenarios. Furthermore, the graphic representation of these thermal scenarios with its wavelet coherence spectrums showed the impacts of an extreme drought period and the agricultural management. The inter-conversion between the scenarios gave high morphological similarities in the obtained wavelet coherence spectrums, and the validation process clearly showed high efficiency of the developed model against old methodologies when comparing with Nash-Stucliffe criterion. Although there is need for further investigation with different climatic and anthropic management conditions, the developed frequency models could be useful in decision-making processes by managers when faced with future global

  3. A Methodology for the Analysis and Selection of Alternative for the Disposition of Surplus Plutonium

    International Nuclear Information System (INIS)

    1999-01-01

    The Department of Energy (DOE) - Office of Fissile Materials Disposition (OFMD) has announced a Record of Decision (ROD) selecting alternatives for disposition of surplus plutonium. A major objective of this decision was to further U.S. efforts to prevent the proliferation of nuclear weapons. Other concerns that were addressed include economic, technical, institutional, schedule, environmental, and health and safety issues. The technical, environmental, and nonproliferation analyses supporting the ROD are documented in three DOE reports (DOE-TSR 96, DOE-PEIS 96, and DOE-NN 97, respectively). At the request of OFMD, a team of analysts from the Amarillo National Resource Center for Plutonium (ANRCP) provided an independent evaluation of the alternatives for plutonium that were considered during the evaluation effort. This report outlines the methodology used by the ANRCP team. This methodology, referred to as multiattribute utility theory (MAU), provides a structure for assembling results of detailed technical, economic, schedule, environment, and nonproliferation analyses for OFMD, DOE policy makers, other stakeholders, and the general public in a systematic way. The MAU methodology has been supported for use in similar situations by the National Research Council, an agency of the National Academy of Sciences.1 It is important to emphasize that the MAU process does not lead to a computerized model that actually determines the decision for a complex problem. MAU is a management tool that is one component, albeit a key component, of a decision process. We subscribe to the philosophy that the result of using models should be insights, not numbers. The MAU approach consists of four steps: (1) identification of alternatives, objectives, and performance measures, (2) estimation of the performance of the alternatives with respect to the objectives, (3) development of value functions and weights for the objectives, and (4) evaluation of the alternatives and sensitivity

  4. National Waste Repository Novi Han operational safety analysis report. Safety assessment methodology

    International Nuclear Information System (INIS)

    2003-01-01

    The scope of the safety assessment (SA), presented includes: waste management functions (acceptance, conditioning, storage, disposal), inventory (current and expected in the future), hazards (radiological and non-radiological) and normal and accidental modes. The stages in the development of the SA are: criteria selection, information collection, safety analysis and safety assessment documentation. After the review the facilities functions and the national and international requirements, the criteria for safety level assessment are set. As a result from the 2nd stage actual parameters of the facility, necessary for safety analysis are obtained.The methodology is selected on the base of the comparability of the results with the results of previous safety assessments and existing standards and requirements. The procedure and requirements for scenarios selection are described. A radiological hazard categorisation of the facilities is presented. Qualitative hazards and operability analysis is applied. The resulting list of events are subjected to procedure for prioritization by method of 'criticality analysis', so the estimation of the risk is given for each event. The events that fall into category of risk on the boundary of acceptability or are unacceptable are subjected to the next steps of the analysis. As a result the lists with scenarios for PSA and possible design scenarios are established. PSA logical modeling and quantitative calculations of accident sequences are presented

  5. Thermodynamic and exergoeconomic analysis of a cement plant: Part I – Methodology

    International Nuclear Information System (INIS)

    Atmaca, Adem; Yumrutaş, Recep

    2014-01-01

    Highlights: • Energy, exergy and exergoeconomic analysis of a complete cement plant have been investigated. • The first and second law efficiencies based on the energy and exergy analysis are defined for the entire cement plant. • The specific energy consumption of the whole sections of the cement plant have been analyzed. • The specific manufacturing costs of farine, clinker and cement have been determined by the cost analysis. - Abstract: The energy, exergy and exergoeconomic analysis of a cement factory has been studied within two parts. This paper is the first part of the study which includes the thermodynamic and exergoeconomic methodology and formulations developed for such a comprehensive and detailed analysis. The second part of this study is about the application of the developed formulation which considers an actual cement plant located in Gaziantep, Turkey. The energy consumption by the cement industry is about 5% of the total global industrial energy consumption. It is also one of the world’s largest industrial sources of CO 2 emissions. In this paper, a cement plant is considered with all main manufacturing units. Mass, energy, and exergy balances are applied to each system. The first and second law efficiencies based on the energy and exergy analysis and performance assessment parameters are defined for the entire cement plant. The formulations for the cost of products, and cost formation and allocation within the system are developed based on exergoeconomic analysis. In order to obtain the optimal marketing price of cement and to decrease specific energy consumption of the whole plant, the cost analysis formulated here have substantial importance

  6. METHODOLOGICAL ASPECTS OF CONTENT ANALYSIS OF CONVERGENCE BETWEEN UKRAINIAN GAAP AND INTERNATIONAL FINANCIAL REPORTING STANDARDS

    Directory of Open Access Journals (Sweden)

    R. Kuzina

    2015-06-01

    Full Text Available The objective conditions of Ukraine’s integration into the global business environment the need to strengthen the accounting and financial re-porting. At the stage of attracting investment in the country there is a need in the preparation of financial statements generally accepted basic prin-ciples of which are based on common international financial reporting standards (IFRS . Relevant is the assessment of convergence of national standards and International Financial Reporting Standards. However, before you conduct content analysis necessary to determine compliance with standards of methodological approaches to the selection of key indicators for the assessment of convergence. The article is to define the methodo-logical approaches to the selection and development of indicators IFRSs list of key elements for further evaluation convergence of national and international standards. To assess the convergence was allocated 187 basic key elements measuring the level of convergence to IFRS. Sampling was carried out based on the professional judgment of the author, the key indicators of the standard, based on the evaluation of the usefulness of accounting information. These figures make it possible to calculate the specific level of convergence of international and national standards and determine how statements prepared by domestic standards corresponding to IFRS. In other words, can with some certainty assert that Ukraine has made (“good practices in IFRS implementation” or not? This calculation will assess the regulatory efforts of government agencies (Ministry of Finance on the approximation of Ukrainian standards and IFRS.

  7. A methodology for analysis of impacts of grid integration of renewable energy

    International Nuclear Information System (INIS)

    George, Mel; Banerjee, Rangan

    2011-01-01

    Present electricity grids are predominantly thermal (coal, gas) and hydro based. Conventional power planning involves hydro-thermal scheduling and merit order dispatch. In the future, modern renewables (hydro, solar and biomass) are likely to have a significant share in the power sector. This paper presents a method to analyse the impacts of renewables in the electricity grid. A load duration curve based approach has been developed. Renewable energy sources have been treated as negative loads to obtain a modified load duration curve from which capacity savings in terms of base and peak load generation can be computed. The methodology is illustrated for solar, wind and biomass power for Tamil Nadu (a state in India). The trade-offs and interaction between renewable sources are analysed. The impacts on capacity savings by varying the wind regime have also been shown. Scenarios for 2021-22 have been constructed to illustrate the methodology proposed. This technique can be useful for power planners for an analysis of renewables in future electricity grids. - Research highlights: → A new method to analyse impacts of renewables in the electricity grid. → Effects of wind, solar PV and biomass power on load duration curve and capacity savings are shown. → Illustration of intermittent renewables and their interplay for sites in India and the UK. → Future scenarios constructed for generation expansion planning with higher levels of renewable.

  8. The U-tube sampling methodology and real-time analysis of geofluids

    International Nuclear Information System (INIS)

    Freifeld, Barry; Perkins, Ernie; Underschultz, James; Boreham, Chris

    2009-01-01

    The U-tube geochemical sampling methodology, an extension of the porous cup technique proposed by Wood (1973), provides minimally contaminated aliquots of multiphase fluids from deep reservoirs and allows for accurate determination of dissolved gas composition. The initial deployment of the U-tube during the Frio Brine Pilot CO 2 storage experiment, Liberty County, Texas, obtained representative samples of brine and supercritical CO 2 from a depth of 1.5 km. A quadrupole mass spectrometer provided real-time analysis of dissolved gas composition. Since the initial demonstration, the U-tube has been deployed for (1) sampling of fluids down gradient of the proposed Yucca Mountain High-Level Waste Repository, Armagosa Valley, Nevada (2) acquiring fluid samples beneath permafrost in Nunuvut Territory, Canada, and (3) at a CO 2 storage demonstration project within a depleted gas reservoir, Otway Basin, Victoria, Australia. The addition of in-line high-pressure pH and EC sensors allows for continuous monitoring of fluid during sample collection. Difficulties have arisen during U-tube sampling, such as blockage of sample lines from naturally occurring waxes or from freezing conditions; however, workarounds such as solvent flushing or heating have been used to address these problems. The U-tube methodology has proven to be robust, and with careful consideration of the constraints and limitations, can provide high quality geochemical samples.

  9. Proposal of a Methodology of Stakeholder Analysis for the Brazilian Satellite Space Program

    Directory of Open Access Journals (Sweden)

    Mônica Elizabeth Rocha de Oliveira

    2012-03-01

    Full Text Available To ensure the continuity and growth of space activities in Brazil, it is fundamental to persuade the Brazilian society and its representatives in Government about the importance of investments in space activities. Also, it is important to convince talented professionals to place space activities as an object of their interest; the best schools should also be convinced to offer courses related to the space sector; finally, innovative companies should be convinced to take part in space sector activities, looking to returns, mainly in terms of market differentiation and qualification, as a path to take part in high-technology and high-complexity projects. On the one hand, this process of convincing or, more importantly, committing these actors to space activities, implies a thorough understanding of their expectations and needs, in order to plan how the system/organization can meet them. On the other hand, if stakeholders understand how much they can benefit from this relationship, their consequent commitment will very much strengthen the action of the system/organization. With this framework in perspective, this paper proposes a methodology of stakeholder analysis for the Brazilian satellite space program. In the exercise developed in the article, stakeholders have been identified from a study of the legal framework of the Brazilian space program. Subsequently, the proposed methodology has been applied to the planning of actions by a public organization.

  10. A methodology for selection of wind energy system locations using multicriterial analysis

    Energy Technology Data Exchange (ETDEWEB)

    Sansevic, M.; Rabadan, Lj. Pilic [Croatia Univ., Faculty of Electrical Engineering, Mechanical Engineering and Naval Architecture, Split (Croatia)

    1996-12-31

    The effectiveness of a wind turbine generator depends not only on its performance but also on the site`s wind resource. Thus the problem of location selection should be approached systematically, by considering a set of relevant parameters particularly those having a significant economical and ecological impact. This paper presents the methodology used in location selection for the operation of wind energy system. It is based on a multicriterial analysis which enables comparison and ranking of locations according to a set of different parameters. Principal objectives (criteria) in location selection are: energy-economical, technical-technological, physical planning and environment and life protection objectives. For the mathematical modeling of this multicriterial problem the PROMETHEE method is chosen which is developed especially for the solution of rather ``poorly`` structured problems, thus justifying its application in the preliminary stage of site selection for the wind energy systems. The developed methodology is applied in selecting the locations in the island of Rhodes using the available database of the Geographic Information System and the wind potential data obtained by means of the AIOLOS program. (Author)

  11. What Synthesis Methodology Should I Use? A Review and Analysis of Approaches to Research Synthesis

    Science.gov (United States)

    Schick-Makaroff, Kara; MacDonald, Marjorie; Plummer, Marilyn; Burgess, Judy; Neander, Wendy

    2016-01-01

    Background When we began this process, we were doctoral students and a faculty member in a research methods course. As students, we were facing a review of the literature for our dissertations. We encountered several different ways of conducting a review but were unable to locate any resources that synthesized all of the various synthesis methodologies. Our purpose is to present a comprehensive overview and assessment of the main approaches to research synthesis. We use ‘research synthesis’ as a broad overarching term to describe various approaches to combining, integrating, and synthesizing research findings. Methods We conducted an integrative review of the literature to explore the historical, contextual, and evolving nature of research synthesis. We searched five databases, reviewed websites of key organizations, hand-searched several journals, and examined relevant texts from the reference lists of the documents we had already obtained. Results We identified four broad categories of research synthesis methodology including conventional, quantitative, qualitative, and emerging syntheses. Each of the broad categories was compared to the others on the following: key characteristics, purpose, method, product, context, underlying assumptions, unit of analysis, strengths and limitations, and when to use each approach. Conclusions The current state of research synthesis reflects significant advancements in emerging synthesis studies that integrate diverse data types and sources. New approaches to research synthesis provide a much broader range of review alternatives available to health and social science students and researchers. PMID:29546155

  12. Field programmable gate array reliability analysis using the dynamic flow graph methodology

    Energy Technology Data Exchange (ETDEWEB)

    McNelles, Phillip; Lu, Lixuan [Faculty of Energy Systems and Nuclear Science, University of Ontario Institute of Technology (UOIT), Ontario (Canada)

    2016-10-15

    Field programmable gate array (FPGA)-based systems are thought to be a practical option to replace certain obsolete instrumentation and control systems in nuclear power plants. An FPGA is a type of integrated circuit, which is programmed after being manufactured. FPGAs have some advantages over other electronic technologies, such as analog circuits, microprocessors, and Programmable Logic Controllers (PLCs), for nuclear instrumentation and control, and safety system applications. However, safety-related issues for FPGA-based systems remain to be verified. Owing to this, modeling FPGA-based systems for safety assessment has now become an important point of research. One potential methodology is the dynamic flowgraph methodology (DFM). It has been used for modeling software/hardware interactions in modern control systems. In this paper, FPGA logic was analyzed using DFM. Four aspects of FPGAs are investigated: the 'IEEE 1164 standard', registers (D flip-flops), configurable logic blocks, and an FPGA-based signal compensator. The ModelSim simulations confirmed that DFM was able to accurately model those four FPGA properties, proving that DFM has the potential to be used in the modeling of FPGA-based systems. Furthermore, advantages of DFM over traditional reliability analysis methods and FPGA simulators are presented, along with a discussion of potential issues with using DFM for FPGA-based system modeling.

  13. In Their Own Words? Methodological Considerations in the Analysis of Terrorist Autobiographies

    Directory of Open Access Journals (Sweden)

    Mary Beth Altier

    2012-01-01

    Full Text Available Despite the growth of terrorism literature in the aftermath of the 9/11 attacks, there remain several methodological challenges to studying certain aspects of terrorism. This is perhaps most evident in attempts to uncover the attitudes, motivations, and intentions of individuals engaged in violent extremism and how they are sometimes expressed in problematic behavior. Such challenges invariably stem from the fact that terrorists and the organizations to which they belong represent clandestine populations engaged in illegal activity. Unsurprisingly, these qualities make it difficult for the researcher to identify and locate willing subjects of study—let alone a representative sample. In this research note, we suggest the systematic analysis of terrorist autobiographies offers a promising means of investigating difficult-to-study areas of terrorism-related phenomena. Investigation of autobiographical accounts not only offers additional data points for the study of individual psychological issues, but also provides valuable perspectives on the internal structures, processes, and dynamics of terrorist organizations more broadly. Moreover, given most autobiographies cover critical events and personal experiences across the life course, they provide a unique lens into how terrorists perceive their world and insight into their decision-making processes. We support our advocacy of this approach by highlighting its methodological strengths and shortcomings.

  14. What Synthesis Methodology Should I Use? A Review and Analysis of Approaches to Research Synthesis.

    Directory of Open Access Journals (Sweden)

    Kara Schick-Makaroff

    2016-03-01

    Full Text Available Background: When we began this process, we were doctoral students and a faculty member in a research methods course. As students, we were facing a review of the literature for our dissertations. We encountered several different ways of conducting a review but were unable to locate any resources that synthesized all of the various synthesis methodologies. Our purpose is to present a comprehensive overview and assessment of the main approaches to research synthesis. We use ‘research synthesis’ as a broad overarching term to describe various approaches to combining, integrating, and synthesizing research findings. Methods: We conducted an integrative review of the literature to explore the historical, contextual, and evolving nature of research synthesis. We searched five databases, reviewed websites of key organizations, hand-searched several journals, and examined relevant texts from the reference lists of the documents we had already obtained. Results: We identified four broad categories of research synthesis methodology including conventional, quantitative, qualitative, and emerging syntheses. Each of the broad categories was compared to the others on the following: key characteristics, purpose, method, product, context, underlying assumptions, unit of analysis, strengths and limitations, and when to use each approach. Conclusions: The current state of research synthesis reflects significant advancements in emerging synthesis studies that integrate diverse data types and sources. New approaches to research synthesis provide a much broader range of review alternatives available to health and social science students and researchers.

  15. What Synthesis Methodology Should I Use? A Review and Analysis of Approaches to Research Synthesis.

    Science.gov (United States)

    Schick-Makaroff, Kara; MacDonald, Marjorie; Plummer, Marilyn; Burgess, Judy; Neander, Wendy

    2016-01-01

    When we began this process, we were doctoral students and a faculty member in a research methods course. As students, we were facing a review of the literature for our dissertations. We encountered several different ways of conducting a review but were unable to locate any resources that synthesized all of the various synthesis methodologies. Our purpose is to present a comprehensive overview and assessment of the main approaches to research synthesis. We use 'research synthesis' as a broad overarching term to describe various approaches to combining, integrating, and synthesizing research findings. We conducted an integrative review of the literature to explore the historical, contextual, and evolving nature of research synthesis. We searched five databases, reviewed websites of key organizations, hand-searched several journals, and examined relevant texts from the reference lists of the documents we had already obtained. We identified four broad categories of research synthesis methodology including conventional, quantitative, qualitative, and emerging syntheses. Each of the broad categories was compared to the others on the following: key characteristics, purpose, method, product, context, underlying assumptions, unit of analysis, strengths and limitations, and when to use each approach. The current state of research synthesis reflects significant advancements in emerging synthesis studies that integrate diverse data types and sources. New approaches to research synthesis provide a much broader range of review alternatives available to health and social science students and researchers.

  16. The Application of Best Estimate and Uncertainty Analysis Methodology to Large LOCA Power Pulse in a CANDU 6 Reactor

    International Nuclear Information System (INIS)

    Abdul-Razzak, A.; Zhang, J.; Sills, H.E.; Flatt, L.; Jenkins, D.; Wallace, D.J.; Popov, N.

    2002-01-01

    The paper describes briefly a best estimate plus uncertainty analysis (BE+UA) methodology and presents its proto-typing application to the power pulse phase of a limiting large Loss-of-Coolant Accident (LOCA) for a CANDU 6 reactor fuelled with CANFLEX R fuel. The methodology is consistent with and builds on world practice. The analysis is divided into two phases to focus on the dominant parameters for each phase and to allow for the consideration of all identified highly ranked parameters in the statistical analysis and response surface fits for margin parameters. The objective of this analysis is to quantify improvements in predicted safety margins under best estimate conditions. (authors)

  17. THE STUDY OF ORGANIZATIONAL CULTURE: METHODOLOGY FOR QUANTITATIVE EVALUATION AND ANALYSIS

    Directory of Open Access Journals (Sweden)

    To Thu Trang

    2014-01-01

    Full Text Available This article discusses the concept: methodof evaluation organizational culture, qualitative and quantitative assessment methodology and lists the basic methodologyfor assessing organizational culture. Fullydescribe professor Denison’s methodology for assessing organizational culture.

  18. Applicability of contact angle techniques used in the analysis of contact lenses, part 1: comparative methodologies.

    Science.gov (United States)

    Campbell, Darren; Carnell, Sarah Maria; Eden, Russell John

    2013-05-01

    Contact angle, as a representative measure of surface wettability, is often employed to interpret contact lens surface properties. The literature is often contradictory and can lead to confusion. This literature review is part of a series regarding the analysis of hydrogel contact lenses using contact angle techniques. Here we present an overview of contact angle terminology, methodology, and analysis. Having discussed this background material, subsequent parts of the series will discuss the analysis of contact lens contact angles and evaluate differences in published laboratory results. The concepts of contact angle, wettability and wetting are presented as an introduction. Contact angle hysteresis is outlined and highlights the advantages in using dynamic analytical techniques over static methods. The surface free energy of a material illustrates how contact angle analysis is capable of providing supplementary surface characterization. Although single values are able to distinguish individual material differences, surface free energy and dynamic methods provide an improved understanding of material behavior. The frequently used sessile drop, captive bubble, and Wilhelmy plate techniques are discussed. Their use as both dynamic and static methods, along with the advantages and disadvantages of each technique, is explained. No single contact angle technique fully characterizes the wettability of a material surface, and the application of complimenting methods allows increased characterization. At present, there is not an ISO standard method designed for soft materials. It is important that each contact angle technique has a standard protocol, as small protocol differences between laboratories often contribute to a variety of published data that are not easily comparable.

  19. Demonstration of a software design and statistical analysis methodology with application to patient outcomes data sets.

    Science.gov (United States)

    Mayo, Charles; Conners, Steve; Warren, Christopher; Miller, Robert; Court, Laurence; Popple, Richard

    2013-11-01

    With emergence of clinical outcomes databases as tools utilized routinely within institutions, comes need for software tools to support automated statistical analysis of these large data sets and intrainstitutional exchange from independent federated databases to support data pooling. In this paper, the authors present a design approach and analysis methodology that addresses both issues. A software application was constructed to automate analysis of patient outcomes data using a wide range of statistical metrics, by combining use of C#.Net and R code. The accuracy and speed of the code was evaluated using benchmark data sets. The approach provides data needed to evaluate combinations of statistical measurements for ability to identify patterns of interest in the data. Through application of the tools to a benchmark data set for dose-response threshold and to SBRT lung data sets, an algorithm was developed that uses receiver operator characteristic curves to identify a threshold value and combines use of contingency tables, Fisher exact tests, Welch t-tests, and Kolmogorov-Smirnov tests to filter the large data set to identify values demonstrating dose-response. Kullback-Leibler divergences were used to provide additional confirmation. The work demonstrates the viability of the design approach and the software tool for analysis of large data sets.

  20. Preliminary CFD analysis methodology for flow in a LFR fuel assembly

    International Nuclear Information System (INIS)

    Catana, A.; Ioan, M.; Serbanel, M.

    2013-01-01

    In this paper a preliminary Computational Fluid Dynamics (CFD) analysis was performed in order to setup a methodology to be used for more complex coolant flow analysis inside ALFRED nuclear reactor fuel assembly. The core contains 171 separated fuel assembly, each consisting in a hexagonal array of 127 fuel rods. Three honey comb spacer grids are proposed along fuel rods with the aim to keep flow geometry intact during reactor operation. The main goal of this paper is to compute some hydraulic parameters: pressure, velocity, wall shear stress and turbulence parameters with and without spacer grids. In this analysis we consider an adiabatic case, so far no heat transfer is considered but we pave the road toward more complex thermo hydraulic analysis for ALFRED (LFR in general). The CAELINUX CFD distribution was used with its main components: Salome-Meca (for geometry and mesh) and Code-Saturne as mono-phase CFD solver. Paraview and Visist Postprocessors were used for data extraction and graphical displays. (authors)